Tuesday, April 24, 2018

Who is Watching Wall Street?

Since the ink dried on the GOP tax plan, officially known as the Tax Cut and Jobs Act, back in December 2017, companies have spent over $218 billion dollars re-purchasing their own stock at the going price on the open market. The point of the tax law, according to Republicans, was to free up corporate cash so that companies could create jobs. Instead, it seems, companies are using the cash windfall to reward shareholders. Daily spending on buybacks has doubled from just a year ago and could reach a record high of $800 billion this year.

Why do companies buy back their own shares? Because buying back shares raises the price of the remaining shares—each share is now a slightly bigger slice of the corporate pie. Share buybacks push up share prices easily and instantly without the hard work of companies making improvements in how they attract customers or create their products.

They are also a perfectly legal way for corporate executives, who hold huge chunks of stock, to juice their own pay. Executives decide which days to buy back shares and can then sell their own shares at the newly bumped up price. Top executives generally make the majority of their compensation through performance-based pay, which is either directly or indirectly tied to stock prices. Even though the rules of performance-based pay changed under tax reform, it is likely that executives will remain large shareholders.

But the problem with stock buybacks isn’t just frustration with the 1 percent getting even richer. Nor is it just the hypocrisy of how the tax bill was sold by the Republican Party—though there is plenty of that. While Republicans promised the bill would raise worker wages, all of the analyses about the ratio of spending on buybacks to spending on workers tell the same story: massive amounts of money are moving out to shareholders while very little is trickling down to workers. Moreover, Republicans promised improved innovation, but it should surprise no one that corporate investment as compared to profits has declined compared to historical levels—hurting corporate potential in the long-run—just as stock buybacks are on the rise.

Ending stock buybacks could be straightforward. Congress could amend the Securities and Exchange Act to simply make open-market share repurchases illegal. Or it could impose limits on buybacks for companies that aren’t investing in their employees or funding their pension commitments, or it could only allow buybacks when workers also receive a dividend. The Securities and Exchange Commission (SEC) could also repeal the “safe-harbor” rule, which lets companies spend massively on buybacks, or at the very least make companies justify why buybacks are a good use of corporate cash.

But the current surge of stock buybacks is a symptom of a much larger problem: how deeply corporate leaders are able to manipulate our economy for their own gain, without oversight from those who are supposed to hold them accountable. We’re in the grip of a shareholder primacy ideology, which posits that the purpose of corporate tax reform is to benefit shareholders because shareholders have the only right to the spoils.

To find our way out of this mess, we must first understand how we got here.

Shareholder primacy as a framework for corporate behavior only became entrenched in the 1980s. The postwar era was dominated by “managerial capitalism,” in which the management of big corporations focused on sales growth and, in some cases, labor peace to ensure growing productivity. For a white male worker, you could get a steady job that paid the bills, promised a pension, and was all but guaranteed for life. Shareholders were an afterthought.

In the 1960s, the big firms grew into conglomerates—highly diversified companies that by the 1970s ended up being worth less than the sum of their parts. Shareholders grew restless in the 1970s as the economy slowed and interest rates rose, but they were stymied from takeovers because of prohibitive state corporate law and anti-trust regulation at the federal level that still held back some industry consolidation. As the 1970s came to a close, prominent economists reframed the responsibility of executives from ensuring rising sales to maximizing shareholder value. Further they claimed that the ideal executive compensation package should include large chunks of shares to align executives’ interest with shareholders.

This has—not surprisingly—led to an obsessive focus by corporate leaders on short-term share prices and cost-cutting, with the workforce as the first cost cut. There has been a significant shift—in power and in material rewards—away from workers and towards shareholders since shareholder primacy rose to dominance in the 1970s. But it wasn’t a gradual or cultural shift—key policy interventions under Ronald Reagan broke the back of managerial capitalism and ushered in shareholder primacy.

In 1982, four key policy changes occurred that allowed shareholders to take over, or threaten to take over, companies, and pushed executives to focus on the share price or get out of the way. The first was an overhaul of the Department of Justice’s antitrust merger review guidelines so that industry consolidation was welcomed, not forbidden. The second was a Supreme Court case, Edgar v. MITE, which made state antitakeover statutes unconstitutional and allowed for the rise of the hostile takeover. Third, Reagan’s wholesale attack on unions ended an era of fragile labor peace.

The fourth policy change was Rule 10b-18, a Securities and Exchange Commission rule that ushered in the era of stock buybacks. Back in 1968, Congress gave the SEC the authority to prohibit buybacks if they so choose under the Williams Act Amendment. The SEC never prohibited them, but throughout the 1970s, it proposed a rule that would have limited buybacks to 15 percent of the volume of a company’s shares that were trading on the open market, and, more importantly, presumed that any buybacks over this limit were stock price manipulation and therefore likely illegal.

But the rule never passed. And in a turnaround that is familiar today, the Reagan Administration came in and promulgated a new rule that allowed companies to do whatever level of buybacks they liked. In 1982, under the leadership of John S. R. Shad—the first SEC Chair from Wall Street since the Great Depression—the Commission passed Rule 10b-18, the “safe harbor rule,” which limits corporates to a daily limit of buying back only an amount of shares equal to 25 percent of what’s trading on the open market. But the rule is superficial—companies do not have to disclose how many shares they buy back each day—only per quarter—and even if they exceed that limit, there is no presumption that the purchases are market manipulation.

The safe harbor rule is akin to driving rules, in that you have to stay within a certain speed. But imagine if no one ever sat by the side of the road to see how fast you drove. What would you do?

The answer is buy back a huge volume of stock in order to keep those share prices rising. Economist William Lazonick, who has done the most to bring attention to the harm of stock buybacks, calculated that from 2003-2012, public companies in the S&P 500 index spent over 90 percent of their earnings on buybacks and dividends. You might think the Obama Administration’s SEC would have given more attention to this practice, but you’d be wrong. In 2015, former SEC Chair Mary Jo White admitted that the agency does not collect the data to know if companies are staying within the daily volume and timing limits. All of this precedes the avalanche of buybacks we’re seeing now.

This practice is the heart of “shareholder primacy”—executives claim that they’re helpless to raise wages, slow down the scale of stock buybacks, or stop the fissuring of the workplace because they have to meet the insatiable demand for an ever-rising share price. The dollar amount that companies spend on buybacks shows just how easily corporations could raise pay for decent wages. Walmart’s base wage, for instance, rose from $10 to $11 this year and was announced to great fanfare after tax reform. But for a starting worker, it still means that he or she earns $19,448 a year. Meanwhile, Walmart authorized a buyback program of $20 billion in 2017.

by Lenore Palladino, Boston Review |  Read more:
Image: Sam Valadi

Bolivia’s Quest to Spread the Gospel of Coca

One thing about chewing coca leaves that is weird to the neophyte is their specific, sylvan kind of taste. Unlike the chemical stain that cocaine burns on the back of the throat, coca can seem like a hippie cleanse for the mouth. To start, there is the inescapable fibrousness; even with some dexterous tongue and tooth work, little twig-like stems end up pressed against the inside of the cheek or stabbing at the gums. Then there is the flavor, a musty piquancy of autumn leaves suffused with a tannic tang. The effect is slightly astringent. Chewing is generally a misnomer, since coca is piled up into a wad on one side of the mouth and sucked on, but some people gnash at the lanceolate leaves until tiny green specks garnish the teeth like dried parsley.

When a person chews coca, a cocktail of compounds is secreted from the leaves and absorbed into the body. This contains dozens of alkaloids that include the cocaine compound, and it has mild psychotropic effects in its unprocessed form. Its processed form, obviously, is a different matter. People from Andean countries like to say that coca’s relationship to cocaine is like the grape to wine. The equivalence isn’t totally precise, but coca is a centerpiece in traditional ceremonies and has the status of a sacred substance and so it enjoys, like the Holy Eucharist, a certain factual leniency.

Of course, neither its natural consumption nor its spiritual status has saved the coca plant from becoming a harbinger of bloodshed. Coca garnered its peculiar status when a German graduate student isolated a pure form of its electrifying alkaloid from a fresh shipment of leaves in 1859 using alcohol, sulfuric acid, sodium carbonate and ether. Cocaine’s global market is now worth around 80 billion dollars per year. It is also illicit. An untold number of people have been killed for having some connection, tenuous or not, to the trade. Drug-related violence has made parts of Latin America among the most dangerous places on the planet.

Nowhere has coca been more important than in Bolivia, South America’s poorest country. Though its governments have traditionally toed the line of U.S. foreign policy on drugs since at least the 1980s, Bolivia’s current president, Evo Morales, threw out the Drug Enforcement Agency (DEA) nearly a decade ago while vowing to resuscitate coca’s sullied reputation. “Coca,” Morales has said so often that the phrase could be printed on the currency, “is not cocaine.” After decades of sweaty counter-narcotics operations, during which U.S.-trained soldiers scoured the jungle uprooting coca bushes and Americans and Europeans snorted cocaine anyway, Morales called a stop to eradication campaigns in his country. Instead, the cocaleros of Bolivia have cultivated the conviction that they can spread the gospel of coca. “Our philosophy is clear,” the country’s leading anti-drug official, Sabino Mendoza, told me. “Coca should be consumed, in its natural state.” To that end, the Bolivian government has spent millions of dollars and put forward a law to support its coca market. It has shunned the War on Drugs and sought instead to create alternate markets for coca leaf by supporting industrialization. Teas, shampoos, wines, cakes, liquors, flour, toothpastes, energy drinks and candies that feature the leaf have been produced, some in government-backed factories.

It sometimes seems like Bolivians will market anything that contains their quasi-magical plant. Anything that could lure investors. Anything that could trade internationally. Anything, anything but cocaine. (...)

Coca, especially in the highlands, enjoys near panacea status. It had deep ties to indigenous culture, and the 30 percent of Bolivians who chew it regularly believe that it can alleviate most ills. In the new and growing coca product market, this tonic-like reputation is its most marketable aspect. “With Coca Real, it’s just the same,” one of Bolivia’s rising coca entrepreneurs, Juan Manuel Rivero, told me, referring to his flagship product, a carbonated energy drink containing coca extract. “A healthy beverage that will effectively combat sorojchi, alleviate exhaustion, and eliminate physical or mental fatigue.” Rivero is one of a dozen or so entrepreneurs who have obtained permission from the government to purchase coca for industrial development. While it’s not illegal to have coca in Bolivia, there is a limit on the amount that can be transported without a permit, and the movement of leaves is closely monitored. His Coca Real drink is one of the products that have entered the market seeking to capitalize on a sympathetic regime and shifting global attitudes about regulating certain kinds of substances.

At Rivero’s factory, where he produces soda concentrate, he offered me some of the finished, neon-green liquid product in a glass to try. It tasted like coca’s distant cousin, just arrived from Miami smacking bubble gum and raving about party yachts. Sweet, bubbly; the unmistakable descendant of Red Bull. I drank it quickly, and recognized an afternote redolent of coca’s tang. “Coca has one bad alkaloid, which is cocaine, and the rest of its alkaloids are good,” Rivero said. (The white powder cocaine is usually the cocaine alkaloid isolated in hydrochloride salt form, occasionally cut with other substances.) “We are sure that our product does not contain a single bad alkaloid. We want to show Bolivia and the world that it’s possible to make appealing derivatives that can be consumed and don’t cause addiction.” (...)

In July 2017, I travelled to the Chapare, a tropical province north of Cochabamba and one of Bolivia’s two major coca-growing regions, to meet Rivero’s outreach team. The road from the highlands down to the rainforest river basin traces its way along mountain saddles overhung with clouds and neon panicles of lobster claw flowers. It is also punctuated by checkpoints. Just a few decades ago, growing coca in the Chapare was prohibited. The area became ground zero in the U.S. War on Drugs. Interdiction forces conducted merciless campaigns against coca growers, who still bitterly resent the authors of their suffering.

I was going to the annual coca fair, where Coca Real was making a pitch, held just up the road from a mirrored glass-plated factory that was built to produce coca products. Flanked on all sides by the hyper-green rainforest, the fair stalls created haphazard corridors where revelers wandered, their cheeks bulging with coca. One vendor, selling frosting-smeared cupcakes topped with decorative coca leaf, told me that she had experimented for months to get the flavor right–there can’t be too much coca, she said, or the cake turns bitter. A man hawked coca shampoo as a cure for hair loss.

Nowhere in Bolivia has the impact of President Evo Morales’s 2005 election been felt more dramatically than the Chapare, where his activism leading one of the major coca unions thrust him into the national political spotlight and ultimately carried him to electoral victory thirteen years ago. Morales, who is the country’s first indigenous president and who was raised in poverty in the highlands before moving to the Chapare as a young man, has remained loyal to his base. Duly, he had promised to make an appearance at the fair. On the day of his scheduled arrival, farmers stood in their mud-splattered shoes and Sunday shirts with eyes turned skyward waiting for a sign of his helicopter.

Morales has increasingly become a subject of controversy in Bolivia, ever more with his recent efforts to massage the constitution to extend his long tenure in the presidential palace. But in the Chapare, support for him is unflagging. Asterio Romero, Morales’s friend and union colleague and currently the mayor of one of the region’s largest cities, told me he believed Morales was sent by God. That, he said, was the only explanation for Morales’s famous work ethic–the president sleeps little, and has been known to call ministers to the palace for meetings at 5 o’clock in the morning. To the people of the Chapare, he also represents someone who understands the pain of the drug war years.

For Morales, the piecemeal documentation of atrocities committed in the 1980s and 1990s in the name of eradicating coca plants is not jarring. He was there for clashes that produced albums filled with grainy photos of men and women with lash-like bruises and gaping bullet wounds, undergoing emergency outdoor surgeries or building barricades to block police trucks; the medical certificates of hematomas, contusions, puncture wounds and edemas; the autopsy reports documenting bullet trajectories. One report from 2008, published by Bolivian government agencies, in which Morales says he was tortured during his many detentions by anti-narcotics squads, includes photographs of the president himself. In them, he has the same mop-top haircut, but his face has the sheen of youth, and he is propped on a medical examining table with purple lesions crisscrossing his back and snaking over his shoulder.

By the time the report was published, Morales had been elected to his first presidential term, and he would with short shrift expel the U.S. Ambassador and the DEA from the country. Although many of the boots-on-the-ground anti-narcotics campaigns were carried out by special Bolivia police and military forces like UMOPAR, the Chapare was one of the first places where the DEA began its foreign War on Drugs operations, and many Bolivians still hold the U.S. responsible for the squads’ violence and corruption. “From the U.S., they made the DEA pressure us at gunpoint and with gas,” a coca union leader named Isidora Coronado told me. “It was a difficult time. A lot of women, especially, were traumatized; there were assaults, and the men in uniform could do whatever they wanted. But from the moment [Morales] became president we haven’t had those kinds of clashes anymore.”

If anything, the coca fair was celebration of the victories won, and by extension, each artisanal coca product on offer seemed a small tribute to the struggle (Coca Real’s stand, with its flashy cardboard cutout of a life-sized bottle, was nearly alone in its unabashedly commercial design). Wherever Morales goes, he is greeted by garlands of flowers; shortly after his helicopter landed on the afternoon of the fair and he emerged from a black SUV among a flock of bodyguards, Morales was garlanded with coca leaves and presented with a shamrock-colored cake made from ground leaves and varnished with white icing. Farmers with pleated skirts and long braids presented him with baskets of guava and sweet potatoes. He spoke for 15 minutes, praising the new coca policies and promising more industrialization. To finish his speech, he chanted a famous slogan in Quechua, joined by hundreds of voices: “Kawsachun coca! Huañuchun Yanquis!” Long live coca. Down with the Yankees.

About 17 million people around the world used cocaine at some point in 2015, according to the latest data from the United Nations Office on Drugs and Crime (UNODC). A third of those people were in North America. While the DEA estimates that cocaine use is increasing in the U.S., most of its field divisions don’t consider the drug to be as urgent a threat as other controlled substances. Cocaine-related deaths have spiked, but this is largely due to a fad of speedballing it with fentanyl. In any case, the agency’s laboratory analyses conclude that 92 percent of cocaine in the U.S. market originated in Colombia and six percent in Peru–two countries where American interdiction programs are still robustly in place.

Bolivia, however, has been singled out by the U.S. government as being a special pain in the ass. Its truculence has earned it repeat mention on the White House’s annual presidential memorandum on illicit drug producing countries, where it is rebuked for having “failed demonstrably” to adequately enact counternarcotics policies. Since it’s an illegal market, drug production can only be measured by proxy, and so the UNODC calculates the number of hectares of coca cultivated using satellite and aerial imagery to guess at the amount of cocaine produced (it also looks at police seizures of finished cocaine and of the intermediary butter-like paste product). Its most recent data for the three major coca producing countries put cultivation at 146,000 hectares in Colombia, 43,900 in Peru, and 23,100 in Bolivia. The U.S. Department of State disagrees with the methodology and says there are more hectares in cultivation, though still less than in Colombia or Peru. But in September’s memo, the White House exempted Colombia, reasoning that its police and army are close security allies.

Bolivia is something else entirely.

by Jessica Camille Aguirre, Guernica | Read more:
Image: Ansellia Kulikku

Sonny Boy Williamson

Oregon Grew More Cannabis Than Customers Can Smoke


A recent Sunday afternoon at the Bridge City Collective cannabis shop in North Portland saw a steady flow of customers.

Little wonder: A gram of weed was selling for less than the price of a glass of wine.

The $4 and $5 grams enticed Scotty Saunders, a 24-year-old sporting a gray hoodie, to spend $88 picking out new products to try with a friend. "We've definitely seen a huge drop in prices," he says.

Across the wood-and-glass counter, Bridge City owner David Alport was less delighted. He says he's never sold marijuana this cheap before.

"We have standard grams on the shelf at $4," Alport says. "Before, we didn't see a gram below $8."

The scene at Bridge City Collective is playing out across the city and state. Three years into Oregon's era of recreational cannabis, the state is inundated with legal weed.

It turns out Oregonians are good at growing cannabis—too good.

In February, state officials announced that 1.1 million pounds of cannabis flower were logged in the state's database.

If a million pounds sounds like a lot of pot, that's because it is: Last year, Oregonians smoked, vaped or otherwise consumed just under 340,000 pounds of legal bud.

That means Oregon farmers have grown three times what their clientele can smoke in a year.

Yet state documents show the number of Oregon weed farmers is poised to double this summer—without much regard to whether there's demand to fill.

The result? Prices are dropping to unprecedented lows in auction houses and on dispensary counters across the state.

Wholesale sun-grown weed fell from $1,500 a pound last summer to as low as $700 by mid-October. On store shelves, that means the price of sun-grown flower has been sliced in half to those four-buck grams.

For Oregon customers, this is a bonanza. A gram of the beloved Girl Scout Cookies strain now sells for little more than two boxes of actual Girl Scout cookies.

by Matt Stangel and Katie Shepherd, Willamette Week |  Read more:
Image: East Fork Cultivars

Your Sea Wall Won’t Save You

In 2011, a catastrophic flood washed through greater Bangkok. Hydrologically, this was not so unusual; Bangkok occupies the Chao Phraya River Delta, and although the rainfall that year was higher than normal, the waters didn’t reach the hundred-year flood level. But the landscape was more vulnerable than in past cycles. Factory development in the flood plain, subsidence caused by groundwater extraction, and mismanagement of dams upriver led to severe flooding that killed more than 800 people and affected some 13 million lives. Protected by the King’s dike, which encircles the Bangkok Metropolitan Area, the capital city was largely spared, but displaced floodwaters made conditions worse in outlying districts. The sacrifice zones were inundated for weeks, and then months. As angry “flood mobs” descended on the protected areas, opening flood gates and tearing holes in the sandbag walls, the prime minister counseled them to think of the national good. If the city center flooded, she said, it would cause “foreigners to lose confidence in us and wonder why we cannot save our own capital.”

And here is a dark truth of planning for “climate resilience.” Decisions about which areas will be protected are not only about whose safety will be guaranteed; they also involve transnational concerns like reassuring global investors and preserving manufacturing supply chains. In Thailand, thousands of soldiers were dispatched to patrol the floodwalls. They were enforcing resilience. This is both a rational decision and a disturbing vision of our climate-changed future. We are heading toward a world in which the unequal distribution of environmental risks is administered by state violence. How did we get here?

This article looks at four large cities in Southeast Asia facing major climate risks: Jakarta, Manila, Ho Chi Minh City, and Bangkok. Each is home to at least 8 million people living in a low-lying delta threatened by rapid urbanization, sinking ground, and rising seas. As officials seek to make their cities more resilient, they bring in outside planning experts who push “climate-proofing” models developed in Japan and Europe, especially in the Netherlands, which has a long history of advanced water management strategies. Highly engineered, technocratic programs come with readymade slogans, like “making room for the river,” a concept which works well along the banks of the Rhine but can mean mass evictions in the Global South. When “slums” (often a slur for urbanized villages with deep histories) are represented as a blight, to be scraped away with little if any recompense, and their people resettled in untenable locations far from the city center, we must ask: “Whose resilience” is really being promoted? Too often, the rhetoric of climate adaptation is doublespeak for the displacement of poor, informal communities, and an alibi for unsustainable growth. (...)

Jakarta

Let’s start in Jakarta, where the “Great Garuda” is the charismatic megafauna of resilience infrastructures. About 40 percent of the city is below sea level, and regular flooding along the highly polluted rivers and colonial canals is a fact of life. In 2007, floods forced 300,000 evacuations and spurred new plans to fortify the city against rising waters. An international team led by the Dutch engineering firms Witteveen+Bos and Grontmij proposed to build the National Capital Integrated Coastal Development, which envisions artificial islands in the Jakarta Bay anchored by the world’s largest sea wall. The scheme, which resembles a garuda, the mythical bird that is a national symbol of Indonesia, is financed largely by private development on the islands, including a new Central Business District housing 1.5 million people.

Victor Coenen, the project manager for Witteveen+Bos, describes the NCICD as “one big polder,” referencing the Dutch strategy for enclosing land within dikes to artificially control its hydrology. Essentially, Jakarta Bay will be a bathtub, completely separated from the Java Sea; the city’s rivers will drain here and then be pumped out to the ocean. Critics argue that disrupting the hydrology will harm local fisheries, trap polluted waters within the city, and exacerbate flooding outside the wall. In response to these concerns, as well as allegations of corruption, the sea wall was redesigned to be a mere(!) 30 km long. Now nearing completion, it will be Jakarta’s iron lung, requiring a whole secondary life-support system of pumps and drainage systems. To prevent retention areas from becoming polluted “black lagoons,” the city will need major sanitation upgrades, which are being led by German and Japanese partners. Getting the various projects to play well together, on time and within scope, is an immense challenge.

by Lizzie Yarina, Places Journal |  Read more:
Image: KuiperCompagnons

What Went Wrong With the Internet

Over the last few months, Select All has interviewed more than a dozen prominent technology figures about what has gone wrong with the contemporary internet for a project called “The Internet Apologizes.” We’re now publishing lengthier transcripts of each individual interview. This interview features Jaron Lanier, a pioneer in the field of virtual reality and the founder of the first company to sell VR goggles. Lanier currently works at Microsoft Research as an interdisciplinary scientist. He is the author of the forthcoming book Ten Arguments for Deleting Your Social Media Accounts Right Now.

You can find other interviews from this series here.


Jaron Lanier: Can I just say one thing now, just to be very clear? Professionally, I’m at Microsoft, but when I speak to you, I’m not representing Microsoft at all. There’s not even the slightest hint that this represents any official Microsoft thing. I have an agreement within which I’m able to be an independent public intellectual, even if it means criticizing them. I just want to be very clear that this isn’t a Microsoft position.

Noah Kulwin: Understood.
Yeah, sorry. I really just wanted to get that down. So now please go ahead, I’m so sorry to interrupt you.

In November, you told Maureen Dowd that it’s scary and awful how out of touch Silicon Valley people have become. It’s a pretty forward remark. I’m kind of curious what you mean by that.

To me, one of the patterns we see that makes the world go wrong is when somebody acts as if they aren’t powerful when they actually are powerful. So if you’re still reacting against whatever you used to struggle for, but actually you’re in control, then you end up creating great damage in the world. Like, oh, I don’t know, I could give you many examples. But let’s say like Russia’s still acting as if it’s being destroyed when it isn’t, and it’s creating great damage in the world. And Silicon Valley’s kind of like that.

We used to be kind of rebels, like, if you go back to the origins of Silicon Valley culture, there were these big traditional companies like IBM that seemed to be impenetrable fortresses. And we had to create our own world. To us, we were the underdogs and we had to struggle. And we’ve won. I mean, we have just totally won. We run everything. We are the conduit of everything else happening in the world. We’ve disrupted absolutely everything. Politics, finance, education, media, relationships — family relationships, romantic relationships — we’ve put ourselves in the middle of everything, we’ve absolutely won. But we don’t act like it.

We have no sense of balance or modesty or graciousness having won. We’re still acting as if we’re in trouble and we have to defend ourselves, which is preposterous. And so in doing that we really kind of turn into assholes, you know?

How do you think that siege mentality has fed into the ongoing crisis with the tech backlash?

One of the problems is that we’ve isolated ourselves through extreme wealth and success. Before, we might’ve been isolated because we were nerdy insurgents. But now we’ve found a new method to isolate ourselves, where we’re just so successful and so different from so many other people that our circumstances are different. And we have less in common with all the people whose lives we’ve disrupted. I’m just really struck by that. I’m struck with just how much better off we are financially, and I don’t like the feeling of it.

Personally, I would give up a lot of the wealth and elite status that we have in order to just live in a friendly, more connected world where it would be easier to move about and not feel like everything else is insecure and falling apart. People in the tech world, they’re all doing great, they all feel secure. I mean they might worry about a nuclear attack or something, but their personal lives are really secure.

And then when you move out of the tech world, everybody’s struggling. It’s a very strange thing. The numbers show an economy that’s doing well, but the reality is that the way it’s doing well doesn’t give many people a feeling of security or confidence in their futures. It’s like everybody’s working for Uber in one way or another. Everything’s become the gig economy. And we routed it that way, that’s our doing. There’s this strange feeling when you just look outside of the tight circle of Silicon Valley, almost like entering another country, where people are less secure. It’s not a good feeling. I don’t think it’s worth it, I think we’re wrong to want that feeling.

It’s not so much that they’re doing badly, but they have only labor and no capital. Or the way I used to put it is, they have to sing for their supper, for every single meal. It’s making everyone else take on all the risk. It’s like we’re the people running the casino and everybody else takes the risks and we don’t. That’s how it feels to me. It’s not so much that everyone else is doing badly as that they’ve lost economic capital and standing, and momentum and plannability. It’s a subtle difference.

There’s still this rhetoric of being the underdog in the tech industry. The attitude within the Valley is “Are you kidding? You think we’re resting on our laurels? No! We have to fight for every yard.”

There’s this question of whether what you’re fighting for is something that’s really new and a benefit for humanity, or if you’re only engaged in a sort of contest with other people that’s fundamentally not meaningful to anyone else. The theory of markets and capitalism is that when we compete, what we’re competing for is to get better at something that’s actually a benefit to people, so that everybody wins. So if you’re building a better mousetrap, or a better machine-learning algorithm, then that competition should generate improvement for everybody.

But if it’s a purely abstract competition set up between insiders to the exclusion of outsiders, it might feel like a competition, it might feel very challenging and stressful and hard to the people doing it, but it doesn’t actually do anything for anybody else. It’s no longer genuinely productive for anybody, it’s a fake. And I’m a little concerned that a lot of what we’ve been doing in Silicon Valley has started to take on that quality. I think that’s been a problem in Wall Street for a while, but the way it’s been a problem in Wall Street has been aided by Silicon Valley. Everything becomes a little more abstract and a little more computer-based. You have this very complex style of competition that might not actually have much substance to it.

You look at the big platforms, and it’s not like there’s this bountiful ecosystem of start-ups. The rate of small-business creation is at its lowest in decades, and instead you have a certain number of start-ups competing to be acquired by a handful of companies. There are not that many varying powers, there’s just a few.

That’s something I’ve been complaining about and I’ve written about for a while, that Silicon Valley used to be this place where people could do a start-up and the start-up might become a big company on its own, or it might be acquired, or it might merge into things. But lately it kind of feels like both at the start and at the end of the life of a start-up, things are a little bit more constrained. It used to be that you didn’t have to know the right people, but now you do. You have to get in with the right angel investors or incubator or whatever at the start. And they’re just a small number, it’s like a social order, you have to get into them. And then the output on the other side is usually being acquired by one of a very small number of top companies.

There are a few exceptions, you can see Dropbox’s IPO. But they’re rarer and rarer. And I suspect Dropbox in the future might very well be acquired by one of the giants. It’s not clear that it’ll survive as its own thing in the long term. I mean, we don’t know. I have no inside information about that, I’m just saying that the much more typical scenario now, as you described, is that the companies go to one of the biggies.

I’m kind of curious what you think needs to happen to prevent future platforms, like VR, from going the way of social media and reaching this really profitable crisis state.

A lot of the rhetoric of Silicon Valley that has the utopian ring about creating meaningful communities where everybody’s creative and people collaborate and all this stuff — I don’t wanna make too much of my own contribution, but I was kind of the first author of some of that rhetoric a long time ago. So it kind of stings for me to see it misused. Like, I used to talk about how virtual reality could be a tool for empathy, and then I see Mark Zuckerberg talking about how VR could be a tool for empathy while being profoundly nonempathic, using VR to tour Puerto Rico after the storm, after Maria. One has this feeling of having contributed to something that’s gone very wrong.

So I guess the overall way I think of it is, first, we might remember ourselves as having been lucky that some of these problems started to come to a head during the social-media era, before tools like virtual reality become more prominent, because the technology is still not as intense as it probably will be in the future. So as bad as it’s been, as bad as the election interference and the fomenting of ethnic warfare, and the empowering of neo-Nazis, and the bullying — as bad as all of that has been, we might remember ourselves as having been fortunate that it happened when the technology was really just little slabs we carried around in our pockets that we could look at and that could talk to us, or little speakers we could talk to. It wasn’t yet a whole simulated reality that we could inhabit.

Because that will be so much more intense, and that has so much more potential for behavior modification, and fooling people, and controlling people. So things potentially could get a lot worse, and hopefully they’ll get better as a result of our experiences during this era.

As far as what to do differently, I’ve had a particular take on this for a long time that not everybody agrees with. I think the fundamental mistake we made is that we set up the wrong financial incentives, and that’s caused us to turn into jerks and screw around with people too much. Way back in the ’80s, we wanted everything to be free because we were hippie socialists. But we also loved entrepreneurs because we loved Steve Jobs. So you wanna be both a socialist and a libertarian at the same time, and it’s absurd. But that’s the kind of absurdity that Silicon Valley culture has to grapple with.

And there’s only one way to merge the two things, which is what we call the advertising model, where everything’s free but you pay for it by selling ads. But then because the technology gets better and better, the computers get bigger and cheaper, there’s more and more data — what started out as advertising morphed into continuous behavior modification on a mass basis, with everyone under surveillance by their devices and receiving calculated stimulus to modify them. So you end up with this mass behavior-modification empire, which is straight out of Philip K. Dick, or from earlier generations, from 1984.

It’s this thing that we were warned about. It’s this thing that we knew could happen. Norbert Wiener, who coined the term cybernetics, warned about it as a possibility. And despite all the warnings, and despite all of the cautions, we just walked right into it, and we created mass behavior-modification regimes out of our digital networks. We did it out of this desire to be both cool socialists and cool libertarians at the same time.

by Noah Kulwin, Select All |  Read more:
Image: Brian Ach/Getty Images

Monday, April 23, 2018


Mark Anderson
via: Andertoon.com

Where Have All the Pilots Gone?


— well, the instructor who made that first takeoff seem easy told me, later that same day, that most people who begin pilot training never finish it. There are plenty of good reasons for that. It is, as my friend Dillo put it, more expensive than a crack habit. People hit plateaus and get frustrated and give up. But I think the main reason is because it’s complicated, and difficult, and stressful, and when the lessons stop being novel, people stop forcing themselves to do the hard thing, despite the ultimate rewards.

Where Have All the Pilots Gone? (Tech Crunch)
Image: FAA

via: The Guardian
repost

Sunday, April 22, 2018

Is the Internet Complete?

In 2013, a debate was held between friends Peter Thiel and Marc Andreessen, the thrust of which was to determine whether we are living through an innovation golden age, or whether innovation was in fact stalling. Thiel, of course, played the innovation sceptic, and it is interesting now with five years remove to look back on the debate to see how history has vindicated his position. In short all of those things that were ‘just around the corner’ in 2013 are, sure enough, still ‘just around the corner.’

One strand of Thiel’s argument at the time (and since) was that the ostentatious progress made in computing in the last 15 years has blinded us to the lack of technological progress made elsewhere. We can hardly have failed to notice the internet revolution, and thus we map that progress onto everything, assuming that innovation is a cosmic force rather than something which happens on a piecemeal basis.

Certainly, this argument has gained more traction since 2013. However, in this piece I’d like to add an extra layer to it. Is it possible that innovation is not only stalling in non-tech areas, but in tech itself? Could we make an argument to say that the internet itself is, in fact, complete?

The driving logic for this argument is easy to dismiss—namely that all of the big ‘possible’ ideas associated with the internet have been taken. One might say that companies like Google, Facebook, and Amazon were all inevitabilities from the moment computers around the world started to link up, and that once these roles were filled, innovation started to dry up as there was fundamentally ‘nothing left to do.’

The first counter to this is that it’s easy to say in hindsight. Sure Amazon—or a company like it—seems like an inevitability now, but there was once a time when people were highly sceptical of the idea that anyone would want to conduct any type of financial transaction over the internet. The second counter, proved by the first, is to say that we can’t possibly know what might be coming over the horizon at any given time. The next Google might be just about to break, and if it were to do so then it would make a mockery of such defeatism.

Both of these arguments are fair and true. However they simply refute the idea that the internet is finished at this moment, rather than the more fundamental idea that it’s possible for the internet to be finished at all. It is this second idea—or at least the theoretical possibility of it—that I want to illustrate here.

Let’s compare the internet to another world changing innovation—the car. The car started as a ‘base concept’; a motorised chassis to transport you from point A to point B. That was the car on ‘day one,’ and this underlying concept has remained true up to this present day. However, that does not mean that the idea was complete on ‘day one.’ Over time, the car was innovated upon and developed. We added passenger seats so you could take people with you. We added a roof, so it wasn’t only suitable for fair weather. We added air conditioning to keep us comfortable, and a radio to keep us entertained. And, of course, we dramatically improved its performance and reliability. All in all, it probably took about 60 years for the car to go from ‘base concept’ to ‘finished article,’ from which point all cars have remained, on the whole, the same. Sure, a car from 2018 is far more advanced than a car from 1965, but it isn’t fundamentally different. It’s just a more polished version of the same thing. The 1965 car is, however, quite a lot different from an 1895 car, because that was the period of true innovation that fleshed out the idea.

We can say, therefore, that the car—as a concept—is ‘finished.’ Now, that isn’t to say of course that there has been no innovation since 1965, and that there won’t be any innovation in the future. Far from it. But it is to say that this innovation has been, on the whole, mere improvement on a static idea. Cars are cars, TVs are TVs, washing machines are washing machines. Once the idea is complete, we merely fiddle in the edges.

In spite of this precedent, we don’t see the internet in the same way. We don’t see the internet as a ‘base concept’ (i.e. a vast directory of information), which is gradually being shaped and polished into a finished article, from which point it will just tick along. Why not? I would suggest it’s because of the business structure. With the car, you had competing businesses each turning out their own version of the idea. Ford versus Mercedes versus Nissan. However, with the internet, you don’t have different ‘competing internets,’ you just have one—and business’s role within it is to look after the component pieces.

It’s a bit like there had only ever been one car, and different brands had each brought a new addition to the table to create the final useful thing. Facebook came along and put in the seats, Google the driving interface, YouTube the radio, and so on until the car was finished.

Seeing the internet this way, we might speculate that we have come to the end of the initial shaping of the idea, and that from this point on we shall merely be optimising it. We have on our hands the internet equivalent of a 58 Chevy—there’s a long way to go, but fundamentally it does what we want it to do.

by Alex Smith, Quillette |  Read more:
Image: uncredited
[ed. See also: The Comments section.]

Peer Pressure

As I was writing this review, two friends called to ask me about ''that book that says parents don't matter.'' Well, that's not what it says. What ''The Nurture Assumption'' does say about parents and children, however, warrants the lively controversy it began generating even before publication.

Judith Rich Harris was chucked out of graduate school at Harvard 38 years ago, on the grounds that she was unlikely to become a proper experimental psychologist. She never became an academic and instead turned her hand to writing textbooks in developmental psychology. From this bird's-eye vantage point, she began to question widespread belief in the ''nurture assumption -- the notion that parents are the most important part of a child's environment and can determine, to a large extent, how the child turns out.'' She believes that parents must share credit (or blame) with the child's own temperament and, most of all, with the child's peers. ''The world that children share with their peers is what shapes their behavior and modifies the characteristics they were born with,'' Harris writes, ''and hence determines the sort of people they will be when they grow up.''

The public may be forgiven for saying, ''Here we go again.'' One year we're told bonding is the key, the next that it's birth order. Wait, what really matters is stimulation. The first five years of life are the most important; no, the first three years; no, it's all over by the first year. Forget that: It's all genetics! Cancel those baby massage sessions!

What makes Harris's book important is that it puts all these theories into larger perspective, showing what each contributes and where it's flawed. Some critics may pounce on her for not having a Ph.D. or an academic position, and others will quarrel with the importance she places on peers and genes, but they cannot fault her scholarship. Harris is not generalizing from a single study that can be attacked on statistical grounds, or even from a single field; she draws on research from behavior genetics (the study of genetic contributions to personality), social psychology, child development, ethology, evolution and culture. Lively anecdotes about real children suffuse this book, but Harris never confuses anecdotes with data. The originality of ''The Nurture Assumption'' lies not in the studies she cites, but in the way she has reconfigured them to explain findings that have puzzled psychologists for years.

First, researchers have been unable to find any child-rearing practice that predicts children's personalities, achievements or problems outside the home. Parents don't have a single child-rearing style anyway, because how they treat their children depends largely on what the children are like. They are more permissive with easy children and more punitive with defiant ones.

Second, even when parents do treat their children the same way, the children turn out differently. The majority of children of troubled and even abusive parents are resilient and do not suffer lasting psychological damage. Conversely, many children of the kindest and most nurturing parents succumb to drugs, mental illness or gangs.

Third, there is no correlation -- zero -- between the personality traits of adopted children and their adoptive parents or other children in the home, as there should be if ''home environment'' had a strong influence.

Fourth, how children are raised -- in day care or at home, with one parent or two, with gay parents or straight ones, with an employed mom or one who stays home -- has little or no influence on children's personalities.

Finally, what parents do with and for their children affects children mainly when they are with their parents. For instance, mothers influence their children's play only while the children are playing with them; when the child is playing alone or with a playmate, it makes no difference what games were played with mom.

Most psychologists have done what anyone would do when faced with this astonishing, counterintuitive evidence -- they've tried to dismiss it. Yet eventually the most unlikely idea wins if it has the evidence to back it up. As Carole Wade, a behavioral scientist, puts it, trying to squeeze existing facts into an outdated theory is like trying to fit a double-sized sheet onto a queen-sized bed. One corner fits, but another pops out. You need a new sheet or a new bed.

''The Nurture Assumption'' is a new sheet, one that covers the discrepant facts. I don't agree with all the author's claims and interpretations; often she reaches too far to make her case -- throwing the parent out with the bath water, as it were. But such criticisms should not detract from her accomplishment, which is to give us a richer, more accurate portrait of how children develop than we've had from outdated Freudianism or piecemeal research.

The first problem with the nurture assumption is nature. The findings of behavior genetics show, incontrovertibly, that many personality traits and abilities have a genetic component. No news here; many others have reported this research, notably the psychologist Jerome Kagan in ''The Nature of the Child.'' But genes explain only about half of the variation in people's personalities and abilities. What's the other half?

Harris's brilliant stroke was to change the discussion from nature (genes) and nurture (parents) to its older version: heredity and environment. ''Environment'' is broader than nurture. Children, like adults, have two environments: their homes and their world outside the home; their behavior, like ours, changes depending on the situation they are in. Many parents know the eerie experience of having their child's teacher describe their child in terms they barely recognize (''my kid did what?''). Children who fight with their siblings may be placid with friends. They can be honest at home and deceitful at school, or vice versa. At home children learn how their parents want them to behave and what they can get away with; but, Harris shows, ''These patterns of behavior are not like albatrosses that we have to drag along with us wherever we go, all through our lives. We don't even drag them to nursery school.''

Harris has taken a factor, peers, that everyone acknowledges is important, but instead of treating it as a nuisance in children's socialization, she makes it a major player. Children are merciless in persecuting a kid who is different -- one who says ''Warshington'' instead of ''Washington,'' one who has a foreign accent or wears the wrong clothes. (Remember?) Parents have long lamented the apparent cruelty of children and the obsessive conformity of teen-agers, but, Harris argues, they have missed the point: children's attachment to their peer groups is not irrational, it's essential. It is evolution's way of seeing to it that kids bond with each other, fit in and survive. Identification with the peer group, not identification with the parent, is the key to human survival. That is why children have their own traditions, words, rules, games; their culture operates in opposition to adult rules. Their goal is not to become successful adults but successful children. Teen-agers want to excel as teen-agers, which means being unlike adults.

It has been difficult to tease apart the effects of parents and peers, Harris observes, because children's environments often duplicate parental values, language and customs. (Indeed, many parents see to it that they do.) To see what factors are strongest, therefore, we must look at situations in which these environments clash. For example, when parents value academic achievement and a student's peers do not, who wins? Typically, peers. Differences between black and white teen-agers in achievement have variously been attributed to genes or single mothers, but differences vanish when researchers control for the peer group: whether its members value achievement and expect to go to college, or regard academic success as a hopeless dream or sellout to ''white'' values.

Are there exceptions? Of course, and Harris anticipates them. Some children in anti-intellectual peer groups choose the lonely path of nerdy devotion to schoolwork. And some have the resources, from genes or parents, to resist peer pressure. But exceptions should not detract from the rule: that children, like adults, are oriented to their peers. Do you dress, think and behave more like others of your generation, your parents or the current crop of adolescents?

by Carol Tavris, NY Times (1998) | Read more:
Image: Goodreads
[ed. See also: The Nurture Assumption: First Chapter (Judith Rich Harris, NY Times).]

Ryan Shorosky

Jean Francois De Witte
via:

Saturday, April 21, 2018

The End of the Joint As We Know It

Willie Nelson may be a legendary country musician, but he is first and foremost the world’s most famous joint ambassador. Legend has it that he once smoked a joint — what he referred to in his 1988 autobiography as an “Austin Torpedo” — on the roof of the White House with Jimmy Carter’s middle son. Snoop Dogg, another self-appointed sticky-icky spokesman, says that when the two met for an Amsterdam stoner summit in 2008, Nelson showed up with not one but three smoking devices, and promptly puffed him to the floor. (“I had to hit the timeout button,” Snoop later said of smoking with Nelson.) A quick Google search will turn up an entire genre of Nelson portraiture in which the singer is framed by the haze of a freshly lit jay.

All that to say, you might be surprised to hear that Nelson is no longer much of a joint guy. “I use a vaporizer these days,” he told the British magazine Uncut in 2015. “Even though marijuana smoke is not as dangerous as cigarette smoke, any time you put any kind of smoke in your lungs it takes a toll of some kind.” GQ investigated Nelson’s claims later that year, uncovering that, while joints were still very much part of his rotation, a good portion of his pot consumption had shifted to vape pens so as to be more discreet. “And he eats candy or has oil at night for sleeping,” Nelson’s wife, Annie, added.

That the most famous stoner in the world is now exploring more healthful avenues for pot consumption is a sign of the times. According to the cannabis consumer insights firm BDS Analytics, which has logged more than 800 million transactions at dispensaries across Colorado, Washington, Oregon, and California, legal sales of concentrates (vape pen cartridges and dabs), topicals (patches, salves, lotions), and edibles are rapidly outgrowing those of loose-leaf weed product — what cannabis industry types refer to as “flower.” In 2014, the year that Colorado first began selling legal pot, 65 percent of sales revenue came from flower, while only 13 percent came from concentrates. Last year, flower made up only 47 percent of total sales in the state. The new majority of the market is distributed to concentrates at 29 percent and edibles — which barely existed at the dawn of the legal pot movement — at 15 percent. “There’s more choices available to people, and in that respect we’re seeing a lot of evolution in terms of consumption methods,” Linda Gilbert, the managing director of BDS Analytics’ consumer research division, told me. “There is an evolution of looking at marijuana in the consumer’s mind, from being about getting stoned to actually thinking of it as a wellness product.”

And also as a part of everyday life. Where there were once bowls, grinders, and rolling papers, there are now myriad sleek contraptions: dainty plastic oil pens and weed walkie-talkies and smokable iPhone cases. These days, the consumption method of choice may not even be inhalable. Maybe it’s a canister of Auntie Dolores’s vegan, sugar-free pretzels. Or a $6 bottle of Washington state’s Happy Apple cider. Perhaps you go the transdermal route and slather on some $90 Papa & Barkley THC-and-CBD-infused Releaf Balm. No matter the product, the packaging has traded the psychedelic pot leafs of yore for clean lines and Helvetica fonts. (...)

As smoking accessories have modernized in the past 10 years, and as more states have legalized sales, grinding and rolling up bud has gradually become a more obscure ritual. And the era of the hastily rolled marijuana cigarette — crystallized by everyone from Cheech and Chong to Barack Obama — is slowly coming to a close. “If you fast-forward 10 years and look back at the cannabis market, I’ll take a guess that in some ways we’ll think about consuming cannabis flower like we think about consuming a cigar now,” said Alan Gertner, the CEO of Hiku, a Canadian cannabis producer and retailer that aims to make pot consumption more mainstream. “It’s a ritual, it’s a heritage moment, it’s about celebration. But ultimately cannabis flower for any individual is somewhat hard to interact with. The idea that a 20-year-old is going to learn to roll a joint is sort of ludicrous.” (...)

As wellness-mania swept the nation, pot-trepreneurs saw a chance to capitalize on a portion of the estimated $3.7 trillion market worldwide. Over the past few years, cannabis and its nonpsychoactive byproducts have taken the form of medicine: inhalers designed to dole out exact dosages, supplements, patches, and tinctures. Though state laws still prohibit pot-related companies from advertising on any mainstream platform, many of them now see the value of building recognizable, commercially viable brands. The idea is that to encourage more first-time pot consumers, the point of entry must be significantly less complicated than it used to be. That can mean anything from offering a prerolled joint to a pill you take before going to bed. “Right now the market is still dominated by hardcore stoners,” Micah Tapman, a cofounder and managing director at the Colorado investment firm CanopyVentures said. “If they’re hitting something they want 50 milligrams. Whereas the new consumer that is coming up will be a much lighter-weight consumption. The soccer mom demographic is probably going to gravitate toward the very discreet vaporizers or topicals. They’re not typically going to want a bong sitting on their coffee table.”

In other words, less horticulture, more convenience. Gertner, who previously worked as a head of sales at Google before starting his own coffee, cannabis, and clothing brand, likens the current weed consumption landscape to that of the North American coffee market in the last 30 years. People smoke joints for the same reason they used to drink only plain black coffee: potency. “It was basically like: How quickly can I get caffeine into my system?” Gertner said. As companies like Starbucks introduced new nomenclature around coffee and a reworked guidebook for how to consume it, people began to see the beverage differently. “The coffee experience is now grounded in community, as opposed to grounded in the idea in just straight-up caffeine consumption,” Gertner said. “You went from a world where we optimized for potency to a world where we started to optimize for brand, convenience, and taste. You’re not necessarily drinking a Frappuccino because of caffeine content, you’re drinking a Frappuccino for other reasons.” The cannabis market is on a similar path of mass consumption. The earthy taste, smell, and delivery of weed smoke are being muted and manipulated. Just like drinking Frappuccinos, that means customers are sometimes ingesting extra calories or unsavory fillers in the process. And like most artisanal coffee brands, these professionalized cannabis brands can also charge a premium. The joint will always have a place in weed culture, but advanced technology has made it functionally outdated. “You start to think of this future where you say, I can have a cannabis drink, why would I smoke a joint?” Gertner said.

by Alyssa Bereznak, The Ringer | Read more:
Image: uncredited

Michael Cohen and the End Stage of the Trump Presidency

On May 1, 2003, the day President George W. Bush landed on the U.S.S. Abraham Lincoln in front of the massive “Mission Accomplished” sign, I was in Baghdad performing what had become a daily ritual. I went to a gate on the side of the Republican Palace, in the Green Zone, where an American soldier was receiving, one by one, a long line of Iraqis who came with questions and complaints. I remember a man complaining that his house had been run over by a tank. There was a woman who had been a government employee and wanted to know about her salary. The soldier had a form he was supposed to fill out with each person’s request and that person’s contact information. I stood there as the man talked to each person and, each time, said, “Phone number?” And each person would answer some version of “The phone system of Iraq has been destroyed and doesn’t work.” Then the soldier would turn to the next person, write down the person’s question or complaint, and then ask, “Phone number?”

I arrived in Baghdad on April 12th of that year, a few days after Saddam’s statue at Firdos Square had been destroyed. There were a couple of weeks of uncertainty as reporters and Iraqis tried to gauge who was in charge of the country and what the general plan was. There was no electricity, no police, no phones, no courts, no schools. More than half of Iraqis worked for the government, and there was no government, no Army, and so no salaries for most of the country. At first, it seemed possible that the Americans simply needed a bit of time to communicate the new rules. By the end of April, though, it was clear: there was no plan, no new order. Iraq was anarchic.

We journalists were able to use generators and satellite dishes to access outside information, and what we saw was absurd. Americans seemed convinced things were going well in Iraq. The war—and the President who launched it—were seen favorably by seventy per cent of Americans. Then came these pictures of a President touting “Mission Accomplished”—the choice of words that President Trump used in a tweet on Saturday, the morning after he ordered an air strike on Syria. On the ground, we were not prophets or political geniuses. We were sentient adults who were able to see the clear, obvious truth in front of us. The path of Iraq would be decided by those who thrived in chaos.

I had a similar feeling in December, 2007. I came late to the financial crisis. I had spent much of 2006 and 2007 naïvely swatting away warnings from my friends and sources who told me of impending disaster. Finally, I decided to take a deep look at collateralized debt obligations, or C.D.O.s, those financial instruments that would soon be known as toxic assets. I read technical books, talked to countless experts, and soon learned that these were, in Warren Buffett’s famous phrase, weapons of financial mass destruction. They were engineered in such a way that they could exponentially increase profits but would, also, exponentially increase losses. Worse, they were too complex to be fully understood. It was impossible, even with all the information, to figure out what they were worth once they began to fail. Because these C.D.O.s had come to form the core value of most major banks’ assets, no major bank had clear value. With that understanding, the path was clear. Eventually, people would realize that the essential structure of our financial system was about to implode. Yet many political figures and TV pundits were happily touting the end of a crisis. (Larry Kudlow, now Trump’s chief economic adviser, led the charge of ignorance.)

In Iraq and with the financial crisis, it was helpful, as a reporter, to be able to divide the world into those who actually understand what was happening and those who said hopeful nonsense. The path of both crises turned out to be far worse than I had imagined.

I thought of those earlier experiences this week as I began to feel a familiar clarity about what will unfold next in the Trump Presidency. There are lots of details and surprises to come, but the endgame of this Presidency seems as clear now as those of Iraq and the financial crisis did months before they unfolded. Last week, federal investigators raided the offices of Michael Cohen, the man who has been closer than anybody to Trump’s most problematic business and personal relationships. This week, we learned that Cohen has been under criminal investigation for months—his e-mails have been read, presumably his phones have been tapped, and his meetings have been monitored. Trump has long declared a red line: Robert Mueller must not investigate his businesses, and must only look at any possible collusion with Russia. That red line is now crossed and, for Trump, in the most troubling of ways. Even if he were to fire Deputy Attorney General Rod Rosenstein and then have Mueller and his investigation put on ice, and even if—as is disturbingly possible—Congress did nothing, the Cohen prosecution would continue. Even if Trump pardons Cohen, the information the Feds have on him can become the basis for charges against others in the Trump Organization.

This is the week we know, with increasing certainty, that we are entering the last phase of the Trump Presidency. This doesn’t feel like a prophecy; it feels like a simple statement of the apparent truth. I know dozens of reporters and other investigators who have studied Donald Trump and his business and political ties. Some have been skeptical of the idea that President Trump himself knowingly colluded with Russian officials. It seems not at all Trumpian to participate in a complex plan with a long-term, uncertain payoff. Collusion is an imprecise word, but it does seem close to certain that his son Donald, Jr., and several people who worked for him colluded with people close to the Kremlin; it is up to prosecutors and then the courts to figure out if this was illegal or merely deceitful. We may have a hard time finding out what President Trump himself knew and approved.

However, I am unaware of anybody who has taken a serious look at Trump’s business who doesn’t believe that there is a high likelihood of rampant criminality. In Azerbaijan, he did business with a likely money launderer for Iran’s Revolutionary Guard. In the Republic of Georgia, he partnered with a group that was being investigated for a possible role in the largest known bank-fraud and money-laundering case in history. In Indonesia, his development partner is “knee-deep in dirty politics”; there are criminal investigations of his deals in Brazil; the F.B.I. is reportedly looking into his daughter Ivanka’s role in the Trump hotel in Vancouver, for which she worked with a Malaysian family that has admitted to financial fraud. Back home, Donald, Jr., and Ivanka were investigated for financial crimes associated with the Trump hotel in SoHo—an investigation that was halted suspiciously. His Taj Mahal casino received what was then the largest fine in history for money-laundering violations.

Listing all the financial misconduct can be overwhelming and tedious. I have limited myself to some of the deals over the past decade, thus ignoring Trump’s long history of links to New York Mafia figures and other financial irregularities. It has become commonplace to say that enough was known about Trump’s shady business before he was elected; his followers voted for him precisely because they liked that he was someone willing to do whatever it takes to succeed, and they also believe that all rich businesspeople have to do shady things from time to time. In this way of thinking, any new information about his corrupt past has no political salience. Those who hate Trump already think he’s a crook; those who love him don’t care.

I believe this assessment is wrong. Sure, many people have a vague sense of Trump’s shadiness, but once the full details are better known and digested, a fundamentally different narrative about Trump will become commonplace. Remember: we knew a lot about problems in Iraq in May, 2003. Americans saw TV footage of looting and heard reports of U.S. forces struggling to gain control of the entire country. We had plenty of reporting, throughout 2007, about various minor financial problems. Somehow, though, these specific details failed to impress upon most Americans the over-all picture. It took a long time for the nation to accept that these were not minor aberrations but, rather, signs of fundamental crisis. Sadly, things had to get much worse before Americans came to see that our occupation of Iraq was disastrous and, a few years later, that our financial system was in tatters.

The narrative that will become widely understood is that Donald Trump did not sit atop a global empire. He was not an intuitive genius and tough guy who created billions of dollars of wealth through fearlessness. He had a small, sad global operation, mostly run by his two oldest children and Michael Cohen, a lousy lawyer who barely keeps up the pretenses of lawyering and who now faces an avalanche of charges, from taxicab-backed bank fraud to money laundering and campaign-finance violations.

by Adam Davidson, New Yorker |  Read more:
Image: Yana Paskova / Getty
[ed. I'm usually loathe to post anything about Trump, but in this case making an exception. With our lick spittle Congress (Republicans and Democrats) and generally absent and clueless American electorate, we shall see.]