Skip to content

Culture war games: self-licking ice-cream cones

Vicious Cycles
By Greg Jackson

What is the news? That which is new. But everything is new: a flower blooms; a man hugs his daughter, not for the first time, but for the first time this time . . . That which is important and new. Important in what sense? In being consequential. And this has been measured? What? The relationship between what is covered in the news and what is consequential. Not measured. Why? Its consequence is ensured. Ensured. . . ? It’s in the news. But then who makes it news? Editors. Editors dictate consequence? Not entirely. Not entirely? It matters what people read and watch—you can’t bore them. Then boredom decides? Boredom and a sense of what’s important. But what is important? What’s in the news.

We need it: the Fourth Estate, complement to government, scourge of corruption, orchestrator of public discourse. No one thinks we could get by without a press. No one who understands the work of journalism has anything but admiration for its honest practice.

But this work—to hold power to account, to safeguard the truth, to comfort the afflicted and afflict the comfortable, in Finley Peter Dunne’s immortal words—has entered into a fatal bargain with an effluvium that demeans and yet supports it. Traditional reporting becomes the loss leader. It exchanges its status for a subsidy, and slowly a reluctant embrace of this co-optation—by the very forces a profession that stands in opposition to power should repel—turns into an erotic grapple, because the apotheosis of market logic is the jittery Stockholm syndrome that makes the prisoners of the market insist that it has set them free.

So we find ourselves in a situation in which an entertainment industry of specious value (called “news”) subsidizes a much smaller and less popular subindustry (real news), which lends its prestige to the former and permits it to call itself by the same name. As this entertainment industry subsumes and replaces the news industry, a little game takes place, more or less in public. The game involves pretending—journalist and audience alike—that they have gathered to discuss a truth that exists outside the media, when, except in the rarest cases, they intend to discuss the processes of the media itself: the drama of how information and sentiment evolve and are influenced within a media environment. Like sports fans, news consumers learn the subtleties of the game. They grow “media-savvy,” and media-savvy becomes the hope of an industry. Members of the news business (and practically they alone) call for greater “media literacy”—a solution to a problem they have created that expects the reformation of their audience but not of their industry—because they do not want to choose between responsibility and popularity, or principle and career. They are selling a healthy product, they imply, which people are using the wrong way. But this is confused. No one wants the healthy product. They want its misuse. They want to believe something so stimulating can be healthy, and they rely on media members to help perpetuate this lie.

As civic discourse—the news—becomes increasingly shaped by media-savvy and game-play, as it becomes a metadiscourse not about actual events but about the translation and distortion of actual events within a virtual sphere, the little lie about what the news is and why we follow it permits bigger lies. Charlatans, con men, demagogues, and cheats crawl out of the woodwork and operate with impunity, knowing they need not win on truth or merit, but simply win the news cycle, win within the rules of a confected game. Playing the game well, being stimulating and likable in a media environment, suffices to justify one’s ascendancy within it, because—despite protestations to the contrary—this logic of celebrity explains why anyone is a media figure in the first place, and why we attend them. The sober, responsible news, now in its watchdog guise, enters here—when the mechanism of its own industry has elevated a crook or a scoundrel to a position of power—promising to solve, through exposure, a problem it helped create. But it can’t undo the media mechanism without relinquishing its own power and profitability by copping to the lie on which its prestige rests.

The news may be judged by what it crowds out. democracy dies in darkness, the Washington Post motto reads. For billions who live in countries without a free press, this is true. But our problem in the United States is not an absence but a glut. Truth dies in darkness, but it also dies in blinding light. Separating what’s important from what’s trivial is as essential as revealing what’s important. A needle in the haystack isn’t much better than no needle.

The problem of distinguishing the important from the trivial is a problem for all of us—for our educators, our politicians, our leaders, for us as individuals, as citizens, as friends. To lay this problem at the feet of the news industry is unfair. The news is trapped in a business model that makes no sense, that rewards it for its worst behavior and refuses to pay it for what of greatest value it contributes. But the news can be blamed for confusing the issue. We need to know when we are being entertained and when we are having a different experience. Being fed trivialities when we need importance, like empty calories when we need nourishment, makes us sick. We grow to mistake bigness for importance, when importance is a measure of our involvement. Big trivialities make us psychically obese, with nowhere to expend this pent-up energy. “What a story. What a fucking story,” Dean Baquet said, watching Trump’s inauguration.

“This Gravy Train Is Coming to an End”: News Media Begins to Contemplate a Post-Trump White House
By Tom Kludt

Though Baquet rejects the “resistance” label for his newsroom, the Times launched an advertising campaign just weeks after the inauguration that made thinly veiled reference to Trump’s attacks. In the 24 hours after its first ad premiered during the Academy Award ceremony, the Times generated more subscribers than it had in the previous six weeks. At that point the Times was already riding the so-called Trump bump thanks to a wave of new subscriptions that followed the 2016 election. Today the Times has around 6.5 million total subscribers, more than twice as many as at the beginning of Trump’s presidency, putting the paper on track to hit its lofty goal of 10 million by 2025.

“They began to understand that they were viewed as the loyal opposition,” news-industry analyst Ken Doctor said of publications like the Times. “Now there are all kinds of editorial and ethical questions about that—and about how they’ve covered it with their headlining and all that kind of stuff—but from a business point of view, they came to understand that they were viewed by many as the loyal opposition and that that could spur subscriptions.” Doctor pointed to another example of the trend: the Washington Post adopting the slogan “Democracy Dies in Darkness” a month into the Trump presidency. “Without saying the word Trump, that’s saying, ‘You need us to push back the darkness,’” Doctor told me “That’s potent. It’s playing off the same thing.”

Trump bump: NYT and WaPo digital subscriptions tripled since 2016
By Sara Fischer

The big picture: Sources tell Axios that the Post is nearing 3 million digital subscribers, a 50% year-over-year growth in subscriptions and more than 3x the number of digital-only subscribers it had in 2016. The New York Times now has more than 6 million digital-only subscribers, nearly 3x its number from 2016.

The shame of the press
By W. Joseph Campbell

A defining ethos of American journalism that emerged during the second half of the Twentieth Century emphasized even-handed treatment of the news and an avoidance of overt, blatant partisanship.

Rank-and-file journalists tended to regard politicians of both major parties with a mixture of suspicion and mild contempt. It was a kind of “fie on both houses” attitude. Running interference for a politician was considered more than a little unsavory.

Not so much anymore. Not in American corporate media, where an overt partisanship has become not only acceptable but unmistakable.

With the decline of advertising revenues, the business model has moved toward a digital-subscriber base. As readers pay, they are prone to make clear their preferences, and the news report tilts to reflect their partisan expectations.

Evidence of the tilt was striking enough four years ago, when Liz Spayd, an advocate of even-handedness in reporting, was public editor at the New York Times. She lasted less than a year before the position was dissolved and she was let go.

Spayd, whom I favorably mention in my latest book, Lost in a Gallup: Polling Failure in U.S. Presidential Elections, hadn’t been on the job a month when she wrote this about the Times in July 2016:

“Imagine what would be missed by journalists who felt no pressing need to see the world through others’ eyes. Imagine the stories they might miss, like the groundswell of isolation that propelled a candidate like Donald Trump to his party’s nomination. Imagine a country where the greatest, most powerful newsroom in the free world was viewed not as a voice that speaks to all but as one that has taken sides.

“Or has that already happened?”

Don’t Get Too Nostalgic About Old Media
By Justin Fox

In the mid-1950s, William Greider worked for a couple of summers as a reporter at the Cincinnati Post, an afternoon newspaper that was the flagship of the scrappy Scripps-Howard chain.

Greider came from an affluent, Republican Cincinnati suburb, and was spending the non-summer months as a student at Princeton University, from which he graduated in 1958. The other reporters at the Post didn’t have backgrounds like that. They had generally grown up in the city proper or in grittier surrounding areas, and started out at the paper as copy boys right after high school. They were pro-union, pro-police and anti-politician. None was black, few were women, and the paper they produced was relentlessly parochial. But it did, Greider recalled decades later, “cast itself as a representative voice” of “the people who were least likely to be heard on political issues.”

By the 1970s, Greider was working at another Post, the one in Washington. Virtually all the reporters there had college degrees, many from fancy places like Princeton. The newspaper they produced was sophisticated and at times brilliant, and some of its journalists were disrespectful enough of authority to help topple a president. But by burnishing the paper’s reputation and authority, Watergate may actually have accelerated its transition to pillar of the Washington establishment, a representative voice not so much of the people as of those who governed them.

For a paper based in the nation’s capital that approach made a certain amount of sense — the Post was arguably serving its readers. But as most of the rest of the news media, especially at the national level, followed a similar path of credentialization, professionalization and cozying up to power, it left a lot of its audience behind, Greider argued:

I have always thought [this] is a central element feeding the collective public resentment that surrounds the news media. People sense the difference, even if they cannot identify it. Conservative critics usually call it a “liberal bias” in the press, but I think it may be more accurately understood as social distance.

Lots of Americans were hungry for something other than what the increasingly elitist mainstream media was offering, and radio talkers such as Rush Limbaugh were able to deliver it on a cost-effective basis.

Limbaugh and a few others of course soon became so successful that they formed a new elite; the subtitle of Rosenwald’s book is “How an Industry Took Over a Political Party That Took Over the United States.” And the overall evolution of the news media since the early 1990s — with the rise of not just talk radio but also cable TV news, Facebook, Twitter, YouTube and other new-media phenomena, along with the precipitous decline of regional and local newspapers — is I think the single best explanation for why U.S. politics have become so fractured and dysfunctional. Most Americans’ policy views don’t divide neatly along partisan lines, the cultural distance between different groups in the U.S. has grown modestly if at all, and it appears that those who follow the news most closely have the most distorted picture of what followers of the other political party believe. The changes in how we communicate have been much bigger than those in how we think and act.

Before one starts waxing nostalgic, though, it’s useful to return to Greider’s clear-eyed view of the state of affairs in 1992. Among other things, he described how what had once been a great variety of daily newspapers aiming to serve very different groups of readers had consolidated into a smaller number of publications “with an angle of vision that presumes an idyllic class-free community — a city where everyone has more or less the same view on things.”

Popular Journalism’s Day in ‘The Sun’
By Batya Ungar-Sargon

… in 1937, when the social scientist and writer Leo C. Rosten conducted a survey of what was the journalistic elite of his time, he found that of 127 Washington-based newspaper correspondents, fewer than half had finished college. Three in ten had attended college for a year or two but hadn’t finished. Eight of the reporters hadn’t finished high school, and two of them had no high school education at all. A more extensive study from earlier in the decade found that just 40 percent of journalists were college grads, and nearly 10 percent had not gone to high school. Two thirds came from working-class families.

Over the course of the twentieth century, however, journalism became an increasingly elite profession. If, in the 1930s, just three in ten journalists had finished college, by 1960, it was two thirds. By 1971, just over half of US journalists had graduated from college. By 1983, that number had jumped to 75 percent. By 1992, it was 82 percent of all journalists; in 2002, it was 89 percent. And by 2015, the figure had reached 92 percent. Today, just 8 percent of all journalists have not been to college. As for the next generation, a recent survey revealed that The New York Times, The Washington Post,and The Wall Street Journal were recruiting their summer interns from the top 1 percent of universities in the country. Meanwhile, the number of Americans with a bachelor’s degree remains at just about a third of the population; 46 percent of American adults have never attended a single college class.

The people tasked with writing the first draft of history every day for the US news media have thus become less and less typical of most Americans. And they’re not just more educated than the average American; they are also more secular than the average American, make more money, and—with the decimation of local news in the Internet era—are more coastal than other Americans. And it’s no secret that they are more liberal: when researchers from Arizona State University and Texas A&M University surveyed business journalists from Reuters, the Associated Press, The Wall Street Journal, Financial Times, Bloomberg News, Forbes, The New York Times, and The Washington Post in 2018, they found that just 4 percent of respondents had conservative political views.

All of this has affected how journalists cover their fellow Americans, and how, in return, those Americans view journalists.

College-Educated Professionals Are Capitalism’s Useful Idiots
By Kurt Andersen

What happened with organized labor in journalism during the 1970s is an excellent illustration of those early days of the deepening fracture between upper-middle-class and lower-middle-class (white) Americans. It encompasses both the cultural split (yuppies versus yahoos) and the introduction of transformative technology in the workplace.

Between the publication of the Pentagon Papers in 1971 through the end of Watergate in 1974, The Washington Post became a celebrated national institution, sexy liberalism incarnate. Following immediately on those two heroic achievements was another milestone episode, not very celebrated or heroic but likewise emblematic of the moment. In the spring of 1974, the journalists of the Post went out on strike—bumblingly. They didn’t even ask the paper’s blue-collar unions to join them, they refused their own Newspaper Guild leaders’ request to walk a picket line, the paper continued publishing, and after two weeks they gave up and accepted management’s offer.

It was a generation before websites and browsers, universal PCs and cell phones, 30 years before print dailies entered their death spiral, but technology was already changing newspapers in a big way, in the manufacturing part of the operation. Owners were eliminating typographers, who operated obsolete, elephantine Brazil-meets–Willy Wonka linotype machines that turned molten lead into blocks of type, and they also wanted to pay fewer people to operate the printing presses. A large majority of the Post’s 2,000 employees were those blue-collar workers, a large majority of whom were suddenly redundant. In 1975 the 200 pressmen wouldn’t come to terms and went on strike, and the other blue-collar unions at the Post went on strike in solidarity, as unions are supposed to do.

Absolutely key to how it played out was the behavior of the Post’s journalists. Just as the recent exposure of the secret Pentagon report on Vietnam and Nixon’s crimes had been game-changing work by journalists with the essential support of management, the crushing of the strike and pressmen’s union, also a game changer, was the work of management with the essential support of journalists.

Two-thirds of the Post’s unionized editorial employees didn’t stop working at all, and a majority voted again and again against striking in solidarity with the pressmen. “What I find ominous is that a number of Guild people don’t think they have common cause with craftsmen,” a Post journalist told a reporter at the time. “They feel professionally superior to guys with dirt under their fingernails.” At a guild meeting, a Post reporter referred to the striking pressmen as “slack-jawed cretins.” Four weeks into the five-month strike, a Times article reported that “if a Post Guild member is asked why he or she is not supporting the strike,” many “say they do not see themselves as ordinary working people. One said, ‘We go to the same parties as management. We know Kissinger, too.’” And while probably none of the pressmen knew the secretary of state, their average pay was the equivalent of $111,000, about as much as reporters, which is the excuse one of the paper’s reporters gave for crossing the picket line from day one. “If they got slave wages, I’d be out on the line myself,” said the 32-year-old Bob Woodward, co-author of the second-best-selling nonfiction book of the previous year.

The strike ended just before the release of the film adaptation of All the President’s Men, a fictionalization that only intensified the love of American liberals for The Washington Post, even though the Post pressroom was about to become nonunion. As a Post columnist wrote back then in The New Republic, “The pressmen’s strike was crushed with methods and with a severity that the press in general or the Post in particular would not be likely to regard as acceptable from the owners of steel mills. Yet because it was a newspaper management that broke the strike, no other newspaper has touched it properly, or even whimpered a protest.”

When I arrived at Time as a writer five years later, I went out of my way to produce copy the modern way—abandoning my office Selectric to use one of the special computer terminals crammed into a special little room, holed up with a few of the other young writers. That technology presently enabled the company to eliminate the jobs of the people downstairs who were employed to retype our stories. At the time I probably shrugged, like the newspaper reporters who hadn’t cared much about the redundant linotype operators and pressmen.

I think that if I had been one of those unionized craft workers who were abandoned by my unionized journalist colleagues 45 years ago, I would have watched journalists getting washed away and drowned by the latest wave of technology-induced creative destruction over the past 15 years with some schadenfreude.

What happened at newspapers (and magazines) back then also had disproportionate impact on this history of the right’s hijacking of America’s political economy, because once journalists were actively ambivalent about organized labor, that disenchantment spread more contagiously than if it had just been random young professionals bad-mouthing unions. News stories about labor now tended to be framed this way rather than that way or were not covered at all. Thus like most Democratic politicians at the same time, media people became enablers of the national change in perspective from left to right concerning economics.

During the 1930s and ’40s and ’50s, the right had derided liberal writers and editors as Communists’ “useful idiots,” unwittingly doing the Communists’ propaganda work; it looks in retrospect as if, starting in the 1970s, a lot of them—of us—became capitalists’ useful idiots. A huge new cohort of college-educated liberal professionals got co-opted.

The American media elite has learned nothing from 2016. It will only get worse
By Jessa Crispin

Yes, the media made a few cosmetic changes to prove they understood the diversity of thought across this nation. The New York Times, for example, decided it lacked the conservative voices that could explain the populist rightward tilt the electorate took, so the paper hired man-of-the-people Bret Stephens – son of a corporate scion, graduate of the University of Chicago and the London School of Economics – as a columnist. You know, someone who could really give some insight into the kind of opinions circulating at the Beloit, Kansas, diner at harvest time. Other publications followed suit, fawning over anyone who might have accidentally found themselves briefly living in a conservative state before their inevitable elevation to the elite university system, like the Yale graduate/Appalachia expert/venture capitalist JD Vance, despite the multitude of critics who say his theories about the white working class are naive at best.

Journalism has become a well-gate-kept little bubble, if bubbles were created out of ignorance and contempt for what lay outside of them instead of just soap. And unlike soap bubbles, so easily pricked and burst, the walls of the ideological bubbles of our professional class are nearly impenetrable.

As local newspapers disappear due to the financial meddling of Facebook and venture capitalists, and as journalism becomes a career that requires advanced degrees rather than apprenticeships, it is harder and harder for anyone who does not come from an upper-middle-class background or elite education to find work and a voice in our media institutions. What gives you access to these realms is not a unique insight or an empathetic perspective or access to overlooked populations, but instead credentials only attainable by wealth or privilege. Vance went to Yale, so whatever he says about Appalachia must be right, because his editor also probably went to Yale and so did his editor’s boss and so on.

The biases of the professional classes replicate themselves, and we find figures as horrified and baffled by the progressive left as they are by the reactionary right. Often, as with Chuck Todd and Chris Matthews, they can’t even distinguish between the two groups. Anyone working as a collective must be essentially the same, even if one group is chanting white nationalist slogans and the other is asking for racial and economic justice. But who can tell the difference, watching the demonstration, as they do, through the glass of their building on Eighth Avenue?

Swat Team
By Thomas Frank

This stuff is not mysterious. We can easily identify the political orientation behind it from one of the very first pages of the Roger Tory Peterson Field Guide to the Ideologies. This is common Seaboard Centrism, its markings of complacency and smugness as distinctive as ever, its habitat the familiar Beltway precincts of comfort and exclusivity. Whether you encounter it during a recession or a bull market, its call is the same: it reassures us that the experts who head up our system of government have everything well under control.

It is, of course, an ideology of the professional class, of sound-minded East Coast strivers, fresh out of Princeton or Harvard, eagerly quoting as “authorities” their peers in the other professions, whether economists at MIT or analysts at Credit Suisse or political scientists at Brookings. Above all, this is an insider’s ideology; a way of thinking that comes from a place of economic security and takes a view of the common people that is distinctly patrician.

Now, here’s the mystery. As a group, journalists aren’t economically secure. The boom years of journalistic professionalization are long over. Newspapers are museum pieces every bit as much as Bernie Sanders’s New Deal policies. The newsroom layoffs never end: in 2014 alone, 3,800 full-time editorial personnel got the axe, and the bloodletting continues, with Gannett announcing in September a plan to cut more than 200 staffers from its New Jersey papers. Book-review editors are so rare a specimen that they may disappear completely, unless somebody starts breeding them in captivity. The same thing goes for the journalists who once covered police departments and city government. At some papers, opinion columnists are expected to have day jobs elsewhere, and copy editors have largely gone the way of the great auk.

In other words, no group knows the story of the dying middle class more intimately than journalists. So why do the people at the very top of this profession identify themselves with the smug, the satisfied, the powerful? Why would a person working in a moribund industry compose a paean to the Wall Street bailouts? Why would someone like Post opinion writer Stephen Stromberg drop megatons of angry repudiation on a certain Vermont senator for his “outrageous negativity about the state of the country”? For the country’s journalists—Stromberg’s colleagues, technically speaking—that state is pretty goddamned negative.

Maybe it’s something about journalism itself. This is a field, after all, that has embraced the forces that are killing it to an almost pathological degree. No institution has a greater appetite for trendy internet thinkers than journalism schools. We are all desperately convincing ourselves that we need to become entrepreneurs, or to get ourselves attuned to the digital future—the future, that is, as it is described for us hardheaded journalists by a cast of transparent bullshit artists. When the TV comedian John Oliver recently did a riff on the tragic decline of newspaper journalism, just about the only group in America that didn’t like it was—that’s right, the Newspaper Association of America, which didn’t think we should be nostalgic about the days when its members were successful. Truly, we are like buffalo nuzzling the rifles of our hunters.

Or maybe the answer is that people at the top of the journalism hierarchy don’t really identify with their plummeting peers. Maybe the pundit corps thinks it will never suffer the same fate as, say, the Tampa Tribune. And maybe they’re right. As I wrote this story, I kept thinking back to Sound and Fury, a book that Eric Alterman published in 1992, when the power of pundits was something new and slightly alarming. Alterman suggested that the rise of the commentariat was dangerous, since it supplanted the judgment of millions with the clubby perspective of a handful of bogus experts. When he wrote that, of course, newspapers were doing great. Today they are dying, and as they gutter out, one might expect the power of this phony aristocracy to diminish as well. Instead, the opposite has happened: as serious journalism dies, Beltway punditry goes from strength to strength.

It was during that era, too, that the old-school Post columnist David Broder gave a speech deploring the rise of journalistic insiders, who were too chummy with the politicians they were supposed to be covering. This was, he suggested, not only professionally questionable. It also bespoke a fundamental misunderstanding of the journalist’s role as gadfly and societal superego:

I can’t for the life of me fathom why any journalists would want to become insiders, when it’s so damn much fun to be outsiders—irreverent, inquisitive, impudent, incorrigibly independent outsiders—thumbing our nose at authority and going our own way.

Yes, it’s fun to be an outsider, but it’s not particularly remunerative. As the rising waters inundate the Fourth Estate, it is increasingly obvious that becoming an insider is the only way to hoist yourself above the deluge. Maybe that is one reason why the Washington Post attracted the fancy of megabillionaire Jeff Bezos, and why the Post seems to be thriving, with a fancy new office building on K Street and a swelling cohort of young bloggers ravening to be the next George Will, the next Sid Blumenthal. It remains, however precariously, the cradle of the punditocracy.

Meanwhile, between journalism’s insiders and outsiders—between the ones who are rising and the ones who are sinking—there is no solidarity at all. Here in the capital city, every pundit and every would-be pundit identifies upward, always upward. We cling to our credentials and our professional-class fantasies, hobnobbing with senators and governors, trading witticisms with friendly Cabinet officials, helping ourselves to the champagne and lobster. Everyone wants to know our opinion, we like to believe, or to celebrate our birthday, or to find out where we went for cocktails after work last night.

Until the day, that is, when you wake up and learn that the tycoon behind your media concern has changed his mind and everyone is laid off and that it was never really about you in the first place. Gone, the private office or award-winning column or cable-news show. The checks start bouncing. The booker at MSNBC stops calling. And suddenly you find that you are a middle-aged maker of paragraphs—of useless things—dumped out into a billionaire’s world that has no need for you, and doesn’t really give a damn about your degree in comparative literature from Brown. You start to think a little differently about universal health care and tuition-free college and Wall Street bailouts. But of course it is too late now. Too late for all of us.

Jon Stewart Is Back to Weigh In
By David Marchese

We used to have news and we had entertainment. Now those categories are totally intertwined — to the extent that it’s not far-fetched to say that we just have varieties of entertainment. And similarly, people are looking at entertainers, rather than politicians, as political authorities. I don’t think it’s too far off base to suggest that, unintentionally or not, ‘‘The Daily Show’’ played a part in that transformation. What do you think about those changes and what they’ve wrought? I think you have to look at what incentivized the system. The news didn’t become ‘‘The Daily Show,’’ because at its core, ‘‘The Daily Show’’ was a critique of the news and a critique of those systems. If they’d taken in what we were saying, they wouldn’t be doing what they’re doing now: creating urgency through conflict. Conflict has become the catalyst for the economic model. The entire system functions that way now. We are two sides — in a country of 350 million people.

That reminds me of the old George Carlin joke about how in America you have 23 kinds of bagels to choose from but only two political parties. Politically in this country, you have Coke or Pepsi. Every now and again, Dr Pepper comes along and everybody is like, ‘‘You ruined this for everyone else.’’ Dr Pepper is Ralph Nader, let’s say. But getting back to your question — it plays into that scenario of looking for the scapegoat. ‘‘Well, it’s ‘The Daily Show.’ They popularized news-as-entertainment.’’ It’s the New York Times trend-piece thing of somebody getting hold of an idea and amplifying it even though it really has no breadth or depth to it.

What do you think of the news media’s handle on this political moment more generally? I don’t think it has ever had a good handle on a political moment. It’s not designed for that. It’s designed for engagement. It’s like YouTube and Facebook: an information-laundering perpetual-radicalization machine. It’s like porn. I don’t mean that to be flip. When you were pubescent, the mere hint of a bra strap could send you into ecstasy. I’m 57 now. If it’s not two nuns and a mule, I can’t even watch it. Do you understand my point? The algorithm is not designed for thoughtful engagement and clarity. It’s designed to make you look at it longer.

Confessions of a Tabloid ‘Extremist’
By Byline Investigates

Few know more about Britain’s dysfunctional tabloid culture than Graham Johnson. A former star investigative reporter at the Sunday Mirror and Rupert Murdoch’s infamous (and now defunct) News of the World, Mr. Johnson spent years ‘turning over’ gangsters, the occasional celebrity, and fellow reporters from London to LA. He is also the producer of Vice’s hugely successful and well-regarded documentary The Debt Collector, and a convict, having recently pled guilty to phone hacking during his NoTW days.

DT: Where did this culture of doing absolutely anything to get a story come from?

GJ: I think it started in the 1980s, with the introduction of intense competition between tabloids. The Sun came in and took the mantle off The Mirror, and there was The Star too.

[But when I was at] The News of the World, you didn’t even bother about other papers. You think you’re an elite, you buy into this brainwashing that you’re like a death squad, like an SAS of tabloids. They tell you that you’re the best – don’t even worry about the Sunday Mirror, they sell two million less than you. So you just compete against yourselves. The News department competes against Features, and the managers would set up this false competition. It’s intense pressure from managers.

There was a culture of fabrication. And even some of their [the News of the World’s] best reporters fabricated stories, quotes, and parts of stories.

DT: And were the bosses smart in not specifically saying ‘we want you to do [something unethical]…’

GJ: Yeah, it’s euphemisms. The best one is ‘you’ve gotta make this work’. ‘No, listen to me – I don’t give a fuck about X,Y, and Z, and she [the source] isn’t saying anything, you’ve gotta make this work.’ I know what he means. It’s got to be made to work. ‘Can’t we engineer this?’ And you get to understand what needs to happen.

DT: What was the worst thing you saw as a tabloid reporter?

GJ: That’s like asking a member of a Colombian death squad what the worst massacre they were involved in was! You can’t remember all the bad things you’ve done, never mind what anyone else has done! There were some terrible things, though. I remember hearing about when someone committed suicide, and a cheer went up [in the newsroom], because they’d driven him to it [by accusing the man of being a paedophile].

There were some terrible things going on. And you’re psychologically equipped to do it…

DT: And when everyone around you is doing it as well, its hard to be the one…

GJ: … who stops it, yeah.

You do go along with it. At the end of the day, you’re a functionary. I do what needs to be done, and go back to my family and say I’m a success. It’s all about money, and status, and status anxiety.

DT: Talking to you now, it is clear you think this culture is awful. So what was going through your mind when you were part of it?

GJ: It’s ambition. You’re driven by an extremist level of ambition. You’re brainwashed. And you want to keep your job. You’ve got that constant fear inside of you.

Everybody Sucks
By Vanessa Grigoriadis

It’s long been known to magazine journalists that there’s an audience out there that’s hungry to see the grasping and vainglorious and undeservedly successful (“douchebags” or “asshats,” in Gawker parlance) put in the tumbrel and taken to their doom. It’s not necessarily a pleasant job, but someone’s got to do it. Young writers have always had the option of making their name by meting out character assassinations—I have been guilty of taking this path myself—but Gawker’s ad hominem attacks and piss-on-a-baby humor far outstrip even Spy magazine’s. It’s an inevitable consequence of living in today’s New York: Youthful anxiety and generational angst about having been completely cheated out of ownership of Manhattan, and only sporadically gaining it in Brooklyn and Queens, has fostered a bloodlust for the heads of the douchebags who stole the city. It’s that old story of haves and have-nots, rewritten once again.

Journalists are both haves and have-nots. They’re at the feast, but know they don’t really belong—they’re fighting for table scraps, essentially—and it could all fall apart at any moment. Success is not solid. That’s part of the weird fascination with Gawker, part of why it still works, five years on—it’s about the anxiety and class rage of New York’s creative underclass. Gawker’s social policing and snipe-trading sideshow has been impossible to resist as a kind of moral drama about who deserves success and who doesn’t. It supplies a Manhattan version of social justice. In the past couple of years, Gawker has expanded its mission to include celebrity gossip, sacrificing some of its insider voice in the process, but on a most basic level, it remains a blog about being a writer in New York, with all the competition, envy, and self-hate that goes along with the insecurity of that position.

Most bloggers in Denton’s network work under the most severe deadlines imaginable, with many contracted to write twelve posts per day. At the same time, they are unbelievably fulfilled: Bloggers get to experience the fantastic feeling of looking at everything in the world and then having everyone look at them through their blog, of being both subject and object, voyeur and voyeurant. To get more of that feeling, some bloggers—if we were a blog, we’d tell you who—are in the bathroom snorting cocaine, or Adderall, the ADHD drug popular among college kids on finals week, the constant use of which is one of the only ways a blogger can write that much (“We’re a drug ring, not a bunch of bloggers,” one Gawker Media employee tells me cheerily). Pinched nerves, carpal tunnel, swollen feet—it’s all part of the dastardly job, which at the top level can involve editing one post every fifteen minutes for nine hours a day, scanning 500 Websites via RSS for news every half-hour, and on “off-hours” keeping up with the news to prepare for tomorrow.

Until recently, most Gawker bloggers were paid a flat rate of $12 per post for twelve posts a day, with quarterly bonuses adding to the bottom line; these bonuses could be used to buy equity in the company, which took two years to vest. Now, Denton is moving to a pay-for-performance system. He has always tracked the page views of each individual Gawker Media writer, thinking of them like stocks in a portfolio, with whoever generates the most page views as his favorite. If each writer was only as valuable as the page views he drew, then why shouldn’t Denton pay him accordingly?

Balk, the site’s primary troublemaker, quickly posted an item on Gawker about this change with the slug “Like Rain on Your Wedding Day, Except for Instead of Rain It’s Knives.” Denton wasn’t amused. “Your item makes the argument for performance pay even stronger,” he responded in the post’s comments. “This awesomely self-indulgent post—of interest to you, me, and you, and me—will struggle to get 1,000 views. Which, under the new and improved pay system, Balk, will not even buy you a minute on your bourbon drip.” (Balk gave notice two weeks later.)

Elijah Pollack Is Going To Be A Horror
By Joshua Stein

A good rule of thumb, I think, is that the level of adult hatred towards a minor should be commensurate not with his biological age but at the age of his precocity. So it is both a compliment and just to describe Elijah Pollack as big, big trouble in the making.

Christina Hendricks Denies Her Bountiful Bare Breasts Are Internet Stars
By Luke Malone

You know you’ve made it in Hollywood when someone hacks your phone or releases alleged nude photos online, so Christina Hendricks can sleep soundly tonight knowing that someone cared enough to help her reach both of those milestones. With a series of sexy times photos [NSFW] leaking over the weekend, the actress’s rep confirms she got hacked but adds that Christina prefers to leave things to the imagination and the topless shot isn’t actually her. “Christina’s phone was in fact hacked and photos were stolen. The proper authorities have been contacted in hopes of rectifying this situation,” said her rep. “The topless image is fake and not an image of Christina.”

We’re Offering $10,000 for Unretouched Images of Lena Dunham in Vogue
By Jessica Coen

Lena Dunham is a woman who trumpets body positivity, who’s unabashedly feminist, who has said that her naked body is “a realistic expression of what it’s like to be alive” and “if you are not into me, that’s your problem.” Her body is real. She is real. And for as lovely as the Vogue pictures are, they’re probably not terribly real. So Jezebel is offering $10,000 for pre-Photoshop images from Lena’s Vogue shoot.

Search And Destroy
By Ben McGrath

Denton used to tell people who asked what he did for a living that he was a pornographer. This was true, in a limited way: he publishes Fleshbot, a blog that boasts of its devotion to “Pure Filth,” and features a great many explicit anatomical images. But Fleshbot, which receives about a million unique domestic visitors each month, is now the worst-performing of the nine titles that Denton puts out, and you won’t find any mention of it on the mastheads of the other eight; it’s a drag on the reputable kind of advertising that Denton now covets. Denton’s best-performing site, Gizmodo, reaches nearly six million Americans a month. It’s a punchy consumer guide to gadgets: cell phones, camcorders, turntables. Denton is a kind of gadget fetishist, but you’d be unlikely to hear him telling a stranger that he is a technology watchdog or a trade publisher. That would be boring, and insufficiently mysterious. Also, it could be interpreted as an attempt at a whitewash, which is something that Denton scorns in others with the ferocity of Mencken and Winchell. On his Twitter feed, Denton identifies himself as a “gossip merchant.”

Like all gossip merchants, Denton fancies himself a truth-teller who relishes flouting the conventions of good taste and privilege. He grew up in London, where the Fleet Street tabloid culture is cutthroat, and he shares the Murdochian view of American journalism as effete, earnest, and uncompetitive. “The staples of old yellow journalism are the staples of the new yellow journalism: sex; crime; and, even better, sex crime,” he wrote in a memo to his staff.

Denton arrived in Manhattan with a list of important people he planned to meet, and a personal mission to unsettle many of them. That summer, he abandoned the science-fiction novel he’d been working on and started what became Gawker Media. He saw in the traditional blog format—links with commentary, presented in reverse chronological order—the potential for a leaner, more accountable publishing model aimed at niche audiences, or verticals, that could be bundled together when selling advertising. Just how lean? He paid Elizabeth Spiers, the original Gawker writer, two thousand dollars a month, on the assumption that posting twelve short items a day, mostly in response to things she’d read in the Times or gleaned from fashion-magazine sources, was a part-time commitment. When Spiers complained, after several months, that the gig was taking over her life, he told her to relax on weekends and pro-rated her pay downward. Later, as the brand grew more established, and as the number of writers in his stable increased, he settled on a new payment scheme: twelve dollars a post, with a pool of bonus money paid out according to the number of page views generated.

Paying bonuses for traffic meant not only keeping statistics about what readers did and didn’t like but sharing that information with writers—a supreme journalistic taboo, as it could easily lead to pandering. Pandering was precisely Denton’s aim, and he took it one step further when he started publishing his traffic data alongside the stories themselves. It almost felt like a sociological experiment designed to prove the obvious: that readers are herd animals, that heat begets heat. A photograph of an unidentifiable mammalian carcass on a beach, cleverly dubbed the Montauk Monster, is viewed two million times: go figure. “I think people are sort of waking up to it now, how probably the biggest change in Internet media isn’t the immediacy of it, or the low costs, but the measurability,” Denton told me. “Which is actually terrifying if you’re a traditional journalist, and used to pushing what people ought to like, or what you think they ought to like.”

Gawker’s implicit mission seemed to be to destroy the established media, both by cannibalizing its content and by obliterating the reputation of everyone who produced it, without any apparent conviction about what ought to follow.

“I think of us as being a little like the friendly barbarians,” Denton said. “You know, like, when the Roman Empire fell, there were the tribes that had come out of Mongolia, and each one that came was fleeing some other yet more barbarian group of barbarians. We’re the barbarians who can actually—probably—be hired to defend your gates.”

Denton’s greatest publishing feat, objectively speaking, occurred about six months ago. “It was the ultimate story,” he told me. “There is no comparison. ‘Obama Caught on Camera with Tranny,’ maybe. Or ‘Global Nuclear War.’ ” The story, which appeared on Gizmodo, was about a guy who lost his cell phone in a bar. The phone in question was a prototype of the iPhone 4G, which had not yet been released, and the guy was a software engineer at Apple who was out celebrating his twenty-seventh birthday. Another bar patron found the phone, and, instead of returning it to Apple, attempted to recover his beer money by selling it to the media. Denton, ever eager to scandalize the J-school puritans by indulging in checkbook journalism, offered five grand—and was rewarded with roughly twenty million page views. (His rule on “bounties,” as he calls them, is that you should be willing to pay ten dollars for every thousand new visitors you hope to attract.) Thirteen million of these came from the initial post, “This Is Apple’s Next iPhone,” which was straightforward gadget porn, featuring photographs of the device from every possible angle. A few million more views were captured when Gizmodo posted a gloating play-by-play account of the transaction, in the process outing the unlucky birthday boy (“Those beers may have turned out to be the bitterest of his life”).

“It was like snatching defeat from the jaws of victory,” Joel Johnson told me. “In the morning, we’re outlaw journalists, taking on Apple, and by the afternoon we were the assholes who made fun of a helpless engineer. It does really typify what it’s like working for Nick. You’re always going to push it a little too far.”

Peter Thiel is totally gay, people
By Owen Thomas

VCs fund so few of the companies they talk to that it’s hard to prove a case of discrimination; there are a hundred reasons why they might pass on any given startup. But gay and lesbian entrepreneurs I’ve spoken to agree it’s real. PlanetOut, the gay and lesbian portal, had to buy out Sequoia Capital, which had come to regret its investment in the company, before it found braver VCs and eventually went public. And really: How many out gay VCs do you know?

I think it explains a lot about Thiel: His disdain for convention, his quest to overturn established rules. Like the immigrant Jews who created Hollywood a century ago, a gay investor has no way to fit into the old establishment. That frees him or her to build a different, hopefully better system for identifying and rewarding talented individuals, and unleashing their work on the world.

That’s why I think it’s important to say this: Peter Thiel, the smartest VC in the world, is gay. More power to him.

Gawker Is Removing Story About Condé Nast CFO
By J.K. Trotter

“The point of this story was not in my view sufficient to offset the embarrassment to the subject and his family,” Denton wrote in a lengthy statement issued on Friday afternoon. “Accordingly, I have had the post taken down. It is the first time we have removed a significant news story for any reason other than factual error or legal settlement.”

Tommy Craggs and Max Read Are Resigning from Gawker
By J.K. Trotter

Tommy Craggs, the executive editor of Gawker Media, and Max Read, the editor-in-chief of, are resigning from the company. In letters sent today, Craggs and Read informed staff members that the managing partnership’s vote to remove a controversial post about the CFO of Condé Nast—an unprecedented act endorsed by zero editorial employees—represented an indefensible breach of the notoriously strong firewall between Gawker’s business interests and the independence of its editorial staff. Under those conditions, Craggs and Read wrote, they could not possibly guarantee Gawker’s editorial integrity.

Was the Deleted Gawker Post Any Worse Than Its Old Stuff?
By Jeremy Stahl

Gawker’s decision last week to publish and then withdraw a story about a male media executive soliciting a male prostitute has sent the company into turmoil, with two of its top editors resigning in protest on Monday. The whole kerfuffle raises significant questions about the ethics of outing, the importance of editorial independence, and what constitutes news.

One other simple question, though, can help clarify everything that has happened in the past few days: Was the post in question consistent with what Gawker had published in the past and with its historic editorial directive? In defending the story’s publication, Gawker Media Executive Editor for Investigations John Cook said on Twitter that the answer was clearly yes: “[The] post was solidly in line with what Gawker has asked its writers and editors to do for years.” That is still a questionable enough proposition, though, that Gawker Media founder and CEO Nick Denton initially opposed the story’s publication—along with several staff members—and ultimately took the dramatic step of actually having the story removed.

So, what is it the site is supposed to be doing, and was this just a case of Gawker being Gawker? Last month Denton described the site’s prime directive like so: “I have a simple editorial litmus test, which is: is it true, and is it interesting?”

I Had a One-Night Stand With Christine O’Donnell
By Remy Stern

I won’t get into the nitty gritty details of what happened between the sheets that evening. But I will say that it wasn’t half as exciting as I’d been hoping it would be. Christine was a decent kisser, but as soon as soon as her clothes came off and she was naked in my bed, Christine informed me that she was a virgin.

“You’ve got to be kidding,” I said. She didn’t explain at the time that she was a “born-again virgin.” She made it seem like she’d never had sex in her life, which seemed pretty improbable for a woman her age. And she made it clear that she was planning on staying a virgin that night. But there were signs that she wasn’t very experienced sexually. When her underwear came off, I immediately noticed that the waxing trend had completely passed her by.

Gawker Honcho: “Writers are Successful to the Extent That They Can Sublimate Their Egotism”
By Choire Sicha

This Gawker scoop is an example of brilliant packaging. The composite image that shows up on the front is good; the pull quotes; etc.

But, best of all: the story was written in the first person. The journalist is a ghost-writer. The account is much more compelling as a result. As is the headline.

And this points to a general rule on the web. Writers are successful to the extent that they can sublimate their egotism and get out of the way of the story.

Justine Sacco Is Good at Her Job, and How I Came To Peace With Her
By Sam Biddle

One year ago today, Justine Sacco was the global head of communications for the digital media conglomerate IAC. Getting on a plane for a trip to South Africa, to visit family, she published a tweet: “Going to Africa. Hope I don’t get AIDS. Just kidding. I’m White!”

At the time, I was editing Valleywag, Gawker’s tech-industry blog. As soon as I saw the tweet, I posted it. I barely needed to write anything to go with it: This woman’s job was carefully managing the words of a large tech-media conglomerate, and she’d worded something terribly.

It was a natural post. Twitter disasters are the quickest source of outrage, and outrage is traffic. I didn’t think about whether or not I might be ruining Sacco’s life. The tweet was a bad tweet, and seeing it would make people feel good and angry—a simple social and emotional transaction that had happened before and would happen again and again. The minimal post set off a 48-hour paroxysm of fury, an eruption of internet vindictiveness.

I said I was sorry posting her tweet had teleported her into a world of media scrutiny and misery. I’d tried not admitting even to myself that I was sorry, toying with various exculpatory principles like a child’s wooden blocks: posting her tweet had been media criticism, industry watchdoggery, social justice, karma.

This Is Why Billionaire Peter Thiel Wants to End Gawker
By J.K. Trotter

Thiel has a very specific sense of how Silicon Valley investors should exert their power. And his vaunted prowess as an investor has not always been borne out by reality. As Gawker has noted over the past decade:

  1. His vaunted hedge fund Clarium Capital was an abject failure, losing more than 90% of its $7 billion in assets, a decline that Valleywag assiduously chronicled.
  2. He is an arch libertarian who believes that central mechanisms of contemporary society—including representative democracy, universal suffrage, and formalized education—are either outdated or incompatible with human freedom.
  3. He is a loud proponent of “seasteading,” the movement to establish sovereign communities on permanent ocean vessels for the purpose of developing legal systems unencumbered by taxes or any other kind of traditional government policies.
  4. He believes death itself can and should be cheated, and even intends to be cryogenically frozen after he passes away, in hopes that science will one day be capable of reviving him. He literally wants to live forever.
  5. He has backed efforts to question the legitimacy of climate change science as well as political groups opposed to immigration—even though the industry that minted him as a billionaire is heavily dependent on immigrant labor.
  6. Gizmodo’s recent coverage of Facebook, in which Thiel was an early investor and on which he has a board seat, launched a congressional investigation into the company’s news curation practices, and inspired a national conversation about the vast amount of power the company wields—with no transparency and minimal accountability—over who reads what.

Gawker’s Gone. Long Live Gawker.
By Farhad Manjoo

Gawker’s formula worked. Its sites found lots of traffic, and until recently it often minted profits. More than that, it played an outsize part in online culture just as that culture was becoming the center of society. No wonder, then, that the rest of the media began to ape its style and format.

Washington Post settles lawsuit with family of Kentucky teenager
By Paul Farhi

The Washington Post has settled a lawsuit brought by the parents of a teenager who alleged that news coverage of the teen’s encounter with a Native American activist on the steps of the Lincoln Memorial last year was defamatory.

The Post admitted no wrongdoing in settling with the family of Nicholas Sandmann, the Covington, Ky., high school student who was involved in the episode during a school trip to Washington in January 2019.

The family’s suit against NBC is still pending. They have also filed suits against Gannett, ABC, CBS, the New York Times and Rolling Stone.

The Sandmanns sought $250 million in damages — which their attorneys noted was the same amount Amazon founder and chief executive Jeff Bezos paid in 2013 to purchase The Post.

CNN settles libel lawsuit with Covington Catholic student
By Paul Farhi

CNN agreed to settle a libel lawsuit filed by the family of a Covington, Ky., teenager who gained national attention during an encounter with a Native American activist on the steps of the Lincoln Memorial in Washington last year.

The network said on Tuesday that it settled the suit with the family of Nicholas Sandmann, but neither side disclosed the terms of their agreement.

Sandmann, who was 16 at the time, was part of a group of students from Covington Catholic High School who had traveled to Washington to attend an antiabortion march on the Mall. Many were wearing red “Make America Great Again” hats and waiting for a bus to pick them up at the memorial when Nathan Phillips, a Native American activist, approached.

What happened next sparked a national debate about the behavior and motives of the participants. Initial news accounts said Sandmann blocked Phillips’ way and smiled at him as Phillips beat a small drum and chanted. In the aftermath, Phillips said Sandmann was the aggressor, but Sandmann later said he had no intent to impede or harass Phillips. Video that later emerged showed that a third group, who identified themselves as Black Israelites, had been taunting the group of Kentucky teenagers.

In the past, news organizations have settled claims of defamation rather than try the case in court, a route that can be expensive, even with a favorable judgment. CNN, for example, settled a defamation suit filed by Richard Jewell, the Atlanta security guard who was suspected of planting a pipe bomb during the 1996 Olympics. The network paid Jewell and his mother, Bobi,$350,000 but maintained that it had acted properly, according to “The Suspect,” a book published last year about the Jewell case.

The death of the private citizen
By Amber Athey

CNN has a similar track record of thrusting private individuals into the spotlight because of their internet activities. In 2017, the outlet dug into a Redditor who created a meme shared by the President, and a CNN editor said the only reason they did not reveal the man’s identity was that he apologized for the meme and promised not to do it again. The admission was akin to political blackmail. Similarly, in 2018, a CNN reporter showed up with a cameraman on a woman’s lawn to scold her for sharing a pro-Trump Facebook account that was allegedly set up by Russians.

The exposing of random internet users is not just a sign of the media’s narcissism. It also has dire consequences for online discourse. Some warn that granting internet users a cloak of anonymity leads to extremism. It also leads to progress. Good ideas that are nonetheless considered controversial by society’s standards can never gain traction if people are too scared of cancellation to post and debate them. The Times and its brethren are doing society a great disservice by becoming the gatekeepers of internet anonymity.

The Latest Squabble Inside The New York Times
By Maxwell Tani

Late on Monday, the science and philosophy writer who founded the blog Slate Star Codex announced that he was shutting down his widely read site over what he said was a forthcoming story in the Times that would share his real identity.

Scott Alexander, the psychiatrist who helms the blog and writes under his first and middle name, said that he did not want to reveal his full name because of past death threats made against him, as well as fear that it would put his psychiatric clinic and patients at risk. Instead, he asked his readers to spam The New York Times demanding the paper not publish his full name.

“After considering my options, I decided on the one you see now,” he wrote in the only post now atop the scrubbed website. “If there’s no blog, there’s no story. Or at least the story will have to include some discussion of NYT’s strategy of doxxing random bloggers for clicks.”

The Slate Star Codex incident set off a tense conversation in the Times’ “newsroom-feedback” Slack channel, an internal message board in which staff have felt increasingly emboldened to criticize and raise questions about the paper and, inevitably, the work of their own colleagues.

Following Alexander’s blog post, several non-editorial newsroom staffers from the paper’s tech and product teams asked why he was being “doxxed,” and pointed out that the blogger’s cause was gaining traction on the computer-science site Hacker News. Another non-editorial staffer said they flagged the not-yet-published story for the paper’s customer care department “in case this snowballs into a spike in cancellations.”

“One of our developers is asking about a tech blogger who took their blog down rather than be doxed by us,” one employee wrote. “Is this something arising because of our standards? Can someone from the technology desk provide some insight?”

“I sincerely hope that the reporters did not intend to print the full name of someone who seems to have several damned good reasons to blog with a pseudonym,” fumed another non-editorial Times staffer, echoing a sentiment reflected by other tech and product staffers.

Several Times staffers pushed back, noting that the paper was not “doxxing” Alexander, as that term is widely used to describe situations where the goal of revealing a person’s identity is specifically to encourage harassment.

The dust-up is just the latest in a series of internal tensions that have roiled the Times in recent weeks. Many editorial staffers openly revolted earlier this month after the paper’s opinion page published a column from Republican Sen. Tom Cotton calling for President Donald Trump to “send in the troops” in response to nationwide protests against police brutality. Dozens of employees openly admonished the paper on social media, saying the op-ed put Black Times staffers in danger. That internal uprising led to a town hall in which the paper’s executives took turns apologizing to irate staffers for running the senator’s column; and it resulted in the ouster of opinions editor James Bennet, who admitted to having not reviewed the piece before its publication.

Other recent incidents have further exposed fissures at the paper, with Slack channels often playing host to the conflicts.

Insiders told The Daily Beast that over the past few weeks, some non-editorial staffers at the Times have become increasingly active in raising issues in the newsroom-feedback room, a channel with more than 2,000 Times staffers that has already provided a convenient forum for feedback but has sparked controversy and internal headaches as well. While some editorial staffers have appreciated and accepted the criticism, other reporters and editors in the newsroom have bristled at the zealous incursion by non-newsroom staff into the newsgathering, reporting, and editing process.

Some employees have appeared to tire of the constant back-and-forth in the channel, particularly among employees with no direct involvement in particular decisions being debated.

On Tuesday, one person chimed in to remind staff that their comments were widely available for thousands of people to see. “I’d like to gently remind people that there are two thousand people in this slack channel and we should be mindful of that,” the employee wrote.

Slack is fueling media’s bottom-up revolution
By Steven Perlberg

Times employees were in open revolt following the publication of an op-ed by US Senator Tom Cotton arguing for a military crackdown in response to protests in American cities. The “Newsroom Feedback” Slack channel, a venue for more than 2,000 editorial and non-editorial employees to speak out, became a rapid-fire feed of criticism and emoji reactions on how the paper covers race. One employee remarked that many customer care representatives, the people forced to respond to readers cancelling their subscriptions, were themselves people of color.

“While I don’t run Opinion, I’m the senior leader of the newsroom,” Baquet wrote to the group in a Slack message obtained by Digiday. “I also believe that some of the points being raised in this channel do point to things the news side can do better. So I’m reading. Thank you.”

When media executives have to put out fires, they meet staffers where they live — on Slack, the enterprise software service where employees communicate, plan, gossip, talk shit about bosses and each other — and increasingly, organize themselves to fight for their rights. The irony of Slack is that media business leaders gravitated to it years ago as a tool to make the labor force more efficient and available at all hours. And now, those same workers are using Slack to fight back against their capitalist bosses. It has become the central forum for the media’s bottom-up revolt, in which empowered, often young staffers are demanding accountability from their managers, outing racist and discriminatory practices, and openly organizing unions to rebalance the power dynamic between management and the rank and file. In the coronavirus work-from-home era, Slack has taken on even greater importance as a mechanism for internal change. It is the new water cooler — one of substance to figure out stories in addition to inane banter about the latest Twitter outrage — and media workers are dumping it on their bosses’ heads.

Slack Has Made Remote Office Communication Easier. It Can Also Be Less Civil.
By Chip Cutter and Aaron Tilley

The popular messaging tool makes instant communication across a company simple—one reason its usage has surged during the pandemic. But when everyone is working virtually, instant-messaging platforms like Slack can become a dumping ground for grievances, passive aggressiveness and other exchanges that are best left for private conversations, says Victor Cho, Evite’s CEO.

“You can’t have large, nuanced conversations over Slack,” Mr. Cho says. “That’s where you just see it going off the rails.” Evite advised its roughly 100 employees to resolve conflicts and complex issues by picking up the phone or scheduling a video meeting, which Mr. Cho says staffers have heeded.

Many companies have embraced Slack and tools like it as a more efficient way to communicate. In the first weeks after many companies issued work-from-home orders in March, usage soared: On March 25, Slack Technologies Inc. said its platform had 12.5 million simultaneous users, up from 10 million two weeks earlier. Microsoft Corp. said in April that the number of daily users on its Teams platform, a competitor to Slack, had grown to 75 million people, more than double the number in early March. The companies haven’t disclosed more recent user numbers.

The technology allows workers to swap information in seconds and respond more quickly than in email with emojis and funny videos, making it easy to set an informal tone. As many offices remain closed, such platforms have become virtual water coolers, one of the primary ways homebound staffers stay in touch with each other.

But the casual nature of many interactions also means some people let their guards down, trash talk and act unprofessionally on the channels, some executives say. Since the pandemic, California employment lawyer Amber Bissell says she has noticed an uptick in harassment complaints related to online communications. Some companies say they have installed tracking tools to police online channels for signs of bullying.

Bari Weiss Resigns From New York Times Opinion Post
By Edmund Lee

Ms. Weiss recently came under fire for online comments on the staff unrest that followed the publication of a Times Op-Ed piece by Senator Tom Cotton calling for a military response to civic unrest in American cities during the widespread protests against racism and police violence.

More than 1,000 Times staff members signed a letter protesting the Op-Ed’s publication, and James Bennet, the editorial page editor, resigned days after it was published. An editors’ note was added to the essay, saying it “fell short of our standards and should not have been published.” The opinion department of The Times is run separately from the newsroom.

In a tweet, Ms. Weiss described the turmoil inside the paper as a “civil war” between “the (mostly young) wokes” and “the (mostly 40+) liberals.” Many staff members objected on Twitter to her comment, saying it was inaccurate or misrepresented their concerns.

Twitter is editing the New York Times
By Bari Weiss

Twitter is not on the masthead of the New York Times. But Twitter has become its ultimate editor. As the ethics and mores of that platform have become those of the paper, the paper itself has increasingly become a kind of performance space. Stories are chosen and told in a way to satisfy the narrowest of audiences, rather than to allow a curious public to read about the world and then draw their own conclusions. I was always taught that journalists were charged with writing the first rough draft of history. Now, history itself is one more ephemeral thing moulded to fit the needs of a predetermined narrative.

My own forays into Wrongthink have made me the subject of constant bullying by colleagues who disagree with my views. They have called me a Nazi and a racist; I have learned to brush off comments about how I’m “writing about the Jews again.” Several colleagues perceived to be friendly with me were badgered by coworkers. My work and my character are openly demeaned on company-wide Slack channels where masthead editors regularly weigh in. There, some coworkers insist I need to be rooted out if this company is to be a truly “inclusive” one, while others post ax emojis next to my name. Still other New York Times employees publicly smear me as a liar and a bigot on Twitter with no fear that harassing me will be met with appropriate action. They never are.

There are terms for all of this: unlawful discrimination, hostile work environment, and constructive discharge. I’m no legal expert. But I know that this is wrong.

I do not understand how you have allowed this kind of behaviour to go on inside your company in full view of the paper’s entire staff and the public. And I certainly can’t square how you and other Times leaders have stood by while simultaneously praising me in private for my courage. Showing up for work as a centrist at an American newspaper should not require bravery.

The New York Times’s self-inflicted fiasco
By Kathleen Parker

Cotton’s essential argument was that an “overwhelming show of force” was needed as the protests unfolded and that President Trump should invoke the 200-year-old Insurrection Act to “restore order to our streets.” Bad idea, Tom. See how easy that was? I for one am glad to know what’s inside Cotton’s cerebral cavity. I disagree with his thinking for the same reasons raised by others, including former defense secretary and retired Marine general Jim Mattis. As a member of the Kent State generation, it’s against my remaining liberal sensibilities, not to mention American values, to turn our military on our own people.

The angry Times staffers also claimed that the op-ed was inflammatory and “contained assertions debunked as misinformation by the Times’s own reporting.” They pointed to Cotton’s claim that antifa, a self-described anti-fascism movement opposed to the far-right that can seem sort of fascist in its disruptive tactics, was behind the unrest. The piece should have been more carefully edited to make it clear that the evidence behind Cotton’s claim about antifa’s role was not very convincing. While his piece was far from perfect, Cotton tried to draw a distinction between violent actors and peaceful protesters.

There are many reasons and ways to disagree with Cotton’s ideas and the way he presented them without censorship as prequel or apology as sequel. It is sadly ironic that the Times ultimately aided and abetted Cotton’s larger goals. The senator’s presidential ambitions are well-known and, thanks to the Times, have been well-served. Already, Cotton has added at least $200,000 to his coffers and made an instant name for himself in those quarters of the Republican Party in which it is never bad politics to do harm to the media. Here’s his schadenfreude-drenched tweet: “How is everyone at the @nytimes doing this morning? Did you have a late night trying to come up with an excuse to pretend you didn’t cave to the woke mob?”

Bennet’s mistake was in not reading the Cotton op-ed before running it, to which he has admitted. He likely assumed it had been sufficiently vetted by other editors who have said they fact-checked and approved it. Bennet’s deputy editor, James Dao, tweeted that he “oversaw the acceptance and review” of the op-ed. He, too, has been removed from the masthead but has moved to another position in the newsroom.

It is probably telling that the Cotton protest largely took place on Twitter, where it was sure to gain momentum. It doesn’t take much courage to join a gang and cancel an opinion — or ruin a career. It does take great courage, on the other hand, to stand alone against a tide of pitchfork-wielding Twitter tyrants and defend a free exchange of ideas, even if some of them are bad.

A crisis of conviction at the New York Times
By Erik Wemple

Such is the left’s scrutiny of the New York Times that the public knows about seemingly every imperfection that has surfaced in its pages since Jan. 20, 2017.

And much of the attention has fallen on Bennet’s opinion pages. He recruited former Wall Street Journal columnist Bret Stephens in April 2017 and watched as his new hire’s first column — about climate change — got mauled on social media. The column itself still bears a scar from that set-to, a consequential correction about how Stephens characterized the impact of climate change.

Other gaffes stemmed from management and process flubs. A year ago, the Opinion section published an anti-Semitic cartoon, prompting the newspaper to acknowledge that the responsible editor was “working without adequate oversight” in a “faulty process.” The section hired journalist Quinn Norton in 2018 to write about technology, only to then realize that she had written about neo-Nazi friendships and other troublesome material. She was fired hours after her hiring was announced. Another 2018 hire, Sarah Jeong, had written derisive remarks about white people; she lasted about a year.

In June 2017, the New York Times published an editorial suggesting that Sarah Palin’s political action committee had incited the murderous 2011 rampage of Jared Lee Loughner in Arizona. Palin sued for defamation, a step that opened the editorial process to a blast of sunlight. As it turned out, Bennet had inserted problematic language in the editorial without having taken basic, essential steps to confirm the details.

Missteps notwithstanding, Bennet has long been regarded as a possible successor to Executive Editor Dean Baquet. As recently as last fall, Sulzberger said this about Bennet to The Post: “Under his leadership, Opinion has been vital, creative and unafraid to tackle big issues, from privacy to domestic abuse to the legacy of slavery. He’s not only a great editor, but a deeply honorable one. As much as any journalist I’ve worked with, he’s constantly pushing himself to make the right journalistic decision.”

How the masthead of the New York Times looks back on all this is difficult to discern. In Friday’s staff meeting, Sulzberger said that the op-ed never should have been published and didn’t meet the newspaper’s standards — this, after writing on Thursday that it embodied the paper’s spirit. In explaining that contradiction to colleagues at the meeting, Sulzberger downplayed the memo as a “placeholder” while the newspaper looked into the matter, according to sources logged into the meeting.

This particular placeholder isn’t holding anything.

The New York Times is experiencing a crisis of leadership and conviction. In just two days, it has alienated staffers, readers, liberals, conservatives, free-expression absolutists of all political persuasions and Tom Cotton. There’s a saying in Washington that if you’re angering both sides, you must be doing something right. The Times’s recent actions prove that such “wisdom” is a crock.

‘Threw Him Under the Bus’: NY Times Publisher A.G. Sulzberger Laments Bennet’s Ouster
By Lloyd Grove

“I really lament the loss of a talent that I respect and admire more than you could know,” Sulzberger, 39, told The Daily Beast about Bennet’s abrupt forced resignation this past Sunday—a mere four days after Bennet’s deputy Jim Dao and a junior editor, former Weekly Standard staffer Adam Rubenstein, published Cotton’s online screed. It was jarringly titled “Send in the Troops,” a polemic in which the Donald Trump-loving Republican demanded that the U.S. military be deployed in response to widespread protests against police brutality.

“But at the end of the day, the most important thing, when you have these crises, is: Can you show up on Monday morning and lead the team out of it,” Sulzberger added. “I really regret that the answer we all got [for Bennet] was ‘no.’”

In a series of widely criticized social media posts—apparently live-tweeted during a private meeting between Bennet and his editorial and opinion staff—right-leaning opinion editor Bari Weiss caricatured the divide as a “civil war” between “the (mostly young) ‘wokes’ ” who allegedly seek to quash uncomfortable ideas and “the (mostly 40+) liberals” who honor the values of free speech. (During the combative town hall last Friday, one questioner asked if Weiss will be fired; the answer was no, although bosses were said to be evaluating her social media conduct.)

Yet A.G., who comes across as an empathetic boss trying clean up a mess made by others, should be held largely accountable for the circumstances that led to the latest troubles, especially the business imperative to churn out massive amounts of content under demanding time pressures to keep readers engaged, said a prominent journalist who asked not to be further identified.

“I was shocked that A.G. didn’t accept any responsibility himself for the circumstances that surrounded this particular controversy,” this person told The Daily Beast. “The editorial pages and the op-ed pages traditionally report to the publisher… The publisher is supposed to be shaping the strategy of the editorial and opinion pages. That’s always been the fun of owning a newspaper.”

Under Sulzberger, “there has been a heavy investment in the growth of opinion at the Times,” the journalist continued, noting that Bennet is a friend. “That was something that A.G. wanted and approved, because it drives their subscription strategy. New York Times readers like to read opinions—especially opinions that align with their own—and they increasingly don’t like to read opinions that don’t align with their own.”

It’s the publisher, not the editor, who sets goals for subscription sign-ups, the journalist noted. “That is a business strategy. That is what the subscriber loop feedback is telling them. It’s great that they’ve achieved this sustainability by moving away from advertising to subscriptions”—around six million Times subscribers to date, including almost a million who pay for the ink-on-paper edition—“but let’s be clear, they’re following their audience and looking at what their audience reads the longest and where they feel emotionally attached to the Times file. And opinion is one of the areas where that data lights up…

“So to call out the Tom Cotton op-ed, as stupid and offensive as it may have been, as some kind of lapse of judgment, it just doesn’t add up. You would expect a leader like A.G. to share some of the responsibility for this. Instead, he threw James under the bus.”

Inside the Revolts Erupting in America’s Big Newsrooms
By Ben Smith

The fights at The Times are particularly intense because Mr. Sulzberger is now considering candidates to replace the executive editor, Dean Baquet, when he reaches the mandatory retirement age of 65 in 2022. Competing candidates represent different visions for the paper, and Mr. Bennet had embodied a particular kind of ecumenical establishment politics. But the Cotton debacle had clearly endangered Mr. Bennet’s future. When the highly regarded Sunday Business editor, Nick Summers, said in a Google Hangout meeting last Thursday that he wouldn’t work for Mr. Bennet, he drew agreement from colleagues in a chat window.

Inside the New York Times’ Heated Reckoning With Itself
By Reeves Wiedeman

During a company town hall two days later, while Bennet got teary answering questions, employees took to Slack again to express their frustration at the company’s seeming lack of action to rectify the situation. Bennet had joined the Times in 2016 with an explicit mandate to expand the voices in the op-ed pages beyond the center-left consensus in which most of its columnists fit. The “Opinion” section had suffered a number of controversies, and the newsroom had become frustrated with what seemed to be an alternate set of standards. Employees were galled to find out that Bennet had not read the column before it was published — while a Black photo editor had done so and objected to no avail. A development editor connected Cotton’s op-ed with a profile of Adolf Hitler from 1922, while an employee in brand marketing asked why Alison Roman, the food writer who had recently been suspended for disparaging comments she made about Chrissy Teigen and Marie Kondo, was seemingly being treated more severely than Bennet.

The conversation turned into what more than one Times employee described to me as a “food fight.” During the mêlée, “Opinion” columnist Elizabeth Bruenig uploaded a PDF of John Rawls’s treatise on public reason, in an attempt to elevate the discussion. “What we’re having is really a philosophical conversation, and it concerns the unfinished business of liberalism,” Bruenig wrote. “I think that all human beings are born philosophers, that is, that we all have an innate desire to understand what our world means and what we owe to one another and how to live good lives.”

“Philosophy schmosiphy,” wrote a researcher at the Times whose Slack avatar was the logo for the hamburger chain Jack in the Box. “We’re at a barricades moment in our history. You decide: which side are you on?”

By Monday morning, Bennet was out. To those who saw the op-ed as one in a series of screwups, Bennet’s ouster was a long time coming. To those who believed his effort to present occasionally controversial views for public consideration was core to the Times’ mission, the decision was a retreat from principle. “I call it a fucking disgrace,” said Daniel Okrent, the former public editor. “I think that James’s firing was as meaningful for the paper’s existence and how it’s perceived as Jayson Blair was.”

It is difficult to think of many businesses that have benefited more from Donald Trump’s presidency — aside from the Trump-family empire — than the Times. After Trump’s election, in 2016, subscriptions grew at ten times their usual rate, and they have never looked back. The Times has gone from just over three million subscribers at the beginning of the Trump presidency to its record of more than 7 million last month. It has hired hundreds of journalists to staff a newsroom that is now 1,700 people strong — bigger than ever. Its stock has risen fourfold since Trump took office, and the Times has consolidated its Trump bump into a business that includes Serial Productions, the podcast juggernaut; Audm, the audio-translation business; and a TV show based on the Times series “Modern Love” that was filming its second season this summer, until a COVID-19 false positive on-set forced it to halt shooting. More than a million people subscribe to its Crossword and Cooking apps alone, and the company has been able to weather the pandemic in part because it now has more cash on hand—$800 million—than at any point in its history. It has become the news-media organization to rule them all.

Identifying as a reader of the Times has become a marker of resistance, and parts of the paper amount to service journalism for participatory democracy — even if the journalists doing the work don’t see it that way. “There’s still this huge gap between what the staff and audience and management want,” one prominent Times reporter said. “The audience is Resistance Moms and overwhelmingly white. The staff is more interested in identity politics. And management is newspaper people. There’s an impulse to want to be writing for a different audience.”

In 2018, a group of data scientists at the Times unveiled Project Feels, a set of algorithms that could determine what emotions a given article might induce. “Hate” was associated with stories that used the words tax, corrupt, or Mr. — the initial study took place in the wake of the Me Too movement — while stories that included the words first, met, and York generally produced “happiness.” But the “Modern Love” column was only so appealing. “Hate drives readership more than any of us care to admit,” one employee on the business side told me.

Twitter presented innumerable headaches, with reporters having to be chastised for being overtly political, or simply for sounding un-Timesian in their pursuit of likes and retweets. “There’s a very sad need for validation,” one Times journalist who has tweeted tens of thousands of times told me.

After Bennet’s ouster, Sulzberger met with a columnist for the “Opinion” section who had expressed consternation about the decision. Sulzberger promised the columnist that the Times would not shy away from publishing pieces to which the Times’ core audience might object. “We haven’t lost our nerve,” Sulzberger said.

“Yes, you have,” the columnist told Sulzberger. “You lost your nerve in the most explicit way I’ve ever seen anyone lose their nerve. You can say people are still gonna be able to do controversial work, but I’m not gonna be the first to try. You don’t know what you’ll be able to do, because you are not in charge of this publication — Twitter is. As long as Twitter is editing this bitch, you cannot promise me anything.”

Cancel Culture Journalism
By The Editorial Board

An ostensibly independent opinion section was ransacked because the social-justice warriors in the newsroom opposed a single article espousing a view that polls show tens of millions of Americans support if the police can’t handle rioting and violence. The publisher failed to back up his editors, which means the editors no longer run the place. The struggle sessions on Twitter and Slack channels rule.

All of this shows the extent to which American journalism is now dominated by the same moral denunciation, “safe space” demands, and identity-politics dogmas that began in the universities. The agents of this politics now dominate nearly all of America’s leading cultural institutions—museums, philanthropy, Hollywood, book publishers, even late-night talk shows.

On matters deemed sacrosanct—and today that includes the view that America is root-and-branch racist—there is no room for debate. You must admit your failure to appreciate this orthodoxy and do penance, or you will not survive in the job.

The Most Intolerant Wins: The Dictatorship of the Small Minority
By Nassim Nicholas Taleb

The main idea behind complex systems is that the ensemble behaves in way not predicted by the components. The interactions matter more than the nature of the units. Studying individual ants will never (one can safely say never for most such situations), never give us an idea on how the ant colony operates. For that, one needs to understand an ant colony as an ant colony, no less, no more, not a collection of ants. This is called an “emergent” property of the whole, by which parts and whole differ because what matters is the interactions between such parts. And interactions can obey very simple rules. The rule we discuss in this chapter is the minority rule.

The minority rule will show us how it all it takes is a small number of intolerant virtuous people with skin in the game, in the form of courage, for society to function properly.

… you think that because some extreme right or left wing party has, say, the support of ten percent of the population that their candidate would get ten percent of the votes. No: these baseline voters should be classified as “inflexible” and will always vote for their faction. But some of the flexible voters can also vote for that extreme faction, just as nonKosher people can eat Kosher, and these people are the ones to watch out for as they may swell the numbers of votes for the extreme party.

How do books get banned? Certainly not because they offend the average person –most persons are passive and don’t really care, or don’t care enough to request the banning. It looks like, from past episodes, that all it takes is a few (motivated) activists for the banning of some books, or the black-listing of some people. The great philosopher and logician Bertrand Russell lost his job at the City University of New York owing to a letter by an angry –and stubborn –mother who did not wish to have her daughter in the same room as the fellow with dissolute lifestyle and unruly ideas.

Let us conjecture that the formation of moral values in society doesn’t come from the evolution of the consensus. No, it is the most intolerant person who imposes virtue on others precisely because of that intolerance. The same can apply to civil rights.

As I am writing these lines, people are disputing whether the freedom of the enlightened West can be undermined by the intrusive policies that would be needed to fight fundamentalists.

… can democracy –by definition the majority — tolerate enemies? The question is as follows: “ Would you agree to deny the freedom of speech to every political party that has in its charter the banning the freedom of speech?” Let’s go one step further, “Should a society that has elected to be tolerant be intolerant about intolerance?”

Refinery29 Editor Resigns After Former Employees Describe ‘Toxic Culture’
By Katie Robertson

The women’s lifestyle publication Refinery29 is the latest media organization to undergo a change in leadership during the cultural reckoning that has accompanied the widespread protests against racism and police violence.

Christene Barberich, the top editor and a co-founder of the 15-year-old site, said on Monday that she would step down after a number of former Refinery29 employees came forward on social media in recent days to describe discrimination they experienced while working at the company.

“I’ve read and taken in the raw and personal accounts of Black women and women of color regarding their experiences inside our company at Refinery29,” Ms. Barberich wrote in a post on Instagram. “And, what’s clear from these experiences, is that R29 has to change. We have to do better, and that starts with making room. And, so I will be stepping aside in my role at R29 to help diversify our leadership in editorial and ensure this brand and the people it touches can spark a new defining chapter.”

Ms. Barberich, 51, will step down from her job immediately but will stay on as an adviser until the fall, according to two people with knowledge of the matter. Ms. Barberich did not reply to a request for comment.

The writer Ashley C. Ford was one of the former employees who described her experience on Twitter. “I worked at Refinery29 for less than nine months due to a toxic company culture where white women’s egos ruled the near nonexistent editorial processes,” she wrote. “One of the founders consistently confused myself and one our full-time front desk associates & pay disparity was atrocious.”

A former Refinery29 senior editor, Ashley Alese Edwards, who now works at Google, wrote on Twitter that the concerns she raised were routinely ignored and that her job title did not reflect her experience or duties at the company.

Bon Appetit’s editor resigns after ‘brownface’ photo and allegations of discrimination
By Emily Heil

Bon Appétit magazine’s top editor, Adam Rapoport, resigned his position Monday night after growing calls from former and current staffers and contributors for him to step down. The outcry came after a series of damning online disclosures, including an allegation by an editor that people of color are not paid for video appearances while their white colleagues are — which a magazine spokeswoman later denied — and the surfacing of an undated photo that appeared to show a younger Rapoport wearing a racist costume.

“I am stepping down as editor in chief of Bon Appétit to reflect on the work that I need to do as a human being and to allow Bon Appétit to get to a better place,” he wrote in an Instagram post.

“From my extremely ill-conceived Halloween costume 16 years ago to my blind spots as an editor, I’ve not championed an inclusive vision,” he wrote on Instagram, apologizing for his “failings.” He said the magazine’s staff “deserved better” and that it has “been working hard to evolve the brand in a positive, more diverse, direction.”

Blackface incident at Post cartoonist’s 2018 Halloween party resurfaces amid protests
By Marc Fisher and Sydney Trent

A middle-aged white woman named Sue Schafer wore a conservative business suit and a name tag that said, “Hello, My Name is Megyn Kelly.” Her face was almost entirely blackened with makeup. Kelly, then an NBC News anchor, had just that week caused a stir by defending the use of blackface by white people: “When I was a kid, that was okay, as long as you were dressing up as, like, a character.”

So just before heading over to the party, Schafer, a graphic designer and friend of Toles, decided to dress as Kelly in blackface to mock her, she said.

Some of the approximately 100 guests at the home of the cartoonist in the District’s American University Park neighborhood said they didn’t notice the blackface. Some noticed it and said nothing. A few people walked up to Schafer, who was then 54, and challenged her about her costume. Gruber, who is of Puerto Rican descent, and her friend Lyric Prince, who is African American, confronted Schafer directly.

“You understand how offensive that could be to a person of color?” Gruber said, according to two witnesses.

“I’m Megyn Kelly — it’s funny!” Schafer replied, the witnesses said.

Nearly two years later, the incident, which has bothered some people ever since but which many guests remember only barely or not at all, has resurfaced in the nationwide reckoning over race after George Floyd, an unarmed African American man, was killed when a white police officer in Minneapolis knelt on his neck for nearly nine minutes. Many protesters have called on white Americans to reassess their own actions or inactions when confronting violent and everyday racism alike.

Why Did the Washington Post Get This Woman Fired?
By Josh Barro and Olivia Nuzzi

It’s not unusual for publications to report on themselves, though ordinarily they do so because they are already in the news. Proactive reporting creates significant concerns about conflicts of interest: If the Post shapes the narrative of a new story about itself, it may do so in a way that is designed to protect the paper’s interests. When politicians do this, reporters cynically refer to it as “getting out ahead of the story” — the idea being that new information has more impact on public perception than a clarification or correction or addition to already public information.

While the piece failed as a journalistic investigation of the culture of Washington, D.C.’s “media figures,” it succeeded by a different metric: It ensured that Schafer bore the brunt of the criticism in the piece — for example, describing her insensitive interaction in the taxi en route to the party, an incident that occurred outside the presence of any media figures — with minimum organizational exposure to the Post. If it had leaked that the Post had neglected to pursue a story about blackface, or if the women who brought the tip to the Post had taken it to another outlet or simply tweeted about it, who knows what direction the story might have taken.

The Schafer story ran at a time of severe pressure on editorial management at the Post over issues related to diversity and news philosophy. Conflicts between the paper’s executive editor, Marty Baron, and two reporters — Wesley Lowery, who made high-profile arguments for newspapers to take clearer moral stances in their reporting and has since left the Post for CBS News, and Felicia Sonmez — over issues of objectivity on social media have spilled into public view. As the story about the two-year-old Halloween party was being reported, senior managers at the Post had recently received 32 pages of employees’ stories about racism and discrimination within the company, gathered by the employee union, according to an email reviewed by New York. A Post employee petition demanding changes in hiring and editorial practices drew nearly 500 signatures in 48 hours. Basically, it was the worst possible time for the Post to have a “blackface scandal,” especially one whose frame it did not control.

“From the outside, it seems clear someone complained to the Post about this stupid incident and rather than handling it as an HR matter, they decided the best thing for public relations would be to project transparency by reporting on it themselves,” said Lowery. “But what no one appears to have thought of is the way giving a massive amount of attention to a dumb incident involving private citizens would invariably do negligible good and cause massive amounts of harm—including to the Post itself.”

At the same time as U.S. newspapers face staff pressure, they face pressure from news consumers who are skeptical about their commitment to fairness. A challenge is that “fairness” means different things to different people, and a move toward what Lowery calls “moral clarity” in the news pages will appeal to some readers while alienating others. But one key element of fairness that is shared across competing visions of journalism is the news pages should be used for news, not as a PR shield to protect the newspaper’s business.

As is so often the case, if the Post had simply followed its own published editorial guidance that “fairness includes relevance,” it would have made the right decision and passed on this story. Asked repeatedly to define that phrase, newsroom leadership didn’t respond to New York.

Hundreds Of Staff At The Guardian Have Signed A Letter To The Editor Criticising Its “Transphobic Content”
By Patrick Strudwick

The letter, which was organised over the last few days in response to a column by Suzanne Moore that has been widely criticised as anti-trans, said the staff were “deeply distressed” by the resignation of a transgender member of staff who said they’d received anti-trans comments from “influential editorial staff” and who criticised the publication of the Moore’s column at the editorial morning conference.

The column was “the straw that broke the camel’s back,” the trans employee said, following a series of pieces that pitted trans people against women and against women’s rights. One leader article — the publicly stated position of the newspaper — claimed that trans rights are in “collision” with women’s rights.

The letter points out that this was the third trans staff member to resign over alleged anti-trans bias.

Earlier on Friday, Viner, along with chief executive Annette Thomas, emailed all staff defending its decision to publish pieces that “never shy away from difficult or divisive subjects” and pledging to represent “a wide range of view on many topics”.

The editor and CEO then castigated staff for publicly criticising the work of coworkers: “It is never acceptable to attack colleagues whose views you do not agree with, whether in meetings, on email, publicly or on social media.”

Washington Post Suspends a Reporter After Her Tweets on Kobe Bryant
By Rachel Abrams

The Washington Post suspended one of its reporters, Felicia Sonmez, after she posted tweets on Sunday about Kobe Bryant in the hours after his death. Over 200 Post journalists criticized the paper’s decision on Monday.

Ms. Sonmez on Sunday posted a link on Twitter to a 2016 Daily Beast article that detailed an allegation of sexual assault made against Mr. Bryant in 2003. Her tweet appeared amid a flood of public tributes to the retired Los Angeles Lakers star, who died earlier that day in a helicopter crash at age 41.

Ms. Sonmez received an email from The Post’s executive editor, Martin Baron, at 5:38 p.m., before she was told that she would be placed on leave. The reporter shared the three-sentence email with The New York Times.

“Felicia,” Mr. Baron wrote. “A real lack of judgment to tweet this. Please stop. You’re hurting this institution by doing this.”

The text of Mr. Baron’s email was attached to a screen shot of Ms. Sonmez’s tweet linking to the Daily Beast article. A spokeswoman for The Post and Mr. Baron did not reply to requests for comment on the email.

Mr. Bryant was arrested in 2003 after a complaint by a hotel employee in Colorado. A charge of felony sexual assault was dropped in 2005, and Mr. Bryant settled with his accuser out of court, saying in a statement that he believed the encounter with the woman was “consensual,” although he had come to understand that she did not see it the same way.

Ms. Sonmez’s tweet drew a swift backlash from other Twitter users. She followed it with a post about the negative responses she had received.

“Well, THAT was eye-opening,” she wrote. “To the 10,000 people (literally) who have commented and emailed me with abuse and death threats, please take a moment and read the story — which was written (more than three) years ago, and not by me.”

Ms. Sonmez also posted what appeared to be a screenshot of an email she had received that used offensive language, called her a lewd name and displayed the sender’s full name.

She deleted the three tweets after being told to do so by Tracy Grant, the newspaper’s managing editor, but not before other journalists captured them in screen shots.

The Post confirmed the paid suspension on Monday, but didn’t specify which of the tweets had prompted it to take action.

“National political reporter Felicia Sonmez was placed on administrative leave while The Post reviews whether tweets about the death of Kobe Bryant violated the Post newsroom’s social media policy,” Ms. Grant said in a statement. “The tweets displayed poor judgment that undermined the work of her colleagues.”

Condé Nast Staffers Expose Entertainment Chief’s Old Tweets About Mexicans and Women
By Maxwell Tani

During an all-staff meeting on Tuesday, one anonymous employee posted a question asking about Condé Nast Entertainment chief Oren Katzeff’s old tweet about sexual consent.

“Wonder why CNE has a company culture that allows leadership to have posts like this on their timelines,” the person wrote, linking to a 2010 tweet in which Katzeff wrote: “Earlier today, I saw a girl wearing a shirt that said ‘No means Yes!’ That might explain why the guy holding her hand was smiling…”

That tweet has since been deleted, but following the meeting, several irate staffers sent The Daily Beast other years-old tweets from Katzeff.

“Millie went up to the Mexican waiter and asked him for paper (to draw on). He thought she was asking for his papers. Comedy ensued,” Katzeff wrote in one 2014 tweet being shared by Condé employees.

“My 2 yr old gets a present for pottying like a big girl. Now she wants presents for all big girl things, like nagging and being irrational,” read another post, from 2011, which was deleted Tuesday.

“There either is a cat on my flight, meowing repeatedly a few rows behind me, or a REALLY horny woman,” Katzeff wrote in another 2014 post, which was also removed.

In a statement to The Daily Beast, Katzeff apologized but noted that he was working at a comedic publication at the time he made the social-media posts.

“These tweets were made at a time when I was working in comedy and in a different role in my life, but that doesn’t excuse them,” he said. “History has shown that they are not funny and I regret posting them. I’m sorry for the offense and pain they may have caused.”

Stan Wischnowski resigns as The Philadelphia Inquirer’s top editor
By Craig R. McCoy

It was the placement of an insensitive headline over Inga Saffron’s column in the Tuesday newspaper that may have set the stage for Wischnowski’s departure. He joined the two other top editors in signing an apology to readers and staff, characterizing the headline, “Buildings Matter, Too,” as “deeply offensive” and apologizing for it. The column had explored the destruction of buildings amid the looting that accompanied some of the nationwide protests over police violence.

Even before the headline was published, Wischnowski and other editors had scheduled a staff-wide Zoom meeting to discuss race at the Inquirer and the pressures in particular faced by journalists of color.

Wischnowski, low-key and measured, as in his personality, told staffers on Wednesday that the paper had made strides in diversifying its 213-member newsroom, boosting minority representation to 27 percent of the editorial workforce, about a doubling in four years. He promised more such hires.

The session turned intense and emotional. Some journalists could be seen in tears in their Zoom frames. Critics, black and white, denounced the pace of change at the paper, sharply criticizing both coverage and the racial and gender mix of the staff. Several journalists pointed out that the newspaper could muster only one male African American reporter to cover the protests and police response convulsing a city that is majority minority.

Hours after the wrenching Zoom session, about 50 journalists of color signed a public a statement calling for faster changes at the paper. The following day, most of the minority staff took the day off from work in protest.

“It’s no coincidence that communities hurt by systemic racism only see journalists in their neighborhoods when people are shot or buildings burn down,” the statement read in part. It added: “We’re tired of shouldering the burden of dragging this 200-year-old institution kicking and screaming into a more equitable age.”

America’s Newsrooms Face a Reckoning on Race After Floyd Protests
By Khadeeja Safdar, Jeffrey A. Trachtenberg and Benjamin Mullin

Some journalists want more freedom to express their views on social media or attend protests and rallies. In many newsrooms, including the Journal’s, social-media guidelines for staff members are meant to avoid any perception that reporting is biased. Whether a journalist’s commentary or activities constitutes a political statement—or simply a statement of social values—can sometimes be a subject of debate.

At CNN, network president Jeff Zucker was asked at a recent town hall why most editorial staffers can’t express their views on social media but the channel’s anchors can. He responded that anchors are treated differently, according to people familiar with the meeting.

Mr. Zucker said at the meeting the network was reviewing its restrictions on participating in protests. In a memo, CNN said that employees could donate to nonprofits but should refrain from taking part in protests the network was covering and from editorializing on social media. They can attend community vigils, CNN said in the memo.

Mr. Zucker said at the meeting that he was sympathetic with staffers’ desire to protest. A CNN spokesman said employees are generally expected to avoid saying anything on social media that they wouldn’t say on air.

On-Air Reporter Yells ‘Shit’ as Firecracker Explodes During Protest at CNN Center in Atlanta
By Peter Wade

As protesters gathered near the CNN Center in Atlanta, an explosion described as a firecracker detonated and understandably startled CNN’s Nick Valencia, leading him to scream: “SHIT!” during the live Friday night broadcast.

According to Mediaite, in-studio host Chris Cuomo called the scene in Atlanta “unreal.” The host then described some of the havoc that had taken place while the protest against police brutality continued.

“They’ve broken out the windows here, they’re continuing to throw objects, another projectile fired, appeared to be a full water bottle… At least two officers have been injured, one of them appeared to be seriously injured, being dragged out,” Cuomo said.

Earlier Friday evening, Valencia said told Anderson Cooper that the scene near the CNN Center was getting dangerous.

“Anderson, it’s ugly out here,” Valencia said, “These demonstrators came here ready to confront the police.”

Valencia, according to Mediaite, went on to say that the protesters were “throwing objects at CNN Center, breaking windows.”

Rancor Erupts In ‘LA Times’ Newsroom Over Race, Equity And Protest Coverage
By David Folkenflik

In an internal Slack exchange last week about recent coverage of protests, however, LA Times film reporter Sonaiya Kelley, who is black, said the newspaper had focused too squarely and too often on the question of looting.

“We can’t constantly pander to our primarily white audience with stories like this that affirm their biases,” she wrote. “One of the responsibilities of the job is to state the facts and tell it true. There’s so much implicit bias in those few sentences alone. And it’s alienating the viewers we’re trying to attract. As well as the [people of color] journalists like me who contribute so much to this paper and then have to read stories like this that oversimplify our struggles and realities.”

The American Press Is Destroying Itself
By Matt Taibbi

Probably the most disturbing story involved Intercept writer Lee Fang, one of a fast-shrinking number of young reporters actually skilled in investigative journalism. Fang’s work in the area of campaign finance especially has led to concrete impact, including a record fine to a conservative Super PAC: few young reporters have done more to combat corruption.

Yet Fang found himself denounced online as a racist, then hauled before H.R. His crime? During protests, he tweeted this interview with an African-American man named Maximum Fr, who described having two cousins murdered in the East Oakland neighborhood where he grew up. Saying his aunt is still not over those killings, Max asked:

I always question, why does a Black life matter only when a white man takes it?… Like, if a white man takes my life tonight, it’s going to be national news, but if a black man takes my life, it might not even be spoken of… It’s stuff just like that that I just want in the mix.

Shortly after, a co-worker of Fang’s, Akela Lacy, wrote, “Tired of being made to deal continually with my co-worker @lhfang continuing to push black on black crime narratives after being repeatedly asked not to. This isn’t about me and him, it’s about institutional racism and using free speech to couch anti-blackness. I am so fucking tired.” She followed with, “Stop being racist Lee.”

The tweet received tens of thousands of likes and responses along the lines of, “Lee Fang has been like this for years, but the current moment only makes his anti-Blackness more glaring,” and “Lee Fang spouting racist bullshit it must be a day ending in day.” A significant number of Fang’s co-workers, nearly all white, as well as reporters from other major news organizations like the New York Times and MSNBC and political activists (one former Elizabeth Warren staffer tweeted, “Get him!”), issued likes and messages of support for the notion that Fang was a racist. Though he had support within the organization, no one among his co-workers was willing to say anything in his defense publicly.

Like many reporters, Fang has always viewed it as part of his job to ask questions in all directions. He’s written critically of political figures on the center-left, the left, and “obviously on the right,” and his reporting has inspired serious threats in the past. None of those past experiences were as terrifying as this blitz by would-be colleagues, which he described as “jarring,” “deeply isolating,” and “unique in my professional experience.”

To save his career, Fang had to craft a public apology for “insensitivity to the lived experience of others.” According to one friend of his, it’s been communicated to Fang that his continued employment at The Intercept is contingent upon avoiding comments that may upset colleagues. Lacy to her credit publicly thanked Fang for his statement and expressed willingness to have a conversation; unfortunately, the throng of Intercept co-workers who piled on her initial accusation did not join her in this.

I first met Lee Fang in 2014 and have never known him to be anything but kind, gracious, and easygoing. He also appears earnestly committed to making the world a better place through his work. It’s stunning that so many colleagues are comfortable using a word as extreme and villainous as racist to describe him.

Though he describes his upbringing as “solidly middle-class,” Fang grew up in up in a diverse community in Prince George’s County, Maryland, and attended public schools where he was frequently among the few non-African Americans his class. As a teenager, he was witness to the murder of a young man outside his home by police who were never prosecuted, and also volunteered at a shelter for trafficked women, two of whom were murdered. If there’s an edge to Fang at all, it seems geared toward people in our business who grew up in affluent circumstances and might intellectualize topics that have personal meaning for him.

In the tweets that got him in trouble with Lacy and other co-workers, he questioned the logic of protesters attacking immigrant-owned businesses “with no connection to police brutality at all.” He also offered his opinion on Martin Luther King’s attitude toward violent protest (Fang’s take was that King did not support it; Lacy responded, “you know they killed him too right”). These are issues around which there is still considerable disagreement among self-described liberals, even among self-described leftists. Fang also commented, presciently as it turns out, that many reporters were “terrified of openly challenging the lefty conventional wisdom around riots.”

Lacy says she never intended for Fang to be “fired, ‘canceled,’ or deplatformed,” but appeared irritated by questions on the subject, which she says suggest, “there is more concern about naming racism than letting it persist.”

Max himself was stunned to find out that his comments on all this had created a Twitter firestorm. “I couldn’t believe they were coming for the man’s job over something I said,” he recounts. “It was not Lee’s opinion. It was my opinion.”

By phone, Max spoke of a responsibility he feels Black people have to speak out against all forms of violence, “precisely because we experience it the most.” He described being affected by the Floyd story, but also by the story of retired African-American police captain David Dorn, shot to death in recent protests in St. Louis. He also mentioned Tony Timpa, a white man whose 2016 asphyxiation by police was only uncovered last year. In body-camera footage, police are heard joking after Timpa passed out and stopped moving, “I don’t want to go to school! Five more minutes, Mom!”

“If it happens to anyone, it has to be called out,” Max says.

Max described discussions in which it was argued to him that bringing up these other incidents now is not helpful to the causes being articulated at the protests. He understands that point of view. He just disagrees.

“They say, there has to be the right time and a place to talk about that,” he says. “But my point is, when? I want to speak out now.” He pauses. “We’ve taken the narrative, and instead of being inclusive with it, we’ve become exclusive with it. Why?”

How objectivity in journalism became a matter of opinion
By The Economist

More than 150 Wall Street Journal employees signed a letter saying that they “find the way we cover race to be problematic”. Over 500 at the Washington Post endorsed demands for “combating racism and discrimination” at the paper. Journalists at the New York Times tweeted that a senator’s op-ed advocating a show of military force to restore order “puts black @nytimes staff in danger”.

But at the heart of many of these arguments is another disagreement, about the nature and purpose of journalism. As a Bloomberg employee is said to have remarked at a recent meeting, reporters are meant to be objective, but to many the distinction between right and wrong now seems obvious. A new generation of journalists is questioning whether, in a hyper-partisan, digital world, objectivity is even desirable. “American view-from-nowhere, ‘objectivity’-obsessed, both-sides journalism is a failed experiment,” tweeted Wesley Lowery, a Pulitzer-winning 30-year-old now at CBS News. The dean of Columbia Journalism School described objectivity as an “inherited shibboleth” in a message to students. The Columbia Journalism Review pondered: “What comes after we get rid of objectivity in journalism?”

Objectivity hasn’t always been a journalistic ideal. Early American newspapers read a bit like today’s blogs, says Tom Rosenstiel of the American Press Institute (API), an industry group. Benjamin Franklin’s Pennsylvania Gazette and Alexander Hamilton’s Gazette of the United States were unashamedly partisan. As they sought wider audiences in the 19th century, newspapers became more concerned with what they called “realism”. Some of this was provided by the Associated Press (AP), founded in 1846, which supplied stories to papers of diverse political leanings and so stuck to the facts. As the news pages became more even-handed, publishers established editorial pages, on which they could continue to back their favoured politicians.

The shift away from partisanship a century ago was driven partly by advertisers. Today, as ad revenues leak away to search engines and social networks, newspapers have come to rely more on paying readers. Unlike advertisers, readers love opinion. Moreover, digital publication means American papers no longer compete regionally, but nationally. “The local business model was predicated on dominating coverage of a certain place; the national business model is about securing the loyalties of a certain kind of person,” wrote Ezra Klein of Vox. Left-leaning New Yorkers may switch to the Washington Post if the Times upsets them. The incentive to keep readers happy—and the penalty for failing—are greater than ever.

Disenchanted with objectivity, some journalists have alighted on a new ideal: “moral clarity”. The phrase, initially popularised on the right, has been adopted by those who want newspapers to make clearer calls on matters such as racism. Mr Lowery repeatedly used the phrase in a recent Times op-ed, in which he called for the industry “to abandon the appearance of objectivity as the aspirational journalistic standard, and for reporters instead to focus on being fair and telling the truth, as best as one can, based on the given context and available facts.” The editor of the Times, Dean Baquet, called Mr Lowery’s column “terrific” in an interview with the “Longform” podcast. Objectivity has been “turned into a cartoon”, he said. Better to aim for values such as fairness, independence and empathy.

The danger is that advocates of moral clarity slide self-righteously towards crude subjectivity. This week Bari Weiss, a Times editor, resigned, criticising what she said was the new consensus at the paper: “that truth isn’t a process of collective discovery, but an orthodoxy already known to an enlightened few whose job is to inform everyone else.” Earlier Mr Rosenstiel warned, in a largely supportive response to Mr Lowery’s column, that “if journalists replace a flawed understanding of objectivity by taking refuge in subjectivity and think their opinions have more moral integrity than genuine inquiry, journalism will be lost.”

As reporters learn more about a subject, he adds, the truth tends to become less clear, not more so. Recognising and embracing the uncertainty means being humble—but not timid.

My run-in with the New York Times
By Andrew Sullivan

Wokeness, in case you hadn’t noticed, has entered a more intense phase. Not so long ago, you were canceled for something you did or said or wrote. Now you’re canceled just for saying absolutely nothing at all.

I had a much milder experience of this during the past week when the New York Times decided to run a profile of me. The hook was that I was forced to leave New York magazine last month because, according to the NYT, I had not publicly recanted editing an issue of the New Republic published…in 1994. The issue was a symposium on The Bell Curve, a book by Charles Murray and Richard Herrnstein that explored the connection between IQ, class, social mobility and race. My crime was to arrange a symposium around an extract, with 13 often stinging critiques published alongside it. The fact I had not recanted that decision did not, mind you, prevent TIME, the Atlantic, Newsweek, the NYT and New York magazine from publishing me in the following years. But suddenly, a decision I made a quarter of a century ago required my being canceled. The NYT reporter generously gave me a chance to apologize and recant, and when I replied that I thought the role of genetics in intelligence among different human populations was still an open question, he had his headline: ‘I won’t stop reading Andrew Sullivan, but I can’t defend him.’ In other words, the media reporter in America’s paper of record said he could not defend a writer because I refused to say something I don’t believe. He said this while arguing that I was ‘one of the most influential journalists of the last three decades’. To be fair to him, he would have had no future at the NYT if he had not called me an indefensible racist. His silence on that would have been as unacceptable to his woke bosses as my refusal to recant. But this is where we now are. A reporter is in fear of being canceled if he doesn’t cancel someone else.

A Note to Readers
By The Editorial Board

We’ve been gratified this week by the outpouring of support from readers after some 280 of our Wall Street Journal colleagues signed (and someone leaked) a letter to our publisher criticizing the opinion pages. But the support has often been mixed with concern that perhaps the letter will cause us to change our principles and content. On that point, reassurance is in order.

In the spirit of collegiality, we won’t respond in kind to the letter signers. Their anxieties aren’t our responsibility in any case. The signers report to the News editors or other parts of the business, and the News and Opinion departments operate with separate staffs and editors. Both report to Publisher Almar Latour. This separation allows us to pursue stories and inform readers with independent judgment.

It was probably inevitable that the wave of progressive cancel culture would arrive at the Journal, as it has at nearly every other cultural, business, academic and journalistic institution. But we are not the New York Times. Most Journal reporters attempt to cover the news fairly and down the middle, and our opinion pages offer an alternative to the uniform progressive views that dominate nearly all of today’s media.

As long as our proprietors allow us the privilege to do so, the opinion pages will continue to publish contributors who speak their minds within the tradition of vigorous, reasoned discourse. And these columns will continue to promote the principles of free people and free markets, which are more important than ever in what is a culture of growing progressive conformity and intolerance.

The woke revolution in American journalism has begun
By Damon Linker

The journalists Smith quotes and paraphrases believe that “fairness on issues from race to Donald Trump requires clear moral calls.” That news organizations need to be devoted to “the truth” rather than some spurious ideal of “objectivity.” That in all things “moral clarity” is required. And that a journalist determines whether he or she has achieved such righteousness by measuring the volume of applause from likeminded followers on Twitter.

But what’s absent from Smith’s essay may be even more illuminating than what’s in it. No one acknowledges the difficulty of achieving moral clarity. No one notes that there are precious few “clear moral calls” in life. No one demonstrates awareness that “the truth,” like justice, is something our country is deeply divided about. No one expresses an understanding of how those divisions shape everyone’s standpoint, very much including that of journalists themselves. Or concedes that understanding a country as complex and divided as the United States might require a little humility and willingness to suspend judgment for a time.

In place of difficulty, complexity, and complication, today’s journalistic revolutionaries crave tidy moral lessons with clear villains and heroes. They champion simplicity, embrace moral uplift, and seek out evildoers to demonize.

Liberals aren’t relativists. They’re people who recognize that achieving understanding is hard, that what justice entails and requires is deeply contested in the United States, and that a news organization that aspires to explain our fractious country to itself cannot be guided by the sensibility of a single-issue activist. Lines need to be drawn, but they should be drawn broadly. A serious news organization cannot exclude views championed by one of the country’s two major political parties and held by more than 40 percent of the country’s voters.

A low-circulation magazine with an explicitly partisan agenda can have a narrower scope. So can a propaganda network like Fox News. But a news outlet with national ambitions needs to aim higher — and broader — than that.

That’s why the woke revolution in American newsrooms is so disheartening — not because it’s a victory for the left, but because it’s yet another sign of the hollowing out of the nation’s public life, as individuals and institutions burrow ever-deeper into ideological enclaves. It is a victory for narrowness and dogmatism, for unearned certainty and facile simplifications. Which means it’s also a defeat for the American mind, which finds itself ever more alienated from reality itself.

New research explores how conservative media misinformation may have intensified the severity of the pandemic
By Christopher Ingraham

In recent weeks, three studies have focused on conservative media’s role in fostering confusion about the seriousness of the coronavirus. Taken together, they paint a picture of a media ecosystem that amplifies misinformation, entertains conspiracy theories and discourages audiences from taking concrete steps to protect themselves and others.

The end result, according to one of the studies, is that infection and mortality rates are higher in places where one pundit who initially downplayed the severity of the pandemic — Fox News’s Sean Hannity — reaches the largest audiences.

Fox News has never been so right
By Erik Wemple

… Hannity heads up a crew of Fox News opinionators who minimized the coronavirus outbreak in late February and early March, a period when President Trump was coming under scrutiny for a slow-footed response to the virus’s spread. In an infamous Feb. 27 broadcast, for instance, Hannity opened with a derisive message for those who were ringing alarms. “We’re all doomed. The end is near,” said Hannity. On March 8, Fox Newser Pete Hegseth said, “I feel like the more I learn about this, the less there is to worry about.” On March 7, host Jeanine Pirro said, “All the talk about coronavirus being so much more deadly doesn’t reflect reality. Without a vaccine, the flu would be far more deadly.”

The pooh-poohing commentary was indeed troublesome, especially as it beamed from Hannity’s 9 p.m. perch, when millions of loyal viewers pay an unsettling degree of attention to the host. We here at the Erik Wemple Blog have advocated his sacking.

As the unscientific, pro-Trump statements piled up on Fox News, critics of the network fantasized on social media about a courtroom reckoning. Somehow, the input of the Hannity-Pirro-Hegseth crowd would, they dreamed, trigger massive liability for Fox News. Joe Lockhart, who was press secretary in the Clinton White House, has been outspoken on the matter. “I do believe there will be a number of lawsuits because of the apparent coordination between the President and Fox [News] pushing fake science,” Lockhart told the Erik Wemple Blog via Twitter DM. “They both first denied it would happen and then repeatedly repeated false information to the public, information that cost lives.”

Why we should be wary of an aggressive government response to coronavirus
By Wendy Parmet and Michael Sinha

As the novel coronavirus has spread, so, too, has fear: Beijing has imposed a travel ban on 16 cities, quarantining more than 50 million people, and several countries, including Australia and Singapore, have also imposed travel restrictions. The World Health Organization declared a public health emergency of international concern. Anti-Chinese sentiment, too, has spread rapidly: In France, a newspaper headline warned of a “Yellow Alert,” and restaurants in South Korea, Hong Kong and Vietnam have reportedly refused to serve Chinese customers. In the United States, customers are avoiding cities’ Chinatown neighborhoods, and students report anti-Chinese sentiment growing on their college campuses.

Initially, the U.S. government adopted a relatively measured response, even as political commentators and some members of Congress demanded that the United States ban all travel from China. It resisted travel bans in lieu of warning people to avoid nonessential travel to China and screening passengers from China at 20 international airports. Its decision Friday to deny entry to non-nationals who have recently traveled to China and quarantine Americans who are returning from that country marks a significant, and potentially counterproductive, escalation in the U.S. response to the coronavirus crisis.

Get a grippe, America. The flu is a much bigger threat than coronavirus, for now.
By Lenny Bernstein

The rapidly spreading virus has closed schools in Knoxville, Tenn., cut blood donations to dangerous levels in Cleveland and prompted limits on hospital visitors in Wilson, N.C. More ominously, it has infected as many as 26 million people in the United States in just four months, killing up to 25,000 so far.

In other words, a difficult but not extraordinary flu season in the United States, the kind most people shrug off each winter or handle with rest, fluids and pain relievers if they contract the illness.

But this year, a new coronavirus from China has focused attention on diseases that can sweep through an entire population, rattling the public despite the current magnitude of the threat. Clearly, the flu poses the bigger and more pressing peril; a handful of cases of the new respiratory illness have been reported in the United States, none of them fatal or apparently even life-threatening.

“Anything that we don’t feel we have sufficient information about feels like a threat,” said Lynn Bufka, senior director of practice research and policy at the American Psychological Association and an expert on anxiety. “The flu doesn’t feel novel. Most people’s experience with the flu is they’ve had it, they’ve recovered, it’s not a big deal — despite the fact that thousands of people die every year.”

When she counsels anxious clients, they “try to get a good assessment of what is actually the risk,” she said. “How likely is it? And try to distinguish between possibility and probability.

“Sure, it’s possible there will be more cases in the U.S. It’s probable there will be more cases in the U.S. But it’s not likely to be the person standing next to me in the grocery store.”

The Virus Killing U.S. Kids Isn’t the One Dominating the Headlines
By Michael Daly

No flu panic swept the Lake View area or other places in America even as the pediatric death toll for this year rose to 68, markedly higher than the same time last year. The death of a 2-year-old from Park Slope in Brooklyn in December caused barely a ripple in the neighborhood.

But with the end of January came talk of a new threat, coronavirus. The pharmacy nearest to the ill-fated little girl’s home reported that there had been no surge in face mask sales after her death. But the entire stock vanished as talk of coronavirus filed the news.

“Sold out,” the cashier at Bare Essentials said.

Up by Niagara Falls, people began fretting about the large population of Chinese immigrants across the border in Canada. Students at the University of Buffalo worried aloud about what the school was doing to protect them.

All that, even though there had not been a single confirmed case of coronavirus in New York state and no American deaths at all, while thousands have died from the flu.

Why we panic about coronavirus, but not the flu
By Bob Herman

By the numbers: This new strain of coronavirus has killed 132 people so far, all of them in China. More than 6,000 total cases have been reported worldwide, although experts believe that total is underestimated.

  1. By comparison, this year’s flu season has killed 8,200 people, with at least 15 million cases — and that’s just in the U.S.

Between the lines: James Lawler, an infectious disease physician at the University of Nebraska, said pandemic viruses like the coronavirus cause more anxiety because, unlike the flu, there are not any initial countermeasures like vaccines, antivirals, diagnostic testing and monitoring systems.

  1. Those things exist for the flu, yet vaccination rates are low.
  2. “The flu is just not as new and headline-grabbing because we see it every year,” said Emily Martin, an epidemiologist at the University of Michigan.

The bottom line: The coronavirus has upended the lives of many Chinese citizens, and it warrants a strong public-health response. It’s just important to remember that if you’re concerned about viruses, a lot of those deaths every year are preventable.

  1. “When we think about the relative danger of this new coronavirus and influenza … coronavirus will be a blip on the horizon in comparison,” William Schaffner, a professor of preventive medicine at Vanderbilt University, told Kaiser Health News.

Beware the Pandemic Panic
By Farhad Manjoo

You might wonder what the great harm is — if the government needs to temporarily limit people’s movements or prevent some people from entering the country, shouldn’t it take those measures in the face of dangerous illness? Perhaps, but given the checkered history of quarantines — throughout history, they have been used to persecute the marginalized — lawmakers and the media should rigorously examine the bases for any such restrictions. We should especially make sure any restrictions imposed are indeed temporary and adhere to the science — something we don’t always do. Consider that 30 years after the AIDS outbreak, men who have sex with men are still restricted from donating blood in the United States, long after the scientific basis for such a ban has passed.

So far, President Trump has offered a measured response to the virus. “We have it totally under control,” he said this week. But online, misinformation about the scale of the virus is already trending. (It is not, as you might have read on Twitter, “thermonuclear pandemic level bad.”) So are racist memes blaming Chinese people and Chinese culture for the virus.

I fear that the conditions are ripe for a situation similar to what occurred in the summer of 2014, when an outbreak of Ebola overran West Africa. After the Obama administration scrambled to bring home two American health workers who had become infected with the disease in Liberia, Trump went on a monthslong Twitter tirade about Ebola.

The future president favored extreme isolationism, seeing no benefit to American help in Africa. “People that go to far away places to help out are great-but must suffer the consequences!” he wrote — ultimately sparking a growing partisan movement against the government’s response to the virus.

Today, Trump runs an administration that is hemorrhaging scientific expertise, and his political agenda is rife with efforts to target immigrants, minorities and the poor. In 2016, a terrorist attack prompted Trump to propose banning Muslims from the country; when he won the White House, he instituted a version of that ban.

I fear that panic about a foreign virus offers society another chance to target marginalized people. So let’s keep our fear in check; panic will hurt us far more than it’ll help.

The Cognitive Bias That Makes Us Panic About Coronavirus
By Cass R. Sunstein

… A lot of people are more scared than they have any reason to be. They have an exaggerated sense of their own personal risk.

How come?

The best answer goes by an unlovely name: “probability neglect.” Suppose that a potential outcome grips your emotions, maybe because it is absolutely terrifying, maybe because it is amazingly wonderful. If so, there is an excellent chance that you will focus on it — and pay far less attention than you should to a crucial question, which is how likely it is to occur.

One of the simplest and most vivid demonstrations comes from Christopher Hsee of the University of Chicago and Yuval Rottenstreich of the University of California at San Diego. They asked a group of people how much they would pay to avoid a 1% chance of a “short, painful, but not dangerous electric shock.” They asked another, similar group of people how much they would pay to avoid a 99% chance of getting such a shock.

There’s a massive difference between a 1% chance and a 99% chance. But people didn’t register that difference. To avoid a 1% chance of an electric shock, the median amount that people were willing to pay was $7. To avoid a 99% chance, the number was $10 – not a whole lot higher.

Hsee and Rottenstreich contend that when an outcome triggers strong negative emotions, people tend not to think a whole lot about the issue of probability.

Their argument is supported by their finding that for ordinary gambles involving small sums of money, people are far more sensitive to probability than in the case of electric shocks. The median person would pay $1 to avoid a 1% chance of losing $20 – and $18 to avoid a 99% chance of losing $20.

Something similar happens when an outcome triggers strong positive emotions. That’s one reason that state lotteries make so much money. True, people are highly unlikely to win. But vivid advertisements, pointing to the amazing things that lottery winners can do, are highly effective, and for one reason: The advertisers benefit from probability neglect, and so get a lot of people to waste their money on lottery tickets.

Turn to the coronavirus in this light. The situation is very fluid, but as of now, most people in North America and Europe do not need to worry much about the risk of contracting the disease. That’s true even for people who are traveling to nations such as Italy that have seen outbreaks of the disease.

Still, the disease is new, and it can be fatal. That’s more than enough to trigger probability neglect.

The Economic Hit From Coronavirus Is All in Your Mind
By Daniel Moss

Hindsight can be an asset during an epidemic: Lessons from the past help steer public decision-making and avoid repeating mistakes. Unfortunately, rearview mirrors appear to be in short supply these days.

“Individuals, under prevailing circumstances of poor information and stress, can arrive at biased subjective assessments concerning the risk of disease contraction,” Ilan Noy and Sharlan Shields of Victoria University of Wellington in New Zealand wrote in the ADB paper. “This leads to panic and suboptimal decisions, which in turn result in an excessively high cost.”

To limit the impact on growth, then, leaders need to think carefully about how to minimize our natural impulse to be afraid.

How Fear Distorts Our Thinking About the Coronavirus
By David DeSteno

The mix of miscalibrated emotion and limited knowledge, the exact situation in which many people now find themselves with respect to the coronavirus, can set in motion a worsening spiral of irrational behavior. As news about the virus’s toll in China stokes our fears, it makes us not only more worried than we need be about contracting it, but also more susceptible to embracing fake claims and potentially problematic, hostile or fearful attitudes toward those around us — claims and attitudes that in turn reinforce our fear and amp up the cycle.

So how to fix the problem? Again, the solution isn’t to try to think more carefully about the situation. Most people don’t possess the medical knowledge to know how and when to best address viral epidemics, and as a result, their emotions hold undue sway. Rather, the solution is to trust data-informed expertise. But in today’s world, I worry a firm trust in expertise is lacking, making us too much the victim of fear.

Don’t trust the psychologists on coronavirus
By Stuart Ritchie

Psychologists haven’t had a great few years. First there was the “replication crisis”, which kicked off in about 2011 and involved the gradual realisation that many of our best-known, most-hyped results couldn’t be repeated in independent experiments. Then there were the revelations that the American Psychological Association, one of the field’s most important professional bodies, had colluded with the US government on its torture programme during the Iraq War, then attempted to cover it up.

Then some of the most famous studies from social psychology’s 1970s heyday fell apart on closer scrutiny. The Stanford Prison Experiment, where people were assigned roles as “prisoners” and “guards”, and the guards ended up treating the prisoners abominably? Probably misreported. The study where “pseudopatients” admitted themselves to psychiatric hospitals, acted entirely sane, and were locked up and medicated regardless? Possibly fraudulent.

Now, psychologists are disgracing themselves anew over the coronavirus.

It started with articles relying on psychological “insights” to downplay the severity of the problem. In early February, social psychologist David DeSteno wrote a piece in the New York Times arguing that people get so caught up with their fear of the virus, they fail to understand that they’re unlikely to get it. Referencing some of his own lab experiments, DeSteno wrote that “…quarantine or monitoring policies can make great sense when the threat is real and the policies are based on accurate data. But the facts on the ground, as opposed to the fear in the air, don’t warrant such actions.”

Two days later, the New York Times’s Interpreter column quoted psychologist Paul Slovic, who noted that “[o]ur feelings don’t do arithmetic very well”, and that focusing on the coronavirus fatalities, and not the “98% or so of people who are recovering from it and may have mild cases” is skewing our judgement. The article argued that our fears, triggered by disturbing reports of “city-scale lockdowns and overcrowded hospitals”, overload our critical faculties, making us overreact to the threat the virus poses. The thought that those city-scale lockdowns and overcrowded hospitals might be a mere month away from the United States didn’t seem to occur.

Further psychological insights were provided by Cass Sunstein, co-author of the best-selling book Nudge, which used lessons from behavioural economics (essentially psychology by another name) that could inform attempts to change people’s behaviour. In an article for Bloomberg Opinion on 28 February (by which point there were over 83,000 confirmed coronavirus cases worldwide), Sunstein wrote that anxiety regarding the coronavirus pandemic was mainly due to something called “probability neglect”.

Because the disease is both novel and potentially fatal, Sunstein reasoned, we suffer from “excessive fear” and neglect the fact that our probability of getting it is low. “Unless the disease is contained in the near future,” he continued, “it will induce much more fear, and much more in the way of economic and social dislocation, than is warranted by the actual risk”.

Sunstein himself appears to have undergone something of a Damascene corona-conversion. On 26 March, he wrote another column for Bloomberg Opinion, entitled “This Time the Numbers Show We Can’t Be Too Careful”. In an argument that was directly opposed to his own reasoning of a month earlier, Sunstein suggested that the potentially ruinous effects of the pandemic mean that “the benefits of aggressive social distancing greatly exceed the cost”. Remarkably, he made no reference whatsoever to his previous article on this specific crisis — an article which, as the writer Ari Schulman has pointed out, had said that if you held the view Sunstein now holds, you were the sad victim of a cognitive malfunction.

There should never be heroes in science
By Stuart Ritchie

It’s fair to say that Stanford University’s John Ioannidis is a hero of mine. He’s the medical researcher who made waves in 2005 with a paper carrying the firecracker title “Why Most Published Research Findings are False”, and who has published an eye-watering number of papers outlining problems in clinical trials, economics, psychology, statistics, nutrition research and more.

Like Eysenck, he’s been a critic of meta-analysis: in a 2016 paper, he argued that scientists were cranking out far too many such analyses — not only because of the phenomenon of Garbage-In-Garbage-Out, but because the meta-analyses themselves are done poorly. He’s also argued that we should be much more transparent about conflicts of interest in research: even about conflicts we wouldn’t normally think of, such as nutrition researchers being biased towards finding health effects of a particular diet because it’s the one that they themselves follow.

First, in mid-March, as the pandemic was making its way to America, Ioannidis wrote an article for STAT News where he argued that we should avoid rushing into big decisions like country-wide lockdowns without what he called “reliable data” on the virus. The most memorable part of the article was his prediction — on the basis of his analysis of the cursed cruise ship Diamond Princess — that around 10,000 people in the US would die from COVID-19 — a number that, he said, “is buried within the noise of the estimate of deaths from ‘influenza-like illness’”. As US deaths have just hit 125,000, I don’t need to emphasise how wrong that prediction was.

So far, so fair enough: everyone makes bad predictions sometimes. But some weeks later, it emerged that Ioannidis had helped co-author the infamous Santa Clara County study, where Stanford researchers estimated that the number of people who had been infected with the coronavirus was considerably higher than had been previously supposed. The message was that the “infection fatality rate” of the virus (the proportion of people who, once infected, die from the disease), must be very low, since the death rate had to be divided across a much larger number of infections. The study became extremely popular in anti-lockdown quarters and in the Right-wing populist media. The virus is hardly a threat, they argued — lift the lockdown now!

But the study had serious problems. When you do a study of the prevalence of a virus, your sample needs to be as random as possible. Here, though, the researchers had recruited participants using Facebook and via email, emphasising that they could get a test if they signed up to the study. In this way, it’s probable that they recruited disproportionate numbers of people who were worried they were (or had been) infected, and who thus wanted a test. If so, the study was fundamentally broken, with an artificially-high COVID infection rate that didn’t represent the real population level of the virus (there were also other issues relating to the false-positive rate of the test they used).

Then, an investigation by Stephanie Lee of BuzzFeed News revealed that the study had been part-funded by David Neeleman, the founder of the airline JetBlue — a company that would certainly have benefited from a shorter lockdown. Lee reported that Neeleman appeared to have been in direct contact with Ioannidis and the other Stanford researchers while the study was going on, and knew about their conclusions before they published their paper. Even if these conversations didn’t influence the conduct of the study in any way (as argued by Ioannidis and his co-authors), it was certainly odd — given Ioannidis’s record of advocating for radical transparency — that none of this was mentioned in the paper, even just to be safe.

Ioannidis didn’t stop there. He then did his own meta-analysis of prevalence studies, in an attempt to estimate the true infection fatality rate of the virus. His conclusion — once again — was that the infection fatality rate wasn’t far off that for the flu. But he had included flawed studies like his own one from Santa Clara, as well as several studies of the prevalence that only included young people — biasing the death rate substantially downwards and, again, not representing the rate in the population (several other issues are noted in a critique by the epidemiologist Hilda Bastian). That German accent you can hear faintly in the background is the ghost of Hans Eysenck, warning us about the “mega-silliness” of meta-analysing low-quality studies.

His most recent contribution is an article on forecasting COVID-19, upbraiding the researchers and politicians who predicted doomsday scenarios with overwhelmed hospitals. His own drastic under-prediction of 10,000 US deaths? Not mentioned once.

Although Ioannidis has at least sounded as if he’s glad to receive criticism, some of his discussion of the more mainstream epidemiological models has sounded Eysenckian — for instance, where he described the Imperial College model of the pandemic as having been “astronomically wrong”. There is, of course, a genuine debate to be had on how and when we should lift our lockdowns. There’s also a great deal that we don’t know about the virus (though more reliable estimates suggest, contra Ioannidis, that its infection fatality rate is many times higher than that of the flu). But Ioannidis’s constant string of findings that all confirm his initial belief — that the virus is far less dangerous than scientists are telling you — gives the impression of someone who has taken a position and is now simply defending it against all comers.

And for that reason, it’s an important reminder of what we often forget: scientists are human beings, and are subject to very human flaws. Most notably, they’re subject to bias, and a strong aversion to having their cherished theories proved wrong. The fact that Ioannidis, the world’s most famous sceptic of science, is himself subject to this bias is the strongest possible confirmation of its psychological power. The Eysenck and Ioannidis stories differ in very many ways, but they both tell us how contrarianism and iconoclasm — both crucial forces for the process of constant scepticism that science needs to progress — can go too far, leading researchers not to back down, but to double-down in the face of valid criticism.

Above, I should really have said that John Ioannidis was a hero of mine. Because this whole episode has reminded me that those self-critical, self-correcting principles of science simply don’t allow for hero-worship. Even the strongest critics of science need themselves to be criticised; those who raise the biggest questions about the way we do research need themselves to be questioned. Healthy science needs a whole community of sceptics, all constantly arguing with one another — and it helps if they’re willing to admit their own mistakes. Who watches the watchmen in science? The answer is, or at least should be: all of us.

Do We Want to Be Credible or Incredible?
By Simine Vazire

How is it possible that almost 9 out of 10 Americans do not agree that medical researchers admit and take responsibility for their mistakes, yet 86% trust science? One clue is the finding, from the same Pew survey, that 57% of Americans say they would trust research more when the data are openly available (vs. 8% who say they would trust it less and 34% who say it makes no difference). The public doesn’t trust us as individuals, but they trust science because of the expectation of transparency and accountability. If we continue to make transparency and quality control optional — which we effectively do when we continue to give our seal of approval (and put out press releases) for research that is not transparent and has not passed through careful scrutiny — we are putting our long-term credibility in jeopardy. We may score more points in the short term by putting out more frequent and dramatic headlines, but we risk losing credibility in the long term when the public realizes we don’t make transparency and verification requirements for endorsing such claims.

Are Protests Dangerous? What Experts Say May Depend on Who’s Protesting What
By Michael Powell

Some public health scientists publicly waved off the conflicted feelings of their colleagues, saying the country now confronts a stark moral choice. The letter signed by more than 1,300 epidemiologists and health workers urged Americans to adopt a “consciously anti-racist” stance and framed the difference between the anti-lockdown demonstrators and the protesters in moral, ideological and racial terms.

Those who protested stay-at-home orders were “rooted in white nationalism and run contrary to respect for Black lives” the letter stated.

By contrast, it said, those protesting systemic racism “must be supported.”

“As public health advocates,” they stated, “we do not condemn these gatherings as risky for Covid-19 transmission. We support them as vital to the national public health.”

There is as of yet no firm evidence that protests against police violence led to noticeable spikes in infection rates. A study published by the National Bureau of Economic Research found no overall rise in infections, but could not rule out that infections might have risen in the age demographic of the protesters. Health officials in Houston and Los Angeles have suggested the demonstrations there led to increased infections, but they have not provided data. In New York City, Mayor Bill de Blasio has instructed contact tracers not to ask if infected people attended protests.

The ten epidemiologists interviewed for this article said near-daily marches and rallies are nearly certain to result in some transmission. Police use of tear gas and pepper spray, and crowding protesters into police vans and buses, puts people further at risk.

“In all likelihood, some infections occurred at the protests; the question is how much,” said Professor Lurie. “No major new evidence has emerged that suggests the protests were superspreader events.”

The coronavirus has infected 2.89 million Americans, and at least 129,800 have died.

The virus has hit Black and Latino Americans with a particular ferocity, hospitalizing those populations at more than four times the rate of white Americans. Many face underlying health issues, and are more likely than most Americans to live in densely populated housing and to work on the front lines of this epidemic. As a result, Latinos and Black people are dying at rates well in excess of white Americans.

How the experts messed up on Covid
By Stuart Ritchie and Michael Story

Throughout the pandemic, experts have been all too willing to make claims about the virus that bordered on the hubristic. It’s easy to forget, for example, that the first months of 2020 saw widespread and confident scepticism of any risk from the novel disease that was sweeping Wuhan. As an infamous (and since updated) BuzzFeed article advised readers not to worry about the coronavirus, but instead to worry about the flu, leading political figures found it difficult to resile from their initial “don’t panic!” stance, even as the pandemic began to take hold.

As late as 3 March, New York mayor Bill de Blasio, supported by his health commissioner, told residents to “go on with your lives + get out on the town despite coronavirus”. This plea for normalcy was made on the day that global coronavirus cases reached 90,000, and a mere month before the total Covid deaths in New York City exceeded those of 9/11.

It shouldn’t be damning to make a bad call — nobody gets every prediction right; any professional forecaster will tell you that the best you can hope for is to be wrong less often. But the fundamental problem is the high degree of confidence with which the bad calls were expressed. Stating a position without the attendant uncertainty makes it very difficult, should the situation change, to update your views without losing face. In a crisis, slowing down that view-updating process could cost time, money, and even lives. Even after the update is made, those who heard your original, dogmatically-expressed view might have lingering doubts — or might even use it against you.

Few areas better illustrate the pitfalls of expert overconfidence than the question of facemasks.

The initial messaging on masks from the WHO emphasised that masks were needed in medical settings where infected patients were likely to be coughing directly on or near healthcare workers. The message was echoed by the US Surgeon General on Twitter, who exhorted citizens to “STOP BUYING MASKS!”, since they “are NOT effective in preventing [the] general public from catching #Coronavirus”. Both emphasised the need to preserve mask supplies for healthcare workers.

This might have made sense given the assumption that the coronavirus behaved like influenza — the model disease most countries were using to plan for a pandemic. If that was the case, it was thought, masks would likely be ineffective at preventing transmission outside of hospitals. Indeed, studies of facemask use for influenza had found mixed results. This, coupled with a norm that health organisations should require ultra-solid evidence before making recommendations, somewhat paradoxically meant that the “masks don’t work” message became ever-more-confidently projected — even as the evidence behind it looked shakier and shakier.

In the UK, the public health community embraced the anti-mask message more strongly than most. For example, in care homes, where around half of the UK’s Covid deaths have occurred, staff were informed by the government in February that “face masks do not provide protection from respiratory viruses such as Covid-19 and do not need to be worn by staff” (incidentally, the following statement from the same document — “[i]t remains very unlikely that people receiving care in a care home or the community will be infected” — stands as one of the most tragic of the entire pandemic).

On 4 March, Chief Medical Officer Chris Whitty argued masks would reduce the risks of the non-infected “almost not at all” and said he would not advise wearing them. The medical website noted that its readers might see people wearing masks out and about. “Don’t worry if you haven’t bought one”, they wrote. “The masks are fairly ineffective for the average person. Only people caring for infected people and the infected people themselves needs [sic] to wear masks.” Even volunteer sewing groups were asked to make morale-boosting patterned scrubs for NHS staff instead of using their skills to boost the mask supply.

The anti-mask campaign soon escalated beyond messaging. In early March, two businesses selling facemasks were banned from producing adverts that claimed their products offered protection from “viruses, bacteria, and other air pollutants”. The adverts, it was said, were “likely to cause fear”. This was on the advice of Public Health England, who didn’t just not recommend masks, but claimed they might raise the risk of transmission, since they were “likely to reduce compliance with good universal hygiene behaviours”. Professor Stephen Powis, the National Medical Director of NHS England, said that the firms selling facemasks who linked their product to the pandemic were “callous” and “outright dangerous”, and that advertising masks in this way had “rightly been banned”.

The campaign to keep British faces uncovered culminated in a video released by the UK Government on 11 March, where the Deputy Chief Medical Officer, Jenny Harries, told the Prime Minister in no uncertain terms that wearing a facemask was “not a good idea and doesn’t help”.

From the vantage point of mid-October, this seems more than a little odd. Masks are now mandated on public transport and in shops, are worn prominently in public by politicians and their public health advisers, and have become a normal part of life. The US Surgeon general who had so vehemently decried the buying of masks on Twitter now has a picture of himself wearing one at the top of his page. Once again, the issue was not necessarily that the original messaging was wrong — as we stated above, the evidence was patchy, and decisions still have to be made.

But the sheer effort undertaken to double down on what could only ever have been an uncertain message helped to narrow the range of options, and likely slowed the eventual change in policy as studies, models and reviews arrived to bolster the case for masks. Overall, whereas the data is far from knockdown, and there hasn’t yet been time to run and publish high-quality randomised trials, the observational and other evidence does point towards a protective effect of masks for this disease.

Had the evidence for the experts’ views been rock-solid, some impatience with public disagreement might have been justified — particularly given the emergency situation. As it was, though, those who even gently queried the evidence quickly became subject to ridicule.

In a difficult-to-watch clip from Jeremy Vine’s Channel 5 show in mid-March, the model and businesswoman Caprice Bourret was upbraided by the show’s medical expert, Dr. Sarah Jarvis, after suggesting that a country-wide lockdown might be helpful. Jarvis perfectly encapsulated the expert attitude with which we’re familiar, telling Caprice that “unless you have read every scientific paper… you cannot argue with me on that. You can have an opinion, but it’s not a fact.” Undeterred, Caprice went on to note that in some East Asian countries that had enjoyed (and still enjoy) a better Covid trajectory than the UK, a large proportion of citizens wore surgical masks. “…which make no difference at all”, Jarvis snapped in rebuttal, laughing at Caprice’s attempts to respond.

Other experts followed suit in suggesting that coronavirus worries — and in particular, disagreements with expert advice — were indications of ignorance, or of low-quality thinking. The American Psychological Association’s Senior Director of Practice, Research and Policy, Lynn Bufka, told Time magazine in March that mask-wearing was a “superstitious behaviour”: “Even if experts are saying it’s really not going to make a difference,” she argued, “a little [part of] people’s brains is thinking, well, it’s not going to hurt.”

Research that looks at people’s beliefs about Covid often bolsters the narrative of “the experts” versus “the unthinking”. A recent paper from academics at Cambridge surveyed people’s beliefs in Covid misinformation. Some of the beliefs they examined were bonkers conspiracy theories (“the new 5G network may be making us more susceptible to the virus”), and some were just daft (“breathing in hot air… (e.g. from a hair dryer) kills the coronavirus as it can only live in cool places”). They were all taken from the WHO’s Covid “Myth Busters” page, and they are all indeed myths that it would be silly to believe. But it can’t help but feel ironic that, at the time these academics accessed its website, the World Health Organisation was spreading its own confident misinformation: just one click away was the WHO’s page telling people not just that they shouldn’t wear masks if they weren’t directly looking after a Covid patient, but that it might actively be dangerous to do so.

Imagine travelling back in time to various points in 2020, knowing everything we now know about the coronavirus and its spread. In February, as you warned the world of the grimness to come, some experts would have told you that you were suffering from “probability neglect”, and that you should be more worried about the flu. In March, your view that masks would be helpful in controlling the disease would have marked you down as an agent of misinformation — psychologists might have suggested techniques to cope with your superstitious thinking, while Public Health England and the WHO might have said you were increasing the risk of spreading the virus.

Equally, if a time-traveller with the approved public health opinions of March 2020 journeyed just seven months into the future to the present day, their mainstream anti-mask position would be viewed with similar derision: they would perhaps have their thinking explained with regard to their gender, or might even be labelled a likely sociopath.

The “expert view” vs “the misinformation” is a false dichotomy. It can be tempting to offer only certainty in times of crisis — but part of being an expert is knowing when to be uncertain, and being strongly aware of the provisional nature of our knowledge. Trustworthiness doesn’t just come from being right, but from communicating the limits of the evidence, and regularly updating one’s view in light of new data and analysis.

Overconfidence from the experts, coupled with a willingness to denigrate and even pathologise those who publicly dissented, might have made it harder for us to change course during the pandemic, costing us precious time that we couldn’t afford. If experts fail to reckon with the inevitable uncertainties of our current times, we risk delaying the next crucial update — or worse, overlooking it altogether.

Nassim Nicholas Taleb on the Pandemic
By EconTalk

Russ Roberts: So, back in January of this year, January 26, you wrote a paper with Yaneer Bar-Yam and Joe Norman warning about the pandemic. And, it got some attention. But, if you were in your shoes then, looking back at that moment, which was–in America, nobody really was paying any attention to this–what were you hoping would happen when you wrote that paper? Besides raising an alarm, which you did.

Nassim Nicholas Taleb: Okay. No, no, no. Two things, and it was depressing that we had to do that. Our team was basically monitoring for pandemics because, in my opinion, they are the fattest tails, and fat tails means that these things could really go out of control. And, we haven’t had many pandemics in history, but we’ve had very lethal ones. They represent an existential risk. And, we knew that, so you have to stop some things in the egg, when it’s cheap to do so.

And, when Ebola started, people were not reacting to it properly. They were comparing a process that is multiplicative to risks that were in a Gaussian distribution. So, here we had fat-tailed risk–Ebola or any pandemic–versus people dying in their swimming pool, people falling from ladders, and stupidities like that. They do not compare–

Russ Roberts: Rare events–

Nassim Nicholas Taleb: Yeah, but you cannot compare these two, because one is multiplicative, and then the other one follows what I call ‘Chernoff bound.’ So–

Russ Roberts: Yeah: when I drown in my bathtub, you don’t drown in your bathtub as a result.

Nassim Nicholas Taleb: Exactly. Exactly. So, they’re not multiplicative.

And, then the way to look at it is they have extreme value properties that are very different. If I tell you a billion people died, you know they did not die drowning in their bathtub in any given year. Whereas if I tell you, ‘A billion people died. Try to guess,’ odds are it’s going to be either a nuclear war or a pandemic. More likely to be a pandemic. It’s actually fattest tail.

So, just to alert people to the nature of pandemics, that’s the thing we have to worry about the most. And, we failed during Ebola because people kept using–especially psychologists–rationality arguments.

And, our argument was twofold. One, the multiplicative notion; and the second one is that rationality point that I just made–that rationality doesn’t scale.

And, this is completely absent: the fat-tailedness and the scaling are absent from the Decision Science literature, which effectively makes people doing the right thing look like they’re irrational.

Russ Roberts: Right. And, no one wants to look irrational.

Astrologer Chani Nicholas Says 2021 Will Still Be Tough, but There’s Hope
By Claire Landsbaum

When we spoke early in January, astrologer Chani Nicholas warned me about 2020. Saturn would move into conjunction with Pluto in the beginning of the year, she said, a uniquely inhospitable planetary placement that would lead to some turbulence, to put it mildly. She mentioned this in passing, but the comment stuck with me as the ensuing months unfolded in a veritable yellow brick road of flaming shit, leaving deep gouges in our collective human psyche. She also mentioned, in passing, that 2021 would bring its own set of challenges.

Obviously you don’t know specifics when you’re looking at how things will play out, but you have a rough idea. What is it like to watch it all unfold in real time?

I think Chris Brennan said this on Twitter once: It’s like looking at a shadow in the distance and thinking, That looks ominous. I wonder what it’s going to be like. And then you approach it, and you’re like, Oh, my God, you are a horrific monster. Or, Okay, I get it now. You know the tone, and you know these years are going to be challenging and these months specifically are going to be really challenging, but you don’t know exactly what it is until you get up close.

In terms of 2020, Saturn-Pluto conjunctions only happen once every 36 years, and usually when they happen they mark the year in a gruesome way. You can look back at history and be like, What happened last time this happened? Will something in that genre happen again? And you can piece things together that way. The last time Saturn and Pluto came together was in the midst of the AIDS epidemic. It also happened during the bubonic plague. Obviously there’ve been a lot of Saturn-Pluto conjunctions that weren’t associated with a global health crisis, but enough have that some astrologers were like: plague.

The C.R.A.P. Framework for Addressing Workplace Bullshit
By Brian Gallagher

McCarthy and his colleagues, following the philosopher Harry Frankfurt, of On Bullshit-fame, tell us that bullshit is not a lie. People utter the former, but not the latter, without regard to truth; the bullshitter is often unaware, whether intentionally or not, of what the relevant facts are. They are just, as we say, “bullshitting,” often in order to persuade others to go along with something—like a plan or an explanation. Lies, on the other hand, are consciously offered instead of the truth; the liar is a calculating deceiver who knows what’s true and obscures it, with language or charts and figures. It is perhaps partly because this distinction so often goes unnoticed that bullshit in the workplace proliferates.

Most people, to lesser and greater degrees, contribute to the problem of workplace bullshit. This is because psychological biases play a significant role in both the defense against, and the dissemination of, bullshit. People believe the bullshit they like in one moment and call out the bullshit they don’t like in the next.

A 2019 paper in Nature titled “Resolving uncertainty in a social world” makes this crystal clear. It describes why people are motivated to reduce the aversive feelings social forms of uncertainty cause. In short, our aversion to being uncertain arises from our ingrained preference for social predictability—not knowing what people are thinking about us, or how they might act toward us (or in ways that may affect us) is a reliable source of anxiety. People tend to be more anxious when the social-prediction space is vast in any given scenario, meaning there’s many possibilities to consider and track, each with a low likelihood of arising. On the whole we feel better when we can “prune the prediction space,” as the authors, Oriel FeldmanHall and Amitai Shenhav, put it, or in other words have some sense of what’s true, or what will happen (to us). Which is where bullshit comes in: Bullshit can both make us more uncertain about some issue or relationship at work, because of how it clouds the truth, and it can make us feel more certain, because of how the bullshit appeals to what we think or wish to be true.

Asia Today: Sri Lanka minister who drank potion is positive
By The Associated Press

Sri Lanka’s health minister, who has faced criticism for consuming and endorsing a herbal syrup made by a sorcerer, has tested positive for COVID-19.

A Health Ministry official on Saturday confirmed that Pavithra Wanniarachchi became the highest-ranking official to be infected with the virus. She and her immediate contacts have been asked to self-quarantine.

Doctors have said there is no scientific basis for the syrup as remedy for the coronavirus. It’s said to contain honey and nutmeg.

Thousands of people gathered in long queues in December in the town of Kegalle, northeast of the capital Colombo, to obtain the syrup, just days after Wanniarachchi and several other government officials publicly consumed it.

The maker of the syrup said he got the formula through his divine powers. In local media, he claimed the Hindu goddess Kaali appeared to him in a dream and gave the recipe to save humanity from the coronavirus.

The Odds of That
By Lisa Belkin

Much religious faith is based on the idea that almost nothing is coincidence; science is an exercise in eliminating the taint of coincidence; police work is often a feint and parry between those trying to prove coincidence and those trying to prove complicity. Without coincidence, there would be few movies worth watching (”Of all the gin joints in all the towns in all the world, she walks into mine”), and literary plots would come grinding to a disappointing halt. (What if Oedipus had not happened to marry his mother? If Javert had not happened to arrive in the town where Valjean was mayor?)

The true meaning of the word is ”a surprising concurrence of events, perceived as meaningfully related, with no apparent causal connection.” In other words, pure happenstance. Yet by merely noticing a coincidence, we elevate it to something that transcends its definition as pure chance. We are discomforted by the idea of a random universe. Like Mel Gibson’s character Graham Hess in M. Night Shyamalan’s new movie ”Signs,” we want to feel that our lives are governed by a grand plan.

The need is especially strong in an age when paranoia runs rampant. ”Coincidence feels like a loss of control perhaps,” says John Allen Paulos, a professor of mathematics at Temple University and the author of ”Innumeracy,” the improbable best seller about how Americans don’t understand numbers. Finding a reason or a pattern where none actually exists ”makes it less frightening,” he says, because events get placed in the realm of the logical. ”Believing in fate, or even conspiracy, can sometimes be more comforting than facing the fact that sometimes things just happen.”

Robert J. Tibshirani, a statistician at Stanford University who proved that it was probably not coincidence that accident rates increase when people simultaneously drive and talk on a cellphone, leading some states to ban the practice, uses the example of a hand of poker. ”The chance of getting a royal flush is very low,” he says, ”and if you were to get a royal flush, you would be surprised. But the chance of any hand in poker is low. You just don’t notice when you get all the others; you notice when you get the royal flush.”

When these professors talk, they do so slowly, aware that what they are saying is deeply counterintuitive. No sooner have they finished explaining that the world is huge and that any number of unlikely things are likely to happen than they shift gears and explain that the world is also quite small, which explains an entire other type of coincidence. One relatively simple example of this is ”the birthday problem.” There are as many as 366 days in a year (accounting for leap years), and so you would have to assemble 367 people in a room to absolutely guarantee that two of them have the same birthday. But how many people would you need in that room to guarantee a 50 percent chance of at least one birthday match?

As a species, we appear to be biologically programmed to see patterns and conspiracies, and this tendency increases when we sense that we’re in danger. ”We are hard-wired to overreact to coincidences,” says Persi Diaconis. ”It goes back to primitive man. You look in the bush, it looks like stripes, you’d better get out of there before you determine the odds that you’re looking at a tiger. The cost of being flattened by the tiger is high. Right now, people are noticing any kind of odd behavior and being nervous about it.”

Adds John Allen Paulos: ”Human beings are pattern-seeking animals. It might just be part of our biology that conspires to make coincidences more meaningful than they really are. Look at the natural world of rocks and plants and rivers: it doesn’t offer much evidence for superfluous coincidences, but primitive man had to be alert to all anomalies and respond to them as if they were real.”

For decades, all academic talk of coincidence has been in the context of the mathematical. New work by scientists like Joshua B. Tenenbaum, an assistant professor in the department of brain and cognitive sciences at M.I.T., is bringing coincidence into the realm of human cognition. Finding connections is not only the way we react to the extraordinary, Tenenbaum postulates, but also the way we make sense of our ordinary world. ”Coincidences are a window into how we learn about things,” he says. ”They show us how minds derive richly textured knowledge from limited situations.”

To put it another way, our reaction to coincidence shows how our brains fill in the factual blanks. In an optical illusion, he explains, our brain fills the gaps, and although people take it for granted that seeing is believing, optical illusions prove that’s not true. ”Illusions also prove that our brain is capable of imposing structure on the world,” he says. ”One of the things our brain is designed to do is infer the causal structure of the world from limited information.”

If not for this ability, he says, a child could not learn to speak. A child sees a conspiracy, he says, in that others around him are obviously communicating and it is up to the child to decode the method. But these same mechanisms can misfire, he warns. They were well suited to a time of cavemen and tigers and can be overloaded in our highly complex world. ”It’s why we have the urge to work everything into one big grand scheme,” he says. ”We do like to weave things together.

”But have we evolved into fundamentally rational or fundamentally irrational creatures? That is one of the central questions.”

We pride ourselves on being independent and original, and yet our reactions to nearly everything can be plotted along a predictable spectrum. When the grid is coincidences, one end of the scale is for those who believe that these are entertaining events with no meaning; at the other end are those who believe that coincidence is never an accident.

”We forget all the times that nothing happens,” says Ruma Falk, a professor emeritus of psychology at the Hebrew University in Jerusalem, who studied years ago with Tversky. ”Dreams are another example,” Falk says. ”We dream a lot. Every night and every morning. But it sometimes happens that the next day something reminds you of that dream. Then you think it was a premonition.”

One of the many experiments she has conducted since then proceeded as follows: she visited several large university classes, with a total of 200 students, and asked each student to write his or her birth date on a card. She then quietly sorted the cards and found the handful of birthdays that students had in common. Falk wrote those dates on the blackboard. April 10, for instance, Nov. 8, Dec. 16. She then handed out a second card and asked all the students to use a scale to rate how surprised they were by these coincidences.

The cards were numbered, so Falk could determine which answers came from respondents who found their own birth date written on the board. Those in that subgroup were consistently more surprised by the coincidence than the rest of the students. ”It shows the stupid power of personal involvement,” Falk says.

The fact that personal attachment adds significance to an event is the reason we tend to react so strongly to the coincidences surrounding Sept. 11. In a deep and lasting way, that tragedy feels as if it happened to us all.

Falk’s findings also shed light on the countless times that pockets of the general public find themselves at odds with authorities and statisticians. Her results might explain, for instance, why lupus patients are certain their breast implants are the reason for their illness, despite the fact that epidemiologists conclude there is no link, or why parents of autistic children are resolute in their belief that childhood immunizations or environmental toxins or a host of other suspected pathogens are the cause, even though experts are skeptical. They might also explain the outrage of all the patients who are certain they live in a cancer cluster, but who have been told otherwise by researchers.

The numbers 9/11 (9 plus 1 plus 1) equal 11, and American Airlines Flight 11 was the first to hit the twin towers, and there were 92 people on board (9 plus 2), and Sept. 11 is the 254th day of the year (2 plus 5 plus 4).; What of the deaths of nearly a dozen scientists? Is it really possible that they all just happened to die, most in such jarring ways, within so short a time?; The fact that personal attachment adds significance to an event is the reason we tend to react so strongly to the coincidences surrounding Sept. 11. In a deep and lasting way, that tragedy feels as if it happened to us all.; Presidents Kennedy and Lincoln were elected 100 years apart. Both men were succeeded by Johnsons, who were also born 100 years apart. Their names each contain seven letters. Their successors’ names each contain 13 letters. Their assassins’s names each contain 15 letters.; In a room of 23 people, the odds are even that two of them will have the same birthday. Reduce the occupants to 14, and it’s even money that two of them were born within one day of each other. And for a 50-50 chance of finding a pair with birthdays a week apart, you need to invite only seven to the party.

No One Knows What’s Going to Happen
By Mark Lilla

Ancient augurs and prophets were in high-risk professions. When their predictions failed to materialize, many were executed by sovereigns or pulled apart by mobs. We see a bloodless version of this reaction today in the public’s declining confidence in both the news media and the government.

Take a banal example: snowstorms and school closings. A half century ago, when meteorological forecasting was less sophisticated, parents and children would not learn that classes were canceled until the storm began and it was announced on radio and television that very morning. We lived in harmless uncertainty, which for kids was thrilling. When snowflakes fell they even looked like manna from heaven.

Today, mayors and school superintendents, putting their faith in the meteorologists, routinely announce closings a day or more in advance. If the storm fails to arrive, though, they are sharply criticized by parents who lost a day of work or had to find day care. And if an unforeseen storm paralyzes the city, leaving streets unsalted and children stranded at school, the reaction is far worse. More than one mayor has lost a re-election bid because of failed prophecies, victim of our collective overconfidence in human foresight.

Apart from the actual biology of the coronavirus — which we are only beginning to understand — nothing is predestined. How many people fall ill with it depends on how they behave, how we test them, how we treat them and how lucky we are in developing a vaccine.

The result of those decisions will then limit the choices about reopening that employers, mayors, university presidents and sports club owners are facing. Their decisions will then feed back into our own decisions, including whom we choose for president this November. And the results of that election will have the largest impact on what the next four years will hold.

The pandemic has brought home just how great a responsibility we bear toward the future, and also how inadequate our knowledge is for making wise decisions and anticipating consequences. Perhaps that is why our prophets and augurs can’t keep up with the demand for foresight.

At some level, people must be thinking that the more they learn about what is predetermined, the more control they will have. This is an illusion. Human beings want to feel that they are on a power walk into the future, when in fact we are always just tapping our canes on the pavement in the fog.

Predictable Surprises: The Disasters You Should Have Seen Coming
By Michael D. Watkins and Max H. Bazerman

When fanatics commandeered jetliners on September 11, 2001, and steered them into buildings full of people, it came as a horrifying shock to most of the world. But however difficult it might have been to imagine individuals carrying out such an act, it shouldn’t have been a surprise. Portents had been building up for years. It was well known that Islamic militants were willing to become martyrs for their cause and that their hatred and aggression toward the United States had been mounting throughout the 1990s. In 1993, terrorists set off a car bomb under the World Trade Center in an attempt to destroy the building. In 1995, other terrorists hijacked an Air France plane and made an aborted attempt to fly it into the Eiffel Tower. Also in 1995, the U.S. government learned of a failed Islamic terrorist plot to simultaneously hijack 11 U.S. commercial airplanes over the Pacific Ocean and then crash a light plane filled with explosives into the CIA’s headquarters near Washington, DC. Meanwhile, dozens of federal reports, including one issued by then Vice President Al Gore’s special commission on aviation security, provided comprehensive evidence that the U.S. aviation security system was full of holes. Anyone who flew on a regular basis knew how simple it was to board an airplane with items, such as small knives, that could be used as weapons.

But despite the signals, no precautionary measures were taken. The failure can be traced to lapses in recognition, prioritization, and mobilization. Information that might have been pieced together to highlight the precise contours of the threat remained fragmented among the FBI, the CIA, and other governmental agencies. No one gave priority to plugging the security holes in the aviation system because, psychologically, the substantial and certain short-term costs of fixing the problems loomed far larger than the uncertain long-term costs of in action. And the organizations responsible for airline security, the airlines, had the wrong incentives, desiring faster, lower-cost screening to boost profitability. Inevitably, plans to fix the system fell afoul of concerted political lobbying by the airline industry.

As coronavirus threatened invasion, a new ‘Red Dawn’ team tried to save America
By Matthew Mosk, Kaitlyn Folmer, and Josh Margolin

A group of public health and national security experts who sent some of the earliest and most dire warnings to officials across the Trump administration about the gathering coronavirus crisis is now offering a searing assessment of how the federal government blundered through the critical first months of a lethal outbreak.

Members of the group, whose lengthy string of emails now read like a chilling foreshadowing of the unfolding deadly pandemic, came to be known by the chain’s dark-humored subject line, “Red Dawn Rising,” a reference to the campy 1984 cold war movie about a gritty band of Americans who fend off foreign invaders. Now several have broken their silence about the early warnings in interviews with ABC News to describe their lingering distress about the missed chances to spare lives.

“We did not step up and meet the challenge that we needed to meet,” said Dr. Jeffrey Duchin, Seattle-King County Public Health Officer, and a contributor to the email chain. “We didn’t act quickly enough to do the things that we needed to do early enough. And we still are not doing the things we need to do to get this outbreak under control.”

Lawler said after he started seeing alerts about the mystery illness in China the Red Dawn members began to “look at these things [and] were giving each other the play by play on what we were hearing and what we were seeing,” he recalled. “And it was obvious very early on, in January, that this had the potential to be a serious global event.”

At the time, the administration was still struggling to interpret the signs from China, said Tom Bossert, an ABC News contributor who was on the Red Dawn email chain and who served as a top Homeland Security Advisor to President Trump.

Bossert, who left the Trump administration in 2018, said government officials were so focused on containing the virus – keeping it from crossing the ocean – they were missing signs that people with no symptoms were capable of circulating it. Trump would announce a ban on most travel from China at the end of January.

“To contain this in China or in Wuhan, that’s a really noble objective,” Bossert said. But that strategy, he said, “didn’t seem to recognize or understand the notion that you can have a lot of sick people, infectious people walking around in any community.”

In those initial weeks, Lawler said the group was just starting their efforts to persuade leaders to look beyond efforts to block the virus from entering the U.S., and in the direction of bracing the public for potentially dramatic lifestyle changes that could slow down the spread.

“These signs were out there pretty early — good indications that asymptomatic infections were occurring and that those people were then able to transmit to others,” Lawler said.

At one point, Fauci was asked to explain why the U.S. government was still so focused on keeping the virus from entering the population, instead of turning more attention to preparing for it to spread.

“That’s the message that is very fine-line sensitive,” Fauci responded. “To let the American people know that, at present, given everything that is going on the risk is really relatively low.”

Branswell told ABC News she remembered being puzzled. And it showed. “Explain to me why the risk is low, somebody?” she responded. “I can’t see why – there’s no force field around China.”

Fauci said his caution stemmed from the fact that, by this point in mid-February, the U.S. had only 13 confirmed cases of coronavirus. But he acknowledged this view could be wrong.

“Is there a risk that this is going to turn into a global pandemic. Absolutely yes,” he said. “There is. There is.”

In an interview with ABC News, Fauci said that, even looking back now, he believes it was “reasonable” to make the assumption that the risk of spread was low, because, at that moment, so few cases had made it across the ocean.

“As a scientist, the thing you must always do is to be humble enough to know that when you get additional information, even information that might conflict what was felt earlier on, you then change your viewpoint and you change your recommendations based on the data that you have at that time,” he said.

“Science is a learning process,” he said. “To think that we knew everything right at the first day that we knew that there was a new virus, I think is just unrealistic.”

Perhaps the biggest challenge confronting federal leaders during a pandemic, Lawler said, is knowing when to acknowledge that it is occurring.

In one of the Red Dawn email exchanges, Lawler chided the assertions by President Trump that the spreading virus would be no worse than a “bad flu.”

Dr. Matthew Hepburn, a U.S. Army infectious disease expert, replied with his advice: “Team, am dealing with a very similar scenario, in terms of not trying to overreact and damage credibility. My argument is that we should treat this as the next pandemic for now, and we can always scale back if the outbreak dissipates, or is not as severe.”

Redfield, the CDC director, described the phenomenon as he experienced it, acknowledging he may have been “lulled” into a false sense of confidence that the virus would be more easily contained.

The CDC responded quickly, he said, to the first person in the U.S. was identified with coronavirus on Jan. 21. That person, Redfield said, had made 50 to 60 contacts before being isolated, and his agency worked hard to evaluate all of them.

“None of them were infected,” he said.

After the CDC had identified 12 more cases involving people traveling into the U.S. from Wuhan, they traced some 850 more people who had been in contact with those travelers.

“We only found two individuals that were infected, and both of them were intimate spouses,” he said. “So initially it didn’t seem like this was infectious-infectious-infectious.”

Lee, the Georgia Tech mathematician, was one of several of the experts who tried to flag the significance of the unfolding outbreak on a cruise ship docked in Yokohama, Japan — the Diamond Princess.

An 80-year-old passenger who became sick while the ship was at sea, had disembarked on Jan. 25. His coronavirus diagnosis was confirmed as the ship sailed on for Yokohama. Soon after it arrived on Feb. 3., health officials found 10 more passengers were infected, and the passengers were asked to quarantine on board.

“It was, in a perverse way, a bit of a natural experiment,” Lawler said. “And so, being able to put the pandemic under a microscope and really look at the details of what happens in an enclosed community where you know there’s nobody coming and going.”

To the experts on the email chain, the outcomes were deeply concerning.

In contrast to Redfield’s observations of the first U.S. cases, which appeared to have indicated a slow-moving virus, on the ship it was spreading with stealthy speed. Even passengers who had been confined in their cabins – with virtually no contact with others – were catching it. In a little more than two weeks, the virus had spread to 691 passengers.

“That really brought home to us the potentially explosive transmission that could occur, particularly in that type of enclosed community,” said Lawler, who was dispatched to the ship to help rescue Americans trapped on board and fly with them to be treated.

Those on the Red Dawn email chain tried to signal to federal officials that the cruise ship was a troubling omen for what was to come. Hanfling noted that the data offered a crucial bit of evidence for U.S. officials about the stealthy way the virus was moving. He said a significant percentage of the passengers had tested positive for the virus, even through they had no symptoms.

“I think was the big red flag that the government missed,” Hanfling told ABC News.

Confusion about the potential for people without symptoms to carry and spread the disease was not a U.S. government trademark alone. Well into the outbreak, the World Health Organization and European health officials also issued conflicting statements about the potential.

But the Red Dawn group seized on the issue as critical.

Dr. Carter Mecher, a Department of Veterans Affairs physician who was a frequent contributor to the email chain, wrote on Feb. 28 that he was “worried what happened on the cruise ship is a preview of what will happen when this virus makes its way to the U.S. health care system.”

“I think this data is close enough to convince people that this is going to be bad,” he wrote. “All that’s left is when.”

Duchin, the Seattle health official who was part of the Red Dawn email chain, said he believes some of the solutions were sitting in plain view.

He recalled a report had just recently been released, in October 2019, by the the Nuclear Threat Initiative, a D.C.-based nonprofit organization, and the Johns Hopkins Center for Health Security, that ranked U.S. readiness for a pandemic as one of the best in the world.

“We may have taken false reassurance in that,” Duchin said.

“I think it was clear to us early on that this outbreak was going to be very difficult to manage,” he said. “And that, regardless of how we compare to other nations in surveys and international assessments, we were still not prepared enough to optimally meet this challenge.”

Tim Harford: why we fail to prepare for disasters
By Tim Harford

In 2003, the Harvard Business Review published an article titled “Predictable Surprises: The Disasters You Should Have Seen Coming”. The authors, Max Bazerman and Michael Watkins, both business school professors, followed up with a book of the same title.

Bazerman and Watkins argued that while the world is an unpredictable place, unpredictability is often not the problem. The problem is that faced with clear risks, we still fail to act.

For Watkins, the coronavirus pandemic is the ultimate predictable surprise. “It’s not like this is some new issue,” he says, before sending over the notes for a pandemic response exercise that he ran at Harvard University.

It’s eerily prescient: a shortage of masks; a scramble for social distance; university leaders succumbing to the illness. The date on the document is October 12 2002. We’ve been thinking about pandemics for a long time.

Other warnings have been more prominent. In 2015, Bill Gates gave a TED talk called “The next outbreak? We’re not ready”; 2.5 million people had watched it by the end of 2019. In 2018, the science journalist Ed Yong wrote a piece in The Atlantic titled “The Next Plague Is Coming. Is America Ready?” Now we know the answer, and it wasn’t just the Americans who were unprepared.

Officialdom had also been sounding the alarm. The World Health Organization and the World Bank had convened the Global Preparedness Monitoring Board (GPMB), chaired by Elhadj As Sy of the Red Cross and Gro Harlem Brundtland, a former director of the WHO.

The GPMB published a report in October warning of “a cycle of panic and neglect” and calling for better preparation for “managing the fallout of a high-impact respiratory pathogen”. It noted that a pandemic “akin to the scale and virulence of the one in 1918 would cost the modern economy $3 trillion”.

Alongside these authoritative warnings were the near misses, the direct parallels to Hurricane Ivan: Sars in 2003; two dangerous influenza epidemics, H5N1 in 2006 and H1N1 in 2009; Ebola in 2013; and Mers in 2015. Each deadly outbreak sparked brief and justifiable alarm, followed by a collective shrug of the shoulders.

It is understandable that we have too few doctors, nurses and hospital beds to cope with a pandemic: spare doctors are expensive. It is less clear why we have so few masks, are so unprepared to carry out widespread testing and didn’t do more to develop coronavirus vaccines after the Sars epidemic of 2003, which involved a strain related to the current outbreak. (There was a flurry of activity, but interest waned after 2004.)

We were warned, both by the experts and by reality. Yet on most fronts, we were still caught unprepared. Why?

Wilful blindness is not confined to those in power. The rest of us should acknowledge that we too struggled to grasp what was happening as quickly as we should.

Psychologists describe this inaction in the face of danger as normalcy bias or negative panic. In the face of catastrophe, from the destruction of Pompeii in AD79 to the September 11 2001 attacks on the World Trade Center, people have often been slow to recognise the danger and confused about how to respond. So they do nothing, until it is too late.

Part of the problem may simply be that we get our cues from others. In a famous experiment conducted in the late 1960s, the psychologists Bibb Latané and John Darley pumped smoke into a room in which their subjects were filling in a questionnaire.

When the subject was sitting alone, he or she tended to note the smoke and calmly leave to report it. When subjects were in a group of three, they were much less likely to react: each person remained passive, reassured by the passivity of the others.

As the new coronavirus spread, social cues influenced our behaviour in a similar way. Harrowing reports from China made little impact, even when it became clear that the virus had gone global.

We could see the metaphorical smoke pouring out of the ventilation shaft, and yet we could also see our fellow citizens acting as though nothing was wrong: no stockpiling, no self-distancing, no Wuhan-shake greetings. Then, when the social cues finally came, we all changed our behaviour at once. At that moment, not a roll of toilet paper was to be found.

Normalcy bias and the herd instinct are not the only cognitive shortcuts that lead us astray. Another is optimism bias. Psychologists have known for half a century that people tend to be unreasonably optimistic about their chances of being the victim of a crime, a car accident or a disease, but, in 1980, the psychologist Neil Weinstein sharpened the question.

Was it a case of optimism in general, a feeling that bad things rarely happened to anyone? Or perhaps it was a more egotistical optimism: a sense that while bad things happen, they don’t happen to me.

Weinstein asked more than 250 students to compare themselves to other students. They were asked to ponder pleasant prospects such as a good job or a long life, and vivid risks such as an early heart attack or venereal disease. Overwhelmingly, the students felt that good things were likely to happen to them, while unpleasant fates awaited their peers.

Robert Meyer’s research, set out in The Ostrich Paradox, shows this effect in action as Hurricane Sandy loomed in 2012. He found that coastal residents were well aware of the risks of the storm; they expected even more damage than professional meteorologists did. But they were relaxed, confident that it would be other people who suffered.

While I realise some people are paranoid about catching Covid-19, it’s egotistical optimism that I see in myself. Although I know that millions of people in the UK will catch this disease, my gut instinct, against all logic, is that I won’t be one of them.

Meyer points out that such egotistical optimism is particularly pernicious in the case of an infectious disease. A world full of people with the same instinct is a world full of disease vectors.

We find exponential growth counterintuitive to the point of being baffling — we tend to think of it as a shorthand for “fast”.

An epidemic that doubles in size every three days will turn one case into a thousand within a month — and into a million within two months if the growth does not slow.

In 1975, the psychologists William Wagenaar and Sabato Sagaria found that when asked to forecast an exponential process, people often underestimated by a factor of 10. The process in that study was much slower than this epidemic, doubling in 10 months rather than a few days. No wonder we find ourselves overtaken by events.

Both Robert Meyer and Michael Watkins made an observation that surprised me: previous near misses such as Sars or Hurricane Ivan don’t necessarily help citizens prepare. It is all too easy for us to draw the wrong lesson, which is that the authorities have it under control. We were fine before and we’ll be fine this time.

This, then, is why you and I did not see this coming: we couldn’t grasp the scale of the threat; we took complacent cues from each other, rather than digesting the logic of the reports from China and Italy; we retained a sunny optimism that no matter how bad things got, we personally would escape harm; we could not grasp what an exponentially growing epidemic really means; and our wishful thinking pushed us to look for reasons to ignore the danger.

While politicians have access to the best advice, they may not feel obliged to take experts seriously. Powerful people, after all, feel sheltered from many everyday concerns.

Heffernan argues that this sense of distance between the powerful and the problem shaped the awful response to Hurricane Katrina.

Leaked emails show the response of Michael Brown, then the director of Fema. One subordinate wrote: “Sir, I know that you know the situation is past critical. Here some things you might not know. Hotels are kicking people out, thousands gathering in the streets with no food or water… dying patients at the DMAT tent being medivac. Estimates are many will die within hours…”

Brown’s response, in its entirety, was: “Thanks for update. Anything specific I need to do or tweak?”

That’s a sense of distance and personal impunity distilled to its purest form.

Because Covid-19 has spread much faster than HIV and is more dangerous than the flu, it is easy to imagine that this is as bad as it is possible to get. It isn’t.

Perhaps this pandemic, like the financial crisis, is a challenge that should make us think laterally, applying the lessons we learn to other dangers, from bioterrorism to climate change.

Or perhaps the threat really is a perfectly predictable surprise: another virus, just like this one, but worse. Imagine an illness as contagious as measles and as virulent as Ebola, a disease that disproportionately kills children rather than the elderly.

What if we’re thinking about this the wrong way? What if instead of seeing Sars as the warning for Covid-19, we should see Covid-19 itself as the warning?

Next time, will we be better prepared?

Chronicle of a Pandemic Foretold
By Michael T. Osterholm and Mark Olshaker

Some are calling the COVID-19 pandemic a once-in-100-year event, comparable to 100-year floods or earthquakes. But the fact that the world is enduring a pandemic right now is no more predictive of when the next one will occur than one roll of dice is of the result of the next roll. (Although the 1918 flu was the most devastating influenza pandemic in history, an 1830–32 outbreak was similarly severe, only in a world with around half of 1918’s population.) The next roll, or the one after that, could really be “the Big One,” and it could make even the current pandemic seem minor by comparison.

When it comes, a novel influenza pandemic could truly bring the entire world to its knees—killing hundreds of millions or more, devastating commerce, destabilizing governments, skewing the course of history for generations to come. Unlike COVID-19, which tends to most seriously affect older people and those with preexisting medical problems, the 1918 influenza took a particularly heavy toll on otherwise healthy men and women between the ages of 18 and 40 (thought to be a result of their more robust immune systems overreacting to the threat through a “cytokine storm”). There is no reason to think that the next big novel influenza pandemic couldn’t have similar results.

The masks and the experts
By Matthew Yglesias

… what I do think we need is for some time in 2021 for the government to order a serious review of what the public health agencies were saying and doing about this stuff and why. Not to point fingers, but in the spirit of the kind of after-action reports the military routinely does. Because clearly something went wrong and there’s a need to assess how decisions were made and how we can do better next time.

I’m afraid it won’t happen though.

Sources I’ve spoken to in the political world tell me that Biden’s vague campaign messages to “listen to the experts” and “follow the science” tested very well. There’s always a lot of emphasis in the press on loud anti-maskers and covid denialists, but the evidence is that most people want to follow the experts’ lead and strongly disapprove of the way Trump derided them. So the political safe zone is to just ride the “experts are good” message, hand out vaccines, and leave the Republicans to deal with the fact that they now have a bunch of maniacs in their base.

But while a 60/40 issue is really good in electoral politics, in actual public health it’s not good enough. We need to get to a point where a larger share of the public has confidence in the experts and the public health authorities. And the fact that the authorities seem to have been deliberately misleading people about masks — and doing so because they lacked the creativity to think of the cloth masks as an alternate solution — is not great for building anyone’s confidence.

A Failure, But Not Of Prediction
By Scott Alexander

But getting back to the media:

Their main excuse is that they were just relaying expert opinion – the sort of things the WHO and CDC and top epidemiologists were saying. I believe them. People on Twitter howl and gnash their teeth at this, asking why the press didn’t fact-check or challenge those experts. But I’m not sure I want to institute a custom of journalists challenging experts. Journalist Johann Hari decided to take it upon himself to challenge psychiatric experts, and wrote a serious of terrible articles and a terrible book saying they were wrong about everything. I am a psychiatrist and I can tell you he is so wrong that it is physically painful to read his stuff (though of course I would say that…). Most journalists stick to assuming the experts know more about their subject of expertise than they do, and I think this is wise. The role of science journalists is to primarily to relay, explain, give context to the opinions of experts, not to try to out-medicine the doctors. So I think this is a good excuse.

But I would ask this of any journalist who pleads that they were just relaying and providing context for expert opinions: what was the experts’ percent confidence in their position?

I am so serious about this. What fact could possibly be more relevant? What context could it possibly be more important to give? I’m not saying you need to have put a number in your articles, maybe your readers don’t go for that. But were you working off of one? Did this question even occur to you?

Nate Silver said there was a 29% chance Trump would win. Most people interpreted that as “Trump probably won’t win” and got shocked when he did. What was the percent attached to your “coronavirus probably won’t be a disaster” prediction? Was it also 29%? 20%? 10%? Are you sure you want to go lower than 10%? Wuhan was already under total lockdown, they didn’t even have space to bury all the bodies, and you’re saying that there was less than 10% odds that it would be a problem anywhere else? I hear people say there’s a 12 – 15% chance that future civilizations will resurrect your frozen brain, surely the risk of coronavirus was higher than that?

And if the risk was 10%, shouldn’t that have been the headline. “TEN PERCENT CHANCE THAT THERE IS ABOUT TO BE A PANDEMIC THAT DEVASTATES THE GLOBAL ECONOMY, KILLS HUNDREDS OF THOUSANDS OF PEOPLE, AND PREVENTS YOU FROM LEAVING YOUR HOUSE FOR MONTHS”? Isn’t that a better headline than Coronavirus panic sells as alarmist information spreads on social media? But that’s the headline you could have written if your odds were ten percent!

People were presented with a new idea: a global pandemic might arise and change everything. They waited for proof. The proof didn’t arise, at least at first. I remember hearing people say things like “there’s no reason for panic, there are currently only ten cases in the US”. This should sound like “there’s no reason to panic, the asteroid heading for Earth is still several weeks away”. The only way I can make sense of it is through a mindset where you are not allowed to entertain an idea until you have proof of it. Nobody had incontrovertible evidence that coronavirus was going to be a disaster, so until someone does, you default to the null hypothesis that it won’t be.

The Lost Days That Made Bergamo a Coronavirus Tragedy
By Jason Horowitz

Bergamo became one of the deadliest killing fields for the virus in the Western world, a place marked by inconceivable suffering and a dreadful soundtrack of ambulance sirens as emergency medical workers peeled parents away from children, husbands from wives, grandparents from their families.

Hospitals became makeshift morgues and produced parades of coffins and scenes of devastation that became a warning to officials in other Western countries of how the virus could rapidly overwhelm health systems and turn infirmaries into incubators.

Officials confirmed that more than 3,300 people died with the virus in Bergamo, though they said the actual toll was probably double that. Mr. Orlandi’s town, Nembro, became perhaps Italy’s hardest struck, with an 850 percent increase in deaths in March. So many, the local priest ordered a stop to the incessant tolling of the bells for the dead.

The question of how such a tragedy could unfold in Bergamo, a wealthy, well-educated province of just more than a million, with top-level hospitals, has remained an uneasy mystery, a blood stain that the government prefers to avoid as it points with pride to Italy’s success in flattening the first wave of infections.

All the authorities involved now recognize Bergamo’s losses as a tragedy. But invariably they lay blame for it elsewhere.

The World Health Organization says that it limited its case definitions for practical reasons, primarily not to waste resources at the outset of an uncertain contagion. The rationale, said Dr. Margaret Harris, a spokeswoman for the organization, was “to limit the testing to a specific population at risk.” It is a position that past W.H.O. officials considered reasonable.

But Dr. Harris also argued that when the agency updated the guidelines at the end of January, it made clear “that the patient’s doctor is the one, ultimately, to decide who to test.”

Doctors in Bergamo considered that a convenient caveat.

The guidance was “the thing that generated the huge problem of the spread of the pandemic,” Dr. Avogadri said. “It was a big limitation.”

The W.H.O. “made a mistake,” said Giuseppe Ruocco, Italy’s chief medical officer and a senior official in its health ministry, adding that if Italy hadn’t automatically followed the organization’s lead it “could have certainly avoided cases and the infection of medical staff.”

In June, Italy bestowed a knighthood in the Order of Merit of the Italian Republic on Dr. Malara, the physician who exposed the outbreak by disregarding the protocol.

Demoralized health workers struggle as virus numbers surge
By Marion Renault

Although concerns remain about getting enough beds, masks and other equipment, many frontline health workers are most worried about staff shortages.

Nurses are the most scare resource of all, said Kiersten Henry, an ICU nurse practitioner at MedStar Montgomery Medical Center in Olney, Maryland.

“I feel we’ve already run a marathon, and this is our second one. Even people who are upbeat are feeling run down at this point,” Henry said.

Many expressed frustration over some Americans’ disregard and even contempt for basic precautions against the virus.

Dr. Lew Kaplan, a critical care surgeon at the University of Pennsylvania’s Perelman School of Medicine, said health care workers are treated as “heroes” for helping patients but are seen as “close to evil incarnate” when they ask people to wear masks.

“It is very disheartening, while you are struggling to manage the influx of patients, there are others who won’t accept public health measures,” said Kaplan, president of the Society of Critical Care Medicine.

Raju Mehta, a critical care physician at Advocate Health and Hospital in the Chicago area, said that early on in the pandemic, many frontline workers were energized by a sense of purpose. Now, that morale is beginning to crumble.

“Seeing what we’re seeing, day in, day out, for eight months, takes a toll,” Mehta said. “It’s tough knowing what we see, and then what happens outside our walls.”

Italian nurse, 34, kills herself after testing positive for coronavirus and worrying she had infected others

An Italian nurse killed herself after testing positive for coronavirus and fearing she had infected others, a nursing federation has revealed.

Daniela Trezzi, 34, was working on the front line of the coronavirus crisis at a hospital in Lombardy, the worst-affected region of Italy.

The National Federation of Nurses of Italy confirmed her death and expressed its ‘pain and dismay’ in a statement last night.

The federation said the nurse had been suffering ‘heavy stress’ because she feared she was spreading the virus while trying to bring the crisis under control.

The nursing group also revealed that ‘a similar episode had happened a week ago in Venice, with the same underlying reasons’.

‘Each of us has chosen this profession for good and, unfortunately, also for bad: we are nurses,’ the federation said.

‘The condition and stress to which our professionals are subjected is under the eyes of all.’

‘I Couldn’t Do Anything’: The Virus and an E.R. Doctor’s Suicide
By Corina Knoll, Ali Watkins and Michael Rothfeld

It is impossible to know for sure why someone takes her own life. And Dr. Breen did not leave a note to unravel the why.

Still, when the casualties of the coronavirus are tallied, Dr. Breen’s family believes she should be counted among them. That she was destroyed by the sheer number of people she could not save. That she was devastated by the notion that her professional history was permanently marred and mortified to have cried for help in the first place.

NewYork-Presbyterian said in a statement that it began offering mental health services to its front-line staff in late March to help them cope with their experiences.

“Dr. Breen was a heroic, remarkably skilled, compassionate and dedicated clinical leader who cared deeply for her patients and colleagues,” the statement said.

If Dr. Breen is lionized along with the legions of other health care workers who gave so muchmaybe too much — of themselves, then her shattered family also wants her to be saluted for exposing something more difficult to acknowledge: the culture within the medical community that makes suffering easy to overlook or hide; the trauma that doctors comfortably diagnose, but are reluctant to personally reveal, for fear of ruining their careers.

“If the culture had been different, that thought would have never even occurred to her, which is why I need to change the culture,” Ms. Feist said. “We need to change it. Like, as of today.” The family has established a fund to offer mental health support to health care providers.

For Dr. Breen’s friend Ms. Ochoa, their last conversation has become especially crushing. At one point, Dr. Breen had gotten stuck on an idea and kept repeating herself.

Ms. Ochoa had not thought profoundly about it at the time, but now she cannot stop hearing that same relentless refrain: “I couldn’t help anyone. I couldn’t do anything. I just wanted to help people, and I couldn’t do anything.”

A doctor who treated some of Houston’s sickest Covid-19 patients has died
By Harmeet Kaur

Araujo-Preza lived the American Dream, Araujo said.

Born in El Salvador, he came to the US in 1994 to continue his medical education, studying at Staten Island University Hospital in New York and Tulane University in New Orleans. In 2001, he moved to the Houston area and worked as a pulmonologist for nearly two decades.

As a physician who treated respiratory diseases, Araujo-Preza was on the frontlines of the Covid-19 pandemic almost as soon as it hit.

He was appointed the critical care medical director of the ICU, primarily treating coronavirus patients. For nearly all of April, he slept in a room in the hospital, always on call in case of emergency.

Araujo remembers seeing him for only five minutes a week during those days — every Wednesday, he would stop by her house to say hello.

Though his family worried for his safety, he considered the work his calling, Araujo said.

“He was so brave,” she said. “He loved medicine and he loved helping patients. He was so excited to wake up every day and go help people.”

The Pandemic Heroes Who Gave us the Gift of Time and Gift of Information
By Zeynep Tufekci

Moderna’s vaccine was apparently designed in just a few days, over a weekend, after the genetic sequence became available on January 10th, 2020.

Here’s why that date matters: the sequence was published ten days before China acknowledged the severity of the problem by admitting sustained human-to-human transmission and shutting down the city of Wuhan, on January 20th. The sequence was published while China—and the WHO, which depended on China for information—were still downplaying what was going on, in their official statements. The sequence wasn’t published in an official document. Instead, it was published independently in an open-source depository by Yong-Zhen Zhang, a professor at the Shanghai Public Health Clinical Center and School of Public Health.

Zhang had received the virus from Wuhan on January 3rd, around 1:30 p.m., when a metal box continuing a test tube packed in dry ice arrived at his office. The researchers in his team worked feverishly to sequence it over the next two days. Just about 40 hours later, on January 5th at 2 a.m., his team was done. Zhang immediately realized the danger the pathogen posed. As he put it in a later interview with Time magazine:

“I realized that this virus is closely related to SARS, probably 80%. So certainly, it was very dangerous.”

He shared the genome with members of his consortium, which included Australian scientist Eddie Holmes. On the morning of January 11th (in China), Holmes called Zhang as he was about to take off for a trip to Beijing for another meeting concerning the outbreak at Wuhan. Holmes called and asked Zhang for permission to release the genome to the world.

It may have taken him a single minute for Zhang to decide, but his bravery was real. This was just 10 days after whistleblowers in Wuhan who had attempted to warn others had been detained by the police. The punishment of these doctors for “rumor-mongering” was broadcast on national TV. Tragically, one of the most prominent whistleblowers, Dr. Li Wenliang, would die of the virus, just a month later (His son was born this summer to his widow). It was a time of silence, not of speaking out. Between January 5th and January 10th, the Wuhan government would not update the number of infected people. It would be another 10 days before the dam broke and President Xi Jinping made his first public statement, saying “the virus must be taken seriously.”

Sadly, the hammer did come down fast. Zhang’s lab was immediately shut down for “rectification”—an obscure term to imply some “malfeasance,” as the South China Morning Post explains.

To many observers, it seemed that furious officials scrambling to snuff out evidence of the outbreak were punishing Zhang simply for sharing the SARS-CoV-2 genome—and in the meanwhile, slowing down the release of this key information.

At the end of February, the South China Morning Post was reporting that Zhang’s lab was still shut down. Things did improve, though. Dr. Zhang continues to carry out important work—and has been recognized with awards.

Professor Zhang’s efforts in sharing the first SARS-Cov-2 genome has already been acknowledged around the world, with Time Magazine recognizing him as a “saving grace” and naming him as one of the 100 most influential people of 2020. Stating that: “The Zhang team’s unprecedented speed in sharing data envisions what is possible with a collaborative, connected public-health collective.”

Professor Loman further highlighted the need for sequence data as the only means to get started on truly managing a viral outbreak, saying: “Whilst the generation of a new viral sequence is a technical accomplishment in itself, much more important is the speed of sharing: until this happens the global scientific community cannot get started on a response. The process of designing diagnostic PCR assays and sequencing protocols are critically contingent on that first genome sequence.”

Professor Coin further pointed to how essential having a viral sequence available is to the medical profession, noting: “Early availability of the genome sequence also enabled researchers to start developing vaccines and antiviral therapies even before the virus could be grown in sufficient quantities in cell culture for it to be studied directly.”

The availability of this data within weeks of the first identified COVID-19 patient undoubtedly saved many lives and will be highlighted for many years to come as the perfect example of why we can see further by standing on the shoulders of giants. The GigaScience prize was an acknowledgement for all of Prof Zhang and his groups efforts and will likely be one of many recognitions to come.

And in interviews since, Zhang, who still works in China, downplays his role and his bravery.

Still, as the good news from these vaccines rolls in, we should remember and celebrate the gift Dr. Zhang and his team gave us, perhaps the most important ones for fighting a pandemic: the gift of time and gift of information. Dr. Zhang acted without being incentivized by the huge amounts of money that the companies will receive—Moderna’s stock has increased almost 700 percent already—and he faced down potentially catastrophic consequences for himself and his lab.

For many years, we will be analyzing the failure of many governments and our institutions in their responses to this pandemic. But the successes are real, too. More than anything, we should also remember those who bravely stepped up when it counted: the healthcare workers and the researchers around the world—starting with Professor Yong-Zhen Zhang and Doctor Li Wenliang of China.

Coronavirus Response: Hospitals Rated Best, News Media Worst
By Justin McCarthy

Americans are generally positive in their evaluations of how each of nine leaders and institutions has handled the response to the coronavirus situation. Eight of the nine receive majority positive ratings — led by U.S. hospitals, at 88% approval. Only the news media gets a more negative than positive review.

Americans Remain Distrustful of Mass Media
By Megan Brenan

At a time when Americans are relying heavily on the media for information about the coronavirus pandemic, the presidential election and other momentous events, the public remains largely distrustful of the mass media. Four in 10 U.S. adults say they have “a great deal” (9%) or “a fair amount” (31%) of trust and confidence in the media to report the news “fully, accurately, and fairly,” while six in 10 have “not very much” trust (27%) or “none at all” (33%).

Gallup first asked this question in 1972 and has continued to do so nearly every year since 1997. Trust ranged between 68% and 72% in the 1970s, and though it had declined by the late 1990s, it remained at the majority level until 2004, when it dipped to 44%. After hitting 50% in 2005, it has not risen above 47%.

The latest findings, from Gallup’s annual Governance poll conducted Aug. 31-Sept. 13, are consistent with all but one recent trust rating — in 2016, a steep decline in Republicans’ trust in the media led to the lowest reading on record (32%).

Republicans’ trust has not recovered since then, while Democrats’ has risen sharply. In fact, Democrats’ trust over the past four years has been among the highest Gallup has measured for any party in the past two decades. This year, the result is a record 63-percentage-point gap in trust among the political party groups.

Groupthink Has Left the Left Blind
By Bret Stephens

According to the incessant pronouncements of much of the news media (including a few of my own), Donald Trump is the most anti-Black, anti-Hispanic and anti-woman president in modern memory. Yet the CNN exit poll found that Trump won a majority of the vote of white women against both Hillary Clinton and Joe Biden. He also improved his vote share over 2016 with both Latino and Black voters, while losing most of the advantage he previously had with college-educated white males — precisely the demographic his policies had supposedly done most to favor.

If the catechism of today’s left determined reality, none of this would have happened. Racial, ethnic or sexual identity would have trumped every other voting consideration. But as the Texas Democratic Representative Henry Cuellar recently told Axios: “Trump did a much better job at understanding Hispanics. Sometimes, Democrats see Hispanics as monolithic.” Latino voters in his South Texas district were particularly turned off by progressive rhetoric about defunding the police, opposition to fossil fuels and decriminalizing border crossings.

What is true of Cuellar’s constituency is true of everyone: People are rarely reducible to a single animating political consideration. Nor should they be subject to a simple moral judgment. Motives are complicated: It is perfectly possible to see Trump for the reprehensible man he is and still find something to like in his policies, just as it is possible to admire Biden’s character and reject his politics.

The apparent inability of many on the left to entertain the thought that decent human beings might have voted for Trump for sensible reasons — to take one example, the unemployment rate reached record lows before the pandemic hit — amounts to an epic failure to see their fellow Americans with understanding, much less with empathy. It repels the 73 million Trump voters who cannot see anything of themselves in media caricatures of them as fragile, bigoted, greedy and somewhat stupid white people.

It also motivates them. The surest way to fuel the politics of resentment — the politics that gave us the Tea Party, Brexit and Trump, and will continue to furnish more of the same — is to give people something to resent. Jeering moral condescension from entitled elites is among the things most people tend to resent.

Which brings me back to the flight of the contrarians. As the left (and the institutions that represent it) increasingly becomes an intellectual monoculture, it will do more than just drive away talent, as well as significant parts of its audience. It will become more self-certain, more obnoxious to those who don’t share its assumptions, more blinkered and more frequently wrong.

Democrats are misreading the election results, and it will cost them
By Musa al-Gharbi

In a post-election news conference, House Speaker Nancy Pelosi, D-Calif., declared that voters gave Democrats a “tremendous mandate … a bigger mandate than John F. Kennedy when I was in school, and a bigger mandate than others.”

A centerpiece of these narratives is the claim that President-elect Joe Biden and Vice President-elect Kamala Harris won more votes than any other ticket in U.S. history. So far, more than 77.7 million votes had been counted for the Democratic ticket, with some final votes left to count. This does, indeed, surpass the previous record, set by Sen. Barack Obama in 2008, of 69.5 million votes.

However, President Donald Trump and Vice President Mike Pence also beat Obama’s record by a healthy margin this cycle, pulling in 72.4 million votes to date, with their numbers also expected to climb. The electoral results do not show that the public is united behind Biden and Harris. Quite the contrary — they show that voters are deeply divided.

About 31.1 percent of voting-age Americans are expected to have cast their ballots for Biden when all is said and done. This is very far from a majority. Nor is it a historic level of support. Lyndon B. Johnson set the record in 1964, pulling in 38 percent of all voting-age adults. And contrary to Pelosi’s assertions, Kennedy also pulled in a larger share of the electorate than Biden and Harris are projected to win (31.8 percent) — as did Ronald Reagan in 1984.

The record in terms of sheer number of votes that the Democratic presidential ticket got is a product of population growth. When we control for the size of the electorate, it is clear that the incoming administration does not have an unprecedented mandate.

In truth, these results are pretty easy to understand. The public did not embrace Democrats this cycle; it merely evicted Trump. Indeed, although Biden’s victory was not historic, Trump’s defeat was; when a party initially takes control of the White House, it tends to stay in power for at least eight years. Going all the way back to the Civil War and the creation of the Democratic and Republican parties, the only true exception to this rule was the administration of Jimmy Carter. Until now.

Democrats No Longer Have a Coalition
By Musa al-Gharbi

In 2008, Barack Obama was widely described as having built a game-changing political coalition: young people, racial and ethnic minorities, educated professionals, urban and suburban voters. He was held to have built an innovative campaign infrastructure, leveraging big data and social media in an unprecedented way, increasing turnout and Democratic vote share with constituencies that are typically underrepresented at the ballot box.

All of this was thought to not only benefit Obama but also the party writ large. Indeed, in the wake of the 2008 election, Democrats had won the presidency and consolidated their hold over both chambers of Congress. At the state level, they held governorships in 29 states and controlled both chambers in 27 state legislatures. For contrast, Republicans controlled just 14 state legislatures and 21 governorships.

Many went so far as to believe that the Obama coalition heralded the arrival of a long-prophesied enduring Democratic majority in US politics. They were wrong.

In 2010, Democrats lost control of the House in the most sweeping congressional reversal in 62 years. They also saw huge losses in state legislatures, which allowed Republicans to control the decennial post-Census redistricting to an unprecedented degree.

In 2014, Democrats would go on to lose the Senate. And of course, two years later, they would lose the presidency as well. The party saw massive losses in state contests too. As Trump assumed office in 2016, Republicans controlled both chambers of the US Congress, both chambers in 32 state legislatures, and held 33 governorships.

Under Trump, the GOP would come to dominate the courts too. Roughly one-quarter of all active federal judges are Trump appointees. Republicans were also able to place three Supreme Court justices over the course of Trump’s term—leaving a 6-3 conservative majority that is likely to endure for some time.

Fortunately, parties virtually always lose seats in the House during their inaugural midterms. The GOP was no exception in 2018. Although the Republican Party’s losses were almost exactly average for an inaugural midterm, it was enough to flip the House to the Democrats.

In 2020, another key win: Joe Biden managed to unseat Donald Trump and is poised to assume the presidency in January 2021. Yet the Democratic Party finds itself in an overall weaker position than before the election.

Democrats lost seats in the House—putting them on track to lose the chamber outright in 2022. They may fail to take control of the Senate. They lost one governorship. The Democratic Party also saw continued erosion in state legislatures, leaving the GOP in a dominant position once again with respect to post-Census redistricting. According to FiveThirtyEight estimates, Republicans will control redistricting for roughly 43 percent of the seats in the House. Democrats will have comparable control over a mere 17 percent of seats. This is no small loss, as these maps will govern elections through 2030.

In short, although Barack Obama was fond of describing himself and his allies as being on the “right side of history”—and implying that his opponents were consigned to its dustbin—history seems to have had other ideas.

Democrats Seem to Have a Religion Problem
By Musa al-Gharbi

… it is not incidental that Democrats are seeing attrition among people of faith at the same time that Republicans are gaining among people of color. African Americans and Hispanics, for instance, tend to be more religious and socially conservative than whites on average. People lose sight of this because many minority populations have decisively skewed towards Democrats. However, this does not mean they are liberal. In fact, African Americans and Hispanics are among the least ‘culturally left’ constituents within the Democratic coalition. And as the Democratic Party leans ever more into niche ‘cultural’ issues at the expense of bread-and-butter issues, it seems as though many minority voters are growing alienated from the party. Again, this was a trend that preceded Trump, but has continued under his tenure unabated.

The most dramatic shift among religious groups (and for many, perhaps the most surprising) seems to have taken place among Muslims. But in fact, these shifts should perhaps not be surprising as they, too, are likely intimately bound up with shifts along racial lines. While Islam is often associated with Arabs in the American imagination, in fact a plurality of U.S Muslims are African American (both ‘black’ and of more recent African heritage. Significant shifts among African Americans, therefore, likely translate into shifts among Muslims.

In 2020, of course, Trump also happened to be running against a Catholic, Joe Biden, who quoted Pope Francis on the campaign trail to explain why he was in this race and what he hoped to accomplish in office. Biden is now set to be sworn in as the second Catholic president in U.S. history. Pope Francis recently called Biden to congratulate him on his victory, and to discuss key policy areas and moral leadership. This reciprocal respect between the pontiff and the Democratic nominee likely helped sway some Catholic hearts and minds.

In the end, Democrats’ gains with Catholics were more than offset by their continued attrition with Protestants, Jews, Muslims and other people of faith – resulting in a much closer race than many seemed to be expecting.

Religious people seem to lack faith in the Democratic Party. Critically, this is not just a weakness among Christians, but among believers of all stripes. These dynamics have persisted for more than a decade now, across multiple administrations, sapping the party’s strength among the very groups that were ‘supposed’ to secure their long-term electoral dominance.

Although the share of religiously unaffiliated Americans continues to grow, 74% of Americans do identify with one faith tradition or another. If Democrats can’t find a better way to speak to connect with these voters and speak to their priorities and concerns, the enduring Democratic majority that they’ve been prophesying for more than 50 years now will likely continue to remain elusive for the foreseeable future.

Why Trump is picking up support from Black and Hispanic voters
By Musa al-Gharbi

Perceptions of Trump as racist seem to be a core driving force pushing whites toward the Democrats. Why would the opposite pattern be holding among minority voters — i.e. the very people the president is purportedly being racist against?

It may be that many minority voters simply do not view some of his controversial comments and policies as racist. Too often, scholars try to test whether something is racist by looking exclusively at whether the rhetoric or proposals they disagree with resonate with whites. They frequently don’t even bother to test whether they might appeal to minorities, as well.

Yet when they do, the results tend to be surprising. For instance, one recent study presented white, Black and Hispanic voters with messages the researchers considered to be racial “dog whistles,” or coded language that signals commitment to white supremacy. It turned out that the messages resonated just as strongly with Blacks as they did with whites. Hispanics responded even more warmly to the rhetoric about crime and immigration than other racial groups.

That is, on balance, these “racist” messages seemed to resonate more strongly with minorities than whites! Across racial groups, most did not find the messages to be racist or offensive— despite researchers viewing these examples as clear-cut cases of racial dog whistles.

Academics generally avoid examining bigotry among members of minority groups — focusing nearly exclusively on anti-minority sentiment among whites.

However, as a matter of fact, people from historically marginalized or disadvantaged groups often hold very negative opinions of people from other minority populations — and do not seem to approach social issues in intersectional terms.

For instance, anti-Black sentiment is common within many Arab, Hispanic and Asian communities in the United States.

Anti-Semitism, meanwhile, is significantly more prevalent among Blacks and Hispanics than among whites. Moreover, many Black and Hispanic Christians are highly distrustful of Muslims. Many American Hindus feel the same way. This antipathy is not just a matter of attitudes. According to FBI statistics, roughly a third of all hate crimes seem to be committed by racial and ethnic minorities.

Intergroup tensions are also expressed in terms of policy. Overall, Black Americans are more supportive of limiting immigration than any other bloc of the Democratic coalition. And Hispanics actually tend to be more concerned about illegal immigration than are whites or Blacks.

Hispanics are generally supportive of legal immigration — however, many insist that people come over “the right way” and worry that illegal immigration has a detrimental effect on Hispanics already living in the United States. More than two-thirds view improving border security as a priority with respect to U.S. immigration policy.

In other words, far from alienating minority constituencies, Trump’s messaging on immigration, law and order and cultural conservativism may be an important source of his appeal to many voters of color — even as it leads many whites to distance themselves from him.

Then again, it may be an error to look at Trump to explain these patterns among voters of color, as they could just as much be a product of minorities’ dissatisfaction with the Democratic Party. In fact, Democratic attrition of minority voters predates Trump. The Big Tent party has seen losses with Hispanic and Black voters for virtually every midterm and presidential election since 2008.

White men swung to Biden. Trump made gains with Black and Latino voters. Why?
By Musa al-Gharbi

Contrary to the prevailing narratives, the Republican party saw continued attrition with whites throughout Trump’s tenure in office. Almost all the losses Republicans saw in 2018, for instance, were due to defections by white voters. As compared to 2016, Republicans slightly improved their numbers with Blacks and Hispanic voters during the midterms. However, the margins among whites shifted 10 percentage points in the other direction, helping Trump’s opposition win the House.

In other words, the prevailing discourse around race seems to be flat out wrong. Shifts among minorities were responsible for Trump’s surprising strength this cycle, while shifts among whites are what helped put Biden over the edge in the end.

Unfortunately, the dominant narratives around gender have been just as deficient as those on race.

For instance, men did not support Trump in record-shattering numbers in 2016 – nor did women rally strongly behind Clinton. Instead, Hillary lost because of anemic support among women. She got one of the lowest shares of the female vote of any Democrat in decades – and turnout among women was down as compared to the previous cycles. Had female turnout – or Democrat’s female vote share — been as strong for Clinton as it had been for Obama, Hillary would have won.

Consequently, the question of why women exercised their agency the way they did in 2016 becomes an extremely important question. In fact, it is objectively more critical than how men voted: women comprised a larger share of the electorate than men in 2016. Indeed, they’ve comprised a majority of the electorate every cycle since 1976.

Nonetheless, narratives about the 2016 election have overwhelmingly focused on men, sexism, patriarchy, etc. How women voted has been largely ignored.

When discussed at all, Democrats’ surprising weakness with women in 2016 is typically attributed to white women having prioritized their commitment to white supremacy above their commitment to feminism. Yet, there was absolutely nothing special about Trump winning a majority of white women:

Going back to 1972, Democrats have literally never won an outright majority of white women, and only reached a plurality twice. White women were less supportive of Trump in 2016 than they were of the Republican candidates in 1972, 1984, 1988, 2004 or 2012 (for the reference, similar patterns hold for white men).

Nonetheless, white women’s 2016 votes are often described as being uniquely motivated by racism – despite the fact that voters were choosing between two tickets comprised 100% of white people.

This time around, spinning such narratives will be much harder. Yes, white women actually did shift in Trump’s direction this time, unlike in 2016. However, Black women and Hispanic women shifted in the exact same direction.

In short, it was shifts among minority voters that helped Trump win the presidency in 2016. This movement among minority voters carried into 2020 – and women across the board shifted towards the Republican party as well. Fortunately, defections among white men overrode the preferences of this growing share of women and minorities, bringing about Trump’s political demise.

The Awokening Will Not Bring an End to the Nightmare
By Musa al-Gharbi

To understand the current political moment, I argued, the ‘real’ action might be among white Democrats rather than Trump voters.

Picking up on this thread, Georgia State University political scientist Zach Goldberg soon demonstrated that the same patterns also held for a number of other questions about inequality, immigration and related topics — beginning in 2014 and accelerating under Trump – among white liberals. Moreover, he showed, this was not just a phenomenon of the American National Election Survey (the specific dataset in which I initially highlighted this phenomenon) – it was apparent in virtually all major national public opinion surveys over the preceding years, but had gone largely unnoticed.

In early 2019, describing Goldberg’s research, Vox’s Matt Yglesias popularized the “Great Awokening” as a shorthand for describing this most recent sea change in expressed attitudes on identity issues. One particularly helpful aspect of this framing is that it calls attention to the religious overtones of the contemporary discourse around identity issues (evoking, of course, the “Great Awakening” religious revivals of the mid 18th century).

Indeed, for a particular subset of white people, antiracism has itself become something akin to a religious creed. There is a discourse of slavery as America’s original sin, of whiteness as a primeval and malevolent force responsible for, or implicated in, virtually all of the world’s ills.

There is a gnostic element, with adherents believing that they can see the ‘real’ structures of the world, which others are blind to – along with the sense of superiority that accompanies such beliefs. There is an eschatological sense of being on the ‘right side of history,’ and in many circles, an intolerance for doubts or heresy.

There is a fetishization over the destruction of ‘black bodies,’ paired with an obsession with the contents of whites’ hearts and minds – without anyone apparently noticing, or seeing a problem with, this asymmetry.

Trayvon Martin, Michael Brown, Eric Garner, Tamir Rice, Philando Castile, Breonna Taylor et alia are held up as saints, as martyrs for the cause – glossing over the far more disturbing truth that these were ordinary (flawed like the rest of us) people who were needlessly murdered while going about their daily lives. Most do not seem to have been particularly political, and likely would not have agreed with many of the political claims now being made by liberal whites (bolstered by elites of color), ostensibly in their names.

Beyond the Great Awokening
By Adolph Reed Jr.

Ventriloquizing the interests of a fictive, undifferentiated racial population has become an important source of political capital for advancing identitarian agendas skewed to benefit the upper strata and aspirants—a key development that in turn suggests the Great Awokening represents a form of cognitive dissonance within that class. That is to say, the more obviously the premises of race-reductionist politics are at odds with the daily realities of black Americans’ lives and expressed concerns, the more insistently the Woke must double down on the fantasy of monolithic, unchanged race-driven oppression. In this way, the vital contrasts of unequal life outcomes arbitrated by class, or other forces beyond the scope of race reduction, are simply factored out of the equation. This, indeed, may mark the point where wishful thinking approaches pathology. Or it may just show the deep wisdom of Upton Sinclair’s famous dictum: “It is difficult to get a man to understand something, when his salary depends upon his not understanding it.”

Election Showed a Wider Red-Blue Economic Divide
By Jed Kolko

More educated places, which leaned strongly blue to begin with, voted even more Democratic in 2020 than they did in 2016. Highly educated Republican-leaning counties, like Williamson County near Nashville and Forsyth County near Atlanta, have become rarer with each recent election.

A more educated work force bodes well for future local economic success — and places with brighter future prospects swung toward Joe Biden. Jobs requiring more education are projected to grow faster and be at less risk from automation.

Counties where more jobs are “routine” (in the sense of being at greater risk from automation) voted strongly for Mr. Trump in 2016 and even more so in 2020, while counties with fewer such jobs swung toward Mr. Biden. Similarly, counties with a mix of occupations that are projected to grow faster voted even more strongly for Mr. Biden in 2020 than for Hillary Clinton in 2016.

Not only did places with brighter future economic prospects swing more toward Mr. Biden, but places with a stronger economy during the past four years did, too. Counties with faster job growth and lower unemployment before the pandemic swung more toward Mr. Biden than other counties. And counties with milder job losses and smaller jumps in unemployment during the pandemic also swung more toward Mr. Biden, even though Republican-leaning places suffered less in the pandemic than Democratic-leaning ones.

Many more places swung toward Mr. Biden relative to 2016 than toward Mr. Trump, but the most significant local shifts were toward Mr. Trump. These included heavily Hispanic areas in Miami-Dade County and along the Texas border, and the more heavily Mormon counties of Utah and Idaho (though some of these counties are still below 98 percent reporting).

Our analysis of the election results suggests that 2020 accelerated a long-running trend
By The Economist

Jonathan Rodden, a professor at Stanford University and author of “Why Cities Lose”, a book about geographic polarisation, says that the partitioning of America by density has led to an underrepresentation of Democratic votes. Because the seats in the House of Representatives and the Senate are awarded on a winner-take-all basis, rather than in proportion to the popular vote, they can end up skewing the allocation of legislative seats away from the party whose voters are crammed into just a few states or congressional districts. As Democrats cluster in cities, the system reduces their political clout. It can be thought of as a natural gerrymander.

Geographic polarisation also hurts Democrats’ chances in the electoral college, America’s system of choosing its president. In this year’s election, for example, Mr Biden will win the national popular vote by about five percentage points. But his margin in the “tipping-point” state that ultimately gave him enough votes to win the election, Wisconsin, will be less than one point. That four-point advantage for the Republicans is the biggest in at least four decades. So long as Democrats continue to be the party of the cities, and Republicans the party of small-town and rural America, those biases will persist.

Joe From Scranton Didn’t Win Back the Working Class
By Lisa Lerer

Of the 265 counties most dominated by blue-collar workers — areas where at least 40 percent of employed adults have jobs in construction, the service industry or other nonprofessional fields — Mr. Biden won just 15, according to data from researchers at the Economic Innovation Group, a bipartisan policy research group.

On average, the work force in counties won by Mr. Biden was about 23 percent blue collar. In counties won by President Trump, blue-collar workers made up an average of 31 percent of the work force.

This isn’t a new trend. For decades, Democrats have been trading the support of union members for broader backing from the professional classes. And the G.O.P., once the party of white college-educated voters, has increasingly found support among white working-class voters.

Many Democratic primary voters saw Mr. Biden as uniquely positioned to cut into the Republican advantage with the working class. For decades, he’s built his political brand on being a scrappy kid from Scranton, Pa., who became just another guy riding the train to work. The rallying cry of his campaign in the final weeks was: “This election is Scranton versus Park Avenue.”

But Mr. Biden fared worse than Mrs. Clinton in 2016 and Barack Obama in 2012 and 2008 in counties dominated by blue-collar workers.

That outcome should scare Democratic strategists about their party’s future, Mr. Shor said, because of structural dynamics like the Electoral College that give rural areas political influence far beyond the size of their population.

David Shor’s Postmortem of the 2020 Election
By Eric Levitz

Education polarization is not only a decades-long trend but also one that spans most of the postindustrial democratic world. The Democratic Party can’t just shrug off its cosmopolitan voting base. It’s always going to be the coalition more associated with cities and liberal professionals. So how much can Democrats really change their party to appeal to those voters?

Yeah, it’s tough. Parties don’t have as much control over who their voters are as many think. One of my favorite examples is Jeremy Corbyn. When he came into the leadership of the Labor Party, I think he legitimately wanted to steer the party in a more working-class direction and move it away from this perceived Blairite shift toward college-educated people. And despite that, he presided over the largest increase in educational polarization in Britain’s history. He ended up flipping one of the richest parliamentary districts in the country and losing the most working-class ones. And then he did it again in 2019.

So there is a real extent to which these parties aren’t always consciously choosing the coalitions they end up with. There isn’t necessarily a guru behind the scenes of the Democratic Party who’s just been turning a giant dial away from the side labeled “Working Class” and toward the one labeled “College Educated.” But that said, I think we shouldn’t be nihilist about this. I think if you look at educational polarization by country over time, there is this general trend where it seems to be increasing almost everywhere. But there are also lots of breaks in the pattern.

Barack Obama, contrary to what a lot of people would expect, actually presided over a depolarization by education. He did better, in relative terms, among non-college-educated whites than John Kerry did. Part of that was the Great Recession. But I think a lot of it was message discipline on Obama’s part. His campaigns focused on economic issues as much as possible and avoided taking hot-button stances on things like foreign policy or immigration. I think he saw that as core to winning.


I think the reality now is that whenever any elected Democrat goes out and says something that’s unpopular, unless the rest of the party very forcefully pushes back — in a way that I think is actually very rare within the Democratic Party currently — every Democrat will face an electoral penalty. And that’s awkward. But I think it’s a natural consequence of polarization and ticket-splitting declining. I think progressives try to get around this awkward reality by saying, “Well, Republicans are going to demonize us no matter what we say or do.” But I don’t think that kind of nihilism is justified. What they say actually does matter. Parties and candidates that say less controversial things, and are associated with less-controversial ideas, win more elections.

I think that the only option that we have is to move toward the median voter. And I think that really comes down to embracing the popular parts of our agenda and making sure that no one in our party is vocally embracing unpopular things. I know that sounds reactionary. But moderates don’t have a monopoly on popular ideas and progressives don’t have one on unpopular ideas. There are a lot of left-wing policies that are both popular and transformational. Worker co-determination. A federal job guarantee. There’s still a lot we can do.

… I think we have to acknowledge that if you pull up a list of Democrats who have outperformed their presidential race the most in the past 20 years, it is, generally speaking, a list of boring, moderate people. There are a few progressive candidates who legitimately have impressive records. Bernie Sanders is a great example. In 1990, he managed to win Vermont’s House seat two years after the state voted for Bush over Dukakis. And he did it as an open socialist. People really don’t appreciate how insane that was. And over the course of all his elections, he did have a track record of outperforming the fundamentals. But, for the most part, the list is overwhelmingly dominated by moderate politicians. To be clear, that isn’t just issue-taking. Part of it is also that moderate politicians are less threatening to capital, and so they’re more likely to get Chamber of Commerce endorsements. But stepping back, if you look at all the polling and all of the evidence, I think there is a story that taking unpopular positions really hurts. You can see it as a time-series story too. If you look at AOC’s initial polling, or Bernie Sanders’s polling in 2016, or Warren’s standing in the Democratic primary, they were all much more popular before they started embracing a bunch of really unpopular issues.

The best example though might be Donald Trump. His approval rating rested in a very narrow band, for basically this entire election, particularly among non-college-educated whites. There was only one time that shifted, in the course of his entire presidency, and that was when they tried to repeal the Affordable Care Act. His approval rating among non-college whites specifically plummeted. We were tracking Obama-Trump voters; this was the only time when they came home in large numbers. And the reason is that a lot of these voters agree with Democrats on Obamacare, and they were very angry about attempts to repeal it. (And then they stopped being angry because it failed and everyone in politics has a short memory.)

If you want to make a case about Bernie Sanders, in his general-election polling before the primary ended, he really did seem to do better than Biden with younger, non-college-educated white people, even as he did a bit worse overall. It’s an interesting question of whether the polls were underrepresenting the non-college-educated group. And I do think, in 2016, Bernie had very, very strong general-election numbers. A lot of people say, “Oh, it was just that people hated Hillary.” But, at the time, in 2016, Bernie Sanders was one of the most popular politicians in America. And over the next four years, that stopped being true. He came into the 2020 primary with lower net favorables than Biden. And I think it’s really worth examining why. I always defined the Sanders model, in 2016, as having a lot of message discipline in terms of talking almost exclusively about economic issues and trying to frame all issues as a series of conflicts between good and evil. I think that kind of politics has a lot of appeal to these low social-trust voters, even if it turns off people who are highly politically engaged. And I think you could potentially build a very interesting coalition around that politics.

But we should learn from what happened over the next four years. Which is that, frankly, the Bernie campaign decided that the reason why they’d lost was that they were overly white and focused too much on economic issues, and that they had to become more woke on a variety of different social issues. And I think as they made this left-wing turn, while also really doubling down on policies that involve very large middle-class tax increases, as that happened, his support declined.

2020 exit polls: As the racial gap closes, the Democrat-Republican education gap widens
By Chris Arnade

How can it be that hard working people, who are scrambling to pay their bills, many of them new immigrants, chose to vote for an ivy league billionaire who wants to limit immigration? Because how we think about politics and voting is all wrong.

The details of politics and policy that political pundits on TV fight over are lost on most Americans. Not because they are too stupid to understand them, but because they are too busy to focus on something that rarely affects them.

That doesn’t mean they don’t have views on politics, but it means politics to them is a sport. While they will never be players, they can be fans.

So who they support is more about which social group to join. It is more about whose supporters would they rather hang out with at a bar, not what policies they want. In the last few decades, Democrats have shifted towards being the party of the highly educated. They resemble college professors in how they talk to voters and how they present themselves and, consequently, how they are viewed by many Americans. While Trump has shifted the GOP towards being a party that maybe, just maybe, might get who they are.

While Democrats can sound too much like wonks, Trump talks their language, in simple, often blunt terms, that avoids details about policy, but gets a few big things right — like understanding that frustration with D.C.

Trump also gets smaller things right that college professors and journalists in D.C. find unimportant and embarrassing, but really matters in how people view politics, like celebrating a big win with a big spread from McDonald’s, or hugging the United States flag. Based on my experience and reporting, this is especially true of newer immigrants, including those from Mexico, who are proud to be here and love America in an unflinching and emotional way.

They believe in faith, family, the flag, and the American dream, and are not embarrassed about that.

The Duo That Defeated the ‘Diversity Industry’
By Tunku Varadarajan

Ms. Wu came to the U.S. in 2009 from Wuxi, population five million—“a small town by Chinese standards.” She earned a doctorate in international studies from the University of Miami and was “perplexed” when it dawned on her that Asians, “as a group, are being scapegoated in education to fulfill a narrative of very shallow diversity.” It shocked her to discover that America wasn’t living up to its ideals as a “land free for everyone.”

She hears from parents in New York’s Chinatown who fear their children will be squeezed out of the city’s specialized public high schools because of their leftist mayor’s push for “diversity.” They tell her that they worry their kids will be unable to “redeem the American dream.” These are poor parents who don’t speak English and have told their children: “You work hard, you study hard, you’re going to get out this ethnic enclave. You’re going to get out of Chinatown.”

When politicians and school administrators say there are “too many” Asians in elite classrooms, “I feel minimized,” Ms. Wu says. “I feel stigmatized that I was reduced to a racial box—that my hard work is being blamed for the lack of so-called proportionality in these institutions.”

Liberals Envisioned a Multiracial Coalition. Voters of Color Had Other Ideas.
By Michael Powell

The proposition seemed tailor-made for one of the nation’s most diverse and liberal states. California officials asked voters to overturn a 24-year-old ban on affirmative action in education, employment and contracting.

The state political and cultural establishment worked as one to pass this ballot measure. The governor, a senator, members of Congress, university presidents and civil rights leaders called it a righting of old wrongs.

“Women and people of color are still at a sharp disadvantage by almost every measure,” The Los Angeles Times wrote in an editorial endorsement.

Yet on Election Day, the proposition failed by a wide margin, 57 percent to 43 percent, and Latino and Asian-American voters played a key role in defeating it. The outcome captured the gap between the vision laid out by the liberal establishment in California, which has long imagined the creation of a multiracial, multiethnic coalition that would embrace progressive causes, and the sentiments of many Black, Latino, Asian and Arab voters.

Variations of this puzzle could be found in surprising corners of the nation on Election Day, as slices of ethnic and racial constituencies peeled off and cut against Democratic expectations.

“We should not think of demography as destiny,” said Professor Omar Wasow, who studies politics and voting patterns at Princeton University. “These groups are far more heterogeneous than a monolith and campaigns often end up building their own idiosyncratic coalition.”

The unanswered question is whether the 2020 election will be a one-off, the voting patterns scrambled by an unusually polarizing president who attracted and repelled in near equal measure. If it signals something larger, political scientists noted, some Latino and Asian voters might begin to behave like white voters, who have cleaved along class lines, with more affluent residents in urban areas voting Democratic while a decided majority of rural and exurban residents support Republicans.

“This is the challenge for liberal Democrats,” Professor Wasow said. “In a diverse society, how do you enact politics that may advance racial equality without reinforcing racial divisions that are counterproductive and hurt you politically?”

‘The Far Left Is the Republicans’ Finest Asset’
By Thomas B. Edsall

Dane Strother, a Democratic consultant whose firm has represented candidates in states from New Hampshire to Montana, was more outspoken in his view:

Four years ago, Democrats’ final messaging was “which bathroom one could use.” This year it was Defund the Police. The far left is the Republicans’ finest asset. A.O.C. and the squad are the “cool kids” but their vision in no way represents half of America. And in a representative democracy 50 percent is paramount.

Bernard Grofman, a political scientist at the University of California-Irvine, shares Strother’s assessment but is still more assertive in his belief that the far left has inflicted significant damage on Democratic candidates. He wrote by email:

“Defund the police” is the second stupidest campaign slogan any Democrat has uttered in the twenty first century. It is second in stupidity only to Hillary Clinton’s 2016 comment that half of Trump’s supporters belong in a “basket of deplorables.”

Moreover, Grofman continued,

the antifa “take back the neighborhood’” in Seattle, where a part of the city became a police no-go zone, with the initial complicity of Democratic office holders, hasn’t helped either, especially after someone was killed within the zone. That allowed the Democrats to be seen as in favor of antifa, and, worse yet, to be portrayed as in favor of violence.

Even more damaging, in Grofman’s view,

have been the scenes of rock throwing demonstrators and boarded up stores that Republicans have regularly used for campaign fodder and that were a long-running story on Fox News. Every rock thrown, every broken window, is one more Republican vote.

Darren Kew, a professor in the University of Massachusetts-Boston Department of Conflict Resolution, pointed out that the internal tensions within the Democratic Party are exacerbated by polarization between the parties: “Political culture is often that part of the system that is hardest to see — the values, norms, and patterns of behavior that govern our actions within the context of institutions — but it’s the glue that holds it all together,” Kew wrote by email, noting that

20-30 percent of Americans on either end of the political spectrum are getting their information from highly politicized sources and are therefore not agreeing on the basic facts of whether an event has even happened or not.

Eitan Hersh, a political scientist at Tufts and the author of the book “Politics is for Power,” is not persuaded of the good faith and ultimate commitment of the affluent left. In addition to arguing that “moderate Democrats don’t want their brand tied to progressive policy priorities,” Hersh questioned the depth of conviction of the so-called progressive elite:

Many of the supporters who say they want big liberal policies at the national level don’t really mean it. For example, well-to-do liberals in fancy suburbs who say they prioritize racial equality but do not want to actually level the playing field in educational opportunities between their districts and majority-minority districts.

He cited his own state, Massachusetts:

Here there’s tons of liberal energy and money to support taking big progressive fights to Washington. Meanwhile, our schools are segregated, our transit system is broken, our housing is unaffordable, our police force is a mess of corruption and there’s little pressure being put on the state legislature and governor to fix any of it.

What, Hersh asks, “to make of all this?” His answer: “The push for big progressive policy is something of a facade.”

Dani Rodrik, a Harvard economist, suggests that any reconciliation of the Democratic Party’s internal conflicts requires an upheaval in contemporary liberal thinking. In “The Democrats’ Four-Year Reprieve,” an essay published Nov. 9 on Project Syndicate, Rodrik argues that the central question is:

How did Donald Trump manage to retain the support of so many Americans — receiving an even larger number of votes than four years ago — despite his blatant lies, evident corruption, and disastrous handling of the pandemic?” It is clear, Rodrik continued, “that the election does not resolve the perennial debate about how the Democratic Party and other center-left parties should position themselves on cultural and economic issues to maximize their electoral appeal.

What is also apparent, in Rodrik’s view, is that “Political leaders on the left need to fashion both a less elitist identity and a more credible economic policy.”

Parties on the left everywhere, he continued,

have increasingly become the parties of educated metropolitan elites. As their traditional working-class base has eroded, the influence of globalized professionals, the financial industry, and corporate interests has risen. The problem is not just that these elites often favor economic policies that leave middle and lower-middle classes and lagging regions behind. It is also that their cultural, social, and spatial isolation renders them incapable of understanding and empathizing with the worldviews of the less fortunate.

In an email, Rodrik wrote:

The first priority of the Democratic Party ought to be to have a sound program for economic transformation — one that promises to increase the supply of good jobs for all, including the lagging regions of the country.

Unmet Job Expectations Linked to a Rise in Suicide, Deaths of Despair
By Rachel White

The study showed that men who expected to work in jobs that did not require a college degree but later faced declines in the job market were nearly three times as likely to suffer early deaths by suicide and drug poisoning as men who sought work that required a bachelor’s degree.

Early death from self-injury has risen dramatically in recent decades, especially among middle-age white men whose deaths by suicide and drug poisoning increased by 9 and 31 per 100,000 (respectively) between 1980 and 2013. At the same time, the labor market also experienced a discerning trend: the decline of well-paying jobs that do not require a college degree.

Researchers from UT Austin, the University of Minnesota and the University of Wisconsin-Madison investigated the relationship between the two trends using data from the High School and Beyond cohort, a nationally representative sample of 11,680 men who were surveyed throughout high school in the early 1980s, again in 1992 (when they were 28-30 years old), and again in 2015.

Between 1992 and 2015, less than 6% of the sample had died. The researchers compared suicide and drug poisoning deaths, which are forms of self-injury, to other causes of early adult deaths, such as heart attacks and cancer. They found that the men most likely to suffer a death by suicide or drug poisoning were those who as adolescents expected to earn enough to support a family through some type of semi-skilled labor that later declined when they reached adulthood, such as manufacturing, mechanics and carpentry.

The study showed that neither educational attainment nor the actual job worked increased risk for death by self-injury. Furthermore, unmet occupational expectations were not associated with a higher risk of an early death by natural or other causes. This comparison further strengthened their conclusions about a link between a decline of working-class jobs and deaths of despair.

Analysis Finds Geographic Overlap In Opioid Use And Trump Support In 2016
By Paul Chisholm

It’s easy to see similarities between the places hardest hit by the opioid epidemic and a map of Trump strongholds. “When we look at the two maps, there was a clear overlap between counties that had high opioid use … and the vote for Donald Trump,” says Dr. James S. Goodwin, chair of geriatrics at the University of Texas Medical Branch in Galveston and the study’s lead author. “There were blogs from various people saying there was this overlap. But we had national data.”

Goodwin and his team looked at data from Census Bureau, the 2016 election and Medicare Part D, a prescription drug program that serves the elderly and disabled.

In counties with higher-than-average rates of chronic opioid prescriptions, 60 percent of the voters went for Trump. In the counties with lower-than-average rates, only 39 percent voted for Trump.

A lot of this disparity could be chalked up to social factors and economic woes. Rural, economically-depressed counties went strongly for Trump in the 2016 election. These are the same places where opioid use is prevalent. As a result, opioid use and support for Trump might not be directly related, but rather two symptoms of the same problem – a lack of economic opportunity.

To test this theory, Goodwin included other county-level factors in the analysis. These included factors such as unemployment rate, median income, how rural they are, education level, and religious service attendance, among others.

These socioeconomic variables accounted for about two-thirds of the link between voter support for Trump and opioid rates, the paper’s authors write. However, socioeconomic factors didn’t explain all of the correlation seen in the study.

“It very well may be that if you’re in a county that is dissolving because of opioids, you’re looking around and you’re seeing ruin. That can lead to a sense of despair,” Goodwin says. “You want something different. You want radical change.”

The ZIP Codes That Rule America
By Michael Lind

The donors to American politicians in all 50 states are concentrated in a few ZIP codes. According to Open Secrets, of the ZIP codes that delivered the most campaign funding for the Democrats in 2020, not counting dark money or soft money for liberal groups, four of the top five were in New York City (10024, 10023, 10022, 10011), followed by Chevy Chase, Maryland (20815), a suburb of Washington, D.C. Other top Democratic ZIPs this year were Silicon Valley (94301 and 94022) and Cambridge, Massachusetts (02138). New York City was also overrepresented among donors to the Republican Party, whose donor base is more geographically diverse, with a lot of money coming from Dallas, Atlanta, Las Vegas, and Palm Beach, Florida.

A former Democratic senator from the Midwest told me a few years back why he got out of politics: “I got tired of the fundraising. They [the Democratic National Committee] give you a list of these rich people in New York, San Francisco, and L.A. No matter what state you’re from, you have to fly out there and grovel before them. They don’t know anything about your state or its people. All they care about is their pet issues.”

And the donors have far more influence on public policy in America than the voters do. In a famous article in Perspectives on Politics in 2014 titled “Testing Theories of American Politics: Elites, Interest Groups, and Average Citizens,” Martin Gilens of Princeton and Benjamin I. Page of Northwestern tested four theories of how democracy works in America: majoritarian electoral democracy, economic-elite domination, majoritarian pluralism, and biased pluralism. They concluded that the data on whose preferences shape policy support the economic-elite domination theory: “When the preferences of economic elites and the stands of organized interest groups are controlled for, the preferences of the average American appear to have only a minuscule, near-zero, statistically non-significant impact upon public policy.”

The progressives who think that only the Electoral College and the malapportioned Senate are preventing the United States from adopting Swedish-style social democracy are living in a fool’s paradise. So are Republicans who think that the GOP answers to its voters, rather than its donors. Most big Democratic donors are neoliberals who do not want Medicare for All or strong labor unions and most big Republican donors are libertarians who do not want low-wage immigration or access by multinational corporations to cheap labor in China and elsewhere to be restricted. Most politicians follow the preferences of their donors, not their voters, when there is a conflict. “Who buys my bread, his song I sing,” as the old saying goes. Or, if you prefer another proverb: “The Golden Rule is whoever has the gold rules.”

Tom Steyer: What I Learned While Running for President
By Tom Steyer

Many felt disconnected and left behind by the political establishment and elites in New York and Washington.

Most people I met felt that the government was broken and that their vote didn’t count because of a corporate stranglehold on our democracy.

Meeting Americans has reinforced my sense of deep governmental failure. Whether it’s the warp-speed gentrification of Charleston, the homeless problem in Los Angeles or water pollution in Denmark, S.C., Americans deserve so much better from their government. We can afford it. We know better. And it’s the right thing to do.

And people ache for a democracy they can believe in. During my campaign, I tried to call out the intertwined elites in the media, the Beltway and corporate America, all of whom are thriving at the expense of the American people. They don’t want to change a single thing. Corporate America has truly bought our democracy, and people across this country are suffering every day because of it.

The campaign reinforced my deep misgivings about how the elite media, political insiders and big corporations have an impact on our democracy. I watched as the Democratic National Committee ignored public appeals from me and others to change the debate process to ensure a more diverse group of candidates made the stage, only to change the rules a few weeks later to allow Mike Bloomberg to participate in the debates.

Bloomberg gets lukewarm response from Democrats as he seeks new role in campaign to oust Trump
By Michael Finnegan

It took Michael R. Bloomberg just over 100 days to spend $1 billion on a Democratic presidential campaign that yielded little more than a token victory in American Samoa.

Three months after his campaign’s collapse, the former New York mayor is now playing a lower-profile — but still crucial — role in the party’s drive to oust President Trump in November.

Bloomberg, the party’s top donor for the 2020 elections, plans to spend heavily this summer and fall on anti-Trump advertising in presidential battleground states to support Joe Biden’s candidacy, according to a senior advisor. He also expects to pour large sums into helping Speaker Nancy Pelosi maintain Democrats’ majority in the House.

At the same time, a tech start-up that Bloomberg founded last year is trying to get hired by Biden’s campaign and the party to gather and analyze data to target appeals to potential Democratic voters. But the New York firm, Hawkfish, has drawn staunch opposition from progressives and party technology experts.

Bloomberg paid Hawkfish $69 million for work on his run for president, but its only other experience was on relatively small campaigns in Kentucky and Virginia.

Bloomberg, 78, spent more than $100 million on Democratic campaigns in 2018, including those of 21 candidates who won House seats formerly held by Republicans. His anti-gun group, Every Town for Gun Safety, was instrumental in helping Democrats win control of Virginia’s Legislature last year for the first time in a generation.

It was data management that produced much of Bloomberg’s estimated $60-billion personal fortune. Bloomberg LP, the company he formed in 1981, started off as a business selling financial-data terminals to investment firms around the world and expanded into a major media company.

The Biden team’s recent hiring of top digital staff from the campaigns of Sens. Elizabeth Warren of Massachusetts and Kamala Harris of California has eased concerns that Bloomberg would gain too much control over the party’s data if Hawkfish is retained.

Future candidates seeking an edge in races up and down the ticket will be looking to use tools developed under Biden, along with the data files built out on tens of millions of voters.

Those data files are some of the most valuable currency in politics. On the Republican side, the network of advocacy groups backed by billionaire activists Charles Koch and his late brother David consolidated its power in the party in the pre-Trump era by contracting with down-ballot candidates to build voter data files for their campaigns.

The data collected by the Koch network opened a rift within the party after Trump won the 2016 presidential nomination and the Koch group did not back him. By 2018, the Republican National Committee under Trump was imploring donors and candidates not to work with the Koch network, complaining it had weaponized its influence over voter data, leaving vulnerable candidates who did not adopt the Kochs’ agenda.

Smith, who worked for Buttigieg, said Trump’s campaign has shown “just how powerful big data is in American politics. Whoever has the data is going to be able to dictate the terms for the next few election cycles…. I cannot imagine putting a billionaire between the Democratic Party and the voters will work out well for us.”

Big Donors Spent Heavily on Failed Election Efforts
By Julie Bykowicz and Tarini Parti

Former New York City Mayor Michael Bloomberg followed his $1 billion failed Democratic presidential bid with a $100 million effort to turn Florida blue—only to see the state award President Trump a wider margin of victory than four years ago.

In an election that cost an estimated $14 billion, Mr. Bloomberg is one of dozens of wealthy individuals who saw arguably poor returns on some political investments. That doesn’t mean they’ll stop spending, representatives for many of them said.

The 2020 election also underscored that money alone doesn’t win. Multiple failed candidates set fundraising records, and small-dollar donors also contributed to efforts, some of which were unsuccessful. The 2020 election is on track to cost about twice as much as the 2016 race, according to an estimate from the Center for Responsive Politics.

In Senate races, the candidate who spent more won about 72% of the time, a record low in recent years, another analysis by the center found.

“Money’s important, resources are important,” said Mr. Wolfson. “But, you know, nothing is more valuable than a good candidate running a good campaign.”

Mr. Bloomberg spent more than $1.3 billion on the 2020 election—largely on his own presidential bid and on boosting Mr. Biden in Florida.

Mr. Bloomberg zeroed in on Florida because few political groups can commit the money needed to advertise effectively in a state with costly media markets, Mr. Wolfson said.

The state also was expected to—and did—process ballots more quickly than Midwestern states, so if Mr. Biden had won there, it could have made it clear on election night that the president had lost, Mr. Wolfson said.

“It would have been great to win Florida,” he said. “But, you know, we are absolutely convinced that by investing in Florida in the way that we did, we forced the Trump campaign to throw resources away from the Midwestern battleground states into Florida in a way that hurt Trump’s ability to compete in Pennsylvania, Wisconsin and Michigan.”

Advertising data show that after Mr. Bloomberg announced his plan, the Trump campaign reduced ad spending in some Midwestern states and boosted its Florida buys.

US elections are bought. And the people paying don’t want the same things we do
By Zaid Jilani

Almost US$6.5bn was spent on presidential and congressional campaigns during the 2016 US elections. Contrary to the idea that the party faithful fund their preferred candidate, some of the largest donors during that election cycle were a small handful of billionaires.

Republican casino magnate Sheldon Adelson and his wife spent as much as $82m – eclipsed only by Tom Steyer, a Democratic billionaire who spent $90m.

Steyer later went on to self-fund a 2020 presidential campaign, pouring $342m into his own bid; former New York City mayor Michael Bloomberg spent even more, putting as much as $1bn into his own doomed bid for the Democratic Party’s 2020 presidential nomination.

Though neither Steyer nor Bloomberg were successful in their quest for the top job, the amount of money at their disposal enabled them to be competitive candidates, while others were forced to drop out not for the lack of ambition or vision but for lack of funds.

Beyond the billionaire class, only a tiny sliver of US Americans give the lion’s share of political money. Most Americans don’t give to campaigns – just 0.52% of the American population gave a donation of over $200 during the 2016 presidential election.

The sums of cash involved and the small percentage of contributors raise questions about exactly how much influence donors have on the political process. If you’re reliant on a small group of wealthy people to get in office and stay there, you’re probably pliable to their opinions.

New research, published in April 2020, looks at how the views of donors differ from those of the voters who form the base of the two major US political parties.

Its authors, UC Berkeley political scientist David Broockman and Stanford University researcher Neil Malhotra, used an original sample of 1,152 partisan donors – that is, donors who give to only one of the two major parties – and compared it to surveys of partisan voters from both major parties. Collectively, the respondents to their survey have contributed $17.2m to campaigns since 2008.

Broockman and Malhotra discovered that on many major issues, the position of donors stood in stark contrast to that of the voters.

For instance, Democratic donors were considerably more socially liberal than Democratic voters on issues such as abortion, where donors favoured more expansive abortion rights and less restrictions on the practice.

On the other hand, on a wide range of economic issues, Republican voters were considerably more in favour of government aid to the poor, higher taxes on the wealthy, and government support for universal healthcare than Republican donors.

Here’s what that looks like in practice: in the spring of 2014, the US Senate – Congress’s upper legislative body – blocked an increase in the minimum wage to $10.10 an hour, with all but one Senate Republican uniting to vote against the bill. However, polling done around the same time revealed that 42% of self-identified Republicans wanted to see the minimum wage increased to that rate.

Staying with the Republican party, the same trend is noticeable on an issue that might surprise you: gun control. When you poll Republican voters, most support universal background checks for gun purchases, a reform that Republican lawmakers have actively blocked in Congress. Broockman and Malhotra’s study did find that Republican donors tended to be much more right wing on gun control issues than Republican voters.

It seems on issues where the American people may very well want the same thing across party lines, their own elected representatives are standing in the way of consensus, siding instead with the elite. “It likely exacerbates polarisation,” Malhotra told me. “Look at how far apart the donors are on the issues.”

The “racial wealth gap” is a class gap
By Matthew Yglesias

For his master’s thesis, Kevin Carney took a detailed look at the evolution of the black/white wealth gap in the United States and among other things came away with this finding — if you lop off the richest quarter of white people, then suddenly Black and white wealth dynamics over time look very similar.

The infamous destruction of African-American wealth during the subprime mortgage crash, for example, also happened for the majority of white households. The reason the racial wealth gap grew during this period is that rich white people own a lot of shares of stock while everyone else’s wealth is in their homes (if it exists at all).

Another way of looking at this is that while most white people are not members of the economic elite, the economic elite is a very white group of people.

But if you’re concerned about the economic disparity between white people and Black people, what you really ought to be concerned with is the disparity between rich people and non-rich people. You obviously don’t want to narrow the gap in an economically destructive way. But if you can find growth-friendly ways to redistribute resources, you mechanically improve the racial gap. And even better, you have a tractable political problem — most voters are white, but most voters are not rich. And white people are overrepresented in the Senate, but rich people are underrepresented. So if you try to build a politics around racial redistribution, you’re just going to lose. But if you try to build a politics around economic redistribution you just might win.

Donald Trump’s ruinous legacy
By Michael Tracey

Compare the final Democratic messaging of the campaign: Warnock and Ossoff stressed that among their first actions would be to approve the $2,000 Covid relief payments, and Biden travelled to Georgia on Monday to underscore his own support for this initiative. Perdue and Loeffler, conversely, couldn’t exactly tout a triumphant pledge to keep McConnell in power so he can carry on witholding withhold cash payments from Americans during a pandemic and protracted economic contraction.

When Politics isn’t All Personal
By Zeynep Tufekci

Leading up to the 2016 election, there were assertions that Trump supporters were motivated by “economic anxiety.” Critics said that such assertions were a cover for racism, arguing that the unemployment rate was low or the income data didn’t support claims about economic anxiety.

This debate is often presented as a dichotomy: Trump supporters must be either racist or economically anxious—as if they were not unemployed at the moment, the reason for their support of Trump must solely or primarily be racism.

But in reality, these two traits are neither always separate nor even neatly separable by any known analytic method. In fact, historically, they’ve long been fused in various versions in complicated ways that we missed going into 2016, and I believe still were missed going into 2020. The idea that we can simply measure either factor with polls or economic indicators is not correct: economic anxiety and status, for example, are not just about the unemployment rate at the moment but how one perceives the trajectory of their—and their families’—life chances. Unemployment can be quite low and economic anxiety also quite high at the same time. Racism also isn’t something that can be assessed based on what people say in response to a poll. And crucially, in people’s minds, loss of economic status can be tied to perception of loss of racial privilege and supremacy.

This column byNew York Times writer Jamelle Bouie, for example, highlights something that’s often overlooked: the US government’s response to pandemic included a sizable financial boost to poorer households. I’ve seen commentators say the aid was pathetic since the one-time payment was just $1200. But that claim overlooks a key part of the government’s response: an increase of unemployment benefits to $600 per week. That’s a staggering $2400 per month per unemployed person. The support was so substantial that some households dramatically improved their financial outlook during the pandemic, when the financial aid was in effect.

Bouie writes:

At the risk of committing the same sin as other observers and getting ahead of the data, I want to propose an alternative explanation for the election results, one that accounts for the president’s relative improvement as well as that of the entire Republican Party.

It’s the money, stupid.

At the end of March, President Trump signed the Cares Act, which distributed more than half a trillion dollars in direct aid to more than 150 million Americans, from stimulus checks ($1,200 per adult and $500 per child for households below a certain income threshold) to $600 per week in additional unemployment benefits. These programs were not perfect — the supplement unemployment insurance, in particular, depended on ramshackle state systems, forcing many applicants to wait weeks or even months before they received assistance — but they made an impact regardless. Personal income went up and poverty went down, even as the United States reported its steepest ever quarterly drop in economic output.

As Bouie states, it’s perfectly reasonable to assume this economic aid may have bolstered Trump’s support among key groups—especially since he made sure to have the checks appear as if they were personally signed by him—an unprecedented move that may have even delayed the payments, but may also have had the intended impact.

Trump’s Economy Really Was Better Than Obama’s
By Karl W. Smith

Between December 2009 and December 2016, the unemployment rate dropped 5.2 percentage points, from 9.9% to 4.7%. By December 2019, it had fallen another 1.2 percentage points, to 3.5%. A cursory look at those numbers might lead you to believe that the improvement under Trump was at best a continuation of a trend that began nearly a decade earlier.

It’s necessary to place those numbers in context. By 2016, officials in the Treasury Department and at the Federal Reserve had concluded that the economy was at full employment and that further improvement in the labor market was unlikely. This was in line with the Congressional Budget Office’s guidance that further declines in the unemployment rate would push the economy beyond its sustainable capacity.

Once in office, Trump ignored this consensus. He implemented a program of tax cuts, spending increases and unprecedented pressure on the Fed to cut interest rates to zero and keep them there. Trump’s goal of 3% growth was derided as delusional, while a bipartisan chorus of commentators declared his policies reckless and irresponsible.

They were anything but. Not only did the unemployment rate continue to fall, but the percentage of Americans aged 25 to 54 either employed or looking for a job saw its first sustained rise since the late 1980s. This inflection point changed the character of the labor market.

In 2016, real median household income was $62,898, just $257 above its level in 1999. Over the next three years it grew almost $6,000, to $68,703. That’s perhaps why, despite the pandemic, 56% of U.S. voters polled last month said their families were better off today than they were four years ago.

The Biden Popular Front Is Doomed to Unravel
By Christopher Caldwell

Trump didn’t sell out his supporters. In fact, his presidency saw something extraordinary, even if it was all but invisible from the country’s globalized cities: the first egalitarian boom since well back in the twentieth century. In 2019, the last non-Covid year, he presided over an average 3.7 percent unemployment rate and 4.7 percent wage growth among the lowest quartile of earners. All income brackets increased their take. That had happened in the last three Obama years, too. The difference is that in the Obama part of the boom the income of the top decile rose by 20 percent, with tiny gains for other groups. In the Trump economy, the distribution was different. Net worth of the top 10 percent rose only marginally, while that of all other groups vaulted ahead. In 2019, the share of overall earnings going to the bottom 90 percent of earners rose for the first time in a decade.

The reasons for Trump’s success are not yet clear. They may well have involved his unorthodox policy choices: above all, limiting immigration. Whatever the reason, this equalization must be why Trump’s economic approval was over 50 percent at election time, even as his personal scores remained low. We can assume that the great demographic surprise of the election—Trump’s uptick among Black and Latino men—owed more to this wage progress than to Lil Wayne’s endorsement, or to Trump’s musing aloud that he had done more for Blacks in America than any president since Abraham Lincoln.

Considering the Senate, the journalist Ronald Brownstein made a striking observation in the wake of the Biden victory. As recently as the Reagan administration, he pointed out, the Senate hovered above partisanship: the states Reagan won twice had almost as many Democrats as Republicans. But in the 25 states that voted for Trump twice, 47 of the 50 senators are Republicans. In the 20 states that voted against Trump twice, 39 of the 40 senators are Democrats. (The exception is Susan Collins of Maine.)

Brownstein frames the 2020 election as a clash between “the voters who embody the nation’s future” and “those who feel threatened by it.” While Brownstein is correct sociologically, it is worth noting how moralistic this is as a description. The conflict is not between two visions of America but between two peoples, one deserving (in fact, America incarnate), the other undeserving (or anti-American).

What appears to await us is a twenty-first-century version of a historical process familiar from the nineteenth: Longstanding traditions are undermined when only part of a country is able to take advantage of new technological possibilities.

The Silenced Majority
By Rana Dasgupta

Eighteenth-century Britain had two economies. The first was that of domestic production. Agriculture supplied a living for most of the population, three quarters of which was dispersed, on the eve of the Industrial Revolution, across thousands of rural communities. Most urban workers also derived their income from agriculture: food, textiles, and leather made up three quarters of British manufacturing. Commerce and transport constituted 40 percent of the service sector; most of the rest consisted of real estate rental and domestic services. From 1700 to 1780, this economy grew between a half a percent and 1 percent per year; beginning in 1780, it approached 2 percent.

The second economy was fueled by Britain’s industrial and imperial expansion. This was largely financialized: investors participated by trading debt, company shares, and commodities on the London exchange. Some investments were domestic, but the most fantastic gains derived from new schemes to privatize the globe. The great majority of British investment capital was bound up in overseas trading monopolies, which used it to occupy territory and turn anything that could be considered an asset (land, industry, tax revenue, luxury goods, human beings) into corporate property. Shares in the East India Company could yield returns of 30 percent or more per year, and Jamaican sugar plantations rapidly produced some of the world’s largest fortunes. The scale of the second economy was so enormous—by the end of the eighteenth century, the East India Company alone generated some 15 percent of British GDP—that it could not remain merely a commercial space. These trading monopolies remade the British state in their own image, forcing it to expand in order to secure their worldwide interests. They transformed politics too—not only for British people, but for everyone.

Two aspects of these trading monopolies are crucial here. First, they served elites almost exclusively. Shareholders in the East India Company numbered fewer than two thousand; there were around two hundred shareholders in the Royal African Company, which controlled the West African trade in gold, silver, timber, and human beings. These were Britain’s landowners and bureaucrats, the same people who controlled the fundaments of the first economy. The British masses did not possess the capital necessary to invest, nor did they have access to the relevant information networks. (Parliament was the most important of these, which is one reason that buying a seat in the House of Commons was such a worthwhile investment.) Second, the trading monopolies made the growth of elite fortunes eerily independent of the British population at large. British workers contributed little to their revenues. A regular supply of maritime and military men was required, to be sure, but their numbers were negligible compared with the profits, and their condition was close to slavery: when they survived, they were often kept from desertion by the constant deferment of payment.

It should be clear why this system endured. The general population was dispersed, victimized, and politically disorganized. Since the oligarchy derived its wealth from outside the country, moreover, it had only limited interest in the political satisfaction of the British masses, and could use aggressive penal measures to keep them in check.

Facebook possesses the greatest potential to restore the political balance of the eighteenth century. Its inflated market capitalization is based not only on its future earnings, but also on its capacity for global political management. In this sense, it is laughable to worry only about Russians or Chinese infiltrating American politics: it is already fully infiltrated by Big Data. We have long envisioned the end of democracy as something out of a Hollywood dystopia; it will not be starkly militaristic, however, but cool and convenient in the Silicon Valley style. Democracy will not be repealed so much as rendered inconsequential and incorporated into a mightier system of social and ideological management—as humans learn, for instance, that they can outsource to machines not just their memories and their friendships, but also their political opinions. Instead of mass rallies and totalitarian cults, society will fragment into mutually incomprehensible bubbles, and only celebrity will possess the transcendental power necessary to deliver electoral numbers. Celebrity, of course, is essentially a Big Data product today. We may soon realize that what Kim Kardashian and others have really been up to is building constituencies.

Tech firms will not just transform the nature of democratic access. Like their eighteenth-century predecessors, they will alter the nation-state itself, placing ever more of its functions under unelected control. It has been clear since what one commentator calls “the first Sino-Google conflict of 2009” that Big Data is quite different from, say, Big Auto or Big Oil, which bully and bend the state but ultimately share its organizing principles. Having quickly driven out other forms of social participation, Silicon Valley offers new political and economic arrangements that are irreconcilable with the old. Antitrust measures by the U.S. government against them are, in this sense, epochal: they represent not just regulation as usual but a battle between competing forms of life. And if the state has the power to discipline the corporations, the reverse is also true. As the state has become increasingly dependent on Silicon Valley for many of its core activities—mapping, law enforcement, immigration control, warfare—it has betrayed many of the principles on which its legitimacy was previously based, such as privacy. Silicon Valley’s global influence is waxing just as America’s conventional imperial power wanes; there will certainly be those in government who would prefer to exploit the power of these monopolies than break them up.

Over time, tech firms will build a more complete social and economic system, competing more aggressively with the state and further diminishing its ability to deliver material assistance to its citizenry. They are already attempting to end the state’s monopoly on issuing currency. In the present moment of state profligacy, cryptocurrency evangelists are urging investors to flee national currencies for the security of what they propagandistically call “decentralized finance” (the aggressive tenor of the abbreviation—“DeFi”—is not accidental). Apple and Google already offer banking services, while Facebook’s embryonic currency, the libra, could provide a means of exchange for two billion Facebook, Instagram, and WhatsApp users. (WeChat Pay and Alipay already perform this function for China’s private online societies.)

The neoliberal revolution aimed to restore the supremacy of capital after its twentieth-century subjugation by nation-states, and it has succeeded to an astonishing degree. As states compete and collude with gargantuan new private powers, a new political world arises.

The Wincott Memorial Lecture
By Yanis Varoufakis

Since the late 1890s, the rise of networked mega-corporations, of the Edisons and the Fords, created big business cartels investing heavily into how to usurp states and replace markets.

In their wake, megabanks were fashioned to finance the megafirms and, in the process, filled the world with fictitious money resting upon mountain ranges of impossible debt. Together, captains of industry and masters of finance accumulated war chests of billions with which to pad campaigns, capture regulators, ration quantities, destroy competitors and, in this manner, control prices. The first time the inevitable crisis hit that audacious superstructure was, of course, in 1929.

John Kenneth Galbraith was once asked how he went about, as FDR’s ‘Price Czar’, fixing countless prices during the War Economy. He answered: “It was pretty easy, considering that they were already fixed!” Through interminable mergers and acquisitions, corporations had replaced markets by a global Technostructure (Galbraith’s term) oozing with the power to shape the future for themselves and in their image.

For too long we lived under the illusion of world capitalism as a small-town, front-porch community rather than the weaponised Soviet-like (or maybe Google-like) planning system that it is. The larger the Technostructure grew the larger the financial sector necessary to conjure up the fictitious capital needed to fund its largesse. Bretton Woods was a remarkable attempt to reclaim political power on behalf of our societies and to stabilise the Technostructure.

When Bretton Woods died, officially on 15th August 1971, and financialisation became a necessity for financing the increasing deficits of the American Hegemon keeping global capitalism quasi-balanced (the Global Minotaur, as I called it), capitalism’s global imbalances were turbocharged. Before we knew it, General Motors turned into a huge hedge fund that also produced some cars on the side while, across the West, the tug-of-war between profits and wages was supplemented by the workers’ struggle for credit.

By the middle of the noughties, out of the one hundred wealthiest entities on Earth sixty-five were financialised corporations, not states. How could anyone expect them to operate in synch with society’s values and priorities – whatever those might be. Even the prospect of environmental catastrophe cannot convert such a highly concentrated, obscenely powerful power grid into the agent of our collective will.

Then came 2008. It proved that, even when the overheated Technostructure-on-financial-steroids melts down, its stranglehold over society grows proportionately to the black holes in their accounting books. In a fascinating inversion of Darwinism, the larger their failure and the steeper their financial losses the greater their capacity to appropriate society’s surplus via gargantuan bail-outs that their political agents push through neutered parliaments.

Capitalism, thy name has become Bankruptocracy: Rule by the most bankrupt of bankers. Democracy, in this context, resonated like a cross between a fond memory and a cruel joke.

Economists, taking their cue from Adam Smith, believe that at the root of all conflict there is scarcity. But, under the Technostructure which long-ago usurped Adam Smith’s world, the direction of causality is reversed. It is not dearth that necessitates exploitation today. It is exploitation, of humans and nature, that causes dearth.

This reversed causality is why the prevailing price of labour leaves millions under-employed, the destruction of the planet is ‘free’ and the price people pay for money is the loss of their soul.

Liberals demand a strong fence protecting our private sphere from a busy-body external world eager to interfere with our hopes and dreams. They say that our desires are no one’s business but our own. They believe we should all live within a safe haven where we can be sovereign and free to develop as individuals before relating with others, before leasing ourselves to an employer on mutually agreed terms and always on the understanding that the property rights over a person are non-tradeable. In short, inalienable self-ownership.

The first breach of the liberals’ essential fence appeared when industrial products became passé. Richard Branson had captured that moment with a statement that made William Morris spin in his grave: Who produces stuff and how does not matter one bit. Only brands matter now, proclaimed Sir Richard. Before long, branding took a radical new turn, imparting personality to objects, boosting consumer loyalty and, of course, the Technostructure’s profits.

Before they knew it, people felt compelled to re-imagine themselves as brands. The Internet allowed colleagues, employers, clients, detractors and ‘friends’ constantly to survey one’s life, putting pressure on each to evolve into a profile of activities, images and dispositions that amount to an attractive, sellable brand. Our sovereign personal space is now almost gone. The right to a time during the day when we are not for sale has vanished. Our liberty’s wetlands have been drained, its habitat destroyed.

Young women and men lacking a trust fund thus end up in one of two dead-ends. Condemned to working under zero-hour contracts and for wages so low that they must work all hours to make ends meet, rendering ridiculous any talk of personal time, space, or freedom. Or they must invest in their own brand every waking hour of every day, as if in a Panopticon where they cannot hide from the attention of those who might give them a break.

Disposable People
By Musa al-Gharbi

In New York City and throughout the country, the professional-managerial class is hunkered down and making the best of a bad situation: working remotely, enjoying time with their families, making sure their children stay up on their schoolwork, finding ways to work out, exercising self-care, and catching up on all the shows they’ve wanted to binge-watch. This could be told as a story about the wonders of technology and capitalism. Social media, communication platforms, delivery services, and streaming entertainment make life under quarantine more bearable and productive. But such a narrative would miss the main story.

Consider the stay-at-home shopper’s lifeline, Amazon: orders are translated into parcels that arrive at one’s doorstep thanks to tens of thousands of fulfillment center workers—disproportionately people of color—who, even under normal circumstances, work under immense strain and deplorable conditions. The packages are then transported to metropolitan hubs by undercompensated truckers. They ultimately arrive at one’s home by means of overburdened post office workers and delivery contractors.

Or, think about food delivery. To receive prepared food, whether the orders are called in to a restaurant or placed through an app like GrubHub or DoorDash, “back of the house” restaurant workers must be on duty to prepare the meal. These workers are also disproportionately people of color, often undocumented immigrants, and typically paid poorly. Receiving groceries at home through platforms like Instacart similarly requires an army of poorly compensated workers to process orders and gather merchandise. In either case, delivering the food is grueling, high-pressure work; deliverers are exposed daily to road hazards, the elements and, occasionally, predation. Wages tend to be very low and benefits nonexistent. Within urban areas, the people filling these jobs are disproportionately immigrants and minorities.

That is to say, the relative ease and comfort that many in the professional-managerial class are experiencing during the pandemic—ostensibly a result of digital platforms like Amazon, Instacart, and GrubHub—is actually the product of thousands of low-paid “invisible” workers who are paying the costs, and exposing themselves to considerable risk, on behalf of those who are better off.

Of course, in the midst of the pandemic these workers may also want to spend time with their families, or ensure their kids stay up on schoolwork. They may want to indulge in self-care or to stream their favorite shows. But that’s not an option for many of these low-income laborers. They cannot afford to stop working, even if they are concerned about health implications of continuing to show up. Even if they are worried about spreading the illness to elderly relatives who often live with them. Even if they are showing symptoms of infection.

The recently passed Families First Coronavirus Response Act, which became law a week before the $2 trillion stimulus bill was signed by the president last Friday, provides two weeks of paid leave for workers who are sick or whose children are out of school. However, it exempts those who work for large companies (like McDonald’s and Amazon) or for some small businesses (like local restaurants) and many classified as self-employed (like some gig workers)—ensuring that these laborers will continue risking their health to provide necessary services so relatively well-off American citizens can continue to comfortably shelter in place.

In New York City, children of those classified as “essential personnel”—first responders, medical professionals, sanitation and transit workers, and some other civil servants (later expanded to include some grocery store employees and pharmacists) are being provided supervision and care at city-run “regional enrichment centers.” But there is another group of workers recognized by New York as providing “essential services”—cooks preparing food, fulfillment center workers, and those making deliveries, among others —and their children have largely been left to fend for themselves. The services are classified essential, yet the personnel providing these services are not. Indispensable labor; disposable people.

Yet at least they still have their jobs. Many others cannot afford to stop working but have been laid off nonetheless, or have seen their clientele abandon them. Some of these workers can file for unemployment, but others cannot—including non-citizens, those who work in the informal economy, and many of those who are “self-employed.” The forthcoming financial stimulus checks, similarly, apply to taxpayers only—excluding those whose labor is “off the books.” These people are simply out of luck.

Stories like these may shock the conscience, yet few of these dynamics are unique to the coronavirus pandemic—they are simply made more visible now. Contemporary urban elites’ lifestyles are fundamentally predicated on a pool of vulnerable, desperate people who will do whatever kinds of tasks are required, whenever they are required, for whatever compensation is available. They will cater to the idiosyncratic preferences of the professional-managerial class, tolerate mistreatment with a smile, and show up to work regardless of what is going on in their lives.

What are Uber and Lyft drivers, for instance? They are chauffeurs for people who cannot deign to drive themselves around, take public transportation, or even exert the minimal effort of hailing a yellow cab. They provide members of the professional-managerial class with the experience, formerly available only to the rich, of having someone available at their beck and call to transport them privately wherever they want to go, whenever they want to go there. How has this service been rendered affordable to white-collar professionals? Rideshare companies outsource expenses like insurance, gas, vehicle acquisition, and maintenance to the drivers themselves, offer few benefits and such low compensation that contractors typically have to work well over full-time just to make ends meet if this is their only “gig.” Despite these measures, rideshare companies consistently operate at a loss, propped up by vulture capitalists who quite explicitly plan on radically jacking up fares as soon as taxis have been effectively killed off.

Perhaps the most valuable service these companies provide to clients is that, in serving as a middleman between the drivers and the customers, they eliminate any sense of responsibility for their laborers among those who consume their services.

The Gig Economy Is Coming for Your Job
By E. Tammy Kim

In the microcosm of the hotel, the app economy has expanded choices for some (the guests) and shrunk options for others (the workers).

These currents in hospitality represent a subtle, sneaky form of technological displacement, care of the gig economy. They’re not robots stepping in for humans on a factory floor, but rather smartphone-based independent contractors and supplemental “cobots” (a portmanteau of “co-worker” and “robot”) chipping away at the careers of full-time and in some cases unionized employees.

In the beginning of the gig economy, people most feared one-to-one job loss: An Uber driver comes in, a taxi driver goes out. And taxi drivers have indeed lost their livelihoods — and taken their own lives. Yet many app workers are only part-time, driving or TaskRabbit-ing to supplement their wages in a traditional job. App companies, for their part, deny that even full-timers are employees, perpetuating the fantasy that gig workers are solo entrepreneurs. It’s a business model that reduces everything to a series of app-enabled transactions, and calls it work, leaving what’s left of the welfare state to fill in the rest.

Aaron Benanav, a labor historian at the University of Chicago, explains that this process of “de-skilling” and misclassification is happening all over the world. The gig economy “is being used to replace skilled workers with less skilled, or continuing a process that’s happening all over the world of ‘disguised employment,’ where you bring in independent contractors to replace employees,” he said. “There’s an app for that” means that there’s less steady, reliable work for traditional employees.

How far has this sort of gig work spread? It’s hard to say. In 2017, the United States Bureau of Labor Statistics collected additional data on “contingent workers” for the first time since 2005. It found that only 1.3 percent to 3.8 percent of the work force is involved in independent and so-called gig work, though smaller surveys have shown that 35 percent of Americans do some “freelancing.”

What is clear is that the platform economy has further blurred the lines between who’s employed versus underemployed, unemployed or out of the labor market. And it’s not just a matter of numbers: People fear app-based gig work because it threatens the very concepts of boss and worker, governor and governed.

The conveniences of the app economy need not come with reckless disregard for working people. But only a broad-based fight for fair treatment and lawful classification can dismantle the ideology of labor built into Uber and its ilk: that all workers should be as productive and loyal as lifetime employees, and expect nothing in return.

For too many people today, 40-year tenure, generous health insurance, employer-funded education and pensions are a distant memory. That doesn’t mean, however, that they wish to float about, untethered to all but their smartphones. The robot future may be on its way, but what’s scarier still is the present being shaped by Silicon Valley.

How big tech is dragging us towards the next financial crash
By Rana Foroohar

Like most of the largest and most profitable multinational companies, Apple has loads of cash – around $210bn at last count – as well as plenty of debt (close to $110bn). That is because – like nearly every other large, rich company – it has parked most of its spare cash in offshore bond portfolios over the past 10 years. This is part of a Kafkaesque financial shell game that has played out since the 2008 financial crisis. Back then, interest rates were lowered and central bankers flooded the economy with easy money to try to engineer a recovery. But the main beneficiaries were large companies, which issued lots of cheap debt, and used it to buy back their own shares and pay out dividends, which bolstered corporate share prices and investors, but not the real economy. The Trump corporate tax cuts added fuel to this fire. Apple, for example, was responsible for about a quarter of the $407bn in buy-backs announced in the six months or so after Trump’s tax law was passed in December 2017 – the biggest corporate tax cut in US history.

Because of this, the wealth divide has been increased, which many economists believe is not only the biggest factor in slower-than-historic trend growth, but is also driving the political populism that threatens the market system itself.

That phenomenon has been put on steroids by yet another trend epitomised by Apple: the rise of intangibles such as intellectual property and brands (both of which the company has in spades) relative to tangible goods as a share of the global economy. As Jonathan Haskel and Stian Westlake show in their book Capitalism Without Capital, this shift became noticeable around 2000, but really took off after the introduction of the iPhone in 2007. The digital economy has a tendency to create superstars, since software and internet services are so scalable and enjoy network effects (in essence, they allow a handful of companies to grow quickly and eat everyone else’s lunch). But according to Haskel and Westlake, it also seems to reduce investment across the economy as a whole. This is not only because banks are reluctant to lend to businesses whose intangible assets may simply disappear if they go belly-up, but also because of the winner-takes-all effect that a handful of companies, including Apple (and Amazon and Google), enjoy.

In a low interest rate environment, with billions of dollars in yearly earnings, these high-grade firms were issuing their own cheap debt and using it to buy up the higher-yielding corporate debt of other firms. In the search for both higher returns and for something to do with all their money, they were, in a way, acting like banks, taking large anchor positions in new corporate debt offerings and essentially underwriting them the way that JP Morgan or Goldman Sachs might. But, it is worth noting, since such companies are not regulated like banks, it is difficult to track exactly what they are buying, how much they are buying and what the market implications might be. There simply is not a paper trail the way there is in finance. Still, the idea that cash-rich tech companies might be the new systemically important institutions was compelling.

I began digging for more on the topic, and about two years later, in 2018, I came across a stunning Credit Suisse report that both confirmed and quantified the idea. The economist who wrote it, Zoltan Pozsar, forensically analysed the $1tn in corporate savings parked in offshore accounts, mostly by big tech firms. The largest and most intellectual-property-rich 10% of companies – Apple, Microsoft, Cisco, Oracle and Alphabet (Google’s parent company) among them – controlled 80% of this hoard.

According to Pozsar’s calculations, most of that money was held not in cash but in bonds – half of it in corporate bonds. The much-lauded overseas “cash” pile held by the richest American companies, a treasure that Republicans under Trump had cited as the key reason they passed their ill-advised tax “reform” plan, was actually a giant bond portfolio. And it was owned not by banks or mutual funds, which typically have such large financial holdings, but by the world’s biggest technology firms. In addition to being the most profitable and least regulated industry on the planet, the Silicon Valley giants had also become systemically crucial within the marketplace, holding assets that – if sold or downgraded – could topple the markets themselves. Hiding in plain sight was an amazing new discovery: big tech, not big banks, was the new too-big-to-fail industry.

A trillion is no small sum: that is an 18th of the US’s annual GDP, much of which was garnered from products and services made possible by core government-funded research and innovators. Yet US citizens have not got their fair share of that investment because of tax offshoring. It is worth noting that while the US corporate tax rate was recently lowered from 35% to 21%, most big companies have for years paid only about 20% of their income, thanks to various loopholes. The tech industry pays even less – roughly 11-15% – for this same reason: data and IP can be offshored while a factory or grocery store cannot. This points to yet another neoliberal myth – the idea that if we simply cut US tax rates, then these “American” companies will bring all their money home and invest it in job-creating goods and services in the US. But the nation’s biggest and richest companies have been at the forefront of globalisation since the 1980s. Despite small decreases in overseas revenues for the past couple of years, nearly half of all sales from S&P 500 companies come from abroad.

How, then, can such companies be perceived as being “totally committed” to the US, or, indeed, to any particular country? Their commitment, at least the way American capitalism is practised today, is to customers and investors, and when both of them are increasingly global, then it is hard to argue for any sort of special consideration for American workers or communities in the boardroom.

The Starving State
By Joseph E. Stiglitz, Todd N. Tucker, and Gabriel Zucman

Today, multinationals shift close to 40 percent of their profits to low-tax countries around the world. Over the last 20 years, according to the economist Brad Setser, U.S. firms have reported growth in profits only in a small number of low-tax jurisdictions; their reported profits in most of the world’s major markets have not gone up significantly—a measure of how cleverly these firms shift capital to avoid taxes. Apple, for example, has demonstrated as much inventiveness in tax avoidance as it has in its technical engineering; in Ireland, the technology giant has paid a minuscule annual tax rate as low as 0.005 percent in some years.

It is not just corporations that engage in tax avoidance; among the superrich, dodging taxes is a competitive sport. An estimated eight percent of the world’s household financial wealth is hidden in tax havens. Jurisdictions such as the Cayman Islands, Panama, and Switzerland have structured their economies around the goal of helping the world’s rich hide their assets from their home governments. Even in places that don’t show up on international watch lists—including U.S. states such as Delaware, Florida, and Nevada—banking and corporate secrecy enable people and firms to evade taxes, regulation, and public accountability.

Unchecked, these developments will concentrate wealth among a smaller and smaller number of people, while hollowing out the state institutions that provide public services to all. The result will be not just increased inequality within societies but also a crisis and breakdown in the very structure of capitalism, in the ability of markets to function and distribute their benefits broadly.

Many policymakers, economists, corporate tycoons, and titans of finance insist that taxes are antithetical to growth. Opponents of tax increases claim that firms will reinvest more of their profits when less gets siphoned off by the government. In this view, corporate investment is the engine of growth: business expansion creates jobs and raises wages, to the ultimate benefit of workers. In the real world, however, there is no observable correlation between capital taxation and capital accumulation. From 1913 to the 1980s, the saving and investment rates in the United States have fluctuated but have usually hovered around ten percent of national income. After the tax cuts in the 1980s, under the Reagan administration, capital taxation collapsed, but rates of saving and investment also declined.

The 2017 tax cut illustrates this dynamic. Instead of boosting annual wages by $4,000 per family, encouraging corporate investment, and driving a surge of sustained economic growth, as its proponents promised it would, the cut led to minuscule increases in wages, a couple of quarters of increased growth, and, instead of investment, a $1 trillion boom in stock buybacks, which produced only a windfall for the rich shareholders already at the top of the income pyramid. The public, of course, is paying for the bonanza: the United States is experiencing its first $1 trillion deficit.

Lower taxes on capital have one main consequence: the rich, who derive most of their income from existing capital, get to accumulate more wealth. In the United States, the share of wealth owned by the richest one percent of the adult population has exploded, from 22 percent in the late 1970s to 37 percent in 2018. Conversely, over the same period, the wealth share of the bottom 90 percent of adults declined from 40 percent to 27 percent. Since 1980, what the bottom 90 percent has lost, the top one percent has gained.

Economic elites are almost always the winners of any legislative or regulatory battle in which their interests might conflict with those of the middle class or the poor. The oil magnates the Koch brothers and other right-wing financiers have successfully built political machines to take over state houses and push anti-spending and anti-union laws that exacerbate inequality. Even rich individuals who are seen as more politically moderate—technology executives, for instance—tend to focus their political efforts on narrow technocratic issues rather than the distributional conflicts that define today’s politics.

What Happens When the 1% Go Remote
By Richard Florida

It doesn’t take very many one-percenters changing their address to wreak havoc on cities’ finances.

When the billionaire hedge funder David Tepper left New Jersey for Miami Beach in 2015, he left a crater in New Jersey’s budget that experts estimate was upwards of $100 million annually. (Interestingly enough, Tepper recently moved back home to the Garden State.) A whopping 80% of New York City’s income tax revenue, according to one estimate, comes from the 17% of its residents who earn more than $100,000 per year. If just 5% of those folks decided to move away, it would cost the city almost one billion ($933 million) in lost tax revenue.

The large differentials in our current system of state and local taxation enable the mega-rich to save millions, and in some cases tens of millions or hundreds of millions of dollars a year, simply by moving from higher-tax states, most of them blue, to lower-tax states, which are typically red. Homesteading provisions like those in place in Florida do not even require them to spend a minimum number of days in the state. They just have to be careful not to spend too many days in the high-tax states like New York and California where their businesses are. New York’s threshold, for example, is 184 days.

Of course, at least in the short term, there are some cities that are benefiting from these migrations. And you can’t really blame the cities and states that are luring these people away. In a place like Miami Beach, where property taxes amount to 1.5-2% of the assessed valuation of a home, someone who buys a $30 million home will pay half a million dollars or more in annual property taxes. But it leaves the losers with large holes in their budgets.

None of this spells the end of superstar cities like New York and San Francisco. In fact, many, if not most, of the New York hedge funders and Bay Area venture capitalists are moving just themselves, not their core businesses. Many of the companies they invested in — and their employees — will likely stay right where they are. Very little actual work or production is being relocated. What’s really changing are the addresses of those who own and control the capital.

In the long run, cities’ ability to attract new generations of innovative and creative talent will ensure their financial survival. But if policies don’t change, their budgets will suffer in the meantime, and their least-advantaged people and neighborhoods will bear the brunt of it as budget cuts and austerity measures eliminate key services. If things get bad enough — as they did in New York in the 1970s — it could take them quite a while to restore their budgets.

Antifragile states and global cities
By Branko Milanović

In a series of books, and especially in Antifragile, Nassim Taleb has introduced an important concept — that of being antifragile, referring to ‘things that gain from disorder’. ‘Fragile’ is, of course, the opposite: it connotes something that thrives under stable conditions but, being brittle, loses, and at times loses big, amid volatility. In the middle, ‘robust’ indicates resilience against uncertainty and turmoil, without the capacity to profit from it.

The contrast between antifragile and the two other categories relates to that between centralised, top-down formations (such as unitary states) and decentralised, bottom-up and more flexible, federal structures. As an example of the latter Taleb takes Switzerland, with its decentralised cantonal system and grassroots democracy.

But Switzerland is also antifragile in another sense. It has historically been a country that benefited from turmoil and disorder outside its borders — from wars, nationalisations, uncertain property rights and outright plunder. In all these cases, whether Jews were trying to save their property from ‘Aryanisation’, Chinese millionaires feared a revolution or African potentates needed a haven in which to park their loot, Switzerland offered the comfort of safety. It was (and is) the ultimate antifragile state: it thrives on disorder.

Big cities such as London, New York, Miami and Barcelona offer many of the services and amenities we find in small nation-states (asset protection, expert money-laundering) but in addition provide agglomeration externalities (increasing returns to scale thanks to the physical presence in the same place of many companies) and thriving housing markets. They too are antifragile.

This has implications for the political life of the nation-states where such cities are located. Global cities are increasingly linked to other global cities and other countries, less and less to their own hinterland. They are what Fernand Braudel called villes-monde.

They remind us of medieval cities, which were often more powerful than much larger states. The power of cities such as Venice and Genoa ended with the advent of the nation-states which became political, economic and military behemoths, absorbing city-states or relegating them to oblivion.

Globalisation is bringing them back, however. While nation-states politically and economically fragment, and in some cases (as with climate change) show themselves to be not the right loci to address a problem, the villes-monde thrive. Many already vote very differently from the surrounding areas: London had a solid anti-Brexit majority (60 per cent), Budapest, Istanbul and Moscow voted against their countries’ authoritarian leaders and New York is leading the ‘rebellion’ against its own citizen who is currently the president of the United States.

The important political question in the 21st century will be how a modus vivendi between the globalised large cities, and the elites living there, and the rest of their nations can be achieved.

Welcome To The ‘Turbulent Twenties’
By Jack A. Goldstone and Peter Turchin

… across history, what creates the risk of political instability is the behavior of elites, who all too often react to long-term increases in population by committing three cardinal sins. First, faced with a surge of labor that dampens growth in wages and productivity, elites seek to take a larger portion of economic gains for themselves, driving up inequality. Second, facing greater competition for elite wealth and status, they tighten up the path to mobility to favor themselves and their progeny. For example, in an increasingly meritocratic society, elites could keep places at top universities limited and raise the entry requirements and costs in ways that favor the children of those who had already succeeded.

Third, anxious to hold on to their rising fortunes, they do all they can to resist taxation of their wealth and profits, even if that means starving the government of needed revenues, leading to decaying infrastructure, declining public services and fast-rising government debts.

Such selfish elites lead the way to revolutions. They create simmering conditions of greater inequality and declining effectiveness of, and respect for, government. But their actions alone are not sufficient. Urbanization and greater education are needed to create concentrations of aware and organized groups in the populace who can mobilize and act for change.

Top leadership matters. Leaders who aim to be inclusive and solve national problems can manage conflicts and defer a crisis. However, leaders who seek to benefit from and fan political divisions bring the final crisis closer. Typically, tensions build between elites who back a leader seeking to preserve their privileges and reforming elites who seek to rally popular support for major changes to bring a more open and inclusive social order. Each side works to paint the other as a fatal threat to society, creating such deep polarization that little of value can be accomplished, and problems grow worse until a crisis comes along that explodes the fragile social order.

American exceptionalism was founded on cooperation — between the rich and the poor, between the governors and the governed. From the birth of the nation, the unity across economic classes and different regions was a marvel for European observers, such as St. John de Crèvecoeur and Alexis de Tocqueville. This cooperative spirit unraveled in the mid-nineteenth century, leading to the first “Age of Discord” in American history. It was reforged during the New Deal as an unwritten but very real social contract between government, business and workers, leading to another age of prosperity and cooperation in postwar America. But since the 1970s, that contract has unraveled, in favor of a contract between government and business that has underfunded public services but generously rewarded capital gains and corporate profits.

While this new neoliberal contract has, in some periods, produced economic growth and gains in employment, growth has generally been slower and far more unequal than it was in the first three postwar decades. In the last twenty years, real median household income has stagnated, while the loss of high-paying blue-collar jobs to technology and globalization has meant a decline in real wages for many workers, especially less educated men.

As a result, American politics has fallen into a pattern that is characteristic of many developing countries, where one portion of the elite seeks to win support from the working classes not by sharing the wealth or by expanding public services and making sacrifices to increase the common good, but by persuading the working classes that they are beset by enemies who hate them (liberal elites, minorities, illegal immigrants) and want to take away what little they have. This pattern builds polarization and distrust and is strongly associated with civil conflict, violence and democratic decline.

At the same time, many liberal elites neglected or failed to remedy such problems as opiate addiction, declining social mobility, homelessness, urban decay, the collapse of unions and declining real wages, instead promising that globalization, environmental regulations and advocacy for neglected minorities would bring sufficient benefits. They thus contributed to growing distrust of government and “experts,” who were increasingly seen as corrupt or useless, thus perpetuating a cycle of deepening government dysfunction.

Closing the representation gap
By Sheri Berman

During the postwar period, European centre-left parties had relatively clear economic profiles, based on the view that it was the job of democratic governments to protect citizens from the negative consequences of capitalism. Concretely, this entailed championing the welfare state, market regulation, full-employment policies and so on. Although centre-left parties tried to capture additional votes outside the traditional working class, their identities and appeals remained class-based.

In the late 20th century this began to change, as centre-left parties moved to the centre economically, offering a watered-down or ‘kindler, gentler’ version of the policies peddled by their centre-right competitors. By the late 1990s, as one study put it, ‘Social Democracy … had more in common with its main competitors than with its own positions roughly three decades earlier’. As centre-left parties diluted their economic-policy positions, they also began de-emphasising class in their appeals and their leaders increasingly came not from blue-collar ranks but from a highly educated elite.

Although less pronounced and universal, at around the same time as centre-left parties moved to the centre economically, many centre-right parties moderated their positions on important social and cultural issues, including ‘traditional’ values, immigration and other concerns related to national identity.

Centre-right parties had generally taken conservative stances on these issues. Christian-democratic parties, for example, had viewed religious values as well as traditional views on gender and sexuality as crucial to their identity. In addition, many of these parties understood national identity in cultural or even ethnic terms and were suspicious of immigration and multiculturalism. But during the late 20th and early 21st centuries many shifted to the centre on national-identity issues, tempering or abandoning the communitarian appeals they had made previously.

Cumulatively, these shifts by centre-left and centre-right parties left many voters, particularly those with left-wing economic and moderate to conservative preferences on immigration and so on, without a party representing their interests. Such voters were heavily concentrated among the less well-educated and the working class, compromising about 20-25 per cent of the electorate in Europe (as well as in the United States).

To use categories popularised by Albert Hirschman, when a representation gap emerges and voters are dissatisfied with the political choices offered to them, they have two options: exit and voice. And indeed, over recent decades, less-educated and working-class voters have increasingly exited by abstaining from voting and other forms of political participation or exerted voice by shifting their votes to right-wing populist parties. They did so because these parties shifted their profiles as well, offering a mix of welfare chauvinism, conservative social and cultural policies and a promise to give voice to the ‘voiceless’—precisely to appeal to them.

Trump Won’t Be the Last American Populist
By Daron Acemoglu

The United States was ripe for a populist movement by 2016, and it remains so today. Vast inequalities have opened in the last four decades between the highly educated and the rest and between capital and labor. As a result, median wages have been stagnant for about 40 years, and the real earnings of many groups, especially men with low education levels, have fallen precipitously. Men with less than a college degree, for example, earn significantly less today than their counterparts did in the 1970s. No serious discussion of the political ills that have befallen the United States can ignore these economic trends, which have afflicted the American middle class and contributed to the anger and frustration among some of the voters who turned to Trump.

The root causes of these inequalities have proved surprisingly difficult to pin down. The rise of new, “skill biased” wunder technologies, such as computers and artificial intelligence, has coincided with a period of singularly low growth in productivity, and analysts have not convincingly explained why these technologies have benefited capital owners rather than workers. Another frequently cited culprit—trade with China—is clearly a contributing factor, but Chinese imports really exploded only once inequality was already rising and American manufacturing was already on the decline. Moreover, European countries with similarly huge trade inflows from China do not show the same extent of inequality as the United States. Nor can deregulation and the demise of unions in the United States account for the disappearance of manufacturing and clerical jobs, for instance, as these losses are common across essentially all advanced economies.

Regardless of its origin, economic inequality has become a source of cultural and political volatility in the United States. Those who have failed to benefit from economic growth have become disillusioned with the political system. In areas where imports from China and automation have led to the loss of American jobs, voters have turned their backs on moderate politicians and have tended to vote for those who are more extreme.

Good policy can begin to redress economic inequality: a higher federal minimum wage, a more redistributive tax system, and a better social safety net would help create a fairer society. Nonetheless, such measures are not enough by themselves. The United States needs to create good—high-paying and stable—jobs for workers without a college degree, and the country is far from a consensus on how this can be done.

Together with economic resentment has come a distrust of all kinds of elites. Much of the American public and many politicians now express a mounting hostility toward policymaking based on expertise. Trust in American institutions, including the judiciary, Congress, the Federal Reserve, and various law enforcement agencies, has collapsed. Neither Trump nor recent party polarization can be held solely to blame for this anti-technocratic shift. The almost complete rejection of scientific facts and competent, objective policymaking among many in the electorate and the Republican Party predates Trump and has parallels in other countries—Brazil, the Philippines, and Turkey to name a few. Without more deeply understanding the root of such suspicion, American policymakers can have little hope of convincing millions of people that better policies, designed by experts, will improve their lives enormously and reverse decades of decline. Nor can policymakers hope to put a lid on the discontent that fueled Trump’s rise.

How and why this unraveling has happened is not self-evident. The first place to look for an answer is in the major, crosscutting economic trends of the present era: globalization and the rise of digital and automation technologies, both of which have induced rapid social changes coupled with unshared gains and economic disruptions. As institutions proved unable or unwilling to protect those suffering from these transformations, they also destroyed public trust in establishment parties, the experts claiming to understand and better the world, and the politicians who appear complicit in the most disruptive changes and in cahoots with those who have stealthily benefited from them.

From this perspective, it isn’t sufficient to decry the collapse of civic behavior or even to defeat toxic populists and authoritarian strongmen. Those who seek to shore up democratic institutions must build new ones that can better regulate globalization and digital technology, altering their direction and rules so that the economic growth they foster benefits more people (and is perhaps faster and of a higher quality overall). Building trust in public institutions and experts requires proving that they work for the people and with the people.

How Did Americans Lose Faith in Everything?
By Yuval Levin

We trust political institutions when they undertake a solemn obligation to the public interest and shape the people who populate them to do the same. We trust a business because it promises quality and reliability and rewards its workers when they deliver those. We trust a profession because it imposes standards and rules on its members intended to make them worthy of confidence. We trust the military because it values courage, honor and duty in carrying out the defense of the nation and forms human beings who do, too.

We lose faith in an institution when we no longer believe that it plays this ethical or formative role of teaching the people within it to be trustworthy. This can happen through simple corruption, when an institution’s attempts to be formative fail to overcome the vices of the people within it, and it instead masks their treachery — as when a bank cheats its customers, or a member of the clergy abuses a child.

That kind of gross abuse of power obviously undermines public trust in institutions. It is common in our time as in every time. But for that very reason, it doesn’t really explain the exceptional collapse of trust in American institutions in recent decades.

What stands out about our era in particular is a distinct kind of institutional dereliction — a failure even to attempt to form trustworthy people, and a tendency to think of institutions not as molds of character and behavior but as platforms for performance and prominence.

In one arena after another, we find people who should be insiders formed by institutions acting like outsiders performing on institutions. Many members of Congress now use their positions not to advance legislation but to express and act out the frustrations of their core constituencies. Rather than work through the institution, they use it as a stage to elevate themselves, raise their profiles and perform for the cameras in the reality show of our unceasing culture war.

President Trump clearly does the same thing. Rather than embodying the presidency and acting from within it, he sees it as the latest, highest stage for his lifelong one-man show. And he frequently uses it as he used some of the stages he commanded before he was elected: to complain about the government, as if he were not its chief executive.

The pattern is rampant in the professional world. Check in on Twitter right now, and you’ll find countless journalists, for instance, leveraging the hard-earned reputations of the institutions they work for to build their personal brands outside of those institutions’ structures of editing and verification — leaving the public unsure of just why professional reporters should be trusted. The same too often happens in the sciences, in law and in other professions meant to offer expertise.

Or consider the academy, which is valued for its emphasis on the pursuit of truth through learning and teaching but which now too often serves as a stage for political morality plays enacted precisely by abjuring both. Look at many prominent establishments of American religion and you’ll find institutions intended to change hearts and save souls frequently used instead as yet more stages for livid political theater — not so much forming those within as giving them an outlet.

All of us have roles to play in some institutions we care about, be they familial or communal, educational or professional, civic, political, cultural or economic. Rebuilding trust in those institutions will require the people within them — that is, each of us — to be more trustworthy. And that must mean in part letting the distinct integrities and purposes of these institutions shape us, rather than just using them as stages from which to be seen and heard.

As a practical matter, this can mean forcing ourselves, in little moments of decision, to ask the great unasked question of our time: “Given my role here, how should I behave?” That’s what people who take an institution they’re involved with seriously would ask. “As a president or a member of Congress, a teacher or a scientist, a lawyer or a doctor, a pastor or a member, a parent or a neighbor, what should I do here?”

The people you most respect these days probably seem to ask that kind of question before they make important judgments. And the people who drive you crazy, who you think are part of the problem, are likely those who clearly fail to ask it when they should.

Americans’ perceptions about unethical behavior shape how they think about people in powerful roles
By Claire Gecewicz and Lee Rainie

The survey found that the more confident people are that group members behave unethically, the less likely they are to have confidence in other aspects of that group’s performance. Conversely, U.S. adults who think group members admit mistakes and take responsibility for them have relatively high levels of confidence in key performance activities of that group.

For example, those who think that members of Congress act unethically “all or most of the time” or “some of the time” are less likely to say the lawmakers care about “the people they represent” than are those who think members of Congress rarely act unethically (47% vs. 66%).

Similar differences show up when the public judges if members of Congress do a good job promoting laws and policies that serve the public, handle resources responsibly and provide fair and accurate information. Two-thirds (66%) of those who think lawmakers mostly act ethically say they are doing a good job serving the public, but 43% of those who see members of Congress as mostly unethical say members of Congress do a good job serving the public.

Just 17% of those who have relatively negative opinions about the ethical behavior of members of Congress think the lawmakers admit mistakes and take responsibility for them, compared with 31% of those who think members of Congress behave ethically at least some of the time. A similar pattern holds when the performance of local elected officials is considered.

There are also wide gaps in judging the performance of journalists. Those who think journalists behave relatively unethically are less likely to say journalists care about people like them than those who think journalists are relatively ethical (45% vs. 69%). The same divides show up when asked if journalists do a good job reporting important news that serves the public (61% vs. 81%), cover all sides of an issue fairly (47% vs. 71%) and admit mistakes and take responsibility for them (37% vs. 62%).

A comparable dynamic plays out when the issue relates to whether those who hold these positions of power and responsibility admit and take responsibility for mistakes. Those who think group members are relatively transparent about mistakes are more likely to think positively about other aspects of a group’s performance than those who think the group members do not regularly admit mistakes and take responsibility for them.

Those who think leaders of technology companies don’t sufficiently admit and take responsibility for mistakes are less likely to judge other performance aspects of their work positively than those who think the leaders do take responsibility for mistakes “all or most of the time” or “some of the time.”

The same dynamic also applies when the performance issue is whether tech leaders “care about people like you” (28% vs. 74%), handle resources responsibly (51% vs. 90%) or provide fair and accurate information to the public (42% vs. 88%).

The gaps between those who have relatively positive views about religious leaders and those who have relatively negative views are particularly striking. There is a 53 percentage point gap between those who think religious leaders take responsibility for their mistakes and those who do not when the issue is whether religious leaders provide fair and accurate information to the public. And there is a 47-point gap when the issue is whether religious leaders handle resources responsibly, a 46-point gap when the issue is whether religious leaders care about “people like you” and a 35-point gap when the issue is whether religious leaders do a good job providing for the spiritual needs of their communities.

America, We Have a Problem
By Thomas B. Edsall

On Oct. 30, a group of 15 eminent scholars (several of whom I also got a chance to talk to) published an essay — “Political Sectarianism in America” — arguing that the antagonism between left and right has become so intense that words and phrases like “affective polarization” and “tribalism” were no longer sufficient to capture the level of partisan hostility.

“The severity of political conflict has grown increasingly divorced from the magnitude of policy disagreement,” the authors write, requiring the development of “a superordinate construct, political sectarianism — the tendency to adopt a moralized identification with one political group and against another.”

Political sectarianism, they argue,

consists of three core ingredients: othering — the tendency to view opposing partisans as essentially different or alien to oneself; aversion — the tendency to dislike and distrust opposing partisans; and moralization — the tendency to view opposing partisans as iniquitous. It is the confluence of these ingredients that makes sectarianism so corrosive in the political sphere.

There are multiple adverse outcomes that result from political sectarianism, according to the authors. It “incentivizes politicians to adopt antidemocratic tactics when pursuing electoral or political victories” since their supporters will justify such norm violation because “the consequences of having the vile opposition win the election are catastrophic.”

Political sectarianism also legitimates

a willingness to inflict collateral damage in pursuit of political goals and to view copartisans who compromise as apostates. As political sectarianism has surged in recent years, so too has support for violent tactics.

Shanto Iyengar, a political scientist at Stanford and another of the paper’s authors, emailed to say:

I would single out the profound transformations in the American media system over the past 50 years. Basically, we’ve moved from an “information commons” in which Americans of all political stripes and walks of life encountered the same news coverage from well-regarded journalists and news organizations to a more fragmented, high choice environment featuring news providers who no longer subscribe to the norms and standards of fact-based journalism. The increased availability of news with a slant coupled with the strengthened motivation to encounter information that depicts opponents as deplorable has led to a complete breakdown in the consensus over facts.

Iyengar noted that research he and Erik Peterson, a political scientist at Texas A&M University, have conducted shows that:

the partisan divide in factual beliefs is genuine, not merely partisans knowingly giving the incorrect answer to factual questions because they realize that to do so is “toeing the party line.”

In the case of views of Covid, he and Peterson found that even though

beliefs about appropriate health practices can have life or death consequences, misinformation over the pandemic is rampant among Republicans and does not dissipate when we offer financial incentives to answer correctly.

Cynthia Shih-Chia Wang, a professor of management and organization at Northwestern’s Kellogg School of Management and also a co-author of the paper, shares Iyengar’s concern over the role of ideologically driven sources of information.

“Media is a big contributor to political sectarianism,” Wang wrote by email, adding that research she and her colleagues have conducted shows that “consuming ideologically homogeneous media produced greater belief in conspiracy theories endorsed by that media.”

In Wang’s view, Trump’s refusal to acknowledge his election loss is dangerous because of “the number of political elite — the 18 attorneys general and 128 members of the House — who are sowing seeds of doubt around the ethicality of the elections,” with the result that

the system is being severely challenged by a president that refuses to concede, by an us-versus-them mentality that contributes to continued congressional gridlock as a pandemic rages, and especially by the doubt cast on the credibility of the American system.

Political sectarianism in America
By Eli J. Finkel, Christopher A. Bail, Mina Cikara, Peter H. Ditto, Shanto Iyengar, Samara Klar, Lilliana Mason, Mary C. McGrath, Brendan Nyhan, David G. Rand, Linda J. Skitka, Joshua A. Tucker, Jay J. Van Bavel, Cynthia S. Wang and James N. Druckman (PDF, 268KB)

America’s response to the coronavirus disease 2019 (COVID-19) pandemic highlights the perils of political sectarianism. An October 2019 report from Johns Hopkins University suggested that America was better prepared for a pandemic than any other nation …, but that report failed to account for the sort of political sectarianism that would, months later, make mask-wearing a partisan symbol, one favored more by Democrats than by Republicans. Democrats were also more likely to prioritize stay-at-home orders despite their massive, immediate economic cost—a pattern that was especially prominent among highly sectarian partisans …. This schism, fomented in part by President Trump, pushed toward a disequilibrium in which too few people engaged sufficiently in commerce to stimulate economic growth while too few social-distanced sufficiently to contain the pandemic. The result has been lethal and expensive for Americans across the political spectrum.

As political sectarianism grows more extreme, pushing strong partisans deeper into congenial media enclaves that reinforce their narratives of moral righteousness, it may also become self-reinforcing, rendering mitigation efforts more difficult. Scholars have long argued that a shared threat can bring people together; indeed, some suggest that rising sectarianism in America is due in part to the loss of the Soviet Union as a unifying arch-nemesis. But such threats may do the opposite when sectarianism is extreme. COVID-19 offered a test case …. By the summer of 2020, 77% of Americans believed that the nation had grown more divided since the pandemic arrived that winter, a response 2.8 standard deviations higher than the mean of the 13 other nations in the study and 1.6 standard deviations higher than the second-highest nation (Spain). Such findings underscore the urgent need to counteract sectarianism before it grows more poisonous.

Even If It’s ‘Bonkers,’ Poll Finds Many Believe QAnon And Other Conspiracy Theories
By Joel Rose

A significant number of Americans believe misinformation about the origins of the coronavirus and the recent presidential election, as well as conspiracy theories like QAnon, according to a new NPR/Ipsos poll.

Forty percent of respondents said they believe the coronavirus was made in a lab in China even though there is no evidence for this. Scientists say the virus was transmitted to humans from another species.

And one-third of Americans believe that voter fraud helped Joe Biden win the 2020 election, despite the fact that courts, election officials and the Justice Department have found no evidence of widespread fraud that could have changed the outcome.

The poll results add to mounting evidence that misinformation is gaining a foothold in American society and that conspiracy theories are going mainstream, especially during the coronavirus pandemic. This has raised concerns about how to get people to believe in a “baseline reality,” said Chris Jackson, a pollster with Ipsos.

“Increasingly people are willing to say and believe stuff that fits in with their view of how the world should be, even if it doesn’t have any basis in reality or fact,” Jackson said.

Blunt 2020 lessons for media, America
By Jim VandeHei

The media filter bubble is getting worse, not better. Look at what’s unfolding in real-time: Trump supporters feel like Fox isn’t pro-Trump enough, while reporters and columnists bolted The New York Times, Vox Media and others because they were not “woke” enough.

  1. This is an urgent sign that we are collectively losing the battle for truth and open debate.
  2. This could still get much worse if Trump supporters choose not just networks but social platforms like Parler and Rumble for consuming and sharing their reality and liberals simply do the same in more traditional places.

Twitter is a mass-reality-distortion field for liberals and reporters. The group-think and liberal high-fiving was as bad as ever and continues to be a massive trap and distraction for journalists.

Facebook is a mass-reality-distortion field for conservatives. Look at the content pages that get the most daily interaction (shares, likes, etc.) and it’s all right-wing catnip. It’s not all fake or conspiratorial, but a lot of it sure is. This is a huge problem.

YouTube is a mass-reality-distortion field for people of all stripes. Videos endorsing election fraud were viewed more than 138 million times on the week of Nov. 3, according to a new report cited by The New York Times.

The bottom line: We are losing the war for truth. There is no bigger crisis for media, politics and society than the growing number of people who do not believe facts and verifiable figures. If we do not collectively solve this, we are all screwed.

  1. Two-thirds of Republicans doubt the election was free and fair, despite the fact that election officials in every state see no evidence of widespread fraud.

The YouTube Ban Is Un-American, Wrong, and Will Backfire
By Matt Taibbi

Unrestrained speculation about the illegitimacy of the 2016 election had a major impact on the public. Surveys showed 50 percent of Clinton voters by December of 2016 believed the Russians actually hacked vote tallies in states, something no official agency ever alleged even at the peak of the Russiagate madness. Two years later, one in three Americans believed a foreign power would change vote tallies in the 2018 midterm elections.

These beliefs were turbo-charged by countless “reputable” news reports and statements by politicians that were either factually incorrect or misleading, from the notion that there was “more than circumstantial” evidence of collusion to false alarms about Russians hacking everything from Vermont’s energy grid to C-SPAN.

What makes the current situation particularly grotesque is that the DNI warning about this summer stated plainly that a major goal of foreign disruptors was to “undermine the public’s confidence in the Democratic process” by “calling into question the validity of the election results.”

Our own domestic intelligence agencies have been doing exactly that for years now. On nearly a daily basis in the leadup to this past Election Day, they were issuing warnings in the corporate press that you might have reason to mistrust the coming results …

Amazing how those stories vanished after Election Day! If you opened any of those pre-vote reports, you’d find law enforcement and intelligence officials warning that everything from state and local governments to “aviation networks” was under attack.

In fact, go back across the last four years and you’ll find a consistent feature of warnings about foreign or domestic “disinformation”: the stern scare quote from a bona fide All-Star ex-spook or State official, from Clint Watts to Victoria Nuland to Frank Figliuzzi to John Brennan to McMullan’s former boss and buddy, ex-CIA chief Michael Hayden. A great many of these figures are now paid contributors to major corporate news organizations.

What do we think the storylines would be right now if Trump had won? What would those aforementioned figures be saying on channels like MSNBC and CNN, about what would they be speculating? Does anyone for a moment imagine that YouTube, Twitter, or Facebook would block efforts from those people to raise doubts about that hypothetical election result?

We know the answer to that question, because all of those actors spent the last four years questioning the legitimacy of Trump’s election without any repercussions. The Atlantic, quoting the likes of Hayden, ran a piece weeks after Trump’s election arguing that it was the duty of members of the Electoral College to defy voters and elect Hillary Clinton on national security grounds. Mass protests were held to disrupt the Electoral College vote in late December 2016, and YouTube cheerfully broadcast videos from those events. When Electoral vote tallies were finally read out in congress, ironically by Joe Biden, House members from at least six states balked, with people like Barbara Lee objecting on the grounds of “overwhelming evidence of Russian interference in our election.”

In sum, it’s okay to stoke public paranoia, encourage voters to protest legal election results, spread conspiracy theories about stolen elections, refuse to endorse legal election tallies, and even to file lawsuits challenging the validity of presidential results, so long as all of this activity is sanctified by officials in the right party, or by intelligence vets, or by friendlies at CNN, NBC, the New York Times, etc.

The Loss of Truth In the Media Is a Threat to Our Democracy
By Ray Dalio

A number of media writers have in private told me that their editors have specifically hired them to write negative, sensationalistic stories because they sell best. They explained that the financial decline of print media and the public’s short attention span have required them to produce such attention-grabbing headlines and stories or face financial decline.

I also see this problem becoming a greater issue —perhaps the greatest issue—of the presidential election. One very senior political strategist explained that in the upcoming presidential election only about 500,000 people—i.e., the swing voters in the swing states—will determine the election and the way these voters will be won is with critical sensationalistic headlines. As a result, this strategist works with those in the media to bring that about. To me that is a clear threat to democracy.

2019: A Year the News Media Would Rather Forget
By Matt Taibbi

In the Bush years, the sudden explosion of ex-military figures as paid or regular contributors on cable news was a conspicuous enough phenomenon that it provoked widespread criticism from media watchdogs. Anti-war voices were scarce in the run-up to the Iraq war, while stars and bars were everywhere.

A decade-plus later, the craze is ex-intelligence officials. The last few years have seen an explosion of hires of prominent intelligence figures as TV talking heads, including many who have been prominent in ongoing news controversies, like former FBI deputy director Andrew McCabe (hired this year), former CIA chiefs John Brennan, and former Director of National Intelligence James Clapper. Networks, incidentally, have stopped bothering to tell audiences that their on-air contributors may have roles in stories they are commenting upon. Asha Rangappa, Frank Figliuzzi, and James Gagliano of the FBI and more than a dozen others round out an amazing collection of ex-spooks now on the air.

Why Is Christopher Steele Still a Thing?
By Matt Taibbi

Steele first appeared in connection with the Trump story as a “well-placed Western intelligence source” in a 2016 Yahoo News article by Michael Isikoff. The piece claimed a Trump aide named Carter Page was discussing the lifting of sanctions with Igor Sechin, chief of the major Russian oil company Rosneft.

Steele, in fact, was a private opposition researcher hired by the “premium research” firm Fusion-GPS, on behalf of the Hillary Clinton campaign. The Yahoo story came out on September 23th, 2016; it would be more than a year before Steele’s status as a paid Clinton researcher would be made public.

After Isikoff’s piece came out, the Clinton campaign released a statement about how it was “chilling” to learn that “U.S. intelligence officials” were “conducting a probe into suspected meetings between Trump’s foreign policy adviser Carter Page and members of Putin’s inner circle.”

If the merry-go-round trick of commenting gravely about a story you yourself planted sounds familiar, that’s because it’s the tactic used by Vice President Dick Cheney in the early 2000s, when he went on Meet the Press to comment about “a story in The New York Times this morning” regarding Saddam Hussein’s aluminum tubes. Press figures denounced such chicanery then.

Steele’s report came out in full during the transition, in a sleazy series of maneuvers by outgoing intelligence officials, who presented the incoming president with a synopsis of Steele’s work.

When details of this meeting leaked, news outlets that previously had been sitting on Steele’s report because it was unverifiable suddenly had a “hook” to release news about the briefing: Intelligence chiefs relayed “allegations that Russian operatives claim to have compromising personal and financial information about Mr. Trump.”

The resulting viral furor spurred Buzzfeed to publish the entire dossier, so Americans could “make up their own minds.”

In this way, the dossier was published without ever going through a vetting process. For all the talk of hacking, this was a true Trojan-horse penetration of the American news media system (not that most media companies minded, of course).

Enthusiasts now cling to the idea that the “dossier” was merely a “starting point,” and remains “neither proved nor disproved” (the New York Times translation for “unmentionable until published by someone less reputable”), but the whole shooting match should have ended once the world got a chance to read Steele’s reports. Any sane person’s Malcolm Gladwell-Blink reaction to these memos would be that they were lunatic conspiratorial horseshit on the level of Avril Lavigne dying and being replaced by a clone named “Melissa.”

Dear CNN: What parts of the Steele dossier were corroborated?
By Erik Wemple

There’s a contradiction deep in CNN’s record on the dossier. On the one hand, its journalists talked about its alleged corroboration for years. On the other hand, the network was careful not to parrot particular claims from the dossier, as its own statement notes. That restraint stemmed from start of the dossier drama, after CNN itself — behind reporting from Perez, Sciutto, Jake Tapper and contributor Carl Bernstein — scooped in January 2017 that then-President-elect Trump had been briefed on the dossier. “At this point, CNN is not reporting on details of the memos, as it has not independently corroborated the specific allegations,” noted the story.

The FBI, as it turned out, didn’t independently corroborate key, specific allegations, either — as the Horowitz report found. Based on a massive document review as well as interviews with more than 100 witnesses, the Justice Department inspector general’s team discovered that the FBI had built a spreadsheet of Steele claims — as John Solomon reported in July — with very few, if any, checkmarks. The 400-plus page Horowitz report brims with credibility-diminishing information on the dossier and the methods used to compile it.

So how did CNN handle the news of the dossier’s non-corroboration? “There was no spying and many parts of that dossier were later corroborated,” said CNN anchor Christine Romans on Dec. 11, two days after the Horowitz report hit the streets.

The problem with such chatter lies in its suggestiveness. The dossier is best known to the public as a set of allegations alleging conspiracy with Russians by Trump campaign aides. By hyping small-bore “corroboration” — about “meetings” or “communications” or whatever — CNN programming bathed the dossier’s large-bore claims in credibility that they turn out not to deserve.

The Steele dossier just sustained another body blow. What do CNN and MSNBC have to say?
By Erik Wemple

Remember the dossier’s famous allegations that the Russians had kompromat against Trump because of illicit alleged activities in a Russian hotel? A declassified footnote elaborates on the provenance of that story: According to an intelligence community report, a source who spanned Trump’s circles and Russia said that it was false and resulted from Russian intelligence “infiltrat[ing] a source into the network.”

Another footnote, citing a U.S. intelligence report, reveals that two people affiliated with Russian intelligence were aware of Steele’s information-gathering efforts in 2016. An FBI official said, however, that he “had no information as of June 2017 that Steele’s election reporting source network had been penetrated or compromised.” A New York Times story notes the strange adjacencies involved in Steele’s pursuits — getting too close to Russian intelligence created a risk of misinformation, but Steele also sought to know what Russian intelligence was “doing with regard to the Trump campaign.”

Steele told Fusion GPS, the research firm that commissioned his work, that at least 70 percent of the claims in the dossier are accurate. An obvious question arises from that assertion: Which ones? The Erik Wemple Blog asked Fusion GPS’s Glenn Simpson to comment on the possibility of Russian disinformation. He replied with an excerpt from his 2017 Senate Judiciary Committee testimony, in which he outlined how Steele viewed these perils: “I’ve worked on this issue all my life and when you’re trained in Russian intelligence matters, the fundamental problem of your profession is disinformation. It’s the number one issue,” said Simpson, paraphrasing Steele.

Multiple Trump scandals have come along since the dossier was front-page news. The declassified footnotes haven’t preoccupied the coronavirus-obsessed mainstream media, although there have been reports by the likes of CBS News, the Associated Press, the New York Times and CNN. Conservative media organs including Fox News, the Daily Caller and Washington Examiner have covered the developments.

We’ll pause to consider the CNN account, which carries the headline, “GOP seizes on newly declassified material to raise further questions about Steele dossier.” The article’s first sentence reads, “Senate Republicans are touting newly declassified information that suggests Russian disinformation, in two instances, may have been passed onto ex-British intelligence agent Christopher Steele when he compiled an opposition research dossier on Donald Trump and Russia in 2016.”

Factual? Yes. Slanted? Yes, that too. Republicans are “touting” the footnotes in part because media outlets such as CNN, MSNBC and others “touted” the dossier with flimsy corroboration in the early months of the Trump presidency. (The chatter is the focus of previous installments in this series.) One CNN anchor, for instance, went so far as to assert in December 2017 that the U.S. intelligence community has “corroborated all the details” of the dossier.

David Corn and the Steele dossier: Just checking the facts!
By Erik Wemple

Research firm Fusion GPS — co-founded by Glenn Simpson and Peter Fritsch — commissioned the project with funds from the Hillary Clinton campaign and the Democratic National Committee. After submitting his first report in June 2016, Steele told Simpson that he wanted to take it to the FBI. “Steele thought [the intelligence community] needed urgently to know — if it didn’t already — that the next possible U.S. president was potentially under the sway of Russia,” write Simpson and Fritsch in their book, “Crime in Progress: Inside the Steele Dossier and the Fusion GPS Investigation of Donald Trump.”

The timing of Corn’s story was no coincidence. In “Crime in Progress,” the authors express contempt for the decision by then-FBI Director James B. Comey to publicly announce another look at the Clinton email investigation while remaining silent about what the bureau had been doing on the Trump campaign and Russia. “Comey’s bombshell prompted the Fusion partners to decide they needed to do what they could to expose the FBI’s probe of Trump and Russia. It was Hail Mary time,” write Simpson and Fritsch.

That so-called Hail Mary meant working with Corn, whom the Fusion GPS co-founders describe as a “reporter who trusted Fusion and who just might have the aggressiveness to write a story this explosive in the final days of a presidential campaign.” (“Crime in Progress” suggests that Fusion GPS approached Corn about the story, whereas “Russian Roulette: The Inside Story of Putin’s War on American and the Election of Donald Trump,” written by Corn and Michael Isikoff, indicates that Corn “checked in” with Fusion GPS around this time.)

Corn came through with the story, and did not stop there: After publishing his piece, he delivered a copy of the dossier to a friend of his — FBI General Counsel James Baker. It was a nice pickup for the FBI. The report of Justice Department Inspector General Michael Horowitz confirms that Corn’s packet contained information from Steele that the bureau hadn’t yet received. Though Corn told the Hill’s John Solomon that he delivered the dossier after the presidential election, the Horowitz report indicates that the FBI’s Crossfire Hurricane team had secured it by Nov. 6, 2016, two days before the election. Corn told the Daily Caller’s Chuck Ross that he’d misspoken to the Hill about the delivery date.

Timing matters here, considering that Fusion GPS — by its own admission — was looking to pressure the FBI on its Trump-Russia work. Steele himself had a sustained concern about how seriously the FBI was taking his memos, a dynamic that arose as early as summer 2016: “To Steele, this was an emergency that needed to be dealt with swiftly,” write Simpson and Fritsch. And in late November, Steele requested a hand-off of the dossier to David Kramer, an associate of Sen. John McCain (R-Ariz.). The idea, Simpson testified before the Senate Judiciary Committee, was to “to give [it] to Senator McCain so that Senator McCain can ask questions about it at the FBI, with the leadership of the FBI. That was essentially — all we sort of wanted was for the government to do its job and we were concerned about whether the information that we provided previously had ever, you know, risen to the leadership level of the FBI,” said Simpson during his August 2017 interview with the committee.

Sharing the dossier with Baker, says Corn, was part of his newsgathering process: “I provided a set of Steele memos to Baker, whom I knew socially, to see if I could get anyone in the FBI to confirm or debunk the allegations in the memo,” Corn tells the Erik Wemple Blog via email. In 2018 comments to the Hill, Corn said he was “merely doing what a journalist does: trying to get more information on a story I was pursuing.” He says he never heard back.

How can we possibly appraise just how authentic was Corn’s ambition to get a thumbs-up or thumbs-down verdict on the dossier’s integrity? Easily. Though the FBI may not have responded directly to Corn, it did, in fact, undertake its very own confirm-or-debunk operation. As Horowitz noted, the agency created a spreadsheet to plot out the dossier’s claims and the evidence underlying them. It interviewed Steele and tracked down his “Primary Sub-source.” It compared claims in the dossier to publicly available information.

The FBI’s conclusion is abridged with lawyerly lethality in the Horowitz report: “The FBI concluded, among other things, that although consistent with known efforts by Russia to interfere in the 2016 U.S. elections, much of the material in the Steele election reports, including allegations about Donald Trump and members of the Trump campaign relied upon in the Carter Page FISA applications, could not be corroborated; that certain allegations were inaccurate or inconsistent with information gathered by the Crossfire Hurricane team; and that the limited information that was corroborated related to time, location, and title information, much of which was publicly available.”

Newly Declassified Documents Suggest FBI Was Wary by Early 2017 of Steele Dossier
By Alan Cullison

People who circulated the dossier among reporters in Washington said they intended it as a kind of tipsheet whose raw and unverified intelligence could be a starting point for further research.

But the president’s Republican allies have pilloried it and demanded to know why it was allowed to enter the chain of legitimate intelligence, accusing the FBI of relying on its assertions to support its Russia investigation. The FBI did use Mr. Steele’s reporting when it sought court approval to monitor former Trump campaign foreign-policy adviser, Carter Page, and the agency has since moved to limit how similar information can be used in the future. Mr. Page, who has denied wrongdoing, was never charged with any crime.

The documents provide a peek into the sources and methods of Mr. Steele, whose report created a political firestorm after it was published by BuzzFeed in January 2017, less than two weeks before Trump’s inauguration. The dossier was funded by Mr. Trump’s political rivals, including the Democratic National Committee and 2016 Democratic presidential candidate Hillary Clinton’s campaign.

Mr. Steele has roundly defended his work in lawsuits filed against him in the U.S. and U.K., saying it was never meant for publication. He said that he never traveled to Russia personally for the information, and instead relied upon his contractor, whom he paid, to interview subsources in Russia.

Notes of Mr. Strzok, as well as the interview with Mr. Steele’s source, suggest that the FBI knew the limitations in Mr. Steele’s reporting, which alleged that Russia had been cultivating Mr. Trump for years with financial enticements and leverage over the president in the form of sex tapes. The dossier also alleged that Mr. Trump’s personal lawyer, Michael Cohen, held a furtive meeting in Prague during the presidential campaign with Russian officials to discuss hacking operations against Democrats. The special prosecutor’s report said that investigators found no evidence that the meeting in Prague ever occurred.

When the February 2017 New York Times article reported that Mr. Steele had a credible record and that he had briefed the FBI on some of his findings, Mr. Strzok wrote in the margins of a printed copy of the article that “recent interviews and investigations, however, indicate that Steele may not be in a position to judge the reliability of his subsource network.”

In the interview with the FBI agents, Mr. Steele’s primary contractor described a network of subsources inside Russia, some of whom he knew socially, and whose names were redacted from the document released by the Senate committee. While he attributed salient episodes of the dossier to one or another of his sources, he said he did not recall or didn’t know where some of the episodes came from. The contractor said he couldn’t vouch for some of the information told to him by his sources in Russia, indicating that one might have been “brainstorming” when she told him about one episode that appeared in the dossier.

Twitter, MSNBC’s Scarborough scramble to respond to insane Trump conspiracy theory
By Erik Wemple

A little context on the Trump/”Morning Joe” timeline is in order here. As noted in this blog and many other precincts, “Morning Joe” was overtly charitable toward candidate Donald Trump as he established himself in the early stages of the 2016 presidential campaign. Over several months after Trump declared his candidacy in June 2015, he yukked it up with Scarborough and co-host Mika Brzezinski. After his big victory in the New Hampshire primary, Trump said this: “It was great seeing you, and you guys have been supporters and I really appreciate it.” Scarborough was forced to clarify what Trump meant.

Like many others, however, the “Morning Joe” duo eventually came to grips with Trump’s core awfulness. In return, Trump has chosen to smear Scarborough with the ultimate crime. And what an awful thing it was to resurface the Klausutis situation, creating a whole bunch of news that, under a sane and considerate president, wouldn’t have been news. The Post’s Craig Pittman reported on how the swirl of presidential social media activity had affected the family that Klausutis left behind:

No one in Klausutis’s family would talk about Trump’s tweets for this article, fearing retaliation by online trolls of the type who went after parents of the Sandy Hook massacre victims. Their grief has been disrupted by conspiracy theories before — not only over the past few years from the White House, but from some liberals who at the time of her death sought to portray a then-conservative Republican congressman as a potential villain.
“There’s a lot we would love to say, but we can’t,” said Colin Kelly, who was Klausutis’s brother-in-law.

“What the Klausutises — the entire family — have had to endure for 19 years, it’s unspeakably cruel,” said Scarborough on Tuesday morning’s show. “Whether it’s the president, or whether it’s people following the president, it is unspeakably cruel. These are not public figures nor have they ever been public figures.” That’s an important distinction, though what Trump is alleging about the public figure in this case — Scarborough, that is — is also quite cruel.

Fox News settles suit with parents of Seth Rich for promoting heinous conspiracy theory
By Erik Wemple

Fox News announced Tuesday that it had reached a settlement over one of the most heinous stories of the Trump era. The move ends litigation by Joel and Mary Rich, parents of Seth Rich, the 27-year-old Democratic National Committee staffer slain on a D.C. street in July 2016. Conspiracy theorists posited that Rich’s killing was linked to the spilling of DNC emails to WikiLeaks during the 2016 presidential election campaign.

Though the idea was outlandish on its face, Fox News embraced it in May 2017, publishing a story under the byline of Malia Zimmerman with the headline: “Seth Rich, slain DNC staffer, had contact with WikiLeaks, say multiple sources.” Prime-time opinion host Sean Hannity promoted the story to his audience of millions. “If this is true and Seth Rich gave WikiLeaks the DNC emails which ultimately led to the firing — remember Debbie Wasserman Schultz on the eve of the DNC convention? — this blows the whole Russia collusion narrative completely out of the water,” Hannity said. Another reckless blast came soon after: “Still so many questions, this is getting more mysterious by the day, surrounding the murder of DNC staffer Seth Rich. One thing I will tell you, my opinion, strongly, guess what: It wasn’t a robbery. Was he talking to WikiLeaks? We’ll investigate. And if he did, does this blow the Russian collusion narrative out of the water?” Hannity finally dropped the theorizing “out of respect for the family’s wishes.”

Though Fox News retracted the story, it had an impact on Rich’s parents, as recounted in a March 2018 complaint in a New York federal court. “They published, republished, and publicized the sham story — which they knew would be covered again and again, and republished, here and around the world — painting Joel and Mary’s son as a criminal and a traitor to the United States,” read the lawsuit. The complaint sought damages for intentional infliction of emotional distress, among other charges.

A statement from Fox News addressed the development: “We are pleased with the resolution of the claims and hope this enables Mr. and Mrs. Rich to find a small degree of peace and solace moving forward.” The Erik Wemple Blog has asked a lawyer for the Rich family for a response and will update this post if we receive one.

Through the retraction and the Rich family lawsuit, Zimmerman, the sole byline on the conspiracy story, remained employed at Fox News, though her contributions dried up in September 2017. She pursued the Seth Rich “story” with the assistance of a financial adviser and private investigator — the ins and outs of that fiasco furnish a narrative all its own. Asked Tuesday about Zimmerman’s status, Fox News responded that she is no longer with the network.

Why TV networks may be afraid of investigative stories
By Stephen Battaglio

Ronan Farrow’s bestselling book “Catch and Kill” detailed his frustration with former bosses at NBC News over his failed attempt to break the story on the sexual assault and harassment allegations against movie mogul Harvey Weinstein. A month later, leaked video showed ABC’s “20/20” co-anchor Amy Robach grousing over how the network would not run a 2015 interview with a victim of billionaire pedophile Jeffrey Epstein that implicated Prince Andrew and former President Bill Clinton.

In both cases the networks said the stories never reached the editorial standard they believed was necessary to put it on the air. Robach even publicly backed up ABC’s assertion, saying her private remarks on an open mike were made in “a moment of frustration.”

But the dissatisfaction Farrow and Robach expressed reflects a deepening concern by some veteran journalists and producers that network TV news divisions are avoiding controversial enterprise stories that could pose financial risks from litigation and create aggravation for their corporate owners. Declining ratings, public distrust of the media and the surfeit of news from the Trump White House have added to those pressures.

As networks have become part of sprawling, publicly held media conglomerates — ABC parent the Walt Disney Co. and NBC parent Comcast have grown significantly in size in recent years — risk management is now a major element of running a news division.

“There is no question lawyers are more careful now,” said Rick Kaplan, a veteran TV producer who has worked at ABC, CBS, NBC and CNN. “Why are they careful? The finance people are telling them, ‘If you lose, and we owe millions of dollars on a legal suit, you’re toast.’”

Rich McHugh, who was the NBC News producer on Farrow’s reporting on Weinstein, said since “Catch and Kill” came out, he has become a sounding board for TV journalists who have faced resistance in getting their investigative and enterprise pieces on the air.

“If you speak to any reporter who has chased down a story, whether it be for a month or two to seven months, everybody has a version of their story getting killed,” McHugh told The Times. “I’ve heard from 50 reporters and producers who’ve said, ‘Yeah, I’ve had my story killed, it was infuriating, they said we didn’t have it.’”

The price for getting an investigative story wrong can be high. A phony document that CBS News used in a 2004 report on former President George W. Bush’s military service effectively ended the network TV careers of its longtime anchor Dan Rather and two of the network’s producers.

The erosion of public trust in the media also has created more caution. ABC News parent Walt Disney Co. paid more than $177 million in 2017 to settle a defamation lawsuit filed by Beef Products Inc. over the network’s 2012 story on processed beef trimmings, known as “pink slime,” which are used as low-cost filler. The network never retracted or apologized for the story and had gone to trial to defend it.

One of the considerations in settling the suit was whether ABC News could get a favorable verdict in a conservative red state such as South Dakota, where BPI is based, according to a person familiar with the matter who was not authorized to discuss it publicly. (ABC News declined comment on the matter.) Under South Dakota law, damages could have gone as high as $5.7 billion.

“It’s always easier to report in depth on politicians or public officials, because legally there’s less recourse for them,” said Bergman, who was played by Al Pacino in the movie “The Insider.” “It’s always been much more difficult to report on those who control private power, the corporate elites.”

McHugh said during his time at NBC News, he heard reporters not on the White House or national security beats complain about the difficulty of getting their stories on the air.

“The president has become a giant target for the media who sucks up a lot of the oxygen on TV and in print,” McHugh said. “So it’s far easier for editors and producers to say, ‘We are going to devote two segments on Trump and the wrongdoings therein versus this corporate malfeasance elsewhere that comes with tremendous risk attached, even though the viewers might benefit.‘”

With Fear and Favor: The Russophobia of ‘The New York Times’
By David S. Foglesong

With the presidential election approaching, partisan politics have played even more important roles in distorted stories concerning Russia. The Times has appeared to favor Biden not only against Trump but also against his earlier Democratic rival Bernie Sanders. In March, when Sanders still seemed likely to win the Democratic presidential nomination, the Times published an outrageous story that claimed Sanders had been a vodka-downing dupe of Soviet propaganda when he visited the communist country to develop a sister-city relationship between Burlington, Vt., and Yaroslavl. The breathless Times correspondent who ballyhooed the story neglected to mention that at the exact same time that Sanders was in Russia, President Ronald Reagan, who strongly supported citizen exchanges, was in Moscow, strolling on Red Square with Mikhail Gorbachev and declaring that because of Gorbachev’s dramatic reforms, he no longer considered the Soviet Union an “evil empire.”

By printing such smears, the Times and other papers not only exacerbate the political polarization and cynicism about the liberal media in the United States. They also shred the reputation of the American press among Russian liberals and journalists who earlier admired its independence and integrity. As Nadezhda Azhgikhina, a veteran Russian journalist and executive director of PEN Moscow, has explained, “Russiagate killed the beautiful dream of the perfection of the US system of government, respect for the law, and the excellence of the US press.”

Although readers of the Times rarely learn about how Dean Baquet and current publisher Arthur G. Sulzberger privately guide the paper’s coverage of Russia, the New York Times Company Records at the New York Public Library allow us to see how an earlier publisher shaped the anti-Russian line of the Times. The man who made it America’s premier newspaper in the middle of the 20th century, publisher Arthur Hays Sulzberger, explained to Editorial Page Editor Charles Merz in April 1958 that “propaganda…is not necessarily untrue.… It is a method of emphasis calling attention to that which it is desired to have known.” Sulzberger encouraged Merz to seize opportunities “to excoriate Russia” and foment “distrust of the Soviet Union” even when it had not done anything recently that warranted indictment.

Merz, who sometimes resisted Sulzberger’s desires to pound away at Russia, had a unique vantage. In 1920, he and Walter Lippmann had published a detailed study of the wildly inaccurate and ideologically distorted coverage of Russia by the Times during the Russian Civil War. The Times, Merz and Lippmann found, repeatedly published as news what the vehemently anti-communist publisher and editors wanted to see: the flight of Lenin and Trotsky; the imprisonment of Lenin; the overthrow of the Soviet regime. “The Russian policy of the editors of the Times,” Lippmann and Merz concluded, “profoundly and crassly influenced their news columns.”

The Times then veered in the opposite direction, posting the eccentric British correspondent Walter Duranty to Moscow, where he would become notorious as “Stalin’s apologist.” Yet Duranty’s tenure as the chief Times reporter in the Soviet Union ended soon after Arthur Hays Sulzberger became the publisher in 1935.

After World War II, it was under Sulzberger that the Times first cultivated a close relationship with the Central Intelligence Agency. In exchange for special briefings of its correspondents by Allen Dulles and others at the CIA, the Times helped to hide and justify covert interventions in Iran, Guatemala, Indonesia, Cuba, and elsewhere that had disastrous consequences. As the young historian David P. Hadley has shown, the CIA did not control the Times; instead, there was a “friendly confluence” of interests (or, one might say, collusion).

Liars Go to Hell
By Francis X. Maier

The Ukrainian Famine, or Holodomor, was a Soviet-enforced mass starvation of Ukraine’s peasants, a deliberate campaign to secure their grain for foreign currency and break resistance to agrarian collectivization. The Third Reich murdered roughly six million Jews in the Shoah as an act of pure race hatred. In contrast, the Holodomor killed a “mere” three million. But as Taylor notes, the Holodomor was simply one part of a much larger Soviet starvation policy applied simultaneously to the peasantry across the northern Caucasus and Kazakhstan. It left nearly ten million dead, dwarfing any previous or subsequent genocide. In Ukraine alone, the breadbasket of the Soviet Union, deaths by starvation exceeded 25,000 per day.

Taylor’s 1990 article was timed to the release of Stalin’s Apologist, her withering biography of journalist Walter Duranty. A Pulitzer Prize winner, celebrated political analyst, and Moscow correspondent for the New York Times during the 1930s, Duranty interviewed Stalin twice. He also played a significant role in securing American diplomatic recognition for the Soviet regime. Less publicly, he was a prodigious womanizer, longtime opium buddy of Satanist Aleister Crowley, compulsive exploiter of friends, a spendthrift, occasional drunk, and an inventive, always-reliable flack for the Soviet regime.

One of Duranty’s lifelong memories involved his religious grandmother who, after catching the adolescent Duranty in a lie, had warned him that “liars go to hell.” He never forgot or forgave the correction. As an adult, he simply erased all family ties and falsely claimed in his autobiography that he’d been orphaned at age ten. Massaging the truth became one of his core skills. Brilliant, engaging, and widely respected at the time, he was, in the words of Malcolm Muggeridge, who also reported from Moscow and saw Duranty in action, “the greatest liar of any journalist I have met in 50 years of journalism.”

Committed to protecting his own influence and to a future “greater good” promised by the Soviet regime, Duranty at first dismissed rumors of the Ukrainian Famine. Then he downplayed them. Then he claimed that Ukraine’s “food shortages” were the result of local mismanagement and the work of “wreckers” and “spoilers” intent on undermining Soviet progress. He repeatedly denied the mass starvation in his reporting. But he did suggest that “you can’t make an omelet without breaking eggs” . . . especially when the omelet is the task of modernization, and the cooks are tough-minded Bolsheviks intent on a better tomorrow.

As Taylor notes in her book, Western powers struggling with the Great Depression and the rise of Hitler in Germany had little interest in rumors from Ukraine that might antagonize Stalin as a potential ally. Muggeridge had arrived in Russia in 1932 to string for the Manchester Guardian. A convinced socialist at the time, he intended to stay in Russia and renounce his British passport for Soviet citizenship. Reality interfered. By March 1933, he was reporting on Ukraine’s famine as “one of the most monstrous crimes in history,” and his disillusionment with the Soviet paradise was complete. But back in England, thanks in part to Duranty’s counter-reporting and Soviet propaganda, Muggeridge’s work was dismissed as “a hysterical tirade.” Muggeridge himself was slandered, vilified, and unable to find employment. And that might have buried the Holodomor story successfully, except for one man.

Welshman Gareth Jones was a young Russian Studies graduate of Cambridge and a former secretary to British Prime Minister Lloyd George. Stringing for the same Manchester Guardian as Muggeridge, he eluded Soviet press controls and spent three weeks on his own, walking through the hellish conditions of a starvation-ravaged Ukraine. Then he wrote about it in the spring of 1933, confirming and compounding the impact of Muggeridge’s recent work. Walter Duranty led the ferocious, Soviet-prodded attack on Jones’s credibility. He also bullied most other Moscow-based Western journalists—to their enduring disgrace—into doing the same, lest they lose their visas. Jones, however, had a spine.

Wednesday’s Other Story
By Matt Taibbi

Just before the madness at the Capitol broke out Wednesday, news came from London. Wikileaks founder Julian Assange, who seemed Monday to be the luckiest man alive when a judge denied an American request to extradite him, was now denied bail on the grounds that he might “fail to surrender to court to face” the inevitable U.S. appeal. He goes back to legal purgatory, possibly a worse outcome than extradition, which might be the idea.

Assange became a celebrity at a time when popular interest in these questions was at its zenith in the United States. Eight years of the Bush administration inspired profound concern about the runaway power of the state, especially a new secret state-within-a-state the Bush administration insisted 9/11 gave them the moral mandate to build.

Our invasion of Iraq had been a spectacular failure — unlike pictures of returning coffins, that couldn’t be completely covered up — and Americans learned about grotesque forms of war profiteering. These included the use of mercenaries to whom the taxpayer unknowingly paid lavish sums, to commit horrific war crimes like the Nissour Square Massacre, also known as “Baghdad’s Bloody Sunday.”

One of Donald Trump’s most indefensible (and bizarrely, least commented-upon) acts was the pardon of the four Blackwater guards who shot and killed those seventeen Iraqi civilians, including women and children. The New York Times story covering the Blackwater pardon spent just four paragraphs on the case, sticking it below apparently more outrageous acts like the pardon of George Papadopoulos.

“Baghdad’s Bloody Sunday” took place in 2007, by which time we were bombing and kidnapping all over the world, disappearing people off streets like the Bogey Man of fairy tales. Detainees were taken to secret prisons where, we later learned, efforts by prisoners to starve themselves out of their misery were thwarted by a diet of raisins, nuts, pasta, and hummus rocketed up the back door through “the widest possible tube.”

Even years later, one Gitmo prisoner would waive his right to appear in court because “rectal damage” made it too painful to sit. We made mistakes in who we selected for this treatment, grabbing people with no connection to anything for torture, as films like Taxi to the Dark Side documented. However, Americans seemed to lose interest in these policies once the Iraq misadventure came to a sort-of end, and a new president was elected.

The rise of Wikileaks introduced an uncontrollable variable into our drift toward authoritarianism. The WMD episode had shown again that our press, the supposed first line of defense against abuses, could not be relied upon. For every expose like Abu Ghraib, there were a hundred stories that either went uncovered or advanced official deceptions.

Wikileaks anticipated a future in which the press would not only be pliant accomplices to power in this way, but where information itself would be tightly controlled by governments using far-reaching and probably extralegal new technological concepts, deploying misleading excuses for clampdowns.

Journalists Love Nothing More Than Extreme Corporate Censorship
By Michael Tracey

We’re in the midst of what is likely the most extreme corporate censorship crusade in modern history — whereby oligarchic tech officials have moved to simultaneously purge the sitting, democratically-elected President from all the internet’s most-used public communications platforms — and the reaction among media activists (a more apt term than “journalists”) is 100% predictable: complete jubilation.

They absolutely love corporate censorship, and their ecstasy at its implementation grows in direct proportion to how ruthless and politically vengeful that censorship is. These people have the foresight of a gnat: they are so consumed by an ideology of petty, vindictive victimhood that they have no capacity to see how such democracy-overruling censorship power poses an unprecedented threat to political speech and civil liberties.

This threat — the coming purge which will be effectuated by those in positions of authority who crave nothing more than to punish their allegedly fascist/white supremacist/MAGA enemies and thereby impose a left/liberal corporate authoritarian monoculture — in the long-run vastly outweighs whatever “threat” was posed by the farcical MAGA goofball intrusion at the Capitol on Wednesday. Which, again, was obviously not a “coup attempt” in any recognizable sense of that term. “MAGA Terrorist Insurrection,” the insane characterization put forth with a straight face on nationally-televised CNN broadcasts by Jake Tapper (to take just one of countless examples) is sheer propaganda — intentionally designed to foment fear and hysteria for profit. CNN enjoyed its highest ratings ever on Wednesday, and of course you can’t just let that go to waste.

It’s darkly hilarious to consider what the supposed “crisis” of those yahoos taking a mob joyride through the Capitol actually achieved, and whether it warrants such a drastic state and corporate crackdown. If they really thought they were going to help their God Emperor Trump triumphantly retain control of the Presidency, what happened was that they hastened his demise. Within about 24 hours, Trump was made to denounce his own pro-Trump mob, and then sheepishly conceded in humiliating fashion. Likewise, the mob’s reward will be not eternal glory, but prison.

Political theorists have been worrying about mob rule for 2,000 years
By The Economist

LIBERALS HAVE become lazy when thinking about the mob. They have celebrated “people power” when it threatens regimes they disapprove of, in the Middle East, say, while turning a blind eye to the excesses of protesters who they deem to be on the right side of history—in Portland, Oregon, for example. In August 2020 a mainstream publisher, Public Affairs, produced “In Defence of Looting: A Riotous History of Uncivil Action” by Vicky Osterweil.

The invasion of America’s Capitol by mobs of President Donald Trump’s supporters on January 6th was a reminder of the danger of playing with fire. It is naive to assume that mobs will be confined to the “nice” side of the political spectrum; the left-wing kind by their nature generate the right-wing sort. It is doubly naive to expect that mobs will set limits; it is in their nature to run out of control.

Democrats Have Been Shameless About Your Presidential Vote Too
By Derek T. Muller

… starting with George W. Bush’s victory in the 2000 presidential election, Democrats contested election results after every Republican win.

In January 2001, Representative Alcee Hastings of Florida objected to counting his state’s electoral votes because of “overwhelming evidence of official misconduct, deliberate fraud, and an attempt to suppress voter turnout.” Representative Sheila Jackson Lee of Texas referred to the “millions of Americans who have been disenfranchised by Florida’s inaccurate vote count.” Representative Maxine Waters of California characterized Florida’s electoral votes as “fraudulent.”

Vice President Al Gore presided over the meeting in 2001. He overruled these objections because no senator joined them. Part of the reason they didn’t join, presumably, was that Mr. Gore conceded the election a month earlier.

In January 2005, in the wake of Mr. Bush’s re-election, Democrats were more aggressive. Senator Barbara Boxer of California joined Representative Stephanie Tubbs Jones of Ohio to lodge a formal objection to Ohio’s electoral votes. The objection compelled Congress to spend two hours in debate, even though Mr. Bush won Ohio by more than 118,000 votes.

Representative Barbara Lee of California claimed that “the Democratic process was thwarted.” Representative Jerrold Nadler of New York said that the right to vote was “stolen.” Ms. Waters objected too, dedicating her objection to the documentary filmmaker Michael Moore, whose 2004 movie “Fahrenheit 9/11” painted a dark (and at times factually debatable) picture of the Bush presidency.

The motion failed, but not before 31 members of the House, and Ms. Boxer in the Senate, voted to reject Ohio’s electoral votes — effectively voting to disenfranchise the people of Ohio in the Electoral College.

In January 2017, after Donald Trump’s victory, Democrats in Congress once again challenged the election outcome. Representative Jim McGovern of Massachusetts cited “the confirmed and illegal activities engaged by the government of Russia.” Ms. Lee of California argued that Michigan’s electoral votes should be thrown out because “people are horrified by the overwhelming evidence of Russian interference in our elections.” She also cited “the malfunction of 87 voting machines.”

There were objections against the votes in at least nine states. To his credit, Vice President Joe Biden rejected each objection on procedural grounds, stating that “there is no debate” and “it is over.”

More recent efforts by Democrats to throw out electoral votes went nowhere in large part because the losing candidates — Mr. Gore, John Kerry and Hillary Clinton — had conceded the election and did not encourage Congress to reject the vote. This election is different, of course: Mr. Trump continues to argue that the election was “stolen” and “rigged,” and that “ballot stuffing” took place, and Vice President Mike Pence has indicated support for efforts to challenge the election outcome in light of those claims.

But there is no evidence to support those claims. State officials have certified the election results. Without more evidence (and none seems likely to come out), the electoral votes from every state should be accepted by all members of Congress — including all Republicans.

Don’t Burn Down Democracy to Save It
By Zaid Jilani

If we were to equate lawful protected speech with the illegal violent acts of people who act in the name of that speech, it wouldn’t just entail silencing Trump. Take, for instance, the bulk of the political violence that was committed last year. Many of those who rioted in Kenosha, Portland, and elsewhere no doubt held misconceptions and exaggerations about the nature of policing; many of these beliefs were likely echoed by leading Democratic politicians.

As one recent example, Representative Ilhan Omar accused police of “state sanctioned murder” for a case where police were fired upon and returned fire, killing a young man. Omar’s conclusion was brash and irresponsible and indeed followed renewed clashes between Minneapolis activists and police.

But her tweet is also legal and protected speech. It’s not illegal to, for instance, say the phrase “abortion is murder” or “police are racists and killers.” It would be illegal for someone who happened to hold those beliefs to use violence against abortion providers or police. It’s not illegal for Trump to share false and ignorant conspiracy theories, even if it’s unwise for all sorts of reasons.

Removing either Trump or the Democratic politicians who echoed false beliefs about policing from the internet would be antithetical to the very liberal democracy that we all believe in. And it would probably cause those who believe in those conspiracies to point and say, “Ha! They’re suppressing us because we’re right! This is a conspiracy. They won’t even let the person we voted for even speak!” (And they’d probably just end up on darker corners of the internet where they’d have even less exposure to people who disagree.)

I say this as someone who has actually believed in election conspiracies in the past. When, for instance, Democrats rose to challenge the 2004 certification—an event the party has basically memory holed in recent days but which I remember vividly—I recall thinking they had a point.

As a teenager who had browsed all manner of Democratic-leaning websites, I was convinced that something had gone amiss in Ohio. Maybe it was changing polling places. Maybe it was rigged voting machines. But Bush had probably stolen the election. Again.

The response from the Republican establishment and government and corporate institutions was not to enact some massive regime of censorship and expel dissenters. They basically made fun of Barbara Boxer, voted down her objections, and moved on.

Eventually, I moved on, too. Nobody stopped me from voicing my concerns on message boards or worked with corporate and governing elites to silence the members of Congress who shared them. Meeting dissent with suppression can breed extremism. Meeting it with argument at least stands a chance of eventually convincing someone that they’re wrong, and eventually I admitted that the reason John Kerry lost the 2004 election was because he was a terrible candidate, not because the Republicans went and pushed the big red “STEAL THE ELECTION” button.

But this time, it looks like things are going in a different direction. Liberals, with their hegemony over basically every governing and cultural institution, are understandably upset over the proliferation of right-wing conspiracy theories and refusal to admit that Biden won. They believe they should use their hegemony to silence dissenting voices and the president they support, either by expelling pro-Trump Republicans on the Hill or censoring them through big tech.

How Silicon Valley, in a Show of Monopolistic Force, Destroyed Parler
By Glenn Greenwald

Critics of Silicon Valley censorship for years heard the same refrain: tech platforms like Facebook, Google and Twitter are private corporations and can host or ban whoever they want. If you don’t like what they are doing, the solution is not to complain or to regulate them. Instead, go create your own social media platform that operates the way you think it should.

The founders of Parler heard that suggestion and tried. In August, 2018, they created a social media platform similar to Twitter but which promised far greater privacy protections, including a refusal to aggregate user data in order to monetize them to advertisers or algorithmically evaluate their interests in order to promote content or products to them. They also promised far greater free speech rights, rejecting the increasingly repressive content policing of Silicon Valley giants.

Over the last year, Parler encountered immense success. Millions of people who objected to increasing repression of speech on the largest platforms or who had themselves been banned signed up for the new social media company.

As Silicon Valley censorship radically escalated over the past several months — banning pre-election reporting by The New York Post about the Biden family, denouncing and deleting multiple posts from the U.S. President and then terminating his access altogether, mass-removal of right-wing accounts — so many people migrated to Parler that it was catapulted to the number one spot on the list of most-downloaded apps on the Apple Play Store, the sole and exclusive means which iPhone users have to download apps. “Overall, the app was the 10th most downloaded social media app in 2020 with 8.1 million new installs,” reported TechCrunch.

It looked as if Parler had proven critics of Silicon Valley monopolistic power wrong. Their success showed that it was possible after all to create a new social media platform to compete with Facebook, Instagram and Twitter. And they did so by doing exactly what Silicon Valley defenders long insisted should be done: if you don’t like the rules imposed by tech giants, go create your own platform with different rules.

But today, if you want to download, sign up for, or use Parler, you will be unable to do so. That is because three Silicon Valley monopolies — Amazon, Google and Apple — abruptly united to remove Parler from the internet, exactly at the moment when it became the most-downloaded app in the country.

It is true that one can find postings on Parler that explicitly advocate violence or are otherwise grotesque. But that is even more true of Facebook, Google-owned YouTube, and Twitter. And contrary to what many have been led to believe, Parler’s Terms of Service includes a ban on explicit advocacy of violence, and they employ a team of paid, trained moderators who delete such postings. Those deletions do not happen perfectly or instantaneously — which is why one can find postings that violate those rules — but the same is true of every major Silicon Valley platform.

And that’s to say nothing of the endless number of hypocrisies with Silicon Valley giants feigning opposition to violent rhetoric or political extremism. Amazon, for instance, is one of the CIA’s most profitable partners, with a $600 million contract to provide services to the agency, and it is constantly bidding for more. On Facebook and Twitter, one finds official accounts from the most repressive and violent regimes on earth, including Saudi Arabia, and pages devoted to propaganda on behalf of the Egyptian regime. Does anyone think these tech giants have a genuine concern about violence and extremism?

So why did Democratic politicians and journalists focus on Parler rather than Facebook and YouTube? Why did Amazon, Google and Apple make a flamboyant showing of removing Parler from the internet while leaving much larger platforms with far more extremism and advocacy of violence flowing on a daily basis?

In part it is because these Silicon Valley giants — Google, Facebook, Amazon, Apple — donate enormous sums of money to the Democratic Party and their leaders, so of course Democrats will cheer them rather than call for punishment or their removal from the internet. Part of it is because Parler is an upstart, a much easier target to try to destroy than Facebook or Google. And in part it is because the Democrats are about to control the Executive Branch and both houses of Congress, leaving Silicon Valley giants eager to please them by silencing their adversaries.

Violence in the Capitol, Dangers in the Aftermath
By Glenn Greenwald

One can condemn a particular act while resisting the attempt to inflate the dangers it poses. One can acknowledge the very real existence of a threat while also warning of the harms, often far greater, from proposed solutions. One can reject maximalist, inflammatory rhetoric about an attack (a War of Civilizations, an attempted coup, an insurrection, sedition) without being fairly accused of indifference toward or sympathy for the attackers.

Indeed, the primary focus of the first decade of my journalism was the U.S. War on Terror — in particular, the relentless erosions of civil liberties and the endless militarization of American society in the name of waging it. To make the case that those trends should be opposed, I frequently argued that the threat posed by Islamic radicalism to U.S. citizens was being deliberately exaggerated, inflated and melodramatized.

I argued that not because I believed the threat was nonexistent or trivial: I lived in New York City on 9/11 and remember to this day the excruciating horror from the smell and smoke emanating throughout Lower Manhattan and the haunting “missing” posters appended by desperate families, unwilling to accept the obvious reality of their loved ones’ deaths, to every lamp post on every street corner. I shared the same disgust and sadness as most other Americans from the Pulse massacre, the subway bombings in London and Madrid, the workplace mass shooting in San Bernardino.

My insistence that we look at the other side of the ledger — the costs and dangers not only from such attacks but also the “solutions” implemented in the name of stopping them — did not come from indifference towards those deaths or a naive views of those responsible for them. It was instead driven by my simultaneous recognition of the dangers from rights-eroding, authoritarian reactions imposed by the state, particularly in the immediate aftermath of a traumatic event. One need not engage in denialism or minimization of a threat to rationally resist fear-driven fanaticism — as Barbara Lee so eloquently insisted on September 14, 2001.

What we owe to Donald J Trump
By Branko Milanovic

American exceptionalism is, as the name says, based on an ideology of American preeminence, held to be earned and deserved on account of the unique virtu of the new republic. The preeminence for the USA clearly implies a structured hierarchical system of countries where the USA is on the top and other countries play subsidiary and inferior roles. The ultimate unspoken objective of that policy is mastery of the world. The US is not the first country to have entertained such dreams: from Egypt, Rome, Christian Empire of Byzantium, Muslim Empire, Charlemagne, the Huns, Tamerlane, Napoleon, Hitler, the Communist Empire of the USSR, the list is long. While achieving such an empire is most unlikely, the road to that objective is paved with wars. This is why the ideology of “indispensable nation” almost by definition calls for, in Gore Vidal’s terms, “endless wars for endless peace”.

Biden says ‘America is back’. But will his team of insiders repeat their old mistakes?
By Samuel Moyn

The chance Biden will end the misbegotten “war on terror” is vanishingly small – and not merely in Afghanistan and Iraq. Antony Blinken, Biden’s pick for secretary of state, will undo much of the damage Trump did to America’s foreign service and international reputation. But as he explained on a recent podcast, the new administration will ratify the shift away from the “large-scale” to the microscopic and visible to invisible strategies that Bush and Obama pioneered, as if the problem were just that Trump used them with even more gusto.

Avril Haines, whom Biden has nominated to direct national intelligence, helped both devise and limit targeted killings in a CIA stint. An eternal campaign of armed drones and special forces isn’t a fulfilment of a promise to “end endless wars”. It merely appropriates a slogan for the sake of continuity.

The continuity of personnel such as Blinken, who was Biden’s top aide when he voted for the invasion of Iraq, is the way restoration really works in practice. Susan Rice, former national security adviser under Obama and nearly Biden’s vice-presidential candidate, also mentioned for high office, has written glowingly that Biden brings “a deep bench of highly qualified, knowledgable experts”. What is less clear is whether these interventionist mainstays have learned enough in their promises to overturn Trump’s legacy while not recognising how much he both capitalised on and continued their own grievous errors.

Infinity War
By Samuel Moyn and Stephen Wertheim

Given World War II, Korea, Vietnam and many smaller conflicts throughout the Western Hemisphere, no one has ever mistaken the United States for Switzerland. Still, the pursuit of peace is an authentic American tradition that has shaped U.S. conduct and the international order. At its founding, the United States resolved to steer clear of the system of war in Europe and build a “new world” free of violent rivalry, as Alexander Hamilton put it.

Indeed, Americans shrank from playing a fully global role until 1941 in part because they saw themselves as emissaries of peace (even as the United States conquered Native American land, policed its hemisphere and took Pacific colonies). U.S. leaders sought either to remake international politics along peaceful lines — as Woodrow Wilson proposed after World War I — or to avoid getting entangled in the squabbles of a fallen world. And when America embraced global leadership after World War II, it felt compelled to establish the United Nations to halt the “scourge of war,” as the U.N. Charter says right at the start. At America’s urging, the organization outlawed the use of force, except where authorized by its Security Council or used in self-defense.

Even when the United States dishonored that ideal in the years that followed, peace remained potent as a guiding principle. Vietnam provoked a broad-based antiwar movement. Congress passed the War Powers Resolution (WPR) to tame the imperial presidency. Such opposition to war is scarcely to be found today. (The Iraq War inspired massive protests, but they are a distant memory.) Consider that the United States has undertaken more armed interventions since the end of the Cold War than during it. According to the Congressional Research Service, more than 80 percent of all of the country’s adventures abroad since 1946 came after 1989. Congress, whether under Democratic or Republican control, has allowed commanders in chief to claim the right to begin wars and continue them in perpetuity.

The World That War Has Made
By Margaret MacMillan

We may wish war away and hope that it only happens somewhere else or in another time, but it remains one of the key driving forces in human history. The long debate over whether making war is inherent in human nature or the result of culture and social institutions may never be settled. What is undeniable is that war and human society are so deeply interwoven that we cannot understand the development of one apart from the other. We must understand the nature of that relationship and the power and dynamics of war itself if we are to have any hope of bringing it under control.

Once humans changed from being nomads, foraging and hunting for what they needed, and settled in farming communities, they had something of value that others could take. Moving away was no longer the easy option, so humans learned to defend themselves with walls, weapons and, over time, distinctive warrior classes.

Defending your own group successfully—and, often, conquering others—was the basis of the first successful states and empires. War was both cause and consequence. A state that has more resources and uses them more effectively is likely to triumph over its poorer, weaker rivals. War went hand in hand with the development of strong government and bigger political units. Many great powers of the past—such as the Egyptian, Tang, Roman and Aztec empires—were created and maintained by their militaries, even if those forces were overlaid by such things as a shared reverence for the ruler or the gods.

Even societies in which democratic norms are deeply embedded will resort to force to defend themselves from internal subversion or external threats. During the two world wars, governments in democracies such as Britain or the U.S. exerted controls over society—over consumption, the right of movement or the allocation of resources—that would seem intolerable in peacetime. And through conscription, democracies forced their young to serve, fight and die in their armed forces.

The growth of strong central states made war more deadly and destructive. By mobilizing and extracting resources from their societies, governments could field larger forces and sustain them in campaigns for longer. The huge increase in productive capacity as a result of the Industrial Revolution, starting in the 19th century, made possible large conscript armies that could remain locked in combat for months and then years on end. The income tax, a rare and light burden in the U.S. before 1914, steadily grew, first as an emergency measure during World War I and then afterward as American governments found that they could tax more heavily without breaking the economy.

What seems too expensive in peacetime becomes a necessity in wartime, and societies somehow find the resources. Near where I live in Oxford, a historical plaque on the wall of what used to be a hospital notes that the first successful use of penicillin to treat an infection occurred there in 1941. The drug, which by now has saved millions of lives, was developed in 1928 but considered too expensive to produce until World War II.

In 1899, the British government and public were horrified when one out of three volunteers for the colonial war in South Africa was rejected as unfit. That made public health a matter of national security and led to such innovations as free school meals for poor children. As war became more mechanized and weapons grew more sophisticated in the 19th century, armed forces needed better-educated men, which stimulated spending on public education. During the Cold War, the U.S. government poured huge resources for defense purposes into American universities; that produced, among much else, the research that made Silicon Valley possible.

Paradoxically, preparing for war or fighting it can increase the bargaining power of citizens in their society and in their relations with their governments. Political elites learned that concessions—extending the franchise or providing social services—rather than coercion were the more effective way to win loyalty and cooperation. As a member of the Russian upper classes said before World War I, “It is impossible to govern against the people when it is necessary to turn to it for the defense of Russia.” (The 1917 Russian Revolution showed what happens when the people withdraw their support.)

Wars have left their imprints on our societies in other equally important ways: in our values, our arts, our language, in the very way we define ourselves. Homer’s “The Iliad,” Tolstoy’s “War and Peace” and Francis Ford Coppola’s “Apocalypse Now” were all created out of the horror and, yes, the fascination of war. For most Germans, memories of World War II are a burden of shame, while for Russians, they are a proud story of fortitude, resistance and triumph. The U.S. War of Independence and the Civil War are part of what it is to be American, even if Americans still disagree over those conflicts’ meanings.

Humans have always fought for causes they see as greater than themselves. The Arab warriors who fanned out from the Arabian Peninsula in the 7th century swept away old regimes, conquered rich and sophisticated cities, and left a lasting imprint on the Middle East and much of southern Europe because they sought a divine reward. The citizen-soldiers of the French Revolution terrified and demoralized their opponents because they swarmed across the battlefields singing revolutionary songs and taking risks that more prudent soldiers would not. Many of the millions who fought in World War I and II were inspired by nationalism, a sense that something called the nation would survive them and was worth dying for.

Individuals and nations also fight out of supposedly cool calculation, to gain something—whether land, loot or slaves—without paying too high a cost. The Spanish adventurers who toppled the Aztec and Inca empires found that the prospect of fabulous wealth outweighed the risk of death. In the 18th and 19th centuries, the European powers waged “cabinet wars” devised for limited ends. The great statesman Otto von Bismarck picked off Prussia’s enemies one by one in short wars to create the modern state of Germany.

It was a dangerous precedent. The leaders and the military who made the plans before World War I thought they could also manage the conflict, but that war, as so often happens, ran out of their control. As Bismarck’s compatriot, Carl von Clausewitz, said, war has its own logic.

Colin Powell Still Wants Answers
By Robert Draper

Some of the core mysteries that still hung over the most consequential American foreign-policy decision in a half-century, I found, remained mysteries even to Powell. At one point during our first conversation in 2018, he paraphrased a line about Iraq’s supposed weapons of mass destruction from the intelligence assessment that had informed his U.N. speech, which intelligence officials had assured him was rock solid: “ ‘We judge that they have 100 to 500 metric tons of chemical weapons, all produced within the last year.’ How could they have known that?” he said with caustic disbelief.

I told Powell I intended to track down the authors of that assessment. Smirking, he replied, “You might tell them I’m curious about it.”

Not long after meeting Powell, I did manage to speak to several analysts who helped produce the classified assessment of Iraq’s supposed weapons program and who had not previously talked with reporters. In fact, I learned, there was exactly zero proof that Hussein had a chemical-weapons stockpile. The C.I.A. analysts knew only that he once had such a stockpile, before the 1991 Persian Gulf war, and that it was thought to be as much as 500 metric tons before the weapons were destroyed. The analysts had noted what seemed to be recent suspicious movement of vehicles around suspected chemical-weapons plants. There also seemed to be signs — though again, no hard proof — that Iraq had an active biological-weapons program, so, they reasoned, the country was probably manufacturing chemical weapons as well. This was, I learned, typical of the prewar intelligence estimates: They amounted to semi-educated guesses built on previous and seldom-challenged guesses that always assumed the worst and imagined deceptiveness in everything the Iraqi regime did. The analysts knew not to present these judgments as facts. But that distinction had become lost by the time Powell spoke before the U.N.

Moreover, a circular reasoning guided the intelligence community’s prewar estimates. As an intelligence official — one of many who spoke to me on the condition of anonymity — said: “We knew where we were headed, and that was war. Which ironically made it that much more difficult to change the analytic line that we’d stuck with for 10 years. For 10 years, it was our pretty strong judgment that Saddam had chemical capability.” Whether or not this was still true, “with American soldiers about to go in, we weren’t going to change our mind and say, ‘Never mind.’”

In early December, word reached the C.I.A. that the White House wanted it to prepare an oral presentation on Iraq’s weapons program that would feature an “Adlai Stevenson moment” — referring to the 1962 episode in which the U.S. ambassador to the U.N. presented open-and-shut photographic evidence of Soviet ballistic-missile installations in Cuba. The timing of the request seemed odd, given that Hans Blix, the U.N.’s chief weapons inspector, and his team were already in Iraq and would presumably be furnishing on-the-ground visual proof of Hussein’s arsenal, if it existed, any day now. The fact that such a presentation was being ordered up was tantamount to a White House vote of no confidence in Blix.

The presentation was referred to internally at the C.I.A. as the Case. That Tenet did not resist the request suggested that the agency had crossed a red line. “The first thing they teach you in C.I.A. 101 is you don’t help them make the case,” said an agency official who was involved in the project. “But we were all infected in the case for war.”

Edge Master Class 2015: A Short Course in Superforecasting, Class I
By Edge

Hillis: There’s another dimension to it, which is that the very act of taking evidence and using that to adjust your prediction demands a framework of explanation. You have an operative story that you think, “Okay, well, this is how Hillary is going to get elected.” If you look at the failure of the intelligence community in the case of WMD, it was a failure of storytelling, not so much a failure of evidence interpretation. Retrospectively, you could tell a story of Saddam deliberately pretending like he had weapons of mass destruction, there’s a story about his generals lying to him; none of those stories were told beforehand.

Had those stories been told beforehand, the evidence could have been interpreted differently. The same evidence could have been interpreted differently. In fact, that’s part of what I was saying about the job of the intelligence community in storytelling is providing the frameworks in which you can interpret evidence and make predictions.

Tetlock: Right, right.

Brand: So you want red team storytellers.

Hillis: Well, yes.

Tetlock: Part of the official operating procedure of the CIA and officials is to have that. In the case of Iraq WMD, that process broke down. You had the situation where the Director of the CIA did say to the President of the United States, “It’s a slam-dunk, Mr. President.”

Axelrod: It’s a slam-dunk to convince the American public.

Tetlock: That’s what George Tenet says.

Axelrod: That’s right. Well, that’s what he did say. We have records on that.

Tetlock: Well, that’s not what everyone says. George Tenet thinks he was blamed by the White House unfairly. The White House leaked it out of context. But it is fair to say that, even to put slam-dunk aside, the U.S. intelligence community was sending out a very decisive affirmation that Iraq did have an active weapons of mass destruction program; that was manifestly true. If the U.S. intelligence community had institutionalized forecasting tournaments, you would have created an organizational culture in which they would be much more reticent about ever using the term slam-dunk, whether at the level of PR or at the level of actuality because slam-dunk means 100 percent. 100 percent means I’m willing to give you infinite odds. That’s quite extraordinary.

Beware ‘faux experts’ who don’t pay for their actions, Nassim Taleb says
By Paul Solman

PAUL SOLMAN: I first interviewed you in 2006. “Black Swan” hadn’t even come out yet. Then came “Black Swan,” the book. Then came the crash of ’08. You became famous for warning people, having warned people, about extreme events and how cataclysmic they could be, right?

NASSIM NICHOLAS TALEB: The reason people paid attention to my work was because I had skin the game at the time. I was involved. I was taking risks.

PAUL SOLMAN: You were a trader.

NASSIM NICHOLAS TALEB: I was involved. I was eating my risk. Owning my own risk, as I write in the book.

PAUL SOLMAN: Are there black swans on the horizon now? What are you betting on?

NASSIM NICHOLAS TALEB: The point is, the system is fragile because we had a lot of debt. Plus, there are a bunch of things that have been developing that I’m not comfortable with, developing over, say, the least 20 years, but mostly the last 10 years, I’m not comfortable with.

PAUL SOLMAN: They are?

NASSIM NICHOLAS TALEB: It’s that rise of the class, the no-skin-in-the-game class in decision-making.

PAUL SOLMAN: The no-skin-in-the-game class?

NASSIM NICHOLAS TALEB: Exactly. Decision-makers who can drag you into intervention, can drag you into policies that cosmetically feel good, but eventually, somebody pays a price and it’s not them.

There are two levels. The first one, and the most obvious one, is people who intervene in Iraq, thinking, “Hey, we’re going to bring democracy,” or some abstract concept. The thing falls apart, and they walk away from it. They’re not committed with living or owning the toy. They broke it. They don’t own it. Then, the same people make the same mistake with Libya and then now currently with Syria, the warmongers. In the past, historically, warmongers were soldiers. You could not rise in a senate if you didn’t have war experience. [Today if] you have a class of people who inflict risk on others without being affected by the outcome, that class of people is going to disrupt the system, causing some kind of collapse.

PAUL SOLMAN: Do you see some kind of collapse on the horizon?

NASSIM NICHOLAS TALEB: I can see some severe distortions now from that class of people deciding to “fix” things and, effectively, not paying the price.

President Bush Discusses Freedom in Iraq and Middle East
By President George W. Bush

Successful societies limit the power of the state and the power of the military — so that governments respond to the will of the people, and not the will of an elite. Successful societies protect freedom with the consistent and impartial rule of law, instead of selecting applying — selectively applying the law to punish political opponents. Successful societies allow room for healthy civic institutions — for political parties and labor unions and independent newspapers and broadcast media. Successful societies guarantee religious liberty — the right to serve and honor God without fear of persecution. Successful societies privatize their economies, and secure the rights of property. They prohibit and punish official corruption, and invest in the health and education of their people. They recognize the rights of women. And instead of directing hatred and resentment against others, successful societies appeal to the hopes of their own people. (Applause.)

These vital principles are being applies in the nations of Afghanistan and Iraq. With the steady leadership of President Karzai, the people of Afghanistan are building a modern and peaceful government. Next month, 500 delegates will convene a national assembly in Kabul to approve a new Afghan constitution. The proposed draft would establish a bicameral parliament, set national elections next year, and recognize Afghanistan’s Muslim identity, while protecting the rights of all citizens. Afghanistan faces continuing economic and security challenges — it will face those challenges as a free and stable democracy. (Applause.)

In Iraq, the Coalition Provisional Authority and the Iraqi Governing Council are also working together to build a democracy — and after three decades of tyranny, this work is not easy. The former dictator ruled by terror and treachery, and left deeply ingrained habits of fear and distrust. Remnants of his regime, joined by foreign terrorists, continue their battle against order and against civilization. Our coalition is responding to recent attacks with precision raids, guided by intelligence provided by the Iraqis, themselves. And we’re working closely with Iraqi citizens as they prepare a constitution, as they move toward free elections and take increasing responsibility for their own affairs. As in the defense of Greece in 1947, and later in the Berlin Airlift, the strength and will of free peoples are now being tested before a watching world. And we will meet this test. (Applause.)

Securing democracy in Iraq is the work of many hands. American and coalition forces are sacrificing for the peace of Iraq and for the security of free nations. Aid workers from many countries are facing danger to help the Iraqi people. The National Endowment for Democracy is promoting women’s rights, and training Iraqi journalists, and teaching the skills of political participation. Iraqis, themselves — police and borders guards and local officials — are joining in the work and they are sharing in the sacrifice.

This is a massive and difficult undertaking — it is worth our effort, it is worth our sacrifice, because we know the stakes. The failure of Iraqi democracy would embolden terrorists around the world, increase dangers to the American people, and extinguish the hopes of millions in the region. Iraqi democracy will succeed — and that success will send forth the news, from Damascus to Teheran — that freedom can be the future of every nation. (Applause.) The establishment of a free Iraq at the heart of the Middle East will be a watershed event in the global democratic revolution. (Applause.)

U.S. Warns Iraq It Risks Losing Access to Key Bank Account if Troops Told to Leave
By Ian Talley and Isabel Coles

The State Department warned that the U.S. could shut down Iraq’s access to the country’s central bank account held at the Federal Reserve Bank of New York, a move that could jolt Iraq’s already shaky economy, the officials said.

Iraq, like other countries, maintains government accounts at the New York Fed as an important part of managing the country’s finances, including revenue from oil sales. Loss of access to the accounts could restrict Iraq’s use of that revenue, creating a cash crunch in Iraq’s financial system and constricting a critical lubricant for the economy.

The prospect of U.S. sanctions against Iraq arose after the Jan. 3 U.S. airstrike that killed Iranian Maj. Gen. Qassem Soleimani at Baghdad International Airport. The Iraqi parliament voted Sunday to urge Prime Minister Adel Abdul-Mahdi to work toward the expulsion of the approximately 5,300 U.S. troops.

In response to the nonbinding resolution, which was backed by the prime minister, President Trump threatened to impose sanctions against Iraq if the U.S. was forced to withdraw its troops.

The financial threat isn’t theoretical: The country’s financial system was squeezed in 2015 when the U.S. suspended access for several weeks to the central bank’s account at the New York Fed over concerns the cash was filtering through a loosely regulated market into Iranian banks and to the Islamic State extremist group.

“The U.S. Fed basically has a stranglehold on the entire [Iraqi] economy,” said Shwan Taha, chairman of Iraqi investment bank Rabee Securities.

The prospect of sanctions has unsettled ordinary Iraqis, for whom memories of living under a United Nations embargo during the 1990s are still fresh. Pro-Iranian and other Shiite factions leading the charge to oust U.S. forces from Iraq have sought to reassure the public by telling them Iraq could pivot to China.

An adviser to the prime minister, Abd al-Hassanein al-Hanein, said that while the threat of sanctions was a concern, he did not expect the U.S. to go through with it. “If the U.S. does that, it will lose Iraq forever,” he said.

Leaked Iranian Intelligence Reports Expose Tehran’s Vast Web of Influence in Iraq
By James Risen, Tim Arango, Farnaz Fassihi, Murtaza Hussain and Ronen Bergman

The leaked cables offer an extraordinary glimpse inside the secretive Iranian regime. They also detail the extent to which Iraq has fallen under Iranian influence since the American invasion in 2003, which transformed Iraq into a gateway for Iranian power, connecting the Islamic Republic’s geography of dominance from the shores of the Persian Gulf to the Mediterranean Sea.

The trove of leaked Iranian intelligence reports largely confirms what was already known about Iran’s firm grip on Iraqi politics. But the reports reveal far more than was previously understood about the extent to which Iran and the United States have used Iraq as a staging area for their spy games. They also shed new light on the complex internal politics of the Iranian government, where competing factions are grappling with many of the same challenges faced by American occupying forces as they struggled to stabilize Iraq after the United States invasion.

And the documents show how Iran, at nearly every turn, has outmaneuvered the United States in the contest for influence.

… by and large, the intelligence ministry operatives portrayed in the documents appear patient, professional, and pragmatic. Their main tasks are to keep Iraq from falling apart; from breeding Sunni militants on the Iranian border; from descending into sectarian warfare that might make Shia Muslims the targets of violence; and from spinning off an independent Kurdistan that would threaten regional stability and Iranian territorial integrity. The Revolutionary Guards and Suleimani have also worked to eradicate the Islamic State, but with a greater focus on maintaining Iraq as a client state of Iran and making sure that political factions loyal to Tehran remain in power.

This portrait is all the more striking at a time of heightened tensions between the United States and Iran. Since 2018, when President Donald Trump pulled out of the Iran nuclear deal and reimposed sanctions, the White House has rushed ships to the Persian Gulf and reviewed military plans for war with Iran. In October, the Trump administration promised to send American troops to Saudi Arabia following attacks on oil facilities there for which Iran was widely blamed.

In a sense, the leaked Iranian cables provide a final accounting of the 2003 U.S. invasion of Iraq. The notion that the Americans handed control of Iraq to Iran when they invaded now enjoys broad support, even within the U.S. military. A recent two-volume history of the Iraq War, published by the U.S. Army, details the campaign’s many missteps and its “staggering cost” in lives and money. Nearly 4,500 American troops were killed, hundreds of thousands of Iraqis died, and American taxpayers spent up to $2 trillion on the war. The study, which totals hundreds of pages and draws on declassified documents, concludes: “An emboldened and expansionist Iran appears to be the only victor.”

Iran’s rise as a power player in Iraq was in many ways a direct consequence of Washington’s lack of any post-invasion plan. The early years following the fall of Saddam were chaotic, both in terms of security and in the lack of basic services like water and electricity. To most observers on the ground, it appeared as if the United States was shaping policy on the go, and in the dark.

Qasem Soleimani: US kills top Iranian general in Baghdad air strike
By BBC News

Iran’s most powerful military commander, Gen Qasem Soleimani, has been killed by a US air strike in Iraq.

The 62-year-old spearheaded Iranian military operations in the Middle East as head of Iran’s elite Quds Force.

He was killed at Baghdad airport, along with other Iran-backed militia figures, early on Friday in a strike ordered by US President Donald Trump.

Mr Trump said the general was “directly and indirectly responsible for the deaths of millions of people”.

Now It Can Be Told: How Neil Sheehan Got the Pentagon Papers
By Janny Scott

The Pentagon Papers, arguably the greatest journalistic catch of a generation, were a secret history of United States decision-making on Vietnam, commissioned in 1967 by the secretary of defense. Their release revealed for the first time the extent to which successive White House administrations had intensified American involvement in the war while hiding their own doubts about the chances of success.

Recounting the steps that led to his breaking the story, Mr. Sheehan told of aliases scribbled into the guest registers of Massachusetts motels; copy-shop machines crashing under the burden of an all-night, purloined-document load; photocopied pages stashed in a bus-station locker; bundles belted into a seat on a flight from Boston; and telltale initials incinerated in a diplomat’s barbecue set.

He also revealed that he had defied the explicit instructions of his confidential source, whom others later identified as Daniel Ellsberg, a former Defense Department analyst who had been a contributor to the secret history while working for the Rand Corporation. In 1969, Mr. Ellsberg had illicitly copied the entire report, hoping that making it public would hasten an end to a war he had come passionately to oppose.

Contrary to what is generally believed, Mr. Ellsberg never “gave” the papers to The Times, Mr. Sheehan emphatically said. Mr. Ellsberg told Mr. Sheehan that he could read them but not make copies. So Mr. Sheehan smuggled the papers out of the apartment in Cambridge, Mass., where Mr. Ellsberg had stashed them; then he copied them illicitly, just as Mr. Ellsberg had done, and took them to The Times.

In the interview in 2015, Mr. Sheehan said he had never revealed Mr. Ellsberg’s identity while the project was underway. To his editors he always spoke only of “the sources.” It was another journalist, outside the paper, who blew Mr. Ellsberg’s cover not long after the Pentagon Papers story broke.

Nor did Mr. Sheehan ever speak about how he had obtained the papers. In 2015, he said he had never wanted to contradict Mr. Ellsberg’s account or embarrass him by describing Mr. Ellsberg’s behavior and state of mind at the time.

There was no contact between the two men for six months. Shortly before Christmas 1971, Mr. Sheehan said, they ran into each other in Manhattan. In a brief conversation, he said, he told Mr. Ellsberg what he had done.

“So you stole it, like I did,” he recalled Mr. Ellsberg saying.

“No, Dan, I didn’t steal it,” Mr. Sheehan said he had answered. “And neither did you. Those papers are the property of the people of the United States. They paid for them with their national treasure and the blood of their sons, and they have a right to it.’”

‘Modern-day Pentagon Papers’: Comparing the Afghanistan Papers to blockbuster Vietnam War study
By Gillian Brockell

… both the Pentagon Papers and the Afghanistan Papers tell a damning story of officials at the highest levels of government, in both Democratic and Republican administrations, misleading the public about the true nature of the foreign wars they waged.

As the Times put it in their first stories, the Pentagon Papers revealed that the White House “concealed from the Congress and the public as much as possible,” that “vigorous, even acrimonious, debate within the Government” was also hidden, and “that the American intelligence community repeatedly provided the policy makers with what proved to be accurate warnings that desired goals were either unattainable or likely to provoke costly reactions from the enemy.”

The Afghanistan Papers are just as damning. John Sopko, the head of SIGAR, told The Post the interviews show “the American people have constantly been lied to” about the war in Afghanistan.

Comparisons of the wars in Vietnam and Afghanistan are not new; after all, Vietnam was once nicknamed “the longest war” — until the conflict in Afghanistan took the title.

But on Monday, as the revelations reverberated across the Internet, the similarities appeared even more uncanny.

“I’ve been wondering when our ‘Pentagon Papers’ would be released,” a Newsweek reporter commented.

“Imagine growing up with a war that became famous for its poor execution and management,” one reader tweeted, “and then imagine making the exact same mistakes once your generation comes into power.”

Another reader tweeted: “My brother was killed in action in Vietnam in 1970. Hearing this is not easy. We haven’t learned anything from history. For all of the Gold Star families from the Afghanistan War, I extend my sincere sorrow.”

‘Our friends didn’t have to die’: Afghanistan Papers surface pain and familiar frustrations
By Alex Horton

For Marine Corps veteran Dustin Kelly, the report reignites the agony of not knowing precisely what comrades gave their lives for.

“The most traumatic experiences of our lives didn’t have to happen, our friends didn’t have to die on the other side of the planet,” Kelly, who served as a mortar man in Helmand province in 2010 to retake the Taliban’s stronghold, told The Post on Monday.

Kelly, the Marine Corps veteran, saw through one common Pentagon talking point — that fierce fighting indicated the desperation of insurgents — once his unit understood their Afghan mission was to essentially become a tourniquet on a war that had gone from bad to worse.

“Nobody could have possibly looked at that and thought, ‘Yeah, we’re winning,’ ” he said.

Defense officials, commanders and government staffers from the start underestimated the complexities of the fight, the documents revealed, and it spiraled out of control when the mission to dislodge al-Qaeda fighters and Taliban militants morphed into something else.

The mission became nation-building on top of a superstructure of “kleptocracy” and mind-bending corruption, even down to the police patrolman level. The result: 43,074 Afghan civilians killed, 2,300 U.S. troops dead and 64,124 Afghan troops and police killed in the insurgency that may be as strong as it ever was.

A central tenet for the Pentagon’s strategy — build the capacity of Afghan forces — was often rejected by commanders more attuned to fighting than training foreign troops. That may have “doomed the whole effort from the start,” said Charles Duncan, a former Army signals intelligence officer who served in Afghanistan in 2013.

“I thought the war’s absurdity was only visible to those of us at the lower levels,” Duncan said. “Now I know that our pessimism was shared by officials all the way up the chain of command, and yet we all acted as though the war were winnable.”

What do veterans do with this now?

Kelly has struggled to explain his combat experience to Americans who have never served and don’t know anyone else who has.

“I keep thinking that if I read enough about war, and rationalize and intellectualize enough, then something will snap, and the knots will untie and I’ll understand it all,” he said.

“I think I’ll spend the rest of my life trying to figure out why the hell any of us were in Afghanistan.”

Column: Pat Tillman’s sacrifice is an important reminder of what Memorial Day is all about
By LZ Granderson

All of which brings me to Pat Tillman, the former Arizona Cardinal who famously left the NFL in May 2002 to enlist in the Army shortly after the Sept. 11 attacks. Tillman and his brother Kevin, who left the Cleveland Indians farm system, both were assigned to 2nd Battalion — 75th Ranger Regiment. In July 2003 the pair were awarded the Arthur Ashe Courage Award by ESPN. On April 8, 2004, they arrived in Afghanistan. On April 22, Pat Tillman was killed by friendly fire in the province of Khost. His memorial service was broadcast on national television.

I never had the honor of meeting Tillman, but I’ve always felt compelled to remember his name and story, especially this time of year.

Not many people would walk away from a three-year, $3.6-million contract and the comforts that come with being a professional athlete to help the country. Hell, we have professional athletes today who are hesitant to play in an empty stadium to help the country. A comparison, mind you, not made to shame the players in a COVID-19 world, but rather to emphasize how special Tillman’s sacrifice was and why it is important to remember his name — even as we struggle to recall what Memorial Day is about.

The Good Soldier
By Dexter Filkins

While most of the facts have been re­ported before, Krakauer performs a valuable service by bringing them all together — particularly those about the cover-up. The details, even five years later, are nauseating to read: After Tillman’s death, Army commanders, aided and abetted by members of the Bush administration, violated many of their own rules, not to mention elementary standards of decency, to turn the killing into a propaganda coup for the American side. Tillman’s clothing and notebooks were burned — a flouting of Army regulations — and he was fast-tracked for a posthumous Silver Star, which, as Krakauer shows, was a fraud. Members of his unit were ordered to stay silent about the manner of his death. Even part of Tillman’s body disappeared. Most important, Army commanders went to great lengths to keep the facts of Tillman’s death a secret and allowed the story that he died at the hands of the Taliban to flourish. The low point came at his memorial service, where he was lionized before television cameras, while officials who knew the truth stayed quiet.

Krakauer doesn’t nail down precisely who gave the initial order to conceal the manner of Tillman’s death, but he demonstrates conclusively that the White House was happy to peddle the story that he’d been killed by enemy fire. It makes sense: at the time of Tillman’s death, the Abu Ghraib scandal in Iraq was dominating the news.

Getting to the truth of Pat Tillman’s death
By Michael Ordoña

Filmmaker Amir Bar-Lev tries to paint a more complete portrait of the fallen soldier, and to chronicle the family’s struggle to uncover the whole truth in the documentary “The Tillman Story,” which opens in L.A. theaters on Friday. It was the football star’s refusal to comment on his motivations for joining up that left them open to so much interpretation, Bar-Lev says. “Nature abhors a vacuum and, in the same way, storytellers — media — abhor a vacuum. Everybody came in and said things for him. And the things they said would have embarrassed the hell out of him. They were actually the opposite of his perspective.”

Russell Baer, a specialist with the Rangers who was close friends with both Tillman and his brother Kevin (who joined the service together in May 2002), says, “It would have been easy to say, ‘There’s an investigation and there’s a possibility of friendly fire.’ But they ran with this pumped-up narrative of this guy running up a hill, blah blah blah. Everything you saw in the media was completely … wrong.

“You also have to understand what was going on at that time: It was the worst month in the war yet, the most casualties; the Abu Ghraib prison scandal was just breaking,” Baer says. “The true story coming out would have damaged public support for the war. He was the most famous soldier and he was killed by the military. Of course they’re going to spin it and pray the family doesn’t do anything about it.”

“There was an effort at the highest levels of government to manipulate the media about Pat’s death,” Bar-Lev says, citing a memo written by then-Lt. Gen. Stanley McChrystal that indicates top decision makers were aware of the true circumstances of Tillman’s death while they wove a very different narrative for the public. “People who do things like that shouldn’t be in charge of our troops,” says the director. “Their disrespect for the soldiers on the ground should make military families and the men and women who serve our country outraged.

“The basic thing we owe the people who put their lives on the line for the country is the truth, and we owe their families the truth too. They can handle the truth.”

At war with the truth
By Craig Whitlock

The Lessons Learned interviews contain few revelations about military operations. But running throughout are torrents of criticism that refute the official narrative of the war, from its earliest days through the start of the Trump administration.

At the outset, for instance, the U.S. invasion of Afghanistan had a clear, stated objective — to retaliate against al-Qaeda and prevent a repeat of the Sept. 11, 2001, attacks.

Yet the interviews show that as the war dragged on, the goals and mission kept changing and a lack of faith in the U.S. strategy took root inside the Pentagon, the White House and the State Department.

Fundamental disagreements went unresolved. Some U.S. officials wanted to use the war to turn Afghanistan into a democracy. Others wanted to transform Afghan culture and elevate women’s rights. Still others wanted to reshape the regional balance of power among Pakistan, India, Iran and Russia.

The Lessons Learned interviews also reveal how U.S. military commanders struggled to articulate who they were fighting, let alone why.

Was al-Qaeda the enemy, or the Taliban? Was Pakistan a friend or an adversary? What about the Islamic State and the bewildering array of foreign jihadists, let alone the warlords on the CIA’s payroll? According to the documents, the U.S. government never settled on an answer.

As a result, in the field, U.S. troops often couldn’t tell friend from foe.

The specter of Vietnam has hovered over Afghanistan from the start.

On Oct. 11, 2001, a few days after the United States started bombing the Taliban, a reporter asked Bush: “Can you avoid being drawn into a Vietnam-like quagmire in Afghanistan?”

“We learned some very important lessons in Vietnam,” Bush replied confidently. “People often ask me, ‘How long will this last?’ This particular battlefront will last as long as it takes to bring al-Qaeda to justice. It may happen tomorrow, it may happen a month from now, it may take a year or two. But we will prevail.”

In those early days, other U.S. leaders mocked the notion that the nightmare of Vietnam might repeat itself in Afghanistan.

“All together now — quagmire!” Rumsfeld joked at a news conference on Nov. 27, 2001.

But throughout the Afghan war, documents show that U.S. military officials have resorted to an old tactic from Vietnam — manipulating public opinion.

In news conferences and other public appearances, those in charge of the war have followed the same talking points for 18 years. No matter how the war is going — and especially when it is going badly — they emphasize how they are making progress.

Even when casualty counts and other figures looked bad, the senior NSC official said, the White House and Pentagon would spin them to the point of absurdity. Suicide bombings in Kabul were portrayed as a sign of the Taliban’s desperation, that the insurgents were too weak to engage in direct combat. Meanwhile, a rise in U.S. troop deaths was cited as proof that American forces were taking the fight to the enemy.

“It was their explanations,” the senior NSC official said. “For example, attacks are getting worse? ‘That’s because there are more targets for them to fire at, so more attacks are a false indicator of instability.’ Then, three months later, attacks are still getting worse? ‘It’s because the Taliban are getting desperate, so it’s actually an indicator that we’re winning.’ ”

“And this went on and on for two reasons,” the senior NSC official said, “to make everyone involved look good, and to make it look like the troops and resources were having the kind of effect where removing them would cause the country to deteriorate.”

Afghan papers reveal US public were misled about unwinnable war
By Peter Beaumont

In one scathing assessment Douglas Lute, a lieutenant general who served as the White House Afghan war tsar during the George W Bush and Barack Obama administrations, told interviewers in 2015: “We were devoid of a fundamental understanding of Afghanistan – we didn’t know what we were doing.”

Speaking frankly, like other interviewees, on the understanding that what he was saying at the time was confidential, he added: “What are we trying to do here? We didn’t have the foggiest notion of what we were undertaking.

“If the American people knew the magnitude of this dysfunction … 2,400 lives lost.”

In another interview Jeffrey Eggers, a retired Navy Seal and White House staffer for Bush and Obama, said: “What did we get for this $1tn effort? Was it worth $1tn?

“After the killing of Osama bin Laden, I said that Osama was probably laughing in his watery grave considering how much we have spent on Afghanistan.”

Some of those interviewed by the Sigar project went even further, suggesting a deliberate effort to alter statistics on the war to suggest to the American public that it was being won.

“Every data point was altered to present the best picture possible,” said Bob Crowley, an army colonel who served as a senior counterinsurgency adviser to US military commanders in 2013 and 2014.

“Surveys, for instance, were totally unreliable but reinforced that everything we were doing was right and we became a self-licking ice-cream cone.”

U.S. officials failed to devise a clear strategy for the war in Afghanistan, confidential documents show
By Craig Whitlock

In unusually candid interviews, officials who served under Presidents George W. Bush and Barack Obama said both leaders failed in their most important task as commanders in chief — to devise a clear strategy with concise, attainable objectives.

Diplomats and military commanders acknowledged they struggled to answer simple questions: Who is the enemy? Whom can we count on as allies? How will we know when we have won?

Their strategies differed, but Bush and Obama both committed early blunders that they never recovered from, according to the interviews.

After a succession of quick military victories in 2001 and early 2002, Bush decided to keep a light force of U.S. troops in Afghanistan indefinitely to hunt suspected terrorists. Soon, however, he made plans to invade another nation — Iraq — and Afghanistan quickly became an afterthought.

James Dobbins, a career diplomat who served as a special envoy for Afghanistan under Bush and Obama, told government interviewers it was a hubristic mistake that should have been obvious from the start.

By the time Obama took office in 2009, al-Qaeda had largely vanished from Afghanistan. But the Taliban had made a comeback.

Obama tore up Bush’s counterterrorism strategy and approved a polar-opposite plan — a massive counterinsurgency campaign, backed by 150,000 U.S. and NATO troops, as well as tons of aid for a weak Afghan government.

In contrast with Bush, Obama imposed strict deadlines and promised to bring home all U.S. troops by the end of his presidency.

But Obama’s strategy was also destined to fail. U.S., NATO and Afghan officials told government interviewers that it tried to accomplish too much, too quickly, and depended on an Afghan government that was corrupt and dysfunctional.

Worse, they said, Obama tried to set artificial dates for ending the war before it was over. All the Taliban had to do was wait him out.

Today, about 13,000 U.S. troops are still in Afghanistan. The U.S. military acknowledges the Taliban is stronger now than at any point since 2001. Yet there has been no comprehensive public reckoning for the strategic failures behind the longest war in American history.

Years after they fought in Afghanistan, US troops watch as their children deploy to the same war
By J.P. Lawrence adn Phillip Walter Wellman

Nineteen years ago on Wednesday, a generation of Americans deployed to Afghanistan to root out the terrorists behind the 9/11 attacks, believing that by fighting in the country more than 7,400 miles away, they would spare their children the need to do so too.

But as the U.S. war in Afghanistan begins its 20th year, some of those same service members have watched as their sons and daughters have deployed to continue the fight.

“When we started this, people asked why I was going, and my response was, ‘So my sons don’t have to fight this war,’” said Master Sgt. Trevor deBoer, who has deployed to Afghanistan three times with the 20th Special Forces Group since 2002.

Nearly two decades later, deBoer’s son, Spc. Payton Sluss, also served in Afghanistan — including at Forward Operating Base Fenty, north of the city of Jalalabad, where deBoer had served.

“My feet were walking the same land you were,” Sluss said to his father in a joint phone interview with Stars and Stripes.

Built to fail
By Craig Whitlock

George W. Bush, Barack Obama and Donald Trump all promised the same thing: The United States would not get stuck with the burden of “nation-building” in Afghanistan.

In October 2001, shortly after ordering U.S. forces to invade, Bush said he would push the United Nations to “take over the so-called nation-building.”

Eight years later, Obama insisted his government would not get mired in a long “nation-building project,” either. Eight years after that, Trump made a similar vow: “We’re not nation-building again.”

Instead of bringing stability and peace, they said, the United States inadvertently built a corrupt, dysfunctional Afghan government that remains dependent on U.S. military power for its survival. Assuming it does not collapse, U.S. officials have said it will need billions more dollars in aid annually, for decades.

Speaking candidly on the assumption that most of their remarks would not be made public, those interviewed said Washington foolishly tried to reinvent Afghanistan in its own image by imposing a centralized democracy and a free-market economy on an ancient, tribal society that was unsuited for either.

By some measures, life in Afghanistan has improved markedly since 2001. Infant mortality rates have dropped. The number of children in school has soared. The size of the Afghan economy has nearly quintupled.

But the U.S. nation-building project backfired in so many other ways that even foreign-aid advocates questioned whether Afghanistan, in the abstract, might have been better off without any U.S. help at all, according to the documents.

“I mean, the writing is on the wall now,” Michael Callen, an economist with the University of California at San Diego specializing on the Afghan public sector, told government interviewers. “We spent so much money and there is so little to show for it.”

Callen and others blamed an array of mistakes committed again and again over 18 years — haphazard planning, misguided policies, bureaucratic feuding. Many said the overall nation-building strategy was further undermined by hubris, impatience, ignorance and a belief that money can fix anything.

Much of the money, they said, ended up in the pockets of overpriced contractors or corrupt Afghan officials, while U.S.-financed schools, clinics and roads fell into disrepair, if they were built at all.

Some said the outcome was foreseeable. They cited the U.S. track record of military interventions in other countries — Iraq, Syria, Libya, Yemen, Haiti, Somalia — over the past quarter-century.

“We just don’t have a post-conflict stabilization model that works,” Stephen Hadley, who served as White House national security adviser under Bush, told government interviewers. “Every time we have one of these things, it is a pickup game. I don’t have any confidence that if we did it again, we would do any better.”

In comments echoed by other officials who shaped the war, Lute said the United States lavished money on dams and highways just “to show we could spend it,” fully aware that the Afghans, among the poorest and least educated people in the world, could never maintain such huge infrastructure projects.

“One poignant example of this is a ribbon-cutting ceremony complete with the giant scissors I attended for the district police chief in some God-forsaken province,” Lute said. He recalled how the U.S. Army Corps of Engineers had overseen the design and construction of a police headquarters that featured a glass facade and an atrium.

“The police chief couldn’t even open the door,” Lute said. “He had never seen a doorknob like this. To me, this encapsulates the whole experience in Afghanistan.”

Under the new constitution, the Afghan president wielded far greater authority than the other two branches of government — the parliament and judiciary — and also got to appoint all the provincial governors. In short, power was centralized in the hands of one man.

The rigid, U.S.-designed system conflicted with Afghan tradition, typified by a mix of decentralized power and tribal customs. But with Afghanistan beaten down and broke, the Americans called the shots.

“In hindsight the worst decision was to centralize power,” an unnamed European Union official said in a Lessons Learned interview.

A German official echoed the point: “After the fall of the Taliban, it was thought that we needed a president right away, but that was wrong.”

An unidentified USAID official said he was astounded that the State Department thought an American-style presidency would work. “You’d think they’ve never worked overseas,” he said. “Why did we create centralized government in a place that has never had one?”

A big reason is that U.S. leaders had a potential Afghan ruler in mind. Hamid Karzai, a tribal leader from southern Afghanistan, belonged to the country’s largest ethnic group, the Pashtuns.

Perhaps more importantly, Karzai spoke polished English and was a CIA asset. In 2001, a U.S. spy had saved his life, and the CIA would keep Karzai on its payroll for years to come.

Jeffrey Eggers, a retired Navy SEAL and White House official under Bush and Obama, told government interviewers that such projects failed to achieve their objective — bringing peace and stability — and that U.S. military officials were guilty of “biting off more than they can chew.”

“There is a bigger question here — why does the U.S. undertake actions that are beyond its abilities?” Eggers said. “This question gets at strategy and human psychology, and it is a hard question to answer.”

Even some of the most well-intentioned projects could boomerang.

Tooryalai Wesa, who served as governor of Kandahar province from 2008 to 2014, said U.S. aid workers once insisted on carrying out a public-health project to teach Afghans how to wash their hands.

“It was an insult to the people. Here people wash their hands five times a day for prayers,” Wesa told government interviewers. “Moreover, hand wash project is not needed. Think about employment, and think about enabling people to earn something.”

But that could backfire, too.

For one project in Kandahar, U.S. and Canadian troops paid villagers $90 to $100 a month to clear irrigation canals, according to Thomas Johnson, a specialist on Afghanistan who works as a professor at the Naval Postgraduate School.

It took a while for the troops to figure out their program was indirectly disrupting local schools. Teachers in the area earned much less, only $60 to $80 a month.

“So initially all the school teachers quit their jobs and joined the ditch diggers,” Johnson said in a Lessons Learned interview. He served as a political and counterinsurgency adviser to the Canadians from 2009 to 2010.

A similar problem arose in eastern Afghanistan, where one gung-ho Army brigade was so determined to make a difference that it promised to build 50 schools — but unwittingly ended up helping the Taliban, according to an officer in the brigade.

“There weren’t enough teachers to fill them, so buildings languished,” the unnamed U.S. military officer told government interviewers, “and some of them even became bomb-making factories.”

Overwhelmed by opium
By Craig Whitlock

President Bush asked the United Nations and NATO allies to tackle the problems of opium production and trafficking. Britain agreed to take charge but got off to a disastrous start, according to the interviews.

In the spring of 2002, British officials floated an irresistible offer. They agreed to pay Afghan poppy farmers $700 an acre — a fortune in the impoverished, war-ravaged country — to destroy their crops.

Word of the $30 million program ignited a poppy-growing frenzy. Farmers planted as many poppies as they could, offering part of their yield to the British while selling the rest on the open market. Others harvested the opium sap right before destroying their plants and got paid anyway.

In a Lessons Learned interview, Anthony Fitzherbert, a British agricultural expert, called the cash-for-poppies program “an appalling piece of complete raw naivete,” saying that the people in charge had “no knowledge of nuances and [I] don’t know they really cared.”

U.S. officials said the British wanted to be seen as doing something, even though they had little confidence the program would work. Michael Metrinko, a former U.S. diplomat who served in the embassy in Kabul at the time, said the results were predictable.

“Afghans like most other people are quite willing to accept large sums of money and promise anything knowing that you will go away,” Metrinko said in an oral-history interview. “The British would come and hand out sums of money and the Afghans would say, ‘Yes, yes, yes, we’re going to burn it right now,’ and the Brits would leave. They would then get two sources of income from the same crop.”

Upon taking office, Holbrooke brought eradication to a standstill. The U.S. government shifted its focus to programs that tried to persuade Afghan poppy farmers to switch to other crops or adopt other livelihoods altogether.

But those efforts mostly backfired. In Helmand province, the epicenter of the poppy belt, USAID and the U.S. military paid Afghans to dig or renovate miles of canals and ditches to irrigate fruit trees and other crops. But the canals worked just as well to irrigate poppies — which were much more profitable to grow.

Similarly, USAID invested millions of dollars to entice Helmand farmers to start wheat-growing operations. While wheat production increased, farmers relocated their poppy fields to other parts of the province. Between 2010 and 2014, poppy cultivation across the country nearly doubled, according to U.N. estimates.

Some U.S. officials suggested part of the problem was that Washington fundamentally misunderstood Afghanistan and mistakenly viewed opium as just another crop.

“Afghanistan is not an agricultural country; that’s an optical illusion,” Barnett Rubin, an academic authority on Afghanistan who served as a senior adviser to Holbrooke, said in a Lessons Learned interview. The “largest industry is war, then drugs, then services,” he added. “Agriculture is down in fourth or fifth place.”

Consumed by corruption
By Craig Whitlock

In public, as President Barack Obama escalated the war and Congress approved billions of additional dollars in support, the commander in chief and lawmakers promised to crack down on corruption and hold crooked Afghans accountable.

In reality, U.S. officials backed off, looked away and let the thievery become more entrenched than ever, according to a trove of confidential government interviews obtained by The Washington Post.

In the interviews, key figures in the war said Washington tolerated the worst offenders — warlords, drug traffickers, defense contractors — because they were allies of the United States.

But they said the U.S. government failed to confront a more distressing reality — that it was responsible for fueling the corruption, by doling out vast sums of money with limited foresight or regard for the consequences.

U.S. officials were “so desperate to have the alcoholics to the table, we kept pouring drinks, not knowing [or] considering we were killing them,” an unnamed State Department official told government interviewers.

“The basic assumption was that corruption is an Afghan problem and we are the solution,” Barnett Rubin, a former senior State Department adviser and a New York University professor, told government interviewers. “But there is one indispensable ingredient for corruption — money — and we were the ones who had the money.”

The scale of the corruption was the unintended result of swamping the war zone with far more aid and defense contracts than impoverished Afghanistan could absorb. There was so much excess, financed by American taxpayers, that opportunities for bribery and fraud became almost limitless, according to the interviews.

To purchase loyalty and information, the CIA gave cash to warlords, governors, parliamentarians, even religious leaders, according to the interviews. The U.S. military and other agencies also abetted corruption by doling out payments or contracts to unsavory Afghan power brokers in a misguided quest for stability.

“We had partnerships with all the wrong players,” a senior U.S. diplomat told government interviewers. “The U.S. is still standing shoulder-to-shoulder with these people, even through all these years. It’s a case of security trumping everything else.”

Gert Berthold, a forensic accountant who served on a military task force in Afghanistan during the height of the war, from 2010 to 2012, said he helped analyze 3,000 Defense Department contracts worth $106 billion to see who was benefiting.

The conclusion: About 40 percent of the money ended up in the pockets of insurgents, criminal syndicates or corrupt Afghan officials.

“And it was often a higher percent,” Berthold told government interviewers. “We talked with many former [Afghan] ministers, and they told us, you’re under-estimating it.”

Berthold said the evidence was so damning that few U.S. officials wanted to hear about it.

“No one wanted accountability,” he said. “If you’re going to do anti-corruption, someone has got to own it. From what I’ve seen, no one is willing to own it.”

In 2002 and 2003, when Afghan tribal councils gathered to write a new constitution, the U.S. government gave “nice packages” to delegates who supported Washington’s preferred stance on human rights and women’s rights, according to a U.S. official who served in Kabul at the time.

“The perception that was started in that period: If you were going to vote for a position that [Washington] favored, you’d be stupid to not get a package for doing it,” the unnamed official told government interviewers.

By the time Afghanistan held parliamentary elections in 2005, that perception had hardened. Lawmakers realized their votes could be worth thousands of dollars to the Americans, even for legislation they would have backed anyway, the U.S. official said.

“People would tell each other, so-and-so has just been to the U.S. Embassy and got this money. They said ‘ok now I need to go,’ ” the U.S. official said. “So from the beginning, their experience with democracy was one in which money was deeply embedded.”

By 2006, the Afghan government had “self-organized into a kleptocracy” under which people in power could plunder the economy without restraint, according to Christopher Kolenda, a retired Army colonel who advised several U.S. commanders during the war.

“The kleptocracy got stronger over time, to the point that the priority of the Afghan government became not good governance but sustaining this kleptocracy,” Kolenda told government interviewers. “It was through sheer naivete, and maybe carelessness, that we helped to create the system.”

In Aug. 20, 2009, Afghans went to the polls to choose a president. It was a critical moment. Obama was contemplating whether to send tens of thousands of additional U.S. troops to the war zone. He needed a reliable and credible ally in Kabul.

Right away, reports surfaced of electoral fraud on an epic scale — ghost voting, official miscounting, ballot-box stuffing, plus violence and intimidation at the polls.

Initial results showed Karzai, the incumbent, had won. But his opponents, and many independent observers, accused his side of trying to steal the election. A U.N.-backed panel investigated and determined Karzai had received about 1 million illegal votes, a quarter of all those cast.

The outcome put Obama administration officials in a box. They had said corruption was intolerable but also had promised to respect Afghan sovereignty and not interfere with the election. Moreover, they did not want to completely alienate Karzai. If there was another vote, many saw him as the likely victor anyway.

In the end, the Obama administration brokered a deal in which Karzai was declared the winner after he agreed to share some power with his main rival. But in Lessons Learned interviews, several U.S. officials said the messy result ruined U.S. credibility.

“That was profoundly destructive to a rule-of-law principle,” said Sarah Chayes, who served as a civilian adviser to the U.S. military at the time. “It was devastating that we were willing to patch up the elections. . . . While we had the opportunity to say that corruption is important, explicit instructions were given that it is not.”

Peter Galbraith, a Karzai critic who served as a deputy U.N. envoy to Afghanistan in 2009, was removed from his post after he complained that the United Nations was helping cover up the extent of the election fraud. An American, Galbraith told government interviewers that the U.S. government also stood by when Karzai appointed cronies to election boards and anti-corruption posts.

“There was a broader impact, because of the culture of dishonesty,” Galbraith said. “You cannot separate administrative fraud from the corruption of the system.”

Unguarded nation
By Craig Whitlock

In one interview, Thomas Johnson, a Navy official who served as a counterinsurgency adviser in Kandahar province, said Afghans viewed the police as predatory bandits, calling them “the most hated institution” in Afghanistan. An unnamed Norwegian official told interviewers that he estimated 30 percent of Afghan police recruits deserted with their government-issued weapons so they could “set up their own private checkpoints” and extort payments from travelers.

Ryan Crocker, a former U.S. ambassador to Kabul, told government interviewers that the Afghan police were ineffective “not because they’re out-gunned or out-manned. It’s because they are useless as a security force and they’re useless as a security force because they are corrupt down to the patrol level.”

Victor Glaviano, who worked with the Afghan army as a U.S. combat adviser from 2007 to 2008, called the soldiers “stealing fools” who habitually looted equipment supplied by the Pentagon. He complained to government interviewers that Afghan troops had “beautiful rifles, but didn’t know how to use them,” and were undisciplined fighters, wasting ammunition because they “wanted to fire constantly.”

Since 2002, the United States has allocated more than $83 billion in security assistance to Afghanistan, a sum that dwarfs the defense budgets of other developing nations. In 2011, at the peak of the war, Afghanistan received $11 billion in security aid from Washington — $3 billion more than what neighboring Pakistan, which has a stockpile of nuclear weapons and a far bigger army, spent that year on its military.

Yet after almost two decades of help from Washington, the Afghan army and police are still too weak to fend off the Taliban, the Islamic State and other insurgents without U.S. military backup.

“We got the [Afghan forces] we deserve,” Douglas Lute, an Army lieutenant general who served as the White House’s Afghan war czar under Presidents George W. Bush and Obama, told government interviewers.

If the U.S. government had ramped up training between 2002 and 2006, “when the Taliban was weak and disorganized, things may have been different,” Lute added. “Instead, we went to Iraq. If we committed money deliberately and sooner, we could have a different outcome.”

In the Lessons Learned interviews, U.S. and NATO officials said the glowing progress reports delivered to the public were largely an illusion and glossed over major deficiencies that were visible from the outset.

For starters, only about 2 in 10 Afghan recruits could read or write. U.S. and NATO trainers put them through crash literacy courses, but those lasted only a few weeks.

Other gaps in basic knowledge had to be bridged. One U.S. Special Forces trainer told government interviewers that the Afghans mistook urinals in the barracks as drinking fountains. Another U.S. trainer said he had to teach conscripts basic human anatomy: “They didn’t understand how a tourniquet could help stop bleeding if you’re not even putting it over the wound.”

Questionable motivations and loyalties snaked through the ranks of the army and police. Ethnic and tribal tensions posed a perpetual problem, with the officer corps dominated by warlords who doled out promotions based on patronage, according to the interviews.

Filling specialized billets was especially tough. It took nearly a decade to get the Afghan air force off the ground, because of not just a lack of qualified pilots but also a dearth of mechanics who could read repair manuals.

One U.S. military adviser assigned to the Afghan air force told government interviewers that “Afghans would come to them with ‘pilot wings’ that they found or purchased, claiming to be pilots but having no flight experience.”

The unnamed U.S. adviser said that the air base where he worked was plagued by “shenanigans” and that many Afghans reeked of jet fuel when they left each day because they were smuggling out small containers of it to sell on the black market.

Petty corruption was rampant. In a 2015 Lessons Learned interview, an unnamed U.N. official described how Afghan police recruits would undergo two weeks of training, “get their uniforms, then go back to the province and sell them.” Unworried that they might get in trouble, he said, many would reenlist and “come back to do it again.”

U.S. advisers constantly tried to plug holes in the system to prevent looting and stealing but said they were often stymied by Afghan government officials who did not want things to change.

“The less they behaved, the more money we threw at them,” a former U.S. official told government interviewers in 2015. “There was no real incentive to reform.”

In a 2017 Lessons Learned interview, Shahmahmood Miakhel, a former adviser to the Afghan Interior Ministry, said he once got an earful from district tribal leaders who could not stand either side.

“I asked that why is it possible that a large number of about 500 security forces cannot defeat about 20 or 30 Taliban. The community elders replied that the security people are not there to defend the people and fight Taliban, they are there to make money” by selling their weapons or fuel, recalled Miakhel, who now serves as the governor of Nangahar province in eastern Afghanistan.

“I asked the elders that ok the government is not protecting you, but you are about 30,000 people in the district. If you don’t like Taliban then you must fight against them.

“Their response was that we don’t want this corrupt government to come and we don’t want Taliban either, so we are waiting to see who is going to win.”

How the Good War Went Bad
By Carter Malkasian

The Taliban exemplified an idea—an idea that runs deep in Afghan culture, that inspired their fighters, that made them powerful in battle, and that, in the eyes of many Afghans, defines an individual’s worth. In simple terms, that idea is resistance to occupation. The very presence of Americans in Afghanistan was an assault on what it meant to be Afghan. It inspired Afghans to defend their honor, their religion, and their homeland. The importance of this cultural factor has been confirmed and reconfirmed by multiple surveys of Taliban fighters since 2007 conducted by a range of researchers.

The Afghan government, tainted by its alignment with foreign occupiers, could not inspire the same devotion. In 2015, a survey of 1,657 police officers in 11 provinces conducted by the Afghan Institute for Strategic Studies found that only 11 percent of respondents had joined the force specifically to fight the Taliban; most of them had joined to serve their country or to earn a salary, motivations that did not necessarily warrant fighting, much less dying. Many interviewees agreed with the claim that police “rank and file are not convinced that they are fighting for a just cause.”

Perhaps the most important lesson is the value of forethought: considering a variety of outcomes rather than focusing on the preferred one. U.S. presidents and generals repeatedly saw their plans fall short when what they expected to happen did not: for Bush, when the Taliban turned out not to be defeated; for McChrystal and Petraeus, when the surge proved unsustainable; for Obama, when the terrorist threat returned; for Trump, when the political costs of leaving proved steeper than he had assumed. If U.S. leaders had thought more about the different ways that things could play out, the United States and Afghanistan might have experienced a less costly, less violent war, or even found peace.

This lack of forethought is not disconnected from the revelation in The Washington Post’s “Afghanistan Papers” that U.S. leaders misled the American people. A single-minded focus on preferred outcomes had the unhealthy side effect of sidelining inconvenient evidence. In most cases, determined U.S. leaders did this inadvertently, or because they truly believed things were going well. At times, however, evidence of failure was purposefully swept under the rug.

The Man Who Told America the Truth About D-Day
By David Chrisinger

During his four years as a war correspondent, Pyle was embraced by enlisted men, officers and a huge civilian public as a voice who spoke for the common infantryman. With his trauma in France, he had become one of them. After sharing so much of their experience, he understood how gravely war can alter the people who have to see it and fight it and live it. He knew that the survivors can come home with damage that is profound, painful and long-lasting. It was a truth that he found hard or even impossible to communicate to the readers back home — and it is a truth that is still difficult and troubling now, 75 years after D-Day.

We accept that our wars are different now — more scattered, seemingly never-ending, against a more diffuse and elusive enemy — but those wars are still presented with the promise that we are fighting for our way of life or the survival of our values, and that we’ll enjoy greater peace and security when those wars are won. War reporting has become more honest and unsparing about tallying the death toll — at least on our side — but politicians making the case for deployments and invasions still don’t invite the public in advance to decide whether the promised benefits will be worth the losses.

Seeing and reporting the vast losses on the beach at Normandy and watching war’s meat grinder in action in the vicious battles that followed, Pyle was evidently forced to recalculate the arithmetic of victories and losses. By the time he was killed, 10 months later and on the opposite side of the world, the lesson seemed to have solidified for him. Not even the war ending, not even victory — which his previous reporting usually kept in sight as the great goal of the war — would be able to bring back all the people killed or counteract the damage done to the survivors. Pyle had written about battles and war in a way that promised hope. By the time victory was actually in sight, he had come to feel that there was no way the war could be a story with a happy ending.

Build a Better Blob
By Emma Ashford

When Ben Rhodes, Deputy National Security Adviser to U.S. President Barack Obama, coined the term “the Blob” a few years ago, it was to describe the hawkish Beltway elites whom he blamed for undermining Obama’s foreign policy vision. Since then, the term has taken on a life of its own, appearing in books and articles and spawning a thousand arguments on Twitter. It has become a shorthand for the D.C. foreign policy community—sometimes as a token of pride, more often as an epithet for those who occupied positions of power during some of the United States’ biggest foreign policy debacles since the end of the Cold War.

To describe the United States’ military interventions as mishandled, for example, is to criminally underplay their impact. Take the 2003 war in Iraq, which scrambled the balance of power in the Middle East for a generation and enabled the emergence of the Islamic State, or ISIS. The same goes for the 2011 “humanitarian” intervention in Libya, which led to a civil war that still rages today and unleashed a tidal wave of small arms across a volatile region. Even the 1998 NATO intervention in Kosovo, relatively uncontroversial by comparison, substantially worsened U.S.-Russian relations and almost brought troops from the two nations to blows.

Some might argue that the Blob has learned its lesson. After all, Washington decided against large-scale ground invasions in Crimea and Syria. But few would have seriously argued that the United States should have gone to war with Russia over Crimea to begin with, while the light-footprint intervention in Syria—often inaccurately described as Obama’s resisting the Blob—was similarly harmful. In one memorable case, Pentagon-trained and CIA-trained rebels ended up fighting each other.

Defenders of the status quo are right to point out that one cannot boil down all of U.S. foreign policy since the Cold War to debacles such as Iraq and Libya. But some less bellicose policies have been just as detrimental. NATO expansion comes to mind. So, too, does American support for the “color revolutions” in eastern Europe and the Caucasus, which did little to promote democracy and much to worsen U.S. relations with Russia and China. Then there are the extraterritorial sanctions that the United States has imposed on Iran, North Korea, Russia, and others and that have pushed even close allies to try to shield their companies or decouple from the dollar. Critics may argue that only Trump’s incompetence has pushed allies to this extreme, but his administration is using tools popularized and perfected during the Bush and Obama years.

The notion that American withdrawal from global leadership is the real culprit behind these failures—that the world would be a better place if the United States just leaned in more—doesn’t pass the sniff test. There are now as many as 80,000 U.S. troops in the Middle East, compared with around 20,000 in the mid-1990s. U.S. forces overseas still number almost 230,000, compared to 300,000 during the last year of the Cold War. American troops are engaged in combat in at least 14 countries, with regular U.S. air or drone strikes in seven others. Defense spending in 2019 was about 3.4 percent of GDP; most other advanced industrialized democracies spend less than 2.0 percent of GDP. To call this disengagement is laughable.

Ultimately, Brands and his co-authors conclude, thanks to liberal internationalism, the “long peace continued” and the world has remained on a “generally positive track.” They may be right that alternative strategies would have done no better; counterfactuals are impossible to falsify. But judged against its own goals—peace, a rules-based order, and the maintenance of American primacy—the project of liberal internationalism has in many ways failed. The United States’ globe-spanning forward military presence has not prevented the emergence of a peer competitor, as China’s growing power shows. The number of global conflicts is at its highest since 1975, while Freedom House’s recently released report concluded that 2019 marked “the 14th consecutive year of deterioration in political rights and civil liberties.” Meanwhile, the rules-based international order is proving to be paper-thin, not least because it was weakened by a continual stream of American violations, such as the invasion of Iraq and the use of drone strikes and extralegal targeted killings. By almost every foreign policy metric, the United States is worse off today than it was at the end of 1991.

How Hypocrisy Became Standard Operating Procedure for the U.S. Government
By Micah Zenko

Officers who are publicly skeptical of current wars are rare and shunned. A prominent example is Army Col. Gian Gentile, who was publicly critical of U.S. counterinsurgency doctrine and practice and was constantly pressured, personally by peers and through his home institution of the U.S. Military Academy, to cease his criticisms. In 2010, a colonel working for Gen. David Petraeus, who was then commander of forces in Afghanistan, told me that Gentile was “public enemy number one” and in some ways more dangerous to the mission than the Taliban. Active-duty officers such as Andrew Krepinevich and H.R. McMaster wrote books that criticized the political decision-making and military actions of the Vietnam War, but they were published, respectively, 11 and 22 years after the disaster ended, when there were limited professional consequences for the authors.

… there are two powerful national psychological factors at play as well. First, an unquestioned respect for the uniform makes it politically difficult to challenge military policy without also challenging the selfless troops who implement it. Second, habitual foreign threat inflation mandates ever more extensive foreign military deployments. These forces combine to create a mindset where wars escape the pervasive scrutiny that is applied to other federal governmental activities.

What the Washington Post revealed was not a conspiracy of insiders who contrived to maintain lies, but rather an information cartel that dispensed confident public assessments about the Afghanistan war that they individually knew were unfounded. What they privately told SIGAR interviewers was what they privately also told many journalists, academics, and think-tank fellows at the time. But they almost never said their piece out loud, because it could have put at risk their political reputations and professional livelihoods. These all-too-human institutional pressures and psychological factors are powerful, undeniable, and will likely remain prevalent in the foreseeable future.

A Long-Forgotten CIA Document From WikiLeaks Sheds Critical Light on Today’s U.S. Politics and Wars
By Glenn Greenwald

What made this document so fascinating, so revealing, is the CIA’s discussion of how to manipulate public opinion to ensure it remains at least tolerant if not supportive of Endless War and, specifically, the vital role President Obama played for the CIA in packaging and selling U.S. wars around the world. In this classified analysis, one learns a great deal about how the “military industrial complex,” also known as the “Blob” or “Deep State,” reasons; how the Agency exploits humanitarian impulses to ensure continuation of its wars; and what the real function is of the U.S. President when it comes to foreign policy.

What prompted the memo was the CIA’s growing fears that the population of Western Europe was rapidly turning against the War on Terror generally and the war in Afghanistan specifically — as evidenced by the fall of the Dutch Government driven in large part by the electorate’s anger over involvement in Afghanistan. The CIA was desperate to figure out how to stem the tide of anti-war sentiment growing throughout that region, particularly to shield France and Germany from it, by manipulating public opinion.

The Agency concluded: its best and only asset for doing that was President Obama and his popularity in Western European cities.

The premise of the CIA memo was that the populations of NATO countries participating in the War in Afghanistan did not support that war. What those allied governments and the CIA relied upon — as the above headline notes — was what the agency called “public apathy”: meaning that the war’s “low public salience has allowed French and German leaders to disregard popular opposition and steadily increase their troop contributions to the International Security Assistance Force (ISAF).”

In other words, as long as the public stayed sufficiently inattentive, their democratically leaders were free to ignore their wishes and stay fighting in a war that the citizens of that country opposed. But what concerned the CIA most was that simmering dislike for the war in Western Europe would turn into active, concentrated opposition — as had just happened in Holland — forcing the worst of all outcomes: that the governments fighting with the U.S. in Afghanistan for close to a decade would actually have to honor the beliefs of their citizens that the war was not worth it, and pull out, leaving the U.S. to shoulder the burden alone …

Rep. Ilhan Omar’s Misguided Defense of John Brennan and The Logan Act: a Dangerous and Unconstitutional Law
By Glenn Greenwald

On Friday, reports emerged that, just days after Israeli Prime Minister Benjamin Netanyahu met with Saudi Crown Prince Mohammed bin Salman, a key Iranian nuclear scientist was ambushed and murdered by gunmen. U.S. officials told The New York Times that Israel was behind the assassination — which should be unsurprising given that Israel assassinated several senior Iranian nuclear scientists during the Obama years.

This news provoked indignation from MSNBC’s John Brennan, formerly Obama’s Director of the CIA, an agency heralded worldwide for its righteous opposition to assassinations. Along with condemning the assassination of this Iranian scientist as “a criminal act and highly reckless,” Brennan also used his tweet to send an explicit message to Iranian officials: urging them not to retaliate but instead to wait for the Biden administration to take over, promising the new U.S. administration would “respond against perceived culprits.”

In other words, Brennan, like many people (including myself), is concerned that the Trump administration and Israel are seeking to escalate tensions with Iran during the transition — either because they seek war with Tehran or, more likely, because they want to provoke a cycle of retaliation that would prevent the incoming Biden administration from re-implementing the Iran Deal which Trump nullified and which Israel vehemently opposes.

Thus, Brennan sought to subvert what he perceives as the current foreign policy of the U.S. Government — to provoke and punish Iran — by encouraging Iranian officials to ignore the provocation and therefore not derail efforts by the incoming U.S. administration to establish better relations once Biden is inaugurated …

There are so many amazing ironies to this Brennan statement. To begin with, it’s just stunning to watch Obama’s Chief Assassin — who presided over a global, years-long, due-process-free campaign of targeted assassinations, under which the official “kill list” of who was to live and who was to die was decreed by Judge, Jury and Executioner Brennan in a secret White House meeting that bore the creepy designation “Terror Tuesdays” — now suddenly posture as some kind of moral crusader against assassinations. I have denounced these Israeli assassinations as terrorism — both in the past and yesterday — but I have also denounced with equal vigor the Obama/Brennan global assassination program.

The audacity of Brennan’s moral posturing became even more evident as he tried to explain why his and Obama’s assassination program was noble and legal, while the one that resulted in Friday’s killing in Iran was immoral and criminal. After all, this is the same John Brennan who got caught red-handed lying about how many innocent civilians were killed by Obama’s global assassination program, and who even claimed the right to target American citizens for execution by drone without any transparency let alone due process: a right they not only claimed but exercised.

When you’re reduced to sitting on Twitter trying to distinguish your own global assassination program from the one you’re condemning, that is rather potent evidence that you are among the absolute last persons on earth with the moral credibility to denounce anything. That’s particularly true when you directed your unilateral assassination powers onto your own citizens, ending several of their lives.

Biden’s Choice For Pentagon Chief Further Erodes a Key U.S. Norm: Civilian Control
By Glenn Greenwald

Worse, many in the media and D.C. professional class cheered outright subversion by military brass and the intelligence community of the policies of the elected President — including when they withheld classified information from Trump, “slow walked” his orders, and deceived him about troop positions to prevent him from leaving Syria. In other words, while the liberal establishment feigned concern over “norms,” including the one that demands civilian control, they applauded military and intelligence sabotage of the president’s policies. (Subversion by the military of democratically elected leaders who, in their judgment, pursue unwise policies is a defining element of a Deep State, something supporters of this subversion simultaneously insisted did not exist in the U.S. and that only conspiratorial crazies could believe it did).

The New Ruling Coalition: Opposition to Afghanistan Withdrawal Shows Its Key Factions
By Glenn Greenwald

In July, pro-war Democrats on the House Armed Services Committee, led by their Lockheed-and-Raytheon-funded Chairman Adam Smith, partnered with Congresswoman Liz Cheney and her pro-war GOP allies to block the use of funds for removing troops (not only from Afghanistan but also Germany), as part of a massive increase in military spending. The oppositional left-right coalition of anti-war Democrats such as Ro Khanna and Tulsi Gabbard and America-First Trump supporters such as Matt Gaetz were no match for the bipartisan pro-war coalition which attempted to block any end to the war.

A crucial weapon which Smith, Cheney and the other anti-withdrawal Committee members wielded was a widely-hyped New York Times scoop published days before the Committee vote, which — in its first paragraph — announced:

American intelligence officials have concluded that a Russian military intelligence unit secretly offered bounties to Taliban-linked militants for killing coalition forces in Afghanistan — including targeting American troops — amid the peace talks to end the long-running war there, according to officials briefed on the matter.

Repeatedly citing this New York Times story, based on the claims of anonymous “intelligence officials,” the bipartisan pro-war wing of the Committee insisted that to leave Afghanistan now would be particularly inappropriate and dangerous in light of this dastardly Russian interference. (Top military officials and the commander in Afghanistan later admitted the bounty program “had not been corroborated by intelligence agencies and that they do not believe any attacks in Afghanistan that resulted in American casualties can be directly tied to it,” but by then, the job was done).

And thus did this union of pro-war Democrats, Cheney-led neocons, the intelligence community and their chosen mainstream media outlets succeed in providing the perfectly crafted tool at the most opportune moment to justify blocking an end to America’s longest war.

The NYT Admits Key Falsehoods That Drove Last Year’s Coup in Bolivia: Falsehoods Peddled by the U.S., its Media, and the NYT
By Glenn Greenwald

The central tool used both the Bolivian Right and their U.S. Government allies to justify the invalidation of Morales’ 10-point election victory were two election audits by the regional group Organization of American States — one a preliminary report issued on November 10, the day before Moraels was forced from the country, and then its final report issued the next month — which asserted widespread, deliberate election fraud.

“Given all the irregularities observed, it is impossible to guarantee the integrity of the data and certify the accuracy of the results,” the OAS announced on November 10 as the country was in turmoil over the election. The next day, Morales, under the threat of force to him and his family, boarded a plane to Mexico, where he was granted asylum. The final OAS report in December claimed that “the audit team has detected willful manipulation” of the results based on “incontrovertible evidence of an electoral process marred by grave irregularities.”

But on Sunday, the New York Times published an article strongly suggesting that it was the OAS audit, not the Bolivian election, that was “marred by grave irregularities,” making it “impossible to guarantee the integrity of the data and certify the accuracy of the” OAS’ claims. The paper of record summarized its reporting this way: “A close look at Bolivian election data suggests an initial analysis by the OAS that raised questions of vote-rigging — and helped force out a president — was flawed.”

To cast serious doubt on the integrity of these critical OAS reports, the Times’ relies upon a new independent study from three scholars at U.S. universities which — in the words of the NYT — examined “data obtained by The New York Times from the Bolivian electoral authorities.” That study, said the NYT, “has found that the Organization of American States’ statistical analysis was itself flawed.”

That study documented that the key “irregularity” cited by OAS “was actually an artifact of the analysts’ error.” It further explained that with regard to “the patterns that the observers deemed ‘inexplicable,’” the new data analysis shows that “we can explain them without invoking fraud.”

While this new study focuses solely on the OAS’s data claims and does not purport to decree the Bolivian election entirely free of fraud — virtually no election, including in the U.S., is entirely free of irregularities and fraud — the NYT explains that “the authors of the new study said they were unable to replicate the OAS’s findings using its likely techniques” and that “the difference is significant” in assessing the overall validity of the OAS’s claims.

”In sum,” the new report concludes, “we offer a different interpretation of the quantitative evidence that led the OAS and other researchers to question the integrity of the Bolivian election.“ Specifically, “we find that we do not require fraud in order to explain the quantitative patterns used to help indict Evo Morales.” The scholars’ bottom line: “we cannot replicate the OAS results.”

It is virtually impossible to overstate the importance of the OAS accusations in driving Morales from his own country and, with no democratic mandate, shifting power in lithium-rich Bolivia to the white, Christian, U.S.-subservient Right. While critics had also accused Morales of improperly seeking a fourth term despite constitutional term limits, Bolivia’s duly constituted court had invalidated those term limits (much the way that New York Mayor Michael Bloomberg induced the City Council to overturn a term limit referendum so he could seek a third term), leaving anti-Morales outside agitators, such as the OAS and U.S. officials, to rely instead on claims of election fraud.

… as usual, the two news outlets most influential in disseminating and ratifying false anti-democratic claims from the U.S. government were the Washington Post and — though they neglected to mention it in their article yesterday on the debunked OAS findings — the New York Times itself. The Post, in its article the day after Morales was forced to leave, ratified the election fraud accusation in its headline: “Bolivia’s Morales resigns amid scathing election report, rising protests.” The article heralded the findings of what it called “the multilateral organization,” noting that the OAS found Morales’ victory “was marred by profound irregularities.”

A Post editorial from the same day proclaimed in its headline: “Bolivia is in danger of slipping into anarchy. It’s Evo Morales’s fault.”

The Post editorial decreed: “there could be little doubt who was ultimately responsible for the chaos: newly resigned president Evo Morales.” How could the victim of a coup — who had just been elected President — be at fault for the resulting chaos? Because, explained the Post’s editors, “an audit released by the Organization of American States reported massive irregularities in the vote count and called for a fresh election.”

The New York Times similarly and repeatedly hyped the OAS report as proof that Morales’ victory was illegitimate and the coup therefore democratic. “An independent international audit of Bolivia’s disputed election concluded that former President Evo Morales’s officials resorted to lies, manipulation and forgery to ensure his victory,” its news article claimed, without a syllable of critical pushback until the penultimate paragraph, where it noted that “some economists and statisticians in the United States” had pointed to flaws in the OAS’ data analysis.

But the paper’s Editorial contained no such reservations, pronouncing Morales’ victory the by-product of “a flawed election,” noting that “early suspicions of fraud by the Organization of American States helped fuel the protests and provided cover for the military to ‘suggest’ that Mr. Morales leave office.” The Times’ Editorial then cited the final OAS report — which the paper yesterday called into question — as “substantiating those suspicions by proving “a series of malicious operations aimed at altering the will expressed at the polls” on Oct. 20.”

In sum, when it came to 2019 Bolivian coup, the U.S. media played its decades-old, standard role whenever the U.S. wants to depict a military coup against a government it dislikes as a victory for democracy: namely, it blindly and dutifully adopted the State Department’s view and uncritically waved the flag.

The NYT Editors, while conceding in 2014 that “it is easy to see why many Bolivians would want to see Mr. Morales, the country’s first president with indigenous roots, remain at the helm” — namely, “during his tenure, the economy of the country, one of the least developed in the hemisphere, grew at a healthy rate, the level of inequality shrank and the number of people living in poverty dropped significantly” — nonetheless insisted that Morales should be regarded as an enemy of democracy because “the pattern of prolonged terms in power is unhealthy for the region (notably, the NYT would never suggest that Angela Merkel’s “prolonged term in power” as German Chancellor (15 years and counting) or Benjamin Netanyahu’s 4-terms-and-counting-in-power as Israeli Prime Minister pose a similar threat to democracy. This is a “concern” reserved by the U.S. media only for Latin American leaders disliked by the U.S. State Department).

At the end of its 2014 editorial on Bolivia and Latin America, the Times inadvertently revealed the real reason it disliked these elected leaders. Concern for democracy is the pretext. The real reason it wants those elected leaders gone was revealed by this candid sentence: “This regional dynamic has been dismal for Washington’s influence in the region.”

In February — with Morales now in exile in Argentina — the Washington Post published an op-ed by two scholars from MIT. They summarized their argument this way:

The media has largely reported the [OAS’s] allegations of fraud as fact. And many commentators have justified the coup as a response to electoral fraud by MAS-IPSP. However, as specialists in election integrity, we find that the statistical evidence does not support the claim of fraud in Bolivia’s October election.

In sum, the authors concluded after setting forth their statistical findings in detail, “there is not any statistical evidence of fraud that we can find — the trends in the preliminary count, the lack of any big jump in support for Morales after the halt, and the size of Morales’s margin all appear legitimate. All in all, the OAS’s statistical analysis and conclusions would appear deeply flawed.”

As CEPR’s Jake Johnston said today in response to the New York Times article:

For those paying close attention to the 2019 election, there was never any doubt that the OAS’ claims of fraud were bogus. Just days after the election, a high-level official inside the OAS privately acknowledged to me that there had been no “inexplicable” change in the trend, yet the organization continued to repeat its false assertions for many months with little to no pushback or accountability.

Yet those reasons for doubting the OAS accusations were barely ever even mentioned, let alone vested with credibility, by the U.S. media or its leading foreign policy commentators. Instead, as the MIT scholars wrote in the Washington Post, “the media largely reported the allegations of fraud as fact.” That’s because whenever it comes to changing a foreign country’s government that is disliked by the U.S., the U.S. media reflexively sides with the U.S. State Department and ceases to report and instead engages in pro-government propaganda.

In this case, Bolivia lost its most successful president in its modern history, and is consequently now ruled by an unelected military junta, all cheered on by the U.S. and its media, relying on an OAS report which even the New York Times is now forced to acknowledge is, at best, deeply flawed. Thus did the U.S. government and its media, yet again, help destroy a thriving Latin American democracy.

The ‘Liberal World Order’ Was Built With Blood
By Vincent Bevins

In 1965 and 1966, the American government assisted in the murder of approximately one million Indonesian civilians. This was one of the most important turning points of the Cold War — Indonesia is the world’s fourth most populous country, and policymakers at the time understood it was a far more valuable prize than Vietnam. But it’s largely forgotten in the English-speaking world precisely because it was such a success. No American soldiers died; little attention was drawn to one more country pulled, seemingly naturally, into the United States’ orbit.

But the process was not natural. The U.S.-backed military used a failed uprising as a pretext to crush the Indonesian left, whose influence Washington had been seeking to counter for a decade, and then took control of the country. Recently declassified State Department documents make it clear that the United States aided and abetted the mass murder in Indonesia, providing material support, encouraging the killings and rewarding the perpetrators.

It was not the first time the United States had done something like this. In 1954, the American ambassador to Guatemala reportedly handed kill lists to that country’s military. And in Iraq, in 1963, the C.I.A. provided lists of suspected communists and leftists to the ruling Baath Party.

How ‘Jakarta’ Became the Codeword for US-Backed Mass Killing
By Vincent Bevins

When the conflict came, and when the opportunity arose, the US government helped spread the propaganda that made the killing possible and engaged in constant conversations with the Army to make sure the military officers had everything they needed, from weapons to kill lists. The US embassy constantly prodded the military to adopt a stronger position and take over the government, knowing full well that the method being employed to make this possible was to round up hundreds of thousands of people around the country, stab or strangle them, and throw their corpses into rivers. The Indonesian military officers understood very well that the more people they killed, the weaker the left would be, and the happier Washington would be.

It wasn’t only US government officials who handed over kill lists to the Army. Managers of US-owned plantations furnished them with the names of “troublesome” communists and union organizers, who were then murdered.

The prime responsibility for the massacres and concentration camps lies with the Indonesian military. We still do not know if the method employed— disappearance and mass extermination—was planned well before October 1965, perhaps inspired by other cases around the world, or planned under foreign direction, or if it emerged as a solution as events unfolded. But Washington shares guilt for every death. The United States was part and parcel of the operation at every stage, starting well before the killing started, until the last body dropped and the last political prisoner emerged from jail, decades later, tortured, scarred, and bewildered. At several points that we know of—and perhaps some we don’t—Washington was the prime mover and provided crucial pressure for the operation to move forward or expand.

And in the end, US officials got what they wanted. It was a huge victory. As historian John Roosa puts it, “Almost overnight the Indonesian government went from being a fierce voice for cold war neutrality and anti-imperialism to a quiet, compliant partner of the US world order.”

The World the Jakarta Method Built: A Conversation with Vincent Bevins

How does this story affect how you view our contemporary world?

One of the most striking things about spending so much time with the Indonesians and Latin Americans who were active politically in the ’50s and ’60s is that, when you saw them speak about the way they understood how the world was going to unfold, you could see this parallel universe open up behind their eyes. You could see that, in the ’50s and ’60s, they had an idea of what the world would become. It was a wondrous and very plausible vision, one that they never lost. And it was a vision that I never grew up knowing existed. Spending time with these people made me realize that the type of globalization that we’ve got was just one of the possible types we could have had. It was certainly not inevitable or natural. It was not that purely the best system won, and that everyone else was bound to fail and eventually got out of the way. The type of globalization that we got, the type of world that we have now, was shaped to a profound extent by the way that the Cold War was fought. And it was profoundly shaped by the fact that mass murder programs were carried out against unarmed civilians in the service of constructing authoritarian capitalist regimes in the Third World, and in the service of constructing a US-led global system. And to recognize that this world was built by violence, I think, might lead each person to question for themselves if we should stick with this world forever, if this is definitely the best possible world that could ever exist — or if, maybe, you could imagine an improvement. Could you imagine something that’s better?

A Cold War legacy many Americans forget
By Ishaan Tharoor

The massacres you write about in the book — particularly what took place in Indonesia — seem to have been memory-holed in the U.S., but also to some extent in Indonesia, too. How was this act of forgetting possible?

In Indonesia the answer is clear — the dictatorship hid the truth of what happened, and to this day it is illegal to defend “communism” in any way, so serious discussion of 1965 can be very risky for Indonesians. The military that took power that year is still very influential, and even under President Jokowi they have screened the gruesome Suharto-era propaganda film blaming the left for the violence. In the U.S., I can only guess. But I think that the hugely consequential flip in 1965 — from a left-leaning anti-imperialist nation to a U.S.-aligned authoritarian nation open for business — was quickly overshadowed by the Vietnam War. The conflicts in Indochina actually affected domestic policy and U.S. citizens, whereas Indonesia was just a quick and painless victory for the West. But it was not painless at all for Indonesians, of course.

That kind of American blind spot seems significant.

I think it’s a gaping hole, and a really serious problem. It stops us from understanding the nature of the global system created by the Cold War, and blocks us from understanding the way that many other nations see us.

Is there a danger that, in framing the ‘nature of the global system’ this way, you perhaps obscure the genuine fear that communist factions inspired in parts of the world?

I don’t think it must, and I hope in the case of my book it does not. Some officials in the U.S. and elsewhere acted cynically, exaggerating the communist threat, but many others genuinely believed they were doing the right thing or saving the world. Now, that is also true for individuals acting in every kind of regime imaginable. What matters for us now is the consequences.

I really don’t think there’s a lot of evidence that authoritarian socialism of a North Korean or Stalinist type would have “won the Cold War” if Jacobo Árbenz was allowed to implement land reform in Guatemala, or the unarmed Indonesian communists, who repeatedly did well in elections, were not rounded up and murdered, or if they had let Allende run Chile without facing right-wing terror. But even if you take that more extreme position, I think recognizing the following is important: The Soviet system is gone. But we still live in a world directly shaped by pro-capitalist violence that is often unrecognized and unresolved.

Personally, I have no desire to minimize the real and well-recognized crimes committed by communist leaders. What I hope is that our side is held to the same standards.

Trial Of Sept. 11 Defendants At Guantánamo Delayed Until August 2021
By Sacha Pfeiffer

A new U.S military court judge has canceled all hearings in the case until next year and delayed the start of the trial of the five defendants charged in the Sept. 11, 2001, terror attacks until at least August 2021.

Tuesday’s delay order by Judge Keane, the fourth judge to oversee the 9/11 case, is the latest stumbling block at Guantánamo’s problem-plagued military court and prison, which NPR found has cost U.S. taxpayers more than $6 billion since 2002. Other recent complications include:

  1. The previous 9/11 judge, Air Force Col. W. Shane Cohen, left abruptly after nine months on the job, citing family concerns.
  2. The former administrative head of the military court, Christian Reismeier, moved to a different role after being in his position for less than a year.
  3. James P. Harrington, the lead attorney for one of the 9/11 defendants, asked to leave the case, citing health issues and “incompatibility” with his client.
  4. David Bruck, the new lead attorney assigned to represent Harrington’s client, said he needs 2 1/2 years to prepare for trial.

All of those personnel changes cost the court time.

Guantánamo’s prison still holds 40 men, down from nearly 800 people who have been detained there since it opened in 2002. Some of the 40 remaining prisoners have been held for more than 18 years without being charged, and some have been cleared for release but remain incarcerated. Guantánamo prosecutors have finalized only one conviction in the military court’s history.

Architect of C.I.A. Torture Program Testifies Prisoners Acted Well Adjusted
By Carol Rosenberg

After 183 rounds of waterboarding, Khalid Shaikh Mohammed, the man accused of plotting the Sept. 11, 2001, attacks, spent his years in C.I.A. detention as a charming captive who dabbled in Islamic mysticism and engaged in pleasantries with the psychologist who waterboarded him, that psychologist told a war crimes prosecutor on Thursday.

James E. Mitchell, who as a contractor for the C.I.A. helped develop the agency’s interrogation program and handled all the waterboarding, said Mr. Mohammed managed so well in his last three years in the secret prisons after the violent questioning had ended that the two men would sit and hold hands, as Middle Eastern men sometimes do.

He said Mr. Mohammed put on a “charm initiative” and described him as a well-adjusted detainee who never expressed fear of him, gave the psychologist a nickname, “Abu Captain,” and sought his help in improving his conditions as he was moved through the different C.I.A. black sites.

Mr. Mohammed’s lawyer rejected Dr. Mitchell’s account, saying his client was motivated by fear in dealing with the psychologist.

Chains, Shackles and Threats: Testimony on Torture Takes a Dramatic Turn
By Carol Rosenberg

It was the sixth day of testimony by Dr. Mitchell in a pretrial hearing focused on the torture of the defendants during their three and four years of C.I.A. captivity, before they were sent to the military prison at Guantánamo Bay.

For much of last week, lawyers questioned Dr. Mitchell about documents, intelligence and alphanumeric codes used to mask the identities of people who worked at the black sites and obscure the locations of the prisons.

But the tone changed dramatically on Monday, when Dr. Mitchell testified that he threatened to kill one of Mr. Mohammed’s sons if there was another attack on America.

He said he did so after he consulted a lawyer at the agency’s Counterterrorism Center about how to make the threat without violating the Torture Convention.

He said he was advised to make the threat conditional.

So, before telling Mr. Mohammed “I will cut your son’s throat,” Dr. Mitchell said, he added a series of caveats. They included “if there was another catastrophic attack in the United States,” if Mr. Mohammed withheld “information that could have stopped it” and “if another American child was killed.”

Dr. Mitchell said he made the threat in March 2003 as “an emotional flag” as he was transitioning from waterboarding and other violent “enhanced interrogation techniques” to more traditional questioning of Mr. Mohammed.

Psychologist Who Waterboarded for C.I.A. to Testify at Guantánamo
By Carol Rosenberg

In the black sites, the defendants were kept in solitary confinement, often nude, at times confined to a cramped box in the fetal position, hung by their wrists in painful positions and slammed head first into walls. Those techniques, approved by George W. Bush administration lawyers, were part of a desperate effort to force them to divulge Al Qaeda’s secrets — like the location of Osama bin Laden and whether there were terrorist sleeper cells deployed to carry out more attacks.

A subsequent internal study by the C.I.A. found proponents inflated the intelligence value of those interrogations.

Pentagon Moves to Block Exam of Tortured Guantánamo Prisoner
By Carol Rosenberg

Mr. Qahtani, who was diagnosed with schizophrenia, was captured along the Pakistan-Afghanistan border three months after the Sept. 11, 2001, terrorist attacks and subjected to two months of continuous, brutal interrogation at Camp X-Ray at Guantánamo in late 2002 and early 2003.

Leaked documents show that Mr. Qahtani was deprived of sleep and water, kept nude and was menaced by dogs, while under the care of military medics.

“There is no real question that the examination will confirm my client’s severe illnesses and lead to his medical repatriation,” said Mr. Kassem, a law professor whose clinic at City University of New York represents Mr. Qahtani.

Mr. Qahtani has been suspected of being one of several failed, aspirational 20th hijackers in the Sept. 11 attacks. But he has never been charged with a crime, in part because he was tortured.

Bali bombings: US to move ahead with trial of suspects held in Guantánamo
By The Associated Press

The Pentagon has announced plans to move ahead with a military trial for three men held at Guantánamo Bay who are suspected of involvement in the 2002 Bali bombings.

After an unexplained delay, a senior military legal official on Thursday approved non-capital charges that include conspiracy, murder and terrorism for the three men, who have been in US custody for 17 years for their alleged roles in the deadly bombing of Bali nightclubs in 2002 and a year later of the JW Marriott Hotel in Jakarta.

The bombings on the island of Bali killed 202 people, mostly foreign tourists, including 88 Australians and three New Zealanders.

Military prosecutors filed charges against Encep Nurjaman, an Indonesian known as Hambali, and the other two men in June 2017.

Hambali is alleged to have been the leader of Jemaah Islamiyah, a south-east Asian affiliate of al-Qaida. The Pentagon said in a brief statement on the case that he is accused with Mohammed Nazir bin Lep and Mohammed Farik bin Amin, who are from Malaysia, of planning and aiding the attacks.

All three were captured in Thailand in 2003 and held in CIA custody before they were taken to Guantánamo three years later.

The timing of the charges, which had been submitted under Donald Trump but not finalised, caught lawyers for the men by surprise and would seem to be in conflict with president Joe Biden’s intention to close the detention centre.

Gen Lloyd Austin, Biden’s nominee to be secretary of defence, this week reaffirmed the intention to close Guantánamo to the Senate committee considering his nomination.

“The timing here is obvious, one day after the inauguration,” said Marine Corps Maj James Valentine, the appointed military attorney for the most prominent of the three. “This was done in a state of panic before the new administration could get settled.”

A spokesman for the military commissions, which have been bogged down for years over legal challenges largely centred around the brutal treatment of men during their previous confinement in CIA detention facilities, had no immediate comment.

Guantánamo’s Darkest Secret
By Ben Taub

After the attacks, Cofer Black, the head of the C.I.A.’s Counterterrorism Center, who had served as the agency’s Khartoum station chief while bin Laden was in Sudan, assured President George W. Bush that men like Abu Hafs would soon “have flies walking across their eyeballs.” The next day, he ordered Gary Schroen, the agency’s former Kabul station chief, to gather a team for a paramilitary mission. “I want to see photos of their heads on pikes,” Black said, according to Schroen’s memoir, “First In,” published in 2005. “I want bin Ladin’s head shipped back in a box filled with dry ice. I want to be able to show bin Ladin’s head to the President.” Black added that he and Bush wanted to avoid the spectacle of a courtroom trial. “It was the first time in my thirty-year CIA career that I had ever heard an order to kill someone,” Schroen wrote.

On September 26th, Schroen and six other officers loaded an aging Soviet helicopter with weapons, tactical gear, and three million dollars in used, nonconsecutive bills. They took off from Uzbekistan and flew into northern Afghanistan, over the snow-capped mountains of the Hindu Kush. There, Schroen contacted the leaders of the Northern Alliance, an armed group that had spent years fighting the Taliban, with little external support. Schroen recalled, “When I began to distribute money—two hundred thousand dollars here, two hundred and fifty thousand dollars for this—I think that they were convinced that we were sincere.” In the next few weeks, Schroen’s C.I.A. team and their Afghan counterparts travelled through much of northern Afghanistan, laying the groundwork for the U.S. military invasion.

At sunrise, the plane landed at Bagram Airfield, the largest U.S. military base in Afghanistan. For the first time, Salahi was in the custody of uniformed American soldiers. “Where is Mullah Omar?” they asked. “Where is Osama bin Laden?” They shouted and threw objects against the wall. Salahi had been living in a cell practically since the beginning of the invasion, nine months earlier.

Military personnel took his biometric information, and logged his health problems—including a damaged sciatic nerve—then led him to a cell. The punishment for talking to another detainee was to be hung by the wrists, feet barely touching the ground. Salahi saw a mentally ill old man subjected to this method. “He couldn’t stop talking because he didn’t know where he was, nor why,” Salahi wrote.

During interrogations, an intelligence officer, known among the detainees as William the Torturer, forced Salahi into stress positions that exacerbated his sciatic-nerve issues. “His specialty was in brutalizing detainees who were considered important, but not valuable enough to get them tickets to the secret CIA prisons,” Salahi wrote. Another officer tried to build rapport with Salahi by speaking to him in German. “Wahrheit macht frei,” the officer said—the truth sets you free. “When I heard him say that, I knew the truth wouldn’t set me free, because ‘Arbeit’ didn’t set the Jews free,” Salahi recalled. (The phrase “Work sets you free” appeared on the gates of Auschwitz and other Nazi concentration camps.)

Each detainee was given a number, and, on August 4th, thirty-four of those numbers were called, including Salahi’s. The men were dragged out of their cells. Military police officers put blackout goggles over their eyes and mittens on their hands, then hooded them, lined them up, and tied each detainee to the one in front of him and the one behind him. Then the men were loaded onto an airplane. “When my turn came, two guards grabbed me by the hands and feet and threw me toward the reception team,” Salahi wrote. “I don’t remember whether I hit the floor or was caught by the other guards. I had started to lose feeling and it would have made no difference anyway.”

For some thirty hours, Salahi was strapped to a board. Medical records indicate that he weighed a hundred and nine pounds—around thirty per cent less than his normal weight. He was belted so tightly that he struggled to breathe, but he didn’t have the English vocabulary to tell the guards.

In the minutes before the first detainees set foot on Guantánamo, “you could literally hear a pin drop,” Brandon Neely, a military-police officer, recalled, in an interview with the Guantánamo Testimonials Project, at the University of California, Davis, in 2008. “Everyone, including myself, was very nervous,” he said. It was January 11, 2002. The Bush Administration had decided that the Geneva Conventions did not apply to the war on terror, which meant that the men captured abroad could be deprived of the rights of prisoners of war. That day, Neely’s job was to haul captives from a bus to a holding area for processing, and then to small, outdoor cages, where they would spend nearly four months sleeping on rocks, and relieving themselves in buckets, while soldiers constructed more permanent cellblocks. “I keep thinking, Here it comes—I am fixing to see what a terrorist looks like face to face,” Neely, who was twenty-one at the time, said.

The first man off the bus had only one leg. He wore handcuffs, leg shackles, earmuffs, blackout goggles, a surgical mask, and a bright-orange jumpsuit. As two M.P.s dragged him to the holding area, someone tossed his prosthetic leg out of the bus. All afternoon, guards screamed at the detainees to shut up and walk faster, called them “sand niggers,” and said that their family members and countries had been obliterated by nuclear bombs.

Later that day, Neely and his partner brought an elderly detainee to the holding area and forced him to his knees. When they removed his shackles, the man, who was shaking with fear, suddenly jerked to the left. Neely jumped on top of him, and forced his face into the concrete floor. An officer shouted “Code Red!” into a radio, and the Internal Reaction Force team raced to the scene and hog-tied him. He was left for hours in the Caribbean sun.

Neely later found out that the elderly detainee had jerked because, when he was forced to his knees, he thought he was about to be shot in the back of the head. In his home country, Neely said, “this man had seen some of his friends and family members executed on their knees.” The man’s response was hardly unique; a military document, drafted ten days later for the base commander, noted that “the detainees think they are being taken to be shot.”

Officially, the job of the Internal Reaction Force was to restrain unruly detainees, to prevent them from injuring themselves or the guards. But, in practice, “IRFing” was often done as a form of revenge, initiated liberally—for example, when a detainee was found to have two plastic cups instead of one, or refused to drink a bottle of Ensure, because he thought that he was being given poison. IRFing typically involved a team of six or more men dressed in riot gear: the first man would pepper-spray the detainee, then charge into the cell and, using a heavy shield and his body weight, tackle the detainee; the rest would jump on top, shackling or binding the detainee until he was no longer moving. Although many of the detainees arrived malnourished, with their bodies marked by bullet wounds and broken bones, some IRF teams punched them and slammed their heads into the ground until they were bloody and unconscious. “You could always tell when someone got IRFed, as the detainees throughout the camp would start chanting and screaming,” Neely recalled. Once, he watched an IRF team leader beat a detainee so badly that he had to be sent to the hospital and the floor of his cell was stained with blood; the next time the team leader was in the cellblock, another detainee yelled out, “Sergeant, have you come back to finish him off?”

In Islam, the Quran is considered the transcribed word of God; some Muslims keep the book wrapped in cloth, never letting it touch unclean surfaces. To dispel notions that the United States was at war with Islam, detainees were allowed to have private meetings with a Muslim military chaplain, and were given copies of the Quran. Some guards saw an opportunity to torment the detainees—by tossing the Quran into the toilet, for example, or by breaking the binding under the guise of searching for “weapons.” Desecration of the Quran provoked riots in the cellblocks, which resulted in IRF teams storming into the cells and beating up detainees.

One day, after an interrogator kicked a Quran across the floor, detainees organized a mass suicide attempt. “Once every fifteen minutes, a prisoner tried to hang himself by tying his sheet around his neck and fastening it through the mesh of the cage wall,” James Yee, an Army captain who served as the Muslim chaplain in Guantánamo, recalled in his memoir, “For God and Country,” from 2005. “As soon as the prisoner was taken to the hospital, another detainee would be found—his sheet wound around his neck and tied to his cage wall. The guards would rush in to save him and the chaos would start again. The protest lasted for several days as twenty-three prisoners tried to hang themselves.”

Military-police officers so frequently abused the Quran during cell searches that detainees demanded that the books be kept in the library, where they would be safe. Yee, who had converted to Islam in the early nineties, sent a request up the chain of command, but was rebuffed. “I felt this decision stemmed from the command’s desire to be able to tell the media that we gave all detainees a Quran out of sensitivity to their religious needs,” he wrote. The detainees protested, and so “it was decided that every detainee who refused the Quran would be IRFed.” While the detainees were receiving medical treatment for their post-IRF injuries, the Qurans were placed back in their cells.

In time, Yee came to believe that “Islam was systematically used as a weapon against the prisoners.” Guards mocked the call to prayer, and manipulated Islamic principles of modesty—by having female guards watch naked detainees in the showers, for example—to create tension as an excuse to exact violence. During interrogations, detainees were forced to perform mock satanic rituals, or were draped in the Israeli flag.

Donald Rumsfeld told reporters that the men in Guantánamo were “among the most dangerous, best-trained, vicious killers on the face of the earth.” But after Brandon Neely’s first shift, on the day the detention camp opened, “no one really spoke much,” he recalled. “I went back to my tent and laid down to go to sleep. I was thinking, Those were the worst people the world had to offer?”

In Afghanistan, the U.S. military was inadvertently presiding over a kidnapping-and-ransom industry. Helicopters dropped flyers in remote Afghan villages, offering “wealth and power beyond your dreams” to anyone who turned in a member of Al Qaeda or the Taliban. “You can receive millions of dollars,” one of the flyers said. “This is enough money to take care of your family, your village, your tribe for the rest of your life.” A common bounty was five thousand dollars—far more money than most Afghans earned in a year—and “the result was an explosion of human trafficking” by various armed groups, Mark Fallon, the deputy commander of Guantánamo’s Criminal Investigation Task Force, wrote in his memoir, “Unjustifiable Means,” which was heavily redacted before being published, in 2017. As Michael Lehnert, a Marine Corps major general who briefly served as the detention camp’s first commander, later testified to Congress, “What better way to enrich yourself, while resolving old grudges, than to finger a neighbor who was your enemy, regardless of his support for either Al Qaeda or the Taliban?”

According to Fallon, “The Northern Alliance would jam so many detainees into Conex shipping containers that they started to die of suffocation. Not wanting to lose their bounties, the captors sprayed the tops of the boxes with machine guns to open ventilation holes. A lot of these prisoners were actually looking forward to being handed over to the Americans, figuring it would be pretty obvious they weren’t Al Qaeda.” Yet hundreds of them were sent to Guantánamo Bay, which ended up housing seven hundred and eighty people.

In public, the Bush Administration and its military leadership asserted that Guantánamo was filled with men who would stop at nothing to destroy the U.S. But, on the base, Fallon and his colleagues referred to most detainees as “dirt farmers.” Lehnert lamented, “It takes an Army captain to send someone to Gitmo, and the President of the United States to get them out.”

Bush Administration lawyers had taken the position that “enemy combatants” could be held indefinitely, without trials, and that in order for something to qualify as “torture” it “must be equivalent in intensity to the pain accompanying serious physical injury, such as organ failure, impairment of bodily function, or even death.”

In 1967, Martin Seligman, a twenty-four-year-old Ph.D. student in psychology, conducted an experiment that involved delivering electric shocks to dogs in various states of restraint. The goal was to assess whether inescapable pain could condition an animal into “learned helplessness,” whereby it simply accepts its fate. Thirty-five years later, the United States government drew inspiration from this experiment in its approach to interrogating terror suspects.

The plan, conceived by James Mitchell, a psychologist working on contract for the C.I.A., was to induce learned helplessness in humans by combining an individually tailored regimen of torture techniques with environmental manipulation. The techniques—which government documents identify as “omnipotence tactics,” “degradation tactics,” “debilitation tactics,” and “monopolization of perception tactics”—had been developed by Communist forces during the Korean War, to coerce prisoners into making false confessions, for propaganda purposes. Since then, the U.S. military has exposed some élite soldiers to the techniques, to prepare them for the kinds of abuses they might encounter should they be captured by terrorist groups or governments that don’t abide by the Geneva Conventions. Mitchell argued that, by reverse-engineering this program, interrogators could overwhelm whatever resistance training a detainee might have absorbed from the Manchester manual. What followed was a period of experimentation—overseen by psychologists, lawyers, and medical personnel—at C.I.A. black sites and military facilities. In September, 2002, Army officers started referring to Guantánamo as “America’s Battle Lab.”

Early in the afternoon of October 2, 2002, a group of interagency lawyers and psychologists met to come up with a framework that used “psychological stressors” and environmental manipulation to “foster dependence and compliance.” The C.I.A. had been torturing detainees at black sites for several months; now the Guantánamo leadership wanted to understand the legal gymnastics that would be required to implement a program of their own. “Torture has been prohibited by international law, but the language of the statutes is written vaguely,” Jonathan Fredman, a senior C.I.A. lawyer, said, according to the meeting minutes. “It is basically subject to perception. If the detainee dies, you’re doing it wrong.” (Fredman has disputed the accuracy of the meeting minutes.)

Later that month, a military lawyer named Diane Beaver drafted a legal justification—described later by a congressional inquiry on torture as “profoundly in error and legally insufficient”—for a set of abusive interrogation techniques. Among such methods as forced nakedness, dietary manipulation, daily twenty-hour interrogations, waterboarding, exposure to freezing temperatures, and the withholding of medical care, Beaver endorsed “the use of scenarios designed to convince the detainee that death” was “imminent.” (She later expressed surprise that her legal opinion had become “the final word on interrogation policies and practices within the Department of Defense.”) An accompanying memo, drafted by a military psychologist and a psychiatrist, explained that “all aspects of the environment should enhance capture shock, dislocate expectations, foster dependence, and support exploitation to the fullest extent possible.”

In November, 2002, the set of proposed techniques landed on Donald Rumsfeld’s desk. He signed it. “Why is standing limited to 4 hours?” he wrote in the margin, referring to a proposed stress position. “I stand for 8-10.”

Twenty-hour interrogations. “You know, when you just fall asleep and the saliva starts to come out of your mouth?” Salahi said. No prayers, no information about the direction of Mecca. No showers for weeks. Force-feeding during the daylight hours of Ramadan, when Muslims are supposed to fast. “We’re gonna feed you up your ass,” an interrogator said.

Medical personnel had noted that Salahi had sciatic-nerve issues; now interrogators kept him in stress positions that exacerbated them. No chairs, no lying down, no more access to his prescription pain medication. “Stand the fuck up!” an interrogator said. But Salahi was shackled to the floor, so he could do so only hunched over. He stayed that way for hours. The next time the Red Cross delegation visited Guantánamo, a representative reported that “medical files are being used by interrogators to gain information in developing an interrogation plan.”

Female interrogators groped him. They stripped, and rubbed their bodies all over his, and threatened to rape him. “Oh, Allah, help me! Oh, Allah, have mercy on me!” one of them said, mockingly. “Allah! Allah! There is no Allah. He let you down!” An interrogation memo listed plans to shave Salahi’s head and beard, dress him in a burqa, and make him bark and perform dog tricks, “to reduce the detainee’s ego and establish control.”

The interrogators head-butted him, and made degrading remarks about his religion and his family. They kept him in alternately hot and cold cells, blasted him with strobe lights and heavy-metal music, and poured ice water on him. One day they would deprive him of food, and the next they’d force him to drink water until he vomited. According to interrogation memos, they decorated the walls with photos of genitalia, and set up a baby crib, because he was sensitive about the fact that he had no children.

On July 17, 2003, a masked interrogator told Salahi that he had dreamed that he saw other detainees digging a grave and tossing a pine casket with Salahi’s detainee number into it. The interrogator added that, if Salahi didn’t start talking, he would be buried on “Christian, sovereign American soil.”

On August 2nd, military records show, an interrogator told Salahi that he and his colleagues “are sick of hearing the same lies over and over and over and are seriously considering washing their hands of him. Once they do, he will disappear and never be heard from again.” Salahi was told to imagine “the worst possible scenario he could end up in,” and that he would “soon disappear down a very dark hole. His very existence will become erased. His electronic files will be deleted from the computer, his paper files will be packed up. . . . No one will know what happened to him, and eventually, no one will care.”

That day, the leader of Salahi’s interrogation came in. He identified himself as Captain Collins, a Navy officer who had been sent to Guantánamo by the White House. (His name was actually Richard Zuley; he was a Chicago police detective, working as a military contractor, who has an extensive record of abusing suspects until they confessed to crimes that they hadn’t committed. He did not respond to requests for comment.) Zuley read Salahi a letter, later shown to be forged, stating that his mother was in U.S. custody and might soon be transferred to Guantánamo. According to government records, “the letter referred to ‘the administrative and logistical difficulties her presence would present in this previously all-male prison environment,’ ” implying that she would be raped.

On August 13th, Donald Rumsfeld authorized the interrogation plan for Salahi. The document he signed listed one aim of the abuse as to “replicate and exploit the ‘Stockholm Syndrome,’ ” in which kidnapping victims come to trust and feel affection for their captors.

Twelve days later, a group of men charged into Salahi’s cell with a snarling German shepherd. They punched Salahi in the face and the ribs, then covered his eyes with blackout goggles, his ears with earmuffs, and his head with a bag. They tightened the chains on his ankles and wrists, then threw him into the back of a truck, drove to the water, and loaded him into a speedboat. “I thought they were going to execute me,” Salahi wrote.

He was driven around for three hours, to make him think that he was being transported to a different facility. He was forced to swallow salt water, and, every few minutes, the men packed ice cubes between his clothes and his skin. When the ice melted, they punched him, then repacked the ice to freeze him again. By the end of the boat ride, Salahi was bleeding from his ankles, mouth, and wrists. Seven or eight of his ribs were broken.

Back on land, Salahi was carried to Echo Special, the trailer, which would be his home for several years. For the next month, he was kept in total darkness; his only way of knowing day from night was to look into the toilet and see if there was brightness at the end of the drain. “To be honest I can report very little about the next couple of weeks,” Salahi wrote, “because I was not in the right state of mind.”

Soon afterward, an interrogator e-mailed Diane Zierhoffer, a military psychologist, with concerns about Salahi’s mental health. “Slahi told me he is ‘hearing voices’ now,” the interrogator wrote. “Is this something that happens to people who have little external stimulus such as daylight, human interaction etc???? Seems a little creepy.”

“Sensory deprivation can cause hallucinations, usually visual rather than auditory, but you never know,” Zierhoffer replied. “In the dark you create things out of what little you have.”

While Salahi was being tortured, James Yee, the Muslim military chaplain, discovered that he and the interpreters at Guantánamo—many of whom were Muslim Americans, with Middle Eastern backgrounds—were being spied on by law-enforcement and intelligence officers. When Yee went on leave, he flew to Jacksonville, Florida, where he was interrogated and arrested, then blindfolded, earmuffed, and driven to a Navy brig in South Carolina. For seventy-six days, he lived in solitary confinement, in a cold cell with surveillance cameras and the lights always on. Government officials suggested that Yee was running an elaborate spy ring—that he and other Muslims had “infiltrated” the military, and represented the gravest insider threat since the Cold War. Based on a misreading of materials in his possession, and the vague aspersions of Islamophobic military officers, prosecutors accused him of treason and “aiding the enemy,” and threatened to pursue the death penalty. (All charges were later dropped, and Yee was honorably discharged.)

In the military hearing, Salahi described the torture program in vivid detail. The transcript omits much of his testimony, noting that, at the moment he started to describe the abuse, “the recording equipment began to malfunction” and that the tapes were “distorted.” The transcript continues, “The Detainee wanted to show the Board his scars and location of injuries, but the board declined the viewing.” (By now, the U.S. government was rolling back authorizations for torture techniques, and the military and the C.I.A. were entering a period of self-reflection; during the next several years, internal and congressional investigations would expose many of the worst abuses that had been inflicted on Salahi and other men in custody.)

“Everything that happened to me—everything I witnessed in Guantánamo Bay—happened in the name of democracy, in the name of security, in the name of the American people,” Salahi told the audience at the Amnesty event. He added that, as the world’s most powerful democracy, the United States had “the means to uphold and pressure other countries to uphold human rights. But instead the United States is stating to the world very clear and loud that democracy does not work—that when you need to get down and dirty, you need a dictatorship. That dictatorship was built in Guantánamo Bay.”

Donald Trump Is a Menace to American Democracy. But He Didn’t Come Out of Nowhere.
By Daniel Denvir

On January 19, 2017, the day before Trump took office, the United States was a country with a fossil-fueled economy charging headlong into climate chaos, a gargantuan archipelago of mass imprisonment, a militarized southern border, First World riches alongside Third World poverty, legalized corruption dictating the path of lawmaking, and permanent global war unhinged from domestic or international law.

Trump is a creature of the social order that preceded his government, not an extraterrestrial.

The CIA and its allies in both parties spent years falsely claiming that torture was necessary to stop terrorism and find Osama bin Laden; then, under Obama stalwart John Brennan, CIA officers spied on their Senate overseers at the very moment that an investigation threatened to expose them. Brennan (alongside fellow national security alumni) took up work as a TV commentator, earning a living by making the implausible case that Trump’s behavior is unprecedented.

Like the wars on crime and immigration, the war on terror exceeded the already capacious limits of legality. Indeed, the new enemy provided fresh license for official impunity. In 2005, the New York Times published revelations that the Bush administration had authorized the NSA to intercept international phone calls of US citizens without a warrant.

The Times, however, had already possessed the story in 2004, ahead of George W. Bush’s reelection. They decided not to publish it under government pressure. Editors only agreed to put it in the paper more than a year later because James Risen, one of the two Times reporters who had written the story, was about to publish the revelations in a book.

“Three years after 9/11, we, as a country, were still under the influence of that trauma, and we, as a newspaper, were not immune,” Bill Keller, the paper’s executive editor, later reflected. “It was not a kind of patriotic rapture. It was an acute sense that the world was a dangerous place.” But Keller’s unquestioning fealty to the national security state is precisely what the long hangover from post-9/11 patriotic rapture looks like.

When the CIA Interferes in Foreign Elections
By David Shimer

The CIA’s method, Panetta went on, was to “acquire media within a country or within a region that could very well be used for being able to deliver” a specific message or work to “influence those that may own elements of the media to be able to cooperate, work with you in delivering that message.” As in Italy in 1948 or Serbia in 2000, the programs that Panetta described complemented overt propaganda campaigns. “Even though we were operating on a covert basis,” he said, “you had to make sure that the overt methods that were being used at least delivered the same message.” Even this type of operation presented risks. “There is no question it’s a gamble,” Panetta continued, which is why it was an option of last resort and why more aggressive tactics had been sidelined.

Every interview pointed to the same conclusion: for the CIA, covert electoral interference has become the exception rather than the rule. Either the agency no longer seeks to influence election outcomes, as Brennan and Petraeus asserted, or it does so in rare cases when, as with Milosevic, a tyrant can be ousted by ballot. The exact truth is unknown. But this general shift marks a dramatic departure from the Cold War, when the CIA was interfering in the elections of “many, many” countries. Of this evolution, Negroponte, a former director of national intelligence, said, “Frankly, political action of that kind is really part of the past. Iraq convinced me of that. It was just zero appetite for [electoral] intervention.”

Skeptics will insist that the United States’ intelligence chiefs are lying. But considering present-day realities, the skeptics may be the ones defying logic. It would be self-defeating for the CIA to manipulate foreign elections in all but the most exceptional of circumstances. One reason why concerns the end of the Cold War, which robbed the CIA of its long-running purpose: to counter the Soviet Union. Milosevic, for one, was a relic of a previous era. In September 2001, the CIA found a new focus in counterterrorism, which called for drone strikes and paramilitary operations, not electoral interference.

The United States’ post–Cold War leaders declared an era of liberal democracy defined by free and fair elections. This transition, from containing communism to promoting democracy, made covert electoral interference a riskier proposition. As Michael Hayden, a former CIA director, explained, “Meddling in an electoral process cuts across the grain of our own fundamental beliefs. You might want to do it to level the playing field, you might want to do it because of just the demands of national security, but it doesn’t feel right.” McLaughlin elaborated upon Washington’s evolving outlook. “If you are interfering in an election and are exposed as doing so,” he said, “you are a lot more hypocritical than you would have appeared in the Cold War, when that sort of thing tended to be excused as part of the cost of doing business.”

Exclusive: Secret Trump order gives CIA more powers to launch cyberattacks
By Zach Dorfman, Kim Zetter, Jenna McLaughlin and Sean D. Naylor

The CIA’s new powers are not about hacking to collect intelligence. Instead, they open the way for the agency to launch offensive cyber operations with the aim of producing disruption — like cutting off electricity or compromising an intelligence operation by dumping documents online — as well as destruction, similar to the U.S.-Israeli 2009 Stuxnet attack, which destroyed centrifuges that Iran used to enrich uranium gas for its nuclear program.

The finding has made it easier for the CIA to damage adversaries’ critical infrastructure, such as petrochemical plants, and to engage in the kind of hack-and-dump operations that Russian hackers and WikiLeaks popularized, in which tranches of stolen documents or data are leaked to journalists or posted on the internet. It has also freed the agency to conduct disruptive operations against organizations that were largely off limits previously, such as banks and other financial institutions.

Another key change with the finding is it lessened the evidentiary requirements that limited the CIA’s ability to conduct covert cyber operations against entities like media organizations, charities, religious institutions or businesses believed to be working on behalf of adversaries’ foreign intelligence services, as well as individuals affiliated with these organizations, according to former officials.

“Before, you would need years of signals and dozens of pages of intelligence to show that this thing is a de facto arm of the government,” a former official told Yahoo News. Now, “as long as you can show that it vaguely looks like the charity is working on behalf of that government, then you’re good.”

The CIA has wasted no time in exercising the new freedoms won under Trump. Since the finding was signed two years ago, the agency has carried out at least a dozen operations that were on its wish list, according to this former official. “This has been a combination of destructive things — stuff is on fire and exploding — and also public dissemination of data: leaking or things that look like leaking.”

Some CIA officials greeted the new finding as a needed reform that allows the agency to act more nimbly. “People were doing backflips in the hallways [when it was signed],” said another former U.S. official.

This more permissive environment may also intensify concerns about the CIA’s ability to secure its hacking arsenal. In 2017, WikiLeaks published a large cache of CIA hacking tools known as “Vault 7.” The leak, which a partially declassified CIA assessment called “the largest data loss in CIA history,” was made possible by “woefully lax” security practices at the CIA’s top hacker unit, the assessment said.

Eatinger, the former top CIA attorney, who retired in 2015, said it’s unclear to him whether the new cyber finding would be a return to the agency’s more freewheeling days of the 1980s, or something that goes even further. Either way, it’s a “big deal,” he said.

The US has suffered a massive cyberbreach. It’s hard to overstate how bad it is
By Bruce Schneier

We are still learning about US government organizations breached: the state department, the treasury department, homeland security, the Los Alamos and Sandia National Laboratories (where nuclear weapons are developed), the National Nuclear Security Administration, the National Institutes of Health, and many more. At this point, there’s no indication that any classified networks were penetrated, although that could change easily. It will take years to learn which networks the SVR has penetrated, and where it still has access. Much of that will probably be classified, which means that we, the public, will never know.

While president-elect Biden said he will make this a top priority, it’s unlikely that he will do much to retaliate.

The reason is that, by international norms, Russia did nothing wrong. This is the normal state of affairs. Countries spy on each other all the time. There are no rules or even norms, and it’s basically “buyer beware.” The US regularly fails to retaliate against espionage operations – such as China’s hack of the Office of Personal Management (OPM) and previous Russian hacks – because we do it, too. Speaking of the OPM hack, then director of national intelligence James Clapper said: “You have to kind of salute the Chinese for what they did. If we had the opportunity to do that, I don’t think we’d hesitate for a minute.”

We don’t, and I’m sure NSA employees are grudgingly impressed with the SVR. The US has by far the most extensive and aggressive intelligence operation in the world. The NSA’s budget is the largest of any intelligence agency. It aggressively leverages the US’s position controlling most of the Internet backbone and most of the major Internet companies. Edward Snowden disclosed many targets of its efforts around 2014, which then included 193 countries, the World Bank, the IMF, and the International Atomic Energy Agency. We are undoubtedly running an offensive operation on the scale of this SVR operation right now, and it’ll probably never be made public. In 2016, President Obama boasted that we have “more capacity than anybody both offensively and defensively.”

He may have been too optimistic about our defensive capability. The US prioritizes and spends many times more on offense than on defensive cybersecurity. In recent years, the NSA has adopted a strategy of “persistent engagement,” sometimes called “defending forward.” The idea is that instead of passively waiting for the enemy to attack our networks and infrastructure, we go on the offensive and disrupt attacks before they get to us. This strategy was credited with foiling a plot by the Russian Internet Research Agency to disrupt the 2018 elections.

If anything, the US’s prioritization of offense over defense makes us less safe. In the interests of surveillance, the NSA has pushed for an insecure cell phone encryption standard and a backdoor in random number generators (important for secure encryption). The DoJ has never relented in its insistence that the world’s popular encryption systems be made insecure through back doors – another hot point where attack and defense are in conflict. In other words, we allow for insecure standards and systems, because we can use them to spy on others.

There’s a lot of attack going on in the world. In 2010, the US and Israel attacked the Iranian nuclear program. In 2012, Iran attacked the Saudi national oil company. North Korea attacked Sony in 2014. Russia attacked the Ukrainian power grid in 2015 and 2016. Russia is hacking the US power grid, and the US is hacking Russia’s power grid – just in case the capability is needed someday. All of these attacks began as a spying operation. Security vulnerabilities have real-world consequences.

Cozy Bears and Hidden Cobras: The hackers targeting COVID-19 vaccine researchers
By James Purtill

Broadly, there are two kinds of hacking groups: those that are state-sponsored and others that focus on attacks for their own financial benefit.

FireEye refers to the first of these as advanced persistent threat (APT) groups and assigns them a number. The other groups, which often work with organised crime, are called FIN groups.

Other security companies have different naming systems. Crowdstrike assigns animal names according to what country they think the group is working for: ‘Maverick Panda’ is linked with China, ‘Fancy Bear’ with Russia, and ‘Charming Kitten’ with Iran.

As a result, one group can go by several names, depending on how widely they’re known: FireEye calls Fancy Bear APT28, while to other companies they’re Pawn Storm or Strontium.

Though these groups have names, relatively little is known about them — or at least made public.

So long as these groups remain anonymous, the nation-states that back them can deny responsibility for their attacks.

Many of these APT groups have been targeting COVID researchers, Mr Wellsmore said.

They include the Vietnamese group APT32 (also known as OceanLotusGroup), which in early 2020 attempted to hack China’s Ministry of Emergency Management in order to learn more about the Wuhan epidemic, according to FireEye.

“We’ve also seen threat groups we attribute to China — APT41 — and some others that we don’t have numbers for,” Mr Wellsmore said.

Other groups accused of targeting COVID researchers — either by cybersecurity companies or government agencies — include Fancy Bear and Cozy Bear (APT28 and APT29).

These groups are well known in security circles. Fancy Bear is accused of hacking the Democratic party computers in the run-up to the 2016 presidential election.

The North Korean group known as Hidden Cobra, Lazarus or APT38 has also been active.

Mr Shevchenko said APT groups sometimes plant “false flags” in order to frame groups linked to other countries.

The Lazarus group, for example, sometimes pretends to be Russian.

“Lazarus are very clumsy at planting Russian flags,” he said.

“They use a number of Russian words in the source code.

“For a native Russian speaker like myself, I can tell they’re from Google Translate.”

In Cyberwar, There are No Rules
By Tarah Wheeler

A major cyberattack against the United States in 2014 was a clear example of how civilians can bear the brunt of such operations. Almost all cybersecurity experts and the FBI believe that the Sony Pictures hack that year originated in North Korea. A hostile country hit a U.S. civilian target with the intention of destabilizing a major corporation, and it succeeded. Sony’s estimated cleanup costs were more than $100 million. The conventional warfare equivalent might look like the physical destruction of a Texas oil field or an Appalachian coal mine. If such a valuable civilian resource had been intentionally destroyed by a foreign adversary, it would be considered an act of war.

In the near future, attacks like the Sony hack will not be exceptional. There are countless vulnerabilities that could result in mass casualties, and there are no agreed norms or rules to define or punish such crimes.

When devastating attacks happen on U.S. soil, people use metonyms to describe them. No one has to describe the specifics of Pearl Harbor or 9/11; we already know what they signify. When the cyberattack that lives in infamy happens, it will be so horrifying that there won’t be a ready comparison. It won’t be the cyber-Pearl Harbor. It will have its own name.

Until that point, however, these attacks will remain nameless. People are frightened of what they can see and understand, not what they cannot imagine and do not comprehend, and, as a result, it’s easy to ignore the twice-removed effects of a quiet but deadly cyberattack. Given that it took more than a decade and a half to successfully prosecute war criminals from the Yugoslav wars of the mid-1990s even with overwhelming photographic evidence and personal testimony, it’s not surprising that the international community has a hard time agreeing on what constitutes a cyberattack deserving of reprisal—especially when countries can’t even settle on a definition for themselves.

In the run-up to the 2017 German parliamentary elections, a string of cyberattacks led to fears of Russian meddling, but according to the Charter of the United Nations, unless armed force has been brought to bear within the borders of a country, no internationally recognized act of aggression has occurred. This definition of war is hopelessly out of date.

Similarly, cyberattacks in the Netherlands in 2017 and 2018 resulted in the denial of government funding and vital services to citizens, but because conventional battlefield weapons weren’t used, the U.N. Charter’s provisions weren’t violated. Countries are beginning to coalesce around the idea that some forms of active countermeasures are justified in self-defense, if not in actual reciprocation, under international law.

Reaching an international consensus on what triggers a country’s right to self-defense in cyberspace requires a coherent, common understanding on where to draw the line between nefarious economic or intelligence activities and true cyberattacks.

In the absence of a binding global accord, the world will remain vulnerable to a motley mix of hackers, warriors, intelligence operatives, criminals, and angry teenagers—none of whom can be distinguished from behind three proxy servers. It would be nearly impossible to identify perpetrators with 100 percent confidence if they take even rudimentary steps to cover their digital tracks after cyberattacks.

Were disaster to strike Southern California tomorrow, scientific tests and forensic analysis would allow us to tell whether it was an earthquake or a bomb—even if both events could destroy approximately the same amount of property. Yet it would be very easy to confuse a distributed denial of service attack on a U.S. government website launched for fun by a few juvenile hackers in St. Petersburg with an attack launched by the Russian military to deliberately deny U.S. citizens the ability to register to vote or collect entitlements. Cyber-enabled disinformation campaigns are equally problematic to attribute and to punish. Despite the consensus among experts and intelligence services that Russia tampered with the 2016 U.S. presidential election, it is proving extremely difficult to gain nonpartisan consensus that Russian-targeted advertising purchases on social media constitute hostile acts by a foreign power.

The international community needs new habits for a new era. Leaders must follow NATO’s tentative footsteps in Tallinn and convene digital Geneva Conventions that produce a few deep, well-enforced rules surrounding the conduct of war in cyberspace. Cyberwar is the continuation of kinetic war by plausibly deniable means. Without a global consensus on what constitutes cyberwar, the world will be left in an anarchic state governed by contradictory laws and norms and vulnerable to the possibility of a devastating war launched by a few anonymous keystrokes.

Drones Are Destabilizing Global Politics
By Jason Lyall

Drones have become all the more alluring as they have succeeded in turning the tide on actual battlefields. Turkey has used drones to particular effect. A Syrian airstrike killed 36 Turkish soldiers operating near Syria’s northern Idlib Province last February, and Ankara retaliated by using TB2 drones to destroy dozens of tanks, air defenses, and armored vehicles, killing hundreds, possibly thousands, of Syrian soldiers. On behalf of a chastened Syria, Russia requested a cease-fire. Turkish TB2s were also decisive in breaking a military stalemate in Libya during Operation Peace Storm: the combined weight of TB2 drones and ground forces drove Libyan National Army forces from Tripoli and their stronghold in Tarhouna.

Tomorrow’s armed drones, propelled by rapid innovations in the commercial sector, will likely prove to be even more effective. Turkey is already working to extend the range of the TB2s used over Nagorno-Karabakh barely four weeks ago. And as unit costs fall, mass-produced drones will soon be able to swarm enemy defenses. In time, drones with individualized capabilities might combine to form hunter-killer teams to exploit an enemy’s battlefield vulnerabilities. With such affordable technology at hand, leaders may be hard-pressed to resist the temptation to restart frozen wars or even instigate new ones, especially if they believe that their advantages are temporary.

Already, some countries are investing in systems to counter drones, but these technologies are in their infancy. The defense is playing catch up while the offense marches downfield. The gaps in short-range, low-level air defenses will be difficult to plug, at least in the near term. And offensive technology is simply cheaper: a Russian S-400 Triumpf missile system costs $300 million and a Pantsir, about $14 million. By contrast, a TB2 costs only $5 million, and its MAM-L missile, used to deadly effect in Nagorno-Karabakh, comes at only $100,000 a pop. Countries that rely on expensive legacy systems for defense might find themselves unable to afford to protect their armies or replace their wartime losses. Until defenses shift to drone-based countermeasures, these costly systems will likely remain vulnerable.

Those countries that invest in armed drones will face a powerful temptation to restart simmering territorial conflicts or seek new advantages in those that have deadlocked. Chillingly, of the next ten countries predicted to acquire armed drones, nine are trapped in long-running territorial disputes or fighting internal wars. The international system may soon face a new round of conflicts propelled by the proliferation of armed drones.

Obama’s Brutal Drone Legacy Will Haunt the Biden Administration
By Emran Feroz

Under the Obama administration, Afghanistan became the world’s most drone-bombed country. President Barack Obama also expanded clandestine wars in countries where the United States was not officially at war, such as Pakistan, Yemen, and Somalia. In most of these countries, Obama’s drone wars fueled more extremism, militancy, and anti-American sentiment.

In 2015, the United Nations reported that Obama’s drones had killed more civilians in Yemen that year than al Qaeda did. In Pakistan’s tribal areas, adjacent to the porous border with Afghanistan, most drone strikes did not kill militants. In 2014, the London-based Bureau of Investigative Journalism found that less than 4 percent of all identified drone victims in Pakistan were actually militants affiliated with al Qaeda. The cases of Obama’s many innocent victims remain unknown. Tariq Aziz, a 16-year-old anti-drone activist from North Waziristan who was killed while driving a car, was one. Momina Bibi, a grandmother torn to pieces by Obama’s drones in her backyard in front of her grandchildren, was another such case.

In Obama’s recently published memoir, A Promised Land, even readers who search extensively will be hard-pressed to find any mention of innocent victims or feelings of remorse. Instead, the former U.S. president prefers to defend and whitewash his drone campaign by solely describing those killed as “dangerous.”

“They were dangerous, these young men, often deliberately and casually cruel. Still, in the aggregate, at least, I wanted somehow to save them—send them to school, give them a trade, drain them of the hate that had been filling their heads. And yet the world they were a part of, and the machinery I commanded, more often had me killing them instead,” Obama wrote.

Obama’s self-righteous words do not reflect on-the-ground realities. Leading human rights organizations from all over the world regularly criticized the effects of drone strikes and the massive civilian casualties they caused. Contrary to the imagination of Obama and many other politicians and military officials, drones are not precise weapons that only kill arbitrarily defined “bad guys.” In fact, the vast majority of designated terrorists, like al Qaeda and Taliban leaders, were not killed.

Taliban supreme leader Mullah Mohammad Omar, the target of the very first drone strike in U.S. history in late 2001, was never killed by any of the Predators and Reapers that hunted him. Many years later, in 2013, he died of natural causes not too far from a U.S. military base in southern Afghanistan, with the rest of the world only finding out two years later. It’s a similar story with other figures like Ayman al-Zawahiri of al Qaeda or Jalaluddin Haqqani of the Taliban.

Yet in his memoir Obama ignores these facts while praising “more targeted, nontraditional warfare” and, “unlike some on the left,” as he writes, embracing parts of his predecessor George W. Bush’s controversial counterterrorism doctrine. Compared to the Bush era, drone strikes in Pakistan, Somalia, and Yemen increased tenfold under Obama, while his “kill list,” which he personally signed off on each “Terror Tuesday,” became notorious.

As a former member of Obama’s administration, President-elect Joe Biden might be keen to carry on the drone program, in a manner not too different from that of his onetime boss. In Afghanistan, for example, Biden might continue outgoing President Donald Trump’s withdrawal plans, but it is hard to imagine that clandestine forces and Predators will stop operating and killing people there. It’s also unrealistic to assume that America’s shadow wars in Africa, which increased heavily under both Obama and Trump, will decrease.

However, Biden should rethink the devastating counterterrorism policies of his predecessors. In the eyes of many people around the world, America’s drone war has become a symbol of injustice, oppression, and impunity that led to both global and local radicalization of Muslim populations who are living with the consequences.

Extrajudicial drone strikes have also led to questioning of the rule of law itself and the notion of a presumption of innocence—something that did not exist for Kareem Aluzai and all the other countless drone victims. The rule of law stands as a crowning achievement on which many Western societies pride themselves, but it is undermined by Western leaders championing the concept to justify their foreign misadventures.

The Overmilitarization of American Foreign Policy
By Robert M. Gates

Presidents must be especially wary of mission creep, the gradual expansion of a military effort to achieve new and more ambitious objectives not originally intended. Often, once they have achieved the established objectives, leaders feel emboldened to pursue broader goals. Such overreach is what happened under Clinton after the United States sent troops into Somalia in 1993 to forestall humanitarian disaster and after it overthrew the military dictatorship in Haiti in 1994, and it is what happened under Bush after the United States toppled the Taliban in Afghanistan in 2001 and Saddam in Iraq in 2003.

The consequences of an insufficiently planned military intervention can be devastating. Take, for example, the U.S. intervention in Libya in 2011, which I opposed. Once President Barack Obama decided to go in, the administration made two strategic mistakes. The first was agreeing to expand the original NATO humanitarian mission from simply protecting the people of eastern Libya against the forces of Libyan President Muammar al-Qaddafi to toppling the regime. NATO could have drawn a proverbial line in the sand somewhere between the capital, Tripoli, and the eastern city of Benghazi; a no-fly zone and attacks on Qaddafi’s ground forces could have protected the rebels in the East without destroying the government in Tripoli. Under those circumstances, perhaps some kind of political accommodation could have been worked out.

As I said at the time, Qaddafi had given up his nuclear program and posed no threat to U.S. interests. There is no question he was a loathsome and vicious dictator, but the total collapse of his government allowed more than 20,000 shoulder-fired surface-to-air missiles and countless other weapons from his arsenal to find their way across both Africa and the Middle East, sparked a civil war in 2014 that plunged Libya into years of turmoil, opened the door to the rise of ISIS in the country, and created the opportunity for Russia to claim a role in determining Libya’s future. The country remains in a shambles. As happened in Somalia, Haiti, Afghanistan, and Iraq, expanding the U.S. military mission in Libya beyond the original objective created nothing but trouble.

The second strategic mistake was the Obama administration’s failure to plan in any way for an international role in reestablishing order and a working government post-Qaddafi. (This is ironic in light of Obama’s earlier criticism of Bush’s alleged failure to plan properly for a post-Saddam Iraq.) Drawing on nonmilitary tools, the government could have taken a number of useful steps, including sending a U.S. training mission to help restructure the Libyan army, increasing the advisory role of the UN Support Mission in Libya, helping design a better electoral system that would not have inflamed social and regional divisions, and restraining Egypt and the Gulf states from their meddling in the lead-up to and after the outbreak of the 2014 civil war.

The United States did provide limited assistance to Libya after Qaddafi fell, much of it for treating victims of the fighting and locating weapons stockpiles. A September 2012 Wilson Center report suggested 30 different nonmilitary U.S. programs to help Libya, focusing on areas such as developing a new constitution, building a transparent judicial system, improving financial governance, promoting economic growth, and improving chemical weapons security and destruction. But the U.S. government never put together sufficient funding for these measures, even though their estimated cost, according to the Wilson Center, for the three years between the intervention in 2011 and the beginning of the civil war in 2014 was $230 million. By comparison, the cost of U.S. military operations in Libya between March and October 2011 was about $1 billion. If ever there was a mismatch between the importance of the nonmilitary mission and its available funding, this was it.

There were a number of nonmilitary ways in which the United States (and its allies) might have been able to stop the fighting and help stabilize Libya in the summer and fall of 2011. But there was no plan, no funding, and no desire. Washington’s use of nonmilitary instruments of power, as so often after the Cold War, was hesitant, inadequately funded, and poorly executed. The NATO-Arab coalition bombed Libya and then just went home, leaving Libyans to fight over the ruins and thus creating another source of instability in the region and a new base for terrorists. Obama himself supplied the harshest judgment about the intervention, characterizing the failure to plan for a post-Qaddafi Libya as the worst mistake of his presidency.

What is so striking about the overmilitarization of the period following the Cold War is just how much U.S. policymakers failed to learn the lessons of the seven previous decades. One of the United States’ greatest victories of the twentieth century relied not on military might but on subtler tools of power. The Cold War took place against the backdrop of the greatest arms race in history, but there was never actually a significant direct military clash between the two superpowers—despite proxy wars in Korea, Vietnam, and elsewhere. Indeed, most historians calculate that fewer than 200 U.S. troops died due to direct Soviet action. Because nuclear weapons would have made any war between the two countries catastrophic for both sides, the U.S.-Soviet contest was waged through surrogates and, crucially, through the use of nonmilitary instruments of power.

Most of those instruments have withered or been abandoned since the end of the Cold War. But as the great powers today expand and modernize their militaries, if the United States is smart, and lucky, the long competition ahead with China, in particular, will play out in the nonmilitary arena. Those nonmilitary instruments must be revived and updated.

Reviving and restructuring U.S. development assistance is all the more urgent in light of China’s Belt and Road Initiative and its other efforts to bring developing countries into its orbit. The establishment, in 2019, of the U.S. International Development Finance Corporation, an independent government agency that helps finance private-sector investment in development projects was a good start to expanding U.S. efforts to encourage private investment in developing countries. China may be able to loan billions of dollars to countries, but the United States has a vastly more powerful private sector that can not only invest in but also select economically viable projects that will truly serve the long-term interests of the recipient countries. The United States is well practiced in the art of economic punishment, but it needs to get a lot smarter about using economic tools to win over other countries.

In addition, if the United States wants to compete effectively with authoritarian governments, it will have to overhaul its public messaging. The current effort is an embarrassment. Many entities have a hand in strategic communications, including the White House, the State Department, the Defense Department, the Treasury Department, the CIA, and the U.S. Agency for Global Media, but for the most part, each goes its own way. The result is many lost opportunities. The United States has failed to appeal to the nationalist sentiments of people in Europe and elsewhere to resist Chinese and Russian efforts to interfere in the internal affairs of their countries. U.S. policymakers have also done a lousy job communicating to the rest of the world the scale and impact of U.S. development assistance and humanitarian assistance programs, including programs that have benefited people ruled by enemy governments. Who knew, for example, that in 1999, during the North Korean famine, the United States provided more food aid than the rest of the world combined and three times what China offered? The United States needs to trumpet its foreign aid, to act less like a monastic order and more like Madison Avenue.

Strengthening the nonmilitary tools of U.S. foreign policy would advance U.S. national interests and create new, more cost-effective, and less risky ways to exercise American power and leadership internationally. Americans want the strongest military in the world, but they want it used sparingly and only when vital national interests are at stake. Across the political spectrum, there is a belief that post–Cold War presidents have turned too often to the military to resolve challenges abroad. The United States must always be prepared to defend its interests, but in order to revive domestic support for the United States’ global leadership role, U.S. leaders must exercise greater restraint in sending the world’s finest military into combat. It should not be the mission of the U.S. military to try to shape the future of other countries. Not every outrage, every act of aggression, every oppression, or every crisis should elicit a U.S. military response.

Finally, most Americans want their country to stand for something beyond just military strength and economic success. They want it to be seen admiringly by others as the world’s strongest advocate for liberty. In formulating a foreign policy that the American public will support, U.S. leaders should recognize that it is important to use every nonmilitary instrument of power possible to encourage both friends and rivals to embrace freedom and reform, because those objectives serve the U.S. national interest. With restructuring and more resources, Washington’s nonmilitary instruments can contribute to a remarkable symphony of power.

The Can-Do Power
By Samantha Power

Restoring American leadership, accordingly, must include the more basic task of showing that the United States is a capable problem solver once more.

The new administration will rightly give precedence to problem solving at home—ending the pandemic, jump-starting an equitable economic recovery, and reforming fraying democratic institutions. Biden has said he plans to pull the country out of the current crisis by “building back better” in a way that confronts economic inequality, systemic racism, and climate change. Yet major structural changes will take time. The Biden administration should therefore also pursue foreign policy initiatives that can quickly highlight the return of American expertise and competence. Here, Biden should emphasize policies that provide clear, simultaneous benefits at home; meet a critical and felt need abroad; are highly visible; and—the missing ingredient in so many U.S. foreign policy endeavors of late—produce tangible outcomes. This means less rhetorical emphasis on the abstract cause of “the liberal international order” and more practical demonstrations of the United States’ distinctive ability to deliver on issues that matter right now in the lives of hundreds of millions of people.

In 2009, the last time Biden entered the executive branch, those of us who were part of the Obama administration confronted analogous concerns stemming from the disastrous Iraq war and the United States’ responsibility for the global financial crisis. President Barack Obama took steps similar to those Biden has promised: moving to rejoin UN bodies and pay UN dues, banning unethical practices such as torture, repairing the damage to alliances caused by the invasion of Iraq, and proclaiming, “We are ready to lead once more.” But while these moves were necessary and helpful in generating international goodwill, my own impression—as someone who served through all eight years of that administration—is that the United States’ stock didn’t hit its peak until 2014–15, when confidence in U.S. leadership was boosted by a succession of visible results.

In that period, Obama mobilized over 62 countries to eradicate the Ebola virus in West Africa, deploying health-care workers, building Ebola treatment units, and dispatching labs to carry out rapid testing. U.S. nuclear experts negotiated innovative ideas for blocking Iran’s pathways to a nuclear weapon, and U.S. diplomats rallied the support of China, Russia, and other major powers to back a deal built on those ideas. U.S. scientists and diplomats leveraged immense national climate expertise and political capital to secure an agreement in Paris that included commitments to reduce emissions and take other steps to mitigate climate change from almost every country in the world. By the end of 2015, while walking the halls of the UN as the U.S. ambassador and interacting with my counterparts, I encountered a palpably higher level of faith in the United States and eagerness to partner with us than two years prior.

After the 2004 Indian Ocean tsunami, the 2005 earthquake in Pakistan, and the 2011 tsunami in Japan, high-profile U.S. disaster relief almost immediately boosted favorable opinions of the United States in those places. A more enduring example of this impact can be seen in the realm of public health, where the President’s Emergency Plan for AIDS Relief, President George W. Bush’s signature international initiative, has provided antiretroviral treatment to more than 16 million people, including 700,000 children. In addition to its enormous humanitarian benefits, PEPFAR has made a significant and lasting impact on the attitudes of foreign publics toward the United States—producing a “substantial” increase in U.S. standing among nations that have participated in the program, according to a 2014 academic study. Indeed, when Bush left office in 2009, Gallup measured his approval rating at 34 percent in the United States—and 73 percent among African nations.

The New Anti-Americanism
By Richard Wike

In 2007, the median percentage of respondents who said they had confidence in Bush to do the right thing in world affairs was 21 percent across seven European nations regularly surveyed by Pew: France, Germany, Italy, Poland, Spain, Sweden, and the United Kingdom. In the 2019 survey, the same percentage expressed confidence in Trump, compared to 79 percent who said they were confident in Obama in 2016. And the Trump era decline isn’t limited to Europe: across 24 countries surveyed during the final two years of Obama’s presidency, a median of 74 percent of respondents said they had confidence in Obama to do the right thing in world affairs. Looking at these same 24 countries, just 31 percent said the same about Trump in 2019. The median percentage (meaning that half the countries were above this percentage and half were below) with a favorable opinion of the United States dropped from 64 percent to 53 percent over the same time period.

The worries driving negative global attitudes toward the United States are different now than they were during Bush’s presidency. When anti-Americanism reached its high point during the Bush administration, the United States was seen as an unchecked superpower, unilaterally pursuing its interests, and unconstrained by the international norms and institutions it had played the lead role in constructing. In the Trump era, by contrast, critics are less concerned about the exercise of unrivaled U.S. power than they are about a U.S. retreat—from both global leadership and liberal democracy.

Rattled by the lingering effects of the 2007-08 financial crisis, exhausted by wars in Iraq and Afghanistan, and challenged by the “rise of the rest,” the United States is now widely seen as a fading former hegemon, disinterested in global challenges and in danger of being eclipsed by China. In an era of anxiety about the creaking liberal world order, much of the rest of the world wants American engagement and leadership—but sees the United States turning inward instead.

Gone are the days when critics assailed the United States for trying to be the world’s policeman. Now they worry about a disengaged superpower thinking only of “America first.”

Global opposition to American power has generally come at moments when the United States seemed inclined to project military might around the world with little restraint. For example, the Vietnam War and President Ronald Reagan’s deployment of intermediate-range missiles to Germany both generated considerable opposition. More recently, the invasion of Iraq provoked considerable opposition around the world at a time when U.S. foreign policy seemed to reflect a belief in Washington that the United States could do what it wished and ignore the concerns of even its closest allies.

By 2007, the United States’ image had been severely tarnished in many parts of the world. Comparing polling data from that year and 2002, the year before the invasion of Iraq, the share of the public with a positive opinion of the United States fell in 26 out of 33 countries surveyed in both years by Pew. It rose in only five, and stayed about the same in two. The 2007 survey found a median of just 41 percent across a total of 47 countries expressing support for U.S.-led efforts to fight terrorism. In an earlier Pew Global Attitudes survey, in 2004, about half or more of Jordanians and Pakistanis, as well as 40 percent or more of French and Germans, said the war on terrorism was a smokescreen for a campaign against unfriendly Muslim governments and groups. Majorities in predominantly Muslim nations consistently said the United States could someday be a military threat to their country. And into the Obama presidency, drone strikes against terrorist organizations and leaders generated widespread opposition. Across 44 countries surveyed by Pew in 2014, 74 percent opposed U.S. drone strikes that targeted extremists in countries such as Pakistan, Yemen, and Somalia. (Still, a Pew survey the very next year offered a reminder that hard power isn’t always unwelcome: a 39-nation median of 62 percent supported U.S. military action against the Islamic State.)

Across 25 countries polled in 2018, a median of 51 percent said the U.S. government respects personal freedom, while a median of 37 percent said it does not. But in many nations, the share who believe the United States respects individual liberty has declined in recent years. Ratings on this measure are down significantly in the Trump era, although in many cases the slide began during the Obama presidency, coinciding with revelations about NSA eavesdropping and other similar stories. This decline has been especially steep in Europe. In 2013, for example, 81 percent of Germans said the United States respects personal freedom, compared with just 35 percent in the 2019 poll.

Yet it is worth remembering that the United States’ image has bounced back before. It’s clear today that people haven’t given up on the United States. In 2018, Pew asked respondents in 25 nations whether they would rather live in a world with the United States or China as the top superpower. A median of 63 percent preferred the United States, while just 19 percent preferred China. People recognize that the world is changing, but they still want the United States to have a prominent place in it. Even if the international system is reeling, many elements of it remain quite popular—and people are still looking for leadership from the increasingly complicated superpower that built it.

Rogue Superpower
By Michael Beckley

Thanks to the U.S.-led order, for decades, most countries have not had to fight for market access, guard their supply chains, or even seriously defend their borders. The U.S. Navy has kept international waterways open, the U.S. market has provided reliable consumer demand and capital for dozens of countries, and U.S. security guarantees have covered nearly 70 nations. Such assurances have benefited everyone: not just Washington’s allies and partners but also its adversaries. U.S. security guarantees had the effect of neutering Germany and Japan, the main regional rivals of Russia and China, respectively. In turn, Moscow and Beijing could focus on forging ties with the rest of the world rather than fighting their historical enemies. Without U.S. patronage and protection, countries would have to get back in the business of securing themselves and their economic lifelines.

Such a world would see the return of great-power mercantilism and new forms of imperialism. Powerful countries would once again try to reduce their economic insecurity by establishing exclusive economic zones, where their firms could enjoy cheap and secure access to raw materials and large captive consumer markets. Today, China is already starting to do this with its Belt and Road Initiative, a network of infrastructure projects around the world; its “Made in China 2025” policy, to stimulate domestic production and consumption; and its attempts to create a closed-off, parallel Internet. If the United States follows suit, other countries will have to attach themselves to an American or a Chinese bloc—or forge blocs of their own. France might seek to restore its grip on its former African colonies. Russia might accelerate its efforts to corral former Soviet states into a regional trade union. Germany increasingly would have to look beyond Europe’s shrinking populations to find buyers for its exports—and it would have to develop the military capacity to secure those new far-flung markets and supply lines, too.

As great powers competed for economic spheres, global governance would erode. Geopolitical conflict would paralyze the UN, as was the case during the Cold War. NATO might dissolve as the United States cherry-picked partners. And the unraveling of the U.S. security blanket over Europe could mean the end of the European Union, too, which already suffers from deep divisions. The few arms control treaties that remain in force today might fall by the wayside as countries militarized to defend themselves. Efforts to combat transnational problems—such as climate change, financial crises, or pandemics—would mimic the world’s shambolic response to COVID-19, when countries hoarded supplies, the World Health Organization parroted Chinese misinformation, and the United States withdrew into itself.

The resulting disorder would jeopardize the very survival of some states. Since 1945, the number of countries in the world has tripled, from 46 to nearly 200. Most of these new states, however, are weak and lack energy, resources, food, domestic markets, advanced technology, military power, or defensible borders. According to research by the political scientist Arjun Chowdhury, two-thirds of all countries today cannot provide basic services to their people without international help. In short, most countries depend critically on the postwar order, which has offered historically unprecedented access to international aid, markets, shipping, and protection. Without such support, some countries would collapse or be conquered. Fragile, aid-dependent states such as Afghanistan, Haiti, and Liberia are only some of the most obvious high-risk cases. Less obvious ones are capable but trade-dependent countries such as Saudi Arabia, Singapore, and South Korea, whose economic systems would struggle to function in a world of closed markets and militarized sea-lanes.

The Endangered Asian Century
By Lee Hsien Loong

The United States has always had vital national interests in this region. It expended blood and treasure fighting the Pacific War to defeat Japan, a war in which the United States nearly lost three future presidents. It fought two costly wars in Korea and Vietnam, which bought precious time for noncommunist countries in Asia to consolidate their societies and economies and win the battle of hearts and minds against communism.

The United States’ generous, open policies that have so greatly benefited the Asia-Pacific derived from deep-rooted political ideals and its self-image as “a city upon a hill” and “a light unto the nations,” but they also reflected its enlightened self-interest. A stable and prospering Asia-Pacific was first a bulwark against the communist countries in the Cold War and then an important region of the world comprising many stable and prosperous countries well disposed toward the United States. To U.S. businesses, the Asia-Pacific offered sizable markets and important production bases. Unsurprisingly, several of the United States’ staunchest allies are in Asia, such as Australia, Japan, and South Korea, and so are some of its long-standing partners, such as Singapore.

China has vital interests in the region, too. In Northeast Asia, the Second Sino-Japanese War and the Korean War still cast long shadows. In Southeast Asia, China sees a source of energy and raw materials, economic partners, and important sea lines of communication. It also sees chokepoints in the Strait of Malacca and the South China Sea that must be kept open to protect China’s energy security. But one critical difference with the United States is that China sees the Asia-Pacific as its “near abroad,” to borrow a Russian expression, and thus as essential to its own security.

Chinese President Xi Jinping has said that the Pacific Ocean is big enough to accommodate both the United States and China. But he has also said that Asian security should be left to Asians. A natural question arises: Does Xi think that the Pacific Ocean is big enough for the United States and China to coexist peacefully, with overlapping circles of friends and partners, or that it is big enough to be divided down the middle between the two powers, into rival spheres of influence? Singapore and other Asia-Pacific countries have no doubt which interpretation they prefer. Although they may not have much influence over how things will turn out, they fervently hope not to be forced to choose between the United States and China.

The U.S. security presence remains vital to the Asia-Pacific region. Without it, Japan and South Korea would be compelled to contemplate developing nuclear weapons; both are nuclear threshold states, and the subject already regularly surfaces in their public discourse, especially given North Korea’s growing nuclear weapons capabilities. Such developments are fortunately still hypothetical, but their prospect is conducive neither to stability in Northeast Asia nor to nonproliferation efforts globally.

In Southeast Asia, the U.S. Seventh Fleet has contributed to regional security since World War II, ensuring that sea lines of communication remain safe and open, which has enabled trade and stimulated economic growth. Despite its increasing military strength, China would be unable to take over the United States’ security role. Unlike the United States, China has competing maritime and territorial claims in the South China Sea with several countries in the region, which will always see China’s naval presence as an attempt to advance those claims.

How China Sees the World
By H. R. McMaster

Since the heady days of Deng Xiaoping, in the late 1970s, the assumptions that had governed the American approach to our relationship with China were these: After being welcomed into the international political and economic order, China would play by the rules, open its markets, and privatize its economy. As the country became more prosperous, the Chinese government would respect the rights of its people and liberalize politically. But those assumptions were proving to be wrong.

China has become a threat because its leaders are promoting a closed, authoritarian model as an alternative to democratic governance and free-market economics. The Chinese Communist Party is not only strengthening an internal system that stifles human freedom and extends its authoritarian control; it is also exporting that model and leading the development of new rules and a new international order that would make the world less free and less safe. China’s effort to extend its influence is obvious in the militarization of man-made islands in the South China Sea and the deployment of military capabilities near Taiwan and in the East China Sea. But the integrated nature of the Chinese Communist Party’s military and economic strategies is what makes it particularly dangerous to the United States and other free and open societies.

The Forbidden City was the perfect backdrop for Xi to showcase his determination to “move closer to the center of the world stage and to make a greater contribution to humankind.”

The Forbidden City was built during the Ming dynasty, which ruled China from 1368 to 1644—a period considered to be a golden age in terms of China’s economic might, territorial control, and cultural achievements. It was during this dynasty that Zheng He, an admiral in the Ming fleet, embarked on seven voyages around the Western Pacific and Indian Oceans, more than half a century before Christopher Columbus set sail. His “treasure ships,” among the largest wooden vessels ever built, brought back tribute from all parts of the known world. But despite the success of the seven voyages, the emperor concluded that the world had nothing to offer China. He ordered the treasure ships scuttled and Chinese ports closed. The period that followed—the 19th and 20th centuries in particular—is seen by Xi and others in the leadership as an aberrational period during which European nations and, later, the United States achieved economic and military dominance.

While the images broadcast to China and the rest of the world from the Forbidden City during our visit were meant to project confidence in the Chinese Communist Party, one could also sense a profound insecurity—a lesson of history that went unmentioned. In its very design, the Forbidden City seemed to reflect that contrast between outward confidence and inner apprehension. The three great halls at the city’s center were meant not only to impress, but also to defend from threats that might come from both outside and inside the city’s walls. After the end of the Han dynasty, in a.d. 220, China’s core provinces were ruled only half the time by a strong central authority. And even then, China was subject to foreign invasion and domestic turmoil. The Yongle emperor, Zhu Di, who built the Forbidden City, was more concerned about internal dangers than he was about the possibilities of another Mongol invasion. To identify and eliminate opponents, the emperor set up an elaborate spy network. To preempt opposition from scholars and bureaucrats, he directed the executions of not only those suspected of disloyalty, but also their entire families. The Chinese Communist Party used similar tactics centuries later. Like Xi, the emperors who sat on the elaborate throne in the heart of the Forbidden City practiced a remote and autocratic style of rule vulnerable to corruption and internal threats.

The fears and ambitions are inseparable. They explain why the Chinese Communist Party is obsessed with control—both internally and externally.

The repressive and manipulative policies in Tibet, with its Buddhist majority, are well known. The Catholic Church and, in particular, the fast-growing Protestant religions are of deep concern to Xi and the party. Protestant Churches have proved difficult to control, because of their diversity and decentralization, and the party has forcefully removed crosses from the tops of church buildings and even demolished some buildings to set an example. Last year, Beijing’s effort to tighten its grip on Hong Kong sparked sustained protests that continued into 2020—protests that Chinese leaders blamed on foreigners, as they typically do. In Xinjiang, in northwestern China, where ethnic Uighurs mainly practice Islam, the party has forced at least 1 million people into concentration camps. (The government denies this, but last year The New York Times uncovered a cache of incriminating documents, including accounts of closed-door speeches by Xi directing officials to show “absolutely no mercy.”)

The party’s efforts to exert control inside China are far better known than its parallel efforts beyond China’s borders. Here again, insecurity and ambition are mutually reinforcing. Chinese leaders aim to put in place a modern-day version of the tributary system that Chinese emperors used to establish authority over vassal states. Under that system, kingdoms could trade and enjoy peace with the Chinese empire in return for submission. Chinese leaders are not shy about asserting this ambition. In 2010, China’s foreign minister matter-of-factly told his counterparts at a meeting of the Association of Southeast Asian Nations: “China is a big country, and you are small countries.” China intends to establish a new tributary system through a massive effort organized under three overlapping policies, carrying the names “Made in China 2025,” “Belt and Road Initiative,” and “Military-Civil Fusion.”

The Belt and Road Initiative has created a common pattern of economic clientelism. Beijing first offers countries loans from Chinese banks for large-scale infrastructure projects. Once the countries are in debt, the party forces their leaders to align with China’s foreign-policy agenda and the goal of displacing the influence of the United States and its key partners. Although Chinese leaders often depict these deals as win-win, most of them have just one real winner.

For developing countries with fragile economies, Belt and Road sets a ruthless debt trap. When some countries are unable to service their loans, China trades debt for equity to gain control of their ports, airports, dams, power plants, and communications networks. As of 2018, the risk of debt distress was growing in 23 countries with Belt and Road financing. Eight poor countries with Belt and Road financing—Pakistan, Djibouti, the Maldives, Laos, Mongolia, Montenegro, Tajikistan, and Kyrgyzstan—already have unsustainable levels of debt.

China’s tactics vary based on the relative strength or weakness of the target states. When undertaking large-scale investment projects, many countries with weak political institutions succumb to corruption, making them even more vulnerable to Chinese tactics.

In Sri Lanka, the longtime president and current prime minister, Mahinda Rajapaksa, incurred debts far beyond what his nation could bear. He agreed to a series of high-interest loans to finance Chinese construction of a port, though there was no apparent need for one. Despite earlier assurances that the port would not be used for military purposes, a Chinese submarine docked there the same day as Japanese Prime Minister Shinzo Abe’s visit to Sri Lanka in 2014. In 2017, following the commercial failure of the port, Sri Lanka was forced to sign a 99-year lease to a Chinese state-owned enterprise in a debt-for-equity swap.

The new vanguard of the Chinese Communist Party is a delegation of bankers and party officials with duffel bags full of cash. Corruption enables a new form of colonial-like control that extends far beyond strategic shipping routes in the Indian Ocean and South China Sea, and elsewhere.

The Chinese Communist Party has also pursued a broad range of influence efforts in order to manipulate political processes in target nations. Sophisticated Chinese efforts have been uncovered in Australia and New Zealand to buy influence within universities, bribe politicians, and harass the Chinese diaspora community into becoming advocates for Beijing.

Any strategy to reduce the threat of China’s aggressive policies must be based on a realistic appraisal of how much leverage the United States and other outside powers have on the internal evolution of China. The influence of those outside powers has structural limits, because the party will not abandon practices it deems crucial to maintaining control. But we do have important tools, quite apart from military power and trade policy.

For one thing, those “Western liberal” qualities that the Chinese see as weaknesses are actually strengths. The free exchange of information and ideas is an extraordinary competitive advantage, a great engine of innovation and prosperity. (One reason Taiwan is seen as such a threat to the People’s Republic is because it provides a small-scale yet powerful example of a successful political and economic system that is free and open rather than autocratic and closed.) Freedom of the press and freedom of expression, combined with robust application of the rule of law, have exposed China’s predatory business tactics in country after country—and shown China to be an untrustworthy partner. Diversity and tolerance in free and open societies can be unruly, but they reflect our most basic human aspirations—and they make practical sense too. Many Chinese Americans who remained in the United States after the Tiananmen Square massacre were at the forefront of innovation in Silicon Valley.

The Age of Great-Power Competition
By Elbridge A. Colby and A. Wess Mitchell

As China became pivotal to global commerce, it did not so much change its discriminatory economic practices—forced technology transfers, mandatory joint ventures, and outright intellectual property theft—as cement them. It complemented this with a military buildup of historic scale, aimed specifically at dominating Asia and, in the long run, at projecting power throughout the world, and with a massive effort to expand its influence through the Belt and Road Initiative and related projects. Russia, meanwhile, rebuilt its military, invaded Georgia, annexed Crimea, initiated a festering insurgency in eastern Ukraine, and began a systematic campaign to resurrect its military, economic, and diplomatic influence in Africa, Latin America, and the Middle East.

And yet most people in Washington long refused to acknowledge the new reality. Instead, American leaders continued to herald an “era of engagement” with Moscow and talked up Beijing’s potential as a “responsible stakeholder” in the international system. The former found expression in the “reset” with Russia in 2009, just months after Moscow’s invasion of Georgia, and the latter took the form of repeated efforts to deepen relations with Beijing and even an aspiration among some to establish a U.S.-Chinese “G-2” to lead the international community. But China’s brazen militarization of islets in the South China Sea and its increasing assertiveness beyond eventually forced Washington to reevaluate its assumptions about Beijing, and Russia’s seizure of Crimea in 2014 put to rest what was left of the so-called reset. By the end of the Obama administration, it was clear that the United States’ course was seriously off.

The resulting policy changes were no exercise in American strategic foresight; they were reactive, ex post facto adjustments. Considerable damage had already been done. Prizing the appearance of stability over the pursuit of definable national interests, the United States had for years ignored China’s flagrant theft of U.S. intellectual property—not to mention government secrets—and Beijing’s slow-motion takeover attempt in the South China Sea. In the hopes of recruiting Russia as a partner in upholding an international status quo that Russian President Vladimir Putin manifestly disdained, Washington had courted and unwittingly emboldened the Kremlin on its path of territorial revision while unnerving frontline NATO allies in eastern Europe. The cost for the United States was steep, with allies in East Asia and Europe beginning to doubt that Washington was willing to stand up for itself, let alone for them.

… the United States is entering what is likely to be a protracted struggle over who will decide how the world works in the twenty-first century. The coming era will be less forgiving of hubris and unpreparedness than were the circumstances of the recent past. Recognition of that has prompted a long-overdue reassessment of U.S. military, economic, and diplomatic priorities, which future administrations will need to carry forward.

Doing so will require painful tradeoffs and sacrifices. It will mean relinquishing old dreams of unfettered military dominance and ill-suited weapons platforms and asking greater material contributions of U.S. allies. It will also mean sharpening the U.S. technological edge in strategically relevant sectors without undermining the American commitment to international free trade and focusing much more rigorously on Asia and Europe at the expense of other regions. Returning to the somnolent complacency of years past—when the United States assumed the best intentions of its rivals, maintained economic policies that often undercut its national security, and masked dangerous shortcomings among its allies in the name of superficial political unity—is not an option. Neither is withdrawing in the hopes of sitting out geopolitical competition altogether. As in the past, the United States can guarantee its own security and prosperity as a free society only if it ensures favorable balances of power where they matter most and systematically prepares its society, economy, and allies for a protracted competition against large, capable, and determined rivals that threaten that aim.

The Next Liberal Order
By G. John Ikenberry

For guidance, today’s leaders should look to the example of U.S. President Franklin Roosevelt. The collapse of the world economy and the rapid spread of fascism and totalitarianism in the 1930s showed that the fates of modern societies were tied to one another and that all were vulnerable to what Roosevelt, using a term that seems eerily prescient today, called “contagion.” The United States, Roosevelt and his contemporaries concluded, could not simply hide within its borders; it would need to build a global infrastructure of institutions and partnerships. The liberal order they went on to build was less about the triumphant march of liberal democracy than about pragmatic, cooperative solutions to the global dangers arising from interdependence. Internationalism was not a project of tearing down borders and globalizing the world; it was about managing the growing complexities of economic and security interdependence in the pursuit of national well-being. Today’s liberal democracies are the bankrupt heirs to this project, but with U.S. leadership, they can still turn it around.

In the face of today’s breakdown in world order, the United States and other liberal democracies must reclaim and update Roosevelt’s legacy. As a start, this means learning the right lessons about the failures of the liberal international order in the past three decades. Ironically, it was the success of the U.S.-led order that sowed the seeds of the current crisis. With the collapse of the Soviet Union, the last clear alternative to liberalism disappeared. As the liberal order grew from being one-half of a bipolar system to a truly global order, it began to fragment, in part because it no longer resembled a club. Indeed, today’s liberal international order looks more like a sprawling shopping mall: states can wander in and pick and choose what institutions and regimes they want to join. Security cooperation, economic cooperation, and political cooperation have become unbundled, and their benefits can be obtained without buying into a suite of responsibilities, obligations, and shared values. These circumstances have allowed China and Russia to cooperate with the liberal system on an opportunistic, ad hoc basis. To name just one example, membership in the World Trade Organization has given China access to Western markets on favorable terms, but Beijing has not implemented significant measures to protect intellectual property rights, strengthen the rule of law, or level the playing field for foreign companies in its own economy.

There simply is no other major state—rising, falling, or muddling through—that can galvanize the world around a vision of open, rules-based multilateral cooperation. China will be powerful, but it will tilt the world away from democratic values and the rule of law. The United States, for its part, needed the partnership of other liberal states even in earlier decades, when it was more capable. Now, as rival states grow more powerful, Washington needs these partnerships more than ever. If it continues to disengage from the world or engages in it only as a classic great power, the last vestiges of the liberal order will disappear.

Chained to Globalization
By Henry Farrell and Abraham L. Newman

In 1999, the columnist Thomas Friedman pronounced the Cold War geopolitical system dead. The world, he wrote, had “gone from a system built around walls to a system increasingly built around networks.” As businesses chased efficiency and profits, maneuvering among great powers was falling away. An era of harmony was at hand, in which states’ main worries would be how to manage market forces rather than one another.

Friedman was right that a globalized world had arrived but wrong about what that world would look like. Instead of liberating governments and businesses, globalization entangled them. As digital networks, financial flows, and supply chains stretched across the globe, states—especially the United States—started treating them as webs in which to trap one another. Today, the U.S. National Security Agency lurks at the heart of the Internet, listening in on all kinds of communications. The U.S. Department of the Treasury uses the international financial system to punish rogue states and errant financial institutions. In service of its trade war with China, Washington has tied down massive firms and entire national economies by targeting vulnerable points in global supply chains. Other countries are in on the game, too: Japan has used its control over key industrial chemicals to hold South Korea’s electronics industry for ransom, and Beijing might eventually be able to infiltrate the world’s 5G communications system through its access to the Chinese telecommunications giant Huawei.

Globalization, in short, has proved to be not a force for liberation but a new source of vulnerability, competition, and control; networks have proved to be less paths to freedom than new sets of chains. Governments and societies, however, have come to understand this reality far too late to reverse it. In the past few years, Beijing and Washington have been just the most visible examples of governments recognizing how many dangers come with interdependence and frantically trying to do something about it. But the economies of countries such as China and the United States are too deeply entwined to be separated—or “decoupled”—without causing chaos. States have little or no ability to become economically self-reliant. Hawks in Beijing and Washington may talk about a new Cold War, but there is today no way to split the world into competing blocs. Countries will remain entangled with one another, despite the dangers that their ties produce—bringing a new era of what might be called “chained globalization.” Under chained globalization, states will be bound together by interdependence that will tempt them to strangle their competitors through economic coercion and espionage, even as they try to fight off their rivals’ attempts to do the same.

As the world’s economic and information networks expanded, many of them coalesced around single points of control, and some states learned to wield those hubs as weapons against their competitors.

Among the first networks to undergo such a transformation was the system underpinning international financial transactions. In the 1970s, the Society for Worldwide Interbank Financial Telecommunication (SWIFT) network made it easier to route transactions through banks around the world, and the dollar clearing system allowed those banks to reconcile torrents of payments denominated in U.S. dollars. Once both banks and individuals had accepted this new messaging system, international exchanges became even more dependent on a single currency—the U.S. dollar—granting Washington additional leverage over the global financial system. International supply chains were next. In the 1980s and 1990s, electronics manufacturers began to outsource production to specialized firms such as Foxconn, creating supply chains with tens or even hundreds of suppliers. Then, in the first decade of this century, cloud computing began to centralize key functions of the Internet in systems maintained by a few large firms, such as Amazon and Microsoft. In each case, money, goods, and information passed through essential economic hubs. A few privileged powers ruled over those hubs, gaining the chance to exclude others or to spy on them.

The United States saw those opportunities before most other countries did, thanks to the fact that so many networks lay within its reach. Since the attacks of September 11, 2001, the Treasury Department has used the world’s reliance on the U.S. dollar to turn the global financial system into a machinery of control, freezing out rogue actors such as al Qaeda and North Korea and using the threat of sanctions to terrify banks into advancing its goals. The National Security Agency has transformed the Internet into an apparatus of global surveillance by tapping into the networks of telecommunications providers such as AT&T and Verizon and running clandestine programs that can identify communications chokepoints and exploit them against both adversaries and allies.

Until recently, other states struggled to keep up. China, a latecomer to the globalized economy, could respond to perceived slights only by locking transgressors out of its valuable domestic market. And although the European Union played a significant role in global economic networks, it lacked the kind of centralized institutions, such as the U.S. Treasury Department’s Office of Foreign Assets Control, that Washington had been able to convert into instruments of power.

Driven by both fear and opportunism, however, China is now insulating itself from networked attacks and building networks of its own to turn against its rivals. Take Huawei, which seeks to build the world’s 5G communications network with the tacit support of Beijing. If Huawei comes to dominate global 5G, the Chinese government could exploit its access to the firm to tap into communications around the world, using its new powers over the network against its rivals. Or to put it another way: China could do to the United States what the United States has already been doing to China.

Jack Ma Blasts Global Financial Regulators’ Curbs on Innovation
By Bloomberg News

Alibaba Group founder Jack Ma criticized global financial regulations for stifling innovation and urged China to seek a system that accommodated development.

“After the Asian financial crisis, the risk control highlighted in the Basel Accords has been” the priority for regulators, Ma said at the Bund Summit in Shanghai on Saturday. Now the world “only focuses on risk control, not on development, and rarely do they consider opportunities for young people and developing countries.”

The Basel Accords, which Ma likened to a club for the elderly, are used to solve problems for financial systems that have been operating for decades, he said. China, however, is still a “youth” and needs more innovation to build an ecosystem for the healthy development of the local industry, according to Ma.

Digital currencies may play an important role in building the type of a financial system that will be needed in the next 30 years, Ma said.

“Digital currency could create value and we should think about how to establish a new type of financial system through digital currency,” said Ma.

Ma’s fintech giant Ant Group Co. plans initial public offerings in both Shanghai and Hong Kong. Ma said that the firm set the price of its Shanghai listing on Friday, without providing details. The deal is one of the most hotly anticipated IPOs in years, on course to make history by surpassing Saudi Aramco’s record $29 billion share sale in 2019.

Analysis: Xi’s message to Jack Ma, ‘You’re nothing but a cloud’
By Katsuji Nakazawa

As a member of the ruling Chinese Communist Party, Ma is required to show absolute loyalty to the system.

But the 56-year-old entrepreneur made remarks deemed to challenge the intentions of the party’s Central Committee, led by President Xi Jinping.

Speaking at a financial conference in Shanghai on Oct. 24, the outspoken billionaire criticized Chinese regulators, saying the country’s old financial regulations are a drag on technological innovation.

“China does not have a systemic financial risk problem,” Ma said at the Bund Summit. “Chinese finance basically does not carry risk; rather, the risk comes from lacking a system.

“Good innovation is not afraid of regulation but is afraid of outdated regulation. We shouldn’t use the way to manage a train station to regulate an airport.”

Ma also took aim at Chinese banks, saying they operate with a “pawnshop” mentality.

Ma was especially daring given that he was speaking in front of Wang Qishan, the vice president who for years has been at the center of China’s financial administration.

Ma’s criticism was also directed at Vice Premier Liu He, a Xi aide who oversees China’s macroeconomic policy. His remarks, naturally, were reported to the top echelons.

Ma has pioneered e-commerce and cashless commerce. But his remarks were careless, and he ended up stepping on the tiger’s tail.

The day after Xinhua used Higashiyama’s painting to make a point, the planned initial public offering of Alibaba’s financial arm, Ant Group, in Shanghai and Hong Kong was shelved.

China Clampdown on Big Tech Puts More Billionaires on Notice
By Zheping Huang and Coco Liu

“The Party is faced with the conflicting desires to empower domestic tech companies to be internationally competitive, while keeping their market activities firmly under control at home,” said Kendra Schaefer, head of digital research at the Trivium China consultancy in Beijing. “The horizontal spread of Chinese big tech makes anti-monopoly regulation that much more urgent for Chinese regulators.”

One Senator’s Strategy for Containing Chinese Technological Dominance
By Greg Ip

When Mark Warner was in the telecommunications business in the 1980s and 1990s, he didn’t think much about how U.S. rules and standards shaped the global use of technology—it was a given. “I never appreciated how much we set the standards on almost every technology and innovation, even if not invented in America,” Mr. Warner said in an interview this week. “We flooded the zone with engineers. We had the best schools, we had most of the companies. It got built in as an assumed advantage, and we kind of got lazy about it.”

Today, as the top Democrat on the Senate Intelligence Committee, he sees China’s erosion of that technological advantage as an existential threat to American values at home and abroad.

In a speech last week he confessed that he once thought China would liberalize economically and politically by integrating with the world, and that technological innovation could only flourish in free societies. “Like so many, I was wrong,” he bluntly told the National Democratic Institute. “Instead, China has shown that the development and use of cutting-edge technology and economic expansion are, indeed, possible within authoritarian-state capitalism.”

He describes China’s strategy thus: First, allow domestic companies to fight it out until a national champion emerges and nurture it by keeping out foreign competition. Use state resources to expand into foreign markets. Then target the international bodies that set standards that allow differing manufacturers’ equipment to work together, from mobile phones to artificial intelligence and facial recognition. In this way, he said, Chinese instead of American values will become embedded in the way the world uses technology.

Chinese Dissident Artist Ai Weiwei On Masks, Uighurs and Democracy
By Heather Chen

When we talk about the coronavirus, we have to talk about China. What do you think about the Chinese government’s handling of the outbreak that emerged in Wuhan?

China clearly mishandled the medical information and intentionally tried to block the truth at early stages of the coronavirus outbreak. This crucial period allowed the virus to spread all around the world, these are the facts. What has not been established as a fact is we don’t yet know how the coronavirus came about. Is it from nature? Or is it man-made? Also, how many people have actually become infected within China and what is the true death toll? As one of the largest economic powers in the world, China should realize that without transparency and social trust, power becomes meaningless. A powerful state without those values can only pose a danger to human development.

What do you have to say about China’s global standing and its diplomatic disputes? Is the world finally waking up to its actions?

The Covid-19 pandemic has raised questions about the long-held, misleading belief in the West that China would become a more liberal state after becoming rich. China has now reached the highest possible economic status but it is still developing. At this rapid rate of development, it will soon overtake the United States and become the biggest economic superpower in the world. With that strategic outlook, the U.S. and the West must rethink the shortcomings of classic capitalism and globalization in competition with China’s state capitalist.

China clearly faces strong obstacles as seen in recent disputes with the U.S. and the West. Business is not going on as usual. Even so, I doubt China will worry or change its behavior. They have been like this for the last 70 years. Only the West has softened and become nervous. China will probably remain the same. I have not seen any crucial tactics employed by the West, their actions have ranged from baffling to threatening. It’s not to say the West has no leverage over China, but Western leaders really need to understand the potential danger of China becoming the most powerful state, instead of continuing to seek short-term political gain for their own agendas and profit-making.

A recent investigation by The New York Times that revealed companies were using Uighur labor through government programs to produce face masks to satisfy global demand. What are your thoughts on this?

The exploitation of minority rights and prisoner rights is a long-standing practice in China, a state without an independent judicial system or an independent press to report or raise questions on human rights, labor rights, or protections for ethnic or religious minorities. You cannot expect what we consider in the West to be just or fair practices in a modern society. Prisoners in China, especially in Xinjiang, have always been used for forced labor and subjected to prison conditions that are inhumane. There is also broad use of child labor.

Xinjiang, meaning New Frontier, is not just that for China, but also for Western industries. Many of the top global companies active in China have some production presence in Xinjiang, such as Volkswagen. Most fashion companies working out of China have Xinjiang in its supply chain, a major source of Chinese produced cotton. Uighur rights have long been suppressed but this has only become more extreme in the last decade. Large populations of Uighurs have been sent to re-education camps primarily for brainwashing but also as a source of forced labor. Besides the Chinese regime, the West is the other primary beneficiary of this practice.

Nike Reviewing China Supply Chain After Report on Uighur Abuse
By Nick Turner and Eben Novy-Williams

China’s foreign ministry earlier this month called the reports about forced labor “simply baseless” and designed to “smear China’s counter-terrorism and de-radicalization measures in Xinjiang.” Late last year the government said it completed what it called de-radicalization training and that the “students” had all “graduated.”

In a statement on its website, Nike said that while it does not “directly source products from the Xinjiang Uighur Autonomous Region,” the company was looking into how suppliers rely on Uighurs elsewhere.

“We have been conducting ongoing diligence with our suppliers in China to identify and assess potential risks related to employment of people from XUAR,” the company said. “Nike is committed to upholding international labor standards and we are continuing to evaluate how to best monitor our compliance standards in light of the complexity of this situation.”

Nike and Coca-Cola Lobby Against Xinjiang Forced Labor Bill
By Ana Swanson

Nike and Coca-Cola are among the major companies and business groups lobbying Congress to weaken a bill that would ban imported goods made with forced labor in China’s Xinjiang region, according to congressional staff members and other people familiar with the matter, as well as lobbying records that show vast spending on the legislation.

The bill, which would prohibit broad categories of certain goods made by persecuted Muslim minorities in an effort to crack down on human rights abuses, has gained bipartisan support, passing the House in September by a margin of 406 to 3. Congressional aides say it has the backing to pass the Senate, and could be signed into law by either the Trump administration or the incoming Biden administration.

But the legislation, called the Uyghur Forced Labor Prevention Act, has become the target of multinational companies including Apple whose supply chains touch the far western Xinjiang region, as well as of business groups including the U.S. Chamber of Commerce. Lobbyists have fought to water down some of its provisions, arguing that while they strongly condemn forced labor and current atrocities in Xinjiang, the act’s ambitious requirements could wreak havoc on supply chains that are deeply embedded in China.

Xinjiang produces vast amounts of raw materials like cotton, coal, sugar, tomatoes and polysilicon, and supplies workers for China’s apparel and footwear factories. Human rights groups and news reports have linked many multinational companies to suppliers there, including tying Coca-Cola to sugar sourced from Xinjiang, and documenting Uighur workers in a factory in Qingdao that makes Nike shoes.

Apple, which has extensive business ties to China, has also lobbied to limit some provisions of the bill, said two congressional staff members and another person familiar with the matter.

Disclosure forms show that Apple paid Fierce Government Relations, a firm led by former staff aides to Senator Mitch McConnell of Kentucky and President George W. Bush, $90,000 to lobby on issues including Xinjiang-related legislation in the third quarter. Apple’s lobbying was previously reported by The Washington Post.

Apple also paid outside firms this year to lobby on another bill, the Uyghur Forced Labor Disclosure Act of 2020.

Apple disputed the claim that it had tried to weaken the legislation, saying it supported efforts to strengthen American regulations and believes the Uyghur Forced Labor Prevention Act should become law.

According to a document viewed by The New York Times, Apple’s suggested edits to the bill included extending some deadlines for compliance, releasing certain information about supply chains to congressional committees rather than to the public, and requiring Chinese entities to be “designated by the United States government” as helping to surveil or detain Muslim minority groups in Xinjiang.

In its March report, the Australian Strategic Policy Institute identified Apple and Nike among 82 companies that potentially benefited, directly or indirectly, from abusive labor transfer programs tied to Xinjiang.

That report said that O-Film Technology, a contractor for Apple, Microsoft, Google and other companies, received at least 700 Uighur workers in a program that was expected to “gradually alter their ideology.” It tied other Apple suppliers, including Foxconn Technology, to similar employment programs.

Apple said in a statement that it had the strongest supplier code of conduct in its industry and that it regularly assessed suppliers, including with surprise audits.

Lobbying disclosures show that companies have spent heavily to sway Congress on Xinjiang-related legislation, though they reveal nothing about their specific requests.

As repression mounts, China under Xi Jinping feels increasingly like North Korea
By Anna Fifield

Over the past four years, the Chinese government has detained more than 1 million Uighurs in reeducation camps designed to strip them of their culture, language and religion. They’ve had to shave their beards and uncover their hair. They’ve been made to pledge allegiance to the Chinese Communist Party. Children have been taken from their parents and put into orphanages.

Returning to Kashgar, I was struck by how, at first glance, it seemed relatively normal. In the Old City, families were out at the night market, eating piles of meat and bread. Kids could be heard laughing through open upstairs windows. There were even young men — prime targets for the detention campaign, which was ostensibly about deradicalization — on the streets again.

Walking around, I was overcome by the same sense of sadness mixed with rage that I felt when reporting in Pyongyang. I knew it was a kind of “Truman Show,” but I couldn’t see the edges of the set. I could see a blankness in people’s eyes and feel a palpable heaviness in the air.

Had any of the people I saw in Kashgar this month been affected by the “reeducation” campaign? Almost certainly. But I couldn’t ask them. Just as in Pyongyang, I didn’t try to interview people on the street or in stores, as I would do anywhere else in the world.

Doing so could place those people in grave danger if it was discovered they had talked to a foreigner, and a foreign journalist no less. I would have loved to talk to someone who had been through the camps — but I was conscious of the risks I posed to people if I tried to discuss sensitive subjects. Or talk to them at all.

Not just in Xinjiang but across China, it has become extremely difficult to have conversations with ordinary folk. People are afraid to speak at all, critically or otherwise. Students and professors, supermarket workers and taxi drivers, parents and motorists have all waved me away this year.

Every now and then I will encounter a brave person who wants to talk, and I am always grateful to them for their honesty. But with that honesty comes a new layer of fear: Will my story result in this person being detained? Those who speak out face severe consequences, including many years in prison.

The invisible line between permissible and potentially treacherous has been shifting rapidly.

Chinese citizens are at most risk because there is no real judicial process and no recourse for them. But China, like North Korea, is increasingly taking foreign hostages and using journalists as pawns in its political and diplomatic disputes with the United States and its allies.

The extent of this trend hit home this month when I was planning a farewell in Beijing. Scrolling through my contacts list on the WeChat messaging app, I realized how many of them had been expelled — like my colleague Gerry Shih and many other American reporters. (The United States has forced dozens of Chinese journalists to leave in retribution.)

I felt a pang of sadness when I got to the “M” section. Canadians Michael Kovrig and Michael Spavor, both of whom I know through my work, have been held by China for more than 650 days, retribution for Canada’s arrest of a Huawei executive. That’s longer than North Korea held Otto Warmbier, the American college student.

More recently, it has emerged that Cheng Lei, an Australian journalist who worked for the Chinese state broadcaster’s English channel, disseminating China’s version of events to the world, was detained last month. She is the single mother of two young children.

As if there was any doubt over what Beijing, locked in a diplomatic clash with Australia, was up to, I woke up to news scarcely a week later that two Australian journalists had made a dramatic exit from China after being summoned for interrogation by state security officials, threatened with exit bans and potential detention.

It’s clear that China now thinks the cost of having foreign correspondents — people who do pesky reporting on human rights abuses — outweighs the benefit of having people to write about what a great destination China is for investment.

How China’s strained relationship with foreign media unravelled
By Helen Davidson

Foreign journalists report what the local press is unable to, and in recent years that has included mass human rights abuses in Xinjiang, corruption allegations in senior ranks of the Communist party, the internationally unlawful intervention in Hong Kong, increasing surveillance on people’s daily lives, and the attempt to cover up the coronavirus outbreak.

Xi has overseen a much tighter hold on Chinese society and crackdowns on potential dissent. Birtles echoed the concerns of many journalists when he said it had become increasingly difficult to get people to go on the record.

The FCCC’s 2019 annual report includes dozens of anecdotes from correspondents of intimidation, violence and surveillance in the field, and said 70% of correspondents had reported the cancellation or withdrawal of interviews, for reasons they knew or believed to be related to the authorities.

Interviewees who did speak to correspondents risked significant reprisals, including detention, interrogations and exit bans, the report said.

New York Times journalist Chris Buckley, who was among those expelled this year, said the situation was “pretty bleak”, but he wants to return. “It has become tougher, there’s no doubt about that,” he said. “Without understating the difficulties, it’s important to understand it’s an amazing story and can be a very rewarding place to be reporting as well.”

Foreign correspondents and human rights groups are quick to point out the situation is far more dangerous for Chinese journalists and media staff. China is the leading jailer of journalists in the world, according to the Committee to Protect Journalists, with at least 48 in jail.

China Peddles Falsehoods to Obscure Origin of Covid Pandemic
By Javier C. Hernández

Facing global anger over their initial mishandling of the outbreak, the Chinese authorities are now trying to rewrite the narrative of the pandemic by pushing theories that the virus originated outside China.

In recent days, Chinese officials have said that packaged food from overseas might have initially brought the virus to China. Scientists have released a paper positing that the pandemic could have started in India. The state news media has published false stories misrepresenting foreign experts, including Dr. Kekulé and officials at the World Health Organization, as having said the coronavirus came from elsewhere.

The campaign seems to reflect anxiety within the ruling Communist Party about the continuing damage to China’s international reputation brought by the pandemic. Western officials have criticized Beijing for trying to conceal the outbreak when it first erupted.

The party also appears eager to muddy the waters as the World Health Organization begins an investigation into the question of how the virus jumped from animals to humans, a critical inquiry that experts say is the best hope to avoid another pandemic. China, which has greatly expanded its influence in the W.H.O. in recent years, has tightly controlled the effort by designating Chinese scientists to lead key parts of the investigation.

Independent pandemic review panel critical of China, WHO delays
By Stephanie Nebehay

An independent panel said on Monday that Chinese officials could have applied public health measures more forcefully in January to curb the initial COVID-19 outbreak, and criticised the World Health Organization (WHO) for not declaring an international emergency until Jan. 30.

The experts reviewing the global handling of the pandemic, led by former New Zealand Prime Minister Helen Clark and former Liberian President Ellen Johnson Sirleaf, called for reforms to the Geneva-based United Nations agency.Their interim report was published hours after the WHO’s top emergency expert, Mike Ryan, said that global deaths from COVID-19 were expected to top 100,000 per week “very soon”.

“What is clear to the Panel is that public health measures could have been applied more forcefully by local and national health authorities in China in January,” the report said, referring to the initial outbreak of the new disease in the central city of Wuhan, in Hubei province.

As evidence emerged of human-to-human transmission, “in far too many countries, this signal was ignored”, it added.

Specifically, it questioned why the WHO’s Emergency Committee did not meet until the third week of January and did not declare an international emergency until its second meeting on Jan. 30.

“Although the term pandemic is neither used nor defined in the International Health Regulations (2005), its use does serve to focus attention on the gravity of a health event. It was not until 11 March that WHO used the term,” the report said.

“The global pandemic alert system is not fit for purpose”, it said. “The World Health Organization has been underpowered to do the job.”

No ‘Negative’ News: How China Censored the Coronavirus
By Raymond Zhong, Paul Mozur, Jeff Kao and Aaron Krolik

At a time when digital media is deepening social divides in Western democracies, China is manipulating online discourse to enforce the Communist Party’s consensus. To stage-manage what appeared on the Chinese internet early this year, the authorities issued strict commands on the content and tone of news coverage, directed paid trolls to inundate social media with party-line blather and deployed security forces to muzzle unsanctioned voices.

Though China makes no secret of its belief in rigid internet controls, the documents convey just how much behind-the-scenes effort is involved in maintaining a tight grip. It takes an enormous bureaucracy, armies of people, specialized technology made by private contractors, the constant monitoring of digital news outlets and social media platforms — and, presumably, lots of money.

It is much more than simply flipping a switch to block certain unwelcome ideas, images or pieces of news.

China’s curbs on information about the outbreak started in early January, before the novel coronavirus had even been identified definitively, the documents show. When infections started spreading rapidly a few weeks later, the authorities clamped down on anything that cast China’s response in too “negative” a light.

The United States and other countries have for months accused China of trying to hide the extent of the outbreak in its early stages. It may never be clear whether a freer flow of information from China would have prevented the outbreak from morphing into a raging global health calamity. But the documents indicate that Chinese officials tried to steer the narrative not only to prevent panic and debunk damaging falsehoods domestically. They also wanted to make the virus look less severe — and the authorities more capable — as the rest of the world was watching.

“China has a politically weaponized system of censorship; it is refined, organized, coordinated and supported by the state’s resources,” said Xiao Qiang, a research scientist at the School of Information at the University of California, Berkeley, and the founder of China Digital Times. “It’s not just for deleting something. They also have a powerful apparatus to construct a narrative and aim it at any target with huge scale.”

“This is a huge thing,” he added. “No other country has that.”

Citizen journalist facing jail in China for Wuhan Covid reporting
By Helen Davidson

A Chinese citizen journalist detained since May for reporting on the coronavirus outbreak from Wuhan is facing up to five years in jail after being formally indicted on charges of spreading false information.

Zhang Zhan, a 37-year-old former lawyer, was arrested more than six months ago after reporting on the outbreak. She is being held in a detention facility in Shanghai.

She was accused of “picking quarrels and stirring up trouble”, an accusation frequently used against critics and activists inside China, after reporting on social media and streaming accounts.

Citizen journalist detained over Wuhan reporting ‘restrained and fed by tube’
By Helen Davidson

In a blog post on Wednesday, Zhang’s lawyer, Zhang Keke, said he visited his client on Tuesday afternoon, and found her unwell and exhausted.

“She was wearing thick pyjamas with a girdle around the waist, her left hand pinned in front and right hand pinned behind,” he wrote. “She said she had a stomach tube inserted recently and because she wanted to pull it out, she was restrained.”

Zhang Keke said she was in “constant torment” from 24 hours a day of restraints, and needed assistance to go to the bathroom.

“In addition to headache, dizziness and stomach pain, there was also pain in her mouth and throat. She said this may be inflammation due to the insertion of a gastric tube.”

Zhang Keke said he told Zhang her family, friends, and lawyers had urged her to stop her hunger strike, but she refused. He said Zhang told him she had expected a court hearing in December, and now it appeared there were no plans to hold one, she didn’t know if she would survive.

China’s Leash on Hong Kong Tightens, Choking a Broadcaster
By Austin Ramzy and Ezra Cheung

Hong Kong’s public broadcaster has long been a rare example of a government-funded news organization operating on Chinese soil that fearlessly attempts to hold officials accountable.

The broadcaster, Radio Television Hong Kong, dug into security footage last year to show how the police failed to respond when a mob attacked protesters in a train station, leading to widespread criticism of the authorities. The broadcaster also produced a three-part documentary on China’s crackdown on Muslims in Xinjiang. One RTHK journalist, Nabela Qoser, became famous in Hong Kong for her persistent questioning of top officials.

Now, RTHK’s journalists and hard-hitting investigations appear vulnerable to China’s new national security law, which takes aim at dissent and could rein in the city’s largely freewheeling news organizations. The broadcaster, modeled on the British Broadcasting Corporation, has already been feeling pressure.

An RTHK spokeswoman, Amen Ng, said that RTHK journalists “have been doing their job professionally” but added that the broadcaster was not a “platform to promote Hong Kong independence.”

But there were already signs in RTHK’s newsroom that a chill was setting in.

Kirindi Chan, a top RTHK executive, announced unexpectedly in June that she would resign, citing health reasons. Days later, she met with RTHK reporters who pressed her if she was being forced out over their coverage of the antigovernment demonstrations. Ms. Chan denied being ousted, but she sought to deliver some solemn advice.

Ms. Chan reminded the reporters and producers of their role as civil servants, and urged them to comply with the government’s code of conduct, according to two people who attended the meeting and spoke on condition of anonymity to discuss an internal matter.

She did not go into details, but the civil service code calls for impartiality and loyalty to the government, values the authorities have stressed to discourage government employees from joining the protests.

Over an RTHK career of nearly three decades, Ms. Chan earned the respect of her staff for being a staunch defender of the organization’s editorial independence. At the end of the somber half-hour meeting, the reporters gave Ms. Chan a bouquet of red and yellow tulips, but an employees’ union said her departure was an ominous sign.

“We worry that Ms. Chan’s resignation would set the scene for further attacks on RTHK,” the union said in a statement.

RTHK has also found itself caught in geopolitical wrangling between China and Taiwan, the self-governing island that Beijing claims as part of its territory.

In April, the government criticized RTHK over an interview the broadcaster ran with a World Health Organization official, Dr. Bruce Aylward, who was asked whether Taiwan should be allowed to participate in the health body. Taiwan had been shut out by Beijing in recent years.

In an awkward exchange that highlighted the sensitivity of the topic, Dr. Aylward first said he did not hear the question, then asked to move on. When the reporter repeated it, the line went dead; minutes later, asked again, Dr. Aylward replied, “We’ve already talked about China.” The interaction gave further ammunition to critics who say the health body is unduly beholden to Beijing.

Hong Kong journalists harassed, arrested and lose press freedoms under new China law
By Rachel Cheung and David Pierson

Since the arrest this week of media tycoon Jimmy Lai, founder of Apple Daily, reporters and editors have been hiding their notes, protecting contacts on encrypted messaging apps, and contemplating how much jail time they could get for violating Hong Kong’s new national security law.

“It would probably be three to five years, but if I plead guilty in court and have good conduct, I will likely get a sentence reduction,” said a reporter at Apple Daily surnamed Chan, who declined to use his full name for fear of retribution.

Such are the calculations in a disturbing new era. The chilling scene of Lai’s arrest, which was livestreamed across the world, showed police rummaging through reporters’ desks and shouting at the paper’s chief editor after he asked to see a search warrant. Outside Apple Daily’s offices, police limited media access to only those that backed the mainland Chinese government.

“If you have one to two daring outlets still breaking news, then all the others have to follow,” said Francis Lee, director of the School of Journalism and Communication at the Chinese University of Hong Kong. “You can’t pretend a story’s not out there. It creates a dynamic that ensures sensitive stories still get circulated. That’s the reason why the Chinese government is targeting Apple Daily.”

Chan and other journalists at Apple Daily are accustomed to the intensifying battle over telling their stories. He has been pepper-sprayed and struck by tear gas canisters and rubber bullets during street protests.

“In our role, we see a lot of people getting injured and making sacrifices,” he said of the protesters. “They are things they should not have suffered. But at the same time, we have to admit there is nothing we can do to save them and that is where we feel the most powerless.”

A Newsroom at the Edge of Autocracy
By Timothy McLaughlin

The SCMP is not as well read as the international outlets that it would like to compete with, but because of its unique position—as the main English-language outlet in a strategically important city—its coverage plays an outsize role in shaping international understanding of events not just in Hong Kong but across the border in China, as well.

An early draft of an initial story about the incident, according to a version that was read to me, had an opening that detailed “chaotic and shocking scenes” as officers went after “cowering commuters.” That was not the account that was eventually published, though. The SCMP’s edited story (which was subsequently updated) instead recounted how “elite Hong Kong police” had chased “radical protesters” wearing “masks” into the subway station.

The incident at the paper, recounted by two people with knowledge of the event, both of whom spoke on condition of anonymity to avoid retribution, exemplified the type of heavy-handed, slanted editing that became common in the SCMP newsroom as the demonstrations carried on. Journalists who spent hours, sometimes in a haze of tear gas, pepper spray, and rubber bullets, saw their work drastically altered by editors before running in print and online. The police were typically portrayed as heroes, and the protesters as villains, with little explanation or context of each side’s motives and grievances. “That was frustrating,” one current reporter involved in coverage of the demonstrations told me. (This journalist, like others I spoke to, did not want to be identified, fearing a backlash from the SCMP.) With these stories appearing on the front page of the paper, the reporter said, “they’ve given an impression that SCMP is anti protesters. As journalists, we should never be pro or against protesters.”

Yet even before the recent enactment of a far-reaching national-security law in Hong Kong, the city’s media was under strain. Numerous mainstream outlets have been bought by China-backed figures or pro-establishment businesses, shrinking the diversity of voices. In recent years, vigilantes have carried out attacks against senior editors and Beijing has harassed officials from Cantonese newspapers. And since protests began last summer, the government in Hong Kong has also sought to curb journalists’ freedoms. Dissatisfied with honest accounts of official malfeasance, the authorities have sought to stifle some of the city’s most cutting voices. Radio Television Hong Kong, the government-funded broadcaster that operates akin to the BBC, drew an official rebuke when a reporter pressed a World Health Organization adviser over the contentious issue of Taiwan’s inclusion in the global body and after its long-running satirical program took aim at the Hong Kong police. That program, Headliner, has since been suspended. Top newsroom executives have stepped down, and the broadcaster is now under government review. Police continue to harass journalists reporting on protests, which have shrunk dramatically in size and frequency due to a combination of the pandemic, new police tactics, and the national security law.

In 2018, the SCMP faced backlash when it conducted a government-arranged interview of Gui Minhai—a Hong Kong bookseller and Swedish citizen who disappeared in 2015 and then reappeared in Chinese custody a year later—in a detention facility while guards loomed over him. Liu has stood behind the interview and article, arguing that the SCMP agreed to the interview after discussions with editors, that there were no strings attached, and that the newspaper made a point of highlighting that Gui was accompanied by security personnel. But Angela Gui, the bookseller’s daughter, told me she was unhappy with the paper’s decision and its continued defense of the interview, which she says Beijing orchestrated to advance its own misleading narrative about her father’s situation. “My father was, after years of illegal detention and torture, subjected to public humiliation by the Chinese government, and the SCMP was complicit by disseminating and legitimizing it as a ‘news story,’” she said.

The true history of fake news
By Tom Standage

That fake news shifted copies had been known since the earliest days of printing. In the 16th and 17th centuries, printers would crank out pamphlets, or newsbooks, offering detailed accounts of monstrous beasts or unusual occurrences. A newsbook published in Catalonia in 1654 reports the discovery of a monster with “goat’s legs, a human body, seven arms and seven heads”; an English pamphlet from 1611 tells of a Dutch woman who lived for 14 years without eating or drinking. So what if they weren’t true? Printers argued, as internet giants do today, that they were merely providing a means of distribution, and were not responsible for ensuring accuracy.

But newspapers were different. They contained a bundle of different stories, not just one, and appeared regularly under a consistent title. They therefore had reputations to maintain. The Sun, founded in 1833, was the first modern newspaper, funded primarily by advertisers rather than subscriptions, so it initially pursued readership at all costs. At first it prospered from the Moon hoax, even collecting its reports in a bestselling pamphlet. But it was soon exposed by rival papers. Editors also realised that an infinite supply of genuine human drama could be found by sending reporters to the courts and police stations to write true-crime stories – a far more sustainable model. As the 19th century progressed, impartiality and objectivity were increasingly venerated at the most prestigious newspapers.

Thanks to internet distribution, fake news is again a profitable business. This flowering of fabricated stories corrodes trust in the media in general, and makes it easier for unscrupulous politicians to peddle half-truths. Media organisations and technology companies are struggling to determine how best to respond. Perhaps more overt fact-checking or improved media literacy will help. But what is clear is that a mechanism that held fake news in check for nearly two centuries – the bundle of stories from an organisation with a reputation to protect – no longer works. We will need to invent new ones.

Reporters are leaving newsrooms for newsletters, their own ‘mini media empire’
By Jacob Bogage

San Francisco-based Substack was founded by Hamish McKenzie, Chris Best and Jairaj Sethi in 2017, as newsletters were having a renaissance. The three had worked together at Kik, a messaging app co-founded by Best. Put off by the social media algorithms that controlled news distribution, they wanted a platform that would allow each client to build a “mini media empire” around their mailing lists, McKenzie said in an interview.

The platform handles the technical end of newsletter production in exchange for 10 percent of subscription revenue. The creators own all content, plus their mailing lists. Newsletters do not feature ads.

Substack declined to provide detailed statistics on its readership or registrations. Its website says “more than 100,000 people pay to subscribe to writers” across its network and the “top writers are making hundreds of thousands of dollars a year.” The platform has big-name backers in Andreessen Horowitz and Y Combinator, from which it raised $15.3 million in Series A funding last year. It has also attracted a number of prominent writers.

Andrew Sullivan: See You Next Friday
By Andrew Sullivan

What has happened, I think, is relatively simple: A critical mass of the staff and management at New York Magazine and Vox Media no longer want to associate with me, and, in a time of ever tightening budgets, I’m a luxury item they don’t want to afford. And that’s entirely their prerogative. They seem to believe, and this is increasingly the orthodoxy in mainstream media, that any writer not actively committed to critical theory in questions of race, gender, sexual orientation, and gender identity is actively, physically harming co-workers merely by existing in the same virtual space. Actually attacking, and even mocking, critical theory’s ideas and methods, as I have done continually in this space, is therefore out of sync with the values of Vox Media. That, to the best of my understanding, is why I’m out of here.

Two years ago, I wrote that we all live on campus now. That is an understatement. In academia, a tiny fraction of professors and administrators have not yet bent the knee to the woke program — and those few left are being purged. The latest study of Harvard University faculty, for example, finds that only 1.46 percent call themselves conservative. But that’s probably higher than the proportion of journalists who call themselves conservative at the New York Times or CNN or New York Magazine. And maybe it’s worth pointing out that “conservative” in my case means that I have passionately opposed Donald J. Trump and pioneered marriage equality, that I support legalized drugs, criminal-justice reform, more redistribution of wealth, aggressive action against climate change, police reform, a realist foreign policy, and laws to protect transgender people from discrimination. I was one of the first journalists in established media to come out. I was a major and early supporter of Barack Obama. I intend to vote for Biden in November.

It seems to me that if this conservatism is so foul that many of my peers are embarrassed to be working at the same magazine, then I have no idea what version of conservatism could ever be tolerated. And that’s fine. We have freedom of association in this country, and if the mainstream media want to cut ties with even moderate anti-Trump conservatives, because they won’t bend the knee to critical theory’s version of reality, that’s their prerogative. It may even win them more readers, at least temporarily. But this is less of a systemic problem than in the past because the web has massively eroded the power of gatekeepers to suppress and control speech. I was among the first to recognize this potential for individual freedom of speech, and helped pioneer individual online media, specifically blogging, 20 years ago.

And this is where I’m now headed.

“People Are Looking to Latch Onto Something Positive”: As Journalists Flock to Substack, Is There a Limit to the Newsletter Boom?
By Joe Pompeo

Emily Atkin, who writes the must-read climate digest Heated, is another Substack star. Earlier in her career, she got all the right jobs: New York Observer intern, research assistant to late investigative journalist Wayne Barrett, New Republic staff writer. But it wasn’t until she started a Substack newsletter in 2019 that she became one of the most influential climate journalists. In just over a year, Heated racked up 32,527 total subscribers and about 3,500 paying ones. (Her paid subscribers get the full smorgasbord of content.) She made about $230,000 in 2020, before subtracting Substack’s 10 percent cut and other expenses. Prior to this, Atkin, who just turned 31, had never made more than $75,000 in any of her other jobs. “I had basically accepted that I would never make enough money to buy a nice house,” she said. “So this is unlike anything I ever imagined for myself—that I could make a real adult, like, nonprofit lawyer’s salary.”

It takes a certain combination of expertise, prose, and hustle to hit numbers like that. But the point is, it’s attainable, and if Substack’s recruiting spree continues to lure high-profile writers and journalists with big audiences, Best believes they’ll have the market cornered, despite competition from less established rivals like Ghost and Lede. Some consumers, he noted, are more willing to pay to read one of their favorite writers than they are to pay for Netflix. “If we’re successful,” said Best, “yes, Substack will be much bigger.”

Malik, himself a longtime newsletter writer (though not with Substack) and founder of the tech blog GigaOm, is generally optimistic about the company’s chances. “As a former media guy, I would like my industry peers to survive and thrive and do well, and I think Substack shows them the way,” he said. Then came the caveat. “People shouldn’t forget that it is for profit, it is venture backed. They will have to kneel down at the altar of growth sooner or later, and growth does take its toll even on the most noble of causes.”

Substack isn’t a new model for journalism – it’s a very old one
By Michael J. Socolow

Perhaps “I.F. Stone’s Weekly” offers the closest historical antecedent for Substack. Stone was an experienced muckraking journalist who began self-publishing an independent, subscription-based newsletter in the early 1950s.

Yet unlike much of Substack’s most famous names, Stone was more reporter than pundit. He’d pore over government documents, public records, congressional testimony, speeches and other overlooked material to publish news ignored by traditional outlets. He often proved prescient: His skeptical reporting on the 1964 Gulf of Tonkin incident, questioning the idea of an unprovoked North Vietnamese naval attack, for example, challenged the U.S. government’s official story, and was later vindicated as more accurate than comparable reportage produced by larger news organizations.

There are more recent antecedents to Substack’s go-it-yourself ethos. Blogging, which proliferated in the U.S. media ecosystem earlier this century, encouraged profuse and diverse news commentary. Blogs revived the opinionated invective that James Gordon Bennett loved to publish in The Herald, but they also served as a vital fact-checking mechanism for American journalism.

How Much Does the World Trust Journalists?
By Zacc Ritter

Median trust in journalists is notably similar in democratic and non-democratic countries — at roughly 60%. Yet, attitudes vary significantly within these categories of political systems.

Among democratic countries — as designated by 2017 Polity IV democracy rankings — at least four in five respondents in Finland, Myanmar and Norway trust journalists “a lot” or “some,” while fewer than one in four do so in Taiwan, Serbia and Greece. In non-democratic countries, roughly nine in 10 adults in Uzbekistan, Tanzania and Rwanda say they trust journalists, but about one in three say the same in Mauritania, Gabon and Yemen.

Trust in journalists is also not associated with media freedom as measured by Reporters Without Borders. In fact, the relationship between trust in journalists and media freedom is weak and goes in the opposite direction that may be expected, where greater media freedom is associated with less trust.

In the United States, trust in the mass media, broadly, has fallen from 68% in 1972 to 41% today. Similarly, a recent Gallup-Knight report found that trust in U.S. local media declines for people who perceive a large gap between their political ideology and the perceived ideology of local news organizations. While trust in the media and trust in journalists differ, the two concepts are related enough to suspect a relationship between political polarization and trust in journalists may exist around the world.

Political polarization — measured as the extent to which differences of opinions on major political issues in society exist by country-level experts and part of the Varieties of Democracy project — is strongly associated with less trust in journalists.

Importantly, political polarization is not strongly associated with trust in other members of society asked about in the Wellcome Global Monitor study, such as neighbors, scientists, doctors or traditional doctors. However, polarization is strongly related to trust in government. Political polarization appears to affect trust in journalists related to the tangible relationship between media and politics.

While the relationship between political polarization and trust in journalists is robust, it remains unclear whether greater political polarization causes less trust in journalists. Journalists may experience reputational damage for covering contentious issues as neutral arbiters, but it is also possible that journalists are active participants in the political arena. Similarly, the media and journalists may contribute to greater political polarization by generating and amplifying contentious narratives supported by policy proponents.

Trust in journalists is complicated. A high level of trust may mean the media and journalists are doing a good job, or it may indicate an acceptance of false narratives by society. In contrast, a trust deficit may mean society is “woke” or it may indicate excessive cynicism that portends an acceptance of a post-truth reality. Regardless, a low level of trust can pose a danger because it erodes the media’s and journalists’ ability to operate as the fourth estate that holds power accountable and promotes civic discussion.

In recent years, observers have noted an apparent swing in the pendulum toward a crisis of trust in formal institutions. This trust deficit in economic and political institutions appears to be related to political polarization. Greater polarization is also associated with less trust in journalists. As political division grows, the news media and journalists willingly or unwillingly become participants in the political fray. Reporting on contentious topics and attempts to hold powerful interests accountable can lead to accusations of media bias. In such deeply polarized societies, citizen skepticism about journalists’ motivations grows, which likely affects general trust in journalists.

In turn, countries with high political polarization and low trust in journalists face an additional challenge to healing societal divisions and challenging wrongdoing. Journalists without social capital are less able to function effectively as credible truthtellers needed for informed public discourse.

Trust in media hits new crisis low
By Felix Salmon

By the numbers: For the first time ever, fewer than half of all Americans have trust in traditional media, according to data from Edelman’s annual trust barometer shared exclusively with Axios. Trust in social media has hit an all-time low of 27%.

  1. 56% of Americans agree with the statement that “Journalists and reporters are purposely trying to mislead people by saying things they know are false or gross exaggerations.”
  2. 58% think that “most news organizations are more concerned with supporting an ideology or political position than with informing the public.”
  3. When Edelman re-polled Americans after the election, the figures had deteriorated even further, with 57% of Democrats trusting the media and only 18% of Republicans.

‘No One Believes Anything’: Voters Worn Out by a Fog of Political News
By Sabrina Tavernise and Aidan Gardiner

… just when information is needed most, to many Americans it feels most elusive. The rise of social media; the proliferation of information online, including news designed to deceive; and a flood of partisan news are leading to a general exhaustion with news itself.

Add to that a president with a documented record of regularly making false statements and the result is a strange new normal: Many people are numb and disoriented, struggling to discern what is real in a sea of slant, fake and fact.

Of course, many Americans have the opposite experience: They turn to sources they trust — whether on the right or left — that tell them exactly what they already believe to be true. But a new poll released last week found that 47 percent of Americans believe it’s difficult to know whether the information they encounter is true. Just 31 percent find it easy. About 60 percent of Americans say they regularly see conflicting reports about the same set of facts from different sources, according to the poll, by The Associated Press-NORC Center for Public Affairs Research and USAFacts.

“Now more than ever, the lines between fact-based reporting and opinionated commentary seem blurred for people,” said Evette Alexander, research director at the Knight Foundation, which funds journalism and research. “That means they trust what they are seeing less. They are feeling less informed.”

They are also tuning out. Mr. Trudell, a registered independent, stopped paying attention to national news about a year ago. He found it toxic and mentally taxing, and it started arguments that had no end. He decided to focus instead on local and state-level politics. As a security manager at a mall, he has to worry about shoplifters, so keeping up with the state’s criminal justice reforms was useful.

National politics, he said, has started to look like eyewitness testimony: “People can see totally different things, standing right next to each other.”

So when he had the day off on Wednesday, which happened to be his 39th birthday, he decided to treat himself to a nap and some “Simpsons” reruns after his kids left for school.

The degree of alienation is new. In the late 1970s, nearly three quarters of Americans trusted newspapers, radio and television. Walter Cronkite read the news every night, and most Americans went to bed with the same set of facts, even if they had different political views. These days, less than half of Americans have confidence in the media, according to Gallup.

The decline in confidence is particularly pronounced by party. Today about 69 percent of Democrats have a great deal of confidence in the media, compared to just 15 percent of Republicans and 36 percent of independents, according to Gallup.

73% see bias in news reporting as “a major problem”
By Fadel Allassan

73% of Americans see bias in news reporting as “a major problem,” according to a study out Tuesday from the Knight Foundation and Gallup.

Why it matters: That’s up from 65% in 2017, indicating “the gap between what Americans expect from the news — and what they think they are getting — is growing,” the Knight Foundation writes.

By the numbers: Views on media bias, like most issues, cut along partisan lines. 71% of Republicans indicated they have a very or somewhat unfavorable opinion of the news media, compared to 22% of Democrats and 52% of independents.

  1. 48% of Americans blamed the media “a great deal” for the country’s political division.
  2. Only a fifth of adults under 30 said they have a “very” or “somewhat” favorable opinion of the news media, versus 44% of those aged 65 and older.

Bias in Others’ News a Greater Concern Than Bias in Own News
By Helen Stubbs

While Americans acknowledge bias in their favored source of news, their greater concern about bias in other people’s media may suggest their discontent with the news media lies less in their experience with it than in their perceptions of others’ experiences, explaining how this plays into the political polarization in the U.S. national discourse. One hypothesis to account for this is that people may assume they are better able than others to recognize news bias, and thus believe they are more impervious to it.

Greater concern about bias in others’ news among younger Americans is consistent with other studies showing they are more likely than older Americans to think news is biased and to be able to identify “fake news.” As “digital natives,” they are more media savvy and consult multiple sources and their own personal networks to discern the truth. Similar to young adults, more-educated Americans may have greater confidence in their ability to discern fact from fiction.

Given the differences noted by both ideology and education, more-educated liberals harbor greater concern about others’ media sources — most likely right-leaning news media. While such outlets consider themselves as providing an important counternarrative to the left-leaning “mainstream media,” in the general public there appears to be greater concern on the left about the influence of conservative media sources on Americans.

The raging trust crisis and its consequences
By Sara Fischer

The trust deficit has gotten so bad that people don’t know who or what to believe anymore, and they don’t even trust themselves to get facts right.

  1. A majority of people around the world believe that journalists, government leaders and business leaders are all purposely trying to mislead people by spreading misinformation.
  2. Most people have terrible information hygiene, and admit that they don’t actively verify information, avoid echo chambers or share things without first vetting information.

That trust gap has real-world consequences. Only 59% of people in the U.S. say they are willing to take the vaccine if it becomes available to them within a year. Those who are unwilling to take the vaccine tend to have poor information hygiene.

Partisanship and Vaccine Uptake Strategies
By Frank Newport

There is a 25-point difference between Democrats and Republicans in expressed willingness to get the vaccine in Gallup’s latest survey — Democrats being the more positive. Pew Research similarly shows a 19-point difference in willingness to get a vaccine between Democrats and Republicans, and a recent survey from Kaiser Family Foundation found a 30-point partisan difference in those who said they would “definitely” or “probably” get the vaccine.

Republicans’ hesitancy to get vaccinated, I believe, reflects in part their long-standing suspicion of elites and of government mandates (Gallup recently reported that Republicans have been less positive about getting vaccines in several situations, going back to questions Gallup asked about getting Asian flu vaccinations in 1957). It may also reflect agreement with Republican President Donald Trump’s more skeptical positions on the coronavirus over the past nine months.

Gallup has asked respondents who said they would not agree to get the vaccine to explain why in their own words. The results show in part that Republicans are significantly more likely than others to say it is because they don’t trust vaccines in general. Verbatim responses also show that those who exhibit vaccine hesitation (including many Republicans) frequently use the word “trust” in their explanation, as in “I don’t trust the vaccine” or “I don’t trust the system” (read, government). Along these same lines, recent Kaiser Family Foundation research finds that Republicans who say they are hesitant about getting the vaccine are significantly more likely than Democrats to say it is because the risks of getting COVID-19 have been exaggerated.

The smaller number of Democrats who are hesitant are most likely, in Gallup’s research, to say they are worried about the vaccine being rushed too quickly into use, and that they need to confirm it is safe. (The Kaiser poll did not separate out Democrats’ reasons because there were too few of them who were hesitant to get the vaccine to provide reliable estimates.)

Those interested in increasing COVID-19 vaccine uptake will need to invest in further research to understand more precisely what is behind the thinking of groups that are below average in expressed willingness to be vaccinated.

As noted, Republicans may be a particularly important group in this regard because of their initial vaccine hesitancy. Some possible strategies to address reluctant Republicans could therefore include:

  1. focusing on communication from grassroots players such as doctors and nurses and local officials, rather than emphasizing messaging from elites and those at the top
  2. using Republican spokespeople and examples where possible (such as Vice President Mike Pence getting the vaccine on television)
  3. emphasizing direct communication from individuals’ own personal doctors and local healthcare professionals
  4. attempting to be aware of and addressing inaccurate information and messaging on social media

There is, in theory, less payoff in targeting Democrats given their currently high willingness to get the vaccine — but the increase in vaccine hesitancy among Democrats in September underscores the value in a focus on maintenance of their currently positive attitudes. Strategies for Democrats could include continuing to use Democratic-friendly spokespeople to stress the thoroughness of the vaccine development process, even while under an extraordinarily rushed timetable, and a focus on the safety of the vaccine.

Overall, Americans’ expressed willingness to get a COVID-19 vaccine is to a significant degree related to their underlying political orientation, reflecting preexisting assumptions and worldviews as well as political and ideological cues carried through selected exposure to media and political thought leaders whose opinions resonate with these viewpoints. Although one may be excused for thinking that politics should not be a part of the vaccination process, the data clearly show that it is. Americans who are ideologically conservative and politically Republican are the most resistant to getting a vaccine, while Democrats’ currently high degree of willingness to be vaccinated has been shown to be quick to change. These facts of life need to be a part of campaigns designed to increase vaccine uptake.

Key findings about Americans’ declining trust in government and each other
By Lee Rainie and Andrew Perrin

Americans think the public’s trust has been declining in both the federal government and in their fellow citizens. Three-quarters of Americans say that their fellow citizens’ trust in the federal government has been shrinking, and 64% believe that about peoples’ trust in each other.

When asked a separate question about the reasons why trust has declined in the past 20 years, people offer a host of reasons in their written answers. Those who think there has been a decline of trust in the federal government over these two decades often see the problem tied to the government’s performance: 36% of those who see the decline cite this. Some worry the government is doing too much, others say too little, and others mention the government doing the wrong things or nothing at all. Respondents also cite concerns about how money has corrupted it and how corporations control the political process. President Donald Trump and his administration are mentioned in 14% of answers, and a smaller share lays the blame on Democrats. Additionally, 10% of those who see decline lay fault at the feet of the news media.

Those who think interpersonal trust has declined in the past generation offer a laundry list of societal and political problems, including a sense that Americans on the whole have become more lazy, greedy and dishonest. Some respondents make a connection between what they think is poor government performance – especially gridlock in Washington – and the toll it has taken on their fellow citizens’ hearts. Overall, 49% of adults think interpersonal trust has been tailing off because people are less reliable than they used to be.

Majorities believe the federal government and news media withhold important and useful information. People’s confidence in key institutions is associated with their views about the transparency of institutions. About two-thirds (69%) of Americans say the federal government intentionally withholds important information from the public that it could safely release, and 61% say the news media intentionally ignores stories that are important to the public. Those who hold those skeptical views are more likely than others to have greater concerns about the state of trust.

Some 44% of Americans say “yes” to both questions – that the federal government withholds information and the news media ignores stories. More Republicans and Republican-leaning independents than Democrats and Democratic leaners believe both institutions hold back information (54% vs. 38%).

On a scale of national issues, trust-related issues are not near the top of the “very big” problems Americans see. But people often link distrust to the major problems that worry them. About four-in-ten adults (41%) think the public’s level of confidence in the federal government is a “very big problem,” putting it more than halfway down the list of other problems that were asked about. Confidence in government is roughly on par with problems caused by racism and illegal immigration – and a bit above terrorism and sexism. Some 25% say Americans’ level of confidence in each other is a very big problem, which is quite low in comparison with a broad array of other issues that more Americans perceive as major problems.

It is important to note, though, that some Americans see distrust as a factor inciting or amplifying other issues they consider crucial. For example, in their open-ended written answers to questions, some Americans say they think there are direct connections between rising distrust and other trends they perceived as major problems, such as partisan paralysis in government, the outsize influence of lobbyists and moneyed interests, confusion arising from made-up news and information, declining ethics in government, the intractability of immigration and climate debates, rising health care costs and a widening gap between the rich and the poor.

When we can’t even agree on what is real
By Christina Pazzanese

In a 2018 study, the researchers found that Americans as a whole largely overestimate how likely it is that a person born in the bottom 20 percent income bracket will rise into the top 20 percent.

Both Republicans and Democrats also overestimated the size of the U.S. immigrant population and its dependence on government assistance, and underestimated its level of education. Republicans were almost twice as likely as Democrats, though, to think that the average immigrant gets twice the aid of a nonimmigrant with an identical resume.

Why are perceptions on the left and right so far apart? Several factors seem to contribute, said Stantcheva. First and foremost, Republicans and Democrats tend to seek out very different news sources so they often get very different information. But even within those sources, the information that’s received is understood differently based on variables like a person’s education or life experiences, how much they trust the messenger or principals involved, their prior beliefs about a given issue, and other ideas they associate with an issue.

“How much you’re going to change your belief as a function of that information is going to depend on the weight you put on it, and that weight will depend on what you already think,” she said. “Without interruption, it’s just a cycle that will reinforce itself.”

Democrats and Republicans were starkly divided on the topic of immigration and what to do about it, perhaps because it’s so often in the news and discussed in predominantly negative and emotionally charged terms. Where they were in sync was how misinformed they were.

“Immigration is an area where there’s a very widespread misperception,” said Stantcheva. Even though liberals broadly view immigrants more favorably, they had no better handle on how the newcomers impact the U.S. than conservatives did. “One group is not necessarily more wrong than the other. Everybody’s quite wrong.”

Complicating matters is the fact that simply presenting accurate data to the misinformed doesn’t always work. On matters like social mobility opinions can be moved with statistics, but on especially partisan issues like immigration, facts appear to do little to change viewpoints, the researchers found.

One experiment showed that even when given an opportunity to learn the facts about immigrants in the U.S. for a nominal sum, those holding the most negative and most inaccurate perceptions were the least willing to pay.

“The people who most need the information are going to be the least likely to seek out that information. It seems that either they don’t realize that they’re wrong, or they’re just very entrenched in their beliefs, and do not want their beliefs to be changed,” said Stantcheva.

Families Have Been Torn Apart by Politics. What Happens to Them Now?
By Sabrina Tavernise

The shock of Donald J. Trump’s election in 2016, just before the holiday season, tested many American families who had to confront — or avoid altogether — political disagreements over Thanksgiving dinners. Many Democrats said they were angry at family members who voted for him. Republicans rejected the notion that their votes were referendums on whether they were good people.

But four years later, for some families, those differences have mutated into something deeper — a divide over basic facts and visions for America’s future. That rift feels even harder to mend after the 2020 election, as Mr. Trump stoked conspiracy theories questioning the legitimacy of Mr. Biden’s win.

In interviews during and after the election, Americans talked about the differences that had emerged in their families over politics and how they had changed over the past four years. Some had learned to live with them, and were trying hard to focus on the things they had in common. Others had not spoken since 2016.

The political divisions within families, while widespread, are far from universal. Dr. Joshua Coleman, a psychologist who specializes in estrangement, said that while he now has such cases in his practice, they are still a small share of the business, and, so far, mostly consist of millennials or other younger Americans pulling back from or cutting off their more conservative baby boomer parents.

That was the case in the Ackley family.

Danielle Ackley of North Carolina and her mother have always been different politically. But they agreed to disagree, even after Mr. Trump’s 2016 win, which Ms. Ackley said brought her son to tears.

But during a visit last month, they got into a terrible argument over politics. Ms. Ackley, 37, said she got angry when she heard her mother criticize Mr. Biden’s character. Then it escalated. It ended with her telling her mother to leave.

“This is not even a political divide, it’s a reality divide,” said Ms. Ackley, who added that she felt even more distant after seeing her mother comment approvingly on a Facebook post questioning mail-in ballots.

For Debbie Ackley, who is 59, the experience was painful and a shock. She said she remembers staring down at her phone, trying not to cry.She left the next morning, hours earlier than she had planned, and was so upset on the drive that she worried she might crash.

She said she loved her daughter, and though she did not understand her anger, she knew it came from a good place.

“Danielle has got the biggest heart,” she said. “She’s very sensitive and very loving. She takes things to heart.”

She said she was frustrated by what she saw as a growing intolerance in the country.

“It’s scary that there’s very little tolerance and respect for other people’s views and opinions — that’s what makes me sad,” she said.

Still Alive
By Scott Alexander

… a recent poll found that 62% of people feel afraid to express their political beliefs. This isn’t just conservatives – it’s also moderates (64%), liberals (52%) and even many strong liberals (42%). This is true even among minority groups, with more Latinos (65%) feeling afraid to speak out than whites (64%), and blacks (49%) close behind. 32% of people worry they would be fired if their political views became generally known, including 28% of Democrats and 38% of Republicans. Poor people and Hispanics were more likely to express this concern than rich people and whites, but people with post-graduate degrees have it worse than any other demographic group.

And the kicker is that these numbers are up almost ten percentage points from the last poll three years ago. The biggest decline in feeling safe was among “strong liberals”, who feel an entire 12 percentage points less safe expressing their opinion now than way back in the hoary old days of 2017. What happens in a world where this trend continues? Does everyone eventually feel so unsafe that we completely abandon the public square to professional-opinion-havers, talking heads allowed to pontificate because they have the backing of giant institutions? What biases does that introduce to the discussion?

Who are the real Shy Trumpers?
By Eric Kaufmann

Republican pollster Frank Luntz told Emily Maitlis that Trump voters were over twice as likely as Biden voters — by a 19 to 9 margin — to conceal their intended vote from others. I would expect this ratio to be considerably higher among university graduates, which would, accordingly, skew predictions the most among graduates.

Pollsters claim to have overcome this problem by comparing telephone and online surveys and finding no difference. Since online surveys are anonymous, they reason, a ‘shy Trump’ effect should reveal itself by comparing these two methods, and they find none.

However, we also know that people who internalise social norms often conceal their views in online surveys. The psychologist George Herbert Mead referred to people’s ‘generalized other’, a kind of mental peer group we carry around in our heads that sits in judgement upon us is even if no one is actually watching. For instance, in a recent survey of North American academics, I found that just 23% of academics were willing to state they would discriminate against a Trump voter for a job, but the actual share when using a concealed technique called a ‘list experiment’ was 42%. Likewise, a 2010 study found that the share of white Americans willing to endorse zero immigration jumped from 39% to 60% when the question was concealed in a list, rather than asked openly.

There is also a problem of blowback among elite Republicans. Frank Luntz has also said that feedback from Trump-supporting respondents revealed considerable resentment towards pollsters, who were perceived as part of a media establishment out to misrepresent them. Indeed, studies show that using words like ‘racist’ to describe Trump or his policies tends to increase support for them among conservative respondents by provoking what is termed a ‘reactance’ effect. Knowing you are perceived as racist by elites for supporting Trump may make you less likely to answer a call from a survey firm you associate with that chastising elite.

Again, this perception is likely to be stronger among Trump-supporting graduates than Trump voters with lower education levels, who are less likely to circulate in politically-correct social environments. Research confirms that highly-educated white liberals have the most skewed perceptions of the actual views of Trump supporters, in part because their social circles tend to be politically homogeneous. The problem is worst among those most attentive to politics.

The Real Divide in America Is Between Political Junkies and Everyone Else
By Yanna Krupnikov and John Barry Ryan

… most Americans — upward of 80 percent to 85 percent — follow politics casually or not at all. Just 15 percent to 20 percent follow it closely (the people we call “deeply involved”): the group of people who monitor everything from covfefe to the politics of “Cuties.”

At the start of the year (i.e., pre-pandemic), we asked people to name the two most important issues facing the country. As expected, we found some clear partisan divides: For example, Republicans are more likely than Democrats to cite illegal immigration as an important issue.

But on a number of other issues, we found that Americans fall much less neatly into partisan camps. For example, Democrats and Republicans who don’t follow politics closely believe that low hourly wages are one of the most important problems facing the country. But for hard partisans, the issue barely registers.

Partisan Republicans were most likely to say drug abuse was the most important problem facing the country. But less-attentive Republicans ranked it second to last, and they were also concerned about the deficit and divisions between Democrats and Republicans.

Among Democrats, the political junkies think the influence of wealthy donors and interest groups is an urgent problems. But less-attentive Democrats are 25 percentage points more likely to name moral decline as an important problem facing the country — a problem partisan Democrats never even mention.

For partisans, politics is a morality play, a struggle of good versus evil. But most Americans just see two angry groups of people bickering over issues that may not always seem pressing or important.

How can politics better match the opinions of a majority of Americans? The fact is, it’s not an easy problem to solve. We can try to give the hardened partisans less voice in the news. Featuring people who exemplify partisan conflict and extremist ideas elevates their presence in politics (though of course by definition, it is the partisans who are most closely watching the news who are also most likely to give their opinions). This is particularly true of social media: What a vocal minority shares on social media is not the opinion of the public. Yet such political tweets, as the political communication scholar Shannon McGregor finds, are increasingly making their way into news coverage as stand-ins for public opinion.

Americans See Skepticism of News Media as Healthy, Say Public Trust in the Institution Can Improve
By Jeffrey Gottfried, Mason Walker and Amy Mitchell

Many Americans remain skeptical toward the news media, questioning not only the quality of journalists’ work but their intentions behind it. For instance, no more than half of U.S. adults have confidence in journalists to act in the best interests of the public, or think that other Americans have confidence in the institution. And the public is more likely than not to say to say that news organizations do not care about the people they report on.

While most Americans (61%) expect the news they get to be accurate, nearly seven-in-ten (69%) think news organizations generally try to cover up mistakes when they do happen.

The reasons for why Americans think these mistakes happen underscore the distrust that substantial portions of the public feel: Many say that careless reporting (55%) or even a desire to mislead the public (44%) are major factors behind significant mistakes in news stories, although other, less negligent or nefarious reasons such as the rapid pace of breaking news (53%) also are seen as responsible for mistakes.

This raises the question: Where might there be opportunities for the news media to gain more trust? First and foremost, the survey finds that personal connections with news tie strongly to Americans’ views of the media overall, echoing earlier Pew Research Center findings at the local level. Americans who feel connected to news outlets – whether in feeling valued by, understood by or loyal to them – express far more positive views toward the news media. For instance, those who feel that their news sources value them are much more likely to expect their news to be accurate and to think news outlets are transparent with audiences.

According to the findings, there is plenty of room for improvement in this area: While most Americans want to have personal connections with their news sources, many do not experience it (again in line with previous Center findings on local news). More than half of U.S. adults say their news outlets do not particularly value them (57%) or that news organizations do not understand people like them (59%), and nearly two-thirds (63%) say they do not feel particularly loyal to the outlets they get their news from.

Americans’ personal connections with specific news stories also are linked with their attitudes toward the media. When Americans encounter news stories that hit close to home, they generally have good things to say about the media’s coverage. Roughly two-thirds of those who felt personally connected to a story – either because it covered an issue they believe they are an expert on, or because it was about a significant event that they experienced or witnessed – think that story was covered well. And those who feel this way express far more favorable views toward the news media in general than those who think the story was not covered well.

Three-quarters of Americans say it is possible for the public to increase its level of confidence in the news media, compared with about a quarter (24%) who say it is not possible. This view is largely shared by both major parties, as well as across demographic groups.

The view that confidence in the media could increase also is common among both those who say that, in an ideal world, the public would be trusting of the news media (85%) and those who say it is better for society to be skeptical of journalists (69%). This suggests that even if views of the news media become more positive, many still think it is important to maintain some level of healthy skepticism.

Posted in Games.