Skip to content


Culture war games: the Midas disease

The Rise and Fall of Andrew Mellon
By Matt Stoller

Andrew Mellon had a quiet demeanor, rail-thin bearing, and beautifully manicured hands. His habit of taking long vacations, his age, his manners, and his soft-spoken shyness might have been mistaken for weakness and frailty in someone else. Mellon may have been born rich, but he was not soft. He was a hard man, a banker, an emperor of money, an owner of several companies later included in the Fortune 500. He would help lead the restoration of rule by private financiers.

President Warren G. Harding formally appointed Mellon under the pretense that a plutocrat like Mellon was so rich he couldn’t be bought. The real reason was that a Mellon Bank had lent $1.5 million to Harding’s campaign in 1920. Mellon had become bored with being a mere tycoon. As one of his enemies put it, “Mellon needed a change, and the Grand Old Party needed the cash.”

Mellon’s appointment was probably illegal. A statute from 1789 prohibits the treasury secretary from engaging in commerce or trade, an absurd expectation for a man with such industrial power. The founders had also written a law blocking the treasury secretary from holding bank stocks, another absurdity. Mellon overcame these legal restrictions by pretending to sell his assets to his brother. The rules existed for good reason: a man clothed in public power should not use that power for private ends, though Mellon did exactly that throughout the 1920s. Mellon explained the need to raise tariffs to protect domestic industrial monopolies to Harding even before the election. Harding dutifully mentioned tariffs in his inaugural address.

Harding, so healthy at his inauguration, became consumed by corruption scandals, and ended up dying within three years of taking office. Mellon, by contrast, would remain treasury secretary for eleven years, under three presidents. Or, as progressive senator George Norris put it in a common joke of the era, “three presidents served under him.” The decade might have started with Warren Harding’s presidential victory, but the political economy of the 1920s would be structured by Andrew Mellon.

Mellon was the perfect symbol of an administration hoping to return to the pre-1900 era. Mellon’s life and career bridged the conservative robber baron politics of the nineteenth century with increasingly large federal government structures of the twentieth. He was born just before the Civil War to a wealthy, austere father, Thomas Mellon, a judge and real estate developer. His gloomy Pittsburgh mansion was in the tony East End of Pittsburgh, a town so smoggy from pollution that someone described it as “Hell with the lid off.”

Judge Mellon was deeply suspicious of democratic politics and lower classes asserting power. During the Civil War, he held no strong views on slavery, but the imposition of high taxes on the wealthy during the war enraged him. Public schools drew his ire; he believed children would study harder if they had to pay. Labor unrest among lower classes, he believed, needed to be met with violence, and may even “require blood to purify.”

Andrew, the smartest of the boys, inherited his father’s empire in his twenties. For most of his life, he had a solitary routine. Rising early, he took the train to work, spent the day at the bank, lunched at a private club, and then brought documents home at night for study, “after a silent supper with his parents.” He read little, enjoyed little music or plays, and did no sports. Mellon grew to become neurotic, secretive, and soft-spoken, suspicious of taxes and the press.

Unlike other tycoons, he did not specialize in one area. At one point, five Fortune 500 companies owed their lineage directly to Mellon: Alcoa, Gulf Oil, Mellon Bank, Carborundum, and Koppers. He controlled a network of ninety-nine banks. He had interests in coal, steel, chemicals, oil, sleeping cars, railroads, building construction, utilities, magnesium, and airplanes.

The Mellon system was a set of industrial and financial enterprises that aided each other and had interlocking boards of directors and even personnel. Coal unearthed on Mellon lands would find its way into Mellon steel mills, which would help build Mellon ships to carry Mellon oil, all financed by Mellon banks. Being a part of the Mellon system meant customers, credit, financing, and prosperity, but also control. Being outside of it meant a constant battle with the Mellon interests.

World War I generated massive demand for not only aluminum and oil, but chemicals to use in warfare—which were made by Mellon companies. Toluol, naphtha, benzoyl, and ammonia, as well as ships made by the Mellons’ New York Shipbuilding Company, and armor made by Bethlehem Steel, sent rivers of cash back to the Mellon empire. By the end of the war, Andrew Mellon was an officer or director of more than sixty companies.

Mellon was also the “financial angel” of the Pennsylvania Republican Party, so powerful that when his ill-considered marriage fell apart in a scandalous split, he had the state legislature pass a law giving judges the right to deny women a trial by jury in divorce cases. Local newspapers, afraid or in thrall to the Mellon family, reported little on the matter.

Throughout the 1920s, Mellon ran the Treasury Department, set tax and government debt policy, and sat as the chairman of the Federal Reserve.

Many of Woodrow Wilson’s achievements offended Mellon, but Wilson’s most rank achievement was the income tax on the wealthy. For the eleven years he was at the Treasury, Mellon sought to reduce that tax any way he could. He pestered Congress to lower the top individual rates, to lower rates for corporations, and to end that most odious of taxes, the one on inheritances. That tax would have blocked Mellon’s father from bequeathing Andrew the beginnings of an empire. Mellon won substantial reductions in the Republican Congress, but a combination of progressive Republicans and southern Democrats blocked him from a full victory.

When he couldn’t win through Congress, he could win through administration, and through his control of the Bureau of Internal Revenue, the forerunner of the Internal Revenue Service. Under Mellon, the Bureau of Internal Revenue changed the way it calculated tax liabilities incurred during World War I. As a result, billions of dollars of refunds, some to Mellon companies, flowed back to corporate America. The bureau was especially malleable in these years, because it had just started collecting income and corporate taxes. In 1916, Americans filed roughly 450,000 income tax returns. By 1921 the number had jumped to eight million. This surge allowed Mellon to decide a host of policy questions around accounting, as corporations demobilized factories and a suite of nationalized industries returned to private ownership. He would even set up a special tax court to interpret and make tax law.

Virtually every large corporation in the country received large rebates, including forty Mellon-affiliated companies or people. Mellon personally received a $400,000 tax refund, the largest awarded to a single individual. Gulf Oil got $3 million. Mellon even had men from the bureau preparing his own returns. These refunds achieved more than just cash in Mellon’s pocket. William Randolph Hearst, whose newspapers had decried Morgan’s spiderlike control years earlier, received $1.7 million of tax refunds. The Hearst papers were so grateful for Mellon’s financial wizardry that they talked up Mellon for the 1928 Republican nomination.

Mellon was a savvy bureaucratic infighter. In perhaps his most bitter feud of the era, with Republican senator James Couzens, the wealthiest member of the Senate, Mellon had the Bureau of Internal Revenue investigate Couzens and leak information about his tax returns. Few Democratic senators dared support Couzens because of the structure of the developing system for taxing corporate and personal income. Senators often had to ask the Bureau of Internal Revenue for decisions on technical questions, on behalf of constituents or corporations. As reporter Frank Kent put it, “not one of them knows when he will be forced to go there and ask for more. Almost any question can be decided by the bureau in three or four different ways—all legal. One of these ways saves a man or a firm a lot of money, and the other doesn’t.” Couzens later said, “Give me the control of the Internal Revenue Bureau and I will run the whole darned country … The Commissioner of the Bureau has the power to perpetuate a political party in power indefinitely … It is a power that no man should be allowed to exercise in secret.” And yet, Mellon did.

Mellon promoted his philosophy in a 1924 best-selling book called Taxation: The People’s Business. Anything that taxed the wealthy was full of “menace for the future,” threatening the very stability of society. He went further. “Our civilization,” he wrote, “is based on accumulated capital, and that capital is no less vital to our prosperity than is the extraordinary energy which has built up in this country the greatest material civilization the world has ever seen.”

Placing power in the hands of business seemed to work. After a brutal recession of the early 1920s, economic growth soared. The unemployment rate for 1925 dropped to 4 percent, on its way to a peacetime century low of 1.9 percent in 1926. A giant financial bubble was undergirding economic growth, but it was easy to overlook that in the haze of prosperity and the continued spread of next-generation industrialization technologies.

“Never before, here or anywhere else,” wrote The Wall Street Journal, “has a government been so completely fused with business.” The Federal Trade Commission, created by Wilson, was in the Mellon years led by W. E. Humphrey, a man who proudly announced it would no longer serve as a “publicity bureau to spread socialist propaganda.”

It seemed like an endless sea of prosperity. Just not for everyone.

Corruption at the Core: Bill Moyers talks with Sarah Chayes
By BILLMOYERS.COM TEAM

SARAH CHAYES: … When you have the Midas disease it’s that you’re so infatuated with money that you convert everything that’s sacred, meaning our earth, the land, what’s on the land, what’s under the land, human creativity, human labor, love, relationships. You monetize all of that. And when a society is struck by the epidemic of Midas disease, what it means is that money becomes the chief source of social standing. And that’s a huge conversion. It’s no longer your honor, or your courage or your wisdom or your ability to play beautiful music. You’re not honored by society for that as much as you are honored for accumulating money. And the danger of that is, it’s a race with no finish line.

Billionaires made $5 trillion in the past year—and their wealth is growing at an ‘unprecedented’ rate
By Nicolas Vega

The world’s billionaires are accumulating wealth at a rate “unprecedented in human history,” according to a recent report from Oxfam International.

The charitable organization’s “Inequality Kills” report found that the planet’s 2,755 billionaires saw their cumulative wealth increase by $5 trillion — a sum greater than the market caps of Apple and Amazon combined — since March 2021, from $8.6 trillion to $13.8 trillion.

“Billionaire wealth has grown more since the pandemic began than it has in the last 14 years,” the report says. That growth has been fueled by government intervention around the world, which has propped up economies and driven up stock prices, the report says.

The growth in wealth has been so great over such a short period of time that “wealth concentration at the very top now surpasses the peak of the Gilded Age of the late 19th century,” Oxfam says.

Indeed, since 1995 the top 1% of the wealthiest people in the world have seen 19 times more wealth growth than the bottom 50%, the report says.

The 50 Richest Americans Are Worth as Much as the Poorest 165 Million
By Ben Steverman and Alexandre Tanzi

While the top 1% of Americans have a combined net worth of $34.2 trillion, the poorest 50% — about 165 million people — hold just $2.08 trillion, or 1.9% of all household wealth.

The 50 richest people in the country, meanwhile, are worth almost $2 trillion, according to the Bloomberg Billionaires Index, up $339 billion from the beginning of 2020.

The bottom 90%’s exposure to the stock market has been dropping for almost two decades. Since peaking at 21.4% in 2002, upper middle class Americans have seen a 10 percentage point decline in their equity interest in companies. A similar pattern is seen among the bottom half.

The wealthiest 1% own more than 50% of the equity in corporations and in mutual fund shares, the Fed data show. The next 9% of the wealthiest own more than a third of equity positions — meaning that the top 10% of Americans hold more than 88% of shares.

The Fed data also show that the Millennial generation, born between 1981 and 1996, control just 4.6% of U.S. wealth even though they are the largest in the workforce with 72 million members. And the share of the pie held by Black Americans is the same size it was 30 years ago.

Baby Boomers hold the majority of U.S. wealth, with $59.6 trillion, twice Generation X’s $28.5 trillion and more than 10 times Millennials’ $5.2 trillion.

It’s not unusual for younger age groups to be significantly poorer than their elders. Even so, Millennials remain far behind where previous generations were at the same age. In 1989, when the median Boomer was 34, the generation controlled more than 21% of U.S. wealth. To match that, Millennials, with a median age of 32 now, will need to quadruple their wealth share over the next couple of years.

The Fed estimates the top 10% of U.S. households hold 69% of the country’s wealth, or $77.3 trillion, up from 60.9% share at the end of the 1980s. The very richest Americans are almost entirely responsible for that gain. The top 1% held 30.5% of U.S. wealth in June, up from 23.7% in late 1989. The bottom half’s share, meanwhile, has fallen from 3.6% to 1.9%.

The billionaire boom: how the super-rich soaked up Covid cash
By Ruchir Sharma

Over the past two decades, as the global population of billionaires rose more than fivefold and the largest fortunes rocketed past $100bn, I started tracking this wealth. Not for the voyeuristic thrill, but for warning signs. Rising inequality was becoming ever more of a political issue, threatening to provoke popular backlashes against capitalism itself.

The pandemic has reinforced this trend. As the virus spread, central banks injected $9tn into economies worldwide, aiming to keep the world economy afloat. Much of that stimulus has gone into financial markets, and from there into the net worth of the ultra-rich. The total wealth of billionaires worldwide rose by $5tn to $13tn in 12 months, the most dramatic surge ever registered on the annual billionaire list compiled by Forbes magazine.

The billionaire population boomed last year as well. On the 2021 Forbes list, which runs to April 6, their numbers rose nearly 700 to a record total of more than 2,700. The biggest surge came in China, which added 238 billionaires — one every 36 hours — for a total of 626. Next came the US, which added 110 for a total of 724. The top 10 gainers in the US and China each saw already vast fortunes grow in just one year by sums that not long ago would have seemed impossible in a lifetime: from $25bn to more than $150bn for Tesla founder Elon Musk.

Built on the premise that anyone can get fabulously rich, the US has rarely been inclined to target those who realise the dream. Only in periods of extreme inequality, such as the robber baron age of the early 20th century, were magnates such as John D Rockefeller targeted as public enemies. Despite talk of a new gilded age, until recently top tycoons were more likely to be celebrated than vilified. It helped greatly that many were self-made entrepreneurs, philanthropists or — such as Bill Gates and Warren Buffett — both.

Still, the scale of mega-billionaire fortunes, and their growing number, is bringing down political wrath on the entire class, no matter what their accomplishments or contributions. Heads and founders of the American tech giants have been hauled before Congress to defend themselves, cast in the role of grasping, all-powerful monopolists. In 2016, Sanders was willing to concede that there were some “great billionaires”, citing Gates. Four years later he was hammering the fact that the bottom half of American families had a combined net worth lower than the top three — which at the time was a reference to Gates, Bezos and Buffett. Increasingly, big is synonymous in the American political mind with bad, and it is likely no coincidence that soak-the-wealthy tax proposals from progressives have strong support, even among some Republicans.

The moral calculations of a billionaire
By Eli Saslow

He’d been earning more than his family could spend since about 1975, and in the decades since then he’d come to see the act of making money less as a personal necessity than as a serious game he could play and win. He invested it, traded it, lent it, gave it away and watched each day as the accounts continued to grow beyond his needs, his wants and sometimes even his own comprehension.

“I don’t want to say it’s all play money at this point, but what else could I possibly spend it on?” he sometimes wondered. His wife’s walk-in closet was already bigger than the South Bronx apartment where he’d grown up. Their Florida home had a custom-built infinity pool, and in five years he’d never once gone in for a swim.

The past year had been the best time in history to be one of America’s 745 billionaires, whose cumulative wealth has grown by an estimated 70 percent since the beginning of the pandemic even as tens of millions of low-wage workers have lost their jobs or their homes. Together, those 745 billionaires are now worth more than the bottom 60 percent of American households combined, and each day Cooperman could see that gap widening on his balance sheet — up an average of $4,788 per minute in the stock market, $1.9 million per day and $700 million total in 2021.

… What exactly had he done wrong? What rule had he broken? He’d been born to poor immigrant parents on the losing end of a capitalist economy. He’d attended public schools, taken on debt to become the first in his family to attend college, worked 80-hour weeks, made smart decisions, benefited from some good luck, amassed a fortune for himself and for his clients and paid hundreds of millions in taxes to the government. He had a wife of 57 years, two successful children, and three grandchildren who were helping him decide how to give most of his money away to a long list of charities. “My life is the story of the American Dream,” he’d said while accepting an award at one charity gala, and he’d always imagined himself as the rags-to-riches hero, only to now find himself cast as the greedy villain in a story of economic inequality run amok.

He knew what people imagined when they thought of a billionaire. He’d read the stories of excess and extravagance and witnessed some of it firsthand, but that wasn’t him. He didn’t spend $238 million on a New York penthouse like hedge fund manager Ken Griffin; or vacation at his own private island in Belize like Bill Gates; or throw himself $10 million birthday parties featuring camels and acrobats like investor Stephen Schwarzman; or drop $70,000 a year on hair care like Donald Trump; or buy a preserved 14-foot shark for an estimated $8 million like Steven Cohen; or spend more than $1 billion on art like media mogul David Geffen; or budget $23 million for personal security like Facebook did for Mark Zuckerberg.

He didn’t have his own spaceships like Elon Musk and Jeff Bezos; or a 600-foot flying airship like Sergey Brin; or a decommissioned Soviet fighter jet like Larry Ellison; or a $215 million yacht with a helipad and a pool like Steve Wynn; or a private train with three staterooms like John Paul DeJoria; or a $5 million luxury car collection like Kylie Jenner.

His father left behind an estate worth less than $100,000, but Cooperman also inherited his father’s belief that the economic ladder between poor and rich was short enough to climb with determination and hard work. More than 90 percent of children born in the United States during the 1940s would go on to out-earn their parents; two-thirds of those born into poverty would rise into at least the middle class. Cooperman waited tables during the summers, worked for Xerox while he went to business school at night and then started as an analyst at Goldman Sachs making $12,500 a year. “My PhD is for poor, hungry and driven,” he liked to say. He told colleagues that capitalism was like a battle for survival in the African safari and that the key to success was to adopt the mind-set of a lion or a gazelle during a hunt. “When the sun comes up, you’d better be running,” Cooperman told them. Within nine years, he’d been named a partner. Within a decade, he was a millionaire.

Cooperman eventually left Goldman Sachs to start his own hedge fund, Omega, and for two decades he compounded his millions at an average of 14 percent each year as the stock market soared, until he and Toby were among the wealthiest few hundred billionaires in the United States. They were invited to dinner in 2010 by Gates and Warren Buffett, who had just started a program called the Giving Pledge, asking billionaires to donate at least half of their money to charity, and the Coopermans committed that night.

“I could buy a Picasso for a hundred million, but it doesn’t turn me on, so then what?” Cooperman told them. “We live a very rational lifestyle. What better use is there for our money?”

They’d given away $150 million to a hospital in New Jersey, $50 million for college scholarships to Newark high school students, $40 million to Columbia Business School, $40 million to Hunter College, $30 million to performing arts, $25 million to the Jewish Family Fund, $20 million to skilled nursing, $15 million to food banks, and on and on it went. But no matter how much they gave away, their money continued to make more money even as wages for the middle class remained essentially flat. In the past 50 years, the gap between poor families and the top 0.1 percent had increased more than tenfold. Children now had only a 43 percent chance of out-earning their parents.

He donated to more than 50 organizations each year and also to a number of people who wrote to him in personal distress. “Other than my family, writing checks is the most meaningful thing I do,” Cooperman said, and yet no matter how many zeros he included, it left him wanting to do more. “We’re going in the wrong direction in this country in so many depressing ways,” he said. He believed in the meritocratic ideal of capitalism — “equal opportunity if not equal results,” he said — but it seemed to him that the odds of success remained stacked by race, by gender and increasingly by economic starting position. Rates of intergenerational poverty had gone up in each of the past three decades. The most disadvantaged children were falling further behind. He believed from his own experience that a college education was the best answer, and yet tuition costs were continuing to skyrocket.

“It’s not exactly a fair system until you even up the odds,” he said, and after looking over the list of worthy causes, he and Toby had decided that donating half of their money didn’t feel sufficient. Sixty percent wasn’t enough to meet the country’s needs. Neither was 75. So they’d agreed to set up a family foundation that would eventually give away more than 90 percent of their money, and Cooperman had decided that rather than retiring in earnest, he would continue to manage their account so there would be more to give away.

“He who dies rich dies disgraced,” read a quotation attributed to Andrew Carnegie on Cooperman’s office desk, but on this day he was still rich and getting richer. “What’s enough?” he wondered. “What’s the answer?” He checked the stock graph on his screen — up $2.6 million in the past five hours. His accounts were equal to the average net worth of 23,000 middle-class American families.

He and Toby had spent almost all of their time during the pandemic within the gates of St. Andrews, eating dinner outside at the clubhouse and playing cards with friends, but every few days they liked to go for a drive. Once, early in the pandemic, they’d driven to a quiet, nearby park only to find more than 150 cars lined up in the parking lot as people waited for bags of canned goods at an impromptu food bank. “Depressing and staggering in a country of such wealth,” Cooperman said, and it made him remember a poem his granddaughter had written and published when she was in middle school, called “Seven Miles,” about the physical proximity between the extreme wealth of Short Hills, N.J., where Cooperman had his other home, and the extreme poverty in nearby Newark. “At one end we have too much,” she’d written. “At the other, they have nothing. Spread it all just seven miles.”

A professor said her students think Americans make six figures on average. That’s a long way off.
By Timothy Bella

The question asked by Nina Strohminger to her students at the Wharton School at the University of Pennsylvania was straightforward: What did they think the average American makes in annual wages?

“I asked Wharton students what they thought the average American worker makes per year and 25% of them thought it was over six figures,” she tweeted late Wednesday. “One of them thought it was $800k.”

According to the Social Security Administration, the average U.S. annual wage last year actually was $53,383, with the median wage at $34,612. The Labor Department reported that median weekly earnings in the fourth quarter of last year were $1,010, which comes out to an annual wage of $52,520, according to MarketWatch.

The professor’s tweet comes as the latest surge of coronavirus cases fueled by the highly transmissible omicron variant has exacerbated the country’s persistent labor shortages and potentially complicated the labor market’s push toward pre-pandemic employment levels. Approximately 8.8 million workers reported not working between Dec. 29 and Jan. 10 because they were infected with the coronavirus or caring for someone who had it, according to data from the Census Bureau.

There’s also the concern surrounding rising inflation and the direct impact it has on Americans’ wallets. While wages are rising, unemployment is low and the stock market is healthy, 2021 was the worst year for inflation since 1982, according to a report released by the Bureau of Labor Statistics last week. Inflation, which is driven in part by supply chain issues and shortages overseas, is wiping out wage gains made by many workers last year, and has caused prices to increase 7 percent over the 12-month period that ended in December.

Meanwhile, the question surrounding what is considered middle class in the United States has popped up in recent years. A majority of Americans consider themselves to be middle class, but many are still having a hard time figuring out what it means. The Post calculated in 2017 that the country’s middle class ranges from $35,000 to $122,500 in annual household income.

“The bottom line is: $100,000 is on the middle-class spectrum, but barely,” The Post’s Heather Long wrote at the time, adding that “75 percent of U.S. households make less than that.”

Among the thousands who responded to the Wharton tweet was Stefanie Stantcheva, a professor of economics at Harvard University. Stantcheva co-authored a paper in 2020 that looked at how well people understand their social position relative to others in society. One of her findings, she tweeted, was that “what you think others make very much depends on your own income.”

“Lower income people think everyone else is lower income too,” she wrote. “Rich people think everyone else is richer too.”

The Wharton School was ranked as the second-best business school in the nation this year by U.S. News and World Report. Tuition for the school is about $80,000 a year. It was also noted on Twitter that the average annual income for those living in West Philadelphia, where Wharton is located, is about $34,000.

The American economy is perilously fragile. Concentration of wealth is to blame
By Robert Reich

Seventy per cent of the US economy depends on consumer spending. But wealthy people, who now own more of the economy than at any time since the 1920s, spend only a small percentage of their incomes. Lower-income people, who were in trouble even before the pandemic, spend whatever they have – which has become very little.

The result has been consumer spending financed by borrowing, creating chronic fragility. After the housing and financial bubbles burst in 2008, we avoided another Great Depression only because the government pumped enough money into the system to maintain demand, and the Fed kept interest rates near zero.

The wealth imbalance is now more extreme than it’s been in over a century. There’s so much wealth at the top that the prices of luxury items of all kinds are soaring; so-called “non-fungible tokens”, ranging from art and music to tacos and toilet paper, are selling like 17th-century exotic Dutch tulips; cryptocurrencies have taken off; and stock market values have continued to rise even through the pandemic.

Corporations don’t know what to do with all their cash. Trillions of dollars are sitting idle on their balance sheets. The biggest firms have been feasting off the Fed’s corporate welfare, as the central bank obligingly holds corporate bonds that the firms issued before the recession in order to finance stock buybacks.

But most people have few if any assets. Even by 2018, when the economy appeared strong, 40% of Americans had negative net incomes and were borrowing money to pay for basic household needs.

America’s Killer Capitalism
By Anne Case and Angus Deaton

A great failure of contemporary American capitalism is that it is not serving everyone. The educated minority – the one-third of the adult population with a four-year college degree – has prospered, but the majority has lost out, not just relatively but absolutely. The facts are increasingly clear and hard to ignore. Less-educated Americans’ prospects are getting worse: they are losing materially, they are enduring more pain and social isolation, and their lives are getting shorter.

After 1970, the engine of American progress began to falter. From the early 1980s onward, economic growth slowed, and what was once a largely equal distribution of gains became increasingly top-heavy. Economists Thomas Piketty and Emmanuel Saez’simportant work with US tax records shows just how well those at the very top have done.

While many commentators with alternative calculations have questioned the extent of rising income inequality, none has succeeded in dismissing the trend. Others argue that it is not a cause for concern, provided that everyone is prospering. For them, the evidence of falling material measures is a more serious challenge. Among men without a college degree, real (inflation-adjusted) median wages have undergone a trend decline for more than 50 years – experiencing interruptions during economic booms, but never recovering enough to return to the previous peak. Even at the height of the boom just before the COVID-19 pandemic, median wages were lower than at any point in the 1980s.

The critics argue that these data exclude various worker benefits, such as employer-provided health insurance. Yet the extraordinary increase in the cost of these benefits is itself contributing to the wage decline and to the destruction of jobs for the less skilled. Including these benefits in the analysis is like mugging someone and charging him for the cost of the attack.

Our findings on “deaths of despair” put another dent in the argument that working Americans are flourishing despite material evidence to the contrary. Death is a lot easier to measure than real income. In 1992, life expectancy at age 25 was two and a half years longer for men and women with a college degree than for those without one. By 2019, the gap had grown to six and a half years; from 2010 to 2018, life expectancy at 25 fell every year for those without a degree.

Accidental drug overdoses are an important part of the story. More than half of the increase in deaths of despair since the late 1980s came from overdoses. In the late 1980s, about 60,000 people in the United States died each year from drugs, alcohol, and suicide. Just prior to the pandemic, annual deaths of despair were running at 170,000 – an increase of more than 100,000 per year – with overdose deaths accounting for the largest share, but less than half of the total. With a little more than half of the increase due to drug overdoses, perhaps it is true that the US is suffering an epidemic not of despair but of drug overdoses. That is something that societies throughout history have had to endure, and will no doubt have to endure again.

Drug epidemics are not like plagues of locusts or earthquakes. They afflict societies that are already in trouble. Consider China in the 1840s. Nothing can excuse the depredations of the Scottish opium merchants William Jardine and James Matheson, nor British Prime Minister Lord Melbourne’s decision to send in the navy to support them. But there is little doubt that the advanced disintegration of the Qing Empire was a precondition for the opioid epidemic that followed.

In the US case, the most important previous opioid epidemic came during and after the Civil War. And on a smaller scale, there was widespread use of opium and heroin by American troops in Vietnam. Most of these addictions disappeared when the soldiers returned from being bored out of their minds half a world away to lead well-supported, meaningful lives at home. The fact that the current surge in drug deaths is concentrated almost entirely among those without a college degree tells us that, as in nineteenth-century China, despair and disintegration were preconditions that gave the dealers the foothold they needed. (And if the Sackler family gets to keep $4 billion of its ill-gotten gains from manufacturing OxyContin, and if no one goes to jail, there will surely be a repeat episode.)

Perhaps most telling of all is what has happened to suicide rates. Whereas the fin de siècle French sociologist Émile Durkheim thought that educated people were more likely to kill themselves, suicide rates in the US today are higher among those without a bachelor’s degree.

By contrast, world suicide rates have been falling for the last two decades, including in the European Union and other high-income countries. Even Japan and Finland, rich countries long troubled by suicide, now have lower rates than the US. There have also been particularly rapid declines in Russia – with rates cut by half since 2000 – and in the other countries of the former Soviet Union. While Russia still has a higher suicide rate than the US, the US is coming to resemble erstwhile suicide hot spots.

Rising suicide rates are hardly marks of a flourishing capitalist democracy. There is nothing wrong with capitalism in principle, but there is a great deal wrong with the version prevailing in the US today.

A college degree is now ‘a matter of life and death,’ says this Nobel Prize winner
By Howard Gold

MarketWatch: Over the past decade, you have been studying “deaths of despair” among blue-collar, non-college-educated Americans. That’s a term your wife and collaborator Anne Case coined. Would you please define it and discuss the trends you have seen up to the beginning of the pandemic?

Deaton: We found there were three causes of death that were rising quite rapidly. One of them was drug overdoses. Another one was suicide. And the third one was alcoholic liver disease. And perhaps the most stunning thing about that increase in deaths was that it was almost entirely confined to people without a four-year college degree.

So, the real important cutoff here is between having a bachelor’s degree and not. When we first wrote about it in 2015, it was confined to white non-Hispanics, but after about 2013, it moved into the African-American community, too.

MarketWatch: Mortality among African-Americans is higher in a lot of different ways. But you’ve also seen the gap between African-Americans who have college degrees and those who don’t.

Deaton: There’s a gap between African-Americans and white non-Hispanics wherever you look. And it’s true for people with a BA and for people who don’t have a BA. But it’s been closing at the same time that the gaps between people with and without a BA have been expanding for both Blacks and whites.

So the biggest gaps now are between people who do and do not have a BA. The gaps between whites and Blacks have been narrowing, though that slowed down over the last few years. And it will slow down a good deal more during COVID.

MarketWatch: You said in your paper that these trends are unique to the United States. You don’t really see them in any other developed country, is that correct?

Deaton: That’s right.

MarketWatch: We’ll get back to drugs in a bit. According to the Georgetown Center on Education and the Workforce, a four-year college graduate earns on average 84% more over a lifetime than someone who has only a high school diploma. But you write that a college degree has become “a matter of life and death.” Is it that grim?

Deaton: We already talked about the fact that death rates are going up for people without a BA and going down for people with a BA. That sounds like a matter of life and death to me. But it’s also true that if you fail your BA, they don’t take you out and shoot you.

So I think that statement is correct and also, that the college premium has become enormous. And again, the troublesome thing is not so much the people at the top are doing really well, but the median wage for men without a BA has fallen for 50 years.

MarketWatch: What has changed in America since the 1970s that has led to this horrible situation in so many parts of the country of drug addiction, suicide, and early death?

Deaton: The real bad guys are the people who are pushing drugs, sometimes illegally and sometimes legally. That was one of the great scandals in the U.S., that essentially heroin with an FDA label on (it’s called OxyContin) was pumped out in enormous numbers and individual members of Congress, as we described in “Deaths of Despair,” blocked enforcement efforts to stop this.

These are very well-financed people getting rich by addicting the population. The Sackler family alone made its $13 billion [fortune primarily from OxyContin]. Other countries do not allow that. They use OxyContin in Germany and Britain and France, but it is not pumped out into the general population like that.

But on the other hand, Anne and I would argue that there was just a lot of pain out there, there was a lot of despair, there was a lot of unhappiness, which made fertile ground for evildoers to come in and unleash this wave of addiction and death.

So, what are the underlying things that have made life so difficult over the last 50 years for people without a BA? The obvious candidates are globalization and automation and the lack of good jobs with good corporations for less-educated people. Now, globalization and automation can’t explain that by themselves, because you have those in every country in the world and you don’t get deaths of despair in those other countries.

But the big story, which is what Anne and I write about in our book, is the incredible cost of healthcare coupled with the way it’s funded. One dollar in five in America goes to supporting this obscenely swollen industry. And then we’re financing most of it off less-educated workers. That premium per family is now over $20,000 a year. If you take that over a 2,000-hour work year, it’s $10 an hour that has to be met by the firm. And it either has to come out of profits or out of wages. And that destroys jobs.

MarketWatch: You mentioned in your book the work of William Julius Wilson, who studied what happened to the Black community on the South Side of Chicago when the steel mills shut down. The social breakdown that followed (including declining marriage rates and increased drug use) also happened 20 years later in non-college-educated white working-class communities.

Deaton: Absolutely. When we went to the White House after I won the Nobel Prize, President Obama suggested to us that we explore exactly that parallel, which we did in a whole chapter in the book.

When these jobs go away, it destroys social life and communities come apart and people don’t go to church anymore. It sounds just like what William Julius Wilson described. It is.

MarketWatch: So how is this all connected with politics? You write in so many words that the Democratic Party divorced non-college-educated, blue-collar whites.

Deaton: Absolutely. We’re not the first to say that. After 1968, the Democrats turned themselves into a party of the educated elite and minorities. And Republicans have been more and more representing these less-educated people. Of course, Republicans are also very dependent on financing from corporations and from the rich, and that alliance doesn’t work very well. And so you have maybe half the population just being shut out of politics altogether, whose voice is not heard in Washington.

How Long Should It Take to Give Away Millions?
By Nicholas Kulish

… While the most vulnerable Americans were forced to line up outside food banks, the share prices of publicly traded companies climbed ever higher. Yet the charities and nonprofits that helped care for the children of frontline medical workers and brought clean diapers to the poor were forced to lay off staff.

“Philanthropy is where wealth inequality is playing out in the public realm,” said Ray Madoff, a law professor at Boston College and one of a group of people backing a push to rein in donor-advised funds. “When the super wealthy claim charitable tax benefits, they are supposed to be putting their money to use for the benefit of society at large. The rules we set down about that are incredibly important at a time when there are more and more super wealthy and greater and greater needs of society.”

Ms. Madoff and others pushing for change see a growing gap between reputation-burnishing promises of money and distributions to people who need it. The Giving Pledge, which was started by Bill Gates, Melinda French Gates and their friend and collaborator Warren E. Buffett, gave billionaires a space where they could announce their intention to give away half their fortunes or more, often to great acclaim. But it provides no mechanism to monitor or ensure the giving actually happens.

Charitable giving in the U.S. reaches all-time high in 2020
By Haleluya Hadero

Galvanized by the racial justice protests and the coronavirus pandemic, charitable giving in the United States reached a record $471 billion in 2020, according to a report released Tuesday that offers a comprehensive look at American philanthropy.

The Giving USA report says Americans gave more to charity last year than in 2019, despite an economic downturn that disrupted the paychecks of millions. Faced with greater needs, estates and foundations also opened up their pocketbooks at increased levels — resulting in a 5.1% spike in total giving from the $448 billion recorded for 2019, or a 3.8% jump when adjusted for inflation.

Giving by individuals, which made up a majority of the donations last year, rose by approximately 2%. The biggest uptick came from foundations, who, in total, increased their giving by 17% for an estimated $88.5 billion in contributions. Those donations made up about 19% of the total share of contributions, the largest that has ever come from foundations.

Most nonprofit categories experienced a boost in contributions, with the strongest growth seen among public-society groups that include civil rights organizations and other charities, like United Way. The nearly 16% increase in contributions also included donations given to donor-advised funds, which are like charitable investment accounts that draw strong opinions across the philanthropic world.

Donor-advised funds are often criticized because they aren’t required to make donations in any given year, but donors are able to take immediate tax deductions before charities get any of the money. A Congressional bill, which follows a plan put forth by billionaire philanthropist John Arnold and others to accelerate the payouts, is now aiming to change that.

Leading Foundations Pledge to Give More, Hoping to Upend Philanthropy
By James B. Stewart and Nicholas Kulish

In 2019, the Ford Foundation handed out $520 million in grants. Mr. Walker quickly realized that was not going to be anywhere near enough in this crisis-engulfed year.

His solution: Borrow money, spend it quickly and inspire others to follow Ford’s lead.

The Ford Foundation plans to announce on Thursday that it will borrow $1 billion so that it can dramatically increase the amount of money it distributes. To raise the money, the foundation — one of the country’s iconic and oldest charitable organizations — is preparing to issue a combination of 30- and 50-year bonds, a financial maneuver common among governments and companies but extremely rare among nonprofits.

Four other leading charitable foundations will pledge on Thursday that they will join with Ford and increase their giving by at least $725 million.

The decision by the five influential foundations — major sponsors of social justice organizations, museums and the arts and environmental causes — could shatter the charitable world’s deeply entrenched tradition of fiscal restraint during periods of economic hardship. That conservatism has provoked anger that foundations, which benefit from generous federal tax breaks, are hoarding billions of dollars during a national emergency, more interested in safeguarding their endowments than in helping those in need.

The Ford Foundation, which has a $13.7 billion endowment, plans to distribute the newly raised money over the next two years, effectively increasing the percentage of its endowment that it pays out annually to about 10 percent from nearly 6 percent.

The four other foundations are among America’s most storied: the John D. and Catherine T. MacArthur Foundation; the W.K. Kellogg Foundation; the Andrew W. Mellon Foundation; and the Doris Duke Charitable Foundation. The MacArthur and Doris Duke foundations plan to issue bonds. Mellon and Kellogg are still working out their financing plans.

Major charitable foundations traditionally spend a little more than 5 percent of their assets in a given year — the minimum required under federal law for the tax-exempt organizations. The less they distribute, the larger their endowments grow and the higher their odds of surviving in perpetuity.

The Ford-led plan provides a workaround. By using borrowed money, foundations would go into debt but wouldn’t erode their endowments. While foundations have issued bonds in the past to finance projects like building new headquarters, bankers said it was virtually unheard-of for them to borrow money that they plan to distribute.

With 300 years of experience between them and fortunes forged in insurance and tobacco, automobiles and banking, the five foundations carry the cachet of America’s unofficial aristocracy. They are closely watched trend setters in the philanthropic community.

There are more than 100,000 private foundations in the United States, and they are together sitting on endowments worth nearly $1 trillion, according to Candid, a group that tracks nonprofits and foundations. Some — including the Bill and Melinda Gates Foundation — already pay out a substantial chunk of their endowments every year. But many more hover near the legal floor, on average distributing roughly 7 percent of their funds annually.

An increasingly vocal network of charitable figures, lawmakers and others wants to pry that piggy bank open. The Patriotic Millionaires — a group of about 200 wealthy individuals, including the Disney heiress Abigail Disney, that pushes for higher taxes on the rich — and the left-leaning Institute for Policy Studies have been pressing Congress to force foundations to double their required payout to 10 percent of their assets for the next three years.

That would generate about $200 billion in additional payouts, the Institute for Policy Studies estimated.

“I’ve been appalled for years how many foundations treat the 5 percent federal floor as a ceiling and refuse to spend a penny more than they are required to,” said Scott Wallace of the Wallace Global Fund, whose family foundation plans to give away one-fifth of its $120 million endowment this year.

“If in our hour of greatest need, America’s greatest crisis in generations, philanthropies are planning to spend less, then they need a big kick in the butt,” added Mr. Wallace, a grandson of the progressive Henry A. Wallace, who was President Franklin D. Roosevelt’s vice president from 1941 to 1945. “Only Congress can deliver that kick.”

Thousands of nonprofits — from community theaters to food pantries to small rural hospitals — are fighting for their survival. A study released this week by the Center for Effective Philanthropy found that 90 percent of nonprofits surveyed had to cancel or postpone fund-raising events and 81 percent had to reduce programs or services. At the same time, more than half said that demand for their services had increased.

Many of those nonprofits rely on foundations as a major source of funding.

The Dangers of Relying on Philanthropists During Pandemics
By Rob Reich and Mohit Mookim

Big philanthropy does have a role to play … . Its distinctive and essential function is to serve as the risk capital for a democratic society, directing resources to fund experiments and discover solutions to social problems that neither the market nor government is well-suited to do. The paradigmatic example is Andrew Carnegie’s funding of public libraries, an experiment in cultivating an educated citizenry.

Philanthropy’s unique advantage is its role in improving outcomes in the long-term, and handing off its successes to the government to bring to scale for all citizens. Philanthropy is not a stopgap for the failure to provide basic services and public goods.

Public health is a paradigmatic public good. We should never be dependent on the whims of wealthy donors—as philanthropy is increasingly dominated by the wealthy—for our collective health and well-being.

That would be a betrayal of democracy. Rather than democratic processes determining our collective needs and how to address them, the wealthy would decide for us. We wanted rule by the many; we may get rule by the rich.

The coronavirus pandemic presents us with an immediate need for a response and it reminds us of the importance to invest so that we avoid preventable disasters in the future. At the moment, it’s all hands on deck for the emergency. But this is not what big philanthropy is built for. Or what it can sustain. The richest country in the world must step up to fund public health rather than relying on the richest people in the world to do it piecemeal.

Updated 3/23/2020 11:35 am: This piece has been updated to note that Reich is the faculty codirector of The Stanford Center on Philanthropy and Civil Society, which has received grants from the Gates Foundation.

Woke Inc.
By Michael Lind

Today’s Silicon Valley and Wall Street tycoons who endow foundations bearing their names—Gates, Bloomberg, Milken—are little different from Andrew Carnegie and John D. Rockefeller a century ago.

In his 1889 essay “The Gospel of Wealth,” Andrew Carnegie, the immigrant Scottish American steel magnate who built his fortune on brutal treatment of American workers and the suppression of organized labor, declared that society was better off when the benevolent rich were allowed to spend their fortunes on charitable causes than when they were taxed to pay for public spending directed by politicians. Recently Bill Gates made a nearly identical argument, claiming that governments are so incompetent that large-scale spending decisions are best entrusted to enlightened billionaires like himself: “Philanthropy is there because the government is not very innovative, doesn’t try risky things and particularly people with a private-sector background—in terms of measurement, picking great teams of people to try out new approaches. Philanthropy does that.”

For a century labor leaders and populists have replied to such self-serving arguments by asking arrogant plutocrats like Carnegie and Gates an obvious question: “If you have enough money to endow a foundation, why don’t you pay your employees more?”

Theodore Roosevelt, who despised Carnegie, observed privately: “All the suffering from Spanish war comes far short of the suffering, preventable and non-preventable, among the operators of the Carnegie steel works, and among the small investors, during the time that Carnegie was making his fortune …”

Hate the Donor, Love the Donation
By Cass R. Sunstein

Suppose that a nation, a company or an individual wants to give a lot of money to a university, a nonprofit group or an individual researcher. Suppose that many people think that the potential donor is morally abhorrent, or has done morally abhorrent things.

Is it wrong to take the money?

A lot of real-world cases raise this difficult question.

Loosely analogizing from those cases, here are some scenarios that capture what I have in mind:

  1. A tobacco company proposes to give money to Princeton University, to support scholarships for poor students.
  2. A big oil company proposes to give money to a climate researcher, to study the economic effects of climate change.
  3. China proposes to give money to Johns Hopkins University, to support a new center studying road safety.
  4. Saudi Arabia proposes to give money to a health economist, to finance research focused on obesity, diabetes and related illnesses.

Reasonable people will disagree about how to evaluate the donors in these cases. For example, some people think that big oil companies are pretty terrible; others have no problem with them. No country is all good or all bad. Let’s assume, for discussion, that in all four cases, the donor either is morally abhorrent or at least has done morally abhorrent things.

Consider a simple rule: Just take the money.

The argument would be that if the money would be used for a good cause, it’s a lot better to have it than to decline it. If its source is morally abhorrent, the rule still holds. Isn’t it better if the money is used to help people, even to save lives?

An initial response would point to the possibility that a donor might impose unacceptable conditions. Suppose that an oil company says that it is eager to make a donation to a climate change researcher — but only if it reserves the right to approve publication of any findings before they become public. If the company means to censor materials that do not fit with its interests, the researcher should refuse the money.

Or suppose that China proposes to give money to a university on one condition: The university agrees not to allow members of its faculty to say anything negative about the Chinese government. That’s a deal-breaker.

We should be able to agree that some conditions are unacceptable, even if others are fine (such as an annual reporting requirement, showing how the money has been used) — and even if others can be debated (such as a requirement that any publications be shown to the donor in advance, without any right of approval).

Refusing to accept money is a way of putting a price on wrongdoing. It is an effort to prevent wrongdoers from getting the reputational benefit that they seek. It is a little like the movement to get universities to divest South African assets during the apartheid era — or the current movement to get them to divest from oil companies.

In some circumstances, the objection is convincing. In others, it really isn’t. There’s usually some weighing to be done.

How abhorrent is the donor? How, and by how much, would the donor benefit from the donation? Might it be beneficial to engage with the donor? Would the money be put to important uses? Would it improve public health? Would it save lives? (If so, refusing to accept the money might be a moral wrong.)

Clear principles might eventually emerge from asking questions of this kind. For now, let’s keep it simple. If a donor imposes no strings of any kind, and if the donation could be used to help real people in really significant ways, here’s a rebuttable presumption: Take the money.

Jeffrey Epstein Gave $850,000 to M.I.T., and Administrators Knew
By Tiffany Hsu, David Yaffe-Bellany and Marc Tracy

Over 15 years, the convicted sex offender Jeffrey Epstein repeatedly donated to the Massachusetts Institute of Technology. Top administrators knew about the gifts, felt conflicted about them, and accepted them anyway. The university’s president even signed a thank-you note.

But on Friday, months after the campus was roiled by revelations of Mr. Epstein’s financial ties to the school’s prominent Media Lab program, investigators hired by the school absolved M.I.T.’s leadership of breaking any rules.

The law firm Goodwin Procter spent four months compiling a report on M.I.T.’s dealings with Mr. Epstein, who killed himself in his Manhattan jail cell in August while awaiting trial on federal sex trafficking charges. The investigation found that Mr. Epstein made 10 donations — totaling $850,000, slightly more than M.I.T. had previously disclosed — from 2002 to 2017. Mr. Epstein also visited campus at least nine times from 2013 to 2017, a period that followed his conviction on sex charges involving a minor in Florida.

According to the report, the men debated whether to accept Mr. Epstein’s money “in the absence of any M.I.T. policy regarding controversial gifts.” They reached a compromise: Under “an informal framework,” the school would accept the donations while insisting that the gifts be small and unpublicized to prevent Mr. Epstein from using them to improve his reputation or gain influence at the university, the report said.

Jeffrey Epstein’s Harvard ties were extensive, new report reveals
By Deirdre Fernandes

Jeffrey Epstein’s millions earned him special treatment at Harvard: He had a personal office among university researchers, a dedicated phone line, an unusual visiting fellowship position, and the backing of several high-level faculty who urged administrators to take the financier’s money despite his record as a registered sex offender.

In a report released on Friday, Harvard outlined these extensive ties with the notorious Epstein, detailing a relationship that spanned more than 25 years.

According to a months-long investigation by Harvard’s general counsel and an outside law firm, the university received $9.2 million from Epstein between 1998 and 2007.

In his application for a second year as a visiting fellow, Epstein wrote Harvard that he planned to study the “derivation of ‘power’ (Why does everybody want it?) in an ecological social system that would include variables for reputation, trust or awe, and the inherent strategically diverse tactics of deception.”

How the 1% Scrubs Its Image Online
By Rachael Levy

Jacob Gottlieb was considering raising money for a hedge fund. One problem: His last one had collapsed in a scandal.

While Mr. Gottlieb wasn’t accused of wrongdoing, googling his name prominently surfaced news articles chronicling the demise of Visium Asset Management LP, which once managed $8 billion. The results also included articles about his top portfolio manager, who died by suicide days after he was indicted for insider trading in 2016, and Mr. Gottlieb’s former brother-in-law, an employee of Visium who was convicted of securities fraud. Searches also found coverage of Mr. Gottlieb’s messy divorce in New York’s tabloids.

So last year Mr. Gottlieb hired Status Labs, an Austin, Texas-based company specializing in so-called reputation management. Its tactic: a favorable news blitz to eclipse the negative stories.

Afterward, articles about him began to appear on websites that are designed to look like independent news outlets but are not. Most contained flattering information about Mr. Gottlieb, praising his investment acumen and philanthropy, and came up high in recent Google searches. Google featured some of the articles on Google News.

His online makeover shows the steps some executives and public figures are taking to influence what comes up on the world’s top search engine. It also illustrates that despite Google’s promises to police misinformation, sites can still masquerade as news outlets and avoid Google’s detection.

Google removed five websites from Google News after The Wall Street Journal inquired about them. Google, owned by parent company Alphabet Inc., said the sites violated its policies around deceptive practices. Google’s news feature forbids “content that conceals or misrepresents sponsored content as independent, editorial content.”

Mr. Gottlieb said in an interview with the Journal that he found his press coverage unfair and wanted to fight back.

“I worked with this company to help me launch my new venture,” he said.

Status Labs interviewed Mr. Gottlieb on his interests and plans, and told him it would get him positive press coverage, according to a person familiar with the matter. He paid between $4,000 and $5,000 per month for the services.

Afterward, an article on a site called Medical Daily Times was published, saying Mr. Gottlieb made a donation to an initiative at New York University’s medical school.

The phone number for Medical Daily Times on its website rang at a pizzeria in Toronto. Until recently, the article’s author was listed as BJ Hetherington. The author couldn’t be reached for comment.

A black-and-white photo accompanied BJ Hetherington’s author page on Medical Daily Times. The photo is of a Canadian theater actor, Randy Hughson. A publicist for Mr. Hughson said he wasn’t aware his photo was being used with BJ Hetherington’s name until the Journal contacted him.

The information about Mr. Gottlieb’s donation is also inaccurate: An NYU spokeswoman said Mr. Gottlieb didn’t donate to that particular initiative, though he has donated to NYU on other occasions. Medical Daily Times didn’t check the information with Mr. Gottlieb, said a person familiar with the matter.

Medical Daily Times couldn’t be reached for comment.

Former Status Labs employees said that in addition to helping clients bury negative information in Google’s search results by gaming the tech giant’s algorithms, the company has also edited Wikipedia pages without disclosure of its role, something Wikipedia forbids.

The co-founders of Status Labs, Mr. Fisher and Jordan French, also ran a company called Wiki-PR, which edited Wikipedia pages for clients, according to a former employee. Mr. French left Status Labs in 2015 following a dispute, according to that former employee and a press release from Mr. French. Status Labs was founded in 2012.

Wiki-PR edited clients’ Wikipedia pages without disclosure, according to a cease and desist letter sent by a lawyer for the Wikimedia Foundation, which oversees Wikipedia.

“When outside publicity firms and their agents conceal or misrepresent their identity by creating or allowing false, unauthorized or misleading user accounts, Wikipedia’s reputation is harmed,” the letter said. It added that the practice “is expressly prohibited by Wikipedia’s Terms of Use.”

The Wikimedia Foundation banned Wiki-PR and its “employees, contractors, owners and anyone who derives financial benefit from editing the English Wikipedia on behalf of Wiki-PR.com or its founders,” the letter said.

Status Labs continued to stealthily edit clients’ pages, said former employees.

An ethicist explains why philanthropy is no license to do bad stuff
By Patricia Illingworth

Recent news about people who make big charitable gifts acting badly is making me wonder whether philanthropy really does make the world better.

Think about it: Members of the Sackler family, who have given millions to arts institutions, also own Purdue Pharma. That’s the company that patented and aggressively marketed Oxycontin, an approach that helped bring about the opioid crisis. Between 1999 and 2017, close to 218,000 people died from overdoses connected to prescription opioids.

Then there’s Warren Kanders, a major donor to New York City’s Whitney Museum. He stepped down as vice chair of its board in July 2019 over his role as the chief executive of Safariland – a manufacturer of bulletproof vests, bomb-defusing robots and other security products. Artists and activists demanded his ouster after learning that Safariland made the tear gas launched at migrants at the border between Mexico and California.

Perhaps the clearest example of doing good and also doing bad is Jeffrey Epstein. The financier, pedophile and alleged sex trafficker gave lots of money to a number of institutions, including Harvard University and MIT’s Media Lab. He made these donations before and, in some cases, after his conviction for sex crimes involving teens – although some of the charities he claimed to support say they never got the money.

Is it possible that giving money away or being generous makes some people more prone to bad behavior? In a word, yes.

Studies show that people unconsciously attempt to achieve a moral balance. Social psychologists call this tendency the “moral licensing loophole.” In other words, doing good according to one’s own judgment frees people to be bad.

Although it’s not a universal experience, it is a common mental glitch. One good example is how dutiful dieters may reward themselves with a pint of ice cream. And environmentalists who go out of their way to consume less water may later use more electricity.

When people donate money to charity, others view them as generous and caring. But giving also influences how donors see themselves. It reassures them of their virtue. Here lies the trigger for moral licensing. With moral credits in hand, donors can feel entitled to be bad.

Morehouse College Graduates’ Student Loans to Be Paid Off by Billionaire
By Audra D. S. Burch, David Gelles and Emily S. Rueb

Robert F. Smith, the billionaire investor who founded Vista Equity Partners and became the richest black man in America, told the crowd that he and his family would pay off the entire graduating class’s student debt, freeing them to begin their next chapter, whether it was a master’s program, a position with Teach for America or an internship at Goldman Sachs, without loan payments to worry about.

The announcement came at a time of growing calls across the country to do something about the mounting burden of student loan debt, which has more than doubled in the past decade. Presidential candidates like Elizabeth Warren have made debt cancellation a key plank in their campaign platforms, and some states and institutions are moving to make college tuition-free.

“We’re going to put a little fuel in your bus,” Mr. Smith, dressed in academic regalia to receive an honorary doctorate, said near the end of his address on Sunday at the school’s 135th commencement service.

Mr. Smith, known for a range of philanthropic donations including one to Morehouse earlier this year to finance scholarships, told the audience on Sunday that his gift was meant to set an example of paying forward.

“Let’s make sure every class has the same opportunity going forward, because we are enough to take care of our own community,” he said. “We are enough to ensure we have all of the opportunities of the American dream, and we will show it to each other through our actions and through our words and through our deeds.”

Forbes estimates Mr. Smith’s fortune at about $5 billion, built mainly through Vista Equity Partners, a private equity firm that focuses on buying and selling software firms. The firm has about $46 billion in assets under management, according to Forbes. It is privately held and does not publicly report its results, but it is believed to be one of the best-performing firms in the country, with annualized returns of more than 20 percent since its founding.

How to Actually Prosecute the Financial Crimes of the Very Rich
By Paul E. Pelletier

It was tax season 1999. I was a federal economic-crimes prosecutor in Miami, and this was the time of year my colleagues and I brought cases to deter would-be tax cheats. My target was a tax-return preparer operating out of Liberty City’s “Pork & Beans” projects, made famous in the movie Moonlight. This tax preparer had been manufacturing false W-2s and Social Security numbers so that her clients would receive an earned-income tax credit to which they weren’t entitled—amounting to more than $100,000 in bogus refunds. She eventually pleaded guilty and spent nearly three years in prison, which at the time I considered a broadly just result. She had committed a real crime against the United States.

That crime was not even one-100th as harmful as what Robert Smith, the billionaire private-equity investor, did. His tax fraud, which he admitted to, was massive—possibly the largest in history—and took place over at least a decade and a half. He went to elaborate lengths to hide more than $200 million from the IRS, which meant that he avoided at least $43 million in taxes.

Unlike the Pork & Beans case, though, Smith faced no prosecution and served no time. He hired expensive lawyers who successfully negotiated a “non-prosecution agreement,” in which Smith paid some taxes and a fine—though leaving the bulk of his billions intact—and did not have to bear the stigma of a criminal conviction.

How Did the Sacklers Pull This Off?
By Patrick Radden Keefe

In 2016, a small-time drug dealer in Leesburg, Va., named Darnell Washington sold a customer a batch of what he thought was heroin. It turned out to be fentanyl. The customer shared it with a friend, and the friend died from an overdose.

To combat the opioid crisis, prosecutors have begun treating overdose deaths not as accidents but as crimes, using tough statutes to charge the dealers who sold the drugs. Mr. Washington had never met the person who overdosed. But, facing a mandatory minimum prison sentence of 20 years for distribution resulting in death, he pleaded guilty to the lesser charge of distribution and is now serving a 15-year sentence in federal prison.

I thought about this the other day when it became clear that members of the billionaire Sackler family will most likely soon receive a sweeping grant of immunity from all litigation relating to their role in helping to precipitate the opioid crisis. Through their control of Purdue Pharma, the families of Raymond and Mortimer Sackler made a vast fortune selling OxyContin, a powerful prescription opioid painkiller that, like fentanyl, is a chemical cousin of heroin.

Though they are widely reviled for profiting from a public health crisis that has resulted in the death of half a million Americans, they have used their money and influence to play our system like a harp. It is hardly news that our society treats people like Mr. Washington with sledgehammer vengeance and people like the Sacklers with velvet gloves.

As Purdue Pharma Sought Controversial Bankruptcy Settlement, It Spent Over $1.2 Million on Lobbying
By Matthew Cunningham-Cook

In the year and a half leading up to the trial, Purdue spent at least $1.2 million on federal lobbying expenses as it worked toward the settlement, an Intercept review of lobbying records shows.

There were 93,000 opioid deaths in 2020, a 29 percent increase from the prior year. (For comparison, there were 20,000 murders in the U.S. in 2020.) While there are many other opioids besides OxyContin, researchers found that Purdue’s aggressive marketing tactics around OxyContin played a major role in the opioid crisis.

Unlike most major corporations in the U.S., Purdue Pharma is privately held, meaning that members of the Sackler family were the majority shareholders. Some of them, notably 75-year-old former Purdue chair and president Richard Sackler, had a great deal of operational control over the company. Another family member, former board member Kathe Sackler, bragged in a 2019 deposition about coming up with the idea for OxyContin, according to journalist Patrick Radden Keefe’s book “Empire of Pain.” David Sackler, who joined the Purdue board in 2012, wrote in a 2007 email that an investment banker had told him, “Your family is already rich, the one thing you don’t want to do is become poor.” David Sackler continued: “My thought is to lever up where we can, and try to generate some additional income. We may well need it?.?.?.?Even if we have to keep it in cash, it’s better to have the leverage now while we can get it than thinking it will be there for us when we get sued.”

Alexis Pleus — whose son overdosed in 2014, a decade after he was prescribed OxyContin for a minor injury — said that Purdue’s use of lobbyists highlights the disparity between Big Pharma and victims of the opioid crisis: One side can pour millions into influence campaigns, while the other cannot.

“We can’t hire lobbyists, we work full-time jobs,” said Pleus. “It’s hard to even take a day off to advocate. My son spent 10 months in jail for stealing $130 worth of stuff from Walmart. We have a system for corporations and billionaires and then we have a system for the rest of us. It seems like they’re absolutely untouchable.”

Chief executives are the new monarchs
By The Economist

IN THE EARLY 15th century many of the Portuguese voyages of discovery around Africa and into Asia were financed by Prince Henry of Portugal, whom historians dubbed “Henry the Navigator”. When Christopher Columbus sought finance for his planned westward voyage to the “Indies”, he first turned to the king of Portugal before achieving success with Ferdinand and Isabella of Spain. Monarchs financed explorations because they believed such trips would boost their power and their treasuries.

In the 21st century corporate executives have become deeply involved in adventure and exploration. Sir Richard Branson of Virgin and Jeff Bezos of Amazon have just travelled to the edge of space. Elon Musk of Tesla has developed the SpaceX programme and is talking of the eventual colonisation of Mars. Messrs Musk and Bezos competed for the contract to operate future Moon landings. Mr Bezos even offered to part-finance the project.

In itself, this is a remarkable development. Sixty years ago, when the space race was between America and the Soviet Union, few could have imagined that individual businessmen would ever have the resources to enter the fray. The shift says something about the extremes of wealth in the 21st century.

“All These Rich People Can’t Stop Themselves”: The Luxe Quarantine Lives of Silicon Valley’s Elite
By Nick Bilton

I’ve spoken to numerous people who’ve described countless billionaires hitting the road, flying around the country to wherever case numbers are lowest. One investor worth several billion who has several homes told a friend—who then parlayed the information to me in tones of shock and awe and more than a tinge of jealousy—that he was in Miami when the numbers were lowest at the start of the pandemic; hopped over to Los Angeles when Florida got a bit dicey; and now that California is a hotbed, is in New York enjoying the season’s outdoor dining. Another billionaire in Los Angeles has been hosting lavish dinner parties (no social media allowed) where an on-site nurse administers 15-minute coronavirus tests outside as guests drink cocktails, and allows them in to dine once their test comes back negative. And yet another investor told me about some of his colleagues who chipped in for a massive $50,000-a-month compound in Palm Springs that’s being used as a group party house. (I’ve heard about similar setups in Los Angeles and Silicon Valley.)

Charles Koch funded eviction push while investing in real estate companies
By Andrew Perez and David Sirota

Billionaire Charles Koch’s foundation has bankrolled three conservative legal groups leading the court battle to eliminate prohibitions against tenant evictions during the Covid-19 pandemic in America.

At the same time, Koch’s corporate empire has suddenly stepped up its real estate purchases during the pandemic – including making large investments in real estate companies with a potential financial interest in eliminating eviction restrictions.

In the last few months, the Texas Public Policy Foundation, the Pacific Legal Foundation and the New Civil Rights Alliance have been pushing federal courts to strike down the Centers for Disease Control and Prevention (CDC) eviction moratorium, which is designed to protect millions of Americans from being thrown out of their homes during the pandemic. The groups have so far won two rulings.

Between 2017 and 2019, the Charles Koch Foundation contributed almost $7.7m to those three conservative organizations, according to the foundation’s tax returns reviewed by the Daily Poster.

Last April, a month into the pandemic, Koch Real Estate Investments made a “$200m preferred-equity investment in Amherst Holdings LLC’s single-family rental business”, according to the corporate law firm Jones Day, which said it advised the Koch Industries subsidiary on the deal. Amherst says that since 2012, “its affiliated funds have acquired and operated more than 30,000 homes”.

Amherst owns Main Street Renewal, which operates 20,000 homes, according to Fortune magazine. Main Street Renewal’s website lists home rental locations in 17 states, including Ohio, Georgia, Tennessee, Texas and Florida.

In March last year, Amherst Capital researchers warned investors that the pandemic could disproportionately crush renters.

“While no economic class will be untouched by Covid-19’s silent wrath, lower middle income households – mostly hourly workers – are likely to bear the brunt,” they wrote. “A disproportionate share of these affected hourly wage workers are renters who will need payment support in these trying times.”

A tech billionaire has fought for a decade to block access to a public beach. Now California is suing.
By Brittany Shammas

For generations, Californians flocked to Martins Beach, a scenic stretch in San Mateo County. They came to fish, swim, sunbathe and surf at the beach known for a rocky, shark tooth-shaped outcrop that juts out of the water and makes for an iconic backdrop.

Then, in 2008, Silicon Valley billionaire and venture capitalist Vinod Khosla bought a huge swath of land abutting the coast, which included the only access road to the popular beach. Within a year, he decided he didn’t want to share, closing a gate to the beach and painting over billboards advertising it as open to the public.

In his long-running campaign to keep the public off the beach, Khosla, the co-founder of Sun Microsystems, has become what the New York Times described as the beach villain of the current California era and “the sandy antagonist of the digital age.” In a rare interview about the issue, he told technology reporter Nellie Bowles he bought the property on a whim and regrets it deeply.

He also hadn’t spent a single night there.

Khosla claimed that he doesn’t even want to win and is waging the battle on principle alone. It’s his contention, the New York Times reported, that by directing him to either keep open the road and charge no more than $2 a car for parking or apply for a permit to change access, the government was forcing him to run an unprofitable parking business.

“If I were to ever win in the Supreme Court, I’d be depressed about it,” Khosla said, according to the paper. “I support the Coastal Act; I don’t want to weaken it by winning. But property rights are even more important.”

Bill Gates is the biggest private owner of farmland in the United States. Why?
By Nick Estes

Like wealth, land ownership is becoming concentrated into fewer and fewer hands, resulting in a greater push for monocultures and more intensive industrial farming techniques to generate greater returns. One per cent of the world’s farms control 70% of the world’s farmlands, one report found. The biggest shift in recent years from small to big farms was in the US.

The principal danger of private farmland owners like Bill Gates is not their professed support of sustainable agriculture often found in philanthropic work – it’s the monopolistic role they play in determining our food systems and land use patterns.

Here’s why the ultra-wealthy like Bill Gates and Thomas Peterffy are investing in U.S. farmland
By Nathaniel Lee

“It’s an asset with increasing value,” American Farmland Trust CEO John Piotti said. “It has great intrinsic value and beyond that, it is a limited resource.”

The U.S. Department of Agriculture estimates that 30% of all farmland is owned by landlords who don’t farm themselves. Buyers often purchase land from farmers who have owned it for decades; many of whom may be asset rich but maybe cash poor.

“The economic realities for them are typical that they’ve spent their life farming,” said Holly Rippon-Butler, land campaign director at the National Young Farmers Coalition. “Their retirement, their equity is all in the land and tied up in selling land.”

Google co-founder Larry Page gets New Zealand residency, raising questions
By Nick Perry

Google co-founder Larry Page has gained New Zealand residency, officials confirmed Friday, stoking debate over whether extremely wealthy people can essentially buy access to the South Pacific country.

Immigration New Zealand said Page first applied for residency in November under a special visa open to people with at least 10 million New Zealand dollars ($7 million) to invest.

“As he was offshore at the time, his application was not able to be processed because of COVID-19 restrictions,” the agency said in a statement. “Once Mr. Page entered New Zealand, his application was able to be processed and it was approved on 4 February 2021.”

“The day after the application was received, a New Zealand air ambulance staffed by a New Zealand ICU nurse-escort [evacuated] the child and an adult family member from Fiji to New Zealand,” Health Minister Andrew Little told lawmakers in parliament.

Opposition lawmakers said the episode raised questions about why Page was approved so quickly at a time when many skilled workers or separated family members who were desperate to enter New Zealand were being turned away.

“The government is sending a message that money is more important than doctors, fruit pickers and families who are separated from their children,” ACT deputy leader Brooke van Velden said in a statement.

In 2017, it emerged that Silicon Valley billionaire Peter Thiel had been able to gain New Zealand citizenship six years earlier, despite never having lived in the country. Thiel was approved after a top lawmaker decided his entrepreneurial skills and philanthropy were valuable to the nation.

Thiel didn’t even have to leave California for the ceremony — he was granted citizenship during a private ceremony held at the New Zealand Consulate in Santa Monica.

New Zealand’s housing crisis is worsening
By The Economist

The average home in Auckland, the commercial capital, now costs NZ$1.4m ($935,000), 35 times the median income.

Auckland is at the centre of a house-price boom that is roiling the country. Low interest rates and lashings of fiscal stimulus have sent prices soaring everywhere. But even by those standards New Zealand’s recent gains look stratospheric. Last year its house prices rose by more than a quarter, according to CoreLogic, a business which tracks them. Relative to incomes, New Zealand has the world’s sixth-most expensive houses. House prices are “unsustainable”, warns the central bank, which acted in November to restrict lending to people with smaller deposits.

Peter Thiel files plans to build luxury lodge, private home and meditation pod on New Zealand estate
By Sam Shead

Peter Thiel, the tech billionaire who co-founded PayPal and Palantir and backed Donald Trump’s 2016 presidential campaign, has filed plans to build a large complex on a 193-hectare (477-acre) estate that he owns on the South Island of New Zealand.

Home to around 5 million people, the country has become synonymous with “preppers” — those who try to prepare for catastrophic events that may pose a threat to humanity. Today, there’s even a website dedicated to people wanting to prep their families for “survival” in New Zealand.

The prepper craze was first put under the spotlight in January 2017, when an article in The New Yorker titled “Doomsday prep for the super-rich” revealed how New Zealand is essentially like a mecca for wealthy preppers. It’s remote, geopolitically stable, and sparsely populated. Importantly, it could also become completely self-sufficient in terms of water, food, and energy if it ever needed to.

A year earlier, Sam Altman, former president of the prestigious Y-Combinator accelerator program, told the magazine that he planned to fly to Thiel’s house in New Zealand in the event of a pandemic.

It’s unclear which house Altman was referring to, as Thiel has owned a number of properties in New Zealand over the years, albeit on significantly smaller plots. That includes a four-bedroom home in neighboring Queenstown, which he bought in 2011 for $4.8 million and subsequently built out with a panic room after a fire. He has also owned property in Auckland.

When it comes to prepping, Thiel said he’s been influenced by a 1997 book “The Sovereign Individual: How to Survive and Thrive During the Collapse of the Welfare State,” according to The Guardian.

In 2020 the ultra-rich got richer. Now they’re bracing for the backlash
By Brenna Hughes Neghaiwi and Simon Jessop

Henley & Partners, a global citizenship and residence advisory firm based in London, said inquiries from high-net-worth individuals seeking to relocate had jumped during the pandemic. The number of calls from U.S.-based clients surged 206% in 2020 from the prior year, for example, while calls from Brazil rose 156%.

For many in emerging countries, fears that strains on public services could lead to civil unrest have prompted younger generations of wealthy families particularly to seek opportunities abroad.

“COVID just basically took the clothes off the Emperor, and all of a sudden, people started to realize: our healthcare system is not strong, our social safety net is really not available,” said Beatriz Sanchez, head of Latin America at global wealth manager Julius Baer.

The Right’s Would-Be Kingmaker
By Ryan Mac and Lisa Lerer

The candidates Mr. Thiel has funded offer a window into his ideology. While the investor has been something of a cipher, he is currently driven by a worldview that the establishment and globalization have failed, that current immigration policy pillages the middle class and that the country must dismantle federal institutions.

Mr. Thiel has started articulating his thinking publicly, recently headlining at least six conservative and libertarian gatherings where he criticized the Chinese Communist Party and big tech companies and questioned climate science. He has taken issue with what he calls the “extreme dogmatism” within establishment institutions, which he said had sent the country backward.

At an October dinner at Stanford University for the Federalist Society, he spoke about the “deranged society” that “a completely deranged government” had created, according to a recording of the event obtained by The New York Times. The United States was on the verge of a momentous correction, he said.

What happens when elites abandon their homeland
By Simon Kuper

In every country, the elite’s time horizon determines how it governs. Does the elite treat its country as a sort of luxury watch that it’s only looking after for future generations or as a stolen wallet full of cash?

There are only a few lucky countries whose elites reliably think decades ahead. This is an unfairly self-perpetuating system. Typically, these are old democracies, where members of the elite live in old houses. Even there, elites have varying time horizons. Politicians tend to think in electoral cycles, whereas policymakers and heads of venerable, nationally based companies take a longer view.

There are moments when elite timeframes suddenly shrink. A country’s portents of doom can come from above (dictator goes off the rails), from below (peasants with pitchforks), from civil war or invasion.

Elites today are mobile like never before. Rich people now live in a borderless world. Contrast them with, say, the Soviet elite, which presumed it would live and die in the USSR. When elites leave their homelands, they typically go to stable countries with long-term horizons. In good times, the rich want a weak state with low tax and little regulation, but in bad times they prefer strong states. That’s why the countries with the highest influx of foreign brains per capita are Australia, Sweden, Norway, Switzerland and Canada, according to the Fund for Peace. The safe haven of the American super-rich in case the US collapses is social democratic New Zealand.

Now another factor is shortening elite time horizons: climate change. Local elites won’t invest long-term in cities like Dhaka or Jakarta that are going to become uninhabitable. The novelty of billionaire space flights suggests something even scarier: shortening elite time horizons for the planet.

“Kochland” Examines the Koch Brothers’ Early, Crucial Role in Climate-Change Denial
By Jane Mayer

The Kochs’ key role in stopping congressional action on climate change is well-known, but longtime environmental activists, such as Kert Davies, the director of the Climate Investigation Center, credit Leonard with discovering that the Kochs played an earlier and even more central role in climate-change denial than was previously understood. In 2010, Davies authored a report, for Greenpeace, that labelled the Kochs “The Kingpins of Denial,” but he told me that he hadn’t realized that their role went as far back as 1991. (A copy of a flyer for the Cato conference can be seen at Koch Docs, a new digital collaborative-research project, directed by the liberal corporate watchdog Lisa Graves, which tracks the Kochs’ influence.)

According to “Kochland,” the 1991 conference was called “Global Environmental Crisis: Science or Politics?” It featured many of the same characters who have spread doubt about the reality of climate change and continue to challenge the advisability of acting against it. Among the speakers was Richard S. Lindzen, a professor of meteorology at M.I.T., who is quoted in the brochure as saying there was “very little evidence at all” that climate change would be “catastrophic.”

“Kochland” is important, Davies said, because it makes it clear that “you’d have a carbon tax, or something better, today, if not for the Kochs. They stopped anything from happening back when there was still time.” The book also documents how, in 2010, the company’s lobbyists spent gobs of cash and swarmed Congress as part of a multi-pronged effort to kill the first, and so far the last, serious effort to place a price on carbon pollution—the proposed “cap and trade” bill. Magnifying the Kochs’ power was their network of allied donors, anonymously funded shell groups, think tanks, academic centers, and nonprofit advocacy groups, which Koch insiders referred to as their “echo chamber.” Leonard also reports that the centrist think tank Third Way quietly worked with the Kochs to push back against efforts to renegotiate the North American Free Trade Agreement, which could have affected their business importing oil from Canada. Frequently, and by design, the Koch brothers’ involvement was all but invisible.

Others have chronicled the cap-and-trade fight well, but Leonard penetrates the inner sanctum of the Kochs’ lobbying machine, showing that, from the start, even when other parts of the company could have benefitted from an embrace of alternative energy, Koch Industries regarded any compromise that might reduce fossil-fuel consumption as unacceptable. Protecting its fossil-fuel profits was, and remains, the company’s top political priority. Leonard shows that the Kochs, to achieve this end, worked to hijack the Tea Party movement and, eventually, the Republican Party itself.

Scientists who worked for Koch Industries adopted the company line; Leonard quotes a former company scientist, who embraced the conspiracy theory that élites invented a global-warming “hoax” as a way to unite Americans against a common enemy after the Cold War. Leonard also quotes Philip Ellender, Koch Industries’ top lobbyist, as claiming, in 2014, that the Earth had gotten cooler in the previous eighteen years. In fact, according to NASA, eighteen of the nineteen hottest years on record have occurred in the past two decades. Yet the Koch machine bought its way into Congress and turned climate-change denial into an unchallengeable Republican talking point. Meanwhile, after the cap-and-trade bill died, the planet continued heating, and the Kochs’ net worth doubled.

Inside the Koch-Backed Effort to Block the Largest Election-Reform Bill in Half a Century
By Jane Mayer

A recording obtained by The New Yorker of a private conference call on January 8th, between a policy adviser to Senator Mitch McConnell and the leaders of several prominent conservative groups—including one run by the Koch brothers’ network—reveals the participants’ worry that the proposed election reforms garner wide support not just from liberals but from conservative voters, too. The speakers on the call expressed alarm at the broad popularity of the bill’s provision calling for more public disclosure about secret political donors. The participants conceded that the bill, which would stem the flow of dark money from such political donors as the billionaire oil magnate Charles Koch, was so popular that it wasn’t worth trying to mount a public-advocacy campaign to shift opinion. Instead, a senior Koch operative said that opponents would be better off ignoring the will of American voters and trying to kill the bill in Congress.

Kyle McKenzie, the research director for the Koch-run advocacy group Stand Together, told fellow-conservatives and Republican congressional staffers on the call that he had a “spoiler.” “When presented with a very neutral description” of the bill, “people were generally supportive,” McKenzie said, adding that “the most worrisome part . . . is that conservatives were actually as supportive as the general public was when they read the neutral description.” In fact, he warned, “there’s a large, very large, chunk of conservatives who are supportive of these types of efforts.”

As a result, McKenzie conceded, the legislation’s opponents would likely have to rely on Republicans in the Senate, where the bill is now under debate, to use “under-the-dome-type strategies”—meaning legislative maneuvers beneath Congress’s roof, such as the filibuster—to stop the bill, because turning public opinion against it would be “incredibly difficult.” He warned that the worst thing conservatives could do would be to try to “engage with the other side” on the argument that the legislation “stops billionaires from buying elections.” McKenzie admitted, “Unfortunately, we’ve found that that is a winning message, for both the general public and also conservatives.” He said that when his group tested “tons of other” arguments in support of the bill, the one condemning billionaires buying elections was the most persuasive—people “found that to be most convincing, and it riled them up the most.”

McKenzie explained that the Koch-founded group had invested substantial resources “to see if we could find any message that would activate and persuade conservatives on this issue.” He related that “an A.O.C. message we tested”—one claiming that the bill might help Congresswoman Alexandria Ocasio-Cortez achieve her goal of holding “people in the Trump Administration accountable” by identifying big donors—helped somewhat with conservatives. But McKenzie admitted that the link was tenuous, since “what she means by this is unclear.” “Sadly,” he added, not even attaching the phrase “cancel culture” to the bill, by portraying it as silencing conservative voices, had worked. “It really ranked at the bottom,” McKenzie said to the group. “That was definitely a little concerning for us.”

The Supreme Court’s Citizens United decision, from 2010, opened up scores of loopholes that have enabled wealthy donors and businesses to covertly buy political influence. Money is often donated through nonprofit corporations, described as “social welfare” organizations, which don’t publicly disclose their donors. These dark-money groups can spend a limited percentage of their funds directly on electoral politics. They also can contribute funds to political-action committees, creating a daisy chain of groups giving to one another. This makes it virtually impossible to identify the original source of funding. The result has been a cascade of anonymous cash flooding into American elections.

It’s Kochs vs. Mercers in the Right’s Big Tech Brawl
By Naomi Nix and Joe Light

Not long ago, conservative lawmakers, think tanks, and trade groups were unified in their calls for a light regulatory approach to business. More recently, they’ve been waging a bitter war over whether and how to rein in Silicon Valley’s biggest companies. The fight has split the right wing into an anti-Big Tech camp made up mostly of social conservatives and a faction composed of more traditional pro-business groups that supports the internet giants.

Defending the internet companies is a network of groups supported by libertarian billionaire Charles Koch, who’s broken with Trump and increasingly shifted his focus to Silicon Valley.

Allied with the Koch network are the pro-free-market Mercatus Center at George Mason University and mainstream conservative think tanks such as the Heritage Foundation and the Competitive Enterprise Institute. Some of these groups criticize the tech companies’ practices but believe regulators should keep their distance.

The Koch family, which owns the industrial conglomerate, Koch Industries Inc., over the past decade has created or supported numerous libertarian organizations to advocate a hands-off approach to regulation. Federal tax forms filed by the Charles Koch Foundation and the Charles Koch Institute show they gave more than $500,000 to Szóka’s TechFreedom from 2014 to 2018, the latest data available.

The Kochs have also given millions to the Heritage Foundation; the R Street Institute, which favors limited government; universities; and other think tanks whose scholars have advocated for a laissez-faire approach to regulation.

Google likewise contributes to the Lincoln Network, TechFreedom, the R Street Institute, and other libertarian think tanks, though it won’t disclose amounts.

On the other side, the Mercer Family Foundation gave almost $8 million between 2013 and 2018 to the Government Accountability Institute, which seeks to uncover government corruption. Peter Schweizer, who heads the institute, co-wrote The Creepy Line, a 2018 documentary that argues Google and Facebook Inc. stifle conservative speech. Schweizer has been a conservative hero since his 2015 book, Clinton Cash, which investigated the finances of Bill and Hillary Clinton.

The Mercer Family Foundation has given almost $3.5 million to the Heartland Institute, a think tank that advocates for free markets and whose website features anti-tech opinion pieces, from 2013 to 2017. The foundation also granted almost $13 million between 2013 and 2017 to the Media Research Center, which polices what it believes is liberal bias by mainstream media. The foundation’s latest filing, for 2018, shows no money going to the Media Research Center and several other groups critical of Big Tech, a sign that the family may be rethinking its donor strategy.

The report Carlson cited in his Fox News segment came from the Google Transparency Project, a group that often criticizes the search giant. Oracle Corp., a longtime Google antagonist, has claimed to be one of its donors.

Some groups, such as the Lincoln Network, straddle the divide. The Kochs gave it $1.6 million from 2014 to 2018. But the group also got $100,000 from the Mercer family in 2016. A Lincoln Network spokesperson pointed to a statement on its website that says the group is “careful not to allow donors to predetermine or unduly influence our work.”

Late last year, Americans for Prosperity began a Facebook advertising campaign criticizing the state-level investigation of Google led by Texas Attorney General Ken Paxton. But those in the social conservative camp said that view ignores the power that tech platforms have over individuals’ personal information, markets, and political speech. Investigating Big Tech “involves both the central tenets of our country’s free-market economy and the guaranteed freedom of speech,” a Paxton spokesperson said in 2018.

“If 50 state AGs agree there is an issue here that is totally appropriate for them to look into, I don’t think that’s conservative or liberal,” says Rachel Bovard, a senior adviser for the Internet Accountability Project, a new advocacy group in the social-conservative orbit. She wouldn’t disclose its donors.

Democrats Decried Dark Money. Then They Won With It in 2020.
By Kenneth P. Vogel and Shane Goldmacher

Spurred by opposition to then-President Trump, donors and operatives allied with the Democratic Party embraced dark money with fresh zeal, pulling even with and, by some measures, surpassing Republicans in 2020 spending, according to a New York Times analysis of tax filings and other data.

The analysis shows that 15 of the most politically active nonprofit organizations that generally align with the Democratic Party spent more than $1.5 billion in 2020 — compared to roughly $900 million spent by a comparable sample of 15 of the most politically active groups aligned with the G.O.P.

The findings reveal the growth and ascendancy of a shadow political infrastructure that is reshaping American politics, as megadonors to these nonprofits take advantage of loose disclosure laws to make multimillion-dollar outlays in total secrecy. Some good-government activists worry that the exploding role of undisclosed cash threatens to accelerate the erosion of trust in the country’s political system.

A single, cryptically named entity that has served as a clearinghouse of undisclosed cash for the left, the Sixteen Thirty Fund, received mystery donations as large as $50 million and disseminated grants to more than 200 groups, while spending a total of $410 million in 2020 — more than the Democratic National Committee itself.

But nonprofits do not abide by the same transparency rules or donation limits as parties or campaigns — though they can underwrite many similar activities: advertising, polling, research, voter registration and mobilization and legal fights over voting rules.

The scale of secret spending is such that, even as small donors have become a potent force in politics, undisclosed money dwarfed the 2020 campaign fund-raising of President Biden (who raised a record $1 billion) and Mr. Trump (who raised more than $810 million).

There is no legal definition of “dark money,” but it generally has been understood to mean funds spent to influence politics by nonprofits that do not disclose their donors. These groups are usually incorporated under the tax code as social-welfare and advocacy groups or business leagues. Legally, these groups are allowed to spend money on partisan politics, but it is not supposed to be their primary purpose.

Back in 2005, Mr. Stein helped start the Democracy Alliance, which would grow into an influential club of some of the wealthiest donors on the left. Warning of the superiority of conservative infrastructure, he urged affluent liberals to create counterweights. They responded, seeding institutions like the turnout group America Votes, the Media Matters watchdog group and the Center for American Progress think tank.

But Democrats’ concerns about losing the big-money race spiked again after the Supreme Court’s 2010 Citizens United decision. It expanded the kinds of permissible political spending by nonprofits and unleashed a torrent of dark money into elections, particularly on the right, where the industrialists Charles G. and David H. Koch oversaw a political operation that came to outstrip the Republican Party financially.

Democrats publicly assailed the Koch operation as epitomizing a corrupting dark-money takeover of American politics. Privately, they plotted ways to compete.

‘Dark Money’ Helped Pave Joe Biden’s Path to the White House
By Bill Allison

President Joe Biden benefited from a record-breaking amount of donations from anonymous donors to outside groups backing him, meaning the public will never have a full accounting of who helped him win the White House.

Biden’s winning campaign was backed by $145 million in so-called dark money donations, a type of fundraising Democrats have decried for years. Those fundraising streams augmented Biden’s $1.5 billion haul, in itself a record for a challenger to an incumbent president.

That amount of dark money dwarfs the $28.4 million spent on behalf of his rival, former President Donald Trump. And it tops the previous record of $113 million in anonymous donations backing Republican presidential nominee Mitt Romney in 2012.

Democrats have said they want to ban dark money as uniquely corrupting, since it allows supporters to quietly back a candidate without scrutiny. Yet in their effort to defeat Trump in 2020, they embraced it.

Big donors — individuals or corporations — who contributed anonymously will have the same access to decision makers as those whose names were disclosed, but without public awareness of who they are or what influence they might wield.

“The whole point of dark money is to avoid public disclosure while getting private credit,” said Meredith McGehee, executive director of Issue One, which advocates for reducing the influence of money on politics. “It’s only dark money to the public.”

Dark money group launches $2 million pressure campaign on moderate lawmakers to pass parts of Biden’s agenda
By Brian Schwartz

A group funded by anonymous donors is launching a big-money influence campaign this week that will pressure moderate lawmakers to pass key pieces of President Joe Biden’s agenda.

WorkMoney, a 501(c)(4) nonprofit that launched last year at the height of the coronavirus pandemic, is moving ahead with a new $2 million effort that will include digital ads both on Facebook and Google. The campaign will also include the group’s nearly 2 million members engaging through emails and other means with over a dozen lawmakers on the need to pass Biden’s infrastructure and family plans through Congress.

The nonprofit is known as a dark money group because it does not publicly disclose its donors. Founder CJ Grimes told CNBC in an email Monday that the group has raised over $20 million since its launch. Facebook’s ad archive shows that the group has already spent over $5 million on ads in the buildup to this new campaign.

Are Democrats Done Fighting Big Money in Politics?
By Walter Shapiro

Not too long ago, super PACs were associated with money-bag Republican donors like Sheldon Adelson and the Koch brothers. When the Barack Obama team encouraged wealthy liberals to fund a super PAC to work against Mitt Romney in 2012, the campaign used the pious explanation that to do otherwise would be “unilateral disarmament.” That Cold War trope became a standard Democratic excuse, as they leaned heavily on super PACs in presidential and congressional races against the GOP over the last decade.

Good-government reformers have long believed that campaign-finance legislation alone can cleanse politics of special-interest money. But New York City has the right laws—and it still cannot escape the scourge of super PACs, even in a Democrats-only primary. Unless the win-at-any-cost political culture among Democrats changes, all the earnest sermons about Citizens United will be about as sincere as an alcoholic preacher advocating for temperance.

Billionaires and Big Checks Shape Battle for Congress
By Shane Goldmacher and Rachel Shorey

Billionaires cut giant checks to super PACs. Small donors gave online in mass quantities. Multimillionaires poured money into their own campaigns. And both political parties announced record-setting hauls in 2021.

The 2022 midterm elections were awash in political money even before the year began, according to new Federal Election Commission campaign disclosures made on Monday.

With control of both chambers up for grabs — the Senate is knotted at 50-50 and Democrats are clinging to a narrow majority in the House — the two parties were almost equally matched when it came to fund-raising last year. The Democratic and Republican national committees, as well as the main House and Senate committees, pulled in nearly identical sums — about $400 million each.

Big money also poured into the campaigns of some politicians who are not even on the ballot this year, reflecting the high stakes of the legislative battles that have raged on Capitol Hill over President Biden’s agenda.

Two moderate Democratic senators, Joe Manchin of West Virginia and Kyrsten Sinema of Arizona, who have not committed to supporting Mr. Biden’s signature domestic bill, raised bigger sums than some facing competitive contests, even though neither is up for election again until 2024.

Time to change the Constitution to fix the problem of money in politics
By Jim Rubens

Conservatives are waking up to a troubling new reality: We are now being vastly outspent by a torrent of campaign money from liberal donors living in deep blue states.

Overall for federal races in 2020, Democrats outpaced Republicans by a massive margin of $8.4 billion to $5.3 billion, reversing the trends of the last several presidential cycles. In the shell game for secret money, Joe Biden outdid Donald Trump by six to one. And for the first time in recent memory, business interests — from Wall Street to the health care industry and beyond — are now spending more on Democrats than Republicans.

While in 2020 Republicans did remarkably well in down-ballot races, over the past eleven cycles, House and Senate candidates with the most money won 92% and 80% of the time. In almost every swing election, the new kingmaker is concentrated, out-of-state money coming from a small number of exceedingly wealthy individuals and power brokers who, in most cases, control the flow of small-donor dollars. Rather than in-state voters and party activists, this new aristocracy is picking our candidates for us. Grassroots, locally-funded candidates are unviable and invisible. Candidates have lost control over their own campaigns. Local and state priorities are submerged. Most critically for what remains of federalism, every swing election is nationalized.

Just look to the two Georgia Senate races that tipped the Senate to Democrat control. Those races were the most expensive ever, with nearly $900 million disgorged on the voters of Georgia.

The race between Democrat Jon Ossoff and Republican incumbent David Perdue alone saw an unbelievable $510 million in total spending, with national outside group spending exceeding what the candidates spent themselves. In comparison, before 2018 the most expensive Senate race ever was Pennsylvania’s at $180 million.

And if liberal billionaires bankrolling such races aren’t bad enough, the shell game of innocuous-sounding organizations giving to one another and then to SuperPACs, with the original donors of such money often remaining hidden, means there’s no way to know if the original source of contributions even comes from this country. Two years ago, entrepreneur Imaad Zuberi pled guilty to funneling illegal donations from a Saudi national to Barack Obama’s 2012 reelection campaign. With malign foreign interests taking a more active role in American elections, it is likely that significant illegal foreign contributions are being laundered into U.S. SuperPACs.

US charges Jho Low and ex-Fugee with 1MDB back-channel lobbying
By Stefania Palma

The US has charged a Malaysian financier and a former member of the hip hop trio Fugees with back-channel lobbying to drop a probe linked to the 1MDB Malaysian state investment fund embezzlement scandal and to extradite a Chinese dissident based in the US.

Jho Low, the alleged mastermind behind the fraud at 1MDB, and Prakazrel “Pras” Michel are accused of launching undisclosed lobbying campaigns in 2017 targeting the Donald Trump administration, according to a DoJ statement released on Friday.

Directed by Low and China’s vice minister of public security, Michel’s and the Malaysian financier’s alleged lobbying efforts sought to convince the US administration to quash the investigation of Low and others in relation to 1MDB and have a dissident sent back to China, the DoJ said.

Michel and Low were accused of conspiring with individuals including Elliott Broidy, a former Trump fundraiser, and Nickie Lum Davis, an American businesswoman, who both pled guilty in 2020 to lobbying senior US government officials to throw out the 1MDB probe and deport the Chinese dissident. The DoJ said these efforts were ultimately unsuccessful.

A war over Russia has erupted at the Atlantic Council
By Daniel Lippman

Emma Ashford and Mathew Burrows, two senior experts at the Atlantic Council, on Friday published an article that said the U.S. should not focus on human rights in its dealings with Russia and wrote that “democratization in Russia would not necessarily be good for US foreign policy interests.”

The article led 22 of the think tank’s staffers and fellows to issue a statement on Tuesday distancing themselves from Ashford and Burrows, saying the article “misses the mark.” Among the signers were three former ambassadors: John E. Herbst, Alexander “Sandy” Vershbow and Daniel Fried, who worked at the State Department for almost 40 years.

“Their article is premised on a false assumption that human rights and national interests are wholly separate and that US policy toward Russia was and remains driven by human rights concerns principally,” the statement reads. “In fact, previous administrations and the current one have sought to integrate our values and other national interests. We the undersigned disagree with its arguments and values and we disassociate ourselves from the report.”

One person who signed the statement told POLITICO that they worried the article was, or might be viewed as, a shoddy work product influenced by a $4.5 million donation over five years to the Atlantic Council from Charles Koch, who advocates for less American intervention abroad.

After Koch gave that money to the Atlantic Council, the money was used to set up the New American Engagement Initiative, which aims to study new ways to address foreign policy issues. Ashford, who was at the Koch-funded libertarian think tank the Cato Institute, started at the Atlantic Council on the NAEI in September and was joined by Chris Preble, another prominent former Cato foreign policy scholar who had started at the think tank a few months before.

“The Koch industry operates as a Trojan horse operation trying to destroy good institutions and they have pretty much the same views as the Russians,” said the person who signed the statement. The person argued that many other think tanks hold themselves to higher standards and refuse to take such money, and they were disappointed that the Atlantic Council had accepted it.

Fred Kempe, CEO of the Atlantic Council, said in an interview that such a claim was “an outrageous charge. We don’t take any funding at the Atlantic Council that doesn’t include intellectual independence for the Atlantic Council, and we need new voices in the foreign policy debate in Washington. It’s not as if we’ve got everything right over the last 20 years.” Will Ruger, the vice president for research and policy at the Charles Koch Institute, also told POLITICO last year that the Koch Institute respected the intellectual freedom of the think tanks it is funding.

This is not the first time the Atlantic Council has received negative attention about who it solicits as donors. In 2014, The New York Times reported that the Atlantic Council had received money from at least 25 countries since 2008. Ukrainian energy company Burisma, which played a star role in President Donald Trump’s first impeachment, has donated to the Eurasia Center and funded a program on energy security.

… Dylan Myles-Primakoff, a nonresident senior fellow at the Atlantic Council who signed the letter, also wrote a 1,841 word article on Tuesday criticizing his colleagues’ work.

He picked apart specific parts of the article that he said were just factually wrong. Ashford and Burrows had called Russian opposition leader Alexei Navalny “an open nationalist who is widely known to agree with Putin on many foreign policy questions; he backed the Russian seizure of Crimea.”

The State Department recently sanctioned Russia over Navalny’s poisoning and imprisonment, and he is widely seen as the leading force for democracy in Russia.

“You can make a case for bringing in alternative points of view, but if you’re going to do that, you have to have standards, man,” one of the dissenters said.

Foreign Powers Buy Influence at Think Tanks
By Eric Lipton, Brooke Williams and Nicholas Confessore

More than a dozen prominent Washington research groups have received tens of millions of dollars from foreign governments in recent years while pushing United States government officials to adopt policies that often reflect the donors’ priorities, an investigation by The New York Times has found.

The money is increasingly transforming the once-staid think-tank world into a muscular arm of foreign governments’ lobbying in Washington. And it has set off troubling questions about intellectual freedom: Some scholars say they have been pressured to reach conclusions friendly to the government financing the research.

The think tanks do not disclose the terms of the agreements they have reached with foreign governments. And they have not registered with the United States government as representatives of the donor countries, an omission that appears, in some cases, to be a violation of federal law, according to several legal specialists who examined the agreements at the request of The Times.

As a result, policy makers who rely on think tanks are often unaware of the role of foreign governments in funding the research.

Joseph Sandler, a lawyer and expert on the statute that governs Americans lobbying for foreign governments, said the arrangements between the countries and think tanks “opened a whole new window into an aspect of the influence-buying in Washington that has not previously been exposed.”

“It is particularly egregious because with a law firm or lobbying firm, you expect them to be an advocate,” Mr. Sandler added. “Think tanks have this patina of academic neutrality and objectivity, and that is being compromised.”

Foreign officials describe these relationships as pivotal to winning influence on the cluttered Washington stage, where hundreds of nations jockey for attention from the United States government. The arrangements vary: Some countries work directly with think tanks, drawing contracts that define the scope and direction of research. Others donate money to the think tanks, and then pay teams of lobbyists and public relations consultants to push the think tanks to promote the country’s agenda.

Why Everyone Hates Think Tanks
By Matthew Rojansky and Jeremy Shapiro

In the last century, think tanks were conceived as a mechanism to bring scientific principles and rigor to the making of policy. Like universities, they would over time accumulate knowledge and serve as centers of expertise and incubators of potentially revolutionary ideas. In their diversity, they would challenge each other and create a competitive, even adversarial, but still fact-based deliberative process for developing policy ideas, through which truth—or at least best practice—would eventually emerge. And they would act like intellectual venture capitalists, investing in people who might eventually emerge to take on important government positions.

As the industry has expanded, as society around it has become more polarized, and as the competition for funding has grown ever fiercer, some think tanks have become advocacy groups, or even lobbyists, by another name. Political parties want loyal propagandists, not niggling, equivocating academic hangers-on. And potential donors want veteran sharpshooters to fire their policy bullets into exactly the right target at precisely the right moment.

As a series of New York Times investigations in 2014-2017 revealed, the think tank business model has drifted disturbingly toward selling access and influence. For some, the point is no longer to generate new ideas or inform a deliberative process but rather to sell ideas that promote the interests of funders. It’s a straightforward transaction and hardly a new one. The Washington lobbying business has exploded over the past three decades, as the private sector, wealthy individuals, and even foreign governments turn to Washington’s revolving-door power brokers to help them purchase influence on issues that matter to them. Funders are essential for think tanks’ survival and success, but they could just as easily take their money elsewhere, so think tanks are under real pressure to give them what they want. Some funders have even cut out the middleman, so to speak, and created their own purpose-built think tanks.

If the funding chase is the why of the problem, Washington’s deeply tribal partisan political culture, and the echo chamber around it, is the how. In the jungle of proliferating new media, social media, and outright disinformation, think tankers battle one another to control the message. A lonely voice in the wilderness will be drowned out, so policy experts band together in teams, like political parties. Some groups, like the Heritage Foundation on the right and the Center for American Progress on the left, have even explicitly created lobbying arms to carry out the advocacy work their public charity arms would be barred from doing. This legal fiction makes for an inherently porous boundary between what purports to be nonpartisan research in the public interest and lobbying advocacy that is, for legal purposes, no different from K Street.

The more there are, the more they compete for donor dollars, which in turn shifts the balance of power from experts working at think tanks to donors who might potentially pay their keep. As political polarization increases, donors demand proofs of fealty to their side of the partisan divide. To survive, let alone to compete and prosper, think tanks are under real pressure to put partisan and donor interests first. No amount of transparency will deal with this fundamentally fiscal problem.

Basic economics suggests that if one side of the donor-expert relationship has all the power, supply and demand are out of whack. If there were fewer think tanks, donors might be less able to force their partisan or special interest views on experts.

… both the U.S. government and think tanks themselves need to seriously reconsider what is appropriate when it comes to foreign funding. FARA registration is ultimately only a disclosure requirement. The harder question is why and whether foreign governments and corporations should be allowed to pay influential Americans, some of whom will someday hold high government offices, to promote their viewpoints in front of once and future colleagues. No other modern democracy offers such an efficient mechanism for foreigners to influence policy decisions.

The Robber Barons of Beijing
By Yuen Yuen Ang

It seemed like a typical story of Chinese corruption. Stuffing suitcases full of company shares, the businessman lavished bribes on influential officials in exchange for cheap loans to subsidize his railroad projects. The target of his largess, those in charge of public infrastructure and budgets, were his friends and business associates. Their family members ran firms in the steel industry, which stood to benefit from the construction of new track. Over time, as the ties between the officials and the businessman grew closer, the officials doubled their financial support for his ventures, indulging his inflated costs and ignoring the risk of losses. Slowly but surely, however, a financial crisis brewed.

Stories like this are endemic to China: business leaders colluding with officials to exploit development projects for personal enrichment, graft infecting all levels of government, and politicians encouraging capitalists to take on outsize risks. No wonder some observers have insisted since the 1990s that the Chinese economy will soon collapse under the weight of its own excesses, and bring down the regime with it. But here’s the twist: the businessman is not Chinese but American, and the tale took place in the United States, not China. It describes Leland Stanford, a nineteenth-century railroad tycoon who helped catapult the United States’ modernization but whose path to immense fortune was paved with corrupt deals.

Conventional metrics of corruption ignore the different varieties it comes in. The most popular metric, the Corruption Perceptions Index, released by Transparency International every year, measures corruption as a one-dimensional problem that ranges on a universal scale from zero to 100. In 2020, China scored 42, ranking it as more corrupt than Cuba, Namibia, and South Africa. Conversely, high-income democracies consistently rank among the cleanest countries in the world, reinforcing the popular belief that corruption is a malaise that is exclusive to poor countries.

Take lobbying, which is a legitimate means of political representation in the United States and other democracies. In exchange for influence over laws and policy, powerful groups fund political campaigns and promise politicians plush positions after they leave office.

In 2012, Xi took on the mantle of leadership under ominous circumstances. The party was facing its biggest political scandal in a generation: Bo Xilai, a Politburo member once seen as a contender for the top position, had been dismissed from his posts and would soon be arrested on charges of graft and abuse of power. This wasn’t just any corruption scandal. Bo, the son of a prominent Chinese Communist Party leader, was also implicated in the murder of a British businessman, and he was rumored to have been plotting a coup against Xi.

This dramatic episode surely helped form Xi’s worldview, imprinting in him a deep sense of insecurity not only about the party’s future but also about his own survival. For Xi, Bo’s brazenness revealed that access money in a supersized economy had created elite factions far more powerful than those any previous leader had had to contend with. And for the Chinese public, Bo’s downfall offered a rare peek into the world of state-business collusion and the lavish lifestyles of the political elite.

China does not exist in a vacuum, of course. Across the Pacific, its chief rival is also experiencing a repeat of the Gilded Age. This time, the new technology the United States is grappling with is not steam power but algorithms, digital platforms, and financial innovations. Like China, the United States is beset by sharp inequality. Its government, too, fears the populist backlash from the losers of globalization, and the country is similarly struggling to reconcile the tensions between capitalism and its political system. In that sense, the world is witnessing a curious form of great-power competition today: not a clash of civilizations but a clash of two Gilded Ages. Both China and the United States are struggling to end the excesses of crony capitalism.

Book review: ‘Corruption in America,’ by Zephyr Teachout
By Max Ehrenfreund

Economists never ask whether people have good reasons for spending their money on, say, pet accessories. Likewise, the court does not ask whether people have good reasons for making campaign donations. Patriotism, ambition, faith, avarice — all are equally valid reasons for contributing, according to Citizens United.

The quid pro quo standard, which leaves motives out of the discussion, appears to lend the abstract rigor of economics to corruption law. It does not. Legislators and judges are not economists, and while it can be useful for economic analysis to ignore what drives people and focus on how they act, analyzing a problem is not the same as solving it. Policymakers do need to worry about motives, especially the ones that are dangerous to society if left unchecked.

The Fight Against Corruption Needs Economists
By Josh Rudolph

Combating corruption and kleptocracy has traditionally been an afterthought in U.S. foreign policy: a goal that most policymakers considered laudable but hardly a priority. That attitude is no longer acceptable. In recent years, countries such as China and Russia have “weaponized” corruption, as Philip Zelikow, Eric Edelman, Kristofer Harrison, and Celeste Ward Gventer argued in these pages last year. For the ruling regimes in those countries, they wrote, bribery and graft have “become core instruments of national strategy” through which authoritarian rulers seek to exploit “the relative openness and freedom of democratic countries [that] make them particularly vulnerable to this kind of malign influence.”

Strikingly, one particular form of financial aggression—covert foreign money funneled directly into the political processes of democracies—has increased by a factor of ten since 2014. Over roughly the same period of time, American voters have become highly receptive to narratives about corruption, and politicians across the ideological spectrum now routinely allege that the economy is rigged and deride their opponents as crooked and corrupt. Thus, the needs of U.S. foreign policy and domestic politics have neatly aligned to offer a historic opportunity for a sweeping anticorruption campaign that would institutionalize transparency, resilience, and accountability throughout the United States and in the international financial, diplomatic, and legal systems.

Instead of trying to win over the hearts and minds of the masses with communist ideology, the countries that threaten U.S. power today are organized as kleptocracies, stealing from their own people to buy the loyalty of cronies. They hide their ill-gotten gains in Western markets, which presents an Achilles’ heel if financial authorities can manage to find their dirty money.

The U.S. Midwest Is Foreign Oligarchs’ New Playground
By Casey Michel and Paul Massaro

For many in the West, the notion of kleptocracy—of transnational money laundering tied to oligarchs and authoritarians bent on washing billions of dollars in dirty money—remains a foreign concept. It conjures images of oligarchs purchasing penthouses in Manhattan or regime insiders floating aboard yachts along the French Riviera or maybe even the children of despots racing luxury cars down the streets of Paris. With pockets bulging with billions of dollars in illicit wealth, it makes a certain sense why these kleptocrats would gravitate toward other deep-pocketed areas.

But these kleptocrats are no longer just laundering and parking their dirty money in places like Miami, Malibu, and Monaco. Instead, they’ve begun targeting new areas for their laundering sprees, places few would suspect: from declining, second-tier cities like Cleveland, Ohio, to small factory and steel towns across the American Midwest. In so doing, these kleptocratic figures are no longer simply keeping luxury condos on standby or collecting fleets of private jets and high-end automobiles. Instead, they’re increasingly leaving a trail of destruction in their wake, demolishing the economies of working-class towns and leaving behind empty, sagging downtowns as relics of better times.

Take, for instance, the ongoing story of Ukrainian billionaire Ihor Kolomoisky. Recently sanctioned by the United States for his rank corruption, Kolomoisky stands accused by Ukrainian and U.S. authorities of overseeing one of the greatest Ponzi schemes the world has ever seen. Running PrivatBank, one of Ukraine’s leading retail banks, for years, Kolomoisky crafted an image of a successful entrepreneur devoted to Ukraine’s growing middle class. However, not long after Ukraine’s successful anti-authoritarian revolution in 2014, Ukrainian authorities began poking around the ledgers of Kolomoisky’s bank. Their findings were staggering. Ukrainian investigators—led by Valeria Gontareva, then-reformist head of Ukraine’s banking governing body—discovered a $5.5-billion hole in the middle of PrivatBank’s books.

The hole forced Kyiv to nationalize the bank, plugging an institution that was too big to fail and sending Kolomoisky on the run. When it came to Ukrainian banks transforming into money laundering machines, “PrivatBank wasn’t an exception,” Gontareva told Foreign Policy. “The problem was that it was the biggest one.”

The immediate question was an obvious one: Where had the money gone? As journalists discovered, and as the U.S. Justice Department has alleged in a series of filings in recent months, Kolomoisky didn’t direct the missing billions of dollars into London flats or mansions on the Italian coastline. Instead, as U.S. and Ukrainian investigators discovered, Kolomoisky and a network of enablers plowed much of the money into commercial real estate in places like Cleveland and Louisville, Kentucky—and into small towns reliant on manufacturing plants and steel factories in Illinois, West Virginia, and Michigan. Rather than use the illicit money to play alongside the world’s elite, Kolomoisky and his network allegedly buried their money in the heart of Middle America, using a series of shell companies and cash purchases to obscure their trail.

Why would a foreign oligarch decide to hide hundreds of millions of dollars (and potentially more) across overlooked pockets of the United States? Kolomoisky’s example offers three possible motivations. The first reason lies in the obscurity of smalls town like Warren, Ohio, and Harvard, Illinois. Few investigators, journalists, and authorities would have paid any attention to these purchases, let alone asked questions about the source of funds. Unlike places like Seattle, Dallas, or New York City, where the United States now effectively bars anonymous real estate purchases, much of the rest of the country remains perfectly open for the kinds of anonymous real estate purchases at the heart of kleptocratic networks.

The second reason appears directly linked to the economic decline of many of these overlooked regions, especially following the Great Recession. For many of these assets, the only buyers are often kleptocrats with deep pockets. In Cleveland, for instance, Kolomoisky’s network of enablers swooped into town when no one else appeared interested, snapping up numerous massive downtown buildings in the post-2008 world. According to a local Cleveland journalist who requested to speak on background, Kolomoisky’s network simply “showed up in Cleveland and started buying when no one else was buying.” Eventually, the oligarch and his team became the biggest commercial real estate holders in the entire city. And that dynamic—with kleptocratic money the only game in town—meant those on the receiving end had no incentive to look this foreign gift horse in the mouth, even when the signs of money laundering were clear.

And the ease of entering these markets meant Kolomoisky and his network could do whatever they wanted with these assets—even running them into the ground as they did time and again. Indeed, Kolomoisky never appeared interested in turning a profit for any of these U.S. assets but instead using them simply as something of a kleptocratic nest egg, far away from Ukrainian authorities. According to court documents, Kolomoisky used his U.S. investments simply as nodes in his laundering network, allowing them to slowly fall apart—but not before, in some cases, these assets’ slow-motion collapse sent Americans to the hospital with debilitating injuries.

This happened time and again across the American Rust Belt and Midwest. The steel plant in Warren, now shuttered, looks like something out of a dystopian landscape, with cavernous holes gouged in the siding and walls covered in rust—and with all of its former employees now without jobs. A hulking manufacturing plant in the town of Harvard, Illinois—a plant that should have been the economic lifeblood for the town—has been left to rot, with the cash-strapped city left to pick up the tab. (“The building is fucking cursed,” Michael Kelly, the town’s mayor, told us.) And rather than investments and the dreamed-of revitalization, Cleveland has been left with, as one local paper said, a “gaping hole” in its downtown, courtesy of the investments Kolomoisky and his network let effectively implode. As the local journalist familiar with the Kolomoisky-linked purchases added, “They pretty much ruined everything they touched.”

How to mess with an oligarch
By Ryan Heath

Whether it’s in Kazakhstan or the United States, a dictator or a stock trader, there’s a common financial thread: secrecy. Paul Massaro, adviser to the bipartisan Helsinki Commission in Congress, told POLITICO’s Global Insider podcast there’s no way to defeat corruption without exposing money flows.

“It’s really important that we not lose sight of the fact that our domestic and foreign crises are inextricably linked. It all comes back to the fact that we have these global corrupt networks that traverse multiple countries that are being abused, exploited by dictators, in many cases built by dictators, who are looking to undermine democracy. They require enablers in the west, to help them to do so. And these include lobbyists, lawyers, real estate professionals, accountants, it’s really just so many people.”

“Ten percent of world GDP is held offshore: I want people to understand the massive, massive magnitude of this issue. But it’s not just money: It’s also loss of democracy, it’s abuse of courts, it’s our entire system.”

“Americans are extremely upset about corruption, on both sides of the aisle, on all sides of the aisle: the extremes, the middle, everywhere. They intuitively feel that our system is corrupt. Now, it’s hard for them to pinpoint it. And that’s by design, right? I’ve been doing this for almost a decade, and I still can’t explain to you the exact manner of all these anonymous shell companies.”

“The people that end up bearing the brunt of this [corruption] are working Americans and middle-class Americans, just because there’s no ability [at the IRS] to go after the wealthy and the corrupt. You’d see a lot less of that, if you abolished secrecy jurisdictions.”

“America is one of the biggest secrecy jurisdictions in the world, and so is the U.K. They are not tax havens, they are secrecy jurisdictions. If these two jurisdictions cleaned up their acts, you have knocked out 70 percent of the problem right there: all the top 200 global law firms in the world are Anglo-American, every single one.”

“If you’re only attacking the minimum corporate tax rate, tax havens, you’re missing the forest for the trees. Ireland is perhaps the most important corporate tax haven in the world, but has no role to play almost whatsoever in the secrecy jurisdiction business. You need to be attacking both.”

“If you don’t attack both, what ends up happening is the secrecy jurisdiction is used to obfuscate your ownership of things, move things around, and basically avoid whatever new tax codes come down the line. You can create these chains of anonymous shell companies, hold 10 passports, go through webs of financial opacity to ensure that you can never be held responsible for your crimes, that your taxes can never be tracked down.”

“… No one has more disdain for their own people than kleptocrats: They don’t even want to live in the countries they run! They want to be in Miami and they want to be on the Riviera, in nice London neighborhoods, and so on and so forth. … ”

“My modest proposal is to sanction all the oligarchs. If you actually force these guys to live in the system they’ve created, they will demand reform.”

Sanctions Threaten U.K.’s Position as Playground for Russian Oligarchs
By Max Colchester and Alistair MacDonald

London’s allure for oligarchs has faded in recent years amid increased hostility between the West and Moscow. However those oligarchs have left a long trail of property and family that still makes them a potential pressure point for the Russian government.

It is unclear how far the U.K. government will go to squeeze them amid fears it will damage the attractiveness of its financial center.

To have any effect, the U.K. would have to be prepared to sanction around 100 powerful Russians, including their wider families, says Thomas Mayne, an expert in corruption studies at the think tank Chatham House. “What kind of pressure that exerts on Putin who knows, perhaps nothing, but it is something that should be tried,” he says.

Transparency International UK, a group that campaigns against corruption, says £1.5 billion, equivalent to $2 billion, worth of property has been bought since 2016 by Russians who are accused of corruption or have links to the Kremlin.

Biden Ramps Up Fight Against Corruption
By Dylan Tokar

The Biden administration in April said it would seek to boost the budget of the Treasury’s Financial Crimes Enforcement Network, or FinCEN, which is building the new database. The administration has requested an additional $64 million for FinCEN, a 50% increase to its current budget.

The Treasury also will take additional steps to stop the purchase of real estate in the U.S. with proceeds from corruption, the Biden administration official said.

The U.S. for decades has ramped up enforcement of the Foreign Corrupt Practices Act, which bars companies from paying bribes to foreign government officials. The Biden administration’s new anticorruption push could lead to a broader focus among federal prosecutors on corruption and bribery. The memo calls for the Justice Department and other federal agencies to establish new programs, where appropriate, and to increase staffing and resources aimed at tackling corruption both in the U.S. and abroad.

The FCPA specifically focuses on supply-side bribery, meaning the companies and individuals who provide bribes. But prosecutors in recent years have used money-laundering laws and other federal statutes to prosecute foreign public officials who accept bribes from companies and launder the proceeds through the U.S.—sometimes referred to as the demand side of bribery.

“What we may be seeing is a shift away from supply-side corruption enforcement, which is the FCPA, to the broader demand-side,” said Nathaniel Edmonds, a former prosecutor and a partner at the law firm Paul Hastings LLP. “Not only corrupt public officials but the banks, real-estate agents, lawyers and accountants who are facilitating these transactions—maybe not knowingly.”

Revealed: Credit Suisse leak unmasks criminals, fraudsters and corrupt politicians
By David Pegg, Kalyeena Makortoff, Martin Chulov, Paul Lewis and Luke Harding

The Suisse secrets project sheds a rare light on one of the world’s largest financial centres, which has grown used to operating in the shadows. It identifies the convicts and money launderers who were able to open bank accounts, or keep them open for years after their crimes emerged. And it reveals how Switzerland’s famed banking secrecy laws helped facilitate the looting of countries in the developing world.

Swiss banks have cultivated their trusted reputation since as far back as 1713, when the Great Council of Geneva prohibited bankers from revealing details about the fortunes being deposited by European aristocrats. Switzerland soon became a tax haven for many of the world’s elites and its bankers nurtured a “duty of absolute silence” about their clients affairs.

The custom was enshrined in statute in 1934 with the introduction of Switzerland’s banking secrecy law, which criminalised the disclosure of client banking information to foreign authorities. Within decades, wealthy clients from all over the world were flocking to Swiss banks. Sometimes, that meant clients with something to hide.

One of the most notorious cases in Credit Suisse’s history involved the corrupt Philippine dictator Ferdinand Marcos and his wife, Imelda. The couple are estimated to have siphoned as much as $10bn from the Philippines during the three terms Ferdinand was president, which ended in 1986.

It has long been known that Credit Suisse was one of the first banks to help the Marcoses ravage their own country and in one infamous episode even helped them open Swiss accounts under the fake names “William Saunders” and “Jane Ryan”. In 1995, a Zurich court ordered Credit Suisse and another bank to return $500m of stolen funds to the Philippines.

Arab Rulers and Spy Chiefs Stashed Millions in Swiss Bank
By Ben Hubbard

“What you have is a very sophisticated, corrupt elite that is very integrated into the global financial system,” said Nadim Houry, the executive director of the Arab Reform Initiative.

Enabling the politically connected to enrich themselves, he said, is the failure of many states to create boundaries between the rulers’ and the state’s assets.

“It looks like a state, it sounds like a state, but ultimately when it comes to the assets of the country,” he said, many of the potentates “act like absolute monarchs disposing of personal property.”

Anti-corruption efforts have stalled around the world, including in the U.S., study finds
By Geir Moulson

Transparency International’s 2021 Corruption Perceptions Index, which measures the perception of public-sector malfeasance according to experts and businesspeople, found that “increasingly, rights and checks and balances are being undermined not only in countries with systemic corruption and weak institutions, but also among established democracies.”

Transparency said the control of corruption had stagnated or worsened in 86% of the countries it surveyed in the last 10 years.

This Is How Kleptocracies Work
By Sarah Chayes

A long, carefully designed, and meticulously executed investigation culminated in the arrest of a palace aide on charges of extorting a bribe. But the man did not spend a single night in jail. Afghan President Hamid Karzai made a call; the aide was released; and the case was dropped.

Corruption, I realized with a start, is not simply a matter of individual greed. It is more like a sophisticated operating system, employed by networks whose objective is to maximize their members’ riches. And a bargain holds that system together: Money and favors flow upward (from aides to presidents, for instance) and downward in return.

Subordinates owe fealty to their network higher-ups, and a cut of whatever revenue streams they manage to access. In Afghanistan and other notorious kleptocracies I’ve studied, such cash transfers might include a percentage of extorted bribes, or kickbacks on contracts for development projects. Many government jobs carry an explicit purchase price, payable in a lump sum or indefinite monthly installments.

In return for this torrent of cash and favors and subservience, those at the top of kleptocratic networks owe something precious downwards. They owe their subordinates impunity from legal repercussions. That is the other half of the bargain, without which the whole system collapses.

Thanks to a concerted campaign that lasted decades and several Supreme Court decisions (most unanimous), the definition of public corruption and bribery has been narrowed to the point that an official would almost have to make an effort to commit the infractions. Prosecutions of white-collar crime have been plummeting for decades.

Ongoing campaigns for sentencing reform may inadvertently add to this leniency in their rush to lift the burdens on nonviolent offenders. Reformers should beware of blurring the distinction between nonviolent crime that poses little broad danger and nonviolent crime, such as fraud and corruption, that menaces the very fundamentals of a democracy.

Monopoly Versus Democracy
By Zephyr Teachout

By 1900, one percent of the U.S. population owned more than half of the country’s land; nearly 50 percent of the population owned just one percent of it. Multimillionaires, who made up 0.33 percent of the population, owned 17 percent of the country’s wealth; 40 percent of Americans had no wealth at all.

Like their forebears in the early twentieth century, today’s Americans have experienced decades of growing inequality and increasing concentrations of wealth and power. The last decade alone witnessed nearly 500,000 corporate mergers worldwide. Ten percent of Americans now control 97 percent of all capital income in the country. Nearly half of the new income generated since the global financial crisis of 2008 has gone to the wealthiest one percent of U.S. citizens. The richest three Americans collectively have more wealth than the poorest 160 million Americans. In most industries, a few companies control the field, dictating terms, squeezing out competitors, and using differential pricing to extract cash and power. Three companies control digital advertising, four companies dominate beef packing, and an ever-shrinking number own the country’s hospitals. To turn back this monopolistic tide, today’s populists and progressives should focus on the priorities that drove their forebears: breaking up companies that have become too big (or reclassifying them as public utilities) and making it harder for wealth to buy political influence by strictly limiting campaign contributions.

During the Gilded Age, populists, progressives, socialists, and even some corporatists clamored for campaign finance reform—and they made a good deal of progress. In 1907, the Tillman Act banned corporate donations to election campaigns. Three years later, Congress passed the Federal Corrupt Practices Act, which created the first requirement for federal-level candidates to disclose their sources of funding. That was followed by further limits on contributions. The impact of these steps was immediate and long-lasting: on a per capita basis, campaign contributions to candidates for federal office plummeted. Only in recent decades, in fact, have corporate campaign expenditures reached the levels that characterized the Gilded Age. These steps on campaign finance exemplified the populist-progressive nexus: populists in the South and the West supported them because they loathed the grinding power of big corporations; urban progressives backed them because they opposed sinful, wasteful, and corrupt behavior; and elected officials had no choice but to embrace them thanks to that bottom-up support.

In the decades that followed, however, organized money found ways to hollow out these limits. Today, money floods American politics like never before: according to the Center for Responsive Politics, political campaigns spent a total of $14 billion in the 2020 U.S. election.

Big Tech increases funding to US foreign policy think-tanks
By Kiran Stacey and Caitlin Gilbert

The world’s largest technology companies are pouring money into the biggest foreign policy think-tanks in the US, as they seek to advance the argument that stricter competition rules will benefit China.

Google, Amazon, Facebook and Apple are behind an increase in funding to four of Washington’s most prestigious research groups: the Center for Strategic and International Studies, the Center for a New American Security, Brookings and the Hudson Institute.

Total donations from Big Tech companies to the four think-tanks have risen from at least $625,000 in 2017-18 to at least $1.2mn in 2019-20, according to a Financial Times analysis of financial disclosures. These figures could be as high as $1.2mn in 2017-18 to $2.7mn in 2019-20.

While the money spent is relatively small for the companies involved — Google, Amazon, Facebook and Apple each have market capitalisations of more than $1tn — the funding has ramped up as tech groups join the oil and gas industry among the top donors to the think-tanks.

Technology companies have been building their presence in Washington for several years, as they fight bipartisan attempts to regulate them more strictly on everything from privacy to content moderation.

Silicon Valley’s biggest companies have become focused on influencing the US foreign policy elite, at a time when measures are being considered to tackle their market power in a way that tech executives have argued risks letting Chinese tech giants become more powerful.

Some of the people affiliated with the foreign policy think-tanks have been vociferous defenders of Big Tech recently, including 12 former top national security officials who signed a letter last September urging Congress to stop working on bills designed to toughen antitrust enforcement.

People close to the Biden administration said lobbyists were increasingly focused on similar arguments during private meetings.

“They are throwing spaghetti at the wall and seeing what sticks,” said one Democratic official. “At the moment, they seem to believe that the national security arguments are sticking.”

The think-tanks themselves have denied being influenced by corporate donations from the technology industry.

Bruce Freed, president of the Washington-based Center for Political Accountability, which argues for stricter tech regulations, said: “Funding think-tanks is a great way to influence experts and help shape the political conversation in a way that helps you.

“This is not traditional lobbying, but these companies are seeking to shape the conversation and shape the political climate. If they are spending this money, they are expecting a return.”

The techlash is the first step to restoring a fair US economy
By Susan Holmberg

The four dominant tech companies, Amazon, Google, Facebook (now Meta) and Apple, have several methods for blocking innovative upstarts that might pose a threat.

First there is the “buy and integrate” strategy, by which they swallow up competitors rather than develop their own breakthrough technologies. After Amazon bought warehouse robotics company Kiva in 2012, it terminated Kiva’s contracts with other retailers and pulled its groundbreaking technologies off the open market, opening a huge lead in ecommerce logistics for itself.

Facebook’s acquisition of Instagram in 2012 is another example. As smartphones soared in popularity, Facebook was outrun by platforms that had been designed to run on apps. To protect its dominance, it opted to buy up the innovators, according to a Federal Trade Commission lawsuit. “It’s better to buy than compete,” Facebook’s founder Mark Zuckerberg explained in an email.

Then there is the “buy and bury” strategy, where small companies are acquired and snuffed out. Facebook bought and killed several social networking sites, including Nextstop, Gowalla, Beluga and Lightbox. Between 2007 and 2019, the company shut down at least 39 companies. The big four plus Microsoft killed approximately half of all apps they bought from 2015 to 2019.

While there can be legitimate reasons for such “killer acquisitions” (to acquire staff, for example) they have resulted in whole market areas devoid of potential competitors because investors are scared to fund them. Between 2012 and 2017 the number of tech start-ups that received initial funding fell 22 per cent. As one investor put it, “90 per cent of the start-ups I see are built for sale, not for scale”. Many of those businesses end up in the tech giants’ waste bin.

The big groups also use their platform power to bleed competitors dry. Apple, for example, which reportedly buys a company every three to four weeks, dictates that its iOS users can only install apps through its own app store, and typically charges developers a 30 per cent commission for many of its paid downloads.

As my organisation’s research shows, because of Amazon’s dominance in online shopping, hundreds of thousands of independent businesses have little choice but to sell on its platform. Amazon wields its gatekeeper power not only to track sellers’ data to develop competing products, but to pocket an increasing cut earned by independent businesses. We found that in 2019, Amazon extracted $60bn in seller fees. Last year that rose to more than $120bn, further entrenching Amazon’s monopoly.

Amazon and the Breaking of Baltimore
By Alec MacGillis

Over the past four decades, deindustrialization, the rise of the tech economy and the weakening of antitrust enforcement have sorted the country into a small number of winner-take-all cities and a much larger number of left-behind cities and towns.

Once, wealth and prosperity were spread out across the country. In the 1960s, the 25 cities with the highest median incomes included Cleveland, Des Moines, Milwaukee and Rockford, Ill. By the middle of the last decade, virtually all of the top 25 were on the coasts.

Meanwhile, the gap between wealthy and poorer regions has grown much wider. In 1980, only a few sections of the country had median incomes that were more than 20 percent above or below the national average. Today, big chunks of the country fall into those extremes.

The imbalance is unhealthy at both ends of the spectrum. In one set of places, it produces congestion and unaffordability; on the other, blight and stagnation.

What The Rise Of Amazon Has To Do With The Rise Of Trump
By Danielle Kurtzleben

DK: You talked about this regional economic inequality contributing to how people vote, the rise of Donald Trump. I’m curious about your perspective on the whole ongoing economic-anxiety-versus-racial-resentment conversation that there always has been around Trumpism.

AM: I spent a lot of 2016 in Ohio — pretty early on about the rise of Trump and sort of how it was happening, trying to make sense of it. And I’ve always believed that the economic issues were closely tied together with his racist and xenophobic appeals, that the economic realities [and] decline made people more vulnerable to those appeals. As I put it in the book, the economic decline and resentment does not excuse racism and xenophobia, but rather weaponized it. It made it more powerful.

The fact is that it’s a big challenge for the Democratic Party that then a lot of these places that used to be real bastions of the party, as these communities have declined, as they’ve really seen their fortunes fall, [residents] look to the cities, the winner-take-all-cities that have become now the base of the Democratic Party — to San Francisco and Boston, to New York City and Seattle.

And they not only feel resentment of their extraordinary wealth and prosperity, but they also look to the Democrats in those cities and feel alienation. They feel this sense of, “That is not me. In no way am I those people. Those people cannot possibly be in the same party as me, because they’re utterly removed from my experience.” And that is sort of how you get to the Obama-Trump voter.

The Democratic Party is now increasingly becoming what I think of as sort of the Amazon Coalition. It’s a coalition of, on the one hand, the highly-educated middle and upper-middle class, big blue metro residents who do a lot of shopping on Amazon.

And then on the other hand, the party still has quite a lot of working-class black and brown voters, many of whom work at Amazon now as the warehouse workers, as the drivers. And so you have this coalition of the party that is essentially the people who are buying all this stuff online, highly educated, mostly white professionals buying all this stuff, and then the people packing and delivering it to them, which is a which is a pretty awkward coalition and and perhaps not the most sustainable one.

Inside Jeff Bezos’ Obsessions
By Shira Ovide

The company likes to say that everything at Amazon begins with what the customer wants and works backward. But one inescapable conclusion from reading “Amazon Unbound” is how much Amazon is a product of Bezos’s will and his responses to competitive challenges or criticisms.

I posed two questions to Stone: When have Bezos’s ideas and his relentlessness to pull them off been helpful, and when have those same qualities led Amazon astray? And has it been good or bad for Amazon to be guided by one person and his obsessions?

Stone told me that Bezos believes Amazon is in a unique position to do difficult, expensive and big things, and he wants to push against employees’ natural resistance to hard changes. His instincts aren’t infallible, but Bezos has been right a lot, he said.

“The countervailing force,” Stone said, is that the world’s richest person “doesn’t really live among us anymore. His personal taste in burgers and technology don’t always represent the common taste.”

Bezos has often said that failures are inevitable and even welcome. They show that Amazon isn’t afraid to try bold things.

But while reading Stone’s book, I wondered if Amazon’s failures weren’t always the result of noble swings at big ideas but sometimes because of blind spots: a lack of self-reflection and a corporate culture that resists standing up to Bezos.

Stone writes that many employees who worked on the Fire phone had serious doubts about it, but it seemed that no one was willing to fight the boss. Stone’s book recounts numerous executives who were driven out of Amazon, including some who challenged Bezos or ways in which the company operated.

Robber Barons in the New Gilded Age
By Michael Lind

Today, Consolidated Edison (Con Ed) is one of the largest private but publicly regulated electrical utility companies in the country. Yet you’re unlikely ever to have heard of its chairman, president, and CEO, Timothy P. Cawley. In 2020 Cawley’s total compensation, in cash and equity, was $2.8 million.

In a well-governed American economy, the celebrity tech tycoons whose banal pronouncements, marital lives, and vanity trips into outer space are covered by the media, and whose firms increasingly determine what Americans can and cannot read, view, or listen to, would be CEOs of regulated utilities like Con Ed. They would be as relatively obscure, modestly paid (in comparison), and lacking in discretionary power over other companies and individuals as Timothy Cawley.

As the example of Con Ed suggests, we Americans can find models for regulating today’s essential intangible infrastructure utilities in the way that we regulate railroads, electric, and water utilities.

The Politics of Tollbooth Capitalism
By Michael Lind

The term “robber baron” is used nowadays simply as another term for “rich person.” But its original meaning referred to German aristo­crats with castles along the Rhine who used the threat of violence to extort tolls from passing ships. In that sense, Microsoft’s Bill Gates, Amazon’s Jeff Bezos, the founders and CEOs of Google/Alphabet, and Facebook’s Mark Zuckerberg are indeed “robber barons.” Their business model consists of controlling essential choke points in the economy.

Asked once what his ideal company was, the billionaire Warren Buffett, one of the world’s richest individuals, replied: “High pricing power, a monopoly.” Economist Michael Hudson, among others, has referred to the multiplication of such arrangements as the “tollbooth economy.” In their reliance on this tollbooth economy, the lords of today’s digital-era infrastructure—“green” and “progressive” though they may be—have more in common with railroad tycoons than with the CEOs of mid-century manufacturing companies.

There are other parallels between the United States in the late nineteenth century and the present. One is the subservience of both national parties to a single infrastructure industry or set of infrastructure industries. For decades following the Civil War, Democratic and Republican politicians alike could be found to do the bidding of the railroad companies. Today servility toward Silicon Valley as well as Wall Street characterizes both national parties, apart from the occa­sional leftist politician or right-wing populist.

Today’s tech tycoons and their companies are as hostile to organized labor as the railroad-era robber barons were, even if they don’t hire Pinkerton detectives to beat up union activists. In 2010, the Justice Department brought a lawsuit against Apple, Google, Intel, Intuit, Pixar, Adobe, eBay, and Lucasfilm, which illegally promised not to hire each other’s employees in order to suppress the bargaining power and wages of their workers. Apple has outsourced most of its manufacturing to unfree labor in China, while Uber and Lyft have reclassified people who are clearly employees as contractors without full labor rights or benefits. This may be changing, with the recently formed Alphabet Workers Union made up of employees and contrac­tors at Google’s parent company. But for now, private sector union­ization in the United States is lower than it was under Herbert Hoo­ver, and its absence from the increasingly powerful tech sector is striking.

The railroad industry was the first national industry to be regulated, and the price-and-entry regulatory model was applied to all sorts of transportation in­dustries in the mid-twentieth century. In the same New Deal/postwar era, finance was heavily regulated and treated it as a public utility.

The Progressives and New Dealers may have applied the public utility model too mechanically. It is not clear, for example, that it made sense to regulate trucking or airlines using methods similar to those used for railroads. But the pendulum swung too far in the other direction after neoliberal Democrats and libertarian Republicans com­peted with each other in the late twentieth century to deregulate one industry after another along with finance.

Indeed, it is precisely because twentieth-century public utility regulation turned infrastructure industries like electricity, water, and gas into dull, low-profit private or public firms, which could not exact predatory tolls, that there were no equivalents to the railroad robber barons of old or the tech robber barons of today. Absent such utility regulation, American history from the 1930s to the 1990s might in­clude the names of celebrity tycoons like the Natural Gas Tsar or the Northeastern Electricity Baron or the Emperor of the Telephones—the equivalents of Gates, Jobs, Zuckerberg, and Dorsey today.

Each suc­cessive American republic has been dominated politically as well as economically by a new set of powerful industries. In FDR’s Third Republic, the railroad industry that dominated the Second Republic was in decline, compared to ascendant manufacturing and oil and gas. Today, American manufacturing has been sacrificed to the interests of Silicon Valley and Wall Street, whose major firms are consolidating their control over everything from the retail and lending to the media and nonprofit sectors.

Will the next American regime be dominated by digital infrastructure robber barons who control essential choke points, rather than by innovative industrial capitalists whose firms invent and make new products?

Google, Amazon, Meta and Microsoft Weave a Fiber-Optic Web of Power
By Christopher Mims

Fiber-optic cable, which carries 95% of the world’s international internet traffic, links up pretty much all of the world’s data centers, those vast server warehouses where the computing happens that transforms all those 1s and 0s into our experience of the internet.

Where those fiber-optic connections link up countries across the oceans, they consist almost entirely of cables running underwater—some 1.3 million kilometers (or more than 800,000 miles) of bundled glass threads that make up the actual, physical international internet. And until recently, the overwhelming majority of the undersea fiber-optic cable being installed was controlled and used by telecommunications companies and governments. Today, that’s no longer the case.

In less than a decade, four tech giants— Microsoft, Google parent Alphabet, Meta (formerly Facebook ) and Amazon —have become by far the dominant users of undersea-cable capacity. Before 2012, the share of the world’s undersea fiber-optic capacity being used by those companies was less than 10%. Today, that figure is about 66%.

And these four are just getting started, say analysts, submarine cable engineers and the companies themselves. In the next three years, they are on track to become primary financiers and owners of the web of undersea internet cables connecting the richest and most bandwidth-hungry countries on the shores of both the Atlantic and the Pacific, according to subsea cable analysis firm TeleGeography.

Industry analysts have raised concerns about whether we want the world’s most powerful providers of internet services and marketplaces to also own the infrastructure on which they are all delivered. This concern is understandable. Imagine if Amazon owned the roads on which it delivers packages.

But the involvement of these companies in the cable-laying industry also has driven down the cost of transmitting data across oceans for everyone, even their competitors, and helped the world increase capacity to transmit data internationally by 41% in 2020 alone, according to TeleGeography’s annual report on submarine cable infrastructure.

Sharing bandwidth among competitors helps ensure that each company has capacity on more cables, redundancy that is essential for keeping the world’s internet humming when a cable is severed or damaged. That happens around 200 times a year, according to the International Cable Protection Committee, a nonprofit group. (Repairing damaged cables can be a huge effort requiring the same ships that laid the cable, and can take weeks.)

But the structure of these deals also serves another purpose. Reserving some capacity for telecom carriers like Telxius is also a way to keep regulators from getting the idea that these American tech companies are themselves telecoms, says Mr. Stronge. Tech companies have spent decades arguing in the press and in court that they are not “common carriers” like telcos—if they were, it would expose them to thousands of pages of regulations particular to that status.

There is an exception to big tech companies collaborating with rivals on the underwater infrastructure of the internet. Google, alone among big tech companies, is already the sole owner of three different undersea cables, and that total is projected by TeleGeography to reach six by 2023.

Google declined to disclose whether or not it has or will share capacity on any of those cables with any other company.

The ability of these companies to vertically integrate all the way down to the level of the physical infrastructure of the internet itself reduces their cost for delivering everything from Google Search and Facebook’s social networking services to Amazon and Microsoft’s cloud services. It also widens the moat between themselves and any potential competitors.

Amazon Web Services outage affects sites and services worldwide
By Associated Press Staff

Amazon’s cloud-service network suffered a major outage Tuesday, the company said, disrupting access to many popular sites. The service provides remote computing services to many governments, universities and companies, including The Associated Press.

Roughly five hours after numerous companies and other organizations began reporting issues with Amazon Web Services, the company said in a post on the AWS status page that it had “mitigated” the underlying problem responsible for the outage. Shortly thereafter, it reported that “many services have already recovered” but noted that others were still working toward full recovery.

The issue primarily affected Amazon web services in the eastern U.S., it said. The company has not disclosed any additional details about the problem besides noting that it had also affected its ability to publish status updates.

Technologist and public data access activist Carl Malamud said the outage highlights just how badly the internet’s original design goal — to be a distributed network with no central point of failure, making it resilient to mass disasters such as nuclear attack — has been warped by Big Tech.

“When we put everything in one place, be it Amazon’s cloud or Facebook’s monolith, we’re violating that fundamental principle,” said Malamud, who developed the internet’s first radio station and later put a vital U.S. Security and Exchange Commission database online.

Antimonopoly Power
By Barry C. Lynn

In March, images of the container ship Ever Given aground in the Suez Canal became an emblem of the fragility of today’s international systems. The stranding triggered a series of disruptions to global transport, demonstrating how the world’s maritime cargo systems depend on a handful of chokepoints. Yet such disruptions are among the lesser threats posed by the concentration of capacity and control in global industrial systems.

Consider how the avoidable shortages of masks, testing gear, and vaccines have shattered trust among even the closest and most integrated of neighbors during the COVID-19 pandemic. Or take the semiconductor industry, which provides crucial components for almost every major industrial product in the world today. A single chipmaker, Taiwan Semiconductor Manufacturing Corporation, has nearly monopolized the production of high-end semiconductors, producing almost all of the world’s supply of certain types of essential chips. This year, shortfalls at TSMC disrupted production in industries from automobiles to telecommunications: Ford, for instance, projected in April that it would lose half of its second-quarter output. Making matters worse, TSMC has concentrated a substantial amount of its operations on a single island—Taiwan—that bestrides two kinds of fault lines, one physical and the other political. An earthquake or a conflict with China could suddenly shut down all of the corporation’s production, with catastrophic effects.

There are similar threats in the communications sector. Amazon, Facebook, and Google have concentrated control over how people exchange information with one another. Their business models—which rely on manipulating what citizens read and buy and which seek to monopolize online advertising revenue—have undermined the free press and the public square in democracies around the world. Just as dangerous, Facebook and Google have sought to intimidate or pay off influential publishers, including News Corp and The New York Times, and have punished countries, such as Australia, that have dared to try to regulate them. At the same time, digital interconnection has given foreign states and groups new ways to disrupt everyday life in the United States, as demonstrated by Russia’s alleged massive hack of U.S. government computers last year, the ransomware attack on Colonial Pipeline in May, and China’s and Russia’s routine exploitation of Facebook, Google, and Twitter to influence U.S. political debates.

Then there are more old-fashioned threats. China, for instance, has used its power over essential components and profit flows to coerce Western corporations such as Apple, Disney, and Nike into promoting Chinese propaganda and reinforcing the power of the Chinese state. The country has used that same power to pressure U.S. allies to agree to special deals. In December 2020, less than a month before Joe Biden took office as U.S. president, Germany pushed the EU into a wide-ranging investment agreement with China, a key source of supplies and profits for German car-makers and manufacturers.

The most immediate danger is the way that monopolists and mercantilists have stripped out many of the physical redundancies that once helped ensure the stability and resilience of international systems. There are many examples of how shocks in one place have swiftly become global problems. The Taiwan earthquake in September 1999, the financial crisis beginning in September 2008, and the Tohoku earthquake in Japan in March 2011 all saw local shocks trigger cascading shutdowns of industrial production around the world. Even more terrifying is the prospect of a political or military action that cuts off access to a key industrial zone. A Chinese move on Taiwan, for instance, would risk not only a major-power war but also the shutdown of much of the world’s industrial production.

F.D.R. Took Down Giants. Biden Can, Too.
By Mark Pryor

Often overlooked in comparison to other aspects of his presidency, President Roosevelt’s push to revive languishing antitrust enforcement helped set the United States back on the right track, creating job and wealth opportunities for Americans at one of the lowest points in the nation’s history.

The reinvigoration of antitrust enforcement helped usher in an era of entrepreneurship and small-business growth. The United States was able to assert itself as a global economic leader, establishing a model of corporate decentralization that would be adopted by democratic nations across the world. But reinvigorating antitrust did not come without substantial opposition from business interests as well as judicial and enforcement bodies that lost their way. Indeed, Roosevelt’s plans were only as strong as the people he appointed to turn his vision of an open market economy into reality.

F.D.R. himself addressed the issue directly in a resounding speech when accepting his renomination before the Democratic National Convention in 1936 by referring to the monopolies of the day as “royalists” with “concentration of control over material things” and lamented that “the whole structure of modern life was impressed into this royal service.”

The Man Who Loved Presidents
By Thomas Frank

Just glance at FDR’s combative 1936 State of the Union address, in which he declared that he had “earned the hatred of entrenched greed” and that the country’s business elite was planning its return to power: “Autocrats in smaller things, they seek autocracy in bigger things.”

Those remarks call to mind an important truth about the Depression years and the broader history of reform. FDR’s most dangerous enemies weren’t five-and-dime führers playing Nazi in homemade uniforms; they were the wealthiest people in the land. They were America’s great industrialists, its leading lawyers and economists, the publishers of its newspapers. These were the men who dabbled in fascism and hired strikebreakers and believed in scientific racism. And many of these fine, affluent, tasteful people came together in the mid-Thirties as the American Liberty League, the great-granddaddy of all the right-wing front groups to come. … Roosevelt fought against them not by respecting norms or by refusing high-mindedly to cater to people’s economic fears; he did the opposite, and in 1936 he trounced what he called “organized money” in one of the greatest victories for liberalism of all times.

Maj. Gen. Smedley Butler: A Marine hero who saw himself as a ‘racketeer for capitalism’
By Eric Rauchway

In 1934, a man representing wealthy American businessmen tried to recruit Butler to lead a revolt against Roosevelt. Inspired by a right-wing group of veterans that unsuccessfully stormed the French legislature, the man proposed that Butler could lead a similar, but successful, group in the United States; maybe half a million veterans would join. The businessmen would pay for this plan to halt the New Deal. Butler replied, “If you get these five hundred thousand soldiers advocating anything smelling of fascism, I am going to get five hundred thousand more and lick the hell out of you.” A congressional committee confirmed Butler’s testimony but did not pursue the matter, leaving Butler to wage his battle against the misdeeds of American businessmen without their help. He died in 1940, before he could see the Marines fight fascism in the field.

Biden Launches Sweeping Action on “Big Tech, Big Pharma, and Big Ag.” Can It Be Real?
By Matt Stoller

In 1938, Franklin Delano Roosevelt gave a speech to Congress on curbing monopolies. With the looming threat of of Nazi Germany’s growing power, Roosevelt warned Americans of the relationship between concentration and authoritarianism. “The liberty of a democracy is not safe if the people tolerate the growth of private power to a point where it becomes stronger than their democratic state itself,” he said. Roosevelt called for the entire government to take on the problem of monopoly, encouraging stronger action on everything from antitrust to bank regulation to the misuse of patents.

And it worked – over the next few years, the Department of Justice brought more antitrust cases than had been brought from 1890 to that point. Congress passed laws regulating investment trusts, regulators cracked down on large banking houses, the Army and Navy kept contractors competitive and prevented price gouging in the build-up to war, and policymakers ended the misuse of patents that let monopolists dictate the roll-out of technology. With the Alcoa decision in 1945, the courts finally outlawed monopolies, and by the 1950s, powerful business leaders treated rivals, suppliers and workers reasonably, for fear of antitrust enforcement, setting the stage for the rise of Silicon Valley and the electronic century.

On Friday, Joe Biden reached back to that moment, and gave the most significant speech on monopolies by an American President since then. “Capitalism without competition isn’t capitalism,” he said. “It’s exploitation.” The speech very much paralleled how FDR framed his talk, emphasizing the importance of small business, workers, and consumers. Biden talked of the need to take on Big Tech, Big Pharma, and Big Ag, and even cited FDR’s call for an economic bill of rights, quoting Roosevelt’s goal of ensuring the “right of every businessman, large and small, to trade in an atmosphere of freedom from unfair competition and domination by monopolies at home or abroad.”

But far more important was Biden’s explicit criticism of the Chicago School, by name. “Forty years ago we chose the wrong path,” said Biden. “Following the misguided philosophy of people like Robert Bork, we pulled back on enforcing laws to promote competition. We are now forty years into the experiment of letting giant corporations accumulate more and more power.” The President of the United States does not typically wade into esoteric legal debates involving competition lawyers. But the policy he was introducing in this speech required it. Biden was giving a speech about an executive order mandating that the policy of the Federal government is to promote fair competition, not just through the antitrust laws, but through every agency with authority to structure markets.

Biden’s speech is as important an ideological turnaround as we see in politics, as big a deal as Ronald Reagan’s statement that “Government is not the solution to our problem, government is the problem.” Biden explicitly called out lax controls on corporate power as the causal factor behind American stagnation. “What have we gotten from it?” Biden asked. “Less growth, weakened investment, fewer small businesses. Too many Americans who felt left behind, too many people who are poorer than our parents. I believe the experiment failed.”

Economists Pin More Blame on Tech for Rising Inequality
By Steve Lohr

Daron Acemoglu, an influential economist at the Massachusetts Institute of Technology, has been making the case against what he describes as “excessive automation.”

The economywide payoff of investing in machines and software has been stubbornly elusive. But he says the rising inequality resulting from those investments, and from the public policy that encourages them, is crystal clear.

Half or more of the increasing gap in wages among American workers over the last 40 years is attributable to the automation of tasks formerly done by human workers, especially men without college degrees, according to some of his recent research.

Globalization and the weakening of unions have played roles. “But the most important factor is automation,” Mr. Acemoglu said. And automation-fueled inequality is “not an act of God or nature,” he added. “It’s the result of choices corporations and we as a society have made about how to use technology.”

Economists point to the postwar years, from 1950 to 1980, as a golden age when technology forged ahead and workers enjoyed rising incomes.

But afterward, many workers started falling behind. There was a steady advance of crucial automating technologies — robots and computerized machines on factory floors, and specialized software in offices. To stay ahead, workers required new skills.

Yet the technological shift evolved as growth in postsecondary education slowed and companies began spending less on training their workers. “When technology, education and training move together, you get shared prosperity,” said Lawrence Katz, a labor economist at Harvard. “Otherwise, you don’t.”

Increasing international trade tended to encourage companies to adopt automation strategies. For example, companies worried by low-cost competition from Japan and later China invested in machines to replace workers.

Today, the next wave of technology is artificial intelligence. And Mr. Acemoglu and others say it can be used mainly to assist workers, making them more productive, or to supplant them.

So-so technologies replace workers but do not yield big gains in productivity. As examples, Mr. Acemoglu cites self-checkout kiosks in grocery stores and automated customer service over the phone.

Today, he sees too much investment in such so-so technologies, which helps explain the sluggish productivity growth in the economy. By contrast, truly significant technologies create new jobs elsewhere, lifting employment and wages.

The rise of the auto industry, for example, generated jobs in car dealerships, advertising, accounting and financial services.

Market forces have produced technologies that help people do their work rather than replace them. In computing, the examples include databases, spreadsheets, search engines and digital assistants.

But Mr. Acemoglu insists that a hands-off, free-market approach is a recipe for widening inequality, with all its attendant social ills. One important policy step, he recommends, is fair tax treatment for human labor. The tax rate on labor, including payroll and federal income tax, is 25 percent. After a series of tax breaks, the current rate on the costs of equipment and software is near zero.

Well-designed education and training programs for the jobs of the future, Mr. Acemoglu said, are essential. But he also believes that technology development should be steered in a more “human-friendly direction.”

Q&A: David Autor on the long afterlife of the “China shock”
By Peter Dizikes

Q: The “China shock” was devastating to some local and regional U.S. economies, from 2001 through 2011. What did you find about subsequent years?

A: The China shock, when we first wrote about it, was ongoing, and China continued to gain market share in the U.S. [We have now found] the China shock plateaued around 2010-2012. In the decade since then, did places rebound? Unfortunately, places that lost manufacturing employment have seen a persistently deteriorated level of overall employment-to-population ratio and of earnings, while rates of dependency on transfer benefits have risen. Economists have made the phrase “creative destruction” famous. We’ve seen the destruction but not the creative rebound yet.

Q: Why did the China trade shock persist so much in some particular places?

A: One of the things that was puzzling about the China trade shock is that when the going got tough, very few people got going. We didn’t see people picking up and moving to better opportunities, as in historical narratives of U.S. resiliency. The one group that tended to move out more was young adults, which is logical since they are generally more mobile and arguably have the most to gain. We also know as a background fact that education levels of metro areas are a very strong predictor of economic resiliency. The more-educated places tend to be able to reinvent themselves, though often not with the same beneficiaries. Pittsburgh has reinvented itself — it used to be a steel town, now it’s a center for health care and tech. But it’s probably not former steel workers, primarily, who are doing that work. Overall, the picture is discouraging.

Q: You point out in this paper that this is not the only large-scale shock we’ve seen, and we should be prepared for other economic shocks. Isn’t another implication of your work that the U.S. economy, at least to some extent, moves from shock to shock, and we should think about what happens in those conditions?

A: Yes. For example, because of the move to cleaner energy sources, employment in U.S. coal production has fallen by 80 percent since 1979. We find that places that are more severely adversely affected don’t tend to come back quickly. With coal in West Virginia, you could say, “Look, there are so few people affected any more. It’s less than a hundred thousand. Why are we so sentimental about this? What’s the big deal?” And the answer is, those people are in just a few places, and they’re really hurting. There’s not something equally good that miners can do to get equal pay or equal esteem in the community. This holds a lesson for what’s ahead. The [renewable] energy transition will create a lot of new work and will be highly investment-intensive, but with different technologies in different places, and it does mean again there will be concentrated losses.

The geographic concentration is what makes these things particularly pernicious — the fact that it all happens in one place at one time. U.S. domestic furniture manufacturing was basically run over by China trade. Not high-end bespoke furniture, but the commodity items you get at Walmart or Target are now made in China or Vietnam.

By contrast, office computing over the last couple of decades has hollowed out the ranks of administrative support positions, but these are not really similar [problems] because that’s one occupation in many businesses which have not gone belly-up. A category of worker is in lower demand, but it’s not like we say, “Oh my God, Topeka used to be the administrative support capital of America.” There’s no such thing. Whereas China trade made American furniture manufacturing businesses not viable. And it’s not just the woodworkers who lose their jobs, it’s office support people, it’s financial people, it’s the effects on transportation, everything.

Q: All right, then what are the best policy steps for helping the people and places affected by these kinds of shocks?

A: I think there are different levels of policy. Knowing what we know now, I would have done China trade policy more gradually. Also, we ought to have much better economic adjustment assistance in place. The U.S. spends an order of magnitude less of GDP in what we call active labor market policies. Denmark spends about 3 percent of GDP on that. We spend about 0.3 percent of GDP on it. Denmark has very fluid labor markets. You can terminate most workers in Denmark for almost any reason, and people don’t expect to keep lifetime jobs. But the state is heavily involved in retraining and reactivating.

A second angle of attack is place-based policy, but we’re not very good at this. There are enterprise zones, which give money to wealthy developers to do things they would do anyway. Subsidized training, and [policies] that stimulate employers to create local jobs sometimes work. It’s not that places can’t be reinvented. But we don’t have an off-the-shelf tool kit for that.

Another approach is targeting investments and interventions on people who need a leg up in the labor market. I’m working on a number of experiments about this: One is on reducing overuse of criminal background checks, another is on a high-intensity STEM training program for people without college degrees, another is attempting to change the quality of jobs in home health care delivery. Fewer than four in 10 American adults have a four-year college degree, and there’s a budding interest in trying to get employers to reduce credentialism, as a way of improving access to family-supporting jobs.

So, one level is about trade and policy. One is about adjustment systems. A third is about place-based policy. And a fourth is interventions to improve labor market opportunity for people without college degrees.

America Is on a Road to a Better Economy. But Better for Whom?
By Ben Casselman

The first two decades of the 21st century were a parade of economic disappointments: The bursting of the dot-com bubble was followed by a recession; which was followed by a “jobless recovery”; which was followed by another burst bubble, this time in housing; and another, even worse, recession; and another, even weaker, recovery.

Officially, the Great Recession ended in June 2009, but it took two years for U.S. gross domestic product to return to its pre-recession level, and six years for unemployment to do so. Long-term joblessness didn’t even stop rising until the recession had been over for nearly a year, and it didn’t get back to its pre-2008 normal until well into 2018. Year after year, forecasters predicted that this was the year that growth would finally pick up and wages would rise and prosperity would be widely shared. And year after year they were wrong. The pessimism became so ingrained that by 2019, when things were, finally, actually pretty good, the dominant economic narrative was about what would inevitably cause the next recession. (“Global pandemic” did not tend to make the list.)

The last crisis, like most recessions, was caused by a fundamental imbalance — the housing bubble — that had to be resolved before the economy could start growing again. Construction workers and mortgage brokers had to find jobs in other industries. Households had to get out from under unsustainable debt loads. Banks and other financial institutions had to write off hundreds of billions in bad loans.

This time, there was no imbalance. Things were basically going fine, and then an outside force, what economists call an “exogenous shock,” turned the world upside-down. If we could somehow have pressed “pause” until the pandemic ended, there would have been no reason for a recession at all.

Of course, there is no “pause” button. That’s why everyone was so worried about the ripple effects last year. Restaurants can’t pay waiters when they have no customers. Waiters can’t pay rent when they have no jobs. Landlords can’t pay their mortgages when their tenants don’t pay rent. Banks can’t make new loans when borrowers stop making payments. And so on and so on, until what began as an isolated crisis caused by a specific set of circumstances has turned into a general pullback in activity across the economy.

Except that never really happened this time. Evictions, foreclosures and bankruptcies all fell last year. The financial system, as anyone who has checked their 401(k) balance lately can attest, did not collapse. Perhaps the most shocking statistic in a year of shocking economic statistics is this one: In what was, by many measures, the worst year since at least World War II, Americans’ income, in aggregate, actually rose.

How is this possible? Because of the other reason this recovery is different: the federal government. Counting all the various Covid relief packages passed under two presidents, the United States has now pumped more than $5 trillion into the economy. That dwarfs not just what the U.S. has spent in any previous recession, but also the aid provided in almost any other large country.

Here’s what that money meant in the real world: When the economy shut down last spring, the federal government stepped in to ban most evictions and made it easy for borrowers to delay payments on their mortgages and student loans. It expanded access to nutrition benefits, school-lunch programs and other emergency relief programs. The Federal Reserve bought hundreds of billions of dollars’ worth of bonds to keep credit flowing and avoid a repeat of the 2008 crisis.

Most important, the government gave people money. Lots of money. By April of this year, the typical middle-class family of four had received more than $11,000 through successive rounds of direct payments. That doesn’t include the expanded child tax credit that was part of the latest aid bill, which is worth up to $3,600 per child.

The CARES Act, which Congress passed in late March 2020, also provided $600 a week in extra unemployment benefits to laid-off workers, and created a whole new program — Pandemic Unemployment Assistance — to cover freelancers, gig workers and other people who ordinarily don’t qualify for benefits. And it created the Paycheck Protection Program, which gave out more than half a trillion dollars in low-interest — and in many cases, forgivable — loans to small businesses, most likely preventing thousands of employers from going under entirely.

Low-income families are starting in a much different place from where they were in the last recovery. Indeed, American households are, on average, in the best financial shape in decades. Debt levels, excluding home mortgages, are lower than before the pandemic. Delinquencies and defaults are down, too. And Americans in aggregate are sitting on a mountain of cash: $6 trillion in savings as of March, more than four times as much as before the pandemic.

Averages, of course, don’t tell the full story. The wealthy, and even the merely affluent, have done exceedingly well during the pandemic. They have, by and large, kept their jobs. They have seen the value of their stock portfolios soar. And they have spent less on vacations, restaurant meals and other services. For those at the other end of the economic spectrum, the picture looks very different: Many of them lost their jobs, had no investments to start with and needed every penny of the aid they received to meet basic living expenses, if they managed to get that aid at all.

Those diverging fortunes are what commentators have called the “K-shaped recovery” — rapid gains for some, collapse for others. But that narrative is incomplete. Millions of people have been financially devastated, but many more have not been.

The End of Free-Lunch Economics
By Raghuram G. Rajan

Most would agree that the pandemic created a need for targeted spending (through extended, generous unemployment benefits, for example) to shield the hardest-hit households. But, in the event, the spending was anything but targeted. The US Congress passed multi-trillion-dollar bills offering something for everyone.

The Paycheck Protection Program (PPP), for example, provided $800 billion in grants (effectively) for small businesses across the board. A new study from MIT’s David Autor and his colleagues estimates that the program helped preserve 2-3 million job-years of employment over 14 months, at a stupendous cost of $170,000-$257,000 per job-year. Worse, only 23-34% of this money went directly to workers who would otherwise have lost their jobs. The balance went to creditors, business owners, and shareholders. All told, an estimated three-quarters of PPP benefits went to the top one-fifth of earners.

What Can Replace Free Markets? Groups Pledge $41 Million to Find Out.
By Steve Lohr

Wages have been stagnant for most Americans for decades. Inequality has increased sharply. Globalization and technology have enriched some, but also fueled job losses and impoverished communities.

Those problems, many economists argue, are partly byproducts of government policies and corporate practices shaped by a set of ideas that championed free markets, free trade and a hands-off role for government. Its most common label is neoliberalism.

The free-market worldview was most forcefully promoted in the 1960s and 1970s by a group of economists at the University of Chicago, led by Milton Friedman, and became known as the Chicago school.

In the 1980s, the Reagan administration and Margaret Thatcher’s government in Britain wholeheartedly embraced the neoliberal model. It was also the Clinton administration’s guiding mind-set for policies like free-trade agreements and financial deregulation. That was generally true for the Obama administration as well in areas like trade, the bank bailout and antitrust enforcement.

In recent years, many prominent economists have questioned the wisdom of leaving so many human outcomes to the whims of markets. Economists are increasingly investigating inequality, and that is a focus of the universities receiving the grants.

“Reducing inequality has to be a goal of economic progress,” said Dani Rodrik, an economist at Harvard’s Kennedy School and a leader of its project on reimagining the economy. “We have all this new technology, but it is not encompassing large enough parts of the work force and not enough parts of the country.”

The grantees are qualified market enthusiasts. “Markets are terrific, but we have to overcome this notion that ‘markets are autonomous — so just leave it to the market,’” said David Autor, a labor economist at M.I.T. “That fatalism is a decision.”

Dr. Autor is a leader of the M.I.T. program on shaping the future of work. “We’re calling it ‘shaping’ because it is interventionist,” he said.

The M.I.T. project will research the challenges faced by workers without four-year college degrees — nearly two-thirds of the nation’s work force — and steps that could improve their jobs or lift them into higher-paying occupations.

The M.I.T. group will also explore policies and incentives to steer technological development in ways that enhance the productivity of workers instead of replace them.

Meager Rewards for Workers, Exceptionally Rich Pay for C.E.O.s
By Peter Eavis

A comprehensive survey of the 200 highest-paid chief executives at public companies conducted for The New York Times by Equilar, an executive compensation consulting firm, revealed some of the biggest pay packages on record, and showed that the gap between C.E.O.s and everybody else widened during the pandemic.

Six of the biggest earners made it onto Equilar’s ranking of people with the 10 largest pay packages of the last decade, topped by Elon Musk of Tesla, who was awarded $2.3 billion in 2018.

The class of 2020 crashed another elite club: Eight of the top-earning executives got compensation last year worth more than $100 million. In 2019, only one earned more than that; in 2018, five did.

The gap between the C-suite and the rest grew bigger, too. C.E.O.s in the survey received 274 times the pay of the median employee at their companies, compared with 245 times in the previous year. And C.E.O. pay jumped 14.1 percent last year compared with 2019, while median workers got only a 1.9 percent raise.

“While Americans were cheering on the workers who were keeping our economy going, corporate boards were busy coming up with ways to justify pumping up C.E.O. pay,” said Sarah Anderson, global economy director at the Institute for Policy Studies, a progressive think tank.

Why the US right wants to put workers in the boardroom
By Oren Cass

Giving workers a seat at the boardroom table also represents a sharp philosophical departure from the American centre-right’s previous dogmatic commitment to “shareholder primacy” and the idea that corporations should be managed solely for the benefit of shareholders. Progressives will no doubt complain that the proposal does not go further, but, when it comes to the principles of corporate governance, the step from zero worker representatives to one is surely significant.

Board members exercise meaningful influence rarely by their individual votes, and much more often by their presence, their ability to raise issues and gather information. Studies of works councils and board-level employee representation have found that they not only increase trust and co-operation, but also enhance productivity, capital formation, market value and resilience.

US CEO pay: shareholders have woken up to excess
By Lex

Shareholders increasingly vote against management. This year 25 companies shareholders have approved executive pay packages with 75 per cent or less of the vote, up from 16 last year, according to data from Farient Advisors. Such high profile rebukes have occurred at prominent companies such as General Electric, IBM and Starbucks.

Institutional shareholders are increasingly resistant to large CEO pay packages. These have edged up through benchmarking against peer group averages that ensures everybody below the mean can claim a pay rise. Even shareholders who have profited handsomely are losing their previous deference to management.

Shareholders are causing seismic changes at massive companies like AT&T and Exxon
By Felix Salmon

Driving the news: Exxon shareholders delivered a major blow against the company’s management this week, installing new board members who take climate change seriously.

  1. What they’re saying: “This is doubtless a day Exxon’s management would rather forget,” writes Bloomberg’s Liam Denning, “but it is a good day for the company.”
  2. Of note: Chevron shareholders, too, voted against management and in favor of greater emissions reductions.
  3. Our thought bubble, from Axios’ Ben Geman: The question of whether mainstream finance wants more climate action from oil majors has been definitively answered.

Context: Shareholders aren’t just embracing the global green-investing thesis, they also have the world’s governments at their backs. The Dutch government ordered Shell this week to get significantly more aggressive in terms of oil production cuts, in a move that presages further such rulings from other governments both inside and outside the EU.

  1. The long-term vision here belongs to the shareholders who see both the necessity and the inevitability of the coming energy transition. That’s worth remembering, the next time a CEO tries to paint activists as being interested only in short-term gains.

Capital for the people — an idea whose time has come
By Rana Foroohar

Unlike traditional methods of redistribution, in which the state taxes existing wealth and then uses it to bolster various projects and constituents, pre-distribution is all about harnessing capital the same way investors do, and then using the proceeds of the capital growth (which as we know far outpaces income growth) to fund the public sector.

The idea of allowing more people to become owners of capital has actually been in operation for some time. The CalSavers programme, created in 2016, allows for individuals such as gig workers or independent contractors who do not have access to private sector retirement accounts to contribute to professionally managed funds in a system run by the state.

Likewise, Proposition 24, the California Privacy Rights Act, was passed last year and will go into effect in 2023. This actually creates a kind of stealth sovereign wealth fund, in which 93 cents of every dollar garnered by the fees paid by companies for violations of privacy (which, given the nature of surveillance capitalism, are likely to be substantial) can be invested by the Treasury, and the proceeds of any gains used to pay for government operations. “It’s a way of helping us not have to raise taxes,” says California Senate majority leader Robert Hertzberg, a Democrat.

He, along with some very rich Californians like former Google chief executive Eric Schmidt and Snap founder Evan Spiegel, have proposed that the concept be broadened into something called “universal basic capital”. The idea is that seed contributions of equity from companies or philanthropists could be invested into a fund that would then be used by individual Californians for things like retirement security, healthcare and so on.

Already, in the 2021-2022 budget, Gavin Newsom, the governor of California, has proposed using some of the state’s tax surplus this year — which along with federal Covid relief has put an extra $100m into public coffers — to start college accounts for every low-income first grader in the state.

One can imagine going further and having the state take a small equity position, perhaps 3-5 per cent, in start-ups, as countries like Israel or Finland already do. Given that the current value of publicly traded companies in California is roughly $13tn, that’s not chump change. If the state had been able to take even a small stake in top firms a few decades ago, there might be far less of the “Occupy” Silicon Valley vibe in California right now.

Last Sane Man on Wall Street
By Andrew Rice

One day, a man with dreams of riches placed a truck on top of a hill. The vehicle was a big white tractor-trailer, a prototype built by an automotive start-up called Nikola. The company’s boastful founder, Trevor Milton, claimed it was the “holy grail” of the commercial-trucking industry, a semi that ran on hydrogen and was both green and powerful, capable of doing thousand-mile hauls with zero carbon emissions. In reality, the truck had no engine. It was towed up a straight two-lane road. Its driver released the brakes, and it rolled down the hill under the force of gravity, like a child’s wagon. The road had a 3 percent grade, gentle enough that with some creative camerawork, the prototype would appear to be barreling across a flat desert landscape.

On January 25, 2018, Nikola’s official Twitter account posted a swooshing 39-second video of the demonstration. “Behold,” it declared, “the Nikola One in motion.”

Four years and one federal criminal indictment later, the story of the engineless truck can be seen in many ways: as the high point of a scandal at an automaker that briefly had a market cap larger than Ford’s; as a manifestation of this era’s fake-it-till-you-make-it, flack-it-till-you-SPAC-it business ethos; as a cautionary tale of social media’s power to intoxicate the stock-trading masses; as yet another indicator that the market has become detached from reality; and maybe even as a big honking metaphor for an entire economy that is rolling down a hill, inflating, going deranged as crypto wizards conjure imaginary fortunes, companies without a hint of revenue reach multibillion-dollar valuations, and our richest men blast off into outer space.

… you have to be a little crazy to bet against a market that has proved impervious to inflation, supply-chain instability, and a plague that has killed millions of people.

“A lot of investors prefer the market to be sort of this mass hallucination,” Anderson said. On the screen of his laptop, a ticker showed that bitcoin was trading at $63,682.60, heading toward an all-time high. “The market is designed to be a place where these scarce resources of society — capital, labor, materials — are allocated to their most efficient use,” he said. “But it has just become this otherworldly casino, which is disconnected from the real world.”

On February 9, 2020, as the S&P 500 index vaulted to an all-time high, Anderson was following the spread of a new coronavirus in China. “I said we’d leave New York when there were ten confirmed cases,” he recalls. “I think we left when there were seven.” He retreated to an Airbnb — a renovated barn — near his parents’ house in Connecticut and waited for the long-predicted market correction. Anderson says he was anticipating it would put a lot of overvalued companies out of business, “a healthy but painful process,” which he felt would be cleansing for the economy. On March 12, as the world locked down, the stock market suffered its worst one-day decline since the crash of 1987.

The federal government opened its macroeconomic sluices, printing trillions of dollars. The stock indexes stabilized and before long were ascending past their previous highs. Newbie investors, killing time during the lockdown, started playing with stocks on apps like Robinhood. “Getting the impression that a large portion of those stimulus checks went straight into Robinhood accounts to buy YOLO calls,” Anderson tweeted incredulously that April. “The Fed just turned around and reinflated the biggest asset bubble of all time,” he says now. “At that point, it was just a question of how crazy it could get.”

Meanwhile, he has continued to watch Nikola’s stock, which remains one of his largest short positions. Milton resigned in the aftermath of Hindenburg’s report and an ensuing Me Too scandal — CNBC reported that two women had accused him of sexually abusing them as minors. GM scuttled the partnership deal, and the SEC began a fraud investigation of Nikola, which recently concluded with the company’s agreeing to pay a $125 million settlement, of which the whistleblowers in Utah are expecting a significant cut. (Nikola’s management and board members, including Ubben, declined to comment for this article, as did Milton, who has pleaded not guilty to the fraud charges. He has also denied the abuse allegations.) Milton’s indictment produced further damaging revelations about the state of the company’s technology, such as the allegation that, contrary to his claims to have discovered a way to produce hydrogen fuel at a quarter of the market price, “Nikola had never obtained a permit for, let alone constructed, a hydrogen production station, nor had it produced any hydrogen.”

Still, despite all that, Nikola’s stock had not gone to zero. Instead, it’s been hovering at around $10 a share, giving Nikola a capitalization of more than $4 billion. Milton has cashed out millions’ worth of stock, but he still owns enough of the company that he remains a billionaire, or close to it, on paper.

“I view that more as a reflection of the complete market insanity that we’re in now,” Anderson says. “Where pictures of digital tulips are trading for hundreds of thousands of dollars.”

Buy GameStop, Fight Injustice. Just Don’t Sell.
By Tara Siegel Bernard, Emily Flitter and Anupreeta Das

It has been a year since Mat Bowen, who was the pastor of a small church in Gibson City, Ill., had the dream — the one where Elon Musk, the head of Tesla, urged him to buy Dogecoin.

Mr. Bowen had just begun to dabble in investing. He soon discovered WallStreetBets, the online forum on Reddit where throngs of small investors were plotting to buy shares of GameStop, the troubled video game retailer, in a bid to teach Wall Street a lesson. Some hedge funds had bet that shares of GameStop would fall. Instead, they took off, as the investors banded together last January to drive the price up more than 1,700 percent.

Caught up in the frenzy, Mr. Bowen bought GameStop, too. In July, he quit the church to become a full-time trader, convinced he was joining a fight against financial injustice.

The beliefs underpinning last year’s meme stock phenomenon are stronger than ever. For a large number of individual investors, the stock market has become the battleground on which they join forces to right perceived wrongs and fight the powerful. So much so that when the stock market seesawed this past week, many small investors were undeterred. Falling prices were another opportunity to buy more shares of their favorite companies.

“The reason I am still in this, and the reason I am willing to ride these stocks to zero, is for my fellow citizens,” said Mr. Bowen, who received his master’s degree in divinity at the Princeton Theological Seminary. He cast the so-called meme stock fight in moral terms. “The battle of good versus evil is not just limited to the walls of a church or a synagogue or a mosque,” he said.

GameStop was the coming-out party of sorts for a sweeping change in stock market investing that had been bubbling since the start of the pandemic. Stuck at home, flush with government stimulus checks, known as “stimmies,” and watching the stock market rise to nosebleed levels even as the economy teetered, millions of small investors began to take tentative steps into trading in the spring of 2020.

Who Really Got Rich From the GameStop Revolution?
By Spencer Jakab

The rookies who transformed videogame retailer GameStop into the hottest stock on the planet thought they had a twofer: sticking it to Wall Street while making a bundle themselves. But the revolutionaries didn’t do a good job on either count.

The GameStop surge was often portrayed as a triumph of amateurs over professionals, fueled by social media. Tips posted on Reddit’s WallStreetBets forum were quickly amplified over TikTok, Twitter and messaging platform Discord, allowing hordes of individual traders around the world to act in concert and even drive markets. They circulated memes showing images of their favorite companies superimposed on the surface of the moon, representing how high they wanted the stock to climb. They said they wanted to defy the hedge funds that bet against these stocks.

Some of those small-time investors did make money as WallStreetBets ballooned from fewer than 2 million members at the start of January 2021 to more than 11 million. One was Keith Gill, aka “Roaring Kitty,” who became a hero to Reddit users while personally earning tens of millions of dollars. And some on Wall Street fared poorly. Gabe Plotkin’s hedge fund Melvin Capital lost a whopping $6.8 billion in January 2021, The Wall Street Journal reported, and some other funds suffered heavy losses too.

When the smoke cleared, though, the popular image of tables being turned on America’s financiers wasn’t entirely accurate. Individual investors who bought GameStop this time a year ago and never sold are sitting on hefty losses.

Firms that stood in the middle of all this buying and selling benefited while taking less risk than some other winners. One trading app, Robinhood, laid the foundation for a craze that nearly overwhelmed it by attracting millions of mostly young people drawn by no-commission trading and lottery-like stock giveaways.

It almost worked too well; Robinhood briefly halted purchases of several meme stocks and had to quickly raise more than $1 billion in January 2021 to help meet rising demands for cash stemming from all the frenzied trading. But it survived and made almost twice as much revenue that quarter as it had in all of 2019. The broker went public months later, making co-founders Vlad Tenev and Baiju Bhatt multibillionaires.

“We’re proud to have made investing more affordable for a new generation, paving the way for our commission-free model to be adopted industrywide,” said a Robinhood spokeswoman. “Gifting people stock for sharing Robinhood with their friends isn’t just a reward, it helps first-time investors see themselves as owners. We reject the idea that it’s ‘investing’ if you’re wealthy, but ‘gambling’ if you aren’t.”

Robinhood, in turn, created opportunities for other financial giants such as Citadel Securities, the electronic-trading firm largely owned by hedge-fund billionaire Ken Griffin. Citadel Securities executes many of the orders submitted by small investors through Robinhood—as well as other online brokerages such as TD Ameritrade. It pays Robinhood for the right to execute Robinhood customer orders, and Mr. Griffin’s massive computers fill them in nanoseconds by making many small, calculated bets that can add up to billions in profits.

Mr. Griffin, in testimony before a U.S. House committee last February, said he played no role in the controversial decision by Robinhood to curb trading in GameStop at the height of the stock’s rally. And why would he have? On Jan. 27, at the height of the GameStop rally in 2021, Citadel Securities executed 7.4 billion shares of trades for retail investors. That was more than the average daily volume of the entire U.S. stock market in 2019, Mr. Griffin said.

Some of the biggest banks on Wall Street were also part of the frenzy. Morgan Stanley, the broker for “Roaring Kitty” and millions of other smaller investors after buying E*Trade in 2020, doubled its net profit in the first quarter of 2021 to $4.1 billion.

“More clients, more engagement, more activity, more cash,” Chief Financial Officer Jonathan Pruzan said at a virtual conference in February 2021. The number of trades made by E*Trade clients, he added, was “off the charts.”

Morgan Stanley rival Goldman Sachs Group Inc. rode the same wave. It earned $6.8 billion during the period for its highest return on equity in 12 years.

“I’d say the first quarter was an extraordinary quarter,” Goldman Chief Executive Officer David Solomon said at the time. Goldman couldn’t be reached for additional comment.

A year later certain GameStop insiders are gone, and some shared in the same meme-stock bounty. GameStop’s senior executives who left the company in the months following the stock surge reaped exit packages worth $290 million at the time, the Journal reported. That is more than the entire company had been worth before it was noticed by WallStreetBets.

Reddit, the company that was the staging ground for the revolt of amateur traders, also did well. It went from a private valuation of $3 billion in 2019 to $10 billion last summer after WallStreetBets helped to raise its profile. It has filed for an initial public offering.

Reddit co-founder and CEO Steve Huffman told the Journal that he would like to see retail investors, normally shut out of juicy deals, participate in the IPO.

Wall Street is still expected to be involved. The offering is being lead-managed by Goldman Sachs and Morgan Stanley.

Millions of Americans are fixated on stock prices. They shouldn’t pay such close attention
By Michael Hiltzik

To begin with, most Americans have little or no direct exposure to the stock market.

Only about 15% of all families own shares of stock directly, and even that figure is skewed by significant holdings by the top 10% of households (those with net worth of $1.2 million or more); about 44% of those households own stocks directly.

Among households with net worth around the national median of $121,700, only about 11% own shares of stock and among the poorest 20%, with net worth of $6,400 or less, only about 5% own stocks. (The figures are from the Federal Reserve’s latest triannual Survey of Consumer Finances, which was published in 2019.)

Even within the top 10%, direct stockholdings represent a fairly small proportion of their assets. “Though 70% of the top 10% directly own stocks,” economist Teresa Ghilarducci of the New School told the Senate Banking Committee last year, “it is only 13% of their wealth.” Businesses they own, real estate and retirement accounts and pension plans account for more than half their net worth, on average.

Over the last year or two, the narrative of stock trading as a pathway to wealth has gained in prominence. As Ghilarducci asserts, that’s a distinctly unwholesome development, especially for younger people who may just now be entering their prime earning years

A surge in interest in short-term stock trading has been spurred by zero-commission brokerages such as Robinhood, which plies young investors with the impression of stock trading as a fun game and the notion that stock trading needed to be “democratized” — that is, removed from the control of big Wall Street players and placed in the hands of the retail trader.

“Robinhood has pushed financial literacy in reverse rather than advancing it,” Ghilarducci told me. “Places like Robinhood are making lots of money out of promoting this ideology of ‘democratization.’ They’re really hitting people who are used to electronic games, like young men. That’s distorting the decisions of some people, putting them at great risk.”

Robinhood is not only making a game of stock trading, but it is also luring customers into riskier investments such as options and nonsensical investments such as cryptocurrencies.

“We have to regulate these financial predators who traffic in the idea that wealth-building will happen if people have access to these risky products and services,” Ghilarducci says.

Why bitcoin is worse than a Madoff-style Ponzi scheme
By Robert McCauley

In 1920, Ponzi promised 50 per cent on a 45-day investment and managed to pay this to a number of investors. He suffered and managed to survive investor runs, until eventually the scheme collapsed less than a year into it.

In the largest and probably the longest running Ponzi scheme in history, Bernie Madoff paid returns of around one per cent a month. He offered to cash out his scheme’s participants, both the original sum “invested” and the “return” thereon. As a result, the scheme could and did suffer a run; the Great Financial Crisis of 2008 led to a cascade of redemptions by participants and the scheme’s collapse.

But the resolution of Madoff’s scheme has extended beyond its collapse on account of the remarkable and ongoing legal proceedings. These have outlived Madoff himself, who died in early 2021.

Many are unaware that a bankruptcy trustee, Irving H. Picard, has doggedly and successfully pursued those who took more money out of the scheme than they put in. He even managed to follow the money into offshore dollar accounts, litigating a controversial extraterritorial reach of US law all the way to the US Supreme Court. Of the $20bn in recognised original investments in the scheme (which the victims had been told had reached a value more than three times that sum), some $14bn, a striking 70 per cent, has been recovered and distributed. Claims of up to $1.6m are being fully repaid.

By contrast to investments with Madoff, Bitcoin is bought not as an income-earning asset but rather as a zero-coupon perpetual. In other words, it promises nothing as a running yield and never matures with a required terminal payment. It follows that it cannot suffer a run. The only way a holder of bitcoin can cash out is by a sale to someone else.

Bitcoin’s ‘One Percent’ Controls Lion’s Share of the Cryptocurrency’s Wealth
By Paul Vigna

It’s good to be the bitcoin 1%. The top bitcoin holders control a greater share of the cryptocurrency than the most affluent American households control in dollars, according to a study by the National Bureau of Economic Research.

The study showed that the top 10,000 bitcoin accounts hold 5 million bitcoins, an equivalent of approximately $232 billion.

With an estimated 114 million people globally holding the cryptocurrency, according to crypto.com, that means that approximately 0.01% of bitcoin holders control 27% of the 19 million bitcoin in circulation.

By comparison, in the U.S., where wealth inequality is at its most extreme in decades, the top 1% of households hold about a third of all wealth, according to the Federal Reserve.

The study, conducted by finance professors Antoinette Schoar at MIT Sloan School of Management and Igor Makarov at the London School of Economics, for the first time mapped and analyzed every transaction in bitcoin’s more than 13-year history.

The ramifications of that centralization are mainly twofold, the paper argues. First, it makes the entire bitcoin network more susceptible to systemic risk. Second, it means the majority of the gains from the rising price and increased adoption go to a disproportionately small group of investors.

“Despite having been around for 14 years and the hype it has ratcheted up, it’s still the case that it’s a very concentrated ecosystem,” Ms. Schoar said about Bitcoin.

Bitcoin was unveiled in 2008 as an open-source software project intended to be an electronic form of physical cash without gatekeepers. Anybody could download the software, become a “node” on the network, and “mine” for bitcoin.

In practice, though, bitcoin has become highly centralized. Most people who trade do so through exchanges. The costs of mining have become so high that only a small group of enterprise-level firms can afford to do it.

Bitcoin traders using up to 100-to-1 leverage are driving the wild swings in cryptocurrencies
By Kate Rooney and Maggie Fitzgerald

Traders taking excessive risk in the unregulated cryptocurrency market being forced to sell when prices go down were in large part responsible for last week’s 30% drop in prices and outages for major exchanges, according to analysts. A burgeoning bitcoin lending market is also adding to the volatility.

The price of cryptocurrencies tanked last week, with the bitcoin losing roughly a third of its value in a matter of hours. Bitcoin popped to nearly $40,000 on Monday but is still down about 33% from its high.

When traders use margin, they essentially borrow from their brokerage firm to take a bigger position in bitcoin. If prices go down, they have to pay the brokerage firm back in what’s known as a “margin call.” As part of that, there’s often a set price that triggers selling in order to make sure traders can pay the exchange back.

Brian Kelly, CEO of BKCM, pointed to firms in Asia such as BitMEX allowing 100-to-1 leverage for cryptocurrency trades.

The fact that bitcoin is not regulated by a central bank is part of what makes it so valuable to its investors.

The Rise and Fall of Bitcoin Billionaire Arthur Hayes
By Adam Ciralsky

Cryptocurrency, it bears repeating, is a digital form of payment and a method for storing value. It relies on a secure, decentralized ledger—called a blockchain—to record transactions, manage the issuance of new “coins” or “tokens,” and prevent fraud and counterfeiting. Though there are thousands of such currencies out there, Bitcoin is by far the most durable, despite having a dubious backstory involving an enigmatic creator named Satoshi Nakamoto, whose existence and identity have never been established. Bitcoin’s blockchain was designed so that only 21 million “virtual coins” would ever be “mined.” That kind of verifiable scarcity—in contrast with the tendency of the world’s central bankers to print money, whether in a pandemic or whenever it is politically expedient—has contributed to the currency’s precipitous rise in price, from less than a penny in 2009 to over $41,000 in January 2021. In 2020 alone the coin rose over 300% in value.

Arthur Hayes started small, with arbitrage: buying Bitcoin in one market and then selling it at a premium in another. Things were humming along until October 2013, when he had problems accessing coins he had sent to Mt. Gox, a Tokyo-based Bitcoin exchange that helped patrons convert their holdings into “fiat money”—traditional legal tender such as the dollar, euro, pound, or yuan. In early 2014, Mt. Gox declared that hackers had stolen nearly $500 million from its coffers. Unlike most other depositors—some 24,000—Hayes managed to get his money out and in the process learned an important lesson: Exchanges constitute a single point of failure in the otherwise secure Bitcoin ecosystem. Mt. Gox might have been the most infamous such hack, but dozens of exchanges have been hit, and untold billions—in Bitcoin and other cryptocurrencies—have vanished.

Hayes, however, decided to take his money elsewhere. When he heard Bitcoin was trading significantly higher on the Chinese mainland, he bought a bundle, transferred the coins to an exchange in China, and swapped them for yuan—literally lugging around a backpack containing stacks of banknotes. “Over a period of days,” he recounted, “I physically crossed the border by bus to Shenzhen with some friends, had lunch, and came back over the border carrying legal amounts [of cash].” It was a neat trick and relatively lucrative. But the real-world hazards of schlepping real money across international borders got him thinking: Why not build an online exchange where people could really profit off of their Bitcoin by using derivatives? (A derivative is a financial contract whose value is based on the performance of an agreed-upon underlying asset—in this case, cryptocurrency.)

In January 2014, Hayes arranged a meeting at a swanky rooftop watering hole with Ben Delo, a brainy British mathematician and programmer whose classmates at Oxford reportedly voted him the most likely to become a millionaire—and the second most likely to wind up in prison. After graduating in 2005, he worked for IBM, two hedge funds, and, after moving to Hong Kong, JPMorgan.

As the pair mapped out what it would take to turn Hayes’s vision into reality, Delo—an expert in the back-office work of designing complex algorithms and high-speed trading systems—said they needed a front-end web developer to handle the consumer-facing side of things. Hayes knew just the guy, a young American coder and tech evangelist named Sam Reed, whom Hayes had met after a speech Reed had given in which he’d warned his aspiring-techie audience not to join start-ups, whose owners often exploited and stiffed their coders. When Hayes pitched Reed on his idea for a Bitcoin-derivatives exchange, Reed, disregarding his own advice, signed on immediately.

Reed was much younger than Hayes and Delo, yet he had been at the crypto game the longest. By 2009, his senior year at Washington and Lee, the self-described “Bitcoin hipster” was mining Bitcoin on his laptop at a time when the currency was next to worthless. Reed racked up roughly 100 Bitcoins along the way, but in the process of reformatting a hard drive, accidentally erased the private keys required to access them, rendering his cache untouchable. (Today those coins would be worth $3.1 million.)

In an online career forum with his alma mater—taped while sitting in a hut in Thailand—Reed shared crypto-business tips. Among his insights: “In a gold rush, you don’t want to mine the gold. You want to sell the shovels.” At one point Reed remarked that he’d been toying with the idea of building an online exchange to trade cryptocurrencies, explaining his rationale: “If you can cut the banks out, you cut most of the complexity out. You cut out a lot of where U.S. law kind of gets involved with [anti-money-laundering], know-your-customer, KYC, kind of stuff, and you get rid of a lot of the fraud because all this, you know, internet money is actually verifiable, you know, by design.”

Hayes, Delo, and Reed began working in earnest on what they termed the Bitcoin Mercantile Exchange (BitMEX).

BitMEX was billed as “a peer-to-peer trading platform that offers leveraged contracts that are bought and sold in Bitcoin.” It allowed users to effectively bet on the currency’s future price with leverage of up to a dizzying 100 to one. Translation: A customer with $10,000 in his or her BitMEX account could seamlessly execute a trade worth a cool $1 million. The lure of the exchange lay in the fact that people could make big money by putting in relatively modest crypto seed money.

In a blog entry on the BitMEX site, Hayes mused, “Trading without leverage is like driving a Lamborghini in first gear: you know it’s safer, but that’s not why you bought it.” His friend Jehan Chu compared BitMEX to the NASDAQ— “if the NASDAQ was located in Las Vegas.” When pressed about the potentially catastrophic downside of letting people trade so much on margin, Chu insisted that personal responsibility has always been central to the crypto ethos. “You put on 100x? Make sure you read the fine print. Mommy’s not here to make sure you don’t fall off the skateboard.”

“This stuff is happening very, very fast—it didn’t exist 10 years ago,” explained J. Christopher Giancarlo, who served at the powerful Commodity Futures Trading Commission (CFTC) under President Obama and later as the CFTC chairman under President Trump. “Regulation always follows innovation, and sometimes, in democracies, it follows a little further behind other jurisdictions.”

Understanding what BitMEX was selling is perhaps less important than whom the company was selling to. In our early conversations Hayes insisted that BitMEX was careful to have “no American customers” and that technological barriers, such as blocking U.S. I.P. addresses, kept American clients off the platform—and stateside regulators at bay.

But U.S. officials said that wasn’t the case. It did not escape their attention that BitMEX had plenty of American depositors, many of whom disguised their location by using virtual private network (VPN) software. They were flocking to BitMEX by the thousands. And even though Hayes is a product of the banking establishment, where whole departments are dedicated to enforcing anti-money-laundering (AML) and know-your-customer (KYC) requirements, his immersion into the deeply libertarian world of crypto seems to have blinded him to certain realities. Among them: U.S. authorities have wide reach, long memories, and an affinity for knocking people down to size—especially brash upstarts.

Hayes’s original sin might be that he refused to play the game. “He didn’t care about the charade and the optics and the bullshit and Silicon Valley and the think tanks—all the stupid shit you do for prestige. He just didn’t care…. Sometimes people’s greatest qualities are also their biggest downfall.”

By 2018, BitMEX had become a high-stakes bazaar, moving billions every day. During one of our meetings, Hayes commented, “We are the biggest trading platform in the world, by volume. That’s anyone who trades a crypto product.” BitMEX, he said, was one of the “most liquid exchange[s] in the world, regardless of asset class.” By that measure it was in the same league as the NASDAQ as well as the New York, London, and Tokyo stock exchanges. Within four short years Hayes’s scrappy casino had become, in gambling terms, the house. (Since the indictment was unsealed in October, BitMEX has taken a huge hit; its market share and trading volume have dropped precipitously.)

With BitMEX, Rimon argued, U.S. authorities trained their sights on the founders of the biggest, flashiest player in the digital-assets-derivatives space to send a message to the entire crypto community: “We’re going to make sure you understand this industry is subject to our jurisdiction.”

By charging BitMEX’s founders—personally—with serious crimes carrying serious time, officials have angered many in the wider crypto community. Some feel strongly that the game is rigged. “Show me a bank that doesn’t have money-laundering violations and I’ll show you a piggy bank,” Jehan Chu told me. “It’s a double standard. Who went to jail from HSBC for their money laundering and, you know, their Iran deals and all these kinds of sanctions violations? They got fined.” He’s not wrong. After HSBC admitted to laundering nearly a billion dollars for the Sinaloa cartel and moving money for sanctioned customers in Cuba, Iran, Libya, Sudan, and Myanmar, the Justice Department elected not to indict the bank or its officials, instead having it pay a $1.92 billion fine and install a court-appointed compliance monitor.

That was hardly an aberration. Barclays, BNP Paribas, Credit Suisse, Deutsche Bank, ING, Lloyds Banking Group, Royal Bank of Scotland, and Standard Chartered have all paid fines for conduct that has included money laundering, sanctions violations, and massive tax fraud. In the world of high finance, charging corporate officers in their individual capacity is rare. “You can Google ‘JPMorgan’ and ‘fraud’ and look at what comes up,” Hartej Singh Sawhney suggested. “Wells Fargo, JPMorgan, Goldman Sachs—they have pleaded guilty to fraud. And yet none of their sentences or fines are nearly as bad as what we’re looking at for Arthur.”

In fact, 48 hours before the charges against Hayes and his partners were announced, JPMorgan Chase “entered into a resolution”—as it was euphemistically termed—with the DOJ, the CFTC, and the SEC in which the bank agreed to pay close to a billion dollars in connection with two distinct schemes to defraud: one involving precious metal futures, the other Treasury notes and bonds. The FBI’s Sweeney was among those who announced the deal: “For nearly a decade, a significant number of JPMorgan traders and sales personnel openly disregarded U.S. laws that serve to protect against illegal activity in the marketplace…Today’s deferred prosecution agreement…is a stark reminder to others that allegations of this nature will be aggressively investigated and pursued.”

Really? Since 2000, JPMorgan Chase, America’s largest bank, has paid tens of billions in fines, including over $2 billion for anti-money-laundering deficiencies alone. Yet its CEO and chairman, Jamie Dimon, and his top lieutenants have not been pursued criminally. Instead, Dimon, who had toyed with a 2020 presidential run, collected $31.5 million last year in salary and incentives.

Archegos poses hard questions for Wall Street
By Robin Wigglesworth

… What on earth were some of the world’s biggest investment banks thinking when they enabled an opaque family office whose founder had a history of regulatory issues to rack up multibillion dollars worth of leverage? Hwang paid $44m in fines to settle US illegal trading charges in 2012, and in 2014 he was banned from trading in Hong Kong.

True, Archegos’ status as a family office means that it was exempt from a lot of the standard regulatory disclosures demanded of hedge funds. But banks’ prime brokerage desks — which service hedge funds with research, trade structuring and leverage — appear to have failed basic “know your customer” processes.

Each bank may have felt comfortable with their exposure to Archegos, assuming they could always ditch its positions to cover themselves. But they failed to appreciate that if everyone has to dump tens of billions of dollars worth of equities, the collateral they may have embedded in their contracts is going to be wholly inadequate.

In LTCM’s infamous blow-up in 1998, the fund adeptly took advantage of Wall Street’s hunger for fees to play banks up against each other and get access to hefty leverage from each of them — with each often unaware of their rivals’ true exposure.

But at least LTCM was at the time the biggest hedge fund in the world, founded by storied Salomon Brothers traders and advised by Nobel laureates. Aside from the under-appreciated size of Archegos — and the fat fees it probably paid to prime brokers — the fund and Hwang were essentially non-entities on Wall Street.

Which leads us to another question: What is Archegos Capital exactly? The size and leverage of its positions would be extreme even for one of the more aggressive members of the hedge fund industry, let alone a family office. In truth, it seems more like a Reddit day trader got access to a Goldman Sachs credit card and went bananas.

These Invisible Whales Could Sink the Economy
By Alexis Goldstein

From regulators to the financial press, everyone seemed mystified by the implosion of Archegos. In part, this is because when Congress passed the 2010 Dodd-Frank Act, which brought new measures of oversight to private money managers, they exempted family funds like Mr. Hwang’s. Hedge funds must publicly report certain stock and option positions every quarter, filing a Form 13F with the Securities and Exchange Commission. But family funds don’t need to file a 13F, so their portfolio positions remain hidden.

In 2021 Archegos bet that the price of approximately nine hot stocks, including ViacomCBS, Discovery and Baidu, would keep climbing. It seemed to have good reason: Discovery and ViacomCBS were investing in streaming services, a booming sector. But Archegos wasn’t just buying and holding. It used borrowed money and underregulated derivatives to make bigger, riskier bets.

Just as you can borrow money against the value of your home, you can borrow money against a portfolio of stocks. Regulators limit nonprofessional investors to two-to-one leverage — brokers will let them purchase $200 in stock for every $100 in assets. But private money managers like Mr. Hwang face no hard upper limit: Many private funds are able to borrow five or 10 times the value of their portfolio. Archegos may have gotten as high as 20-to-one.

Once Mr. Hwang’s investments starting soaring, he wanted more. Rather than cashing out, he kept borrowing from the world’s biggest banks, driven to make even bigger bets. To do so, he used what’s known as a total return swap, a type of derivative that provides all the economic benefits of owning a stock without requiring Archegos to spend the money to actually buy it.

Here’s how it worked: Archegos paid a bank like UBS a preset fee. In exchange, the bank bought ViacomCBS stock and then paid Archegos any income and capital gains the stock generated. When the stock price dropped, Archegos would pay the bank the amount by which the stock fell. All Mr. Hwang had to do was find a bank that believed Archegos was creditworthy enough — and he found plenty. But the same leverage that powered Archegos’s success also proved to be its undoing.

At the end of March, ViacomCBS sought to capitalize on a blistering four-month stretch that saw its stock triple, issuing new shares. ViacomCBS didn’t know that its growth was mostly a byproduct of Mr. Hwang’s giant swaps bets. Mr. Hwang didn’t participate in the ViacomCBS issuance, even after initially expressing interest: Reporting suggests he may have been so leveraged that he simply didn’t have money left to buy any shares.

As a result, ViacomCBS’s offering finished well short of its target. In just a week, the stock lost half its value. Because of how total return swaps are structured, Archegos had agreed to pay for any losses on the stock. When it couldn’t pay, it defaulted.

Without Mr. Hwang propping up these stocks, their recovery would be tough. Meanwhile, the banks needed to liquidate the stocks the swaps were based on and discussed creating a plan to do so without disrupting the market.

But Goldman Sachs and Morgan Stanley didn’t wait for a plan to be finalized and jumped first. The stocks cratered, and the other banks had little choice as the bottom fell out. Credit Suisse bore the worst loss: $5.4 billion, thanks to one bad trade.

If the banks knew how big Archegos’s position was, they may have realized other banks were supplying it with the same leverage — and reconsidered the trade. But a set of worrisome regulatory loopholes kept them from detecting this lurking whale.

Morgan Stanley’s Archegos Unwinding Sped Up Trading Probe
By Katherine Burton and Sridhar Natarajan

U.S. authorities ramped up their investigation into whether big banks and their hedge fund clients broke rules when privately negotiating large stock sales after the blowup at Archegos Capital Management, according to people with knowledge of the matter.

Regulators had already been scrutinizing block trades for years when Archegos shined a fresh spotlight on the market: Stocks Hwang had made massive bets on started tanking last March, prompting banks to unload tens of billions of dollars of his holdings through a spree of huge sales. Morgan Stanley had amassed one of the largest exposures, and was one of the first to dump positions that eventually led to the flame-out of the family office. With a new hook, authorities soon began poring through the carnage.

The spectacular rise and sudden collapse of Hwang’s fund has set off a number of inquiries into issues from market manipulation to collusion among banks.

The radical reshaping of the American economy, told through a Kansas City dissident
By David Hudnall

Q: So as not to scare off readers, I am trying to avoid saying “quantitative easing.” As you write in the book, it’s an intentionally opaque name for a program that’s not actually that complex. Can you give a quick overview of what it is?

A: Yes. The Federal Reserve has one superpower. It is the only institution on earth that can create new dollars out of thin air. That’s why it exists – to create and manage our currency, the dollar. QE at its root is just an experimental program that the Federal Reserve used to create money at a scale and pace at which it had never done before.

Why do that?

Well, at the same time that the Fed began doing this, after the global financial crisis, it also set interest rates at zero, which is extraordinary. The Fed kept interest rates at zero for seven years during the 2010s. When interest rates are at zero, a bank can’t earn any money by saving. So the idea was, this combination of low interest rates and pumping all this money into the banks would force banks to lend money to people and businesses that would stimulate growth in the economy and create new jobs. And that would heat up the economy and get things back on track.

What are some examples in the economy where you see the effect of this policy?

Asset prices: things people invest in, like tech stocks, commercial real estate, corporate bonds. Because you have all these people saying, “Well, I can’t make any returns by saving my money, I might as well take a risk on this Tesla stock at a super-high price because it beats the low yield I get on super-safe savings.” So this search for yield effectively pushes all this money out into the system to fund riskier and riskier assets and debt instruments.

Take the example of CALPERS, which is the huge pension fund for public retirees in California. It has billions of dollars in assets and the fund’s managers have to figure out where to invest it to earn money. Before, it could have met all its obligations by simply investing in super-safe Treasury bills. But when rates are at zero, that won’t cut it. So now we are seeing huge pension funds that used to be very conservative finding themselves having to invest in things like corporate junk debt and fracking wells – stuff that used to be the domain of risky speculators on Wall Street.

The Fed’s Doomsday Prophet Has a Dire Warning About Where We’re Headed
By Christopher Leonard

Between 2008 and 2014, the Federal Reserve printed more than $3.5 trillion in new bills. To put that in perspective, it’s roughly triple the amount of money that the Fed created in its first 95 years of existence. Three centuries’ worth of growth in the money supply was crammed into a few short years. The money poured through the veins of the financial system and stoked demand for assets like stocks, corporate debt and commercial real estate bonds, driving up prices across markets.

The Fed is now in a vise. Inflation is rising faster than the Fed believed it would even a few months ago, with higher prices for gas, goods and automobiles being fueled by the Fed’s unprecedented money printing programs. This comes after years of the Fed steadily pumping up the price of assets like stocks and bonds through its zero-percent interest rates and quantitative easing during and after Hoenig’s time on the FOMC. To respond to rising inflation, the Fed has signaled that it will start hiking interest rates next year. But if that happens, there is every reason to expect that it will cause stock and bond markets to fall, perhaps precipitously, or even cause a recession.

“There is no painless solution,” Hoenig said in a recent interview. “It’s going to be difficult. And the longer you wait the more painful it will end up being.”

To be clear, the kind of pain that Hoenig is talking about involves high unemployment, social instability and potentially years of economic malaise. Hoenig knows this because he has seen it before. He saw it during his long career at the Fed, and he saw it most acutely during the Great Inflation of the 1970s.

As a bank examiner, Hoenig spent the 1970s watching as the Fed’s policies helped pile on the inflationary tinder that would later ignite. These policies are known as “easy money” policies, meaning that the Fed was keeping interest rates so low that borrowing was cheap and easy. The Fed had kept interest rates so low during the 1960s that they were effectively negative when accounting for inflation by the late 1970s. When rates are effectively negative, that might be called a super-easy money policy. This kind of environment fuels inflation because all that easy money is looking for a place to go. Economists call this phenomenon “too many dollars chasing too few goods,” meaning that everybody is spending the easy money, which drives up the prices of the things they are buying because demand is high.

Importantly, the Fed creates these conditions by creating more and more dollars, or increasing the monetary supply, as the economists say.

As a bank examiner, Hoenig realized another very important thing. Easy money policies don’t just drive up the price of consumer goods, like bread and cars. The money also drives up price of assets like stocks, bonds and real estate.

It all came to an end in 1979, with a severity that has never been repeated. Paul Volcker became chair of the Federal Reserve and he was intent on beating inflation by hiking interest rates. Under Volcker, the Fed raised short-term interest rates from 10 percent in 1979 to 20 percent in 1981, the highest they have ever been. This unleashed massive economic havoc, pushing the unemployment rate to 10 percent and forcing homeowners to take out mortgages with 17 percent interest rates or higher. Volcker recognized that when he was fighting inflation, he was actually fighting two kinds: asset inflation and price inflation. He called them “cousins,” and acknowledged that they had been created by the Fed.

“The real danger comes from [the Fed] encouraging or inadvertently tolerating rising inflation and its close cousin of extreme speculation and risk taking, in effect standing by while bubbles and excesses threaten financial markets,” Volcker later wrote in his memoir.

When the Fed doubled the cost of borrowing, the demand for loans slowed down, which in turn depressed the demand for assets like farmland and oil wells. The price of assets collapsed, with farmland prices falling by 27 percent in the early 1980s and oil prices falling from more than $120 to $25 by 1986. This, in turn, created a cascading effect within the banking system. Assets like farmland and oil reserves had been used to underpin the value of bank loans, and those loans were themselves considered “assets” on the banks’ balance sheets. When the loans started failing, the banks had to write down the value of those loans, which made some banks appear insolvent because they suddenly didn’t have enough assets on hand to cover their liabilities. When land and oil prices fell, the entire system fell apart.

“You could see that no one anticipated that adjustment, even after Volcker began to address inflation. They didn’t think it would happen to them,” Hoenig recalled. Overall, more than 1,600 banks failed between 1980 and 1994, the worst failure rate since Depression.

The Minsky Moment
By John Cassidy

Twenty-five years ago, when most economists were extolling the virtues of financial deregulation and innovation, a maverick named Hyman P. Minsky maintained a more negative view of Wall Street; in fact, he noted that bankers, traders, and other financiers periodically played the role of arsonists, setting the entire economy ablaze. Wall Street encouraged businesses and individuals to take on too much risk, he believed, generating ruinous boom-and-bust cycles.

There are basically five stages in Minsky’s model of the credit cycle: displacement, boom, euphoria, profit taking, and panic. A displacement occurs when investors get excited about something—an invention, such as the Internet, or a war, or an abrupt change of economic policy.

With the cost of borrowing—mortgage rates, in particular—at historic lows, a speculative real-estate boom quickly developed that was much bigger, in terms of over-all valuation, than the previous bubble in technology stocks.

As a boom leads to euphoria, Minsky said, banks and other commercial lenders extend credit to ever more dubious borrowers, often creating new financial instruments to do the job. During the nineteen-eighties, junk bonds played that role. More recently, it was the securitization of mortgages, which enabled banks to provide home loans without worrying if they would ever be repaid. (Investors who bought the newfangled securities would be left to deal with any defaults.) Then, at the top of the market (in this case, mid-2006), some smart traders start to cash in their profits.

The onset of panic is usually heralded by a dramatic effect: in July, two Bear Stearns hedge funds that had invested heavily in mortgage securities collapsed.

You might think that the best solution is to prevent manias from developing at all, but that requires vigilance. Since the nineteen-eighties, Congress and the executive branch have been conspiring to weaken federal supervision of Wall Street. Perhaps the most fateful step came when, during the Clinton Administration, Greenspan and Robert Rubin, then the Treasury Secretary, championed the abolition of the Glass-Steagall Act of 1933, which was meant to prevent a recurrence of the rampant speculation that preceded the Depression.

The greatest need is for intellectual reappraisal, and a good place to begin is with a statement from a paper co-authored by Minsky that “apt intervention and institutional structures are necessary for market economies to be successful.” Rather than waging old debates about tax cuts versus spending increases, policymakers ought to be discussing how to reform the financial system so that it serves the rest of the economy, instead of feeding off it and destabilizing it.

Business leaders have to play a better political role
By Martin Wolf

Businesses operates within a system: market capitalism. This system is now globally dominant, at least in the economic domain. That is true even in today’s China. The essence of capitalism is competition. That has profound implications: competitive profit-seeking entities are essentially amoral, even if they are law-abiding. They will not readily do things that are unprofitable, however socially desirable, or refuse to do things that are profitable, however socially undesirable. If some try to do either of these things, others will outcompete them. Their shareholders may also revolt. Being or pretending to be virtuous may bring benefits to a company. But others may do well just by being cheaper. Society — at local, national and global levels — has to create the framework in which business works. This applies in all dimensions — labour law, social insurance, regional policy, financial regulation, competition policy, innovation policy, support for fundamental research, responses to emergencies, the environment and so forth.

What this might mean is the theme of a recent issue of the Oxford Review of Economic Policy on capitalism, which contains essays that undertake a challenging examination of the economics of contemporary capitalism. Crucially, the assumptions under which capitalism has evolved in recent decades are questionable and have had some highly perverse results. This is a really important volume (in which I was also involved).

Particularly important are essays by Anat Admati of Stanford and Martin Hellwig of the Max Planck Institute. Both consider the role of business leaders as influential but self-interested voices in setting public policy in company law, competition law, taxation, financial regulation, environmental regulation and many other areas. The outcome, they and other authors suggest, has been the emergence of a system of opportunistic rent extraction that creates uninsurable risks for the majority and vast rewards for a few. This has in turn played a big role in undermining confidence in democracy and increasing support for populists.

Business has used its influence to set the rules of the game under which it then can play. It is not the only voice, of course, but it is a well-resourced and influential one. It is particularly influential in the US, much the most important western country.

The results are a form of capitalism that, for all its undoubted economic superiority over alternative systems, creates a highly unequal distribution of rewards and shifts unmanageable risks on to ordinary people. The result has been today’s politics of anxiety and anger. The financial crisis of 2007-12 played a big role in fomenting that anxiety and anger, as many tens of millions of innocent people suffered while the institutions whose behaviour caused the implosion were rescued. This is surely why rightwing populists, notably Donald Trump, ended up replacing more traditional conservatives.

Now, however, the pandemic has created an opportunity for a politics of competence and shared purpose.

BlackRock CEO Larry Fink to chief executives: Treat your workers well
By Emily Peck

The old work world is gone, writes Larry Fink, the biggest investor on the planet, in his closely read letter to CEOs, released on Monday night.

  1. Driving the news: Used to be that companies could expect employees to come to the office five days a week, neglect workers’ mental health and keep wages low for those at the lower end of the income scale, writes Fink, who is the chief executive of BlackRock. Not any longer.

What he’s saying: “No relationship has been changed more by the pandemic than the one between employers and employees. CEOs face a profoundly different paradigm than we are used to.”

The big picture: Since 2018, Fink’s letters to CEOs have highlighted the buzzy term “stakeholder capitalism,” the notion that public firms shouldn’t simply be looking to maximize profits but should do more for their workers and society, too.

  1. The term is not “woke,” he writes in this year’s letter, perhaps as a rebuke to Republican senators like Pat Toomey, who have decried efforts by CEOs to address racial discrimination and climate change.
  2. It is capitalism, driven by mutually beneficial relationships between you and the employees, customers, suppliers, and communities your company relies on to prosper.” (emphasis Fink’s).

Flashback: For the past couple years, Fink’s used his annual letter to push companies to respond to the risks of climate change.

  1. In his 2021 missive he said he’d already begun to vote against managers and directors who don’t show progress in this area, as Axios’ Felix Salmon wrote.

I Wouldn’t Bet on the Kind of Democracy Big Business Is Selling Us
By Kim Phillips-Fein

Historically, some private companies have occasionally supported goals like health and safety legislation to protect workers or the expansion of the welfare state through unemployment insurance and Social Security. But it has very often followed from focused, tireless efforts by unions and other social movements to get them to take these positions — and only when the disruptions have become so powerful that there appears to be no real choice and adoption offers companies a measure of control.

Mr. Fink’s vision is not new. At the close of the 19th century, after years of intense conflict with workers who sought to organize unions to secure higher wages, shorter work hours and basic safety provisions, a few business executives founded the National Civic Federation. It promoted negotiation between executives and representatives of workers deemed politically reliable. But the federation became increasingly focused on anti-socialist and anti-communist agitation, while the post-World War I labor movement collapsed.

The disaster of the Great Depression prompted some companies to back greater government regulation of economic life. Executives at firms like General Electric supported programs like Social Security in part because they recognized that the mass movements of the era — marches of the unemployed, sit-down strikes and other work disruptions — were potentially explosive if concessions were not made to working-class Americans.

But this alliance proved fleeting: Companies that supported reforms in the depths of the Depression no longer did once they saw other options. General Electric turned on its unions after a strike wave in 1946, relocating plants to the Southwest, where laws favored employers, and adopting hard-line bargaining strategies to weaken the electrical workers’ union.

In the late 1960s, after the legal victories of the civil rights movement in the South, the rise of Black power and uprisings of the decade in cities across the country, some corporate leaders suggested that they would do more to try to fight urban poverty — committing to hiring and training the long-term unemployed and to making philanthropic contributions. At the high point of protest against the Vietnam War, some business leaders tried to rally support for ending the war; one organization, Business Executives Move for Vietnam Peace, included almost 1,000 business leaders.

Today, some of the wealthiest Americans may be growing uncomfortable with the political destabilization that can accompany extreme inequality, and some may be anxious about the impact of climate change on their ability to generate profits. But this does not mean that they are eager to do the kinds of things that might actually address inequality or provide a meaningful way forward to a world less in danger of destroying itself. Mr. Fink’s letter may be partly born of good intentions, but the opposition of the Business Roundtable (whose members lead many of the economy’s largest corporations, including BlackRock) to potential corporate tax increases in Build Back Better played a key role in squashing it — and its relatively limited climate proposals.

The social responsibility trend in general undermines the idea of citizenship and of a public sphere as the place where decisions and arguments over economic and social policy play out. The commitment of business to democratic norms is pretty shallow, or at least it emphasizes a narrow understanding of what those are.

You can see this dynamic even at BlackRock. Environmental activists have protested BlackRock for years, calling on it to withdraw its investments in oil, gas and coal companies. But for all Mr. Fink’s talk about the long-term problem of climate change, his company has been unwilling to divest from these firms, despite his company’s 2020 pledge to halt investment in firms that earned more than 25 percent of their revenue from thermal coal. (Businesses, he wrote in defending his approach, “cannot be the climate police,” and he called on governments to increase their efforts.)

The reality is that there is no way to bypass the arduous, contentious work of building a politics that can sustain a more democratic culture. The only thing that brought elites to support such causes in the past — however tentative such support may be — is the pressure of political and social movements.

The political CEO
By The Economist

We believe that companies operating in competitive markets advance social progress. Nonetheless, as classical liberals, we also believe that concentrations of power are dangerous. Businesspeople will always lobby for their own advantage, but the closer they get to the government, the more harm they threaten to both the economy and politics.

As more citizens want firms to support causes they hold dear, CEOs who remain silent risk being accused of complicity. Fund managers seek to evaluate firms’ “social and governance” scores, in response to demand from their clients—and to charge juicier fees. Tech firms exercise influence over political speech. Many Americans think the government in Washington is broken and may hope businesses can fill the vacuum. Donald Trump bullied and enticed business. President Joe Biden has a big-government agenda that is founded on an alliance with business to bring about national renewal, to fight climate change and to gird America against the rise of China.

Even if those goals are individually laudable, all this amounts to a shift in the role of business that brings underappreciated risks. One is of a display of hypocrisy that discredits everyone. Many socially conscious investment funds are stuffed with the shares of tech giants accused of antitrust violations. Members of the Business Roundtable who took the pledge to look after all their stakeholders went on to cut hundreds of thousands of jobs last year, and are busy campaigning against tax rises to pay for the social cost of the pandemic. To want to defend voting rights, which are central to democracy, is only natural. But that leads ineluctably to the next test—over support for, say, new federal voting laws, reform of the Supreme Court and boycotts of China over human-rights abuses in Xinjiang. If CEOs claim that their companies are moral actors, will they be consistent?

Consider Delta Air Lines, which lobbied in private to amend the voting legislation in Georgia. It is part of an oligopoly that hurts consumers, has just received $8.5bn of government cash, cut its workforce by 19% during the pandemic and is an important polluter.

… make no mistake, companies are not a substitute for effective government. It is the state that ensures markets are competitive and not skewed by monopolies or corruption. Only governments can tax externalities such as pollution and build a social safety-net. And the only legitimate way to mediate America’s bitter divisions and protect its fundamental rights is through the political process and the courts—not the executive suite.

The Price of Woke Corporate Politics
By The Editorial Board

… when CEOs take sides in political fights unrelated to their business interests or regulation, they have to expect to be treated like politicians themselves.

CEOs may have different justifications for their new progressive politics. Some may be speaking out of conviction, though most seem ill-informed about the political fights they’ve joined, such as state voting laws. Nike and Mr. Donahoe flaunt their leftist positions as part of their brand identification. Others may be hoping to buy cheap grace from the Biden Administration. Still others may be bowing to politically charged employees. But CEOs get paid to lead and protect their brands, not to be led by people who think business should embrace their politics.

The woke business trend isn’t healthy for the free-market cause. CEOs risk undermining support on the grass-roots and Congressional right for business, mirroring the anti-corporate sentiment that dominates the left. CEOs can never buy enough absolution from the left as long as they believe in profit.

Corporate America is suffering from Bloombergism
By Henry Olsen

Bloombergism is my way of describing that peculiar combination of mildly woke social liberalism and devotion to rapacious capitalism typified by Michael Bloomberg. The former New York mayor made billions of dollars on Wall Street with his eponymous data terminal, the profits from which he then spun into the Bloomberg media empire. Few if any Gilded Age robber barons can favorably compare with how he ruthlessly exploited a genuine innovation and accumulated his massive fortune.

Bloomberg differs from 19th-century industrial giants, though, in his approach to social issues. Bloomberg is fashionably pro-choice on abortion and has tipped his hat toward LGBTQ rights and funded campaigns for gun control, all while generally aiding causes deemed great and good by lettered liberals who wanted some of his cash. The waves of wokeness move quickly, and he did have to apologize for policing policies during his time as mayor that had fallen out of favor when he ran for president in 2020. But in general, the adroit entrepreneur stayed up with the market and mirrored his audience’s social liberalism.

This combination of views is stunningly common among educated elites, and especially among the particularly well-to-do. It’s the leitmotif of the annual World Economic Forum in Davos, Switzerland, where devotion to the pursuit of global lucre is coupled with 21st-century socially liberal noblesse oblige to salve the capitalists’ guilty souls. It’s so common that most who profess faith in Bloombergism think it’s just common sense.

It is not, at least not for most Americans. A 2019 poll by Echelon Insights found that 12 percent of Americans, when asked to select from five types of parties typical in a European multi-party system, favored “the Acela Party” — the hypothetical party devoted to the socially liberal, pro-capitalist ideas of Bloombergism. This was the second smallest of the five parties; only a Green-Left party advocating the policies popularized by Rep. Alexandria Ocasio-Cortez (D-N.Y.) attracted fewer voters. This estimate might be on the high side. A 2017 report by Lee Drutman found only 4 percent of voters were liberal on identity issues and conservative on economics.

This will give Bloomergists a particularly hard choice in the United States. If they prioritize their socially liberal views, they will fall in the Democratic Party coalition. That means they will always be pulled to their left on economic issues, even as they often find they are not woke enough for their new bedfellows. But if they prioritize their economic interests and join the Republicans, they must share the party with religious conservatives and MAGA devotees whom Bloombergists consider troglodytes at best. Many remain in denial of this basic fact, hoping that President Biden can save their bacon within the Democratic Party while a champion such as Nikki Haley can retake the GOP. But those are simply fever dreams; the old politics are not returning.

Social Consequences of the Pandemic: “The Super-Rich in the West Are Evading Their Responsibility”
By Martin Hesse und Michael Sauga

DER SPIEGEL: The big tech companies like Amazon and Google are making billions in the crisis, and employees in other industries are losing their jobs. Has the virus also made the rich richer and the poor poorer within countries?

Milanović: Not in general, although it is still too early to draw strong conclusions. Recent studies show that inequality is falling in many Western countries thanks to huge government aid programs. On the other hand, it can be observed that the super-rich in the Western world are evading their responsibility. They did not live up to their own claims during the pandemic.

DER SPIEGEL: What do you mean by that?

Milanović: I can still remember how the global elite promised in large public appearances at the World Economic Forum meeting in Davos that they would use part of their wealth to fight emergencies. Unfortunately, little of this has been seen in the past few months. Words and actions have been far apart. The inaction of big money in the U.S. was particularly noticeable.

DER SPIEGEL: The wealthy in industrialized countries have become even more affluent in recent years.

Milanović: The income elite in the West today typically benefit from high investment income and growing income from work at the same time. This applies to chief physicians, star lawyers or software developers, for example. This is a new phenomenon. In classical capitalism of the 19th and 20th centuries, only entrepreneurs and capitalists were at the top of the income pyramid. The change we are witnessing, though, is a long-term trend that is exacerbated by the pandemic – for example, by the boom on the stock markets.

DER SPIEGEL: Which deepens the gap between rich and poor.

Milanović: That’s right. But this change also has positive sides, because it levels out the old opposition between capital and labor. At the same time, it protects the elite, say, the top 10 percent, from losses because their prosperity comes from several sources, both from high labor and from capital incomes. That makes the privileged classes even stronger economically – and politically more influential.

DER SPIEGEL: The traditional middle class, on the other hand, has suffered economic losses over the past few decades. Will this development worsen?

Milanović: There are a number of signs of that. The pandemic has shown that you can work from practically anywhere in the world with the help of modern means of communication. As a result, something like a global job market is emerging for the first time in many professions. This is bad news for numerous professional groups in the West who have not previously had to fear global competition.

DER SPIEGEL: You believe that salaries may go down?

Milanović: Not only that. Those who live in Delhi and email their reports to Düsseldorf have a much lower cost of living than their local colleagues. This can lead to the paradoxical result that equality grows globally, but the middle class in the West suffers.

DER SPIEGEL: How should governments react to this?

Milanović: It would make sense to make it easier for lower-paid workers to access capital income. For example, through tax advantages or a promotion of worker shareholding. Education is even more important. Today the elites of the West use their wealth to buy their children access to the best schools and universities. This isolates the top 10 percent even more from the rest of the population. It reduces the chances of advancing socially and leads to a new form of rule by the money nobility. I call it homoplutia because the elite is wealthy (plutia) in terms of both financial and human capital.

End welfare as we know it — for the upper middle class
By Richard V. Reeves and Christopher Pulliam

The most formidable political class in the United States now is not the oligarchic 1 percent. It is not the struggling middle class. It is not the suffering poor. It is the upper middle class, strong in number, loud of opinion and fiercely determined to protect its interests.

It’s the upper middle class that candidate Joe Biden was attempting to appease when he promised not to raise taxes on anybody making less than $400,000 a year. Upper-middle-class Americans — those with household income in the top 20 percent — may not have incomes reaching quite those heights, but they are comfortably into the six figures. They also have solid retirement accounts, kids for whom four-year college is a given, good health care and valuable homes in good neighborhoods.

But they don’t come by all these things solely through their own hard work and brilliance (though they really do like to think that). Every step of the way, the government is there to help. Their affluence — perhaps your affluence — is plumped up with generous tax breaks and restrictive zoning laws. This is welfare for the prosperous.

Start with the tax code. Dozens of special federal tax breaks overwhelmingly benefit the affluent. Exhibit A is the deduction for state and local taxes, or SALT. President Donald Trump capped this regressive benefit at $10,000 a year. Now some Democrats, especially those representing upscale liberal districts, are fighting to get the whole deduction back. But if the cap is repealed, at the cost of more than $70 billion annually, 96 percent of the benefit will flow to households in the top fifth of the income distribution.

This reform would in fact be much more skewed to the rich than the Republican tax law that Democrats rightly denounced as regressive. Unsurprisingly, the Biden administration is cool on the idea. The results of this battle will show just how far the Democrats have now become a party of the liberal upper middle class.

There’s also the mortgage-interest deduction, which costs $24 billion a year. This similarly lines the pockets of the affluent, with nearly 80 percent of the benefit going to the top fifth, but also distorts the housing market — and does nothing to broaden home ownership.

Or how about tax-advantaged 529 educational saving accounts? These are the product of George W. Bush-era tax cuts and, these too only offer real financial benefits to the affluent. We wouldn’t expect movement here anytime soon, however. When President Barack Obama tried to reform them, he was forced into a U-turn by liberal members of his own party, who were feeling the heat from their noisily entitled constituents.

As Paul Waldman wrote for The Post then, the proposal was “targeted at what may be the single most dangerous constituency to anger: the upper middle class. That’s because they’re wealthy enough to have influence, and numerous enough to be a significant voting bloc.” In the intervening years, the upper-middle-class grip on the party seems if anything to have tightened.

There are some moves afoot from the Biden administration to undo at least some upper-middle-class welfare benefits — but the White House had better get ready for a fight. Affluent Americans have come to see these benefits as vital for sustaining their lifestyle. You might even call it welfare dependency. Upper-middle-class Americans are also gaining political muscle — and seem ready to use it.

The Bush Restoration
By Michael Lind

The heat and noise of cultural warfare over issues like race and wokeness conceals a broad consensus within the narrow class of people whose needs and opinions are actually represented within the American party system. Business-class Republicans quietly support Planned Parenthood, while business-class Democrats are hostile to unionization (though they may tolerate harmless company unions that act in concert with corporate HR to enforce woke corporate norms). Republican neoliberals quietly, and Democratic neoliberals loudly, favor gay marriage and abortion rights—and also free markets, deregulation, cheap-labor immigration, and offshoring of industry to low-wage countries. They recognize the need for a safety net but prefer to subsidize low-wage workers through programs like the earned income tax credit (EITC)—as opposed to raising the minimum wage or cutting off the supply of low-wage immigrant labor that deprives the least educated American workers of real bargaining power.

It is unsurprising that both left neoliberals and right neoliberals oppose significant increases in taxation on the affluent and rich. During the 2020 presidential campaign, Biden promised not to raise taxes on any households that make less than $400,000. Restoring the state and local tax (SALT) deduction, a taxpayer subsidy to the metropolitan rich who overwhelmingly vote Democratic, is a goal of many neoliberal Democrats in Congress.

The working class itself is best defined by the level of education: the two-thirds of Americans who lack a B.A. If there were a working-class party, to judge from polls, it would be left on economics—supportive of higher Social Security and Medicare benefits and organized labor—and moderately traditional on social issues. But the multiracial American working-class majority has no party, even though it makes up two-thirds or more of the population.

The effective disenfranchisement of two-thirds of the American population by the party system is a recent development. Portions—not all—of the working class used to be represented by private sector trade unions, local political machines, and activist churches within the Democratic Party. But with the exceptions of the Black evangelical churches that are important in Democratic politics, and the white evangelical churches that are a pillar of electoral support for Republicans, these intermediaries that once magnified working-class power have largely ceased to exist or exert much influence within party structures.

Electorally speaking, both progressive populism and conservative populism are instrumental frauds. While the educated professionals who define “progressivism” claim to be pro-union, they would be horrified by the return of powerful private sector trade unions led by the equivalents of Jimmy Hoffa or George Meany—few of whose followers would be likely to share progressive views on late-term abortion or transgender issues or bans on suburban construction and fossil fuels. For them, “union organizing” is a fun activity for liberal arts college graduates looking for a taste of “authenticity” complete with retro Shepard Fairey graphics that will help credential them in their future pursuits as foundation grant-makers, political operatives, or leveraged buyout specialists.

Republican populism is equally fraudulent. So-called Republican populists are mostly just culture-war grifters in the tradition of Patrick Buchanan. They may talk about limiting low-wage immigration or onshoring industry, but when it comes to any policies that might actually raise the wages and bargaining power of American workers—a higher minimum wage, universal health and retirement benefits, new kinds of organized labor—the fake populists of the GOP suddenly go silent or change the subject to noneconomic culture war issues, mostly involving race and sex.

As I’ve noted, the actual base of the pseudo-populist right consists of self-employed Americans—who make up no more than 10% of all working-age Americans—not the vast majority of Americans who work for others, least of all the majority of private sector workers who work for large firms with more than 500 employees. The business model of many Republican conservative small proprietors—owners of hair salons, family farmers dependent on foreign guest workers—relies on cheap and pliant workers. Laws that raise the wages and improve the working conditions of their workers threaten the small proprietors of the right, many of whom do not make all that much money anyway. Today as in the past, the small business lobby is the natural enemy of the underpaid, overworked American worker, of any race or religion.

For their part, genuine working-class Americans tend to pay little attention to politics and view it as something alien that they cannot influence. During elections, Democrats and Republicans try to overcome this entirely reasonable aversion through cynical campaigns to make their potential supporters terrified of the other side. Democrats try to win the votes of working-class Black, Hispanic, and Asian Americans by claiming that Republicans are white nationalists seeking to overthrow democracy and jail or deport them. In 2012 then-Vice President Biden said of Republicans to a multiracial audience in an affected Black idiom, “They’re gonna put ya’ll back in chains.” Republicans win the votes of their working-class white base by claiming (more plausibly) that progressives hate their traditional religious and patriotic values and are bent on desecrating their symbols and reeducating their children.

Of the two coalitions—the Democratic coalition of neoliberals and progressives and the Republican coalition of neoliberals and pseudo-populist petty bourgeois conservatives—the Republican coalition seems likely to continue its winning ways in the near future. The reason is that progressives frighten more Americans than conservatives do.

Progressives are cultural aggressors. They favor radical social engineering to change America and the world for the better: racial quotas in all jobs and school curricula; diversity training and indoctrination for all; massive increases in immigration; taking down statues of historic figures who do not meet today’s woke standards and changing the names of institutions named for them (goodbye, Woodrow Wilson School of Public and International Affairs); hostility to nonprogressive religious traditions; banning fossil fuels; forcing people out of their cars into mass transit and replacing suburbs with high-density apartment blocks; discouraging meat consumption. In every area of American social life, no matter how minor, there are progressive NGOs and university professors and their donors pushing some radical transformation that most Americans either don’t care about or strongly oppose.

Rule by Decree
By Michael Lind

From the beginning, America’s elite technocratic progressives sought to use the state and other managerial institutions for top-down social engineering. Many of them, like much of the early-20th-century intelligentsia, believed in “scientific” racism and eugenic sterilization of “the unfit.” Among these was Margaret Sanger, the founder of Planned Parenthood, which grew out of the eugenics movement. The flagship publication of the American foreign policy establishment for the past century, Foreign Affairs, resulted from a 1922 merger with the Journal of Race Development. Other causes of Progressive Era social engineers included urban and regional planning and resource conservation (later known as environmentalism).

Although most early Progressives were northern Republicans, many of them were disillusioned by the party’s domination by business and some joined the New Deal Democrats. The crisis of the Great Depression initially excited many technocratic progressives, who saw it as a chance to realize their dream of remaking America from above as a planned society and a planned economy. Stuart Chase, the progressive intellectual who coined the term “New Deal,” was enthusiastic about Soviet-style multiyear economic planning.

But as Ellis Hawley showed in The New Deal and the Problem of Monopoly (1966), what he called “the planners” were soon disappointed to learn that Franklin Roosevelt and the Southern Democrats who dominated Congress had no interest in reconstructing the United States as a technocratic utopia run by Ivy League eggheads. The New Deal coalition included some progressive technocrats, but politically it was an alliance of urban machine politicians with working-class constituents and small-town politicians from the South and West. The unions and the farm lobby got much of what they wanted, but the planners got little out of the New Deal except for conservation policy and national parks.

Flash forward to 2021. Today’s Democratic coalition would have been unrecognizable to FDR or even to Lyndon Johnson.

As the white working class has streamed out of the Democratic Party, college-educated white professionals and managers who would have been moderate Republicans a few decades ago have streamed in. These affluent, white, suburban, former Republican voters handed Biden his victory in 2020, even as the Democrats lost substantial numbers of working-class Hispanics, Blacks, and Asian Americans.

In the post-farmer, post-labor Democratic party, the two most powerful groups are the neoliberals backed by America’s corporations and banks, and the technocratic progressives, who are more interested in top-down social engineering than in empowering working-class people of all races and views.

Two historic events have empowered technocratic progressives since the Reagan and Clinton years. The first is a Niagara of money pouring into nonprofits and university programs funded from the outside. Old-timey egalitarians on the left may lament the upward redistribution of income to Silicon Valley and Wall Street billionaires and their wives and ex-wives, but technocratic progressives with noneconomic causes have been among the major beneficiaries. Technocratic crusades that struggled for funding in the 1950s and 1970s, like urban densification and environmentalism, or which did not take shape until recently, like gender activism, today need only hold buckets out the window to catch falling donations from major corporations, banks, and IPO plutocrats.

The other unanticipated event that has empowered the old and new progressive social engineering cults has been the success of neoliberal Democrats and Republicans in deregulating and privatizing telecommunications, finance, and other essential industries, transferring power from democratically elected or supervised public officials to self-regulating corporate and financial managers. This transfer of power from public to private was not carried out in the 1990s and 2000s to empower progressive intellectuals and activists in the nonprofit sector and university campuses. Indeed, both Clinton and Obama tried to distance themselves from the cultural left. The neoliberal program of shifting social power from the democratic state to private managerial elites was done to enrich the individual and corporate donors of neoliberal Democrats and Republicans, and also to enrich politicians themselves like Clinton, Gore, and Obama, who were rewarded for their service to the business and financial communities with help in building their own personal fortunes after they left office. (Fewer than 100 days after leaving office, Obama made a speech to the Wall Street bank Cantor Fitzgerald for $400,000.)

But the transfer of wealth and power to a small number of large corporations and banks created opportunities that technocratic progressives based in the vastly expanded NGO and academic sectors have learned to exploit.

Large commercial institutions have always been risk-averse and subject to political pressure campaigns, whether from the left or the right. To avert bad publicity, like protesters sent by the Rev. Jesse Jackson’s Operation PUSH (People United to Save Humanity), many large corporations adopted affirmative action programs following the civil rights revolution. Other companies in the late 20th century responded to more or less blatant shakedowns by green NGOs by making donations to approved environmental organizations and causes.

Today this has gone into overdrive, as we see from the corporate largesse showered on the Black Lives Matter movement and corporate promotion of transgender ideology—a cause unknown even to most liberals half a decade ago. As Joel Kotkin argues in The Coming of Neo-Feudalism (2020), pro-business neoliberals and social engineering progressives—rival factions in the Democratic Party as recently as the 1990s—have settled into a comfortable arrangement, like the aristocracy and clergy in medieval Europe. As long as the aristocracy donates to the clerisy and submits to its moral direction, the clerisy will not question the aristocratic order and will look the other way as the lords pillage the peasants.

This neofeudal deal between the corporate and financial managerial elite and the NGO-academic intelligentsia has become clear in the past few years. A decade or two ago, many Democrats would have been shocked that a social media firm like Twitter would censor a president, no matter how obnoxious. But many of the ascendant technocratic progressives in the Democratic coalition are eager to supply major U.S. corporations with blacklists of individuals and organizations to censor on YouTube and Amazon and to suggest Republican states for corporations and banks to boycott. The authoritarian DNA in America’s technocratic progressive subculture, inherited from the long-ago marriage of the mandarin progressives and monarchical bureaucrats of Bismarckian Germany, is more evident than ever.

American technocratic progressives have always preferred to impose their social engineering plans without the need for trying to persuade ignorant voters and their grubby, corrupt representatives. In the first half of the 20th century progressives looked to a European-style high civil service as a method for realizing their goals while circumventing Congress. When a Congress dominated by rural representatives and urban immigrant machines refused to create an all-powerful federal bureaucracy, America’s disappointed technocratic progressives put their hopes instead in activist federal and state judges, leading to the age of judicial lawmaking on made-up pretexts that began in earnest with the Warren Court and which continues today. More recently, both Democratic and Republican presidents have made increasing use of executive orders to ram through partisan policies that cannot get through Congress or state legislatures.

In the last decade, in addition to judicial rulings and executive orders, progressive technocrats have found a new tool for authoritarian, undemocratic policymaking: the policies of corporations and banks. Why bother trying to get Congress or a state legislature to pass a bill requiring corporate boards to have rigid quotas for women, minorities, nonheterosexuals, immigrants, or whoever—which is arguably illegal—if you can persuade major banks not to fund any businesses that don’t have board quotas? You can skip all the tedious stuff—you know, making your case to the public, getting your proposal into a party platform, and assembling a legislative coalition. Just win over a few major bank executives, and you’ve imposed your pet social engineering policy on much of America without any public debate or votes by elected officials.

This kind of nondemocratic technocracy is really plutocracy in disguise. After all, who funds the NGOs of technocratic progressivism? Rich people, foundations endowed by rich people (living and dead), and corporations and banks, for the most part. So whenever you read the phrase “public interest group” or “social justice organization,” you should substitute it with “billionaire-or-corporate-funded social engineering bureaucracy.”

All of this raises the question: Why bother with elections at all? Why not just let progressive NGOs funded by the rich set the political agenda, and then have it imposed by corporate and bank policies on the American people?

Consider the following thought experiment. Suppose there is a movement in the next decade to legalize polyamory. Under our present system, this can be done by entirely nondemocratic methods. Let me explain.

… let us set aside the merits of polyamory in any form and focus on the task of imposing it on the American people without legislation or democratic debate. The process would begin with billionaire donors and foundation program officers at private conclaves agreeing to make polyamory the next big reform. Once the money faucet was turned on, various nonprofit groups hiring underemployed college graduates desperate for office cubicle jobs would spring up like mushrooms after rain—People United for Marriage Plurality (PUMP). Consultants would be paid to coin slogans and acronyms that can be repeated and retweeted: Plural Marriage Now (PMN).

Next would come the mainstream media articles and TV debates: “Has the Time Come for Plural Marriage in America?” Historians of Mormon polygamy and the Hebrew patriarchs would dash toward TV cameras and fight for op-ed space. Children’s book authors would try to catch the wave with My Three Mommies.

Bills in favor of legalizing plural marriage would be introduced in progressive Democratic states like Massachusetts and California, if not Mormon Utah. The democratic debate about plural marriage would not last long. Indeed, after a few weeks of controversy the progressive publication Vox might announce in its typical fashion that “The Debate About Plural Marriage Is Over.”

Federal legislation to legalize plural marriage nationwide would stall in Congress, however, and only a third or half of the states would have enacted their own legislation. At this point, the Supreme Court would step in. The federal judiciary is nominally divided among Democrats and Republicans, but they are almost all “liberaltarians”—socially liberal, pro-business, and anti-organized labor—and plural marriage would be coded as a liberal social issue of the sort popular in their social circles. So even with a “conservative” Republican bench, a majority of Supreme Court justices would find that federal and state laws against the recognition of plural marriages violate—well, they violate something. The 14th Amendment, maybe, or the right to bear arms. Maybe the clerks can find a constitutional pretext or two.

Once the Supreme Court made plural marriage the law of the land—on the basis of a constitutional right or natural right that nobody except the Mormon Prophet Joseph Smith and some stoned hippies in communes even knew existed—the brief window of democratic debate and legislation would be over. A day after the Supreme Court decision, corporate HR departments would make questioning the sanctity of plural marriage a firing offense. On university campuses, professors and students who questioned plural marriage would be disciplined for “hate crimes.” Books and podcasts critical of plural marriage might vanish from YouTube and Amazon, and critics of plural marriage might find their bank accounts suddenly closed and their online payment incomes demonetized. If state legislatures or city councils tried to resist plural marriage, major corporations would announce boycotts against them until the citizens and their elected representatives gave up and voted the way the top 100 corporations told them to vote.

For technocratic progressives, democracy is not a way to solve social problems. Democracy is an obstacle to social justice that should be circumvented whenever possible.

‘The Culture-War Stuff Just Rots the Brain’
By Len Gutkin

Tell me about the book you’re working on.

It’s called We Have Never Been Woke: Social Justice Discourse, Inequality, and the Rise of a New Elite, forthcoming with Princeton University Press.

At its core, the book is about changes in society that occurred as a result of the shift to what you could call the “knowledge economy”; it’s about the people who make their living by manipulating data, symbols, and information, who have become increasingly prominent and influential in recent decades. They have certain legitimizing narratives that justify their role: why they should be the ones who are entrusted with power, why other people who have power are terrible and should be overthrown. The book traces the rise of that group, looks at the narratives they use to legitimize themselves, and then explores the extent to which those narratives correspond to reality.

And in many instances, they don’t. When you look at who benefits the most from systemic inequality, it’s not the people in Trump country, for the most part. It’s people who live in places like New York or D.C., especially those affiliated with the knowledge professions. They’re the ones who are the winners in the prevailing order — and by “they” I mean we.

To give a concrete example: At a different stage of my life, I subscribed to the banal liberal understanding, that a lot of people I’m surrounded by still believe, with respect to who the “bad guys” are: Those damn Republicans! If only people here in podunk Arizona could be more like the people of enlightened New York, then God, what a beautiful country this could be!

And I had already shed a lot of this in previous years — but the vestiges that were left got destroyed when I arrived in Manhattan. One of the first things that stood out to me is that there’s a racialized caste system here that everyone takes for granted. You have disposable servants who will clean your house, watch your kids, walk your dogs, deliver food to you. Someone goes shopping for you and delivers it to your house. There’s this army of disposable, vulnerable, desperate people — mostly minorities and immigrants and disproportionately women — from particular racial and ethnic groups, while people from other racial and ethnic groups are the ones being served. And this is basically taken for granted in New York, that this is the way society operates.

And yet, the way things are in liberal bastions like L.A. or Chicago — this is not how things are in many other parts of the country. Most other places, the person buying a pair of shoes and the person selling them are likely to be the same race — white — and the gaps between the buyer and the seller are likely to be much smaller. Even the most sexist or bigoted rich white person in many other contexts wouldn’t be able to exploit women and minorities the same way as the typical liberal professional in a city like Seattle or New York; the infrastructure simply isn’t there. It’s these progressive bastions associated with the knowledge economy that have these well-oiled machines for casually exploiting the vulnerable, desperate and disadvantaged. And it’s largely Democratic-voting professionals who take advantage of them.

Where does Trumpism come in?

A few months after I arrived at Columbia, Trump won. I expected this to happen, but for most people, that was not the expectation. So here at Columbia, the day after Trump won, a lot of the students claimed to be so traumatized that they couldn’t do tests or homework. They needed time off. Now there are two things striking about that to me.

First, these are students at an Ivy League school, overwhelmingly people from wealthy backgrounds — and even if they don’t come from wealth, they’re likely to leave well-positioned. These are elite schools — schools designed to cultivate elites.

So even in their own self-narratives about what the impact of the election would be — the poor and vulnerable would be crushed underfoot while elites became even more well-off — guess what: We’re the elites! Realistically speaking, we’re the type of people who stand to benefit from someone like Trump. We certainly shouldn’t be thinking of ourselves as the little guy. But there seemed to be strikingly little recognition of these realities. Instead, many students seemed to view themselves as somehow uniquely vulnerable to Trump and his regime, as being especially threatened or victimized. And so they demanded all of these accommodations for themselves.

Meanwhile, there was this whole other constellation of people around them who seemed to be literally invisible to them.

The people doing all the work on the campus.

Yes! The landscapers, the people serving food, the security guards, the janitors. There was not one thought for them. And these were the people, according to the prevailing narratives, who stood to lose the most from Trump’s victory. They are disproportionately immigrants and minorities. Yet the students didn’t begin by demanding that those people need a day off, higher pay, better benefits or protections, etc. They were focused on themselves.

Nor were these ignored laborers — the people with the most at stake in this election — saying they needed time off because they were too traumatized. They showed up to work the next day and did their jobs. They weren’t making a scene, sobbing as they scrubbed rich kids’ mess out of the toilets. The juxtaposition was sobering. And I want to be clear, I’m not picking on Columbia students here. When I left campus, walking around the Upper West Side, or other affluent parts of Manhattan, similar scenes were playing out. Nor was New York City unique in this regard. Other knowledge economy hubs had similar scenes playing out. And the same drama that was playing out in Columbia was unfolding at colleges and universities across the country.

The Woke Meritocracy
By Blake Smith

What is new about education’s turn to woke identity politics is not the fact that administrators and faculty are influencing students’ sense of self, but rather the sort of values that the new ideal personality is supposed to uphold. The contemporary ideal, increasingly, is no longer someone so charmingly personable that others forget he is in fact a ruthless competitor, but a person who so convincingly narrates her having overcome some kind of social injustice that others forget she is in fact a beneficiary of systems of privilege.

Many students in my class received tremendous amounts of help on their admissions essays from dedicated tutors at their high schools as well as private writing coaches. Their letters are a collective output, a kind of shared fantasy of the ruling class. They should not be read for their insight into what students are really like, but for the purposes they serve their supposed authors and the society that has trained them to speak of themselves in these terms.

No students get to my classroom without having had a lot of good luck, or without learning how to occlude that luck through narratives of merit and identity. Nearly all of them have been born into wealthy, stable families, and attended excellent primary and secondary schools. Parents, teachers, and classmates pushed them to make the most of their cognitive abilities (another stroke of genetic and environmental fortune) and to develop the sort of personality most congenial to teachers and future employers. None of this was their own doing.

They were all, of course, trained to work hard, and eventually internalized a commitment to hard work to such an extent that they may believe themselves to have earned, by their own efforts and choice, access to a university degree that will in turn grant them entry into the highest levels of such lucrative professions as consulting and finance. They are able to tell themselves, through stories of personal merit and victory over oppressive structures, that they were the heroes of their own lives.

There is much that could be said about the pernicious role of elite universities and their admissions process in this crisis—but we must also be careful not to let urgently necessary critiques of economic inequality allow us to ignore that we cannot avoid having some kind of elite.

Even in an economy organized around meeting the needs of ordinary citizens rather than offering rewards to those imagined to be especially meritorious, there would still be an important and irreducible form of inequality at the heart of our democracy. Only a small number of people at any given time can make the important decisions that shape our collective life. As citizens, we have the right, and the obligation, to determine who our political elites will be—not only by deciding which person will receive our vote, but what kind of person the institutions that train our elites should produce. We must ask ourselves toward what ideal personality, or moral character, we expect the efforts of our educators to aim, and we must confront—and indeed lament—the ideal that we have tacitly accepted in both meritocratic and woke pedagogy.

Elites whose character has been shaped by the apparent conflict, and inner coherence, of meritocracy and wokeness, may not be immoral or incompetent. I can offer no evidence that elites with a different sort of education would be better people or more effective leaders. I can only observe that every system of education aims, whether anyone acknowledges it or not, toward producing and privileging a certain human type, and that every society has an elite. Beyond the noisy conflict between defenders of meritocracy and their woke opponents, our society has chosen, and continues to choose, to educate its children with the apparent aim of making a class of leaders who are disconnected from any real solidarity to others but unable to think for themselves, combining the worst qualities of individualism and conformism.

The Authority Blob
By David Samuels

Michael Lind: We must be careful not to attribute too much Machiavellian cunning to the American managerial elite, which rewards conformity over intelligence and is grossly incompetent. It has presided over one domestic and foreign policy disaster after another for the last generation.

Few of the members of America’s oligarchy are sincere leftists. They are more likely than the working class to have traditional, intact families, and the most successful among them enjoy plutocratic lifestyles. They pay lip service to transgender rights and racial “equity” but in general do not like labor unions, taxes, or economic regulations, and they send their children to private schools if no high-quality public schools are available.

As a rule, American institutional elites deploy identity politics cynically, to co-opt and bribe the leaders of groups that might either challenge their wealth and privilege or offer reliable blocs of voters. Elite Democrats kneel while wearing kente cloth and back race-based affirmative action in college admissions, business set-asides, and quotas on corporate boards, not to help working-class Black Americans but to co-opt college-educated Black and Hispanic professionals and business owners as intermediaries between the Democratic Party and non-white voting blocs. For decades, Reagan-Bush Republicans have similarly bought off leaders of evangelical Protestant populists who are hostile to big business and big banks with talk of “Judeo-Christianity” and conservative foundation grants for religious intellectuals. “People of faith” is the right-managerial version of BIPOC (Black, Indigenous, people of color).

The strategic replacement of autonomous grassroots leaders by a group of elite-designated astroturf leaders who speak for abstract race and gender categories and are wholly dependent clients of the managerial oligarchy needs an alibi. That alibi is provided by the establishment’s ongoing campaign to downgrade representative democracy in favor of representative demography. A. Philip Randolph and Cesar Chavez were union leaders, and Martin Luther King Jr. and other members of the Southern Christian Leadership Conference derived their authority from their churches. In contrast, today’s alleged racial spokespersons are designated by mostly white magazine editors, book publishers, prize committees, and foundation program officers.

Is Musa Al-Gharbi the Last Academic Who Can Tell the Truth?
By B. Duncan Moench

“There’s this big division between elites of color and other people of color,” Al-Gharbi said. “It’s especially true at universities, in particular prestigious ones. They usually support diversity, equity, and inclusion because it gives them an advantage in intra-elite competitions. It helps them get a leg up on other aspiring elites who happen to be white, or men, or whatever. So, there’s a very direct material advantage to them. Similarly, you see that the Black writers or Latino writers doing work for The New York Times are onboard with all of this, but it breaks down to what Chomsky has argued about elite education: It serves as a tool to filter out people who aren’t willing to say the ‘right things’ or don’t know how to avoid saying things which anger elites. Critics at The New York Times have clapped back at Chomsky’s argument and said, ‘Well, no one tells me what to write.’ To which Chomsky replies, ‘If you didn’t already know what the right things are to say, you wouldn’t be at the Times at all.’”

What happened with New York Times reporter Donald McNeil?
By Erik Wemple

In the summer of 2019, McNeil joined a group of students on a Times-sponsored educational excursion to Peru. In the wake of the trip, the Times received a number of complaints about the longtime science reporter’s conduct. Some of them cited allegedly racist remarks and behavior, including that McNeil had stereotyped African American youths. After an investigation, McNeil received a reprimand in September 2019.

Then Times management pivoted after receiving a letter from 150 Times staffers. That letter, sent on Feb. 3, said, “Our community is outraged and in pain. Despite The Times’s seeming commitment to diversity and inclusion, we have given a prominent platform — a critical beat covering a pandemic disproportionately affecting people of color — to someone who chose to use language that is offensive and unacceptable by any newsroom’s standards.” It also asserted that since the controversy became public, “current and former employees have suggested that he also has shown bias against people of color in his work and in interactions with colleagues over a period of years.” The letter furnished no specific examples of that, but it requested a “renewed investigation” into the 2019 controversy.

A Friday email from Baquet and Managing Editor Joseph Kahn announced McNeil’s resignation and laid out the standard: “We do not tolerate racist language regardless of intent.”

With those last three words, the Times lost its foothold for criticizing a politician for flip-flopping. In his initial assessment of the McNeil case, Baquet wrote, “It did not appear to me that his intentions were hateful or malicious.” Now, suddenly, intent means nothing.

The legacy of the McNeil controversy will be to blockade intelligent discussion. In December, Hannah-Jones posted a tweet using the n-word in the context of exploring mores regarding the use of the epithet. There was no national controversy because the intent behind the tweet wasn’t vindictive or hurtful; it was journalistic. If intent no longer matters at the Times, however, wasn’t this a problematic posting?

Not in Hannah-Jones’s view. “As far as I know, we haven’t rewritten the employee handbook. I think context matters and I think the very smart people who run the New York Times understand that,” she says.

Slate clarifies guidelines on use of racial slurs following suspension of podcaster Mike Pesca
By Erik Wemple

Slate is clarifying internal guidelines surrounding the use of the n-word following the suspension of Mike Pesca, host of the daily podcast “The Gist.” Pesca, who is White, rankled colleagues in an internal Slack chat about the controversy surrounding former New York Times science reporter Donald G. McNeil Jr., who had used the n-word during a discussion about racist language with students on a Times-sponsored trip to Peru in 2019. He was disciplined at the time, and resigned from the Times earlier this month after details of the trip surfaced in the Daily Beast.

“My points are [McNeil’s] internal conduct was in a grey area, you guys don’t think it was,” wrote Pesca in the Slack chat, according to an extensive account in Defector. That account pointed out that Pesca had used the n-word twice in 2019 — once in an interview with a Slate reporter and a second time while recording his podcast. In both cases, the remarks were not ultimately aired. Those instances appear to have figured in Pesca’s suspension, which was initially for a week before being made indefinite. “The Gist” is also on suspension while an outside firm conducts an investigation, according to two sources.

Pesca used the slur in a 2019 taping of an episode of “The Gist” relating to how the media dealt with a news story involving the n-word: A security guard at a school in Wisconsin had been fired for telling a student not to call him the n-word.

The New York Times Buys Wordle
By Marc Tracy

The sudden hit Wordle, in which once a day players get six chances to guess a five-letter word, has been acquired by The New York Times Company.

The purchase, announced by The Times on Monday, reflects the growing importance of games, like crosswords and Spelling Bee, in the company’s quest to grow digital subscriptions to 10 million by 2025.

Wordle was purchased from its creator, Josh Wardle, a software engineer in Brooklyn, for a price “in the low seven figures,” The Times said.

The Times hits its goal of 10 million subscriptions with the addition of The Athletic.
By Marc Tracy

The New York Times Company reached its goal of 10 million subscriptions ahead of schedule, the company said Wednesday, aided substantially by the 1.2 million it gained by buying the sports news website The Athletic.

The $550 million deal for The Athletic, which was announced last month, was completed on Tuesday, the company said.

The Times also announced a new goal on Wednesday: It will aim, it said, to have at least 15 million subscribers by the end of 2027.

Meredith Kopit Levien, the company’s president and chief executive, said in a statement that The Times’s executives believed there were “at least 135 million” potential subscribers in the United States and around the world — adults “paying or willing to pay for one or more subscriptions to English-language news, sports coverage, puzzles, recipes or expert shopping advice.”

For the fourth quarter of 2021, the company reported adjusted operating profit of $109.3 million, a 12 percent increase from a year earlier, and revenue of $594.2 million, a 16.7 percent rise. Operating costs rose at virtually the same rate, to $500.1 million. Subscription revenue rose 11.1 percent, to $351.2 million.

For the year, revenue grew 16.3 percent, to $2.1 billion — making 2021 The Times’s first $2 billion year since 2012.

New York Times Confronts Labor Strife as Tech Workers Push to Organize
By Alexandra Bruell and Allison Prang

New York Times Co. is resisting a unionization effort by its technology workers and is battling the newsroom’s main union over the creation of new holidays as labor strife at the publisher intensifies.

Roughly 600 tech workers at the Times—including employees in engineering, product management, project management, design and data analysis—are currently voting on whether to unionize in an election held by the National Labor Relations Board. Ballots are due in mid-February, with results expected in March.

The Times’s management has declined to voluntarily recognize the new union, which is seeking to organize as part of the NewsGuild of New York.

The company has recently sent pamphlets to tech staffers that say it has made progress on “career growth and development, company culture and [diversity, equity and inclusion] and compensation and benefits,” according to a copy of the document viewed by The Wall Street Journal. It has encouraged them to vote against the union.

“It’s Chaos”: Behind the Scenes of Donald McNeil’s New York Times Exit
By Joe Pompeo

The Times was already under fire on two fronts. One was the recent termination of freelance editor Lauren Wolfe, some of whose tweets—most notably, one in which she professed to having “chills” watching Joe Biden’s plane land—the Times determined had crossed into political territory, a ruling that was criticized by many in the journalism community as unfair and heavy-handed. The other was the ongoing fallout from the Times’ now infamous Caliphate podcast. Crucial elements of Caliphate had fallen apart under scrutiny, and the podcast’s primary producer, Andy Mills, was now in the spotlight for allegations of past inappropriate interactions with female colleagues. A separate review of Mills had been unfolding in the background. It was prompted by new complaints about his behavior that began to pour forth after he guest-hosted The Daily in December, less than a week after the Times acknowledged its failures with Caliphate and reassigned the podcast’s star reporter, Rukmini Callimachi. (Times managers have privately acknowledged that allowing Mills to guest-host The Daily was a “severe error in judgment,” as one put it.)

‘I don’t see what the solution is here’: New York Times editor speaks truth about social media
By Erik Wemple

There’s a tension in news organizations around social media, argued Bumiller: Management and “corporate communications” support tweeting because it circulates the company’s content. She cited the example of reporter Maggie Haberman, who can boost a story to her 1.7 million followers. But there’s no editing for tweets, noted Bumiller:

[R]eporters, despite how many times they are told, “Do not tweet anything out that you would not see in the New York Times,” they’re human beings. They don’t have editors. It’s late at night. They’re angry. They’re upset. They’re excited. They tweet out things that are not appropriate.

And inevitably, I get the call. I mean, I can’t possibly have the time to police it, but I get a call from New York saying, Would you please talk to so-and-so? You’ve got to talk to so-and-so. There’s about three or four people. And I talk to them and they say, We promise never to do it again.

Times managers spent an untold number of hours two years ago on the case of Jonathan Weisman, the Washington-based editor whose insensitive tweets led to his demotion, as the Times itself reported. As the Times account notes, Weisman had discussed his Twitter conduct with Phil Corbett, the paper’s standards editor, and met with Executive Editor Dean Baquet. “I accept Dean’s judgment. I think he’s right to do what he’s doing. I embarrassed the newspaper, and he had to act.”

That’s a pretty incisive quote that captures the predicament of news executives vis-a-vis social media enforcement and discipline. Is there a single editor in this business who starts the day eager to scold a reporter who ran afoul of the company’s social media guidelines? Bumiller described in detail how such an exchange frequently goes:

And you often think, What were you thinking? And the answer is, Well, she said something worse about me. And I would say, You are a reporter for the New York Times. Why do you care about what she said? But I go through this all the time. You are above this. This is beneath you to get involved. Why are you doing this?

As a manager, you make like 400 decisions. And every one of them are judgment calls based on the situation. So that’s a lot of what I do … a lot of psychology, too.

Boldface added to highlight the sea of subjective muck through which managers must wade when social media crises come their way. Every errant tweet represents a new challenge: Is it a baldfaced violation of company rules, or does it lie on one of those many gray areas? Does it reveal some bias in the news organization’s coverage? How does it compare with so-and-so’s awful tweets from last month? Much worse? About the same?

New York Times fires ‘Wirecutter’ editor accused of leaving profane voicemails for gun rights group
By Jeremy Barr

The New York Times has fired an editor for Wirecutter, its popular product recommendation service, who was accused of leaving profane voice mails for a gun rights advocacy group, the newspaper said Friday morning.

On Dec. 2, two days after a high school student shot and killed four classmates in Michigan, Marquis posted a message on Twitter criticizing a group called Great Lakes Gun Rights for urging supporters to oppose gun control legislation proposed in the aftermath of the shooting.

“Just got a news release from the Great Lakes Gun Rights organization about protecting gun rights from democrats in Michigan and I am literally shaking with rage,” wrote Marquis, who has since deleted her Twitter account following online criticism that she had violated journalism standards by promoting a political viewpoint. “I hope there is a God and they meet that God someday.” She also tweeted out a phone number and email address for the group, which is the Michigan state affiliate of the National Association for Gun Rights.

The national organization reacted by publishing an audio recording that it said contains voice-mail messages that Marquis had left at its offices, expressing her anger at the group. In the messages, which could not be independently verified as coming from Marquis, the speaker identifies herself as “a journalist at the New York Times” and asks: “How do you sleep at night? And aren’t you just, like, a little bit worried that there might be a hell, and when you meet God, he will send you there?” The speaker then says she is “letting everyone at the New York Times know” what she thinks of the organization.

After the voice mails were published, a Times spokesperson said the company would review the matter and suspended the employee.

“The employee has been terminated from Wirecutter following our investigation related to inappropriate behavior,” a spokesperson told The Washington Post on Friday morning. “We expect our employees to behave in a way that is consistent with our values and commitment to the highest ethical standards. Repeatedly invoking the New York Times’s name in an unprofessional way that imperils the reputation of Wirecutter, The Times, and all of our journalists is a clear violation of our policies and cannot be tolerated.”

How the AP wronged Emily Wilder
By Erik Wemple

The 22-year-old Wilder received her dismissal notice following a successful attempt by conservatives to promote outrage over her activist work while attending Stanford University, where she served as a leader of Students for Justice in Palestine. The episode points to two emerging facts of life in contemporary mainstream media — one, that editors at large news organizations quake when right-wing actors target their colleagues; and two, publishers’ concerns over ethical appearances and perceptions are reaching irrationality.

According to Wilder’s dismissal letter, the rumblings from Stanford — and inquiries from the Washington Free Beacon, Fox News and others — prompted a deeper look into the AP rookie’s social media history. “As discussed, over the last few days some of your social media posts made prior to joining AP surfaced,” reads the dismissal letter. “Those posts prompted a review of your social media activity since you began with the AP, May 3, 2021. In that review, it was found that some tweets violated AP’s News Values and Principles.“

Social-media brushback talks, say the former staffers, occur regularly at the AP, which has 1,400-plus journalists — but the guild says that a “great majority” of these situations are resolved “quietly” and “discipline-free.”

Meaning, Wilder was wronged by her editors.

… Having opinions on foreign crises — or having participated in college activism on such issues — is not a conflict. It is, rather, a set of beliefs, or a personal intellectual history.

Billionaire investor Chamath Palihapitiya says ‘nobody cares’ about Uyghur genocide in China
By Amanda Macias

Billionaire investor Chamath Palihapitiya triggered a backlash on social media after saying during a recent episode of his podcast that “nobody cares” about the ongoing human rights abuses against the Uyghurs in China.

About 15 minutes into the podcast, Calacanis pointed to the Biden administration’s steps to curb and address China’s sweeping human rights abuses when the following conversation ensued:

Calacanis: His [President Biden’s] China policy, the fact that he came out with a statement on the Uyghurs, I thought it was very strong.

You know, it’s one of the stronger things he did, but it’s not coming up in the polls.

Palihapitiya: Let’s be honest, nobody, nobody cares about what’s happening to the Uyghurs, okay? You bring it up because you really care. And I think that’s really nice that you care but …

Calacanis: What? What do you mean nobody cares?

Palihapitiya: The rest of us don’t care. I’m just telling you a very hard truth.

Calacanis: Wait, you personally don’t care?

Palihapitiya: I’m telling you a very hard truth, okay? Of all the things that I care about. Yes, it is below my line. Okay, of all the things that I care about it is below my line.

Calacanis: Disappointing.

Palihapitiya went on to say that he cared about supply chain issues, climate change, America’s crippled health-care system as well as the potential economic fallout of a Chinese invasion of Taiwan.

He later clarified his remarks in a Monday evening tweet, saying he recognizes that he came across as “lacking empathy.”

How to Become an Intellectual in Silicon Valley
By Aaron Timms

On your path to becoming an intellectual in Silicon Valley, understanding these two lessons—the Peter Principles, we’ll call them, since that adds nothing to the conversation but sounds sophisticated—will be key to your success. First, the point of your interventions in the public sphere is not to “win” any “argument,” nor to attract new adherents or convince neutrals of the righteousness of your cause. It is to avoid competition. When competition seeks you out, as it invariably will, your task will be to lose the debate and propose ideas that “seem” (and often are) “shit,” since popular discourse is a test of conventional mindedness; to be truly radical, you must be wrong. Second, there is no absolute moral evil that cannot be playfully reframed on irrelevant grounds as a net historical good. Take, for instance, poverty: what looks to most people like a recipe for social inequality, resentment, division, and violence will be, in your spritely retelling, the most powerful mechanism for income mobility in the history of human civilization. Or consider, say, Pol Pot’s killing fields: bad for the people who got stuck in them, but good for Cambodia’s startup ecosystem? Nazis did bad things to the world in the middle of the twentieth century, but there’s no reason to think they won’t do wonders for agency culture at the Food and Drug Administration in the early 2020s. Your success as a Silicon Valley intellectual will depend on your ability to insert difficult but necessary conversations like these into the public domain. A couple of half-decent ratioed tweets about the beauty of population control or the necessity of transphobia, and you’ll be well on your way to securing your status among the Silicon Valley elite.

These are golden days for the philosopher-VC. Moore’s Law states that the power of computing doubles every two years. Paul’s Law states that the population of VC intellectuals increases at a similarly exponential rate. This law takes its name, of course, from Paul Graham, the man who has done more than anyone else to make the venture capitalist not simply an investor of other people’s money but a critical thinker on politics and society. From his pithy Twitter feed to his legendary blog, the cofounder of Y Combinator has exhibited an unrivaled capacity to be consistently wrong on all the great questions of the age—or so his critics would have you believe. And this is the man’s special strength: the more the online mob criticizes him, the more right he’s convinced he is, and the stronger he grows. “Counterintuitive as it sounds,” he recently tweeted, “I’m always encouraged when I publish a new essay and get showered with insults. Showered = lots of people saw it, and insults = there were no big mistakes in it for them to refute.” Studying the great man’s Twitter feed—understanding both his obsessions and his style—will offer a sturdy template for you to follow as you embark on your career as a VC intellectual. (Obsessions = the things he is interested in. Style = the way he writes.) But he is not the only peer whose work you should become familiar with. Consider also Sam Altman, who once argued that “we have to allow people to say disparaging things about gay people if we want them to be able to say novel things about physics”; or Vinod Khosla’s pioneering contributions to the fight against public beach access (sample wisdom from Khosla Ventures website: “We believe change depends on unreasonable people”); or even Keith Rabois, who joined Khosla’s investment firm on the strength of its whiteboards and now knows more about residential real estate than probably anyone in the United States, according to Keith Rabois.

Jason Calacanis has leveraged the global pandemic and a $2000 UberEats gift card to completely change the way we think about early childhood education. Marc Andreessen’s work making the difficult case in favor of colonialism continues to astonish and impress. In a world where mainstream thinkers often present climate change, the incompatibility of growth-driven economics with our planet’s ecological limits, or conflict between China and the United States as the greatest global threats, Balaji Srinivasan instead spent much of 2020 zeroing in on the real danger to humanity’s future: New York Times journalist Taylor Lorenz. To some, this looks like harassment. To those in the know, it’s a heroic stance against sensitivity to people’s feelings, which is orthogonal to the protection of free speech—a hall-of-fame contrarian move that you, as an aspiring VC intellectual, would do well to study in detail.

The Rise of the Thielists
By Benjamin Wallace-Wells

Before Peter Thiel was a billionaire, he had the biographical points of a pretty conventional Gen X young Republican. He was born in 1967, in Frankfurt, to a German family that followed his chemical-engineer father to jobs around the globe before settling in Northern California. As a teen-ager, Thiel was a mathematics prodigy who says he was comfortable taking contrarian positions early, supporting Ronald Reagan and opposing drug legalization in middle school. As an undergraduate, he founded the combative, conservative Stanford Review, and, after law school and a stint as an appellate clerk for a Reagan appointee, Thiel co-authored “The Diversity Myth,” in 1995, a book decrying mounting political correctness on campus. His career had scarcely begun—after a stint at a New York law firm, he’d founded a small tech-investment company. But Thiel already had a fully formed political identity, and his résumé wasn’t far from what you get from many Republican congressional candidates.

Silicon Valley was booming during the late nineties, and it did not take Thiel very long to have a huge hit, when he founded PayPal with a half-dozen friends and acquaintances. Thiel’s friends, George Packer wrote, in 2011, “are, for the most part, like him and one another: male, conservative, and super-smart in the fields of math and logical reasoning.” Thiel reportedly came out as gay to his friends in 2003 (he would be outed publicly by Gawker some years later, and went on to sponsor a lawsuit against the company). Thiel co-founded the defense-and-intelligence firm Palantir Technologies, in 2004; that same year, he became Facebook’s first outside investor. Thiel donated to John McCain’s 2008 Presidential campaign after supporting Ron Paul in the primary, but his Republicanism received less attention than the fanciful, long-arc libertarian projects in which he invested: the Seasteading Institute (which aimed to build politically autonomous cities on platforms in international waters), the Machine Intelligence Research Institute (which wanted to insure that artificial intelligence was friendly to humans), and the Thiel Fellowship (which supported exceptionally talented young people in creating startup companies if they skipped, dropped out of, or took time off from college).

In 2012, Blake Masters, then a Stanford Law student, took a course on startups that Thiel taught, and published the notes on his Tumblr page, where they became a phenomenon. By 2014, Thiel and Masters had published the notes as a book, “Zero to One,” offering theories on startups and advice for founders. Reviewing it, Derek Thompson, of The Atlantic, said he thought it “might be the best business book I’ve read.” Thiel and Masters emphasize the breadth of forces arrayed against any founder: “In a world of gigantic administrative bureaucracies both public and private, searching for a new path might seem like hoping for a miracle. Actually, if American business is going to succeed, we are going to need hundreds, or even thousands of miracles. This would be depressing but for one crucial fact: humans are distinguished from other species by our ability to work miracles. We call these miracles technology.”

The deepest quality of Thiel and Masters’s book is its outsized vision of what a heroic individual—a founder—can do. In a late chapter, they argue that successful founders tend to have the opposite qualities of those seen in the general population—that they are, in some basic ways, different—and compare them to kings and figures of ancient mythology. In a section on Steve Jobs, Thiel and Masters write:

Apple’s value crucially depended on the singular vision of a particular person. This hints at the strange way in which the companies that create new technology often resemble feudal monarchies rather than organizations that are supposedly more “modern.” A unique founder can make authoritative decisions, inspire strong personal loyalty, and plan ahead for decades. Paradoxically, impersonal bureaucracies staffed by trained professionals can last longer than any lifetime, but they usually act with short time horizons. The lesson for business is that we need founders. If anything, we should be more tolerant of founders who seem strange or extreme; we need unusual individuals to lead companies beyond mere incrementalism.

The heightened vision of what a single leader can do, the veneration for more ancient and direct forms of leadership, the praise for authoritative decision-making and disdain for bureaucracies—it’s a short hop from here to the Donald Trump of “I alone can fix it.”

Silicon Valley’s Safe Space
By Cade Metz

… Silicon Valley, a community of iconoclasts, is struggling to decide what’s off limits for all of us.

In 2013, Mr. Thiel invested in a technology company founded by Mr. Yarvin. So did the venture capital firm Andreessen Horowitz, led in the investment by Balaji Srinivasan, who was then a general partner.

That year, when the tech news site TechCrunch published an article exploring the links between the neoreactionaries, the Rationalists and Silicon Valley, Mr. Yarvin and Mr. Srinivasan traded emails. Mr. Srinivasan said they could not let that kind of story gain traction. It was a preview of an attitude that I would see unfold when I approached Mr. Siskind in the summer of 2020. (Mr. Srinivasan could not be reached for comment.)

“If things get hot, it may be interesting to sic the Dark Enlightenment audience on a single vulnerable hostile reporter to dox them and turn them inside out with hostile reporting sent to *their* advertisers/friends/contacts,” Mr. Srinivasan said in an email viewed by The New York Times, using a term, “Dark Enlightenment,” that was synonymous with the neoreactionary movement.

But others, like Mr. Thiel, urged their colleagues to keep quiet, saying in emails that they were confident the press would stay away. They were right.

What Happened to Jane Mayer When She Wrote About the Koch Brothers
By Jim Dwyer

Out of the blue in the fall of 2010, a blogger asked Jane Mayer, a writer with The New Yorker, how she felt about the private investigator who was digging into her background. Ms. Mayer thought the idea was a joke, she said this week. At a Christmas party a few months later, she ran into a former reporter who had been asked about helping with an investigation into another reporter on behalf of two conservative billionaires.

“The reporter had written a story they disliked,” Ms. Mayer recounts in “Dark Money: The Hidden History of the Billionaires Behind the Rise of the Radical Right,” out this month from Doubleday. Her acquaintance told her, “‘It occurred to me afterward that the reporter they wanted to investigate might be you.’”

As it happened, Ms. Mayer had published a major story in the magazine that August about the brothers David and Charles Koch, and their role in cultivating the power of the Tea Party movement in 2010. Using a network of nonprofits and other donors, they had provided essential financial support for the political voices that have held sway in Republican politics since 2011. “Dark Money” chronicles the vast sums of money from the Koch brothers and other wealthy conservatives that have helped shape public dialogue in opposition to Democratic positions on climate change, the Affordable Care Act and tax policy.

Ms. Mayer began to take the rumored investigation seriously when she heard from her New Yorker editor that she was going to be accused — falsely — of plagiarism, stealing the work of other writers. A dossier of her supposed plagiarism had been provided to reporters at The New York Post and The Daily Caller, but the smears collapsed when the writers who were the purported victims made statements saying that it was nonsense, and that there had been no plagiarism whatsoever. Indeed, as one noted, Ms. Mayer had plainly credited his writing — though this was not mentioned in the bill of particulars that was passed around.

There was more. Ms. Mayer would learn that these same dark forces had dug into a friend from her college years, with some notion of using the friend’s later problems against her. “I’m 60,” Ms. Mayer noted. “That was a long time ago.”

(Ms. Mayer’s husband, William Hamilton, is an editor for The New York Times in Washington.)

Who was behind this?

Figuring that out took three years, Ms. Mayer said, and she writes that she traced it to a “boiler room” operation involving several people who have worked closely with Koch business concerns.

Tech vs. Journalism
By Benjamin Wallace

Then there are the journalists who hold themselves out as a priestly caste motivated by nothing beyond the public good and who write their articles in a stentorian institutional voice yet run wild on Twitter slagging this VC for that offhand remark. Tech Twitter (and right-wing media) went bonkers after Times reporter Taylor Lorenz (who has 236,000 Twitter followers) mistakenly tsk-tsked Andreessen for saying “retard revolution” in a Clubhouse discussion of the GameStop-Reddit stock frenzy, faulting her for misidentifying the slur-utterer — who was not Andreessen but his partner, Ben Horowitz — and accusing her of being a woke scold because Horowitz had merely been referring to a WallStreetBets subgroup that called itself Retard Revolution. Lorenz quickly deleted her tweet and corrected her error.

Stung By Media Coverage, Silicon Valley Starts Its Own Publications
By Bobby Allyn

As tech reporting has shifted from being dazzled by the latest gadgets and apps to concerned over its impact on people and institutions, Silicon Valley’s elite have searched for a way to bypass the critical eye of journalists altogether.

Now companies and investors are trying to regain control of the narrative by launching their own media publications, with rah-rah stories that they hope will compete directly with news coverage of technology.

“If you’re celebrating cutting out the media, then you’re giving powerful people aircover to thumb their nose at impertinent questions that you, too, would probably like the answer to,” wrote independent tech journalist Eric Newcomer in his newsletter, noting that Silicon Valley’s latest publication “does make it easier for Andreessen to get his message out without facing questions from prying reporters.”

It follows Andreessen Horowitz’s investing heavily in two other efforts, the live-audio app Clubhouse and newsletter startup Substack. Both offer a “go direct” approach that lets speakers and writers get their messages to the public, while circumventing the media.

Longtime tech journalist Timothy Lee said the tech industry may have grander plans to disrupt journalism.

“They just see media as another potential industry where they might be able to come along and build something better than what was there before, the same way they did with taxis, video streaming and lots of other stuff,” Lee said.

Yet online publishing platforms like Medium have long featured the kind of cheery takes on tech that Future now delivers. Watson says she does not see an existential threat to the journalism industry in the new publication, as much as an opportunity to snub the media.

“What I am more worried about is the way they are welding access as a tool of power,” she said.

In other words, the site could give itself exclusive interviews or insider access, making big announcements through its own publication. While companies issue press releases all the time to highlight their achievements, Watson says stories that appear to be news articles can trick readers into believing they are written from an unbiased perspective.

The New York Times Op-Ed Page Is Not Your Safe Space
By Jack Shafer

In recent months, New York Times readers brave enough to open the paper’s op-ed pages have been staggered to find opinions there. Yes, opinions!

The initial shock came in April, when Bret Stephens, a refugee from the Wall Street Journal opinion pages, took up residence at the Times, bringing with him his contrarian views on climate change. The Guardian spoke for injured readers everywhere with a piece headlined “NY Times Hired a Hippie Puncher to Give Climate Obstructionists Cover.” The inevitable Change.org petition calling for Stephens’ sacking went up, gathering tens of thousands of signatures, and Times reporters and editors savaged him on Twitter. Compounding the grievance was Michael Kinsley’s opinion series “Say Something Nice About Donald Trump,” which elicited angry letters to the editor and ridicule from Newsweek, Fusion (now Splinter) and, of course, Twitter. (Disclosure: Kinsley was my boss more than a decade ago.)

Over the summer, umbrage fever spread to conservatives exposed to the “Red Century” series, the paper’s continuing op-ed exploration of “the history and legacy of Communism, 100 years after the Russian Revolution.” Why is the Times rehabilitating Communism, one Federalist writer asked, instead of dealing “forthrightly with the horrors of the Soviet regime”? Another Federalist writer called “Red Century” a “bizarre nostalgia series about communist dictatorships.” Among the stories the Federalist took exception to were “Why Women Had Better Sex Under Socialism,” “Thanks to Mom, the Marxist Revolutionary” and “When Communism Inspired Americans.” “The ghost of Walter Duranty still lives at the New York Times,” the Federalist’s Inez Feltscher wrote.

It was the left’s turn to cry this week, as the page published an op-ed by Erik Prince, the founder of the Blackwater mercenary army and Betsy DeVos’ brother. Prince proposed a potentially self-dealing solution to the Afghanistan stalemate: A mercenary army of about 6,000 to assist an American Special Operations force of 2,000. The New Republic disparaged the piece as a Prince sales pitch. Huffington Post Editor in Chief Lydia Polgreen tweeted, “Unbowed, a newspaper continues to flog its lousy opinion piece.” Also taking exception to the Times’ editorial judgment were writers at Slate and GQ. New York magazine took the easy route, scraping Twitter for the inevitable parody headlines.

As susceptible as the next guy to hating the Times for running pieces I disagree with, I used as kindling to start a trash fire the recent Times page in which K-Sue Park informed the ACLU it needed to rethink free speech. Then, guided by a 2010 article about the history of the Times op-ed page written by University of Maine professor of journalism Michael Socolow, I returned to my senses. His deeply researched piece, “A Profitable Public Sphere: The Creation of the New York Times Op-Ed Page,” demonstrates that from the time its top editors started thinking about adding an op-ed section in the early 1960s, the whole idea was to trigger reader insurrections with outrageous views.

A Profitable Public Sphere: The Creation of the New York Times Op-Ed Page
By Michael J. Socolow (PDF, 435KB)

On September 21, 1970, the New York Times began publishing its op-ed page. “We hope,” the editors wrote, “that a contribution may be made toward stimulating new thought and provoking new discussion on public problems.” … This new forum of opinion and commentary was soon imitated by other newspapers.

In the summer of 1956, John B. Oakes, a member of the editorial board of the New York Times, received a letter from his friend Ed Barrett. Barrett, who had been an assistant secretary of state specializing in propaganda and would later become dean of the Columbia School of Journalism, was working in public relations. His client, the Suez Canal Company, wanted to publish an article detailing its position on the Egyptian government’s seizure of the canal. Barrett had drafted a short piece and submitted it to Oakes for publication. Oakes liked the essay but was forced to inform Barrett that the Times could not publish it. “We just didn’t have a place for that kind of fairly short piece,” he later remembered. Barrett then turned to the New York Herald Tribune, which printed the commentary (under the name of Francois Charles-Roux, chairman of the board of the Suez Canal Company) in a spot occasionally reserved for outside contributors on its editorial page. …

Oakes later regarded that experience as the origin of the New York Times‘ op-ed page. He also acknowledged that his idea was not particularly novel; it was modeled on the page of commentary appearing opposite the editorial page of the old New York World in the 1920s. …

Oakes had long argued that newspapers needed more analytical depth and complexity. “The function of newspapers and newspapermen,” he concluded, was to “interpret [the] age to the general public.” … Yet this interpretative responsibility did not mean ostentatious representations of authority; rather, a good journalist needed to be wary of appearing too professorial, elite, and inflexible. When asked to vet a book about the daily operation of the New York Times in 1966, Oakes regretted that the author saw fit “to add to the ‘Ivory Tower’ image of the editorial page which I have been trying very hard to dispel.” … The “deepest responsibility” of the newspaper, Oakes wrote, was “the same responsibility … that the college has for its students-the responsibility of making them think.” Thus, a fundamental purpose of the editorial page was “to question, to debunk.” … “Diversity of opinion is the lifeblood of democracy,” Oakes contended in a 1954 speech. “The minute we begin to insist that every one think the same way we think, our democratic way of life is in danger.” … The apparent rise of mass conformity in the United States during the 1950s particularly troubled Oakes. In a May 1963 commencement address, he complained about “mass thought” and called for “more iconoclasm” in the media, politics, and academia. …

Oakes had always opposed advertising on the page, but he was particularly incensed that corporations like Mobil Oil could buy “their way onto the op-ed page.” Everything published there, Oakes argued, should undergo the same editorial review. “We ran innumerable pieces by presidents of every damn oil company in the world,” he remembered years later. “So there’s no question of not wanting that view appearing, but the idea of paid advertising on that page burnt me up.” … Oakes later remembered only one case-when the Republic of China (Taiwan) purchased the space-in which he and Salisbury were able to convince Veit to remove an advertisement on the grounds of editorial autonomy.

A review completed in mid-1971 revealed that the first six months of op-ed operation produced a net profit of $112,000 on $264,900 of revenues. The editors had spent $21,800 on art, $54,300 on articles, and $2,300 on photographs. … The results were remarkable for a recessionary period. A comparison of the first nine months of 1969 to the same period in 1970 showed net operating income of the New York Times Company declining from $10.2 million to $6.8 million. Classified advertising fell 18.3%. … The Op Ed page’s premium advertisements, usually written like the adjacent editorials and essays, accounted for $244,400 in revenue in the first six months. ….

Lou Silverstein remembered that the late 1960s were “a bad time for the country but a good time to start the op-ed page.” Similarly, Harrison Salisbury called the feature a “child of its times.” He called the period an “age of skepticism,” in which “not one institution in American society escaped reexamination.” During this tumultuous time, the number of daily newspapers in New York City declined. The Times lost its chief competitor when the Herald Tribune failed. “We knew we had to attract readers of the old Herald Tribune,” Salisbury noted in discussing the op-ed innovations at the Times.

The Real Reason the Times Has Quit Using the Term ‘Op-Ed’
By Jack Shafer

By corralling opinion inside the “Guest Essay” definition and serving it via newsletter as a pure opinion product for subscribers, the paper hopes to drive subscription sales as it has with its Cooking and Games apps. Digiday reported in the first six months of last year that the Times brought in $25.1 million from its standalone subscription products, including cooking, games and audio. That was up from $15.7 million the year before.

Consult internal Times studies, such as the January 2017 “Journalism That Stands Apart” report, or listen to Times executives talk, and you’ll hear the constant refrain that the paper is now a digital-first, subscriber-first enterprise. “We are not trying to win a pageviews arms race,” the 2017 report stated. “We believe that the more sound business strategy for the Times is to provide journalism so strong that several million people around the world are willing to pay for it.” Separating the opinion section from its old Op-Ed moorings and rechristening it as a subscriber-only digital product places it in a flow with the paper’s other successful digital products. “If you go back to the idea of the habitual reader, opinion columnists are precisely the kinds of writers who attract repeat visits and drive habitual behavior,” a former Times editor and VP for product and technology told CJR’s Piore.

How Substack Revealed the Real Value of Writers’ Unfiltered Thoughts
By Jack Shafer

The rise of Substack—and of platforms of its competitors—signals a new juncture in journalism, one that combines the power and mystique of the byline with the editorial independence afforded by the blog. After being lectured forever about how information wants to be free, Substack is teaching us that not only will readers pay for top-drawer copy, but that the work of some writers was actually undervalued in the market before readers were given the opportunity to purchase journalism a la carte instead of from a prix fixe menu.

A Substack gives a writer the lever and the fulcrum that says, “Here, see if you can move the world all by yourself.” Most appealing to the heterodox writers who cover race, ethnicity, and gender, a Substack puts an end to writer’s worries that a publication’s diversity committee or its activist-rich Slack channel might sabotage the piece with internal protests if it doesn’t conform to the greater staff’s tastes and politics.

The big money Substack has thrown around serves as a market signal that the country’s best writers have been exploited by their publications for eons. To pinch a phrase from Marx, Substack has proved that the best writers create “surplus value” that their employers have absconded with. Substack allows writers to reclaim something closer to their total value. This isn’t the first writer goldrush in recent decades, though it may well prove to be the largest and longest lasting.

The growing influence of Substack shouldn’t move us to a choice of either-or. You can enjoy conventionally edited publications and unfettered newsletters at the same time without being ridiculed for wanting it both ways. But there might also be a lesson for the big institutional media. Nobody expects newspaper and magazine publishers to replace conventional journalism with nothing but newsletter-style copy. But Substack has shown that boosting the best writers’ salaries, creating publishing platforms free of meddlesome editors and allowing readers—not editors—to decide what to publish can be a winning business model.

Substack Is a Scam in the Same Way That All Media Is
By Eric Levitz

Between 2008 and 2019, the number of newsroom jobs in the United States fell by 26,000, according to the Pew Research Center. Over that same period, roughly 15,000 journalism majors were graduating into the U.S. labor market every year. In addition to making the competition for writerly employment exceptionally brutal, these developments also raised the barriers to merely entering that competition: Since regional newspapers have collapsed faster than national outlets, what jobs remain are now (even more) heavily concentrated in a handful of extremely high-cost cities.

Faced with a superabundant supply of underemployed writers, and increasingly thin to nonexistent profit margins, all manner of media companies in such cities have made a common practice of paying poverty wages for entry-level work. Applicants accept these terms because the outlets offer (potentially, eventually monetizable) “prestige,” and/or because they sought to emulate the success of that publication’s star writers, and/or because they had no other options, and/or because class privilege shielded them from the worst consequences of their underpayment.

Like the vast majority of the writers who create Substacks, the vast majority of the interns who take unpaid to barely paid positions in journalism will never attain the financial security of their publications’ big-name writers. And those big-name writers — and the interns who are able to approximate their success — are typically beneficiaries of an uneven playing field tilted in favor of the upper-middle class. My own path to a decent job in journalism was eased by parental subsidies, which made it possible for me to accept $8-an-hour internships in New York City without suffering malnutrition. The “advances” that most consequentially bias who gets to write for a living and who does not derive from accidents of birth.

The resurgence of labor organizing in media has mitigated the industry’s exploitative treatment of entry-level workers and the class bias inherent to it. And this is one of the many reasons why unionizing newsrooms is a vital project. But labor unions alone cannot solve the underlying problem of mass underemployment within the industry. America does not have more competent journalists than it needs. But it does have far more of them than media firms are capable of profitably employing, amid the erosion of the ad-supported business model.

Which is one major reason why there are so many writers willing to provide Substack with content free of charge.

There may be something distasteful about the fact that Substack benefits from journalists’ financial desperation. But ultimately the core problem here is not that a newsletter platform is helping cash-strapped writers squeeze some tips out of their Twitter followings. The problem is that legions of talented journalists are going underemployed, even as statehouses across the country are going under-covered.

Journalism’s two Americas
By Sara Fischer and Nicholas Johnston

Why it matters: The disparate fortunes skew what gets covered, elevating big national political stories at the expense of local, community-focused news.

At the local level, newspapers continue to be gobbled up by hedge funds eager to slash jobs for profits. News veterans with more experience are often the first to go.

  1. Dozens of journalists at the Chicago Tribune have recently taken buyouts in response to the purchase of its parent company by a hedge fund known for slashing local newspaper jobs for profit.
  2. A recent New York Times opinion piece about Julie Brown, the Pulitzer Prize-winning journalist at the Miami Herald who broke the story about Jeffrey’s Epstein’s abuses but often had to pay her own expenses, also underscores that discrepancy.

Journalism Is Broken and I Alone Can Fix It!
By Jack Shafer

If it’s a crime to proclaim on the obvious, then Justin Smith and Ben Smith — whose yet-to-be-named global news organization just entered startup mode — should be sentenced and jailed immediately. Defector writer Albert Burneko rightly ridiculed Smith and Smith for their plans to target their new operation at the 200 million college-educated English speakers on the planet they think are underserved by the current press. You could make a case that the 200 million are underserved, Burneko notes, but if you ignore the output of the New York Times, the Washington Post, the Wall Street Journal, the Los Angeles Times, the Atlantic, the New Yorker, New York magazine, Harpers, TIME, the National Review, the New Republic, Insider, the Intercept, ProPublica, the Columbia Journalism Review, Vanity Fair, Mother Jones, the Federalist, the Nation, Jacobin, the Washington Examiner, the Hill, Reason, Bloomberg and the Daily Beast.

How Place and Privilege Came to Define American News
By Asher Schechter

Q: This touches on a very big question that is often under-discussed: Who is journalism for? The assumption is that it is for everyone, to maintain an informed public and ultimately serve democracy. But the book suggests that is increasingly not the case.

On one hand, news is more widely available than it ever has been, and there’s lots of free news available to people who want it. The question is what about that is quality journalism. I think that you saw this really break down in Covid: if you compare the kind of Covid information you’re getting on your local TV news, you would get a two-minute soundbite about the number of cases across a whole bunch of different counties, and that might be it. Whereas if you went to your local newspaper, you would find highly detailed coverage about restrictions and trends, and what important people are thinking. The quality of that coverage was just so much better.

It’s really telling that in the beginning, in the pandemic, those news organizations realized that everybody needed to have access and dropped their paywalls. And then those paywalls went back up.

Q: You write in the book about the placelessness of global elites, pointing to The New York Times as an example of a newspaper that really thrives by targeting this interconnected global class of elites. Can you elaborate on the role that this placelessness has in our current media ecosystem?

I have this metaphor of the “placeless guy,” this person who has so much cultural capital and actual capital that their physical experience in the world really doesn’t change that much as they go from one capital to another, drinking the same delicious Manhattan made with the same delicious bourbon. Their circumstances don’t change wherever they are, and their interests are so above any one geographic location, because they care about the flow of financial markets and capital and the changing balances of larger geopolitical power dynamics.

The New York Times has always served an elite. The Times also recognizes they’re not going to get digital advertising abroad—the restrictions are growing even stronger in terms of collecting private data—so the only strategy for them is to continue to expand their subscriber base; there are only so many people in the United States that will subscribe to The New York Times—ultimately, it’s a very specific, limited market. And so you have to go beyond borders, to get somebody who fits that profile but doesn’t live in the States.

More Money and Fewer Readers: The Paradox of Subscriber Journalism
By Jack Shafer

The rise of the subscriber has diminished the power of advertisers to dictate or subtly influence news coverage, a headache that has plagued every advertiser-supported publication since the beginning of time. Yesterday’s publisher had to worry about advertiser boycotts or offending the powers that be. While that’s a thing of the past for subscriber-based publications, they’re still not in the clear. They have to worry about boycotts by readers whose sensibilities and politics they’ve offended. Social media makes it easy for readers to build and prosecute a case against a publication. Last year, New York Times staffers engaged the social media millions to protest the publication of Senator Tom Cotton’s op-ed about sending federal troops to quell riots. The newspaper backed down, apologized for publishing the piece from the Arkansas Republican, and the editor of the opinion pages resigned. One lesson learned here was that a club should think twice before it publishes something that might insult and anger its members.

In her fine new book, News for the Rich, White, and Blue, media scholar Nikki Usher warns of the downside of journalism designed primarily for those who pay full ticket. High subscription prices have made today’s newspaper, whether in print or online, something of a luxury good. In 2019, Nieman Labs’ Joshua Benton reported that the cost of the average print newspaper subscription had doubled in just a decade. Readers now provide more revenue to newspapers than do advertisers, according to a recent Pew report. That fact is a little illusionary. The new ration is not so much a matter of reader revenue increasing, but of advertising revenue vanishing.

The looming danger, Usher writes, is a future in which newspapers shape their coverage to appeal to the group that has demonstrated the greatest willingness to pay for quality news: the rich, white and liberal elites.

Muckraker George Seldes, one of America’s most accomplished independent journalists, built his paper-and-postage newsletter, In fact, into a 176,000-circulation phenomenon in the late 1940s, outpacing the circulation of both the New Republic and the Nation. Every bit as edgy in his journalistic approach as such Substack all-stars as Glenn Greenwald, Matt Yglesias or Matt Taibbi, Seldes should have gloried in his triumph. Instead, he was depressed by his lack of mass reach. No matter how hard he tried, he lamented in his final issue that he couldn’t connect very far beyond the “$5 liberals” who subscribed to In fact. He wanted a mass audience that would allow him to keep pace with the big newspapers that reached a million subscribers. He wanted his words to land as hard as theirs. Instead, the elite audience he’d cultivated became too much of an echo chamber.

As the philosophy of clickbait has given way to a business model based on subscriberbait, paywalls have nursed the New York Times back to health, given marginal voices a bully pulpit, reinvigorated journalistic entrepreneurship, blunted the ad man’s power and provided a regular stream of revenue that allows editors to plan long-term coverage. But the cost, one that we’ve not yet been billed for or contemplated, appears to be the erosion of a common frame of reference that old media provided and the arrival of what media scholar Stephen Bates calls gated communities of information that produce “income rather than geographic news deserts.” Every boon, it seems, must bring a bane.

‘Time to reactivate MySpace’: the day Australia woke up to a Facebook news blackout
By Calla Wahlquist

At 5.30am Australian east coast time, after months of threats and failed attempts to lobby the government over proposed new media laws, Facebook banned the sharing of news in Australia.

The main page of the national broadcaster, ABC, was down. Guardian Australia’s page was also down. Australians trying to post links to news publishers on their personal Facebook pages received an error message.

The Australian Bureau of Meteorology, which uses its Facebook page to deliver climate updates and severe weather warnings, was blocked. So too was the Western Australian Department of Fire and Emergency Services, which earlier this month was issuing evacuation warnings for a bushfire that destroyed 86 homes in the Perth hills. In a statement, DFES said it had contacted Facebook “and they have assured us they will restore the page as a priority”.

State health departments, where daily coronavirus figures and information about potential exposure sites are listed, were deleted, as was the official page for the governments of the Australian Capital Territory, South Australia and Tasmania.

So too was St Vincent’s Health, a hospital in Melbourne that is soon to begin distributing the first coronavirus vaccines in Australia. On Twitter, the organisation said it was “extremely concerning” to find its page had been blocked “during a pandemic and on the eve of a crucial Covid vaccine distribution”.

1800 Respect, a family violence service, was blocked, as were homelessness services, other crisis centres and women’s shelters. Aboriginal and Torres Strait Islander media pages servicing small, often remote communities were gone.

It is not clear whether Facebook, which has 2.89bn monthly active users and a net worth of US$780bn, is concerned about any reputational damage that may arise from blocking potentially lifesaving information for 11.23 million Australians.

Inside the Making of Facebook’s Supreme Court
By Kate Klonick

In 2009, shortly after the company was criticized for quietly changing its terms of service to allow it to keep users’ data even after they deleted their accounts, it released a video of Zuckerberg, clad in an uncharacteristic button-up shirt and a tie, announcing a “new approach to site governance.” People would be able to vote on Facebook’s policies; the company called it “a bold step toward transparency.” In the first referendum, on whether to change the terms of service, only 0.32 per cent of users voted. “In its own eyes, Facebook has become more than merely a recreational website where users share photos and wish each other a happy birthday,” a columnist for the Los Angeles Times wrote. “It is now a global body of citizens that should be united and protected under a popularly ratified constitution. But it’s hard to have a democracy, a constitution or a government if nobody shows up.” In 2012, the project was quietly shuttered, and, as with Crystal Pepsi, Google Wave, and the Microsoft Zune, no one remembers that it existed.

Why Are Tech Companies Pretending to Be Governments?
By Binyamin Appelbaum

… big tech companies putting on shows of government-style decision-making about government-scale issues. Recent examples include Facebook’s reliance on an ersatz judiciary to decide whether Donald Trump may resume posting and Uber’s efforts to create different labor standards for that special group of workers known as Uber drivers.

Public outrage tends to focus on the poor quality of these pantomimes. The real injustice runs deeper. In a representative democracy, the process confers legitimacy on the result. A piece of legislation or a court ruling commands compliance because the decision is made by duly empowered representatives acting under the law.

Corporations behave like governments because they want to invest their decisions with that sense of procedural legitimacy. But they do it for the purpose of warding off the government.

The show is a sham, a mockery of democracy. Corporations may be people, but they’re not polities. Their executives are not our representatives. The rules they choose to follow are not laws. And legitimacy cannot be borrowed to justify decisions contrary to the public interest.

Gatekeepers: These tech firms control what’s allowed online
By Geoffrey A. Fowler and Chris Alcantara

A bunch of other companies, stacked on top of each other like layers in a cake, operate the pipes and services that keep the Internet running.

Typically unseen parts of the stack flexed their power after the Capitol riot. Amazon Web Services — which provides the cloud-computing power that keeps many apps and websites running — ended its contract with social network Parler for having too many violent posts and insufficient moderation. App stores run by Apple and Google kicked out Parler, too, effectively kneecapping the platform. Parler, supported financially by major Trump backer Rebekah Mercer, called the moves part of a “coordinated effort” to silence Trump and his supporters. (Amazon CEO Jeff Bezos owns The Washington Post.)

It’s a lot of power to put into the hands of tech executives who aren’t elected or don’t necessarily have experience weighing what’s right for society.

In some slices of the stack, a small number of companies have inordinate influence. Apple’s App Store, for example, is the only way you can buy apps for iPhones and iPads.

Critics, like Jillian York, the author of “Silicon Values: The Future of Free Speech Under Surveillance Capitalism,” say when companies become unaccountable censors, it sets a precedent that endangers political and personal expression of all kinds.

“The sex worker, the Palestinian or Burmese or Egyptian activist/dissident, the LGBTQ+ rights activist, the person speaking out against terrorism in their community — they often have nowhere else to turn, especially if they live in a country without a free media,” she said.

Do companies have a responsibility to moderate content because they have the technical ability? Or does the fact that they could make the wrong calls mean they should hold back?

The conversation about keeping society safe online only gets more complicated from here — up and down the stack.

The Secret History of the Shadow Campaign That Saved the 2020 Election
By Molly Ball

Bad actors spreading false information is nothing new. For decades, campaigns have grappled with everything from anonymous calls claiming the election has been rescheduled to fliers spreading nasty smears about candidates’ families. But Trump’s lies and conspiracy theories, the viral force of social media and the involvement of foreign meddlers made disinformation a broader, deeper threat to the 2020 vote.

Laura Quinn, a veteran progressive operative who co-founded Catalist, began studying this problem a few years ago. She piloted a nameless, secret project, which she has never before publicly discussed, that tracked disinformation online and tried to figure out how to combat it. One component was tracking dangerous lies that might otherwise spread unnoticed. Researchers then provided information to campaigners or the media to track down the sources and expose them.

The most important takeaway from Quinn’s research, however, was that engaging with toxic content only made it worse. “When you get attacked, the instinct is to push back, call it out, say, ‘This isn’t true,’” Quinn says. “But the more engagement something gets, the more the platforms boost it. The algorithm reads that as, ‘Oh, this is popular; people want more of it.’”

The solution, she concluded, was to pressure platforms to enforce their rules, both by removing content or accounts that spread disinformation and by more aggressively policing it in the first place. “The platforms have policies against certain types of malign behavior, but they haven’t been enforcing them,” she says.

Quinn’s research gave ammunition to advocates pushing social media platforms to take a harder line. In November 2019, Mark Zuckerberg invited nine civil rights leaders to dinner at his home, where they warned him about the danger of the election-related falsehoods that were already spreading unchecked. “It took pushing, urging, conversations, brainstorming, all of that to get to a place where we ended up with more rigorous rules and enforcement,” says Vanita Gupta, president and CEO of the Leadership Conference on Civil and Human Rights, who attended the dinner and also met with Twitter CEO Jack Dorsey and others. (Gupta has been nominated for Associate Attorney General by President Biden.) “It was a struggle, but we got to the point where they understood the problem. Was it enough? Probably not. Was it later than we wanted? Yes. But it was really important, given the level of official disinformation, that they had those rules in place and were tagging things and taking them down.”

Biden administration ‘flagging problematic posts for Facebook,’ Psaki says
By Lawrence Richard

The Biden administration is playing an active role in flagging Facebook posts it considers to be “problematic” or “disinformation,” according to White House press secretary Jen Psaki.

During a Thursday press conference, Psaki said White House senior staff were engaging with “social media platforms” to combat the spread of “misinformation specifically on the pandemic.”

“In terms of actions we are taking or that we’re working to take, I should say, from the federal government, we’ve increased disinformation research and tracking within the surgeon general’s office. We’re flagging problematic posts for Facebook that spread disinformation,” she said.

Censorship Coordination Deepens
By The Editorial Board

It’s been clear for some time that the tech giants look to government to determine what coronavirus-related speech to allow. YouTube’s misinformation policy bans content that contradicts the evolving guidance of “health authorities.” Facebook stopped blocking some commentary on the lab-leak theory of the virus’s origins only after President Biden ordered an investigation into the possibility.

Public-private coordination is not in itself sinister. The government can be an important source of information, and most people would agree that it’s not an abuse of Facebook’s authority to suppress, say, fraudulent medical advice that goes viral.

Yet as the acute crisis phase of coronavirus passes, a government arrangement with private firms to control speech about the pandemic looks less salutary.

‘Dangerous Ideas’ Review: The Follies of Censorship
By Jonathan Rose

A commitment to open expression has always defined liberalism, which gradually expanded our First Amendment protections. But now we see many liberals abandoning that principle, perhaps because they are no longer liberals in any meaningful sense of the term. How could they be, if they want tech barons to police our online reading? Facebook recently decided to stop blocking posts that suggested a “lab-leak” origin of Covid, but at the same time the company has been boasting of its efforts to downrank or “shadow-ban” accounts that share “misinformation” (in other words, they make it difficult for readers to find those accounts, without telling the account owners).

We sorely need a reminder of the follies and crimes of censorship. In “Dangerous Ideas,” Eric Berkowitz, a journalist and lawyer, offers a global history that identifies some recurring patterns in the suppression of free thinking. For starters, crackdowns almost inevitably happen when societies confront overwhelming crises. Philosophy flourished in ancient Athens, where free males (at least) enjoyed intellectual liberty, but after the Athenians suffered military defeat and a devastating pandemic, they canceled Socrates. Then Plato’s “Republic,” putting words into Socrates’ mouth, laid out a program for absolute control of speech and thought, anticipating in detail modern totalitarianism. Reading Plato, Mr. Berkowitz recognizes Mao’s Cultural Revolution.

What emerges from “Dangerous Ideas” is that ideological terms like blasphemy, subversion and hate speech are impossible to define. Thus there are never clear guidelines for censorship, which is inevitably inconsistent and often absurd. “We really do not know what is demanded of us,” protested a czarist censor jailed for making a wrong call. Facebook moderators can only be fired, but face a similar quandary.

Mr. Berkowitz recognizes that “censorship doesn’t work. The ideas animating suppressed speech remain in circulation and, in the end, can become more effective for being forbidden.” Everyone in prerevolutionary France read Voltaire, who knew that outlawed books sold phenomenally. But censorship can still exact an enormous price. Without a free press, the French people could not know what was going on at Versailles. Rumors and paranoia rushed into that information vacuum and corroded faith in royal institutions until they collapsed.

Given that censorship is used by the powerful to control the masses, it inevitably enforces class biases. In Victorian Britain, plebeian atheists were prosecuted for blasphemy. Gentlemen agnostics like T.H. Huxley and Leslie Stephen weren’t. A similar condescension largely explains current anxieties about social media. A.J. Liebling once regretted that “Freedom of the press is guaranteed only to those who own one.” Now, thanks to the internet, everyone owns one, and for elites in media, technology and government, that prospect is frighteningly democratic.

Even Mr. Berkowitz’s faith in free expression is sorely tried by social media. It galls him that (until January 2021) Donald Trump could tweet that the 2020 election was stolen and the 2016 election wasn’t. He is appalled by the online “dismissal of scientific findings—even those with urgent, widespread public health implications.” As we have seen repeatedly with research and reporting related to Covid-19, however, we cannot assume that those findings will prove to be conclusive.

Vindication Over Hunter’s Emails
By The Editorial Board

The New York Post is claiming vindication over its scoop in October 2020 about Hunter Biden’s emails, and deservedly so. But that shouldn’t be the end of the story for everyone who attacked the Post or ignored the story to cover for Joe Biden.

A writer for Politico has published a book about President Biden that confirms that some of the emails on a laptop belonging to Hunter Biden are authentic. This is barely a scoop, since neither Hunter Biden nor Joe Biden’s campaign denied last year that the laptop provided to the Post by Rudy Giuliani was Hunter’s. Both men counted instead on the rest of the media to serve as a cordon sanitaire, and did they ever. Twitter barred the Post’s feed for a time lest Americans be able to read about the emails and their content.

A press that was interested in telling the truth about both candidates would have pressed to confirm the Post’s story and examined the emails for themselves. Instead they rose nearly as one to denounce the Post and claim without evidence that the emails might have been Russian disinformation.

The Media Stonewalls on the Steele Dossier
By Eric Dezenhall

‘Why don’t they just fess up and say they’re sorry?” That is the question journalists have asked about the corporate and institutional clients of my crisis-management business. It’s a question media companies should be asking themselves amid the implosion of the Steele dossier. Here we are, a few weeks after the dossier was discredited, and no one has paid a price.

Having had media companies as clients, I’ve found that when they’re under fire, they behave no differently from chemical or drug companies. Why? Because they don’t see coming clean as being in their self-interest.

Among other things, the truth can tarnish the brand and jam them up in court. So they often deny, stonewall, close ranks, and attack their critics. Two things media companies have that other businesses don’t is the ability to deliver news instantly and the mantle of moral authority.

Let’s Not Consign Journalistic Transparency to the Memory Hole
By Jack Shafer

Newspaper proprietors, especially former Washington Post President and Publisher Philip L. Graham, have long subscribed to the idea that their newspaper articles constitute “the first rough draft of history.” Some first rough drafts are more accurate than others, as every journalist will concede. So when reporters uncover new information that undermines earlier copy, they write new stories, updating the record. What they don’t do is go back and erase the original, flawed version. But that’s what the Washington Post did last week.

As Post journalist Paul Farhi reported last Friday, the newspaper removed from its archives two stories from 2017 and 2019 related to the controversial Steele dossier and replaced them with new articles that added and deleted whole sections and also added explanatory text at the top, alerting readers to the changes. Post Executive Editor Sally Buzbee told Farhi that the changes had been made because the paper could no longer stand by the accuracy of elements of the story because new information has surfaced, in part from an ongoing criminal investigation. Without getting too deeply into the dossier weeds, the original stories identified businessman Sergei Millian as former spy Christopher Steele’s “Source D.” The heavily reworked versions don’t. The Post also briefly updated and amended a dozen other stories on the topic, according to Farhi.

We can’t very well accuse the Post of hiding its miscues: In fact, by running the Farhi news article, the Post has tied a bright pink bow to its altered dossier coverage. So let’s salute the Post for correcting the first rough draft based on new findings. Let’s also commend the paper’s media critic, Erik Wemple, for his investigations of where the press erred in its dossier coverage. Other outlets should be as aggressive in correcting the record.

What’s peculiar about the Post’s method of error correction was its decision to vaporize the two original stories. The original stories can’t be retrieved from LexisNexis, as the Post left that database in late 2020. Post spokesperson Kristine Coratti Kelly tells me the deleted pages can be found on Factiva, a Dow Jones subscription database, but Factiva costs about $249 a month, which makes it expensive for readers who can’t afford the service to determine precisely what the paper’s first rough draft got wrong and how it was amended. Such heavy reworking of years-old copy is so rare it approaches the unprecedented, as American University media history professor W. Joseph Campbell told Farhi. Stephen Bates, a professor of journalism at the University of Nevada at Las Vegas, concurs. “It’s hard to have a paper of record if the record keeps changing,” Bates says.

Our main beef isn’t that the Post flubbed a story. Lots of outlets flub stories. Many gave too much credence to the Steele dossier story, as Bill Grueskin just detailed. The issue is how the paper should handle its flubs in the light of new information. Ordinarily, when outlets make mistakes, they note it in a corrections column and append the correction to the original in the web archives to render transparent both the error and the revision. Often, such corrections require moderate bits of rewriting, but rarely to the extent of the Post’s two dossier stories. (The Post changes are already sending out ripples into the mediasphere. Yesterday, POLITICO rewrote two paragraphs in a 2019 story and added an editor’s note to reflect the Post’s reworking of its copy.)

Accountability requires journalists to show how their work was flawed if they choose to correct or retract. (The Post did not retract its piece, Farhi reported.) In 1981, when the Washington Post was scandalized by the fabulism of Janet Cooke’s “Jimmy’s World” story, purportedly about a child addict, Post ombudsman Bill Green filed a 14,600-word account of how the paper had been duped, an account the Post still hosts on its website. When Jayson Blair’s fabulism rocked the New York Times in 2003, the paper kept the scores of Blair stories in which he lied or plagiarized on its site with editor’s notes detailing his deceptions, and it published a lengthy investigation of his misdeeds. (The Times’ Blair page’s links are broken now, but the stories can be easily Googled by title.)

This sort of transparency is superior to the rewrite and erase strategy the Post just deployed. Readers shouldn’t have to purchase pricey news databases to determine what newspapers originally published. But not all is lost. Thanks to the Internet Archive, the enterprising can retrieve the Post’s vanquished pages. To save you the time of using the Archive — it took me a while to locate the pages — here are the URLs for the Post’s original March 29, 2017, and Feb. 7, 2019, pieces. The rewritten pages can be found on the Post’s site (here and here, respectively). To save you the bother of comparing, here are text comparisons of the March 29 and the Feb. 7 pieces.

Back in pre-web days, the best way to keep tabs on a newspaper’s honesty quotient (short of stealing somebody’s LexisNexis account) was to clip stories or check microfilm. Then came the web, and it became an easy matter to dial up a newspaper’s back pages. But no more. At some publications, the written record can be expunged if it contains embarrassing information. Now the Post is tossing old, flawed stories down the memory hole. Is this how journalism dies … in darkness?

The Internet Is Rotting
By Jonathan Zittrain

People tend to overlook the decay of the modern web, when in fact these numbers are extraordinary—they represent a comprehensive breakdown in the chain of custody for facts. Libraries exist, and they still have books in them, but they aren’t stewarding a huge percentage of the information that people are linking to, including within formal, legal documents. No one is. The flexibility of the web—the very feature that makes it work, that had it eclipse CompuServe and other centrally organized networks—diffuses responsibility for this core societal function.

The problem isn’t just for academic articles and judicial opinions. With John Bowers and Clare Stanton, and the kind cooperation of The New York Times, I was able to analyze approximately 2 million externally facing links found in articles at nytimes.com since its inception in 1996. We found that 25 percent of deep links have rotted. (Deep links are links to specific content—think theatlantic.com/article, as opposed to just theatlantic.com.) The older the article, the less likely it is that the links work. If you go back to 1998, 72 percent of the links are dead. Overall, more than half of all articles in The New York Times that contain deep links have at least one rotted link.

Our studies are in line with others. As far back as 2001, a team at Princeton University studied the persistence of web references in scientific articles, finding that the raw numbers of URLs contained in academic articles were increasing but that the links were often broken, including 53 percent of the articles they had collected from 1994. Thirteen years later, six researchers created a data set of more than 3.5 million scholarly articles about science, technology, and medicine, and determined that one in five no longer points to its originally intended source. In 2016, an analysis with the same data set found that 75 percent of all references had drifted.

Of course, there’s a keenly related problem of permanency for much of what’s online. People communicate in ways that feel ephemeral and let their guard down commensurately, only to find that a Facebook comment can stick around forever. The upshot is the worst of both worlds: Some information sticks around when it shouldn’t, while other information vanishes when it should remain.

The trouble with the past
By The Economist

History “is not the past”, Hilary Mantel once said. “It’s what’s left in the sieve when the centuries have run through it.” In the case of the classical world, what is left is sparse. Even very basic things are unknown. “Almost anything with a number on, we don’t know,” says Mary Beard of the University of Cambridge. “We don’t know the date of the eruption of Pompeii…We don’t know the population of the Roman Empire.” Evidence regarding women and minorities is especially thin. There are almost no first-person testimonies from Greek or Roman women. Tantalisingly, Nero’s mother is known to have written a memoir, but it was lost (like almost all classical literature). The poetry of Sappho survives only in scraps.

Such silences speak volumes. They do not, alas, fill them. That makes the job of scrupulous modern historians painstakingly hard. It took around 12,000 tiny tiles to make a square metre of fine Roman mosaic; to produce a scholarly yet arresting paragraph on classical history, the writer must, like a mosaicist, combine a little colour from a geographer, a sample from a poet and a snippet from an ancient chronicler.

Keeping the Wrong Secrets
By Oona A. Hathaway

In the national security world, there is a concept known as “the mosaic theory.” It holds that disparate, seemingly innocuous pieces of information can become significant when combined with other pieces of information. This theory is one reason why the vast majority of individuals with access to classified information are told that they cannot judge what information should be classified. A document that appears meaningless might, when put together with other information, give away an important piece of the mosaic to an adversary.

Historically, intelligence analysts have pieced together bits of information to complete the mosaic. As specialists in their fields, good analysts come to know when a seemingly inconsequential piece of information may be significant in context. The advent of big data, combined with artificial intelligence, promises to upend this traditional approach. To understand why, consider the breakthrough made by the retail giant Target almost a decade ago. Like most companies, Target assigns its customers ID numbers tied to their in-store cards and to their credit cards, names, and email addresses. When a customer makes a purchase, that information is collected and aggregated. In 2012, a statistician working at Target figured out that he could use this information, together with purchase information from women who had set up baby registries, to determine who was likely pregnant. Women who were pregnant started buying unscented lotion, for instance, and they were more likely to purchase calcium, magnesium, and zinc supplements. Using this information, Target was able to create a “pregnancy prediction score,” calculate where women probably were in the course of their pregnancies, and send women coupons for products they may need. This technology only came to public attention after an angry customer complained to a manager at Target that the company was sending mailers to his daughter that clearly targeted pregnant women. Later, he called to apologize: “It turns out there’s been some activities in my house I haven’t been completely aware of. She’s due in August. I owe you an apology.”

Conceptual biology, hypothesis discovery, and text mining: Swanson’s legacy
By Tanja Bekhuis

‘Undiscovered public knowledge’ is a phrase coined by Swanson … . It refers to published knowledge effectively buried in disjoint topical domains –’disjoint’ because researchers working in disparate fields are unaware of one another. Hence, truly disjoint literatures have no articles in common. Swanson suggested in a series of creative papers that novel information might be unearthed by systematically studying seemingly unrelated and non-interactive research literatures, which he called “complementary but disjoint” … . To demonstrate the feasibility of his ideas, he found evidence for previously overlooked relationships between fish oil and Raynaud’s syndrome …, magnesium and migraine …, somatomedin C and arginine …, and viruses as weapons … . This is quite remarkable given that Swanson is a mathematician and an information scientist, not a physician.

Rediscovering Don Swanson: the Past, Present and Future of Literature-Based Discovery
By Neil R. Smalheiser

Undiscovered public knowledge encompasses several distinct scenarios:

For example, one may ask: How many articles are published that no one reads – no one at all besides the author and (we hope) the reviewers? Information contained in such articles is, indeed, public yet undiscovered.

How much information is contained in articles that few can find, because the article is poorly indexed by Web of Science or by online search engines? Such articles may have been published without a digital presence, or placed in a journal that has limited circulation or low visibility.

A related type of information loss occurs when someone publishes an important article in an obscure or topically inappropriate journal, so that no one will take the finding seriously even if they see it.

In a series of articles in the 1980s, Don analyzed two classic examples of medical literatures that were not (or only slightly) connected, yet contained multiple links of the form “A affects B” in one literature and “B affects C” in the other, such that when they were brought together and assembled, created a persuasive, novel hypothesis. These have become widely analyzed benchmarks for nearly all subsequent studies of literature based discovery.

Don made further analyses of complementary un-connected literatures, both by himself … and in collaboration with me … . It is noteworthy that late in his career, Don proposed a link between atrial fibrillation and running … . Exercise is known to be a risk factor for atrial fibrillation, and he proposed that this may be mediated by gastroesophageal reflux, which in turn may be alleviated by taking proton pump inhibitors. Besides being another masterful, insightful example of putting together separate pieces of evidence to form a new whole, it is worth mentioning that these analyses were all based on conditions he experienced, himself. He had Raynaud syndrome, and he had migraine headaches. And, his chronic atrial fibrillation eventually caused his strokes and led to his withdrawal from active life.

The Code Breaker by Walter Isaacson review – a science page-turner
By Laura Spinney

The Crispr story is made for the movies. It features a nail-biting race, more than its fair share of renegades, the highest prize in chemistry, a gigantic battle over patents, designer babies and acres of ethical quicksand. It presents a challenge to a biographer, however, who has to pick one character from a cast of many to carry that story. Isaacson chose Doudna, and you can understand why. Having helped to elucidate the basic science of Crispr, she remains implicated in its clinical applications and in the ethical debate it stimulated – unlike Charpentier, who has said that she doesn’t want to be defined by Crispr and is now pursuing other science questions. Doudna is the thread that holds the story together.

Still, you can’t help wondering how that story might have read if it had been told from the point of view of Francisco Mojica, the Spanish scientist who first spotted Crispr in bacteria inhabiting salt ponds in the 1990s. He intuited that it did something important, then doggedly pursued this line of research despite a lack of funding and the fact that everyone told him he was wasting his time. A different story again might have been told via the two French food scientists who realised in 2007 that Crispr could be harnessed to vaccinate bacteria against viruses, thus securing the future of the global yoghurt industry, or the Lithuanian biochemist Virginijus Šikšnys, who moved the story on again, but whose work was rejected by top journals.

Each one made an essential contribution, and it’s difficult to say whose, if any, was the most important. A similar dilemma preoccupied Carl Djerassi and Roald Hoffmann in their 2001 play Oxygen, which asked who should receive a “Retro-Nobel” for the discovery of the eponymous gas. Should it go to the scientist who discovered oxygen but didn’t publish his discovery, the one who published but failed to understand the discovery’s significance, or the one who grasped its significance but only thanks to the insights of the other two?

The Present and Future of Journalism: How the News Media Lost Its Purpose
By Martin Gurri

The word media means “in the middle.” Journalists aspired to stand between the public and the facts of the world. Today, however, we know that the facts are virtually infinite in number and incoherent in pattern. The digital dispensation resembles the Tower of Babel: an uproar of mutually unintelligible voices. Mainstream media brands like the New York Times or CNN just add to the din.

The public, in such a chaotic environment, craves comprehension, not news reports. One of the institutions I see sprouting up is that of the new mediator: people who stand between the public and the information madness, and try to make sense of the latter. These mediators are sometimes connected to news organizations but they are more likely to be independent actors, often working on platforms like Substack and depending on their own audience to make a living.

Is this the wave of the future? Who knows? My point is that the need for mediation has pivoted from reporting to interpretation. Let’s think back to what Walter Lippmann wrote about the function of truth: “to bring to light the hidden facts, to set them into relation with each other, and make a picture of reality on which men can act.” That may be the next business model for journalism.

Trust in America: Do Americans trust the news media?
By Katerina Eva Matsa, Lee Rainie, Shannon Greenwood and Kim Arias

People have lots of news sources that they trust, but they don’t think that the institution of the news media and the industry of news organizations as a whole is trustworthy. So people tend to go to sources of information that map with their point of view. And we see in our data Americans don’t trust each other the way they used to. They don’t think Americans share the same facts that they used to. And so, the charge to people who are in the thick of this new environment is to figure out how to help people find their way to the truth and not make it a hard job. And Americans couldn’t be clearer about that. They want to know what’s going on, and they want help doing it, and they are looking to journalists to help solve these problems.

Distrust in political, media and business leaders sweeps the globe
By Sara Fischer

Trust in government is collapsing, especially in democracies, according to a new global survey.

Why it matters: People also don’t think media or business leaders are telling them the truth, and this suspicion of multiple societal institutions is pushing people into smaller, more insular circles of trust.

Details: Government leaders and journalists are considered the least trustworthy societal leaders, according to Edelman’s new 2022 global “Trust Barometer,” a survey of 35,000 respondents across 28 countries.

  1. A majority of people globally believe journalists (67%), government leaders (66%) and business executives (63%) are “purposely trying to mislead people by saying things they know are false or gross exaggerations.”
  2. Around the world, people fear the media is becoming more sensational for commercial gain and that government leaders continue to exploit divisions for political gain.

Between the lines: People who live in democracies are quickly losing trust in those democracies, while trust in authoritarian regimes — in China, the UAE and Saudi Arabia, for example — is increasing among the people who live under them.

  1. As trust in democratic institutions wanes, there are also growing doubts about capitalism. Developed democracies specifically lack economic optimism, per the survey.
  2. A trust gap has also increased between wealthy and low-income populations.

Liberals want to blame rightwing ‘misinformation’ for our problems. Get real
By Thomas Frank

American political culture is and always has been a matter of myth and idealism and selective memory. Selling, not studying, is our peculiar national talent. Hollywood, not historians, is who writes our sacred national epics. There were liars-for-hire in this country long before Roger Stone came along. Our politics has been a bath in bullshit since forever. People pitching the dumbest of ideas prosper fantastically in this country if their ideas happen to be what the ruling class would prefer to believe.

Debunking” was how the literary left used to respond to America’s Niagara of nonsense. Criticism, analysis, mockery and protest: these were our weapons. We were rational-minded skeptics, and we had a grand old time deflating creationists, faith healers, puffed-up militarists and corporate liars of every description.

Censorship and blacklisting were, with important exceptions, the weapons of the puritanical right: those were their means of lashing out against rap music or suggestive plays or leftwingers who were gainfully employed.

What explains the clampdown mania among liberals? The most obvious answer is because they need an excuse. Consider the history: the right has enjoyed tremendous success over the last few decades, and it is true that conservatives’ capacity for hallucinatory fake-populist appeals has helped them to succeed. But that success has also happened because the Democrats, determined to make themselves the party of the affluent and the highly educated, have allowed the right to get away with it.

There have been countless times over the years where Democrats might have reappraised this dumb strategy and changed course. But again and again they chose not to, blaming their failure on everything but their glorious postindustrial vision. In 2016, for example, liberals chose to blame Russia for their loss rather than look in the mirror. On other occasions they assured one another that they had no problems with white blue-collar workers – until it became undeniable that they did, whereupon liberals chose to blame such people for rejecting them.

This is a party that has courted professional-managerial elites for decades, and now they have succeeded in winning them over, along with most of the wealthy areas where such people live. Liberals scold and supervise like an offended ruling class because to a certain extent that’s who they are. More and more, they represent the well-credentialed people who monitor us in the workplace, and more and more do they act like it.

America, the panic room
By Thomas Frank

Liberal leaders may have given up talking about the middle class, but they have become absolutely adamant about their own goodness; about their contempt for their less refined inferiors. The liberalism of scolding is the result, and it is everywhere in Covidtime, playing constantly on a social media outlet near you. As I write this there is a video making the rounds in which a throng of protesters for Black Lives Matter (a cause I happen to believe in) corner a woman eating at a sidewalk café; they shriek at her, demanding she raise her fist in conformity with them. Watching it, one starts to understand what living in the McCarthy era must have felt like … .

Similar but larger episodes — society-wide paroxysms of accusation and denunciation — seem to sweep over social media every single day. Three acquaintances of mine — all of them well to the left of liberal — have seen their reputations attacked in episodes of this kind, and in each of them the judicial process by which they were declared guilty was outrageously unfair, more like a political show trial than a judicious weighing of arguments. I would hazard a guess that millions of other Americans know of similar stories.

That liberalism has become a politics of upper-class bullying and of character assassination is an impression that daily becomes more and more difficult to avoid. To say that people regard this form of politics with hate and fear would be a vast understatement.

Assabiya Wins Every Time
By Lee Smith

The medieval Arab historian Ibn Khaldun explains the dynamic in his 14th-century masterwork, Al Muqaddima. History, he shows, is a repetition of the same pattern seen throughout the ages—a group of nomadic tribesmen overturn an existing sedentary culture, a civilization that has become weak and luxurious. What drives the success of the rising tribe is its group solidarity, or assabiya. Its awareness of itself as a coherent people with a drive for primacy is frequently augmented by religious ideology. The stronger the tribe’s assabiya, the stronger the group. Assimilating the conquered by imposing its will and worldview on them, the victor lays the foundations of a new civilization. But since, as Ibn Khaldun writes, “the goal of civilization is sedentary culture and luxury,” all groups carry the seeds of their own demise.

Ibn Khaldun showed that every ruling establishment, what he called “royal authority,” will eventually bring the house down on its own head. The luxury and corruption that are the inevitable consequences of civilization replace the stern ways that forged the tribe’s assabiya. And setting out to destroy group solidarity intentionally raises the stakes considerably for the ruling power.

‘Evil Genuises,’ by Kurt Andersen: An Excerpt
By Kurt Andersen

In 2002, when several spectacular corporate financial frauds were exposed and their CEO perpetrators prosecuted, I published a long screed in The New York Times blaming Wall Street. “If infectious greed is the virus, New York is the center of the outbreak,” I wrote, because it is also, inarguably, the money center of America and the world, the capital of capitalism. . . . It was New York investment bankers who drove the mergers-and-acquisitions deal culture of the 80’s and 90’s and who most aggressively oversold the myth of synergy that justified it. . . . It was they . . . who invented the novel financial architectures of Enron and WorldCom. It was the example of New York investment bankers, earning gigantic salaries for doing essentially nothing— knowing the right people, talking smoothly, showing up at closings— that encouraged businesspeople out in the rest of America to feel entitled to smoke-and-mirrors cash bonanzas of their own.

… Ronald Reagan didn’t cheerfully announce in 1980 that if Americans elected him, private profit and market values would override all other American values; that as the economy grew nobody but the well-to-do would share in the additional bounty; that many millions of middle-class jobs and careers would vanish, along with fixed private pensions and reliable healthcare; that a college degree would simultaneously become unaffordable and almost essential to earning a good income; that enforcement of antimonopoly laws would end; that meaningful control of political contributions by big business and the rich would be declared unconstitutional; that Washington lobbying would increase by 1,000 percent; that our revived and practically religious deference to business would enable a bizarre American denial of climate science and absolute refusal to treat the climate crisis as a crisis; that after doubling the share of the nation’s income that it took for itself, a deregulated Wall Street would nearly bring down the financial system, ravage the economy, and pay no price for its recklessness; and that the federal government he’d committed to discrediting and undermining would thus be especially ill-equipped to deal with a pandemic and its consequences.

How This All Happened
By Morgan Housel

Ronald Reagan’s 1984 Morning in America ad declared:

It’s morning again in America. Today more men and women will go to work than ever before in our country’s history. With interest rates at about half the record highs of 1980, nearly 2,000 families today will buy new homes, more than at any time in the past four years. This afternoon 6,500 young men and women will be married, and with inflation at less than half of what it was just four years ago, they can look forward with confidence to the future.

That wasn’t hyperbole. GDP growth was the highest it had been since the 1950s. By 1989 there were 6 million fewer unemployed Americans than there were seven years before. The S&P 500 rose almost four-fold between 1982 and 1990. Total real GDP growth in the 1990s was roughly equal to that of the 1950s – 40% vs. 42%.

President Clinton boasted in his 2000 State of the Union speech:

We begin the new century with over 20 million new jobs; the fastest economic growth in more than 30 years; the lowest unemployment rates in 30 years; the lowest poverty rates in 20 years; the lowest African-American and Hispanic unemployment rates on record; the first back-to-back surpluses in 42 years; and next month, America will achieve the longest period of economic growth in our entire history. We have built a new economy.

His last sentence was important. It was a new economy. The biggest difference between the economy of the 1945-1973 period and that of the 1982-2000 period was that the same amount of growth found its way into totally different pockets.

You’ve probably heard these numbers but they’re worth rehashing. The Atlantic writes:

Between 1993 and 2012, the top 1 percent saw their incomes grow 86.1 percent, while the bottom 99 percent saw just 6.6 percent growth.

Joseph Stiglitz in 2011:

While the top 1 percent have seen their incomes rise 18 percent over the past decade, those in the middle have actually seen their incomes fall. For men with only high-school degrees, the decline has been precipitous—12 percent in the last quarter-century alone.

It was nearly the opposite of the flattening that occurred after that war.

Why this happened is one of the nastiest debates in economics, topped only by the debate over what we should do about it. Lucky for this article neither matters.

All that matters is that sharp inequality became a force over the last 35 years, and it happened during a period where, culturally, Americans held onto two ideas rooted in the post-WW2 economy: That you should live a lifestyle similar to most other Americans, and that taking on debt to finance that lifestyle is acceptable.

A lot of debt was shed after 2008. And then interest rates plunged. Household debt payments as a percentage of income are now at the lowest levels in 35 years.

But the response to 2008, necessary as it may have been, perpetuated some of the trends that got us here.

Quantitative easing both prevented economic collapse and boosted asset prices, a boon for those who owned them – mostly rich people.

The Fed backstopped corporate debt in 2008. That helped those who owned their debt – mostly rich people.

Tax cuts over the last 20 years have predominantly gone to those with higher incomes. People with higher incomes send their kids to the best colleges. Those kids can go on to earn higher incomes and invest in corporate debt that will be backstopped by the Fed, own stocks that will be supported by various government policies, and so on. Economist Bhashkar Mazumder has shown that incomes among brothers are more correlated than height or weight. If you are rich and tall, your brother is more likely to also be rich than he is tall.

None of these things are problems in and of themselves, which is why they stay in place.

But they’re symptomatic of the bigger thing that’s happened since the early 1980s: The economy works better for some people than others. Success isn’t as meritocratic as it used to be and, when success is granted, is rewarded with higher gains than in previous eras.

You can scoff at linking the rise of Trump to income inequality alone. And you should. These things are always layers of complexity deep. But it’s a key part of what drives people to think, “I don’t live in the world I expected. That pisses me off. So screw this. And screw you! I’m going to fight for something totally different, because this – whatever it is – isn’t working.”

The Jan. 6 Insurrectionists Aren’t Who You Think They Are
By Robert A. Pape

For decades, Americans have become used to thinking of right-wing extremism—or really extremism of any kind—as emanating from the awful edges of society. Extremists make up just a tiny fraction of the country, far less than 1 percent of the population—so the logic goes—and they are economically destitute, often unemployed, and come from the rural parts of the United States.

That picture has changed—at least when it comes to the insurrection of Jan. 6, 2021 and the insurrectionist movement today. An in-depth look at who broke into the U.S. Capitol, the size of the insurrection movement in the United States today, the ideas motivating the movement, and their media consumption habits shows that the old patterns in right-wing extremism no longer apply.

Our new analysis at the University of Chicago Project on Security and Threats of the demographics of those who stormed the Capitol on Jan. 6 and multiple nationally representative surveys paint a new, startling reality: The insurrectionist movement is mainstream, not simply confined to the political fringe.

Consider the economic profile of the 716 people arrested or charged, as of Jan. 1, 2022, for storming the Capitol. Of the 501 for which we have employment data, more than half are business owners, including CEOs, or from white-collar occupations, including doctors, lawyers, architects, and accountants.

Only 7 percent were unemployed at the time, almost the national average, compared with the usual 25 percent or more of violent right-wing perpetrators arrested by the FBI and other U.S. law enforcement from 2015 to mid-2020.

Furthermore, only 14 percent of those who broke into the Capitol on Jan. 6 were members of militias such as the Oath Keepers or extremist groups such as the Proud Boys; 86 percent had no affiliation.

In other words, these were people who had something to lose when they went to Washington and carried out this violence.

A majority of the people arrested for Capitol riot had a history of financial trouble
By Todd C. Frankel

Nearly 60 percent of the people facing charges related to the Capitol riot showed signs of prior money troubles, including bankruptcies, notices of eviction or foreclosure, bad debts, or unpaid taxes over the past two decades, according to a Washington Post analysis of public records for 125 defendants with sufficient information to detail their financial histories.

The group’s bankruptcy rate — 18 percent — was nearly twice as high as that of the American public, The Post found. A quarter of them had been sued for money owed to a creditor. And 1 in 5 of them faced losing their home at one point, according to court filings.

The financial problems are revealing because they offer potential clues for understanding why so many Trump supporters — many with professional careers and few with violent criminal histories — were willing to participate in an attack egged on by the president’s rhetoric painting him and his supporters as undeserving victims.

While no single factor explains why someone decided to join in, experts say, Donald Trump and his brand of grievance politics tapped into something that resonated with the hundreds of people who descended on the Capitol in a historic burst of violence.

“I think what you’re finding is more than just economic insecurity but a deep-seated feeling of precarity about their personal situation,” said Cynthia Miller-Idriss, a political science professor who helps run the Polarization and Extremism Research Innovation Lab at American University, reacting to The Post’s findings. “And that precarity — combined with a sense of betrayal or anger that someone is taking something away — mobilized a lot of people that day.”

The financial missteps by defendants in the insurrection ranged from small debts of a few thousand dollars more than a decade ago to unpaid tax bills of $400,000 and homes facing foreclosure in recent years.

In the Capitol attack, business owners and white-collar workers made up 40 percent of the people accused of taking part, according to a study by the Chicago Project on Security and Threats at the University of Chicago. Only 9 percent appeared to be unemployed.

How America Fractured Into Four Parts
By George Packer

The majority of Americans who elected Reagan president weren’t told that Free America would break unions and starve social programs, or that it would change antitrust policy to bring a new age of monopoly, making Walmart, Citigroup, Google, and Amazon the J.P. Morgan and Standard Oil of a second Gilded Age. They had never heard of Charles and David Koch—heirs to a family oil business, libertarian billionaires who would pour money into the lobbies and propaganda machines and political campaigns of Free America on behalf of corporate power and fossil fuels. Freedom sealed a deal between elected officials and business executives: campaign contributions in exchange for tax cuts and corporate welfare. The numerous scandals of the 1980s exposed the crony capitalism that lay at the heart of Free America.

The new knowledge economy created a new class of Americans: men and women with college degrees, skilled with symbols and numbers—salaried professionals in information technology, computer engineering, scientific research, design, management consulting, the upper civil service, financial analysis, law, journalism, the arts, higher education. They go to college with one another, intermarry, gravitate to desirable neighborhoods in large metropolitan areas, and do all they can to pass on their advantages to their children. They are not 1 percenters—those are mainly executives and investors—but they dominate the top 10 percent of American incomes, with outsize economic and cultural influence.

Educated professionals pass on their money, connections, ambitions, and work ethic to their children, while less educated families fall further behind, with less and less chance of seeing their children move up. By kindergarten, the children of professionals are already a full two years ahead of their lower-class counterparts, and the achievement gap is almost unbridgeable. After seven decades of meritocracy, a lower-class child is nearly as unlikely to be admitted to one of the top three Ivy League universities as they would have been in 1954.

A common refrain, in places like southeastern Ohio and southern Virginia and central Pennsylvania, is that the middle class no longer exists. I once heard a woman in her 60s, a retired municipal employee in Tampa, Florida, who had made and then lost money in real estate, describe herself as a member of “the formerly middle class.” She meant that she no longer lived with any security. Her term could apply to a nonunion electrician making $52,000 a year and to a home health aide making $12 an hour. The first still belongs financially to the middle class, while the second is working-class—in fact, working-poor. What they share is a high-school degree and a precarious prospect. Neither of them can look with confidence on their future, less still on their children’s. The dream of leaving their children better educated and better off has lost its conviction, and therefore its inspiration. They can’t possibly attain the shiny, well-ordered lives they see in the houses of the elite professionals for whom they work.

After the 2016 election, a great deal of journalism and social science was devoted to finding out whether Trump’s voters were mainly motivated by economic anxiety or racial resentment. There was evidence for both answers.

Progressives, shocked by the readiness of half the country to support this hateful man, seized on racism as the single cause and set out to disprove every alternative. But this answer was far too satisfying. Racism is such an irreducible evil that it gave progressives commanding moral heights and relieved them of the burden to understand the grievances of their compatriots down in the lowlands, let alone do something about them. It put Trump voters beyond the pale. But racism alone couldn’t explain why white men were much more likely to vote for Trump than white women, or why the same was true of Black and Latino men and women. Or why the most reliable predictor for who was a Trump voter wasn’t race but the combination of race and education. Among white people, 38 percent of college graduates voted for Trump, compared with 64 percent without college degrees. This margin—the great gap between Smart America and Real America—was the decisive one. It made 2016 different from previous elections, and the trend only intensified in 2020.

In the first decade of the new century, the bipartisan ruling class discredited itself—first overseas, then at home. The invasion of Iraq squandered the national unity and international sympathy that had followed the attacks of September 11. The decision itself was a strategic folly enabled by lies and self-deception; the botched execution compounded the disaster for years afterward. The price was never paid by the war’s leaders. As an Army officer in Iraq wrote in 2007, “A private who loses a rifle suffers far greater consequences than a general who loses a war.” The cost for Americans fell on the bodies and minds of young men and women from small towns and inner cities. Meeting anyone in uniform in Iraq who came from a family of educated professionals was uncommon, and vanishingly rare in the enlisted ranks. After troops began to leave Iraq, the pattern continued in Afghanistan. The inequality of sacrifice in the global War on Terror was almost too normal to bear comment. But this grand elite failure seeded cynicism in the downscale young.

The financial crisis of 2008, and the Great Recession that followed, had a similar effect on the home front. The guilty parties were elites—bankers, traders, regulators, and policy makers. Alan Greenspan, the Federal Reserve chairman and an Ayn Rand fan, admitted that the crisis undermined his faith in the narrative of Free America. But those who suffered were lower down the class structure: middle-class Americans whose wealth was sunk in a house that lost half its value and a retirement fund that melted away, working-class Americans thrown into poverty by a pink slip. The banks received bailouts, and the bankers kept their jobs.

The conclusion was obvious: The system was rigged for insiders. The economic recovery took years; the recovery of trust never came.

… Just America assails the complacent meritocracy of Smart America. It does the hard, essential thing that the other three narratives avoid, that white Americans have avoided throughout history. It forces us to see the straight line that runs from slavery and segregation to the second-class life so many Black Americans live today—the betrayal of equality that has always been the country’s great moral shame, the heart of its social problems.

But Just America has a dissonant sound, for in its narrative, justice and America never rhyme. A more accurate name would be Unjust America, in a spirit of attack rather than aspiration. For Just Americans, the country is less a project of self-government to be improved than a site of continuous wrong to be battled. In some versions of the narrative, the country has no positive value at all—it can never be made better.

There are too many things that Just America can’t talk about for the narrative to get at the hardest problems. It can’t talk about the complex causes of poverty. Structural racism—ongoing disadvantages that Black people suffer as a result of policies and institutions over the centuries—is real. But so is individual agency, and in the Just America narrative, it doesn’t exist. The narrative can’t talk about the main source of violence in Black neighborhoods, which is young Black men, not police. The push to “defund the police” during the protests over George Floyd’s murder was resisted by many local Black citizens, who wanted better, not less, policing. Just America can’t deal with the stubborn divide between Black and white students in academic assessments. The mild phrase achievement gap has been banished, not only because it implies that Black parents and children have some responsibility, but also because, according to anti-racist ideology, any disparity is by definition racist. Get rid of assessments, and you’ll end the racism along with the gap.

But most Just Americans still belong to the meritocracy and have no desire to give up its advantages. They can’t escape its status anxieties—they’ve only transferred them to the new narrative. They want to be the first to adopt its expert terminology. In the summer of 2020, people suddenly began saying “BIPOC” as if they’d been doing it all their lives. (Black, Indigenous, and people of color was a way to uncouple groups that had been aggregated under people of color and give them their rightful place in the moral order, with people from Bogotá and Karachi and Seoul bringing up the rear.) The whole atmosphere of Just America at its most constricted—the fear of failing to say the right thing, the urge to level withering fire on minor faults—is a variation on the fierce competitive spirit of Smart America. Only the terms of accreditation have changed. And because achievement is a fragile basis for moral identity, when meritocrats are accused of racism, they have no solid faith in their own worth to stand on.

The rules in Just America are different, and they have been quickly learned by older liberals following a long series of defenestrations at The New York Times, Poetry magazine, Georgetown University, the Guggenheim Museum, and other leading institutions. The parameters of acceptable expression are a lot narrower than they used to be. A written thought can be a form of violence. The loudest public voices in a controversy will prevail. Offending them can cost your career. Justice is power. These new rules are not based on liberal values; they are post-liberal.

All four of the narratives I’ve described emerged from America’s failure to sustain and enlarge the middle-class democracy of the postwar years. They all respond to real problems. Each offers a value that the others need and lacks ones that the others have. Free America celebrates the energy of the unencumbered individual. Smart America respects intelligence and welcomes change. Real America commits itself to a place and has a sense of limits. Just America demands a confrontation with what the others want to avoid. They rise from a single society, and even in one as polarized as ours they continually shape, absorb, and morph into one another. But their tendency is also to divide us, pitting tribe against tribe. These divisions impoverish each narrative into a cramped and ever more extreme version of itself.

Michael Lind Wants to Put America Back Together
By Christopher Hooks

TM: At the beginning of the book you say that the populist surge of the last few years is “a revolution, not a revolt.” I think everyone has wondered over the last couple of years whether 2016 was a revolution or a revolt. Trump won relatively slim margins in Midwestern states. He won a smaller share of the national popular vote than Romney did, and he won Wisconsin in 2016 with fewer votes than Romney lost with in 2012. And the Republican party performed horribly in the 2018 midterms. What happened in 2016, when Brexit passed in England and Trump was elected? Was the last presidential election the start of a new political realignment, or a fluke?

ML: My thesis is not that the populists are going to win all the elections. In fact, they will usually lose. And in the event that these outsider demagogues win, they will be neutralized because most of the establishment is against them.

My argument is that we’re now in a new world on both sides of the Atlantic. The late-twentieth-century political order is gone. It’s collapsed. We’re in this new system where the mainstream parties are just labels and power kind of circulates among factions of this technocratic insider neoliberal establishment. Politics becomes this kind of spectator sport of musical chairs among people who all went to the same Ivy League schools and all know one another and vacation together in the Hamptons and so on, and they really don’t disagree on that much. And then you get a large portion of the population that feels alienated and doesn’t vote much of the time. And when they do, it’s to cast a protest vote. It’s an anti-system vote. And that’s a political system ripe for demagogues.

Does Inequality Provoke Populism?
By Paolo Gerbaudo

In their 2009 book The Spirit Level, Kate Pickett and Richard Wilkinson had already highlighted that economic inequality and low social mobility are intertwined. Relying on a large dataset of Western countries’ social mobility data, they argued that “greater income inequality reduces social mobility.” This is due to the social and geographic segregation of the rich and poor that high income inequality fosters. In other words, high income inequality raises the obstacles to social mobility, excluding the poor from social networks and information that are necessary to advance economically, while shielding the rich in gated communities. Hence, there is no way to unleash social mobility without tackling inequality.

Protzer and Summerville’s argument about the centrality of fairness makes more sense from the standpoint of social psychology than political economy. The authors are correct to highlight that public perceptions are often a distorted mirror of structural conditions and that if the public displays concern about inequality this does not automatically mean that it espouses policies aimed at reducing it.

This paradox is highlighted by 2021 “Unequal Britain” report showing that while around 80 percent of British citizens are worried about inequality, only about 50 percent support redistribution from the rich to the poor. This seems to vindicate Protzer and Summerville’s point that policies that appear to contradict the contribution-rewards link, such as direct transfers of money to poor citizens, may be seen as unfair or even as promoting laziness. Yet, interestingly, the same study finds strong support for indirect redistributive policies such as public services, education, health, and other forms of public consumption. Funding these services would necessarily involve growing taxation of the rich, which is the real sticking point in contemporary politics.

As Joe Guinan and Martin O’Neill argued in the Guardian last year, Third Way policies, which attempted to marry low taxation of the rich and social welfare spending, cannot be replicated in the present, as they relied on the exceptionally favorable economic conditions of “the ‘long boom’ of the 1990s and early 2000s.” Given that—except for the current rebound of GDP, after the dip at the beginning of the COVID-19 pandemic—the 2020s are likely to continue on the trajectory of low growth, redistribution is destined to be a zero-sum game in which there is little choice but to target wealth hoarded during the upward swing of the business cycle and to confront monopolistic positions and rent-seeking behavior that fuels its continued growth.

After the devastation of World War II, redistributive measures by social-democratically minded governments acted as a “great equalizer,” as Piketty has argued. Today, the enormous wealth amassed by the likes of Bezos and Elon Musk indicates that economic power has become ever more concentrated. By dint of higher historical returns on capital than the growth rate of wages, the accumulated wealth of these individuals tends to get bigger. Further, the economic power of the wealthy translates into political power, with corporate lobbyists acting as the necessary link.

While on the one hand, taxation on the rich is the only realistic way forward to fund public services and provide health care and pensions for an aging population, on the other hand, the political power of the wealthiest means that redistributive measures are likely to be met with stubborn resistance.

From the 1950s until the 1970s, what today would be considered punitive levels of taxation were common in the United States and Europe, even under right-wing politicians, with the famous example of the top income tax rate of 91 percent under U.S. President Dwight D. Eisenhower. Yet these days, the United States and many European countries have some of the lowest tax rates on corporate profits, wealth, and high incomes in recent history. While a return to those earlier levels of taxation may not be realistic in a globalized world, it is evident that the present record-low levels of taxation on the rich and corporations are not fiscally sustainable either.

Populism has often been described as the mirror of democracy—and of its failings. Protzer and Summerville interestingly use low trust in political institutions as a proxy for populist sentiment. Citizens don’t turn to populist movements and leaders just because they are concerned about the effects of inequality or “unfairness,” they do so because they see this inequality as the net result of political decisions taken by representatives against the best interests of those they represent.

The Real Threat to American Democracy
By Michael Lind

The real threat to American democracy is the disconnect between what the bipartisan US political establishment promises and what it delivers. This problem predates Trump by decades and helps to explain his rise.

Contrary to what was promised, globalization did not produce new and better jobs for most Americans in the high-tech “knowledge economy.” Instead, the US has become dependent on China and other countries for basic manufactured goods, including drugs and medical equipment. Most US job creation over the last three decades has been in low-wage positions with few or no benefits.

Similarly, in the 2000s, America’s financial elites claimed that the “Great Moderation” in global macroeconomic volatility represented a permanent “new normal.” But it turned out to be dependent on asset bubbles that burst, causing the 2008 global financial crisis and the subsequent Great Recession. And US interventions in Afghanistan, Iraq, Syria, and Libya produced only outright failures or continuing quagmires.

Many of the architects of these colossal disasters have gone on to establish lucrative careers as respected experts. Few have suffered financial or reputational losses. When a national establishment fails so often and at such cost, and when mainstream media sources remain complicit in those failures, no one should be surprised if citizens look to alternative media sources, including crazy ones, or turn to outsider politicians, including narcissistic demagogues like Trump.

Americans must be on their guard to prevent corrupt politicians’ illegal and immoral attempts to alter electoral results. But the real long-term threat to American democracy is the lack of popular trust in conventional politicians whose policies have repeatedly failed. And for that lack of public trust, American elites have nobody to blame but themselves.

The Biggest Threat to America Is America Itself
By Nicholas Kristof

We Americans repeat the mantra that “we’re No. 1” even though the latest Social Progress Index, a measure of health, safety and well-being around the world, ranked the United States No. 28. Even worse, the United States was one of only three countries, out of 163, that went backward in well-being over the last decade.

Another assessment this month, the I.M.D. World Competitiveness Ranking 2021, put the United States No. 10 out of 64 economies. A similar forward-looking study from the World Bank ranks the United States No. 35 out of 174 countries.

… the “American dream” of upward mobility (which drew my refugee father to these shores in 1952) is increasingly chimerical. “The American dream is evidently more likely to be found on the other side of the Atlantic, indeed most notably in Denmark,” a Stanford study concluded.

“These things hold us back as an economy and as a country,” Jerome Powell, the chair of the Federal Reserve, said Tuesday.

More broadly, the United States has lost its lead in education overall and in investments in children. The World Bank Human Capital Project estimates that today’s American children will achieve only 70 percent of their potential productivity. That hurts them; it also hurts our nation.

Nixon Warned About U.S. Decline
By Tom Switzer

The U.S. has been through dark times before—and in living memory. Fifty years ago this week, President Richard Nixon spoke frankly about America’s doldrums in a speech that didn’t get enough attention from the media at the time or historians since. On July 6, 1971, the 37th president addressed senior midwestern media executives in Kansas City, Mo., amid racial unrest, campus agitation and antiwar protests.

The columns of the National Archives Building in Washington reminded Nixon of ancient fallen empires. “I think of what happened to Greece and Rome, and you see what is left—only the pillars. What has happened, of course, is that the great civilizations of the past, as they have become wealthy, as they have lost their will to live, to improve, they then have become subject to decadence that eventually destroys the civilization.” Nixon’s lament: “The United States is now reaching that period.”

America, he said, needed to find the “moral and spiritual strength” to shape the emerging post-Vietnam era.

The real lessons from 9/11
By The Economist

The fall of Saigon did not lead to the West losing the cold war. And for all America’s flaws—its divisions, debts and decrepit infrastructure—many facets of its power are intact. Its share of global GDP, at 25%, is roughly what it was in the 1990s. It is still technologically and militarily pre-eminent. Although public opinion has turned inwards, America’s interests are far more global than during its isolationist phase in the 1930s. With 9m citizens abroad, 39m jobs supported by trade and $33trn of foreign assets, it has a strong interest in an open world.

Foreign policy is guided by events as much as by strategy: Mr Bush ran on a platform of compassionate conservatism, not a war on terror. Mr Biden must improvise in response to an unruly age.

What Putin Really Wants
By Lilia Shevtsova

Russia’s rebellion threatens to turn geopolitics into a battle of threats — force on one side, sanctions on the other. Mr. Putin’s method is tried and tested: He ratchets up the tensions and then demands “binding agreements,” which he does not take seriously. The aim, really, is a Hobbesian world order, built on disruption and readiness for surprise breakthroughs.

Europe Thinks Putin Is Planning Something Even Worse Than War
By Ivan Krastev

A Russian incursion into Ukraine could, in a perverse way, save the current European order. NATO would have no choice but to respond assertively, bringing in stiff sanctions and acting in decisive unity. By hardening the conflict, Mr. Putin could cohere his opponents.

Today, geopolitical strength is determined not by how much economic power you can wield, but by how much pain you can endure. Your enemy, unlike during the Cold War, is not somebody behind an iron curtain, but somebody with whom you trade, from whom you get gas and to whom you export high-tech goods. Soft power has given way to resilience.

What Putin Learned From the Soviet Collapse
By Richard Connolly and Michael Kofman

Russian policymakers have drawn lessons from the tumult of the late Soviet experience, as well as the economic disruptions of the 1990s. Oil market crashes in 1986 and 1997 inflicted enormous budgetary shocks upon the Soviet Union and the fledgling Russian Federation. Among policymakers in Moscow, these shocks generated deep-seated fears of the impact that resource market volatility can exert on the financial stability of export-dependent economies.

Russia’s leadership has also absorbed the lesson that financial weakness curtails a country’s freedom of action on the international stage. In the late 1980s, Gorbachev faced limited options when confronted by tumult in the Warsaw Pact and the prospect of German unification. Leading states in the Warsaw Pact were heavily indebted to the West, while Moscow was constrained in its ability to prop up the faltering economies of these satellite communist regimes. Attaining German financial support was also a factor in Soviet acquiescence to German unification.

Subsequently, Russia was, in the eyes of most in Moscow, ignored on foreign policy matters throughout the 1990s. It was a great power in name only. Once Russia’s leadership paid back the country’s debts and reduced the state’s dependence on external finance, it began to restore the country’s global position.

Russia proved adaptable and resilient during the 2008 financial crisis, the more recent 2014–15 recession, and again during 2020’s COVID-19 pandemic–induced global recession. For all its faults, Russia’s policy elite has built a system that is able to weather oil price shocks, recessions, and sanctions better than at any time in the past. When oil prices collapsed in 1986, the Soviet leadership was forced to run huge budget deficits, print money (which caused inflation), and borrow huge sums from international creditors. In 2020, Russia ran a budget deficit of 3.5 percent (half that of European countries) financed almost entirely from its own considerable resources. These domestic resources have also helped Russia adapt to many of the challenges it has faced since Western sanctions were imposed in 2014.

How Putin took Europe to the brink of war
By Gideon Rachman

The longer-term question about how we got here goes back to the 2007 Munich conference. It was there that Putin made an angry speech denouncing the post-cold war order and the use of American power in Iraq and around the world.

The following year, Russia went to war in neighbouring Georgia. In 2014, the first attack on Ukraine and the annexation of Crimea took place. Western leaders threatened Russia with what Merkel called “massive damage . . . economically and politically”. But Russia weathered the sanctions and by 2018 was staging a successful World Cup, which ended with Putin hosting the presidents of two EU countries, France and Croatia, in the VIP box.

If Putin is now willing to shrug off the threats of western sanctions, it may be because he has literally heard it all before. A full-scale assault on Ukraine would, however, represent a massive escalation in his willingness to use force and accept confrontation with the west.

After more than 20 years in power, at the age of 69, he may now be in the legacy business. He has expressed a deep desire to “reunify” the Russkiy Mir, or Russian world, which, as he sees it, is now divided. Bringing Ukraine back into Moscow’s orbit could be seen as the completion of a historic task.

If he does indeed think like a 19th-century figure, Putin will believe that massive bloodshed is justified to unite the nation. After all, Abraham Lincoln’s army burnt down Atlanta in 1864 to preserve American unity. Otto von Bismarck fought three wars to unify Germany — and is still the subject of admiring biographies.

Many analysts, however, including Russian liberals, believe Putin is dangerously deluded if he believes those kinds of wars are still possible in Europe. Today, every atrocity committed by Russian forces will be recorded on somebody’s smartphone and broadcast around the world. Russia’s young people enjoy the same technological and social freedoms as their contemporaries in western Europe. Will they really accept the dangers, deprivations and moral opprobrium that Putin could bring down?

Can he, in short, continue to use 19th-century methods in the 21st century?

History will judge Vladimir Putin harshly for his war
By The Economist

After 22 years at the top, even a dictator with an overdeveloped sense of his own destiny has a nose for survival and the ebb and flow of power. Many Russians, unclear about a crisis that has come from nowhere, may be unenthusiastic about waging a deadly war against their brothers and sisters in Ukraine. That is something the West can exploit.

Accommodating Mr Putin in the hope that he will start to behave nicely would be more dangerous still. Even China should see that a man who rampages across frontiers is a threat to the stability it seeks.

Weakness at Home Drives Putin to Invade Ukraine
By David Satter

Many believe Mr. Putin wants to re-create the Soviet Union, but it is much more likely that in threatening Kyiv, he wants to re-create the “Crimea effect,” in which, according to Nikolai Petrov of Moscow’s Higher School of Economics, Russians “forgot their worries and felt everything was allowed and anything was possible.”

Consolidating Russians around the regime is almost certainly a priority because, despite appearances, Mr. Putin has created an unstable structure of power. There are signs that the population’s patience with massive corruption is waning. On Jan. 23, 2021, protests broke out all over Russia in reaction to the arrest of anticorruption blogger Alexei Navalny and the contents of his film “A Palace for Putin,” which describes the ruler’s 190,000-square-foot residence on the Black Sea, built in part with funds taken from a $1 billion program to improve Russia’s healthcare system.

Popular discontent with corruption has been building for years. A 2011 survey by the Russian Academy of Sciences found that 34% of Russians “always” wished “they could shoot down all bribe-takers and speculators,” while 38% said they “sometimes” did. Much of this corruption is associated with Mr. Putin. In 2017, after accusations that the then-Prime Minister Dmitri Medvedev had embezzed $1.2 billion, demonstrators in Moscow blamed the president as well, shouting: “Putin is a thief.”

Piece by Piece, Russia’s Rationale for a Ukraine Invasion Is Put in Place
By Anton Troianovski

Until recently, it appeared that many Russians had tuned out talk of an impending war. Pollsters say that while the possibility of war is one of Russians’ greatest fears, no antiwar movement has emerged in recent weeks both because many simply can’t imagine it — or see how they can influence any decisions.

Russians “feel they can’t influence the process at all,” said Aleksandra Arkhipova, a Moscow social anthropologist, who found that there had been relatively little discussion of a possible war with Ukraine online before the propaganda barrage of recent days. “So they try to avoid it.”

On Sunday, a handful of activists unfurled antiwar posters in central Moscow’s Pushkin Square and were promptly arrested. One of the protesters, Lev Ponomarev, a Soviet-era human-rights activist, insisted that while for the moment many still could not imagine a war, most Russians would oppose it if it actually happened.

“There will be no support for this war,” Mr. Ponomarev said in an interview on Monday. “It will be the collapse of this regime.”

Get Ready for a Spike in Global Unrest
By Elise Labott

The global risk firm Verisk Maplecroft has warned that as many as 37 countries could face large protest movements for up to three years. A new study by Mercy Corps examining the intersection of COVID-19 and conflict found concerning trends that warn of potential for new conflict, deepening existing conflict, and worsening insecurity and instability shaped by the pandemic response.

The group found a collapse of public confidence in governments and institutions was a key driver of instability. People in fragile states, already suffering from diminished trust in their government, have felt further abandoned as they face disruptions in public services, rising food prices, and massive economic hardships, such as unemployment and reduced wages. Supply chains disrupted during the pandemic have seen food prices skyrocket, while in the global recession humanitarian aid budgets are being slashed, bringing many countries to the brink of famine. For the first time in 22 years, extreme poverty—people living on less than $1.90 a day—was on the rise last year. Oxfam International estimates that “it could take more than a decade for the world’s poorest to recover from the economic impacts of the pandemic.”

The shocks caused by the pandemic have also eroded social cohesion, further fraying relations between communities and deepening polarization. That is especially true in the United States, where social and political pressures both deepened the health crisis and were themselves worsened by it. All of this should serve as a clarion call to countries that they can’t prepare for, or respond to, future health crises in a vacuum—but must anticipate an economic, political, and social crisis. This is true for any severe shock, which brings the potential for a breakdown in public order.

The Fund for Peace’s Fragile States Index, which tracks social, economic, and political trends across 179 countries, found COVID-19 was the “first domino in a chain of events that ignited more longstanding and deep-seated grievances,” with impacts that will reverberate for years. The results show that fragility—whether in the social, economic, political, or security dimension—can develop anywhere, even in the wealthiest and most powerful countries in the world. In the event of a shock, even rich societies unable to pull together may be as vulnerable as the poorest country in the world.

US spies peer into the future – and it doesn’t look good
By Gordon Corera

The latest attempt by the US National Intelligence Council to understand what may happen within and between countries points to uncertainty and instability. It firstly focuses on the key factors driving change.

One is political volatility.

“In many countries, people are pessimistic about the future and growing more distrustful of leaders and institutions that they see as unable or unwilling to deal with disruptive economic, technological, and demographic trends,” the report warns.

It argues people are gravitating to like-minded groups and making greater and more varied demands of governments at a time when those governments are increasingly constrained in what they can do.

“This mismatch between governments’ abilities and publics’ expectations is likely to expand and lead to more political volatility, including growing polarisation and populism within political systems, waves of activism and protest movements, and, in the most extreme cases, violence, internal conflict, or even state collapse.”

Unmet expectations, fuelled by social media and technology, could lead to risks for democracy.

“Looking forward, many democracies are likely to be vulnerable to further erosion and even collapse,” the report warns, although adding that these pressures will also affect authoritarian regimes.

There are some optimistic scenarios for 2040 – one called “the renaissance of democracies”.

This involves the US and its allies harnessing technology and economic growth to deal with domestic and international challenges, while China and Russia’s crackdowns (including in Hong Kong) stifle innovation and strengthen the appeal of democracy.

Debt and Inflation Threaten U.S. Security
By Jeb Hensarling

The national debt this month reached $30 trillion. Not only is this the largest debt in U.S. history in dollar terms, but the ratio of debt to gross domestic product is 119%—the largest it’s ever been. And things are only getting worse. The Congressional Budget Office predicts that the mammoth debt-to-GDP ratio will double over the next three decades. The Highway Trust Fund will become insolvent this year. Medicare Part A will run out of money in fiscal year 2026 and Social Security will go bust in fiscal 2033.

The strength of the American economy was one of the main reasons the U.S. won the Cold War. It supported a defense buildup that the Soviet Union couldn’t match. But today, Russia’s economy supports a debt-to-GDP ratio of only 19%; China’s ratio is 68%.

Those who believe there can never be a debt crisis should look no further than the last financial crisis. For years, policy makers told us that Fannie Mae and Freddie Mac, the largest players in the housing market, would always remain solvent and strong. They were wrong. Fannie and Freddie failed, the housing market crashed, and the economy was brought to its knees. Fannie and Freddie weren’t problems until they were. Just as the national debt may never be a problem until it is.

Competition Can Be Good for the Developing World
By Branko Milanovic

The approach that the United States followed during the Cold War can be broadly divided into two phases. During the first phase, just after the end of World War II, the United States concentrated its efforts on rebuilding the economies of Western Europe, Japan, and South Korea through direct aid and open markets. Washington sought to create in these countries an affluent middle class, on the assumption that such a demographic would be disinclined to vote for communist parties that challenged the private ownership of capital. This postwar period is associated with the Marshall Plan and is often considered the high-water mark of American hegemonic benevolence.

The American record during the second phase of the Cold War is much more checkered. In the 1960s, during the decolonization of much of the developing world, Washington at times supported regimes that were socially reactionary and uninterested in economic development, such as those in Congo, pre-revolutionary Cuba, the Dominican Republic, and South Vietnam. But at other times, it urged land reform and broadly based economic growth, for instance in Colombia, South Korea, and Taiwan. Gradually, a consistent Western view of economic development emerged—one based on modernization theory, which held that by creating a strong middle class, economic development would lead to democracy.

U.S. development advice to many countries during the mid– to late–Cold War period combined these two relatively simple but powerful theories of growth and distribution: countries should focus on economic growth, and growth would eventually take care of inequality. Higher and more equally distributed incomes would lead citizens to demand democracy. The same conditions would make democracy sustainable. Notably, the United States did not seek to impose democracy by insisting on institutional change before the economic conditions were ripe. Rather, democratization was to be reached indirectly, through economic growth and the fairer distribution of resources. The model worked most clearly in South Korea and Taiwan but also in Botswana, Costa Rica, Mauritius, and southern European countries such as Portugal and Spain.

Nonetheless, the 1980s saw the advent of a neoliberal economics that overtook this model. The new one called for reducing the role of the state and opening institutions and the “investment climate” to the private sector. Growth was supposed to follow. The end of the Cold War and the collapse of the Eastern bloc hastened the turn toward neoliberalism. U.S. advisers and the international organizations in which Washington played a leading role, such as the World Bank and other development banks, ceased to stress growth and redistribution. Instead, they promoted institutional reforms. When countries faced balance-of-payment crises, development banks provided loans that were contingent on “structural adjustment”: governments were expected to reduce spending, lower taxes, deregulate, and privatize.

These policies, together known as the Washington consensus, reflected ideological developments within rich countries themselves during the 1980s and 1990s. U.S. President Ronald Reagan and British Prime Minister Margaret Thatcher deemphasized the role of the state as a matter of principle. Moreover, as the competition between the Soviet Union and the United States waned and ended, neither the public nor the political class sustained much interest in the fate of developing countries. There was no longer any pressing need to convince other countries of the superiority of American arrangements, which now seemed obvious. The United States could dispense with special efforts to woo developing countries and no longer even needed to pay much attention to them.

The effects of the neoliberal model are difficult to disentangle from those of globalization, but the record of both together was mixed. China obviously benefited a lot, but its domestic policy approach was often at the very opposite end of what neoliberalism advocated. India, after 1991 and even more recently under the government of Prime Minister Narendra Modi, came closer to following neoliberal recommendations. But for many countries in Africa, the 1990s and the first decade of the twenty-first century were marked by low, often negative per capita growth that made them fall even further behind the rest of the world than before.

If Western countries and the United States, in particular, plan to compete with China in the developing world, they must move away from an approach based on arguing for quixotic institutional reforms while denigrating the role of the state in economic development. Instead, the United States needs to fashion a more appealing approach that delivers tangible goods to the populations of developing nations. Some of these goods could take the form of old-fashioned dams, electric grids (less than one-half of the African population has access to power), water and sewage systems, and even productive investment in processing or manufacturing. Other investments could support education, health, urban development, wireless networks, or direct cash transfers to eligible populations. What is essential is that U.S. projects result in visible improvements in the daily lives of ordinary citizens.

Does America Really Support Democracy—or Just Other Rich Democracies?
By Jake Werner

The claim that the United States is at odds with most democracies may feel jarring, but that is only because U.S. leaders and media so often conflate the “world’s democracies” with the handful of rich countries, including former colonial powers in Europe (and Japan) and states that began as settler colonies, such as Australia and Canada. A 2020 New York Times article, for example, headlined the findings of a Pew Research Center poll this way: “Distrust of China Jumps to New Highs in Democratic Nations.” The poll was not, however, about “democratic nations.” Most of the world’s largest democracies—countries such as Brazil, India, Indonesia, Mexico, and South Africa—were not included, nor were many smaller democracies such as Botswana, Papua New Guinea, and Sri Lanka. It was instead a poll of people in (as Pew itself put it) “advanced economies.”

According to the Economist Intelligence Unit’s Democracy Index, democratic developing countries are home to twice as many people as rich democracies—three times as many, if one counts semidemocratic “hybrid regimes” such as those in Bangladesh, Nigeria, and Turkey. Yet the world’s many poor democracies remain largely peripheral to the worldview of U.S. policymakers. They enter into Beltway conversations only when they threaten regional stability or become useful in wider geopolitical conflicts.

This invisibility is understandable. Precisely because they are poor, the democracies of the global South exert far less influence over world politics and the global economy than their wealthy counterparts. The rich democracies account for about 15 percent of world population but enjoy 43 percent of global GDP as measured by purchasing power (59 percent in dollar terms), and their military budgets amount to nearly two-thirds of the world’s war spending. Many Americans also share a feeling of cultural or ethnic affinity with the rich democracies that does not extend to the poor democracies.

An international order responsive to the needs of the global South would have begun organizing a system for global vaccine production and distribution in May 2020, when the first promising vaccine candidates emerged. Instead, although billions of dollars of public funds enabled vaccine development, the production of vaccines (and their enormous profits) was left entirely to private pharmaceutical companies, creating devastating shortages. As for distribution, although the COVAX initiative promised a minimal level of global vaccine equality, it was hobbled when the rich democracies bought up most of the vaccine supply.

Under considerable pressure from a transnational coalition of public health, fair-trade, and global justice groups, the Biden administration has finally begun to act. In May, Biden agreed to support a waiver of World Trade Organization’s intellectual property restrictions on COVID-19 vaccines. The rich democracies of the G-7 recently announced they intend to donate 870 million vaccine doses over the next year. These efforts, although welcome, fall far short of the eight billion doses needed to end the pandemic in developing countries. Even incorporating the new measures, Biden expects the pandemic in the global South to continue into 2023.

Focusing on donations is neither the fastest nor the best way to bring the pandemic under control. Far more effective would be expanding production in the global South itself and helping to establish a permanent public health infrastructure to prevent future disasters. The G-7 issued a vague promise to support such a program, but even if the intellectual property waiver eventually goes through (despite the emergency, discussions are expected to take many months, and Germany continues to block it), the rich democracies’ refusal to share technology and know-how with the rest of the world casts doubt on their intentions.

The disastrous delay in formulating a global pandemic strategy and the deep flaws in what is now emerging also presage a grim future as the climate crisis deepens. Here, too, the United States is at odds with most democracies. The small fraction of the world’s population that lives in today’s rich democracies or their predecessor states has produced around half of all greenhouse-gas emissions since 1751. In recognition of this historical responsibility and the disproportionate wealth the rich countries gained from all that resource usage, developing countries have demanded that wealthy countries bear most of the burden of resolving the climate crisis. The rich countries say they will support poor countries in their transitions to sustainable energy, but few significant investments have materialized.

Disputes over the pandemic and climate change are connected to a third constellation of issues splitting the rich and developing countries: industrial policy and intellectual property. Because the one poor country that has successfully defied the rich democracies on these issues is authoritarian China, analysts in Washington regularly exploit the “democracy versus autocracy” framework to legitimize their grievances. For example, an Atlantic Council report titled “Countering China’s Challenge to the Free World” asserts: “China engages in unfair economic practices that violate international standards, including: intellectual-property theft, subsidizing state-owned companies to pursue geopolitical goals, and restricting market access to foreign firms.”

Such practices certainly do challenge the power of the rich countries, but most of the “free world” would very much like to emulate them. The rules in question were set in the negotiations that established the World Trade Organization in 1995, when rich countries, at the behest of some of the world’s most powerful corporations, strong-armed poor countries into prohibiting development practices that previously were widely accepted. The refusal of democracies such as Brazil and India to make further concessions was a central reason the subsequent Doha Round of negotiations broke down in 2008, but the rich countries have pushed these principles further through bilateral trade and investment agreements.

The new rules banned practices that all the rich democracies employed in the past. The wealth of rich countries owes much to the theft of intellectual property—the industrialization of the United States, for example, would have been impossible if Americans had not stolen advanced British production techniques—not to mention more violent forms of theft, such as mass enslavement or the plunder of colonies. As for industrial policy, all the rich democracies employ its techniques. China’s much-vilified Made in China 2025 plan was modeled on Germany’s Industrie 4.0 strategy and the U.S. National Network for Manufacturing Innovation (also known as Manufacturing USA). Biden’s economic agenda aims to use the power of the state to secure U.S. control over high-value sectors. China’s success in vaccinating its people and grappling with the climate transition is founded not on Beijing’s hostility to democracy but on its ability to emulate the rich countries by breaking rules when convenient.

Several barriers stand between U.S. policymakers and a better pro-democracy agenda. First, U.S. economic growth has become highly dependent on the concentrated profits of corporations in the technology, pharmaceutical, entertainment, consumer brand, and financial sectors—the same businesses that present the biggest obstacles to raising global labor standards and liberalizing intellectual property rules. Investment in the United States and around the world is flowing toward opportunities to extract economic rents rather than to creating jobs, building infrastructure, and raising productivity.

Is the role of democracy merely to provide a neutral framework within which individuals can freely exchange goods and ideas by reducing threats to liberty and property—that is, by providing “negative” public goods? Or should democracy also ensure the substantive provision of “positive” public goods such as health care, education, high-quality jobs, and capital investment? U.S. foreign policy has energetically acted in favor of negative public goods and dismissed positive ones, with American officials and experts of both parties frequently warning of the risks posed by state intervention in the economy. Washington has focused on market liberalization, individual rights, the rule of law, and defending the security of property and the freedom of navigation against a procession of villains: transnational criminals, “rogue states,” terrorists, and now China.

Yet negative public goods lose efficacy and legitimacy when they are divorced from positive public ones. U.S. aid efforts often fail for this reason. Consider, for example, a $31 million program in Guatemala that the U.S. Agency for International Development funded in recent years to create a smartphone app that would allow residents to track local government spending. The impoverished residents, more concerned about jobs than good governance, could not afford smartphones in the first place.

Another equally important public good is an enforceable regime of global labor rights, which would help reduce the desperate competition among workers that drives so much racism and nationalism and to boost consumer demand and popular support. Nearly every country aside from the United States has already committed to protect essential labor rights under the fundamental conventions of the International Labor Organization. The United States, despite its avowed embrace of a rules-based international order, is the major exception.

The recently concluded agreement for a global minimum corporate income tax, largely decided among the rich democracies, shows that it is possible for multilateral coordination to pave the way for a more equitable global economy. It also reinforces the division between rich and poor, however, by ignoring the desire of developing countries to raise revenue. The next step for reform could be an increase in the global corporate tax rate to establish a stable source of funding for development in the global South. Private foreign investment in developing countries has been piecemeal and volatile. What such places need are not quick returns but long-term, transformative investment to permanently increase their capacity to generate wealth—something that would not only put an end to the appalling poverty afflicting billions but also create enormous new opportunities for U.S. businesses and workers.

It’s not just the economy, stupid
By Rana Foroohar

We know now that markets are not perfect and that consumers and large multinational companies aren’t the only economic stakeholders. In this new world there will be new trade-offs to be made between a larger and more diverse group, from businesses of all sizes, to workers and the environment. Most people seem to understand that ever-cheaper goods have raised wages in some parts of Asia, and created incredible profits for big companies, but they haven’t led to a healthier and more sustainable form of market capitalism. Liberal democracy hasn’t fared well, either.

The new world is, admittedly, messier and it will come with some downsides in the short term. Inflation, for example. Let’s face it, products made by robots, big companies and autocratic states that suppress wages are cheaper. We need to be honest about the inflationary implications of moving from a highly globalised economy to one in which production and consumption are more tightly geographically connected, and in which stakeholders, not just shareholders, have a voice.

But that doesn’t mean we should go back to the old, unsustainable paradigm, which led to environmental degradation, labour abuses, rising inequality and toxic politics.

January 6 and the Paradoxes of America’s Democracy Agenda
By Larry Diamond

For some two decades, political scientists have been worrying about the growing polarization of American politics, as evidenced in rising congressional gridlock, an unwillingness to compromise, and the maximalist, take-no-prisoners tone of cable news, talk radio, and social media. Well before Trump began to use the power and prestige of the presidency to trample on democratic norms, ratings agencies noted a decline in the quality of U.S. democracy. Analysts at Freedom House have shown the decline unfolding steadily between 2010 and 2020, dropping the country’s “freedom score” by 11 points—from 94 to 83—on a 100-point scale. Due in part to eroding public trust in democratic institutions, the Economist Intelligence Unit downgraded the United States to a “flawed democracy” in 2017. And in 2020, International IDEA (Institute for Democracy and Electoral Assistance), a widely respected international think tank focused on democratic development, classified the United States as a “backsliding democracy.”

Among Washington’s democratic allies in Europe and the Indo-Pacific region, as well as many emerging democracies worldwide, there is mounting concern, even alarm, at the deeply troubled state of democracy in the United States. As Americans tear their own country apart, fragile democracies are retreating before a tide of illiberal populism, dictatorships in China and Russia are surging in power and ambition, and the norms and restraints of the post–World War II liberal order—including the indispensable norm against territorial aggression—are crumbling. Last month, the Biden administration finally held its long-awaited Summit for Democracy to rally international resolve and push back against the illiberal tide. It was an important symbolic step, because despite the collective weight of the European Union and the geopolitical courage and generous assistance of small European democracies such as the Czech Republic, Lithuania, Norway, and Sweden—which refuse to be cowed by China and Russia—the United States remains the world’s most important democratic bulwark. A global democratic countermovement will find its energy and conviction challenged, however, as long as it depends for leadership on a democracy as troubled as the United States. This is the paradox of global democracy today: the fate of freedom still rests on a deeply flawed and unstable democratic superpower.

People do not generally cast their votes for or against democracy; the abuse of power has to get very bad and typically remain bad for a long time before it will become the dominant issue. So political forces seeking to defend or renew democracy must speak to other issues, in particular the economy, and they must craft the broadest possible coalitions in doing so. This requires going against the trend of polarization by showing respect for the concerns of people who previously backed illiberal options.

Although successful pushbacks against illiberalism must bridge partisan polarization, they often triumph by condemning corruption and crony capitalism and by mounting appeals to economic fairness and inclusion—promises also made by aspiring autocrats, who abandon them once in power and divert attention from their policy failures and limitations through appeals to identity and cultural grievance. Democracy’s defenders need to avoid the brutal divisiveness, contempt for institutions, intolerance of pluralism, and exaltation of the leader that define illiberal populism. But they should try to energize voters by expressing moral outrage and empathy for people’s insecurity and loss and, when possible, by putting forward charismatic candidates who embody a message of change.

Broadening the base of support for democratic reform is crucial, because authoritarians and illiberal democrats always seek to tilt the political playing field to a degree that requires the opposition to win a larger than normal victory. Oppositions that dismiss this danger—and fail to transcend their own divisions and alternative identity claims—typically falter.

In 2022 and 2024, elections must be squarely focused on the question of which party offers the people a fairer economic deal. Shifting away from identity politics would go against the grain of the moment, which is defined by demographic change and social media passions. It would require a disciplined focus on job creation, childcare support, early childhood education, health-care expansion, infrastructure investment, the new green economy, and bringing manufacturing jobs back to the United States. The three Democratic presidents who managed to serve two full terms in the last century—Franklin Roosevelt, Bill Clinton, and Barack Obama—all understood the need for a message of hope and optimism focused on bread-and-butter economic issues. Democratic success might also require a presidential candidate who can craft an outsider, anti-elite image more authentic and persuasive than the one Trump has perfected—but shorn of illiberal tendencies. In the near term, that might be the hardest paradox for American democracy to overcome.

Posted in Games.


0 Responses

Stay in touch with the conversation, subscribe to the RSS feed for comments on this post.



Some HTML is OK

or, reply to this post via trackback.