The Illusion of Free Economic Choices

In October 2024, Treasury Secretary Janet Yellen spoke at the Council on Foreign Relations, promoting the Biden administration’s “modern supply-side economics.” She claimed that increased government spending and capital directed toward favored industries could enhance growth and productivity. She contrasted this with the Republican supply-side economics of the 1980s, which relied on tax cuts rather than increased spending to achieve similar outcomes.

The new supply-side approach assumes that economic decisions come without trade-offs, a belief fueled by the low interest rates that persisted from the 2008 financial crisis until inflation surged in 2021. This environment created an illusion that budget constraints were no longer relevant and that any initiative boosting economic growth would be self-financing.

However, trade-offs are central to economic decision-making. Whether it’s taxation, government spending, or private investment, each choice affects different groups, creating winners and losers. Economists typically evaluate policies based on their costs and benefits, seeking optimal solutions within budget limits. While some infrastructure projects in the 20th century paid for themselves over time, most government spending redistributes resources from taxpayers to others. Similarly, private-sector investments involve risks and trade-offs.

Despite these realities, some economic theories, like Modern Monetary Theory (MMT), argue that governments need not worry about debt or resource scarcity. Although mainstream economists reject MMT, its ideas have gained traction among politicians and media outlets.

Ignoring trade-offs can lead to serious consequences. After the 2008 financial crisis, an $800 billion stimulus package was passed, though some advocated for even larger spending. The belief that low interest rates meant no cost for increased spending persisted through the 2010s. In 2020, the Federal Reserve adopted a more relaxed stance on inflation, and minimum wage hikes were implemented despite economic concerns.

During the pandemic, massive government spending was initially necessary to prevent economic collapse. However, as the economy recovered, some economists warned that the American Rescue Plan could trigger inflation. These concerns were largely ignored, as policymakers continued to promote industrial policies favoring specific industries.

Trade-off denialism also affected the private sector. Corporate leaders made decisions under the assumption that constraints didn’t matter. Initiatives like ESG (Environmental, Social, and Governance) investing and DEI (Diversity, Equity, and Inclusion) programs were promoted as ways to boost profitability, despite economic principles suggesting otherwise.

While ESG funds initially appeared to outperform, their performance during volatile markets revealed limitations. Similarly, DEI programs introduced new hiring restrictions, with studies showing little evidence of improved business outcomes.

The belief in a “free lunch” has historical roots, but skepticism usually acts as a counterbalance. Over the past decade, that skepticism diminished, partly due to the perception that free-market policies had failed. However, the period of expanding trade and market-driven policies since the 1980s was economically successful, despite challenges like job displacement.

Low interest rates contributed to the illusion of unlimited resources. MIT economist Olivier Blanchard suggested increased spending if real interest rates remained below economic growth rates. This advice overlooked key risks, such as rising interest rates and potential debt spirals.

As interest rates have risen, the private sector has begun to reassess ESG and DEI initiatives. Financial firms are retreating from climate-related investment mandates, and companies are dismantling DEI programs. This shift reflects a return to economic realism, where businesses must focus on profitability.

Public policy has been slower to adjust. Government officials continue to act as if borrowing is cost-free. Rising interest costs will increasingly burden the economy, requiring more spending on debt service and potentially leading to spending cuts or tax increases.

The era of assuming economic decisions come without trade-offs is ending. As former Treasury Secretary Lawrence Summers noted, “As long as you are given a credit card with a reasonably low rate, you’ll borrow and spend for as long as they’ll let you do that.” Now, the costs of this approach are becoming apparent.
— news from City Journal

— News Original —

The Danger of Ignoring Economic Trade

In October 2024, Treasury secretary Janet Yellen addressed the Council on Foreign Relations to tout the Biden administration’s “modern supply-side economics.” She argued that more government spending and capital directed toward politically favored industries would boost growth and productivity—a win-win. She was quick to distinguish this approach from Republican supply-side economics of the 1980s, which had promised the same benefits—not from more spending but from tax cuts.

What’s striking about this new supply-side revolution is its underlying assumption that economic choices come without trade-offs—that they represent a free lunch. This idea has become more prevalent in both the public and private sectors, fueled by the near-zero interest-rate environment that prevailed for much of the period beginning with the financial crisis in 2008 and ending only in 2021, when inflation began to spike. Cheap money enabled the illusion that budgetary limitations were gone, that opportunity costs no longer mattered, and that any initiative that grew the economy would pay for itself.

Yet trade-offs are central to economic thinking. Whether to tax or not, how much the government should spend, how an individual or a company invests—all create winners and losers. Economists assess policies based on costs and benefits, calculating optimal solutions within budget constraints. Yes, rare exceptions exist. Some twentieth-century infrastructure projects, for example, undertaken when the U.S. had little industrial capacity, generated enough growth to pay for themselves over time. But in most cases, government spending takes money away from some taxpayers and redistributes it to others. Likewise, private-sector investments come with costs and risks. CEOs face trade-offs in hiring, financing expansion, or making acquisitions. If resources were unlimited, no opportunity costs would exist, and no one would have to give up anything. But the world isn’t like that.

Lately, though, dissenters have challenged these basic realities, even in the economics profession. Modern Monetary Theory contends that government needn’t worry about debt or resource scarcity because resources aren’t so scarce after all. Though mainstream economists reject this approach for its denial of fundamental principles, MMT got buy-in from politicians and the media, among whom trade-off denialism is most prevalent.

Trade-offs are unavoidable, and ignoring them leads to consequences—often dangerous ones. Unfortunately, even as interest rates have risen, free-lunch thinking continues to show up in policy discussions. Though circumstances will likely dictate the return of economic sanity, the price of our illusions has already been high.

After the financial crisis, policymakers passed an $800 billion stimulus package—a huge outlay, though some had wanted a much larger one. At the time, concerns about the size of the debt and the risk of inflation from loose monetary policy tempered the appetite for even greater spending. But a narrative soon took hold that the sluggish recovery owed not to the severity of the recession, which had devastated household balance sheets, but to unresponsive policy. And since interest rates and inflation stayed low, the argument went, an even larger stimulus could have been financed at no cost.

Throughout the 2010s, policymakers from both parties maintained that inflation was licked and that no amount of spending, debt, or expansionary monetary policy risked bringing it back. An advocacy group even emerged to push the idea that the Federal Reserve should prioritize lowering unemployment as much as possible, as if there were no trade-off between prices, on the one hand, and a booming economy with a tight labor market, on the other. This position ignored decades of economic wisdom. In 2020, the Fed adopted a more relaxed stance on inflation. Legislators also began to hike minimum wages significantly—sometimes even doubling them—while dismissing the economic literature predicting that this would lead to job losses.

It all culminated in a pandemic-era spending bonanza. Some of the vast government expenditure was necessary to keep the economy from imploding when it was first shut down. After the recovery got under way, though, sober-minded economists—some of whom had previously supported aggressive spending—warned that the Biden administration’s nearly $2 trillion American Rescue Plan would ignite inflation. Their objections were ignored; the notion that spending could stimulate growth without potential downsides had taken hold among policymakers and the public.

Even after inflation began rising, the Biden administration didn’t back off, aiming to reshape the economy through industrial policy by steering capital toward its preferred projects. Some of these initiatives offered legitimate national security benefits, such as boosting domestic semiconductor production. But costs rarely figured in the discussion—either to taxpayers themselves or in the form of market distortions, higher interest rates, and lost private-sector investment. Instead, the administration’s efforts were framed as an “abundance agenda.”

The pitch went beyond semiconductors and clean energy. It also sought to deliver well-paid union jobs, expand child care, and fund a host of other progressive priorities—a vision that journalist Ezra Klein criticized as “everything-bagel liberalism,” where every policy goal could seemingly be achieved at once: we could build chip factories with a mandated racially diverse and unionized work force, and nothing but growth and efficiency would result.

The problem wasn’t confined to policymakers. Trade-off denialism seeped into the private sector as well. Corporate leaders began making crucial business and staffing decisions under the same belief that limitations didn’t matter. Until recently, a remarkable corporate delusion had become entrenched: that imposing binding mandates—whether concerning ESG or DEI—would somehow boost profitability. Yet, basic economics makes clear that a constrained optimization yields less than an unconstrained one.

Consider the ESG sales pitch: limiting your investment to firms that made only the “right” things, worked with the “right” people, and were overseen by the “right” boards would offer a higher return. A Royal Bank of Canada Asset Management report captured this spirit: “The bottom line: An overall strong ESG track record may help to increase shareholder value.” Or consider a 2019 McKinsey Quarterly survey on ESG: “The acceleration has been driven by heightened social, governmental, and consumer attention on the broader impact of corporations, as well as by the investors and executives who realize that a strong ESG proposition can safeguard a company’s long-term success. The magnitude of investment flow suggests that ESG is much more than a fad or a feel-good exercise.” Even those who should have known better—including fiduciaries—bought into it. By 2024, 38 percent of American institutional investors were in ESG funds, joined by a staggering 94 percent of European institutional investors.

True, for the first several years, ESG funds appeared to outperform unconstrained funds—largely because they heavily invested in tech. But the real test of any market strategy is how it holds up under different conditions, and ESG struggled in the more volatile pandemic-era market. This was entirely predictable. Individuals have every right to invest in companies that align with their values, but the notion that they could do so without any downside was always illogical, a belief that could seem plausible only in an era that rejected the reality of budget limitations and opportunity costs, and where capital was nearly free.

Another example of trade-off denialism was the growing adoption of Diversity, Equity, and Inclusion programs, which took many forms, including diversifying hiring. In theory, this might seem beneficial for business. After all, if hiring were restricted to any one group—privileged whites, as DEI proponents allege that it was—the talent pool would be reduced, leading to a less capable workforce. The DEI approach was reinforced by top consulting firms, which claimed that more ethnically diverse leadership would boost earnings and profitability. The premise was that people who look different necessarily think differently, offering perspectives otherwise not available. Many CEOs embraced this view, including Microsoft’s Satya Nadella. “Don’t think of this as a quota,” he said in an interview: “This is, in fact again, necessary for business success. I mean, think about what a diverse team can do in a multi-constituent world.”

Yet DEI amounted to introducing new and rigid hiring restrictions, replacing the supposed limitations on hiring—largely illusory—that excluded disfavored groups. It’s no surprise that more rigorous studies found no support for the idea that DEI initiatives make companies more successful and profitable. They may even pit employees against one another and render them less productive.

One can offer political justifications for initiatives like ESG or DEI. They could serve as marketing tools, say, or prove lucrative for consulting firms that profit from implementing these programs. But their widespread adoption in the corporate world—especially by Fortune 500 companies—defied economic logic.

The promise of a free lunch is nothing new. Politicians have always claimed that they can deliver something for nothing. But typically, sufficient skepticism acts as a counterweight in the public debate. Over the past decade, that skepticism waned.

One reason is the growing belief that so-called neoliberal, or free-market, policies have failed. As Bloomberg’s Clive Crook argues, neoliberalism at its core is a framework for evaluating policies through cost-benefit analysis. Rejecting free markets is, by extension, a rejection of trade-offs. It requires believing that more government intervention and less trade will lead to greater growth without risk, steady prosperity without setbacks.

The supposed demise of neoliberalism, however, is revisionist history. From an economic perspective, the period of expanding trade and market-driven policies since the 1980s has been remarkably successful. Over the last 40 years, inflation remained low, global poverty plummeted, and living standards rose for nearly all Americans. Of course, costs came with it—especially job displacement caused by trade and technology, and uneven growth across regions. Many societies, it’s fair to say, have not handled these downsides effectively.

But the argument from commentators on the left (and, increasingly, on the right) that free markets have failed supposes that the alternative—a government that could perfectly manage trade, more aggressively regulate capital, and efficiently direct resources to certain industries and places—would have been somehow better, or that it would not impose even bigger costs. That assumption is far from proven.

Why did this idea gain so much traction? It’s unlikely that it would have been so seductive if interest rates had been higher. When borrowing is cheap or even free, budget constraints seem to vanish. Even some leading economists got drawn in. In a 2019 speech at the American Economic Association, MIT economist Olivier Blanchard argued that policymakers should spend much more as long as real interest rates remained lower than the rate of economic growth. This advice ignored key risks—such as the possibility that the government was taking on long-term liabilities and that real interest rates could rise. It also failed to consider that accumulating more debt could itself push rates higher, leading to a potential debt spiral.

Real interest rates don’t just determine how much policymakers pay for debt; they also shape private-sector decision-making. The risk-free rate serves as the foundation for valuing risk and pricing capital. When it drops to zero, it lowers the cost of corporate borrowing, once again conjuring the illusion of a free lunch. Companies can borrow endlessly and invest in anything with a positive return; it all seems to cost nothing. In such an environment, markets become more tolerant of wasteful spending and questionable business decisions.

A sensible response to low rates in the 2010s would have been to recognize their unpredictability and avoid assuming that they would stay low forever. Policymakers failed to use this period to put the American fiscal house in order—reforming entitlements, issuing long-term bonds to lock in low rates, and preparing for an eventual tightening of monetary policy.

By the time the pandemic hit, a decade of cheap borrowing had made such sober analysis increasingly rare. Such sloppy thinking showed up not only in public-health decisions—where policymakers failed to weigh the costs and benefits of school closures, business shutdowns, and child masking—but also fiscal and monetary policy, which responded with an unprecedented wave of spending and accommodation. Even as the economy recovered, stimulus measures stayed in force. The result? Surging inflation and rising interest rates—the forgotten trade-offs.

Higher rates have already reached the private sector, where financial pressures are mounting. ESG and DEI have suddenly lost favor, with financial firms retreating from climate-related investment mandates and several prominent companies dismantling their DEI programs. Some of this shift stems from cultural and political backlash—alienating large portions of the workforce and customer base comes with trade-offs, too—but it also reflects a realization that ESG’s lower returns were inevitable. Unsentimental economics is reasserting itself: in a higher-interest-rate, inflationary environment, businesses must focus on their bottom lines.

Policy, however, has yet to adjust. Public officials continue to act as though borrowing is free. Even without a full-blown debt crisis, rising interest costs will weigh on the economy, forcing governments to divert more spending toward debt service. This, in turn, will require issuing even more debt at higher rates, leading to a cycle of rising costs and, eventually, the need for substantial spending cuts or tax hikes.

The free-lunch era is ending, whether policymakers want to acknowledge it or not. As former Treasury secretary Lawrence Summers put it: “As long as you are given a credit card with a reasonably low rate, you’ll borrow and spend for as long as they’ll let you do that.” Now the bill is coming due.

Leave a Reply

Your email address will not be published. Required fields are marked *