The Politics and Economics Of a U.S. Debt Default

X
Story Stream
recent articles

With federal spending in the United States rising to record levels in nominal terms, not to mention as a percentage of GDP, there's a growing amount of commentary suggesting that the U.S. government is bankrupt, and that it will eventually default on its growing debt. Although federal debt has not reached post-World War II levels, unfunded future liabilities that some suggest are in the $100 trillion range have analysts on edge.

In their 2009 book This Time is Different, economists Carmen Reinhart and Ken Rogoff note that defaults are most likely when the debt/GDP ratio (not including unfunded liabilities) rises above 100%. The fact that we have yet to breach 100% has given optimists comfort, along with the fact that the cost of capital for the U.S. Treasury remains low. Mind you, Reinhart and Rogoff add that both Mexico and Argentina have defaulted when their debt/GDP ratios were in the 50% range. At the very least that tells us we're already in the danger zone.

One argument that supports optimism concerning our budget deficits has to do with the historical fact that the U.S. has never defaulted. The problem here is that default as it's traditionally understood is too narrowly defined. If calculated on a currency-value adjusted basis, Washington has short-changed its creditors before and it continues to do so. That has serious economic consequences, because capital flows away from economies where it is penalized rather than rewarded.

If default is thought of in traditional terms whereby investors are simply given a haircut on monies owed, default is less scary, less economically harmful, and internationally commonplace. In such circumstances the currency can remain sound and the damage to lending is not spread to those who are funding the growth of the private economy. As it applies to the United States, there's therefore an argument that an honest default would be better than what we are doing by stealth.

A brief history of defaults. The late Citibank chairman Walter Wriston is famous for saying "Countries don't go bust." His comment in the '80s and now engendered for him a lot of ridicule but, if looked at realistically, what he said was correct.

Indeed, it's rare that a country runs out of resources altogether, and that's no surprise considering the power of taxation that all governments possess. The greater truth is that governments short on funds have produced sophisticated cost/benefit analyses of potential defaults and have historically figured out how to pay back less than what is owed.

As Reinhart and Rogoff note, "most defaults end up being partial, not complete." Indeed, they found that in most cases, "partial repayment is significant and not a token."

Some might believe that militarily powerful countries are most able to secure monies lent to others, but even this is a mistake. As logic would tell us, there's a strong incentive for countries to make good on a large portion of what they owe given their desire to avoid being shut out of capital markets altogether.

Not surprisingly, war considerations have historically served as a strong incentive among countries not to stiff their creditors. Specifically, Reinhart and Rogoff cite England's move to a gold standard in the aftermath of the Glorious Revolution as significantly facilitating frequent accession to the debt markets to fund war with France.

So while serial defaults make the marshaling of war mobilization resources difficult, and as such not a good strategy, wars themselves have often signaled a looming default of another kind through the simple means of devaluation. To Gustav Cassel, devaluation was the only way for governments to pay for wars not supported by taxpayers, and he went on to note that governments in need of a greater share of resources during times of war used devaluation as a tool to make them less accessible to the average individual.

Looking at defaults globally, Greece has spent more than half its years since 1800 in arrears to creditors, Argentina has defaulted three times since 1980 alone, and Venezuela has defaulted ten times since 1830. As Reinhart and Rogoff put it in their book, "repeated sovereign default is the norm throughout every region of the world, including Asia and Europe."

Sovereign versus domestic default. It is here that things get a little more interesting. While debt is surely debt, there is most assuredly a distinction between monies owed by a country to its citizens and those which are owed to foreign creditors. To put it simply, governments are far more comfortable defaulting domestically than they are internationally, where global capital markets are a major constraint.

Evidence supporting the above claim is provided by Reinhart and Rogoff. While they're not clear about their definition of inflation, in their research they found that inflation rates amid external defaults average 33 percent, while during internal battles with debt, the number spikes to 170 percent. Most notably, but not very surprisingly, they find that external defaults are highly correlative with banking crises.

This also explains why defaults on external debt are such big news. Banking crises are the frequent result of defaults, when countries fail to make good on monies owed to foreign creditors, and that increases the likelihood of "contagion" related to banks in possession of the debt.

Specifically, Rogoff and Reinhart point to the 1994 Mexican "Tesebono" crisis as one that would have gone largely unreported had it not been for global exposure to Mexican debt. Turning to Argentina, though its government defaulted in 1982, 1989 and 2001, the '89 default did not merit international mention because it was the country's citizens - as opposed to foreign banks - that suffered the haircut.

Rogoff and Reinhart found that the creation of the IMF has coincided with "more frequent episodes of sovereign default," and this seems logical due to the IMF's role in making government profligacy somewhat less painful. According to their calculations, investors are compensated for this growing risk with risk premiums "sometimes exceeding 5 or 10 percent per annum."

Of possible interest to those who follow gold, Reinhart and Rogoff found that there was relative banking calm from the 1940s to 1970s. This is hardly surprising when we consider the generally stable money values that prevailed during the years of Bretton Woods. The authors note that banking crises have increased since 1970, but oddly point to a "reduction and removal of barriers to investment inside and outside a country"as the likely culprit.

Perhaps not stressed enough there is Cassel's view that the public tenaciously holds to the belief that a "krona is a krona," and by extension, "a dollar is a dollar," despite the fact that a change in the relationship of currency values to gold is very significant. Global money since 1971 has possessed nothing in the way of intrinsic backing.

That in mind, it should in no way shock us that banking crises have grown in number. Devaluation of money is as much a default as any other, and it has significantly changed the value of debt securities on the balance sheets of banks the world over, with consequences that were surely negative.

Looking at the decline in the value of the dollar versus gold alone since 2001, not only did it promote a flight to hard assets least vulnerable to currency debasement such as housing, but it arguably debased the debt instruments held by banks which enabled the rush to invest in housing. Interest rates on home loans were then (and remain now) low, but the dollar's decline and the resulting crisis among financial institutions is not easy to ignore.

The U.S. has never defaulted? Reinhart and Rogoff observe that the U.S., Canada, New Zealand and Australia have never defaulted in the traditional sense, but they admit that at least the U.S. has defaulted in the form of currency devaluation. Specifically, they mention the abrogation of the gold clause in 1933, which meant that debt paid to American creditors (from 1928 to 1946 the US had no external debt) was repaid in paper currency rather than gold.

But if 1933 can be counted as a default episode, then it's easy to argue that the U.S. has been giving investors haircuts ever since, especially beginning in 1971. During that time the value of the dollar in terms of gold has collapsed from the $35 Bretton Woods fix, all the way to recent lows beyond $1,300/ounce.

So while the U.S. Treasury has never reduced the dollar amount of what it owes to creditors foreign or domestic since 1933, even partially, it has most certainly debased the dollar value of what it owes. To put it in simple terms, the U.S. has been a serial defaulter ever since President Nixon severed the dollar's link to gold in 1971. The French called it our "exorbitant privilege" to issue what remains the world's currency, and if gold is telling a realistic tale, we've used this privilege to stiff creditors.

Not only did gold-defined money eliminate inflation, but it also kept governments honest. And while the dollar remains the dollar, in gold terms the greenback has changed profoundly in this decade alone. Using gold as the benchmark, the default that some say looms has long been underway, particularly in the '70s and the decade just passed.

What's so scary about a US default? The answer is easy if we expand the definition of debt default to include currency devaluation. In that sense, default is problematic because all entrepreneurial activity and all jobs are the direct result of savings meant to fund them.

If Treasury continues to reduce the real value of its debt by allowing it to fall, then we all suffer for limited capital moving into unproductive, hard assets least vulnerable to devaluation. Our ability to find work with innovative companies is compromised if money is debased such that investors go on strike. As savers we also suffer from currency devaluation thanks to debasement reducing the value of the funds of which we've chosen to delay consumption. Inflation is a tax, and when Treasury policy errs in favor of currency depreciation, its defaults are paid for by us, the savers.

But assuming a more traditional default, it bears asking if this wouldn't be a very good thing. Milton Friedman is famous for arguing that government spending itself is the truest tax as governments vacuum up limited capital to fund their profligacy. So if the U.S. Treasury were to default, it would be the federal govenment, not the average American, that would suffer the consequences.

To see why, we might consider Cupertino, CA based technology firm Oracle. Though Oracle is based in a state with nosebleed levels of debt, it would be naïve to think that if California were to default that Oracle's cost of capital would rise. More realistically, the California government's reduced attractiveness as a place to invest money would redound to the credit of a blue-chip firm such as Oracle, along with any other company known to treat well capital that is entrusted to it.

At present, the U.S. economy surely suffers from the eagerness of federal and state governments to run up all manner of debt for activities that in no way stimulate real economic growth. So if a federal default makes it more difficult for Treasury to raise money in the capital markets, that money would have to go somewhere. It's easy to argue that it would find more productive uses outside Washington.

For supporting evidence we need only look to the global economy in the aftermath of World War II. According to Reinhart and Rogoff, the years after the war were the greatest era for default in modern world history, with countries representing 40% of world GDP either in default or rescheduling debt.

Yet the economies of formerly war-ravaged countries such as Japan, Germany and France grew with great gusto in the post-war years. Assuming another period of capital market skepticism when it comes to government debt, much the same could occur again.

Conclusion. For as long as there have existed deep capital markets able to fund the needs of governments, there have been defaults on the debt raised by governments. During that time, investors have been compensated to varying degrees for the risks they've taken.

Sadly, inflation in the post-Bretton Woods era has given governments another way to reduce their debts, and savers around the world have suffered, not to mention entrepreneurs and those eager to work for them. It should be stressed that devaluation is merely a more subtle form of default, and the U.S. has used it with great vigor over the last four decades.

In contrast, government default in the traditional sense is not something we should fear. Governments can only spend what they've taxed or borrowed from the private sector first, so while default would surely singe investors reliant on steady income paid for on the backs of taxpayers, it would serve as a boost to the private economy thanks to investors starving the government in favor of growth-enhancing initiatives in the private sector.

At this point U.S. politicians have shown no spending discipline no matter the party in power. In that case, if market discipline proves the only cure for the spending disease, then we should embrace a U.S. debt default without reservation.

John Tamny is editor of RealClearMarkets, Political Economy editor at Forbes, a Senior Fellow in Economics at Reason Foundation, and a senior economic adviser to Toreador Research and Trading (www.trtadvisors.com). He's the author of Who Needs the Fed?: What Taylor Swift, Uber and Robots Tell Us About Money, Credit, and Why We Should Abolish America's Central Bank (Encounter Books, 2016), along with Popular Economics: What the Rolling Stones, Downton Abbey, and LeBron James Can Teach You About Economics (Regnery, 2015). 

Comment
Show commentsHide Comments

Related Articles