China: America’s Greatest Threat

The United States has gotten China wrong for the better part of four decades. Politicians, policy makers, academics, businessmen, and others naïvely assumed that China’s communist totalitarian system would evolve toward democracy and freedom. These elites did not understand that engaging with the 94 million-member Chinese Communist Party (CCP) is like having unprotected sex.

It is only recently that those who control the commanding heights in the United States have given due regard to the reality that CCP-controlled China, which regards democracy as an existential threat, is a menace to American’s security and prosperity. The reasons why are apparent from China’s activities in the South China Sea.

If there is to be a great power military conflict in the future, it will most likely involve a rising China challenging a predominant America. The list of China’s strategic initiatives is lengthy; everything from becoming a world leader in science and technology to economics and business to military might. The U.S. now faces a rising power, a confident, ambitious country that wants to supplant America’s role as the current global hegemon.

This goal is demonstrated by China’s actions in the South China Sea, which is strategically important to China’s goals and is one of the battlefields on which the competition between China and the United States will play out.

The South China Sea is a part of the western Pacific Ocean and borders southern China, Taiwan, Vietnam, the Philippines, Malaysia, Indonesia, and Brunei. More than $5 trillion in trade flows through it, roughly 30 percent of all global maritime trade. A major shipping route, the sea also accounts for about 10 percent of the world’s fisheries and a potentially significant amount of oil and natural gas deposits.

As the region’s link between the Indian and Pacific Oceans, the South China Sea is a vital trading and military route for the countries that surround it as well as for larger Asian economic powers, including Japan and South Korea. The country that controls the South China Sea has a strategic advantage in the region and a huge influence over global seaborne trade.

Xi Jinping, general secretary of the CCP, claims all of the South China Sea — lock, stock, and oil barrel — as sovereign territory. He backs up his claims by building aggressive military installations on existing islands, dredging new islands out of the sea itself and building airfields, “missile defense systems” and harbors that are essentially naval bases.

China bases its claims to the South China Sea on historical records from the Zia and Han dynasties that are thousands of years old. It is unlikely that Japan, Vietnam, and South Korea will stand by while China exploits them. The United States, as an ally of Japan, South Korea, and the Philippines, could be drawn into disputes surrounding these claims. It is worth noting that actions by China’s maritime forces aimed at the Japanese-administered Senkaku Islands in the East China Sea are another area of concern.

Following an appeal by the Philippines that China’s actions violated the United Nations Convention on the Law of the Sea, the Permanent Court of Arbitration ruled in 2016 that there was no legal basis for China to claim historic rights, while also finding that there had been several violations of the obligations set out in the Convention.

China refused to accept the court’s ruling and has continued militarization of the artificial islands with impunity. This is an expression of China’s newfound military and political power and its might-makes-right approach to international affairs. China’s expansion in the South China Sea is equivalent to Russia’s annexation of the Crimea in 2014.

America should not try to contain China unilaterally, but rather assemble a broad coalition with nations including India, South Korea, Japan, and the Philippines to confront, resist, and sanction China in the same way as it partnered with NATO and others to contain the Soviet Union during the Cold War.

Poker And Risk Management

Gambling – the willingness to take actions whose outcomes cannot be known for certain – is a basic human instinct. The riskier you perceive a particular action to be, the higher its potential payoff should be to justify your taking the action.

As it happens, the risk inherent in many actions can be roughly quantified. You can rank actions by their estimated riskiness, compare them to each other and to their potential payoffs, and make intelligent judgments about which (if any) actions to take.

This is known as managing risk. And the widespread failure to manage risk sensibly was a major reason why the financial industry melted down so catastrophically in the fall of 2008. To their peril, Wall Street firms relied on oversimplified models for managing their risk.

Many people insist that financial markets are simply a large collection of gambling casinos that offer investors a variety of “games” to bet on. This is almost right – but the almost is significant.

When you walk into a casino, you face an immediate choice. Are you going to play slot machines and table games like roulette and craps, or seek out the poker rooms?

If you choose slots or table games, you are likely to lose because you are playing against the house. The payoffs of these games are structured (with the blessings of state gaming commissions) to give the house an edge that assures you will lose in the long run. This is unlike the situation in the financial industry.

But if you choose the poker rooms, you have a chance of winning, because you are playing against other gamblers like yourself. The house simply hosts the games (i.e. provides the space, tables, and chairs, decks of cards, professional dealers and so on) and takes a modest cut of the pot for doing so. This is a lot more like the situation in the financial industry.

You can sit down at a table and become a “player” (which is like being a “professional investor” in the financial industry). But you have another option.

You can engage in side betting. People who visit poker rooms simply to watch the games can place side bets among themselves about the winner of the next hand. But since they’re unable to influence the hand’s outcome, their betting decisions simply reflect their estimates of the raw probabilities. These bettors are spectators with no influence over who wins the next hand.

But there are ways to refine your initial assumption about the win/lose probabilities.

One way is to simply watch a half-a-dozen or so hands and see which player or two seem to be dominating, then make a subjective judgment about the player’s probability of winning. Another is to look at the chip stacks in front of each player. If one player’s stack is twice as large as anybody else’s, it may be evidence of that player’s superior poker skills.

But suppose you recognize at the outset that one of the players is Jennifer Harman or some other highly regarded poker maven who tends to win a significant percentage of the hands they play. You reflect this by assigning him or her a higher win probability. You place most of your bets on the maven winning, possibly adjusting the size of each bet based on how well the maven is doing as the game progresses and what kind of payoff odds you’re getting from the other spectators.

An important point stands out about this poker example: You have a relatively large number of variables to keep track of, and their interrelationships and relative impacts are constantly changing.

This is especially true in financial markets. During the years leading up to the beginning of 2008 many firms bowed to the temptation to oversimplify their models. Many of them turned out to be less than worthless when the proverbial expletive hit the fan and blew up the world or at least lit the fuse.

Politicians’ Contempt For The Truth

Russell Baker, the Pulitzer Prize-winning American journalist, said in his memoir that covering Washington was just a matter of sitting in grand marble halls waiting for someone ever more important to come out and lie to you. One could easily make the case that this happens with considerable regularity in state houses and city halls throughout America.

If people think politicians can get ahead without untruths, they’re lying to themselves. Politicians have always had a distant relationship with the truth. They have always lied, are constantly lying, and will always lie. It no longer matters if statements have any basis in reality. Get over it.

Some political lies can lead to unnecessary war. Still others conceal illegal behavior. What matters is firing up your supporters and getting reelected. They promise heaven on earth, and when they can’t deliver, they spin, evade, manipulate the numbers and knowingly engage in falsehoods.

President Trump’s self-serving whoppers are overwhelming and are memorably labeled as B.S. The President’s body of falsehoods is singular in its multiplicity. He may be an outlier, but he is hardly unique in deliberately saying something untrue. The truth about lies is that politicians have always told them. Of course, the exception being America’s first President, George Washington. He could not tell a lie, unlike most politicians who cannot tell the truth.

Trump is not the only one lying. Recall a number of prominent presidential lies. Some are as egregious, such as when President Obama told the American public over and over that “if you like your health care plan you can keep it.” Better still, the many falsehoods President George W. Bush told in the run-up to the Iraq war, which were very damaging to the United States. Or when President Clinton shamelessly said in 1998, “I did not have sexual relations with that woman, Miss Lewinsky.”

Then there was President George H. W. Bush’s “Read my lips, no new taxes.” And of course, “People have got to know whether or not their President’s a crook. Well, I’m not a crook” by President Richard Nixon. Truth tellers in politics are an endangered species. Polling data shows politicians among the least trusted actors in society.

But do the American people care about the veracity of what politicians say. Or do they simply want to hear “their truth”? People have a tendency to view information familiar to them as the truth and search for other information that reinforces their beliefs. Daniel Kahneman, psychologist and winner of the Nobel Memorial Prize in Economics, calls it “cognitive bias” – we tend to avoid those facts that force their brains to work more.

People live in their own social network bubble in the digital world. They go on the internet to search for information that confirms their convictions. They know more but understand less, dividing into hostile tribes. They see the world as a battle between left and right, each living in separate worlds.

They fish in different information streams. Politicized media outlets and online social networks put out completely different representations of the truth. Extreme partisanship is not a new problem. George Washington warned about the dangers of it in his Farewell Address in September 1796.

With social media, lies have the capacity to spread faster than ever before. It is a cheap and easy way to disrupt political discourse. After all, birds of a feather flock together. These days, anybody with a Twitter account can throw spaghetti at the wall and see if it sticks and for how long.

You would be right to conclude that Machiavelli would, with a few exceptions, have a lot to learn from public figures in the age of post-truth politics. The country is beset with tribalism, having forgotten the American forefathers’ motto “E pluribus unum,” which is imprinted on every coin in hopes of avoiding the United States becoming a nation of immigrants divided into tribes.

In Praise Of Negative Interest Rates?

Negative interest rates are widely discussed these days as a monetary policy tool to support economic growth. President Donald Trump is a huge fan of low or negative rates and has been browbeating Federal Reserve Chairman Jerome Powell, whom he appointed in 2017, to cut interest rates to zero or even lower. Powell and his colleagues should think long and hard before capitulating.

The apparent goal is to keep the economy percolating until after next year’s presidential election. The Fed appeased the President last month, making a modest quarter-point cut. Egged on by Trump, they seem poised to lower rates further this year.

Negative interest rates have become commonplace in Europe and Japan. Central banks in Denmark, Switzerland, Sweden, Japan, and the European Central Bank have slashed rates below zero to shore up weak economies or strengthen their currencies. The notion is that weakening a country’s currency makes it a less attractive investment than other currencies, giving the country’s exports a competitive advantage. Worldwide, there is more than $17 trillion in debt with negative yields, almost half of it in euros. The majority of the balance is in Japanese yen. Almost all of it is sovereign debt.

Central banks usually pay commercial banks interest on the reserves they keep at the central bank. Under a negative rate policy, the commercial institutions are required to pay interest on any surplus cash beyond what regulators say banks must keep on hand. This penalty is designed to incentivize commercial banks to lend more money. The view is that low or negative interest rates encourage businesses to invest and consumers to spend rather than pay a fee to keep their money safe. Loans put money into circulation and generate economic activity.

Lower or negative interest rates present both costs and benefits for consumers.

Imagine if you go to the bank for a loan and are told the bank will pay you for taking it. Who in their right mind rejects such an offer? Conversely, if you make a deposit, under a negative interest rate scenario you are actually paying the bank to hold your money.

A big concern, which has yet to be explained, is the impact of negative interest rates on money market funds, which are a foundational investment for many households. Negative interest rates reward borrowers at the expense of lenders or savers. The goal is to bring future consumption into the present.

One potential danger of this approach is the liquidity trap that occurs when interest rates are so low that they reduce the flow of money to the Main Street economy. Instead, it goes into investments that don’t generate economic activity, such as the stock market, as people desperately chase higher yields and push up stock prices.

Interest rate cuts tend to stimulate the stock market by making real returns on bonds less competitive. The President seems to think that makes for good economic policy. Negative interest rates might actually lead to lower interest costs on government debt. Debt service is one of the fastest growing drivers of federal spending.

Low interest rates are old hat. Even during the Obama administration, when the economy rarely topped 2 percent annual growth, business did not pick up when money was cheap. For the last decade, the low interest rate scenario has been a secret tax on savers, who are not generally speculators in the stock market.

Millions of Americans are either behind in the race to save for retirement or living off their interest income. They may spend less in a negative interest rate environment, which would reduce economic activity.

How using this unconventional monetary policy will work in the United States is a mystery. It could leave the Fed without any ammunition when an actual recession hits and could increase the likelihood that the President is reelected. One can only hope that Powell and company make the right economic call.

Stakeholder Capitalism. Really?

A recent announcement by nearly 200 CEOs that corporations should serve more than the bottom line may be great public relations, but don’t hold your breath waiting for big changes in the way corporate America operates.

For four decades the popular conception is that a corporation exists to maximize returns to shareholders. This conceit is the work of economists. Milton Friedman, who was awarded the Nobel Prize in Economics in 1976, made the case in a famous 1970 New York Times Magazine article that the social responsibility of business is to increase profits. It laid the intellectual foundations for the shareholder value revolution of the 1980s.

As he put it “there is one and only one social responsibility of business – to use its resources and engage in activities designed to increase its profits.” His former students popularized the idea that the great challenge of corporate governance is getting executives (agents) to act in the interest of the shareholders (principals).

This view caught on and became conventional wisdom, as universally accepted as the idea that the sun revolves around the earth once was. Over time the U.S. stock market has focused strongly on corporations’ quarterly earnings to the point that a penny up or down from expected earnings per share can cause the stock price to fluctuate.

This has created a number of potential problems. For starters the short-term focus of the stock market dictates a short-term approach by management, at the expense of long-term shareholder value. For example, management might decide to shower cash on shareholders and not invest in research and development on projects that would only pay off down the road. Also, market pressures could tempt managers to cheat or manage earnings to meet investor expectations, especially since the compensation of CEOs and other executives is linked to stock performance.

This August, nearly 200 chief executives of major American corporations including Apple, Amazon, General Motors, and Walmart – all members of the powerful U.S. Business Roundtable – announced that corporations should no longer just maximize profits for shareholders but also benefit other stakeholders including employees, customers, and citizens.

Is this all just rosy rhetoric or a real change in mission? Will these corporations who are people, really nice people, now use house money to support expansive social goals that are irrelevant to maximizing shareholder returns? This rhetoric about the purpose of a corporation won’t even rise to the level of the inconsequential unless executives address basic questions.

Will they argue for changes in how they are paid, how corporations are taxed and regulated and focus instead on the long-term health of their companies? What metrics will these executives use to measure stakeholder returns? How will corporations pivot away from the needs of activist short-term investors? How will they balance the needs of multiple stakeholders to create value for all these shared interests? How will executives resolve stakeholder conflicts? What tradeoffs have to be made? There are more unasked and unanswered questions than positions in the Kama Sutra.

New York Times Columnist Farhad Manjoo believes the new mission statement is all foam and no beer. He cynically says these CEOs want you to know how much they care, but they will continue to eat your lunch while virtue signaling. Many others are quite skeptical that corporations will change the way they behave.

Former General Electric CEO Jack Welch said in a 2009 interview with the Financial Times that: “On the face of it, shareholder value is the dumbest idea in the world.” This comment may be the height of irony given that when he ran GE, the firm consistently met or beat analysts’ quarterly earnings forecasts.

One thing is for certain. The time is long overdue to shift the focus of corporations away from maximizing shareholder value and stock-based executive compensation. But don’t hold your breath. This is like asking business executives to perform surgery on themselves.

The Merrill Lynch story

The weekend of Sept. 13 and 14, 2008 was one of the worst ever on Wall Street. And when Lehman Brothers went bankrupt on Sept. 15, it triggered a global financial panic.

Also over that weekend, Bank of America and Merrill Lynch hammered out one of the biggest deals in Wall Street history in less than 36 hours. The feds pushed for a deal to prevent Merrill from becoming the next domino to fall. With Lehman preparing to file for bankruptcy after failing to find a buyer, executives at both Bank of America and Merrill knew they needed to act quickly as Merrill’s liquidity was evaporating.

Merrill Lynch was founded in 1914 by Charles Merrill and his friend Edmund Lynch. During the next 30 years, it grew by a series of mergers and acquisitions into the nation’s largest and best-known retail brokerage firm. Just as Lehman Brothers had epitomized the “aristocratic German-Jewish culture” in the financial industry, Merrill Lynch became a symbol of “working-class Irish Catholic culture” (like New York City’s police and fire departments). Not that it mattered much when push came to shove in September 2008.

In 1971, Merrill Lynch became a publicly traded corporation. And in 1978, it acquired the small but prestigious investment bank White Weld & Company to expand its underwriting activities and take advantage of the ability of its huge retail brokerage arm to place new common stock issues with investors directly rather than through syndicates composed of other firms.

But by 2000, Merrill (like Lehman and Bear Stearns) was becoming increasingly dependent on its collaterized mortgage obligations business to grow profits. By goosing this growth by more than doubling its 2003 leverage ratio of 19-1 to 39 to 1 in 2007, Merrill was able to provide its common stock holders with a 13 percent increase in investment returns during this period.

By 2006, Merrill had leaped to the top spot in the nation’s collateriized mortgage obligations business, underwriting $35 billion in these securities, 40 percent of which were backed by sub-prime mortgages. To help secure its position, Merrill spent $1.3 billion to acquire First Franklin, one of the nation’s largest originators of sub-prime residential mortgages. This gave it a major in-house mortgage originator and reduced its dependence on buying mortgages from numerous banks and home loan firms to back new underwritings of collateralized mortgage obligations.

Concerns about Merrill’s viability increased during the summer of 2007, when two Bear Stearns hedge funds defaulted. As a short-term lender to these funds, Merrill seized $800 million of Bear’s mortgage assets and proceeded to auction them off in the secondary markets. But the auctions failed to generate reasonable bids for the sub-prime mortgages and highlighted Merrill’s exposure to these “toxic waste securities”. For the last quarter of 2007 and the first three quarters of 2008 combined, Merrill wrote down more than $46 billion to bad bets on real estate and other mortgage-related instruments.

These write downs had severe consequences for Merrill: the firm’s stock price fell significantly, Moody’s Investors Service placed Merrill’s long-term debt “on review for a possible downgrade”, traders in other firms lost confidence in the firm’s ability to meet its trading obligations, and the firm had to increase its equity capital by selling off assets such as its 20 percent stake in Bloomberg for a much-needed $4.4 billion.

Additionally, between May 2007 and September 2008, Merrill laid off over 7 percent of its employees. Its board ousted CEO Stan O’Neil in October 2007, though he retained $30 million in retirement benefits and $129 million in stocks and options.

Merrill’s continued write downs of toxic mortgage assets, increasing operating losses, difficulty refinancing its short-term borrowings made it clear that its days as an independent firm were numbered. On Sept. 14, 2008 Merrill agreed to sell itself to the Bank of America.

Financial markets are prone to instability. But when paired with excessive financial leverage, the result can be severe economic pain.

The Lehman Brothers story

Next month is the 11th anniversary of the fall of the famed investment banking firm Lehman Brothers (“Lehman”), which froze up the nation’s credit system when it collapsed on Sept. 15.

The firm was founded as a dry goods business in 1850 in Alabama by brothers Henry, Emmanuel, and Mayer. Lehman began focusing on cotton trading and moved to New York during the late 1850s. That office eventually became its headquarters.

By 1900, Lehman had begun moving into underwriting new issues of common stocks for corporate clients, as well bond trading and financial advisory services. During the ensuing decades, it underwrote issues for corporations like Sears Roebuck, RCA, and Macy’s. In 1984, Lehman was acquired by American Express and merged with Shearson, the company’s brokerage subsidiary. This lasted until 1994, when American Express decided to get out of the brokerage business and spun off Lehman.

The company saw considerable success in the years that followed, as it increased its net revenues more than six-fold, to $19.2 billion. By the end of 2007, it was the fourth largest investment bank in the United States and seemed poised to continue its stellar growth.

But Lehman had become increasingly reliant on the subprime and commercial real estate markets. This went hand-in-hand with a 46 percent increase in its leverage ratio, from 24 to 1 in 2003 to 35 to 1 in 2007. Much of this leverage took the form of short-term debt with maturities as short as a single day. So Lehman had to continuously sweet talk its lenders about the “solid value” of the assets it had pledged as collateral for these “here-today-gone-tomorrow” loans.

The sweet talk was undercut by continued erosion of the housing and mortgage markets during the summer of 2007. Lehman’s common stock price fell 37 percent from June to August, as the firm closed its sub-prime mortgage arm, wrote off $3.5 billion in mortgage related assets, and laid off more than 6,000 employees by year’s end.

Things got even worse in 2008. In January, Lehman closed its wholesale mortgage lending unit and laid off another 1,300 employees in a vain attempt to stem further hemorrhaging from its sub-prime mortgage operations. Then Standard & Poor’s credit rating agency downgraded its outlook on Lehman from “Stable” to “Negative” on the expectation that its revenue would decline by at least another 20 percent, which caused Lehman’s stock price to plunge an additional 48 percent.

Lehman attempted to counter by selling $4 billion in convertible preferred stock, but the fresh cash was quickly soaked up by more write-offs. Rumors flew that other firms were refusing to trade with Lehman.

The company contemplated “taking itself private,” but financing wasn’t available. Lehman’s next move was to try and locate buyers for $30 billion of its commercial mortgages, whose actual market value couldn’t be determined because their trading activity was virtually non-existent. Talks with the Korea Development Bank, China’s CITIC Securities, and the Royal Bank of Canada went nowhere.

The time had come for the federal government to step in if Lehman was to be saved. But public backlash against the earlier Bear Stearns bailout made such a rescue politically untenable. With voices from all sides of the political spectrum screaming at the feds for using taxpayer funds to bail out big Wall Street firms that had caused this mess, while refusing to lift a finger to help American families in danger of losing their homes.

On Sept. 15, 2008, Lehman had to file for Chapter 11 bankruptcy, leaving its viable businesses to be snapped up at fire-sale prices by sharp-eyed bottom fishers.

At the time it was the largest Chapter 11 bankruptcy in American history. In retrospect, it’s generally regarded as the most disastrous decision by the feds since the early 1930s, when the Federal Reserve chose to shrink the nation’s money supply by one-third, which shattered the American economy for the rest of the decade.

Let them eat credit

The Federal Reserve Bank cut interest rates by a quarter of a point on July 31, the first reduction in more than a decade. The 25-basis points reduction was seen as an effort to stimulate the economy and counteract the escalating tit-for-tat trade war with China that is seen as impeding global growth.

The Federal Reserve, the world’s most powerful central bank, is again bearing the burden of keeping the economy growing and minimizing financial instability. What’s more, they are pursuing pro-growth policies without any fiscal policy support from elected officials.

Just last month, President Trump reached a bipartisan two-year budget agreement with Democratic leadership in the House of Representatives and Republican leaders in the U.S. Senate that raises discretionary spending caps by $320 billion and suspends the debt ceiling until July 31, 2021. The legislation will add $1.7 trillion to projected debt. It is a really bad idea to assume the future will look after itself. The good news is that they got on well together to pass the legislation. If you believe that you can’t be helped.

This budget deal avoids the risk of another partial federal government shutdown and a potentially catastrophic default on the nation’s debt. The Republicans voting for it touted the increase in military spending while the Democrats talked up the additional domestic spending it includes. The federal debt has grown from about $19 trillion in January 2017 to more than $22 trillion now. Fear of debt and its potentially dangerous implications are nowhere to be found in 2019.

But it’s not just the ruling class in Washington that has become addicted to debt; the whole country is waist deep in it. Taken together, all segments of U.S. debt – federal, state, local, corporate, and household – are at 350 percent of the gross domestic product. American household debt continues to climb to record levels, reaching $13.54 trillion in the fourth quarter of 2018, $869 billion above 2008′s $12.68 trillion peak, according to the Federal Reserve Bank of New York.

The Federal Reserve also claims to be tweaking the benchmark federal funds rate because it is worried that inflation is running below its target of 2 percent. According to the Fed, prices rose just 1.6 percent in the year through June, not counting volatile food and fuel prices.

Inflation as defined and measured by the Fed may be running pretty low right now, but bear in mind that the typical family’s living costs may be nothing like the official stats. It is also fair to say that Americans want a bigger paycheck, not higher prices resulting from a 2 percent inflation target. On a daily basis they experience the Dickensian nightmare of the accumulated high cost of several decades of low wages.

Don’t forget that even modest inflation for a prolonged period can seriously erode purchasing power. For instance, inflation averaged 2.46 percent annually between 1990 and 2018. Sounds low, but you would need just about $2,000 today to buy what $1,000 would have bought in 1990. You don’t have to be a socialist or an economist to understand that despite the strong labor market, today’s wages provide about the same purchasing power as years ago – if you are lucky.

To compensate, households turn to debt. The average American now has about $38,000 in personal debt, excluding home mortgages. The average household carries about $137,000 in debt and the median household income is about $59,000. So when the cost of living rises faster than household income, more Americans use credit to cover basic needs like food, clothing, and medical expenses. When wage growth does not keep up with the cost of living, government promotes cheap credit to grease the economic wheel, especially in an on-demand society that values the immediate over the remote.

Put differently, U.S. economic policy has for decades been, to paraphrase the misquoted Marie Antoinette, “Let them eat credit.”

Demography is destiny

The world is undergoing a dramatic transition due to the confluence of disruptive forces such as accelerating technological change and globalization. But another important factor that often gets overlooked will shape society and the global economy over the coming decades: The life expectancy of humans is increasing. Fertility rates are falling, and the world’s population is growing gray.

This unprecedented demographic shift has major implications for U.S. fiscal policy. Entitlement programs will be increasingly strapped as the number of beneficiaries increase and the number of working people who pay for the benefits shrinks.

Due to advances in medical science and technology, people – especially the well to do – expect to live longer, better lives than they might have imagined even three decades ago. According to the Census Bureau, the average American born today can expect to live to about 80, up dramatically from the average of 68 in 1950.

Additionally, the Census Bureau notes that whereas the average American woman in 1950 had 3.5 children during her lifetime, the figure today has fallen below two. The causes of declining fertility include the rising social status of women, widespread availability of birth control, and the growing cost of raising children.

French sociologist and philosopher Auguste Comte coined the aphorism “demography is destiny” with dubious finality almost 200 years ago. But that does not suggest that destiny is immutable, nor is it inevitable. Just as aging individuals must adjust their lifestyles to maintain personal vitality, societies with aging populations must adjust policies to preserve and promote their economic prosperity.

Demographic trends can have big implications. This shift from a predominantly young to predominantly older population has both broad macro-economic implications and important financial consequences. Consider that many U.S. entitlement programs were created with the assumption that there would be a relatively small group of old people and a large number of working-age people, followed by an even bigger cohort of children.

According to the Census Bureau, 47.8 million Americans are 65 and over. This figure is projected to nearly double to 83.7 million by 2050. Just 10 years ago, 12.5 percent of the population was 65 and over. Today, it is 15 percent, and is projected to reach 21 percent in just 20 years. By 2030, one in every five U.S. residents will be over 65. For decades this was the age when people were expected to end their careers and embrace a life of leisure, following Andrew Carnegie’s advice to spend the first third of life getting educated, the second third getting rich, and the last third giving money away.

As the baby boomer generation retires, fertility rates keep falling and life expectancy continues to increase, there will be too many beneficiaries and too few taxpayers. In 1950, the American economy had 8.1 people of working age for each person of retirement age. Recent figures indicate that this “dependency ratio” as the demographers call it, has shrunk to just over 5-1. By 2030, the Census folks estimate it will have fallen to 3-1.

Caring for large numbers of elderly people will put severe pressure on government finances. More specifically and painfully, the U.S. may be facing major tax increases, significant budget cuts, or most likely some combination of the two to secure the future stability of old age entitlement programs. In particular, Social Security and Medicare, which provides health insurance to the aged, will rise as a share of gross domestic product as baby boomers retire.

With the retirement of baby boomers and the rising number of elderly in the population, the nation will face a slow-motion train wreck absent changes in government fiscal policy. The good news is the slow motion part, which gives Americans enough time to take on the challenge of real entitlement reforms that will allow the country to successfully navigate this demographic transition.

Deficits And Debt

As chairman of the Joint Chiefs of Staff from 2007 through 2011, Admiral Michael Mullen was particularly vocal about saying that the greatest threat to U.S. national security was budget deficits. He pointed out that interest on the debt will nearly equal the defense budget and jeopardize the ability to properly resource the military.

In economic terms, the national debt – the sum total of annual budget deficits – now exceeds $22 trillion. The nonpartisan Congressional Budget Office (CBO) projects a deficit of $896 billion for 2019, about a 15 percent increase over the $779 billion deficit in 2018. The CBO predicts deficits will keep rising in the next few years, topping $1 trillion in 2020 and never dropping beneath that that amount through 2029.

Federal debt held by the public is projected to grow from 78 percent of gross domestic product in 2019 to 92 percent in 2029 and 144 percent in 2049, which would be the most in American history. The prospect of such large deficits and debt poses substantial risks, sayeth the CBO.

In case you missed it, neither Democrats nor Republicans seem to care much. Putting the federal government’s fiscal house in order currently commands the attention of few national politicos. They behave like Scarlet O’Hara in “Gone with the Wind,” who reacted to adverse circumstances by saying, “I can’t think of this now… I’ll think about it tomorrow.”

Democratic presidential candidates have presented plans such as Medicare for All, Free College, and the Green Leap Forward. They advocate increasing taxes on the rich to address wealth and income inequality, social problems and any number of other things, but not to reduce deficits and debt. They don’t appear to be worried by deficits and accumulating debt, and seem to think a magic money tree will fund their spending initiatives.

Republicans also usher the idea of taming deficits out of the room rather quickly, accepting bigger deficits in exchange for tax cuts they argue will promote economic growth and fill budget shortfalls over the long term. The theory is that the debt is manageable so long as the economy grows at a faster pace than the feds’ borrowing costs.

President Trump campaigned in 2016 on eliminating the then-$19 trillion national debt in eight years, but the White House spending plan for the next decade calls for adding another $10.5 trillion to the $22 trillion federal debt – and that assumes continued economic growth.

Doing nothing about government red ink shifts the burden to future generations. The theory is that it is wrong for the current generation to enjoy the benefits of government spending without paying for them.

The CBO estimates the federal government will spend more on servicing outstanding debt in 2020 than on Medicaid, and more than on national defense by 2025. Many Democrats and Republicans deny this is a problem, arguing that the U.S. can simply borrow more to fund unrestrained spending. They appear unconcerned that the government’s debt payments may crowd out a good portion of the spending they want.

The Treasury Department’s Office of Debt Management forecasts that starting in 2024, all U.S. debt issuance will be used to fund the U.S. net interest expense, which will be anywhere between $700 billion and $1.2 trillion or more. If this happens, the U.S. will be engaging in the ultimate Ponzi scheme, in which new debt issuance is used exclusively to fund interest on the debt by around 2024.

Out of control spending will haunt the taxpayers for years to come. Obviously, there is no political gain in being a good fiscal steward.

Nota bene what Edmund Burke wrote in Reflections on the Revolution in France in 1790: “Society is indeed a contract. It is a partnership… not only between those who are living, but between those who are dead, and those who are to be born.”