The Debt Dilemma

The U.S. debt is more than $23 trillion, by far the largest in the world. During the fiscal year that ended on Sept. 30, 2019, Uncle Sam laid out nearly $4.4 trillion, while taking in just $3.5 trillion in revenue, which adds up to a $984 billion deficit, 26 percent higher than the year before and equal to 4.6 percent of the country’s gross domestic product. In the first two months of the current fiscal year (October and November) the feds ran a $343 billion deficit.

If not corrected, the fallout from exploding debt will be felt for generations.

Most of the federal budget goes toward entitlement programs such as Social Security, Medicare, and Medicaid, which account for about 47 percent of all spending. Those costs are expected to increase because of the aging population and the resulting rise in health care spending.

Unlike discretionary spending, which Congress must appropriate each year, entitlement spending occurs automatically unless Congress alters the underlying legislation. In the past fiscal year, only 31 percent of federal spending went toward discretionary programs, with defense spending taking up roughly half of that.

Going forward, the nation is looking at trillion-dollar deficits. According to the Congressional Budget Office, federal budget deficits are projected to average $1.2 trillion annually for the next decade thanks to recent tax cuts and spending increases, along with continued growth in entitlements programs.

Federal Reserve Chair Jerome Powell recently testified that the country’s current fiscal situation is unsustainable, noting that high and rising debt threatens to slow economic growth and increase federal interest payments, leaving the country vulnerable when the next recession occurs and putting an undue burden on future generations. The national debt is growing faster than the economy, and its rapid growth threatens the nation’s economic health.

So far, none of the leading presidential candidates, including the incumbent, have interrogated the debt issue. Stoned on their own virtues, invincible in the belief they are right, the candidates make incandescent, seductive promises, promises, and more promises to the laity who are looking for a free lunch and shortcuts to economic growth. These promises, such as Medicare for All, are to be financed by cooking up a raiding party on the 1 percent, on corporate America and on an Amazon forest of magical money trees postponing the day you have to square the ledger.

The reason the debt issue and its attendant risks are studiously avoided is obvious enough: politics trumps (if we can use that word) economics. There is no incentive to put principle before career ambitions. Better to live in the eternal present and just spend, spend, spend. The public is okay with that. The fault, with apologies to Shakespeare, is in ourselves.

There are a number of risks attendant to the rising debt. For example, if interest rates rise, servicing the federal debt will consume resources that could be spent on infrastructure, education, and research. As Chairman Powell noted, increased federal interest payments could leave the country less prepared for the next recession and put undue burdens on future generations.

Correcting the debt trajectory will require politically difficult decisions. Basic economics suggest three options for balancing the books: cut benefits, raise taxes, or do both. There are no easy answers. Whether you seek to increase taxes or cut spending, you are likely to face headwinds. Benefits once given are hard to cut, much less freeze.

The path of least resistance is just to print money. Simply monetize the debt, make asset prices rise and ignore the consequences of currency depreciation. Sound familiar?

One thing is certain; the country needs to get debt under control before the fiscal house of cards collapses and we find ourselves in another financial crisis. It would be wise to recall what Herbert Stein, an economist and member of the Council of Economic Advisors under Presidents Nixon, Ford, and Reagan wrote: “If something cannot go on forever, it will stop”.

Poker And Risk Management

Gambling – the willingness to take actions whose outcomes cannot be known for certain – is a basic human instinct. The riskier you perceive a particular action to be, the higher its potential payoff should be to justify your taking the action.

As it happens, the risk inherent in many actions can be roughly quantified. You can rank actions by their estimated riskiness, compare them to each other and to their potential payoffs, and make intelligent judgments about which (if any) actions to take.

This is known as managing risk. And the widespread failure to manage risk sensibly was a major reason why the financial industry melted down so catastrophically in the fall of 2008. To their peril, Wall Street firms relied on oversimplified models for managing their risk.

Many people insist that financial markets are simply a large collection of gambling casinos that offer investors a variety of “games” to bet on. This is almost right – but the almost is significant.

When you walk into a casino, you face an immediate choice. Are you going to play slot machines and table games like roulette and craps, or seek out the poker rooms?

If you choose slots or table games, you are likely to lose because you are playing against the house. The payoffs of these games are structured (with the blessings of state gaming commissions) to give the house an edge that assures you will lose in the long run. This is unlike the situation in the financial industry.

But if you choose the poker rooms, you have a chance of winning, because you are playing against other gamblers like yourself. The house simply hosts the games (i.e. provides the space, tables, and chairs, decks of cards, professional dealers and so on) and takes a modest cut of the pot for doing so. This is a lot more like the situation in the financial industry.

You can sit down at a table and become a “player” (which is like being a “professional investor” in the financial industry). But you have another option.

You can engage in side betting. People who visit poker rooms simply to watch the games can place side bets among themselves about the winner of the next hand. But since they’re unable to influence the hand’s outcome, their betting decisions simply reflect their estimates of the raw probabilities. These bettors are spectators with no influence over who wins the next hand.

But there are ways to refine your initial assumption about the win/lose probabilities.

One way is to simply watch a half-a-dozen or so hands and see which player or two seem to be dominating, then make a subjective judgment about the player’s probability of winning. Another is to look at the chip stacks in front of each player. If one player’s stack is twice as large as anybody else’s, it may be evidence of that player’s superior poker skills.

But suppose you recognize at the outset that one of the players is Jennifer Harman or some other highly regarded poker maven who tends to win a significant percentage of the hands they play. You reflect this by assigning him or her a higher win probability. You place most of your bets on the maven winning, possibly adjusting the size of each bet based on how well the maven is doing as the game progresses and what kind of payoff odds you’re getting from the other spectators.

An important point stands out about this poker example: You have a relatively large number of variables to keep track of, and their interrelationships and relative impacts are constantly changing.

This is especially true in financial markets. During the years leading up to the beginning of 2008 many firms bowed to the temptation to oversimplify their models. Many of them turned out to be less than worthless when the proverbial expletive hit the fan and blew up the world or at least lit the fuse.

In Praise Of Negative Interest Rates?

Negative interest rates are widely discussed these days as a monetary policy tool to support economic growth. President Donald Trump is a huge fan of low or negative rates and has been browbeating Federal Reserve Chairman Jerome Powell, whom he appointed in 2017, to cut interest rates to zero or even lower. Powell and his colleagues should think long and hard before capitulating.

The apparent goal is to keep the economy percolating until after next year’s presidential election. The Fed appeased the President last month, making a modest quarter-point cut. Egged on by Trump, they seem poised to lower rates further this year.

Negative interest rates have become commonplace in Europe and Japan. Central banks in Denmark, Switzerland, Sweden, Japan, and the European Central Bank have slashed rates below zero to shore up weak economies or strengthen their currencies. The notion is that weakening a country’s currency makes it a less attractive investment than other currencies, giving the country’s exports a competitive advantage. Worldwide, there is more than $17 trillion in debt with negative yields, almost half of it in euros. The majority of the balance is in Japanese yen. Almost all of it is sovereign debt.

Central banks usually pay commercial banks interest on the reserves they keep at the central bank. Under a negative rate policy, the commercial institutions are required to pay interest on any surplus cash beyond what regulators say banks must keep on hand. This penalty is designed to incentivize commercial banks to lend more money. The view is that low or negative interest rates encourage businesses to invest and consumers to spend rather than pay a fee to keep their money safe. Loans put money into circulation and generate economic activity.

Lower or negative interest rates present both costs and benefits for consumers.

Imagine if you go to the bank for a loan and are told the bank will pay you for taking it. Who in their right mind rejects such an offer? Conversely, if you make a deposit, under a negative interest rate scenario you are actually paying the bank to hold your money.

A big concern, which has yet to be explained, is the impact of negative interest rates on money market funds, which are a foundational investment for many households. Negative interest rates reward borrowers at the expense of lenders or savers. The goal is to bring future consumption into the present.

One potential danger of this approach is the liquidity trap that occurs when interest rates are so low that they reduce the flow of money to the Main Street economy. Instead, it goes into investments that don’t generate economic activity, such as the stock market, as people desperately chase higher yields and push up stock prices.

Interest rate cuts tend to stimulate the stock market by making real returns on bonds less competitive. The President seems to think that makes for good economic policy. Negative interest rates might actually lead to lower interest costs on government debt. Debt service is one of the fastest growing drivers of federal spending.

Low interest rates are old hat. Even during the Obama administration, when the economy rarely topped 2 percent annual growth, business did not pick up when money was cheap. For the last decade, the low interest rate scenario has been a secret tax on savers, who are not generally speculators in the stock market.

Millions of Americans are either behind in the race to save for retirement or living off their interest income. They may spend less in a negative interest rate environment, which would reduce economic activity.

How using this unconventional monetary policy will work in the United States is a mystery. It could leave the Fed without any ammunition when an actual recession hits and could increase the likelihood that the President is reelected. One can only hope that Powell and company make the right economic call.

Stakeholder Capitalism. Really?

A recent announcement by nearly 200 CEOs that corporations should serve more than the bottom line may be great public relations, but don’t hold your breath waiting for big changes in the way corporate America operates.

For four decades the popular conception is that a corporation exists to maximize returns to shareholders. This conceit is the work of economists. Milton Friedman, who was awarded the Nobel Prize in Economics in 1976, made the case in a famous 1970 New York Times Magazine article that the social responsibility of business is to increase profits. It laid the intellectual foundations for the shareholder value revolution of the 1980s.

As he put it “there is one and only one social responsibility of business – to use its resources and engage in activities designed to increase its profits.” His former students popularized the idea that the great challenge of corporate governance is getting executives (agents) to act in the interest of the shareholders (principals).

This view caught on and became conventional wisdom, as universally accepted as the idea that the sun revolves around the earth once was. Over time the U.S. stock market has focused strongly on corporations’ quarterly earnings to the point that a penny up or down from expected earnings per share can cause the stock price to fluctuate.

This has created a number of potential problems. For starters the short-term focus of the stock market dictates a short-term approach by management, at the expense of long-term shareholder value. For example, management might decide to shower cash on shareholders and not invest in research and development on projects that would only pay off down the road. Also, market pressures could tempt managers to cheat or manage earnings to meet investor expectations, especially since the compensation of CEOs and other executives is linked to stock performance.

This August, nearly 200 chief executives of major American corporations including Apple, Amazon, General Motors, and Walmart – all members of the powerful U.S. Business Roundtable – announced that corporations should no longer just maximize profits for shareholders but also benefit other stakeholders including employees, customers, and citizens.

Is this all just rosy rhetoric or a real change in mission? Will these corporations who are people, really nice people, now use house money to support expansive social goals that are irrelevant to maximizing shareholder returns? This rhetoric about the purpose of a corporation won’t even rise to the level of the inconsequential unless executives address basic questions.

Will they argue for changes in how they are paid, how corporations are taxed and regulated and focus instead on the long-term health of their companies? What metrics will these executives use to measure stakeholder returns? How will corporations pivot away from the needs of activist short-term investors? How will they balance the needs of multiple stakeholders to create value for all these shared interests? How will executives resolve stakeholder conflicts? What tradeoffs have to be made? There are more unasked and unanswered questions than positions in the Kama Sutra.

New York Times Columnist Farhad Manjoo believes the new mission statement is all foam and no beer. He cynically says these CEOs want you to know how much they care, but they will continue to eat your lunch while virtue signaling. Many others are quite skeptical that corporations will change the way they behave.

Former General Electric CEO Jack Welch said in a 2009 interview with the Financial Times that: “On the face of it, shareholder value is the dumbest idea in the world.” This comment may be the height of irony given that when he ran GE, the firm consistently met or beat analysts’ quarterly earnings forecasts.

One thing is for certain. The time is long overdue to shift the focus of corporations away from maximizing shareholder value and stock-based executive compensation. But don’t hold your breath. This is like asking business executives to perform surgery on themselves.

The Merrill Lynch story

The weekend of Sept. 13 and 14, 2008 was one of the worst ever on Wall Street. And when Lehman Brothers went bankrupt on Sept. 15, it triggered a global financial panic.

Also over that weekend, Bank of America and Merrill Lynch hammered out one of the biggest deals in Wall Street history in less than 36 hours. The feds pushed for a deal to prevent Merrill from becoming the next domino to fall. With Lehman preparing to file for bankruptcy after failing to find a buyer, executives at both Bank of America and Merrill knew they needed to act quickly as Merrill’s liquidity was evaporating.

Merrill Lynch was founded in 1914 by Charles Merrill and his friend Edmund Lynch. During the next 30 years, it grew by a series of mergers and acquisitions into the nation’s largest and best-known retail brokerage firm. Just as Lehman Brothers had epitomized the “aristocratic German-Jewish culture” in the financial industry, Merrill Lynch became a symbol of “working-class Irish Catholic culture” (like New York City’s police and fire departments). Not that it mattered much when push came to shove in September 2008.

In 1971, Merrill Lynch became a publicly traded corporation. And in 1978, it acquired the small but prestigious investment bank White Weld & Company to expand its underwriting activities and take advantage of the ability of its huge retail brokerage arm to place new common stock issues with investors directly rather than through syndicates composed of other firms.

But by 2000, Merrill (like Lehman and Bear Stearns) was becoming increasingly dependent on its collaterized mortgage obligations business to grow profits. By goosing this growth by more than doubling its 2003 leverage ratio of 19-1 to 39 to 1 in 2007, Merrill was able to provide its common stock holders with a 13 percent increase in investment returns during this period.

By 2006, Merrill had leaped to the top spot in the nation’s collateriized mortgage obligations business, underwriting $35 billion in these securities, 40 percent of which were backed by sub-prime mortgages. To help secure its position, Merrill spent $1.3 billion to acquire First Franklin, one of the nation’s largest originators of sub-prime residential mortgages. This gave it a major in-house mortgage originator and reduced its dependence on buying mortgages from numerous banks and home loan firms to back new underwritings of collateralized mortgage obligations.

Concerns about Merrill’s viability increased during the summer of 2007, when two Bear Stearns hedge funds defaulted. As a short-term lender to these funds, Merrill seized $800 million of Bear’s mortgage assets and proceeded to auction them off in the secondary markets. But the auctions failed to generate reasonable bids for the sub-prime mortgages and highlighted Merrill’s exposure to these “toxic waste securities”. For the last quarter of 2007 and the first three quarters of 2008 combined, Merrill wrote down more than $46 billion to bad bets on real estate and other mortgage-related instruments.

These write downs had severe consequences for Merrill: the firm’s stock price fell significantly, Moody’s Investors Service placed Merrill’s long-term debt “on review for a possible downgrade”, traders in other firms lost confidence in the firm’s ability to meet its trading obligations, and the firm had to increase its equity capital by selling off assets such as its 20 percent stake in Bloomberg for a much-needed $4.4 billion.

Additionally, between May 2007 and September 2008, Merrill laid off over 7 percent of its employees. Its board ousted CEO Stan O’Neil in October 2007, though he retained $30 million in retirement benefits and $129 million in stocks and options.

Merrill’s continued write downs of toxic mortgage assets, increasing operating losses, difficulty refinancing its short-term borrowings made it clear that its days as an independent firm were numbered. On Sept. 14, 2008 Merrill agreed to sell itself to the Bank of America.

Financial markets are prone to instability. But when paired with excessive financial leverage, the result can be severe economic pain.

The Lehman Brothers story

Next month is the 11th anniversary of the fall of the famed investment banking firm Lehman Brothers (“Lehman”), which froze up the nation’s credit system when it collapsed on Sept. 15.

The firm was founded as a dry goods business in 1850 in Alabama by brothers Henry, Emmanuel, and Mayer. Lehman began focusing on cotton trading and moved to New York during the late 1850s. That office eventually became its headquarters.

By 1900, Lehman had begun moving into underwriting new issues of common stocks for corporate clients, as well bond trading and financial advisory services. During the ensuing decades, it underwrote issues for corporations like Sears Roebuck, RCA, and Macy’s. In 1984, Lehman was acquired by American Express and merged with Shearson, the company’s brokerage subsidiary. This lasted until 1994, when American Express decided to get out of the brokerage business and spun off Lehman.

The company saw considerable success in the years that followed, as it increased its net revenues more than six-fold, to $19.2 billion. By the end of 2007, it was the fourth largest investment bank in the United States and seemed poised to continue its stellar growth.

But Lehman had become increasingly reliant on the subprime and commercial real estate markets. This went hand-in-hand with a 46 percent increase in its leverage ratio, from 24 to 1 in 2003 to 35 to 1 in 2007. Much of this leverage took the form of short-term debt with maturities as short as a single day. So Lehman had to continuously sweet talk its lenders about the “solid value” of the assets it had pledged as collateral for these “here-today-gone-tomorrow” loans.

The sweet talk was undercut by continued erosion of the housing and mortgage markets during the summer of 2007. Lehman’s common stock price fell 37 percent from June to August, as the firm closed its sub-prime mortgage arm, wrote off $3.5 billion in mortgage related assets, and laid off more than 6,000 employees by year’s end.

Things got even worse in 2008. In January, Lehman closed its wholesale mortgage lending unit and laid off another 1,300 employees in a vain attempt to stem further hemorrhaging from its sub-prime mortgage operations. Then Standard & Poor’s credit rating agency downgraded its outlook on Lehman from “Stable” to “Negative” on the expectation that its revenue would decline by at least another 20 percent, which caused Lehman’s stock price to plunge an additional 48 percent.

Lehman attempted to counter by selling $4 billion in convertible preferred stock, but the fresh cash was quickly soaked up by more write-offs. Rumors flew that other firms were refusing to trade with Lehman.

The company contemplated “taking itself private,” but financing wasn’t available. Lehman’s next move was to try and locate buyers for $30 billion of its commercial mortgages, whose actual market value couldn’t be determined because their trading activity was virtually non-existent. Talks with the Korea Development Bank, China’s CITIC Securities, and the Royal Bank of Canada went nowhere.

The time had come for the federal government to step in if Lehman was to be saved. But public backlash against the earlier Bear Stearns bailout made such a rescue politically untenable. With voices from all sides of the political spectrum screaming at the feds for using taxpayer funds to bail out big Wall Street firms that had caused this mess, while refusing to lift a finger to help American families in danger of losing their homes.

On Sept. 15, 2008, Lehman had to file for Chapter 11 bankruptcy, leaving its viable businesses to be snapped up at fire-sale prices by sharp-eyed bottom fishers.

At the time it was the largest Chapter 11 bankruptcy in American history. In retrospect, it’s generally regarded as the most disastrous decision by the feds since the early 1930s, when the Federal Reserve chose to shrink the nation’s money supply by one-third, which shattered the American economy for the rest of the decade.

Let them eat credit

The Federal Reserve Bank cut interest rates by a quarter of a point on July 31, the first reduction in more than a decade. The 25-basis points reduction was seen as an effort to stimulate the economy and counteract the escalating tit-for-tat trade war with China that is seen as impeding global growth.

The Federal Reserve, the world’s most powerful central bank, is again bearing the burden of keeping the economy growing and minimizing financial instability. What’s more, they are pursuing pro-growth policies without any fiscal policy support from elected officials.

Just last month, President Trump reached a bipartisan two-year budget agreement with Democratic leadership in the House of Representatives and Republican leaders in the U.S. Senate that raises discretionary spending caps by $320 billion and suspends the debt ceiling until July 31, 2021. The legislation will add $1.7 trillion to projected debt. It is a really bad idea to assume the future will look after itself. The good news is that they got on well together to pass the legislation. If you believe that you can’t be helped.

This budget deal avoids the risk of another partial federal government shutdown and a potentially catastrophic default on the nation’s debt. The Republicans voting for it touted the increase in military spending while the Democrats talked up the additional domestic spending it includes. The federal debt has grown from about $19 trillion in January 2017 to more than $22 trillion now. Fear of debt and its potentially dangerous implications are nowhere to be found in 2019.

But it’s not just the ruling class in Washington that has become addicted to debt; the whole country is waist deep in it. Taken together, all segments of U.S. debt – federal, state, local, corporate, and household – are at 350 percent of the gross domestic product. American household debt continues to climb to record levels, reaching $13.54 trillion in the fourth quarter of 2018, $869 billion above 2008′s $12.68 trillion peak, according to the Federal Reserve Bank of New York.

The Federal Reserve also claims to be tweaking the benchmark federal funds rate because it is worried that inflation is running below its target of 2 percent. According to the Fed, prices rose just 1.6 percent in the year through June, not counting volatile food and fuel prices.

Inflation as defined and measured by the Fed may be running pretty low right now, but bear in mind that the typical family’s living costs may be nothing like the official stats. It is also fair to say that Americans want a bigger paycheck, not higher prices resulting from a 2 percent inflation target. On a daily basis they experience the Dickensian nightmare of the accumulated high cost of several decades of low wages.

Don’t forget that even modest inflation for a prolonged period can seriously erode purchasing power. For instance, inflation averaged 2.46 percent annually between 1990 and 2018. Sounds low, but you would need just about $2,000 today to buy what $1,000 would have bought in 1990. You don’t have to be a socialist or an economist to understand that despite the strong labor market, today’s wages provide about the same purchasing power as years ago – if you are lucky.

To compensate, households turn to debt. The average American now has about $38,000 in personal debt, excluding home mortgages. The average household carries about $137,000 in debt and the median household income is about $59,000. So when the cost of living rises faster than household income, more Americans use credit to cover basic needs like food, clothing, and medical expenses. When wage growth does not keep up with the cost of living, government promotes cheap credit to grease the economic wheel, especially in an on-demand society that values the immediate over the remote.

Put differently, U.S. economic policy has for decades been, to paraphrase the misquoted Marie Antoinette, “Let them eat credit.”

Deficits And Debt

As chairman of the Joint Chiefs of Staff from 2007 through 2011, Admiral Michael Mullen was particularly vocal about saying that the greatest threat to U.S. national security was budget deficits. He pointed out that interest on the debt will nearly equal the defense budget and jeopardize the ability to properly resource the military.

In economic terms, the national debt – the sum total of annual budget deficits – now exceeds $22 trillion. The nonpartisan Congressional Budget Office (CBO) projects a deficit of $896 billion for 2019, about a 15 percent increase over the $779 billion deficit in 2018. The CBO predicts deficits will keep rising in the next few years, topping $1 trillion in 2020 and never dropping beneath that that amount through 2029.

Federal debt held by the public is projected to grow from 78 percent of gross domestic product in 2019 to 92 percent in 2029 and 144 percent in 2049, which would be the most in American history. The prospect of such large deficits and debt poses substantial risks, sayeth the CBO.

In case you missed it, neither Democrats nor Republicans seem to care much. Putting the federal government’s fiscal house in order currently commands the attention of few national politicos. They behave like Scarlet O’Hara in “Gone with the Wind,” who reacted to adverse circumstances by saying, “I can’t think of this now… I’ll think about it tomorrow.”

Democratic presidential candidates have presented plans such as Medicare for All, Free College, and the Green Leap Forward. They advocate increasing taxes on the rich to address wealth and income inequality, social problems and any number of other things, but not to reduce deficits and debt. They don’t appear to be worried by deficits and accumulating debt, and seem to think a magic money tree will fund their spending initiatives.

Republicans also usher the idea of taming deficits out of the room rather quickly, accepting bigger deficits in exchange for tax cuts they argue will promote economic growth and fill budget shortfalls over the long term. The theory is that the debt is manageable so long as the economy grows at a faster pace than the feds’ borrowing costs.

President Trump campaigned in 2016 on eliminating the then-$19 trillion national debt in eight years, but the White House spending plan for the next decade calls for adding another $10.5 trillion to the $22 trillion federal debt – and that assumes continued economic growth.

Doing nothing about government red ink shifts the burden to future generations. The theory is that it is wrong for the current generation to enjoy the benefits of government spending without paying for them.

The CBO estimates the federal government will spend more on servicing outstanding debt in 2020 than on Medicaid, and more than on national defense by 2025. Many Democrats and Republicans deny this is a problem, arguing that the U.S. can simply borrow more to fund unrestrained spending. They appear unconcerned that the government’s debt payments may crowd out a good portion of the spending they want.

The Treasury Department’s Office of Debt Management forecasts that starting in 2024, all U.S. debt issuance will be used to fund the U.S. net interest expense, which will be anywhere between $700 billion and $1.2 trillion or more. If this happens, the U.S. will be engaging in the ultimate Ponzi scheme, in which new debt issuance is used exclusively to fund interest on the debt by around 2024.

Out of control spending will haunt the taxpayers for years to come. Obviously, there is no political gain in being a good fiscal steward.

Nota bene what Edmund Burke wrote in Reflections on the Revolution in France in 1790: “Society is indeed a contract. It is a partnership… not only between those who are living, but between those who are dead, and those who are to be born.”

Labor Unions And Inequality

In the wake of the Great Recession, economic inequality – the extent to which income and wealth are distributed unevenly across a population – has emerged as a major issue in the United States.

Since the late 1970s, there has been enormous change in the distribution of income and wealth in the U.S. The gap between the “haves” and the “have-nots” has widened, with a small portion of the population reaping an increasingly larger share of the country’s economic rewards. Warren Buffet got it right when he said, “There’s been class warfare going on for the last 20 years and my class has won.”

The average American has lost. Since the mid-1970s, wages have remained stagnant and middle-class earnings have lagged the cost of living.

There are a number of factors contributing to economic inequality, downward mobility among working-class Americans and the dangerous fissures it has caused American society. These include government tax and regulatory policies, the acceleration of finance capitalism, culture, immigration, globalization, and the rate of technological change.

Frequently overlooked is the declining strength of private-sector labor unions. In 1979, unions represented 24 percent of the private sector labor force; today only 6.5 percent of private-sector workers are unionized.

The effects of this decline are fiercely debated. Conservatives argue that labor unions decrease competitiveness and business profitability. Progressives say that in an era of globalization, companies threaten to ship jobs to factories offshore to extract concessions from unions with impunity. For sure, unions raise wages, but that doesn’t necessarily mean they reduce profitability or diminish competitiveness. Consider the success of unionized firms such as Southwest Airlines and UPS.

American manufacturing and wages suffered as U.S. companies engaged in extensive offshore outsourcing of decent-paying domestic jobs to China and other low-wage countries under the banner of free trade, prioritizing short-term profits over long-term investments and the public interest. For example, from 2000-2016, the U.S. shed five million manufacturing jobs, a fact that supporters of free trade and globalization rarely mention.

The loss of traditional manufacturing jobs has contributed to income inequality and declining union membership. According to a report by the Washington-based think tank the Economic Policy Institute, if unions had the same presence in the private sector today as in 1979, both union members and non-members would be making about $2,500 more each year.

Many companies have built their business models around offshoring manufacturing to reduce costs without passing the savings on to consumers. They view the wages and benefits that once underpinned a middle-class lifestyle as obscenely excessive. That’s why they support free trade and use their political power to garner the support of both major political parties, helping accelerate the demise of labor unions. Government turned a blind eye as corporations packed up good jobs and send them overseas, weakening private-sector unions.

The American public has repeatedly been told that policies that restrain foreign competition are a form of protectionism that subsidizes inefficient domestic industries and raises prices. The issue of job losses is ignored. The benefits of free trade allegedly exceed the costs of lost jobs, especially for those who work with their hands. Assumed consumer benefits should be considered when it comes to trade policy, but so should giving working-class people a fair shot at the American Dream. Americans need a more balanced way of thinking about free trade and the offshoring of American jobs.

Is it any wonder that President Trump’s campaign slogan – “Make America Great Again” – resonated with ordinary Americans? This rhetoric is reminiscent of 1988 Democratic Presidential candidate Rep. Richard Gephardt’s slogan “Let’s Make American First Again.”

Writing over 2400 years ago, the Greek philosopher Aristotle captured the importance of inequality when he wrote, “A polity with extremes of wealth and poverty is a city not of free persons but of slaves and masters, the ones consumed by envy, the others by contempt.”

Financial Sector Is Driving The Economy

The contemporary rise of finance, promoted by both Republican and Democratic administrations, has changed America from an economy focused on sustainable growth to one dominated by the financial sector itself – a broad range of industries that includes banks, investment firms, insurance companies, and real estate firms that provide financial services to commercial and retail customers.

Since the 1980s, the financial sector has expanded to take up an extremely large slice of the U.S. economy, a trend referred to as “financialization.” Financial institutions have significantly increased in scale and profitability relative to what most see as the “real” economy – businesses that produce tangible goods – which has left the United States increasingly reliant on the financial sector to generate economic growth.

The growth is apparent when measuring the size of the financial sector as a percentage of gross domestic product. Finance and insurance alone represent about 7 percent of the U.S. GDP. Profits tell a similar story. The sector now takes around a quarter of all corporate profits, yet creates only 4 percent of jobs, according to the Bureau of Economic Analysis. In short, the financial sector has captured a larger and larger piece of the national economic pie.

Many say this has contributed to widening income inequality between a small pool of high earners and the rest of society, giving the financial elite ample resources to sway government policy in their favor. This political influence is quite unlike the 99 percent, whose choices are increasingly limited to making ends meet. Several Democratic presidential candidates have criticized the United States’ reliance on the financial sector and lax government regulation. Sen. Bernie Sanders has made “breaking up the banks” a key plank in his presidential platform.

One factor that has contributed to financial sector growth is deregulation.

Before the Great Depression, the status and influence of financiers was so great that when President Theodore Roosevelt filed the first major antitrust lawsuit against J. P. Morgan’s Northern Pacific Railroad, Morgan, the fabled Wall Street titan, at a February 1902 White House meeting, told the President, “If we have done anything wrong, send your man to my man, and they can fix it up.”

Four years after the stock market crash of1929, the United States passed the Glass-Steagall Act in 1933 and other legislation to rigorously regulate the financial sector and make it more stable and transparent. Glass-Steagall legislation separated investment banking from commercial banking forcing banks to choose one or the other. Little by little ever the last several decades, those laws that served America so well were rolled back starting in the 1980s onward.

The financialization of the economy was jacked up in the 1980s, fueled by Reagan-era laissez-faire policies. For example, the 1982 tax reform lowered the capital gains tax. Deregulation from the 1980s onward encouraged banks to move away from their traditional role of supporting businesses and individuals and providing the liquidity and credit needed to lubricate the economy.

For instance, the repeal of the Glass-Steagall Act in 1999, a seismic moment in the story of financialization, triggered high-risk deals and trading by financial institutions by enabling them to use deposits collected through their commercial banking arms. Lusting for quick, short-term profits that kept the money within the financial sector rather than investing it in the real economy, banks began to focus on high-risk “financial engineering” like sub-prime loans, collateralized debt obligations, structured investment vehicles, and derivatives, which Warren Buffet famously called “financial weapons of mass destruction.”

Such activities are remote from the production of tangible goods and services. Finance has become a business unto itself, all about making money from money rather than making things and being a facilitator for real business. Populists from the left and the right say Wall Street has done better than Main Street – and that may be the truth of it.