The forgotten tribe: America’s working class

Countless working-class Americans of all races and ethnicities, who work hard and play by the rules, are fed up with the extreme partisanship that permeates the country, and with meaningless acts of violence, including the storming of the capitol. These people are the forgotten tribe in America.

In general, working class people are those with a high school diploma but less than a four-year college degree who live in households with annual incomes roughly between $30,000 and $70,000 for two adults and one child. They are somewhere between the poor and the middle class.

Americans by some measures are more deeply divided politically and culturally than ever before. We live in a period of competing moral certitudes, of people who are sure they are right and prepared to engage in violence to make their point.

For the last many years, political correctness; cancel culture; social justice; multiculturalism; the all-pervasive claim to victimhood; judging people on their ethnicity, gender and race rather than the merits of their work; and the politicization of just about everything has generated more heat and fumes than light. For all their rosy rhetoric on the subject, the ruling elites have less experience with ethnic and racial diversity than the working class.

These factors, and probably dozens of others, are contributing to the breakdown in the American genius for reaching compromises that meet the real social and economic needs of the working class.

Both the extreme right and the extreme left are corroded by ideology. Extremists on the right label their counterparts on the left socialists, and the left calls the right fascists. Each faction takes the law into their own hands while politicians see which way the wind is blowing and refuse to intervene. The growing divisions help explain why the nation’s political center is shrinking.

At the same time, the media, both traditional and social media, have accelerated the fragmentation of cultural and political identities. Conservative and liberal TV networks only highlight information that confirms their audiences’ biases, creating ideological echo chambers.

The worst of the fallout from this polarization will be felt by the forgotten tribe. These issues have done little to help them make ends meet and keep their families safe from COVID. Is it any wonder when they walk past a statue of that schnorrer Thomas Jefferson they don’t experience any trauma? Working people, after all, have to work.

America’s working class doesn’t have the luxury of engaging in ideological pursuits; they have to take care of their families, paying for groceries, medical bills, making mortgage or rent payments. The pampered and self-consciously fortunate regard the working class as “deplorables,” half of whom believe Elvis is still alive. Their understanding is the comic book version of diversity. They live in white neighborhoods, send their kids to private schools, and summer in the Hamptons.

These ruling elites don’t have to live with the unintended consequences of their decisions. The working class are the ones who have to work. As long as they do, it hardly matters what color their skin is or what accent they have. All the while, the economic system directs food, shelter, energy away from those who need it most and toward those who need it least.

The causes of the forgotten tribe’s problems have been well documented: The rate and speed of technological changes, growing monopoly power and concentration, and globalization. Is it any wonder why the working class is losing hope in a better future (get real, they are not Bill Clinton)? They are an endangered species, living paycheck to paycheck.

Despite copious amounts of cash provided to families and unemployed workers, COVID-19 rescue plans don’t provide long-term solutions for making work pay, giving the working class the education and skills needed to get better work, and to strengthen families and communities to support work. These omissions only exacerbate the fraying cohesion of America’s society and political fabric.

The Fed and inflation

Life has changed substantially for ordinary working-class Americans in the first two decades of the 21st century. The deification of technology, the growth of globalization, the harrowing financial events of 2008 followed by the Great Recession, and the COVID-19 pandemic have left them struggling psychologically, physically, socially, and economically.

Growing income and wealth inequality were on the radar screen long before the coronavirus pandemic, but the pandemic has made the problem more obvious and urgent. The actions of the Federal Reserve (Fed) have widened the gap. Quite apart from persistently low interest rates, there is the issue of inflation.

Last August, Fed Chair Jerome Powell introduced a policy that not only allows but welcomes an inflation level above 2 percent. The Fed assumes it will be able to just snap their fingers and stop inflation at the point they like, which is the pinnacle of hubris.

Inflation matters. It tends to redistribute income and wealth toward groups that are better able to hedge against inflation by sheltering their assets in ways that earn decent returns.

But for the ordinary American, prices that rise faster than wages mean a decline in real income, less purchasing power and lower living standards. Inflation coupled with wage stagnation is eating away at the working class.

While the cost of many discretionary goods has fallen during the pandemic, basic necessities such as housing, healthcare, education, and food are absorbing an ever-larger portion of the incomes of ordinary Americans.

The cost of groceries has been rising at the fastest pace in decades since the pandemic seized the economy. It’s as if working-class Americans are involuntarily observing Lent all year round. They experience life at the sharp end.

In the United States, the Consumer Price Index (CPI), which reflects retail prices of goods and services, including housing costs, transportation, and healthcare, is the most widely followed indicator of inflation. Food inflation is a major part of the CPI.

But the Fed generally focuses on “core inflation” or “core CPI.” This excludes non-discretionary items such as food and energy prices and can give a misleading picture of inflation trends. In the real world, people can’t exclude food from their weekly budget.

According to the latest inflation data published by the U.S. Labor Department’s Bureau of Labor Statistics, another light, or as they now say, lite, read, food prices have increased by nearly 4 percent in the last year, higher than at any point since the 1970s.

The increases are even more dramatic for some food items, with beef and veal prices up 25 percent year-over-year, egg prices up 12 percent, potatoes up 13 percent, and tomato prices up 8 percent.

The report is broken into price changes for “food away from home” and “food at home”. In November, the categories registered year-over-year increases of 3.8 percent and 3.6 percent, respectively.

Rising food prices impact everybody, but they are always top of mind for ordinary working Americans. Even more affected are the poor and the unemployed because they are unable to afford basic necessities. Cutting back on food budgets is one of the first things people do to make ends meet.

Central bankers suffer from a Copernican complex – the belief that the sun and planets revolve around them. Real world experience and history demonstrate that inflation can’t be controlled like a thermostat. But one thing you can be certain of is that inflation has a painful effect on working class Americans.

As the COVID-19 pandemic recedes, the national goal should be to Make America’s Working Class Great Again (MAWCGA). If you believe the intellectual gratin and shekel dispensers in D.C. will internalize that notion, perhaps you would be interested in some prime real estate – something deep in the Everglades.

The downside of low interest rates

The Federal Reserve loves low interest rates.  With rates stuck at low levels since the 2008 financial crisis, they have become the rule rather than the exception.

When the coronavirus pandemic plunged the economy into a sudden freeze, the Fed lowered its benchmark borrowing rate to near zero and purchased corporate and government securities like there is no tomorrow to curb unemployment and to stimulate the economy.

The funds rate defines the cost of lending from bank to bank through the Fed and serves as the benchmark interest rate for the economy. While low interest rates may be great for driving up sales of homes and automobiles, artificially low interest rates punish savers.  Money market and certificate of deposit rates head to near zero when the Fed sets the federal funds rate at near zero.

This action disproportionately hurts senior citizens, retirees, savers, and those folks who prefer less risk.  In accepting the lower yield, those people get less income, less ability to consume, a lower quality of life, and take on more risk in the stock market for which they are not prepared. Nasty choices.

Low interest rates force savers to pursue more risky investments in the hunt for yield.  Ten-year Treasury Bonds offer a laughable less than 1 percent, making stocks look attractive.  Thank the Fed for the stock market’s run.  The rise in stocks benefits the wealthiest 1 percent or 10 percent or wherever you want to draw the line, who own more than $11 trillion of stock and mutual fund shares.

The Fed’s fundamental imperative is to strong- arm ordinary Americans to spend, spend, spend, or to invest.  The notion being that if, for example, a saving account provides an interest rate that rounds to zero percent, savings makes no sense – especially when inflation is rising faster than the interest earned on a savings account.  Low-risk investments don’t keep up with inflation and your money doesn’t have as much purchasing power.

The situation for savers isn’t likely to get better soon.  The Fed chair has said rates would remain near zero at least through 2023, though the Fed insists it won’t take interest rates negative. The reality is that when inflation is factored in people are already experiencing negative interest rates.

When more people spend and invest the economy expands. Of course, every dollar consumers spend instead of saving amounts to several dollars that would have been available in the future if it had instead been earning interest. As low rates discourage people from saving, they must become more and more reliant on government entitlements in old age.

To put the worst construction on it, a policy of constant low interest rates is an idea that deserves to be put on a stretcher and carried back to the leisure of the theory class where it was born.  You don’t have to be Philip Marlowe to know these policymakers have more than they can say grace over and are permanently out of the financial wars.

Low interest rates add to the Illiad of woes faced by ordinary Americans. The working class was in chronic crisis, alliteration aside, even before the pandemic. They work hard to make ends meet and stay out of the grasp of poverty, play by the rules, and do everything asked of them but kick extra points.

What is the right interest rate? Here’s a crazy idea: the free-market interest rate.  Cut out the middleman.  This is the rate you get when the Fed does not interfere in financial markets.

Don’t bet on it; the Fed wants to preserve the status quo, preserve in other words, the wealth of the One Percent and all that.

But not to worry, money isn’t everything – as long as you have enough of it.

Powell manifesto addressed American economic system under attack

History often has a hidden beginning. Since the 1970s, people who are already well off have enjoyed a rising percentage of income and wealth. Meanwhile, ordinary Americans face declining social mobility, a shrinking middle class, widening income inequality and crumbling infrastructure. There is plenty to be mad about and plenty of blame to go around.

The economic struggles of the American working class since the late 1970s were not just the result of globalization and technology changes. A long series of public policy changes favored the wealthy. Some argue these changes were the result of sophisticated efforts by the corporate and financial sectors to change government policy, from tax laws to deregulation, to favor the wealthy.

In August 1971, less than two months before he was nominated to serve as an associate justice of the Supreme Court, Lewis F. Powell Jr. sent a confidential memorandum to his neighbor and friend Eugene B. Sydnor Jr., chair of the Education Committee of the U.S. Chamber of Commerce. Powell was a leading Virginia corporate lawyer, a former president of the American Bar Association and served on 11 corporate boards.

The 34-page memo was titled “Attack on American Free Enterprise System.” It presented a bold strategy for how business should counter the “broad attack” from “disquieting voices.” The memo, also known as the Powell manifesto, did not become available to the public until after he was confirmed.

He began the memo this way: “No thoughtful person can question that the American economic system is under broad attack.” He went on to write that the assault was coming from “perfectly respectable elements of society: the college campus, the pulpit, the media, the intellectual and literary journals, the arts and sciences, and from politicians.” American business believed it was facing a hostile political environment during the late 1960s and that it was under attack with the growth of government authority under the Great Society and an increase in regulations ranging from the environment to occupational safety to consumer protection.

The memo outlined a bold strategy and blueprint for corporations to take a much more aggressive and direct role in politics. Powell was following the Milton Friedman argument that it was time for big business to focus on the bottom line; it was time to fight for capitalism. Powell proposed waging the war on four fronts: academia, the media, the legal system, and politics.

The memo influenced, for example, the creation of new think tanks such as the Heritage Foundation, the Manhattan Institute, and other powerful organizations. As Jane Mayer wrote, the Powell Memo “electrified the Right, prompting a new breed of wealthy ultraconservatives to weaponize their philanthropic giving in order to fight a multifront war of influence over American political thought.”

The venerable National Association of Manufacturers moved its offices from New York City to Washington. Its CEO noted: “The relationship of business with business is no longer so important as the interrelationship of business with government.” The number of corporations with public offices in Washington grew from 100 in 1968 to over 500 in 1978. In 1971, only 175 firms had registered lobbyists in Washington; by 1982, nearly 2,500 did.

When it comes to lobbying, money is the weapon of choice. It looms over the political landscape like the Matterhorn.  The number of corporate political action committees (PACs) increased from under 300 in 1971 to over 1,400 by the middle of 1980.  The money they spread around gave lobbyists the clout they needed.  The growth of super PACs and lobbyists ensured that any piece of relevant regulation would be watered down, first in Congress and then during implementation.

The Powell memo galvanized Corporate America and enlarged the influence of big business over the political landscape.  It encouraged business to play a more active role in American politics. Corporate America and the 1 percenters got the memo.

Revisiting the tragedy of the commons

During the 1990s, the term paradigm became increasingly fashionable as an intellectually upscale replacement for the traditional and somewhat shopworn term model. But decanting this old wine into new bottles can still leave a bad taste in our mouths if we define a paradigm in too simplistic a manner.

Dictionaries define “paradigm” as a model or intellectual framework that seeks to explain some phenomenon in a clear and simple manner. A relevant example for our times is Garrett Hardin’s Tragedy of the Commons. In this paradigm there is a common pasture in which local farmers can freely graze their cattle. Needless to say, each farmer will want to graze as many cattle as they can on the common because each cow, they add will provide them with a marginal economic benefit at no additional cost. So, all the farmers continue adding more cows.

This works only so long as the total number of grazing cows remains within the carrying capacity of the commons. Once that limit is exceeded, the viability of the commons for grazing begins to break down as the grass wears out and provides less nourishment per cow.

So, each farmer finds that his or her herd of cattle is producing less milk for them to sell. Under the circumstances, their only rational response is to increase the size of the herd. Which means adding still more cows to the over utilized commons. When all the local farmers keep doing this, the result can only be an increasingly dysfunctional commons.

In Hardin’s words: “Each man is locked into a system that compels him to increase his herd without limit—in a world that is limited. Ruin is the destination towards which all men rush, each pursing his own best interest in a society that believes in the freedom of the commons.”

By way of a solution, some people may propose expanding the commons if it is no longer large enough to support existing herds and to pay for it out of tax revenues so users of the commons can continue to obtain its benefits without directly paying for them. Such people believe the purpose of the commons is to serve the community’s economy, its size should be tailored to the demands of that economy as it grows.

Others insist that the real problem is not too little grass, but too much demand. They argue that the time has come to “think green” about the future of public commons in the context of the overall environment. People should begin shifting to more sustainable ways of managing their communities so they can phase down grazing and turn the commons into public parks.

Then there are those enamored of the stained-glass verities of undergraduate microeconomic theory. They suggest that the time has come to start charging farmers user fees—so much per hour of grazing time for each cow. In this way, each user will pay for the benefits received from the public facility in accordance with how much they use it.

By using a sensible pricing system to ration the use of these scare resources, each farmer will be motivated to make the most efficient use of it. Meanwhile, the revenue from user fees can cover the cost of expanding the public commons when necessary rather than the government taxing everyone to pay for this.

Hardin’s grazing pasture paradigm appears to go a long way towards answering socio-economic questions about the inevitable tendency towards over-use of public goods when they are perceived to be “free”. It explains why this tendency leads to a condition where supply can never really catch up with demand. It describes how the widespread availability of free public goods can significantly influence the underlying economics of many private activities. And it demonstrates the ease with which an entire society can become locked into behavioral patterns that may turn out to be “anti-social” in the long run.

It’s your call. After all, Rorschach tests are not graded.

The debt bomb

This year, the federal debt is on track to exceed the size of the entire U.S. economy.

The United States’ debt-to-GDP ratio rose sharply during the Great Recession of 2008-2009 and has continued to rise, reaching 106 percent in 2019. Last year, the GDP was $21.4 trillion, but it is expected to shrink this year. U.S. debt is projected to exceed about $20 trillion and is growing like kudzu.

While the subject of debt and deficits may be dishwater dull to the average American living unemployment check to unemployment check, consider that the Congressional Budget Office (CBO) has warned that the Social Security Trust Fund will run out of money by 2031. Closely related, Medicare’s hospital insurance trust fund is now on track to run out of money in 2024.

The debt-to-GDP ratio compares a country’s public debt to its gross domestic product. By comparing what a country owes with what it produces, the ratio indicates that country’s ability to pay back its debts.

Debt is eating away at the American economy like a swarm of termites invisibly consuming a house. The fiscal follies continue, with the only certainty being that the accumulated debt will be passed on to future generations and jeopardize their chance to live a prosperous life.

It may be time for Washington to consider a new financing instrument to address America’s debt bomb so future generations have a chance to enjoy greater prosperity once the pandemic is behind us. The issuance of 100-year Treasury bonds to fund ballooning deficits, with the interest income indexed to the CPI as a hedge against inflation, may be an idea whose time has come. It would give the next generation, which has to pay down the debt, a break by locking in rock-bottom interest rates. These bonds may appeal to long-term investors, such as pension funds and insurers and be used to fund infrastructure projects.

Long-term bonds are not unusual. Disney issued 100-year bonds in 1993; Norfolk Southern did so in 2010; and Coca-Cola, IBM, Ford and other companies have done the same. Oxford University, Ohio State, Yale and other universities have done the same. Fourteen Organization for Economic Co-Operation and Development countries have issued debt with maturities ranging from 40 to 100 years. Austria, Belgium, and Ireland have all issued century bonds within the last two years.

With COVID-19 and the economic contraction, the CBO has estimated that the deficit for fiscal year 2020 which ends this month will exceed $3 trillion. According to the Committee for a Responsible Budget, this amounts to around 18 percent of GDP for the year. As things stand, the federal debt is expected to reach 108 percent of GDP by next year.

To put these figures into perspective, the U.S.’s highest debt-to-GDP ratio was 112 percent at the end of World War II. The war was financed with a combination of roughly 40 percent taxes and 60 percent debt.

If the great and the good in Washington don’t address how to reduce the deficit-to-GDP ratio and find a fiscally sustainable path after COVID-19, large debt burdens can slow economic growth, raise interest rates, and lead interest on the debt to consume an ever-large proportion of the federal budget, crowding out spending on other priorities. But there is a trust deficit when it comes to the faith sentient Americans have in Washington’s ability to deal with the issue intelligently.

The only approach politicians can agree on to manage the debt and deficits is to steal from future generations by passing on to them the accumulated debt burden. So much for intergenerational fairness. As Admiral Mike Mullen, the former Chairman of the Joint Chiefs of Staff said: “Our national debt is our biggest national security threat.”

Extraordinary situations call for extraordinary measures and the issuance of 100-year bonds might be one way to deal with intergenerational equity.

Financialization of the economy

Financialization refers to the increase in size and importance of the financial sector relative to the overall U.S. economy. Simply put, it is the wonky term used to describe the growing scale, profitability, and influence of the financial sector over the rest of the economy. Combine it with deregulation, less antitrust enforcement, and easy monetary policy from the 1980s onward and you get financial institutions that were too big and too speculative in the years leading up to the financial crisis in 2008.

Today, Wall Street buccaneers don’t just exert great influence over the economy; they are also a major influence in politics and government policy. The financial industry spends millions annually in Washington promoting the Panglossian view that the financial markets promote economic growth and contribute to economic well-being. It would be more accurate to say they contribute to economic inequality and the decline of U.S. manufacturing.

According to data from the Center for Responsive Politics, seven banks spent over $13 million on campaign contributions in the 2018 election cycle and over $38 million on lobbying during the 2017-2018 Congress. Not surprisingly, the top five campaign donors were Bank of America, Goldman Sachs, Morgan Stanley, JPMorgan Chase, and Citigroup.

Any wonder why the Washington crowd favors Wall Street over Main Street? Only the health care industry spends more.

For many Americans, the stock market acts as a barometer for the economy. U.S. financial markets are the largest and most liquid in the world. In 2018, the finance and insurance industries (excluding real estate) represented 7.4 percent or $1.5 trillion of the U.S. gross domestic product. In 1970 the finance and insurance industries accounted for 4.2 percent of GDP, up from 2.8 percent of GDP in 1950. In contrast, manufacturing fell from 30 percent of GDP in 1950 to 11 percent in 2019.

Prior to COVID-19, finance and insurance industry profits were equal to a quarter of the profits of all other sectors combined, even though it accounted for just 4 percent of jobs. These data are evidence of the industry’s growing weight within the American economy.

The figures do not reflect the extent to which non-financial firms derive revenues from financial activities, as opposed to productive investments in real assets. For instance, prior to the 2008 market crash and meltdown, GE Capital generated about half of General Electric’s total earnings. GE became an example of the financialization of American business. In the years leading up to the financial crisis, It became one of the world’s largest non-bank financial services companies, meaning it avoided the level of regulatory scrutiny official players like Wall Street banks face. After it crashed and burned in 2008, GE Capital got a whopping $139 billion taxpayer bailout.

Another example of corporate America moving to the rhythm of Wall Street is the case of Boeing’s 787 Dreamliner aircraft, which famously encountered delays and massive cost overruns due to its incredibly complex supply chain, which involved outsourcing 70 percent of the airplane’s component parts to multiple tiers of suppliers scattered around the world. The Dreamline supply chain reflects the pressure to maximize return on net assets. and was consistent with Wall Street’s approach.

Return on net assets is a key measure financial analysts use to evaluate how effectively management is deploying assets. The goal is to make the most money with the fewest possible assets. In the end, the Dreamliner became an embarrassing failure that cost billions more than it should have. In such instances, financialization reduces the dependence of corporate America on domestic workforces, which leads to offshoring manufacturing jobs.

The financial sector has amassed great power since the 1980s and contributed to the decline of U.S. manufacturing as well as income and wealth inequality. As Supreme Court Justice Brandeis allegedly said in 1941 with great foresight: “We can have democracy in this country, or we can have great wealth concentrated in the hands of a few, but we can’t have both.”

Executive compensation and economic inequality

Oceans of ink have been consumed writing about the subject of widening economic inequality, declining social mobility, and a shrinking middle class in the United States over the last 40 years. More recently, the subject has emerged as a social and political flash point.

The most commonly cited reasons for this phenomenon are globalization and technology adoption. Improvements in technology, such as more powerful computers and industrial robots increase the incentive to substitute capital for labor. Increased trade competition from imports made in lower cost countries, and the threat of exporting jobs to those countries put pressure on wages and employment. Others point to excessive monopoly power, market consolidation and the hollowing out of labor unions.

For the ordinary working-class American there is plenty to be mad about. While wage growth has remained relatively stagnant for decades, an Economic Policy Institute study reports that extravagant chief executive officer (CEO) pay is a major contributor to rising inequality, contributing to the growth of top 1 percent and top 0.1 percent incomes.

The report found that the CEOs of the top 350 US companies by sales raked in an average of $21.3 million last year, an increase from about $18.7 million in 2018. This means that the average CEO made 320 times as much as the average worker earns in wages and benefits. CEO pay went crazy in the 1990s. In 1976 it was 36 times what an average worker earned, 61 times in 1989, and 131 times in 1993.

The authors of the report argue that this “growing earning power at the top has been driving the growth of inequality in our country.” The report attributes the increase to the rapid growth in vested stock awards and exercised stock options tied to stock market growth. Stock-based compensation accounted for about three-fourths of the median CEO’s compensation.

The rise of executive compensation practices linked to stock prices has been the mantra of America, Inc. over the past several decades. In 1982, the Securities and Exchange Commission adopted Rule 10b-18, allowing companies to buy back their own stock without being charged with stock manipulation. Starting in the 1990s many companies introduced stock option grants as a major component of executive compensation. The idea was to better align management interests with those of shareholders. A small circle of highly influential pay consultants, academics, and activist shareholders argued that American firms must pay top dollar for top candidates because they compete in a global market for talent.

While beneficial in some ways, this new form of compensation also created problems quite apart from resentment and lower morale among rank and file workers. For example, the incentive for executives to manage earnings through any means, fair or foul, and focus on the short-term earnings game become strong. Making matters worse, a favorite corporate America trick is to use stock buybacks to manipulate their companies’ stock prices. By increasing demand for a company’s shares, open market buybacks lift the stock price and help the company hit quarterly earnings targets. It makes sense. Stock buybacks enrich investors, including company executives who receive most of their compensation in company stock.

There are many ideas to solve the policy of extravagant executive compensation, ranging from higher marginal income tax rates for those at the top to banning stock buybacks to allowing greater use of “say on pay,” which allows a firm’s shareholders to express dissatisfaction with excessive pay.

While ideas have influence, it’s rarely just because of their singular force they are implemented. Instead, there has to be a confluence between the ideas themselves, the zeitgeist of the times, and the interests of “the great and the good” who find the ideas congenial. The pandemic may serve as a wake-up call for boards of directors and institutional investors to circumcise executive pay.

Closing the carried interest tax preference

Those who can often be found at the very top of the earnings scale – people who manage private investment funds such as hedge funds or private equity and venture funds – enjoy a tax loophole that allows the money they make by investing money for others (their “carried interest”) to be taxed as capital gains rather than earned income, even though they earn the money from work, not as a return on investing their own money.

In plain terms, they reap a benefit even though they don’t put their own capital at risk. It’s a loophole that allows the rich to get richer, and its demise is long overdue. That is why some of the wealthiest Americans pay lower tax rates than their secretaries. Proponents argue that taxing those who run these funds at the same rate that everyone else pays on their earned income would drive away trillions of investment dollars.

These are the same folks, the 1-percenters, who can enjoy indulging in any of the 40 items on the Forbes cost of living extremely well index (CLEWI). The list, which should not be shared with progressive friends, includes such items as a Learjet, 45 minutes with a shrink on the Upper East Side of Manhattan, Russian sable fur coats, a Har-Tru crushed stone tennis court and more. Forbes says, the CLEWI is to the very rich what the CPI is to “ordinary people.”

The term carried interest goes back to medieval merchants in Genoa, Pisa, Florence, and Venice. These traders carried cargo on their ships belonging to other people and earned 20 percent of the ultimate profits on the “carried product.”

Today, those who manage investments in private equity funds are typically compensated in two ways: with a 2 percent fee on funds under management and a 20 percent cut of the gains they produce for investors. The 20 percent in profits these managers pocket, known as carried interest, is currently treated as a long-term capital gain and taxed at 23.8 percent: the capital gains rate of 20 percent plus the Obamacare surcharge of 3.8 percent on their income. The 2 percent management fee is taxed at the higher ordinary income tax rate.

Presumptive Democratic presidential nominee Joe Biden has put forward an economic policy platform under which he would repeal many of the tax cuts that went into effect on Jan. 1, 2018. The proposals include increasing the federal corporate tax rate from 21 percent to 28 percent and restoring the top individual tax rate to 39.6 percent for taxable incomes above $400,000, up from the current 37 percent; taxing capital gains as ordinary income for individuals and couples with over $1 million in annual income and increasing the Social Security earnings cap by applying the payroll tax of 12.4 percent to earnings above $400,000.

While these sweeping tax proposals do not specifically address carried interest, it might be reasonably inferred that carried interest would be taxed as ordinary income rates. In the past, Biden has said he’d like to eliminate the carried interest giveaway. Both Presidents Obama and Trump campaigned on closing the carried interest dodge, yet it’s still there. Their proposals to abolish the carried interest preference were met with pregnant and deadening silence in Congress.

Eliminating the carried interest provision that allows fund managers to get away with bargain basement tax rates should be low-hanging fruit given the inequality of wealth and income in the United States. Yet despite its unpopularity this is the tax break that just won’t die. Well-connected lobbyists and trade groups for private equity, hedge funds, and others have mobilized their resources and fought successfully to keep carried interest as is. The nine lives of carried interest are more evidence, if any more evidence is needed, that big money gets its way in Congress. Here’s hoping that the conceit of closing the carried interest loophole will gain traction but for sure it’s a long shot.

Models aren’t crystal balls

Every day, while folks are stuck at home, politicians, public health officials, and slick talking heads point to charts showing the latest statistics on the coronavirus pandemic as they attempt to predict what might happen next in your neck of the woods. Underlying these graphics are various forecasting models, which you should approach with a healthy dose of skepticism.

It is tempting to view the models as oracles that will help predict how the disease will spread, tell you what to do and when to do it. But these models are simplified versions of realty. Reality is reality. Models should be read with the greatest care. They are not a substitute for controlled scientific experiments that generate relevant data.

Models certainly provide information that can create a framework for understanding a situation. But models, including those used to predict COVID-19′s trajectory, aren’t crystal balls. A model is simply a tool. It consists of raw data, along with assumptions based on our best guesses at the time, that together shape an overall forecast.

A model is only as good as its underlying data, which is in short supply. For example, there is still plenty of uncertainty about how many COVID-19 deaths may occur over the next six months under various social distancing and mask wearing scenarios. Also, a model’s accuracy is constrained by uncertainty about how many people are or have been infected.

Assumptions aren’t facts. Put another way, models are constrained by what is known and what is assumed. Understanding these underlying assumptions helps explain why some forecasts have a sunny disposition, while others can’t be pessimistic enough.

There are also economic models. Financial mavens develop them to take stock of how the pandemic has impacted the economy and where they see it and markets heading. With so many countries experiencing sharp declines in gross domestic product, there is a lot of forecasting about what shape the recovery will take. Will there be a quick V-shaped recovery or will it be U-shaped? Or maybe a little bit of both?

These models also have their limitations. Recall how Long-Term Capital Management, an industry-leading hedge fund run by a renowned team of mathematical experts that included two Nobel Prize winners, developed complex quantitative models to analyze markets and placed huge bets on the assumption, among others, that Russia would never default on its bonds. They did a lousy job of stress testing their assumptions and they bet wrong. In September 1998, the firm had to be bailed out by a consortium of Wall Street banks to prevent the bottom dropping out of the financial system.

This episode was a coming attraction for the harrowing financial crisis a decade later in September 2008, which was perhaps the biggest event of the 21st century until COVID-19. Prior to the 2008 crisis, a key assumption in many models was that housing prices would always go up. Indeed, one cause of the meltdown was the quant movement: the proliferation of quantitative models for designing and analyzing financial products as well as for risk management. Many finance professionals mistakenly believed that quantitative tools had allowed them to conquer risk. Products such as derivatives, subprime mortgage-backed securities and activities that relied heavily on quantitative models were at the heart of how financial firms expanded their activities to take more and greater risks.

And of course, with the presidential election just months away, Americans still remember how 2016 election models forecast Hilary Clinton waltzing into the White House. Between now and Nov. 3, many people will take election forecasts with an extra grain of salt or three.

Given the events of the last several months, people should keep a simple fact in mind: Models should not be asked to carry any more than they can bear. So when you hear about models put on your hmmm face.