Pearl Harbor Day is a day that should live in infamy

Early in 1941, the government of resource-poor Japan realized that it needed to seize control of the petroleum and other raw material sources in the Dutch East Indies, French Indochina and the Malay Peninsula. Doing that would require neutralizing the threat posed by the U.S. Navy’s Pacific Fleet based at Pearl Harbor in Hawaii.

The government assigned this task to the Imperial Navy, whose combined fleet was headed by Admiral Isoroku Yamamoto. The Imperial Navy had two strategic alternatives for neutralizing the U.S. Pacific Fleet. One was to cripple the fleet itself through a direct attack on its warships, or cripple Pearl Harbor’s ability to function as the fleet’s forward base in the Pacific.

Crippling the U.S. fleet would require disabling the eight battleships that made up the fleet’s traditional battle line. It was quite a tall order.

The most effective way to cripple Pearl Harbor’s ability to function as a naval base would be to destroy its fuel storage and ship repair facilities. Without them, the Pacific Fleet would have to return to the U.S., where it could no longer deter Japanese military expansion in the region during the year or so it would take to rebuild Pearl Harbor.

It soon became apparent that the basics of either strategy could be carried out through a surprise air raid launched from the Imperial Navy’s six first-line aircraft carriers. Admiral Yamamoto had a reputation as an expert poker player, gained during his years of study at Harvard and as an Imperial Navy naval attaché in Washington. He decided to attack the U.S. warships that were moored each weekend in Pearl Harbor. But in this case the expert poker player picked the wrong target.

The Imperial Navy’s model for everything it did was the British Royal Navy. Standard histories of the Royal Navy emphasized its victories in spectacular naval battles.

Lost in the shuffle was any serious consideration of trying to cripple Pearl Harbor’s ability to function as a forward naval base. So it was that, in one of history’s finest displays of tactical management, six of the world’s best aircraft carriers furtively approached the Hawaiian Islands from the north just before dawn that fateful Sunday, Dec. 7, 1941, launched their planes into the rising sun, caught the U.S. Pacific Fleet with its pants down and wrought havoc in spectacular fashion. On paper at least, this rivaled the British Royal Navy’s triumph at Trafalgar.

But so what?

The American battleships at Pearl Harbor were slow-moving antiques from the World War I era. As we know, the U.S. Navy already had two brand new battleships in its Atlantic Fleet that could run rings around them. And eight new ones the navy was building were even better.

More importantly, the Pacific Fleet’s three aircraft carriers weren’t at Pearl Harbor. American shipyards were already building 10 modern carriers whose planes would later devastate Imperial Navy forces in the air/sea battles of the Philippine Sea and Leyte Gulf.

Most importantly, as the sun set on Dec. 7 and the U.S. Navy gathered the bodies of its 2,117 sailors and Marines killed that day, all-important fuel storage and ship repair facilities remained untouched by Japanese bombs, allowing Pearl Harbor to continue as a forward base for American naval power in the Pacific.

So in reality, Dec. 7 marked the sunset of Japan’s extravagant ambitions to dominate Asia. Admiral Yamamoto and the Imperial Navy’s other tradition-bound leaders chose the wrong targets at Pearl Harbor.

The dictates of tradition are usually the worst guides to follow when it comes doing anything really important. After all, if they survived long enough to be venerated, they’re probably obsolete.

Powell manifesto addressed American economic system under attack

History often has a hidden beginning. Since the 1970s, people who are already well off have enjoyed a rising percentage of income and wealth. Meanwhile, ordinary Americans face declining social mobility, a shrinking middle class, widening income inequality and crumbling infrastructure. There is plenty to be mad about and plenty of blame to go around.

The economic struggles of the American working class since the late 1970s were not just the result of globalization and technology changes. A long series of public policy changes favored the wealthy. Some argue these changes were the result of sophisticated efforts by the corporate and financial sectors to change government policy, from tax laws to deregulation, to favor the wealthy.

In August 1971, less than two months before he was nominated to serve as an associate justice of the Supreme Court, Lewis F. Powell Jr. sent a confidential memorandum to his neighbor and friend Eugene B. Sydnor Jr., chair of the Education Committee of the U.S. Chamber of Commerce. Powell was a leading Virginia corporate lawyer, a former president of the American Bar Association and served on 11 corporate boards.

The 34-page memo was titled “Attack on American Free Enterprise System.” It presented a bold strategy for how business should counter the “broad attack” from “disquieting voices.” The memo, also known as the Powell manifesto, did not become available to the public until after he was confirmed.

He began the memo this way: “No thoughtful person can question that the American economic system is under broad attack.” He went on to write that the assault was coming from “perfectly respectable elements of society: the college campus, the pulpit, the media, the intellectual and literary journals, the arts and sciences, and from politicians.” American business believed it was facing a hostile political environment during the late 1960s and that it was under attack with the growth of government authority under the Great Society and an increase in regulations ranging from the environment to occupational safety to consumer protection.

The memo outlined a bold strategy and blueprint for corporations to take a much more aggressive and direct role in politics. Powell was following the Milton Friedman argument that it was time for big business to focus on the bottom line; it was time to fight for capitalism. Powell proposed waging the war on four fronts: academia, the media, the legal system, and politics.

The memo influenced, for example, the creation of new think tanks such as the Heritage Foundation, the Manhattan Institute, and other powerful organizations. As Jane Mayer wrote, the Powell Memo “electrified the Right, prompting a new breed of wealthy ultraconservatives to weaponize their philanthropic giving in order to fight a multifront war of influence over American political thought.”

The venerable National Association of Manufacturers moved its offices from New York City to Washington. Its CEO noted: “The relationship of business with business is no longer so important as the interrelationship of business with government.” The number of corporations with public offices in Washington grew from 100 in 1968 to over 500 in 1978. In 1971, only 175 firms had registered lobbyists in Washington; by 1982, nearly 2,500 did.

When it comes to lobbying, money is the weapon of choice. It looms over the political landscape like the Matterhorn.  The number of corporate political action committees (PACs) increased from under 300 in 1971 to over 1,400 by the middle of 1980.  The money they spread around gave lobbyists the clout they needed.  The growth of super PACs and lobbyists ensured that any piece of relevant regulation would be watered down, first in Congress and then during implementation.

The Powell memo galvanized Corporate America and enlarged the influence of big business over the political landscape.  It encouraged business to play a more active role in American politics. Corporate America and the 1 percenters got the memo.

Revisiting the tragedy of the commons

During the 1990s, the term paradigm became increasingly fashionable as an intellectually upscale replacement for the traditional and somewhat shopworn term model. But decanting this old wine into new bottles can still leave a bad taste in our mouths if we define a paradigm in too simplistic a manner.

Dictionaries define “paradigm” as a model or intellectual framework that seeks to explain some phenomenon in a clear and simple manner. A relevant example for our times is Garrett Hardin’s Tragedy of the Commons. In this paradigm there is a common pasture in which local farmers can freely graze their cattle. Needless to say, each farmer will want to graze as many cattle as they can on the common because each cow, they add will provide them with a marginal economic benefit at no additional cost. So, all the farmers continue adding more cows.

This works only so long as the total number of grazing cows remains within the carrying capacity of the commons. Once that limit is exceeded, the viability of the commons for grazing begins to break down as the grass wears out and provides less nourishment per cow.

So, each farmer finds that his or her herd of cattle is producing less milk for them to sell. Under the circumstances, their only rational response is to increase the size of the herd. Which means adding still more cows to the over utilized commons. When all the local farmers keep doing this, the result can only be an increasingly dysfunctional commons.

In Hardin’s words: “Each man is locked into a system that compels him to increase his herd without limit—in a world that is limited. Ruin is the destination towards which all men rush, each pursing his own best interest in a society that believes in the freedom of the commons.”

By way of a solution, some people may propose expanding the commons if it is no longer large enough to support existing herds and to pay for it out of tax revenues so users of the commons can continue to obtain its benefits without directly paying for them. Such people believe the purpose of the commons is to serve the community’s economy, its size should be tailored to the demands of that economy as it grows.

Others insist that the real problem is not too little grass, but too much demand. They argue that the time has come to “think green” about the future of public commons in the context of the overall environment. People should begin shifting to more sustainable ways of managing their communities so they can phase down grazing and turn the commons into public parks.

Then there are those enamored of the stained-glass verities of undergraduate microeconomic theory. They suggest that the time has come to start charging farmers user fees—so much per hour of grazing time for each cow. In this way, each user will pay for the benefits received from the public facility in accordance with how much they use it.

By using a sensible pricing system to ration the use of these scare resources, each farmer will be motivated to make the most efficient use of it. Meanwhile, the revenue from user fees can cover the cost of expanding the public commons when necessary rather than the government taxing everyone to pay for this.

Hardin’s grazing pasture paradigm appears to go a long way towards answering socio-economic questions about the inevitable tendency towards over-use of public goods when they are perceived to be “free”. It explains why this tendency leads to a condition where supply can never really catch up with demand. It describes how the widespread availability of free public goods can significantly influence the underlying economics of many private activities. And it demonstrates the ease with which an entire society can become locked into behavioral patterns that may turn out to be “anti-social” in the long run.

It’s your call. After all, Rorschach tests are not graded.

Anti-Catholic bigotry

When President Trump nominated Seventh Circuit Judge Amy Coney Barrett to the Supreme Court last month, some media outlets and politicians suggested she would bring her Catholic faith onto the bench when deciding matters of law.

The roughly 51 million Catholic adults in the United States are racially and ethnically diverse. Politically, registered Catholic voters are evenly split between those who lean toward the Democratic Party (47 percent) and those who favor the Republicans (46 percent).

For a long time, many Americans have seen Catholics as taking their cues from Rome and not the U.S. Constitution. In the mid-19th century nativist groups combined to form the Supreme Order of the Star-Spangled Banner, which was obsessed with a hatred of Catholics. Ultimately members of the movement were labelled the “Know-Nothings”. Among their demands were to ban Catholics from holding public office and fears that the growing Irish population was making the church a force in American government.

For years, American politics remained plagued by widespread anti-Catholic sentiment, especially in the South. The Irish bore the brunt of tensions that sometimes erupted into violence between Catholics and the Protestant majority. It was another instance of where white privilege was not equally distributed.

Today Catholics are fully assimilated into society. They inhabit an increasingly secular world in which theological dictates from the church carry far less weight than in earlier generations. Catholics, like members of any faith, pick and choose which teachings to follow. For instance, many U.S, Catholics want the church to allow priests to marry, allow women to become priests and come down hard on child abuse by priests and the church’s shameful cover-up of it.

The nomination of then-Professor Amy Coney Barrett to the Seventh Circuit Court of Appeals in 2017 stirred up an awakening of anti-Catholicism. California Senator Dianne Feinstein, who has the kind of voice that makes you wish you have remote control, tried to undermine the candidate’s legitimacy because she was a Catholic. Feinstein, the top Democrat on the Senate Judiciary Committee, told Professor Barrett, “When you read your speeches, the conclusion one draws is that the dogma lives loudly within you. And that’s of concern”.

Senator Feinstein’s brand of bigotry is less like old-fashioned anti-Catholicism and more about the failure of Catholicism to distinguish between public and private moral duties, such as when the Little Sisters of the Poor fought the Affordable Care Act’s contraception mandate all the way to the Supreme Court and won, or the church’s opposition to capital punishment. Of course, there was no mention of how some governors and mayors are keeping houses of worship closed because of the coronavirus while opening schools, businesses, and even athletic events. Hypocrisy is alive and well.

Ironically, it was because of the questioning of Judge Barrett during her previous confirmation hearing three years ago and the subsequent blowback that Senate Judiciary Committee members avoided obsessive and nauseating spritzing about Judge Barrett’s Catholicism. Republican senators were smart to repackage questions about the Judge’s religious beliefs into bigotry, hoping the Democrats would alienate Catholic voters just before the Nov. 3 election.

Democrats avoided the trap. While arguing that a Justice Barrett would jeopardize Roe vs. Wade and the Affordable Care Act, they bent over backward to make clear that they did not oppose the nomination because of her Catholicism. Other senators asked probing questions such as do you support white supremacy, have you ever committed a sexual assault, who does the laundry in the Barrett household, and do you play a musical instrument.

The rest, as they say, is pure commentary.

The rise of the new left

Much has been said and written about our divided society, in which there appears to be more tension than ever. The nation is angry, and America’s polarized discourse leaves many Americans rightfully fearing for the future.

Some claim the contemporary ideology underlying this division derives from cultural Marxism, a contentious term that refers to the strategy propounded by new left-wing theorists in the last century to use the institutions of a society’s culture to bring about revolution.

Cultural Marxism had its roots in the political philosophy propounded by far-left thinkers known as the Frankfurt School. Founded in Germany in 1923, the “Institute for Social Research” was the official name for a group of intellectuals who would play an important role in Europe and the U.S. Among their ideas was to dismantle and undermine the totality of a capitalist society.

Fleeing Hitler in the 1930s, these German academics first set up shop at Columbia University in New York City and then, beginning in 1940, in California. They identified popular culture as wielding a pervasive influence that conditioned the masses into acceptance of capitalist society.

From the 1960s onwards, the strategy was to infiltrate and eventually dominate social and cultural institutions, and thereby achieve cultural hegemony. Rather than the class warfare and the plight of workers, which was the focus of classical Marxist thinkers, they concentrated on areas such as racial, ethnic, and gender warfare, and identity politics.

The Frankfurt School’s new-left intellectuals realized that a Soviet-style revolution was not attractive to democratic Western societies and was unlikely to succeed. Conditions for the working class were improving due to trade union representation and an expanding franchise, among other things. Communism held little appeal to the industrial working class in whose name it had been invented.

Rather than expecting workers to seize control of the levers of political and economic life, they believed the way to bring about revolutionary change was to seed radical ideas within core institutions of society such as the media, arts, and universities.

They understood that culture mass produces consent for the West’s political system, and political revolution would be impossible without a cultural revolution. A successful revolution requires not just seizing political and economic power, but also conquest of the culture, broadly defined as everything from art and entertainment to social and sexual norms. The 1960s radical left-wing German student leader Rudi Dutschke described the strategy of capturing society’s commanding heights as the “long march through the institutions.” A cultural revolution to be achieved by using existing institutions, not overthrowing them.

The outcome of the culture war, like all wars, is wholly uncertain. But what is certain is that the late great Sen. Daniel Patrick Moynihan was right when he said “The central conservative truth is that it is culture, not politics, that determines the success of a society. The central liberal truth is that politics can change a culture and save it from itself.”

In plain terms, if you capture culture, politics will surely follow.

The debt bomb

This year, the federal debt is on track to exceed the size of the entire U.S. economy.

The United States’ debt-to-GDP ratio rose sharply during the Great Recession of 2008-2009 and has continued to rise, reaching 106 percent in 2019. Last year, the GDP was $21.4 trillion, but it is expected to shrink this year. U.S. debt is projected to exceed about $20 trillion and is growing like kudzu.

While the subject of debt and deficits may be dishwater dull to the average American living unemployment check to unemployment check, consider that the Congressional Budget Office (CBO) has warned that the Social Security Trust Fund will run out of money by 2031. Closely related, Medicare’s hospital insurance trust fund is now on track to run out of money in 2024.

The debt-to-GDP ratio compares a country’s public debt to its gross domestic product. By comparing what a country owes with what it produces, the ratio indicates that country’s ability to pay back its debts.

Debt is eating away at the American economy like a swarm of termites invisibly consuming a house. The fiscal follies continue, with the only certainty being that the accumulated debt will be passed on to future generations and jeopardize their chance to live a prosperous life.

It may be time for Washington to consider a new financing instrument to address America’s debt bomb so future generations have a chance to enjoy greater prosperity once the pandemic is behind us. The issuance of 100-year Treasury bonds to fund ballooning deficits, with the interest income indexed to the CPI as a hedge against inflation, may be an idea whose time has come. It would give the next generation, which has to pay down the debt, a break by locking in rock-bottom interest rates. These bonds may appeal to long-term investors, such as pension funds and insurers and be used to fund infrastructure projects.

Long-term bonds are not unusual. Disney issued 100-year bonds in 1993; Norfolk Southern did so in 2010; and Coca-Cola, IBM, Ford and other companies have done the same. Oxford University, Ohio State, Yale and other universities have done the same. Fourteen Organization for Economic Co-Operation and Development countries have issued debt with maturities ranging from 40 to 100 years. Austria, Belgium, and Ireland have all issued century bonds within the last two years.

With COVID-19 and the economic contraction, the CBO has estimated that the deficit for fiscal year 2020 which ends this month will exceed $3 trillion. According to the Committee for a Responsible Budget, this amounts to around 18 percent of GDP for the year. As things stand, the federal debt is expected to reach 108 percent of GDP by next year.

To put these figures into perspective, the U.S.’s highest debt-to-GDP ratio was 112 percent at the end of World War II. The war was financed with a combination of roughly 40 percent taxes and 60 percent debt.

If the great and the good in Washington don’t address how to reduce the deficit-to-GDP ratio and find a fiscally sustainable path after COVID-19, large debt burdens can slow economic growth, raise interest rates, and lead interest on the debt to consume an ever-large proportion of the federal budget, crowding out spending on other priorities. But there is a trust deficit when it comes to the faith sentient Americans have in Washington’s ability to deal with the issue intelligently.

The only approach politicians can agree on to manage the debt and deficits is to steal from future generations by passing on to them the accumulated debt burden. So much for intergenerational fairness. As Admiral Mike Mullen, the former Chairman of the Joint Chiefs of Staff said: “Our national debt is our biggest national security threat.”

Extraordinary situations call for extraordinary measures and the issuance of 100-year bonds might be one way to deal with intergenerational equity.

Financialization of the economy

Financialization refers to the increase in size and importance of the financial sector relative to the overall U.S. economy. Simply put, it is the wonky term used to describe the growing scale, profitability, and influence of the financial sector over the rest of the economy. Combine it with deregulation, less antitrust enforcement, and easy monetary policy from the 1980s onward and you get financial institutions that were too big and too speculative in the years leading up to the financial crisis in 2008.

Today, Wall Street buccaneers don’t just exert great influence over the economy; they are also a major influence in politics and government policy. The financial industry spends millions annually in Washington promoting the Panglossian view that the financial markets promote economic growth and contribute to economic well-being. It would be more accurate to say they contribute to economic inequality and the decline of U.S. manufacturing.

According to data from the Center for Responsive Politics, seven banks spent over $13 million on campaign contributions in the 2018 election cycle and over $38 million on lobbying during the 2017-2018 Congress. Not surprisingly, the top five campaign donors were Bank of America, Goldman Sachs, Morgan Stanley, JPMorgan Chase, and Citigroup.

Any wonder why the Washington crowd favors Wall Street over Main Street? Only the health care industry spends more.

For many Americans, the stock market acts as a barometer for the economy. U.S. financial markets are the largest and most liquid in the world. In 2018, the finance and insurance industries (excluding real estate) represented 7.4 percent or $1.5 trillion of the U.S. gross domestic product. In 1970 the finance and insurance industries accounted for 4.2 percent of GDP, up from 2.8 percent of GDP in 1950. In contrast, manufacturing fell from 30 percent of GDP in 1950 to 11 percent in 2019.

Prior to COVID-19, finance and insurance industry profits were equal to a quarter of the profits of all other sectors combined, even though it accounted for just 4 percent of jobs. These data are evidence of the industry’s growing weight within the American economy.

The figures do not reflect the extent to which non-financial firms derive revenues from financial activities, as opposed to productive investments in real assets. For instance, prior to the 2008 market crash and meltdown, GE Capital generated about half of General Electric’s total earnings. GE became an example of the financialization of American business. In the years leading up to the financial crisis, It became one of the world’s largest non-bank financial services companies, meaning it avoided the level of regulatory scrutiny official players like Wall Street banks face. After it crashed and burned in 2008, GE Capital got a whopping $139 billion taxpayer bailout.

Another example of corporate America moving to the rhythm of Wall Street is the case of Boeing’s 787 Dreamliner aircraft, which famously encountered delays and massive cost overruns due to its incredibly complex supply chain, which involved outsourcing 70 percent of the airplane’s component parts to multiple tiers of suppliers scattered around the world. The Dreamline supply chain reflects the pressure to maximize return on net assets. and was consistent with Wall Street’s approach.

Return on net assets is a key measure financial analysts use to evaluate how effectively management is deploying assets. The goal is to make the most money with the fewest possible assets. In the end, the Dreamliner became an embarrassing failure that cost billions more than it should have. In such instances, financialization reduces the dependence of corporate America on domestic workforces, which leads to offshoring manufacturing jobs.

The financial sector has amassed great power since the 1980s and contributed to the decline of U.S. manufacturing as well as income and wealth inequality. As Supreme Court Justice Brandeis allegedly said in 1941 with great foresight: “We can have democracy in this country, or we can have great wealth concentrated in the hands of a few, but we can’t have both.”

Executive compensation and economic inequality

Oceans of ink have been consumed writing about the subject of widening economic inequality, declining social mobility, and a shrinking middle class in the United States over the last 40 years. More recently, the subject has emerged as a social and political flash point.

The most commonly cited reasons for this phenomenon are globalization and technology adoption. Improvements in technology, such as more powerful computers and industrial robots increase the incentive to substitute capital for labor. Increased trade competition from imports made in lower cost countries, and the threat of exporting jobs to those countries put pressure on wages and employment. Others point to excessive monopoly power, market consolidation and the hollowing out of labor unions.

For the ordinary working-class American there is plenty to be mad about. While wage growth has remained relatively stagnant for decades, an Economic Policy Institute study reports that extravagant chief executive officer (CEO) pay is a major contributor to rising inequality, contributing to the growth of top 1 percent and top 0.1 percent incomes.

The report found that the CEOs of the top 350 US companies by sales raked in an average of $21.3 million last year, an increase from about $18.7 million in 2018. This means that the average CEO made 320 times as much as the average worker earns in wages and benefits. CEO pay went crazy in the 1990s. In 1976 it was 36 times what an average worker earned, 61 times in 1989, and 131 times in 1993.

The authors of the report argue that this “growing earning power at the top has been driving the growth of inequality in our country.” The report attributes the increase to the rapid growth in vested stock awards and exercised stock options tied to stock market growth. Stock-based compensation accounted for about three-fourths of the median CEO’s compensation.

The rise of executive compensation practices linked to stock prices has been the mantra of America, Inc. over the past several decades. In 1982, the Securities and Exchange Commission adopted Rule 10b-18, allowing companies to buy back their own stock without being charged with stock manipulation. Starting in the 1990s many companies introduced stock option grants as a major component of executive compensation. The idea was to better align management interests with those of shareholders. A small circle of highly influential pay consultants, academics, and activist shareholders argued that American firms must pay top dollar for top candidates because they compete in a global market for talent.

While beneficial in some ways, this new form of compensation also created problems quite apart from resentment and lower morale among rank and file workers. For example, the incentive for executives to manage earnings through any means, fair or foul, and focus on the short-term earnings game become strong. Making matters worse, a favorite corporate America trick is to use stock buybacks to manipulate their companies’ stock prices. By increasing demand for a company’s shares, open market buybacks lift the stock price and help the company hit quarterly earnings targets. It makes sense. Stock buybacks enrich investors, including company executives who receive most of their compensation in company stock.

There are many ideas to solve the policy of extravagant executive compensation, ranging from higher marginal income tax rates for those at the top to banning stock buybacks to allowing greater use of “say on pay,” which allows a firm’s shareholders to express dissatisfaction with excessive pay.

While ideas have influence, it’s rarely just because of their singular force they are implemented. Instead, there has to be a confluence between the ideas themselves, the zeitgeist of the times, and the interests of “the great and the good” who find the ideas congenial. The pandemic may serve as a wake-up call for boards of directors and institutional investors to circumcise executive pay.

Closing the carried interest tax preference

Those who can often be found at the very top of the earnings scale – people who manage private investment funds such as hedge funds or private equity and venture funds – enjoy a tax loophole that allows the money they make by investing money for others (their “carried interest”) to be taxed as capital gains rather than earned income, even though they earn the money from work, not as a return on investing their own money.

In plain terms, they reap a benefit even though they don’t put their own capital at risk. It’s a loophole that allows the rich to get richer, and its demise is long overdue. That is why some of the wealthiest Americans pay lower tax rates than their secretaries. Proponents argue that taxing those who run these funds at the same rate that everyone else pays on their earned income would drive away trillions of investment dollars.

These are the same folks, the 1-percenters, who can enjoy indulging in any of the 40 items on the Forbes cost of living extremely well index (CLEWI). The list, which should not be shared with progressive friends, includes such items as a Learjet, 45 minutes with a shrink on the Upper East Side of Manhattan, Russian sable fur coats, a Har-Tru crushed stone tennis court and more. Forbes says, the CLEWI is to the very rich what the CPI is to “ordinary people.”

The term carried interest goes back to medieval merchants in Genoa, Pisa, Florence, and Venice. These traders carried cargo on their ships belonging to other people and earned 20 percent of the ultimate profits on the “carried product.”

Today, those who manage investments in private equity funds are typically compensated in two ways: with a 2 percent fee on funds under management and a 20 percent cut of the gains they produce for investors. The 20 percent in profits these managers pocket, known as carried interest, is currently treated as a long-term capital gain and taxed at 23.8 percent: the capital gains rate of 20 percent plus the Obamacare surcharge of 3.8 percent on their income. The 2 percent management fee is taxed at the higher ordinary income tax rate.

Presumptive Democratic presidential nominee Joe Biden has put forward an economic policy platform under which he would repeal many of the tax cuts that went into effect on Jan. 1, 2018. The proposals include increasing the federal corporate tax rate from 21 percent to 28 percent and restoring the top individual tax rate to 39.6 percent for taxable incomes above $400,000, up from the current 37 percent; taxing capital gains as ordinary income for individuals and couples with over $1 million in annual income and increasing the Social Security earnings cap by applying the payroll tax of 12.4 percent to earnings above $400,000.

While these sweeping tax proposals do not specifically address carried interest, it might be reasonably inferred that carried interest would be taxed as ordinary income rates. In the past, Biden has said he’d like to eliminate the carried interest giveaway. Both Presidents Obama and Trump campaigned on closing the carried interest dodge, yet it’s still there. Their proposals to abolish the carried interest preference were met with pregnant and deadening silence in Congress.

Eliminating the carried interest provision that allows fund managers to get away with bargain basement tax rates should be low-hanging fruit given the inequality of wealth and income in the United States. Yet despite its unpopularity this is the tax break that just won’t die. Well-connected lobbyists and trade groups for private equity, hedge funds, and others have mobilized their resources and fought successfully to keep carried interest as is. The nine lives of carried interest are more evidence, if any more evidence is needed, that big money gets its way in Congress. Here’s hoping that the conceit of closing the carried interest loophole will gain traction but for sure it’s a long shot.

Models aren’t crystal balls

Every day, while folks are stuck at home, politicians, public health officials, and slick talking heads point to charts showing the latest statistics on the coronavirus pandemic as they attempt to predict what might happen next in your neck of the woods. Underlying these graphics are various forecasting models, which you should approach with a healthy dose of skepticism.

It is tempting to view the models as oracles that will help predict how the disease will spread, tell you what to do and when to do it. But these models are simplified versions of realty. Reality is reality. Models should be read with the greatest care. They are not a substitute for controlled scientific experiments that generate relevant data.

Models certainly provide information that can create a framework for understanding a situation. But models, including those used to predict COVID-19′s trajectory, aren’t crystal balls. A model is simply a tool. It consists of raw data, along with assumptions based on our best guesses at the time, that together shape an overall forecast.

A model is only as good as its underlying data, which is in short supply. For example, there is still plenty of uncertainty about how many COVID-19 deaths may occur over the next six months under various social distancing and mask wearing scenarios. Also, a model’s accuracy is constrained by uncertainty about how many people are or have been infected.

Assumptions aren’t facts. Put another way, models are constrained by what is known and what is assumed. Understanding these underlying assumptions helps explain why some forecasts have a sunny disposition, while others can’t be pessimistic enough.

There are also economic models. Financial mavens develop them to take stock of how the pandemic has impacted the economy and where they see it and markets heading. With so many countries experiencing sharp declines in gross domestic product, there is a lot of forecasting about what shape the recovery will take. Will there be a quick V-shaped recovery or will it be U-shaped? Or maybe a little bit of both?

These models also have their limitations. Recall how Long-Term Capital Management, an industry-leading hedge fund run by a renowned team of mathematical experts that included two Nobel Prize winners, developed complex quantitative models to analyze markets and placed huge bets on the assumption, among others, that Russia would never default on its bonds. They did a lousy job of stress testing their assumptions and they bet wrong. In September 1998, the firm had to be bailed out by a consortium of Wall Street banks to prevent the bottom dropping out of the financial system.

This episode was a coming attraction for the harrowing financial crisis a decade later in September 2008, which was perhaps the biggest event of the 21st century until COVID-19. Prior to the 2008 crisis, a key assumption in many models was that housing prices would always go up. Indeed, one cause of the meltdown was the quant movement: the proliferation of quantitative models for designing and analyzing financial products as well as for risk management. Many finance professionals mistakenly believed that quantitative tools had allowed them to conquer risk. Products such as derivatives, subprime mortgage-backed securities and activities that relied heavily on quantitative models were at the heart of how financial firms expanded their activities to take more and greater risks.

And of course, with the presidential election just months away, Americans still remember how 2016 election models forecast Hilary Clinton waltzing into the White House. Between now and Nov. 3, many people will take election forecasts with an extra grain of salt or three.

Given the events of the last several months, people should keep a simple fact in mind: Models should not be asked to carry any more than they can bear. So when you hear about models put on your hmmm face.