Cancel Culture and the Chinese Cultural Revolution

It’s not news that Americans live in a new Age of Magical Thinking. The Enlightenment is seen as the start of hate speech, feelings must always overrule facts, and transubstantiation has taken on a whole new meaning.  Men can become women simply by wishing it so.

Over the last several years, much ink has been spilled about whether there are similarities between cancel culture of the 21st century, particularly in Anglosphere countries, and China’s Great Proletarian Cultural Revolution.  Pundits warn of the dangerous implications of cancel culture.

Both social media and real-life mobs target people who dissent, aiming to ruin their reputations and sometimes getting them fired, all while toppling statues of the Founding Fathers and looting in the name of social justice.

Contemporary events come nowhere near the scale of violence and repression associated with the Cultural Revolution.  Thankfully, social ostracism and unemployment are not the same as firing squads and gulags, but they are still harmful, especially to those committed to free speech.

The ordinary American lives in an age when they witness “high-tech” lynching, to borrow a phrase coined in 1991 by then Supreme Court nominee Clarence Thomas.  The core features are public smears, ridicule, along with the moralistic mob forcing victims to publicly recant their sins.

Between 1949 and his death in 1976, dictator Mao Zedong directed a radical transformation of China.  He grew increasingly suspicious of government apparatchiks and Chinese Communist Party intellectuals, leading him in 1966 to launch a stunning attack on the establishment in the form of a “Cultural Revolution.”

He encouraged youthful Red Guards, his shock troops, to destroy the “four olds” (old ideas, old culture, old customs, and old habits).  In practice, this meant widespread beating, denunciations, and mob-instigated “trials.”  Red Guards roamed the country attacking establishment elites, including government officials, managers, intellectuals, and former members of the bourgeois class. The goal was to purge the country of anyone who was insufficiently leftist.

Today, America and other Anglosphere countries are going through an admittedly more genteel cultural revolution of activists preoccupied with identity politics and cancel culture preaching the same old shibboleths.  As under Mao, people suffer disproportionate consequences for small ideological heresies.

Cancel culture involves public shaming, boycotts, online harassment, and calls for removing people from positions of influence due to perceived offensive comments or behavior.  It can lead to reputational damage, loss of employment opportunities and social isolation without due process.  Cancelling people who disagree with you is straight out of the playbook of dictators and cults.

For example, when former NFL quarterback Drew Brees stated he could “never agree with anybody disrespecting the flag of the United States of America,” citing his grandfathers’ military service, he was accused of violating contemporary social justice dogmas. Acquiescing to the pressure less than 24 hours later, Brees issued an apology on Instagram. He soon followed up with another apology.  Then his wife apologized.  Any wonder why public figures spend their days walking on eggshells?

It remains to be seen where America goes next in its nascent cultural revolution.  Where this trend goes and how long it lasts will ultimately depend on whether Americans stand up for their convictions or cave before online mobs.  Maybe nothing permanent will come of it, despite the best efforts of today’s Red Guards.  It may well turn out that the worst harm from legitimization of censorship and cancel culture may befall those on the right or the left who wield these weapons.

We would do well to remember the words of John Stuart Mill:

“He who knows only his side of the case (argument) knows little of that.  His reasons may be good, and no one may have been able to refute them. But if he is equally unable to refute the reasons on the opposite side, if he does not so much as know what they are, he has no grounds for preferring either opinion”.

Prime Minister Trudeau went too far in dealing with Canada’s ‘Freedom Convoy’

The “Freedom Convoy” of trucks that converged in Ottawa on Jan. 28 began in response to the Canadian government’s requirement that Canadian truck drivers crossing the U.S. border be fully vaccinated to avoid testing and quarantine requirements upon their return. Then it evolved into a protest against all public health measures aimed at fighting the COVID-19 pandemic.

Organizers said they would not end their protest until all pandemic-related public health measures were dropped.

After three weeks of protests, Prime Minister Justin Trudeau invoked the Emergency Act to deal with the blockades. It was the first time the law had ever been used, and it was invoked even though there were plenty of other laws on the books to deal with peaceful protests. It was a classic example of using a machete when a scalpel would have worked just fine.

The Act gave the Canadian government broad powers to restore order, ranging from placing significant limits on peaceful assembly, to prohibiting travel, to requiring financial institutions to turn over personal financial information to the Canadian Security Intelligence Service and freezing the bank accounts of protestors and anyone who helped them.

The Act also gave the government broad authority over businesses, such as dragooning private tow truck companies to provide services against their will. Insurance companies were required to revoke insurance on any vehicles used in blockades.

The Emergency Act is only supposed to be invoked in a genuine crisis, such as in wartime. The War Measures Act, its predecessor, was last invoked under the current prime minister’s father, Pierre Trudeau, in response to the 1970 October Crisis, when a group of militant separatists who wanted to create an independent socialist Quebec engaged in numerous bombings and kidnapped and murdered a cabinet minister.

There is a very real difference between invoking a law against violent terrorists using it to combat a largely peaceful protest by Canadian citizens tired of COVID-19 restrictions and lockdowns.

Riot gear-clad Ottawa police, with provincial and federal help, towed dozens of vehicles that were blocking Ottawa’s downtown streets, retaking control of the area around Parliament buildings, and using pepper spray and stun grenades to remove demonstrators. Ottawa’s streets are now back to normal; there is only snow and silence in the country’s capital.

All this could have been done under existing law. As Alberta Premier Jason Kenney put it, “We have all the legal tools and operational resources required to maintain order.” Put simply, the prime minister could have restored and maintained public order without marginalizing substantial segments of the population.

Trudeau, born and bred elite, first described the truckers as a fringe minority who held “unacceptable” racist and misogynist views. He refused to meet the protesters or negotiate with them, and he was not interested in hearing about the mandates’ impact on their lives. Many of these truckers had spent the last two years keeping the supply chain running.

Instead of finding ways to defuse the situation, Mr. Trudeau issued the emergency order, which he called a “last resort.” After a conservative member of Parliament and descendant of Holocaust survivors asked him tough questions about his handling of the truckers’ protest, Trudeau denounced conservatives who “stand with people who wave swastikas and confederate flags.” These comments came from someone who spent his youth wearing blackface.

The role of government is to maintain public order while respecting civil liberties, including the right to peaceful assembly. Many protests are disruptive and often unlawful, so it is reasonable to impose limits on the right to assemble.

But a real leader and statesperson would have gone to the protesters and said: “I’m here. What do you want to say?” Seeking out and meeting with protesters and pursuing dialogue is a far more strategic way to restore the rule of law than imposing martial law.

The First Amendment and free speech

While many national constitutions come and go every few decades, the U.S. Constitution has served the purpose for which it was intended for more than two centuries. The United States is proud of its tradition of freedom of speech that was established in the First Amendment to the Constitution.

It allows for public criticism of the government. Without it, such behavior could land you in prison – just ask Russian opposition leader Alexei Navalny. Still, there were many times in American history when this principle was traduced.

For example, some of the same people who ratified the Bill of Rights voted in Congress in 1798, during the presidency of John Adams, to pass the Alien and Sedition Acts that made it a crime to utter “false, scandalous, or malicious” speech against the government or the president.

The first 10 amendments to the constitution are known as the Bill of Rights.  They were proposed by Congress in September 1789 and ratified by the states in December 1791.

Freedom of speech isn’t the only freedom protected by the First Amendment.  It reads: “Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof, or abridging the freedom of speech, or of the press; or the right of the people peaceable to assemble, and to petition the Government for a redress of grievances.”

Freedom of speech is considered a fundamental bedrock of liberty, allowing citizens to express their ideas and bring about changes that reflect the needs of its people.  It gives voice to conflicting or dissenting opinions that promote healthy debate that moves society closer to realizing America’s founding ideals.

The Civil Rights Movement is a perfect example of free speech in action.  During the 1950s and 1960s, activists such as Dr. Martin Luther King, Jr. used free speech as a tool to force change in society.  Exercising their voice, these activists were able to outlaw racial discrimination that plagued the country.

But freedom of speech is not an unlimited right. The First Amendment only protects individuals’ speech from U.S. governmental oppression, control, and censorship; it does not extend to private entities. Companies have significant leeway to set their own standards and policies regarding employee conduct.

There is nothing illegal about a private firm censoring people on its platform.  For example, Facebook banning former President Trump indefinitely from its platform and Twitter permanently banning him were within the companies’ legal rights in the aftermath of the Capital incursion on January 6.

The nation has long grappled with which types of speech should be protected and which should not.  Interpreting the broad guarantees of free speech in the First Amendment has not been an easy task.  Over time, the Supreme Court has spilled barrels of ink defining the freedom of speech.  It has upheld people’s right to critique the government and hold political protests, but hasn’t extended protection to those who incite action that might cause harm.

But what constitutes harm is still a matter of debate.  For some, it is limited to physical harm as in the case of falsely shouting “fire” in a crowded movie theater.  For others, harm encompassed a compromise to the dignity of others, as in the case of hate speech.  Another recent argument is that free speech should be curtailed if it causes offense and the speaker makes you feel disrespected. This argument may be setting a lower bar for limiting free speech. But that is a story for another day.

In today’s politically charged climate, some people believe government should restrict certain speech.  But thankfully, the First Amendment protects everything from car commercials to fiery protests.

While it may be unfashionable to quote America’s first President, it merits recalling what he said about free speech: “If freedom of speech is taken away, then dumb and silent we may be led, like sheep to the slaughter.”

Naturally, everyone has their own interpretation of those comments.

Managing the demographic risk of an aging population

One trend that has been largely overlooked by the movers and shakers is our aging population. It is one of the forces that will shape society and the global economy over the next decades and governments need to adjust their policies accordingly.

Around the world, workforces are steadily aging. Among the key drivers of a rapidly aging population are declining fertility rates, increased longevity, and the decline in mortality rates. For example, retiring baby boomers in the United States will live longer, but there will not be enough new births to offset the surge in the ranks of the elderly.

The world’s fertility rate fell from five children per woman in 1950 to roughly 2.5 today and is projected to drop to two by 2050. This decline has been the result of such factors as the rising social status of women and their increased participation in the workforce, widespread availability of birth control, and the increasing costs of raising children.

On the other hand, global life expectancy has increased from 50.09 years in 1960 to 72.6 years in 2019 and is expected to rise to 75 years by 2050. In the United States, life expectancy is projected to increase by about six years from 79.7 in 2017 to 85.6 in 2060. By 2035, there will be more people in the U.S. aged 65 and over than there will be children under 18, according to the Gerontology Institute at the McCormack School of Policy and Global Studies at UMass Boston.

The reasons for increased longevity include advances in health care, increased emphasis on personal and social hygiene, and increased government programs for the elderly.

In the developed world, the ratio of dependents to workers is rising sharply as baby boomers retire. Retirees are not only living longer but are increasingly prone to dementia at older ages. As the CEO of Dana-Farber Harvard Cancer Center said, one out of three people who reach 85 will have Alzheimer’s. This is a group largely dependent on others to help with daily living. As the need for caregivers intensifies, there will be fewer workers available for other work.

A rising dependency ratio is inflationary because dependents consume but do not produce. The growth in retirees may trigger a vicious cycle of slower economic growth and higher taxes. Going forward, policy makers should consider a progressive decline in the size of the labor force.

With fewer people producing goods and services and significantly more non-working people consuming them, global supply will tend to lag demand. Combined with a greater bargaining power of the workforce in wage negotiations, this may increase inflation.

Meanwhile, workers are likely to consume more as a labor shortage pushes up wages. Investment will rise in advanced countries as companies substitute capital for more expensive labor. Rising wages may improve the galloping inequality gap.

Despite these facts, many business leaders and policymakers don’t have a good grasp of the realities of an aging population and the economic challenge it will pose. Aging populations increase the financial burden on governments, creating a pension time bomb, and increasing demands on health care and elderly care systems.

But these outcomes are not inevitable. Greater longevity presents individuals, employers, and policy makers with opportunities to help the elderly live more purposeful lives. Policy makers should take steps to harness the productive potential of older people. For example, by promoting an education policy that includes a strategy for supporting lifetime skill formation.

The famous maxim that “demography is destiny” may or may not be attributable to Auguste Comte, the 19th century French sociologist. But it was certainly true that it was Comte who first wrote about how population trends could determine the future of a country.

What is not true is that destiny is not susceptible to change. Just as societies must adjust their lifestyles to adapt to climate change, societies with aging populations must adjust their policies to promote economic growth.

Demystifying the rule of law

America’s constitutional order is under great stress and foundational principles such as free speech and the rule of law are under attack. The breakdown in respect for American institutions has helped instigate a season of violence and unrest.

The rule of law (ROL) is an expression most Americans are familiar with. It is a popular but vague term often used in political and economic contexts. Americans routinely hear politicians, judges, legislators and prosecutors mention the ROL right up there with freedom and democracy.

Few have paused to say what they actually mean by it. The concept is defined in many ways. For starters the ROL is an ideal, something to look at as a standard, a criterion. It is another way of saying that laws as written are applied equally to everyone. The ROL in its most basic form is captured in the popular quote “no one is above the law.”

It also means that laws should govern a nation and its citizens, as opposed to power resting with a few individuals. In theory, the law of the land is owned by all, made and enforced by representatives of the people.

The notion of the ROL comes with a host of concepts, like the law should be clear, known, and enforced; people are presumed innocent until proven otherwise; the police cannot arbitrarily arrest or detain people without good reason. Laws are interpreted by an independent judiciary which provides for the peaceful settlement of disputes.

The ROL requires that the law be enforced equally.  The most marginalized people in our society are entitled to be treated exactly the same way as anyone else.  It also requires that laws should not discriminate against people for no good reason, such as the color of their skin, their nationality or gender.

The concept of the ROL dates back thousands of years.  For example, the ancient Greeks started democratic law courts back in the 4th and 5th century BC with juries that had hundreds of members.  At Runnymede in 1215, English leaders signed the Magna Carta (Latin for Great Charter).

One might argue that the exalted Magna Carta was the beginning point of English-speaking peoples’ understanding of the ROL.  It was a document in which, for the first time, monarchs and government leaders agreed to subject themselves to the law, recognized that people were entitled to equality before the law and had a right to a jury trial.  The immediate practical consequence of Magna Carta was the establishment of an elected assembly to hold the monarchy to its side of the bargain.  These were momentous new concepts.

In the U.S., the most visible symbol of the ROL is the constitution, which was drafted by a special convention in Philadelphia in 1787.  It is the framework for effective and limited government and the supreme law of the land.  A congressman once delivered one of the truest statements of American political theory: “There is a straight road which runs from Runnymede to Philadelphia”.

The American effort to make good on the promise of the ROL has been difficult and sometimes bloody.  There is no getting around it – America has struggled to create a legal system that is fair to all its people.

The most glaring example is that the U.S. Constitution did not address the problem of slavery, despite the words in the Declaration of Independence that “all men are created equal”. This was the great flaw in American constitutional history.

America and other countries subscribing to the notion of the rule of law have considerable hard work to do to negotiate the distance between the ideal and the reality on the ground.

The rise of the new left

Much has been said and written about our divided society, in which there appears to be more tension than ever. The nation is angry, and America’s polarized discourse leaves many Americans rightfully fearing for the future.

Some claim the contemporary ideology underlying this division derives from cultural Marxism, a contentious term that refers to the strategy propounded by new left-wing theorists in the last century to use the institutions of a society’s culture to bring about revolution.

Cultural Marxism had its roots in the political philosophy propounded by far-left thinkers known as the Frankfurt School. Founded in Germany in 1923, the “Institute for Social Research” was the official name for a group of intellectuals who would play an important role in Europe and the U.S. Among their ideas was to dismantle and undermine the totality of a capitalist society.

Fleeing Hitler in the 1930s, these German academics first set up shop at Columbia University in New York City and then, beginning in 1940, in California. They identified popular culture as wielding a pervasive influence that conditioned the masses into acceptance of capitalist society.

From the 1960s onwards, the strategy was to infiltrate and eventually dominate social and cultural institutions, and thereby achieve cultural hegemony. Rather than the class warfare and the plight of workers, which was the focus of classical Marxist thinkers, they concentrated on areas such as racial, ethnic, and gender warfare, and identity politics.

The Frankfurt School’s new-left intellectuals realized that a Soviet-style revolution was not attractive to democratic Western societies and was unlikely to succeed. Conditions for the working class were improving due to trade union representation and an expanding franchise, among other things. Communism held little appeal to the industrial working class in whose name it had been invented.

Rather than expecting workers to seize control of the levers of political and economic life, they believed the way to bring about revolutionary change was to seed radical ideas within core institutions of society such as the media, arts, and universities.

They understood that culture mass produces consent for the West’s political system, and political revolution would be impossible without a cultural revolution. A successful revolution requires not just seizing political and economic power, but also conquest of the culture, broadly defined as everything from art and entertainment to social and sexual norms. The 1960s radical left-wing German student leader Rudi Dutschke described the strategy of capturing society’s commanding heights as the “long march through the institutions.” A cultural revolution to be achieved by using existing institutions, not overthrowing them.

The outcome of the culture war, like all wars, is wholly uncertain. But what is certain is that the late great Sen. Daniel Patrick Moynihan was right when he said “The central conservative truth is that it is culture, not politics, that determines the success of a society. The central liberal truth is that politics can change a culture and save it from itself.”

In plain terms, if you capture culture, politics will surely follow.

Let them eat credit

The Federal Reserve Bank cut interest rates by a quarter of a point on July 31, the first reduction in more than a decade. The 25-basis points reduction was seen as an effort to stimulate the economy and counteract the escalating tit-for-tat trade war with China that is seen as impeding global growth.

The Federal Reserve, the world’s most powerful central bank, is again bearing the burden of keeping the economy growing and minimizing financial instability. What’s more, they are pursuing pro-growth policies without any fiscal policy support from elected officials.

Just last month, President Trump reached a bipartisan two-year budget agreement with Democratic leadership in the House of Representatives and Republican leaders in the U.S. Senate that raises discretionary spending caps by $320 billion and suspends the debt ceiling until July 31, 2021. The legislation will add $1.7 trillion to projected debt. It is a really bad idea to assume the future will look after itself. The good news is that they got on well together to pass the legislation. If you believe that you can’t be helped.

This budget deal avoids the risk of another partial federal government shutdown and a potentially catastrophic default on the nation’s debt. The Republicans voting for it touted the increase in military spending while the Democrats talked up the additional domestic spending it includes. The federal debt has grown from about $19 trillion in January 2017 to more than $22 trillion now. Fear of debt and its potentially dangerous implications are nowhere to be found in 2019.

But it’s not just the ruling class in Washington that has become addicted to debt; the whole country is waist deep in it. Taken together, all segments of U.S. debt – federal, state, local, corporate, and household – are at 350 percent of the gross domestic product. American household debt continues to climb to record levels, reaching $13.54 trillion in the fourth quarter of 2018, $869 billion above 2008′s $12.68 trillion peak, according to the Federal Reserve Bank of New York.

The Federal Reserve also claims to be tweaking the benchmark federal funds rate because it is worried that inflation is running below its target of 2 percent. According to the Fed, prices rose just 1.6 percent in the year through June, not counting volatile food and fuel prices.

Inflation as defined and measured by the Fed may be running pretty low right now, but bear in mind that the typical family’s living costs may be nothing like the official stats. It is also fair to say that Americans want a bigger paycheck, not higher prices resulting from a 2 percent inflation target. On a daily basis they experience the Dickensian nightmare of the accumulated high cost of several decades of low wages.

Don’t forget that even modest inflation for a prolonged period can seriously erode purchasing power. For instance, inflation averaged 2.46 percent annually between 1990 and 2018. Sounds low, but you would need just about $2,000 today to buy what $1,000 would have bought in 1990. You don’t have to be a socialist or an economist to understand that despite the strong labor market, today’s wages provide about the same purchasing power as years ago – if you are lucky.

To compensate, households turn to debt. The average American now has about $38,000 in personal debt, excluding home mortgages. The average household carries about $137,000 in debt and the median household income is about $59,000. So when the cost of living rises faster than household income, more Americans use credit to cover basic needs like food, clothing, and medical expenses. When wage growth does not keep up with the cost of living, government promotes cheap credit to grease the economic wheel, especially in an on-demand society that values the immediate over the remote.

Put differently, U.S. economic policy has for decades been, to paraphrase the misquoted Marie Antoinette, “Let them eat credit.”

Demography is destiny

The world is undergoing a dramatic transition due to the confluence of disruptive forces such as accelerating technological change and globalization. But another important factor that often gets overlooked will shape society and the global economy over the coming decades: The life expectancy of humans is increasing. Fertility rates are falling, and the world’s population is growing gray.

This unprecedented demographic shift has major implications for U.S. fiscal policy. Entitlement programs will be increasingly strapped as the number of beneficiaries increase and the number of working people who pay for the benefits shrinks.

Due to advances in medical science and technology, people – especially the well to do – expect to live longer, better lives than they might have imagined even three decades ago. According to the Census Bureau, the average American born today can expect to live to about 80, up dramatically from the average of 68 in 1950.

Additionally, the Census Bureau notes that whereas the average American woman in 1950 had 3.5 children during her lifetime, the figure today has fallen below two. The causes of declining fertility include the rising social status of women, widespread availability of birth control, and the growing cost of raising children.

French sociologist and philosopher Auguste Comte coined the aphorism “demography is destiny” with dubious finality almost 200 years ago. But that does not suggest that destiny is immutable, nor is it inevitable. Just as aging individuals must adjust their lifestyles to maintain personal vitality, societies with aging populations must adjust policies to preserve and promote their economic prosperity.

Demographic trends can have big implications. This shift from a predominantly young to predominantly older population has both broad macro-economic implications and important financial consequences. Consider that many U.S. entitlement programs were created with the assumption that there would be a relatively small group of old people and a large number of working-age people, followed by an even bigger cohort of children.

According to the Census Bureau, 47.8 million Americans are 65 and over. This figure is projected to nearly double to 83.7 million by 2050. Just 10 years ago, 12.5 percent of the population was 65 and over. Today, it is 15 percent, and is projected to reach 21 percent in just 20 years. By 2030, one in every five U.S. residents will be over 65. For decades this was the age when people were expected to end their careers and embrace a life of leisure, following Andrew Carnegie’s advice to spend the first third of life getting educated, the second third getting rich, and the last third giving money away.

As the baby boomer generation retires, fertility rates keep falling and life expectancy continues to increase, there will be too many beneficiaries and too few taxpayers. In 1950, the American economy had 8.1 people of working age for each person of retirement age. Recent figures indicate that this “dependency ratio” as the demographers call it, has shrunk to just over 5-1. By 2030, the Census folks estimate it will have fallen to 3-1.

Caring for large numbers of elderly people will put severe pressure on government finances. More specifically and painfully, the U.S. may be facing major tax increases, significant budget cuts, or most likely some combination of the two to secure the future stability of old age entitlement programs. In particular, Social Security and Medicare, which provides health insurance to the aged, will rise as a share of gross domestic product as baby boomers retire.

With the retirement of baby boomers and the rising number of elderly in the population, the nation will face a slow-motion train wreck absent changes in government fiscal policy. The good news is the slow motion part, which gives Americans enough time to take on the challenge of real entitlement reforms that will allow the country to successfully navigate this demographic transition.

Labor Unions And Inequality

In the wake of the Great Recession, economic inequality – the extent to which income and wealth are distributed unevenly across a population – has emerged as a major issue in the United States.

Since the late 1970s, there has been enormous change in the distribution of income and wealth in the U.S. The gap between the “haves” and the “have-nots” has widened, with a small portion of the population reaping an increasingly larger share of the country’s economic rewards. Warren Buffet got it right when he said, “There’s been class warfare going on for the last 20 years and my class has won.”

The average American has lost. Since the mid-1970s, wages have remained stagnant and middle-class earnings have lagged the cost of living.

There are a number of factors contributing to economic inequality, downward mobility among working-class Americans and the dangerous fissures it has caused American society. These include government tax and regulatory policies, the acceleration of finance capitalism, culture, immigration, globalization, and the rate of technological change.

Frequently overlooked is the declining strength of private-sector labor unions. In 1979, unions represented 24 percent of the private sector labor force; today only 6.5 percent of private-sector workers are unionized.

The effects of this decline are fiercely debated. Conservatives argue that labor unions decrease competitiveness and business profitability. Progressives say that in an era of globalization, companies threaten to ship jobs to factories offshore to extract concessions from unions with impunity. For sure, unions raise wages, but that doesn’t necessarily mean they reduce profitability or diminish competitiveness. Consider the success of unionized firms such as Southwest Airlines and UPS.

American manufacturing and wages suffered as U.S. companies engaged in extensive offshore outsourcing of decent-paying domestic jobs to China and other low-wage countries under the banner of free trade, prioritizing short-term profits over long-term investments and the public interest. For example, from 2000-2016, the U.S. shed five million manufacturing jobs, a fact that supporters of free trade and globalization rarely mention.

The loss of traditional manufacturing jobs has contributed to income inequality and declining union membership. According to a report by the Washington-based think tank the Economic Policy Institute, if unions had the same presence in the private sector today as in 1979, both union members and non-members would be making about $2,500 more each year.

Many companies have built their business models around offshoring manufacturing to reduce costs without passing the savings on to consumers. They view the wages and benefits that once underpinned a middle-class lifestyle as obscenely excessive. That’s why they support free trade and use their political power to garner the support of both major political parties, helping accelerate the demise of labor unions. Government turned a blind eye as corporations packed up good jobs and send them overseas, weakening private-sector unions.

The American public has repeatedly been told that policies that restrain foreign competition are a form of protectionism that subsidizes inefficient domestic industries and raises prices. The issue of job losses is ignored. The benefits of free trade allegedly exceed the costs of lost jobs, especially for those who work with their hands. Assumed consumer benefits should be considered when it comes to trade policy, but so should giving working-class people a fair shot at the American Dream. Americans need a more balanced way of thinking about free trade and the offshoring of American jobs.

Is it any wonder that President Trump’s campaign slogan – “Make America Great Again” – resonated with ordinary Americans? This rhetoric is reminiscent of 1988 Democratic Presidential candidate Rep. Richard Gephardt’s slogan “Let’s Make American First Again.”

Writing over 2400 years ago, the Greek philosopher Aristotle captured the importance of inequality when he wrote, “A polity with extremes of wealth and poverty is a city not of free persons but of slaves and masters, the ones consumed by envy, the others by contempt.”

Stock market boom doesn’t float everyone’s boat

Forgetting history is an American pastime. The current bull market that ranks among the great rallies in stock market history began 10 years ago this month, just about the time when Lady Gaga’s “Poker Face” was the number one song in America.

The stock market party has been going on for a decade, but many Americans have not been invited. The Standard & Poor’s 500 index has soared over 300 percent since March 2009, but the gains are heavily concentrated among the richest families.

The richest families are far more likely to own stocks than are middle- or working-class families. Eighty-nine percent of families with incomes over $100,000 have at least some money in the market, compared with just 21 percent of households earning $30,000 or less, according to a Gallup survey.

Overall, 62 percent of families owned stocks before 2008. That number has fallen to 54 percent, the Gallup poll found. The psychological and financial damage inflicted by the 2008 financial crisis and the subsequent Great Recession continue to weigh heavily on the average American, just as memories of the Great Depression influenced financial habits for decades.

In March 2008, the Financial Meltdown, Financial Apocalypse, Financial Collapse – call it what you will – began, with the feds arranging a shotgun marriage between Bear Stearns and JPMorgan Chase. In March 2008, Bear Stearns, the smallest of the five major Wall Street investment banks, was unable to fund its operations and was bleeding cash, having lost the confidence of the market. The feds were faced with a choice between letting the company fail or taking extraordinary steps to rescue it. They choose the latter.

Bear Stearns was sold to the JPMorgan Chase, with the Federal Reserve providing $29 billion as an inducement to the acquiring bank. Bear Stearns may have ceased to exist as an independent firm, but it continued to haunt the financial world like Marley’s Ghost for months thereafter. Its collapse signaled the real start of the financial crisis. Bear’s demise started a banking liquidity crisis in which financial institutions became unwilling to lend to each other, and credit markets seized up.

A growing number of formerly solid financial institutions were turned into basket cases. After their years of kindergarten management games, shooting up on short-term borrowings, ample use of leverage fueled by low interest rates, and binging on risky trades blew up in their faces. Freezing their lending to businesses and individuals alike caused vast portion of the nation’s business activity to grind to a halt, leading to the Great Recession.

The Financial Meltdown of 2008 was one of the most critical events in American history, a biblical-style plague tanked the stock market by nearly 60 percent in the fall of 2008, killing off other financial and credit markets in the process. Banks and firms either vanished into bankruptcy or had to be rescued by taxpayers. The financial system nearly collapsed, triggering an economic crisis.

The deepest recession in decades wiped out some $11 trillion of wealth and vaporized more than eight million American jobs by September 2009. It froze up the nation’s vast financial credit system, leaving many firms without enough cash to operate. It forced the Federal government to spend $2.8 trillion and commit another $8.2 trillion in taxpayer funds to bail out crippled corporations like General Motors, Chrysler, Citigroup, AIG and a host of other too-big-to-fail private institutions.

In addition to their jobs, it cost millions of Americans their homes, life savings, and hopes for a decent retirement. These Americans were in no position to invest in stocks and benefit from the subsequent run-up in the stock market. By contrast, the wealthy have gotten even richer.

This was a cataclysm far worse than any natural disaster the nation has experienced, and its ripples continue to be felt today.

Originally Published: March 29, 2019.