Inflating Away the National Debt

Once again, the federal debt is big news, as lawmakers grapple with bumping up against the U.S. debt ceiling. The government has reached its borrowing limit, and House Republicans claim they will not vote to raise the debt ceiling and allow further borrowing without real spending cuts, not reductions in planned increases.

High government debt is a significant problem.  The higher the debt to GDP ratio, the harder it may be for a government to borrow by issuing bonds, and investors will demand a higher interest rate for what they view as a risky investment.

It’s also a political hot potato, but quick fixes come with big downsides.  Slow and steady may be the best solution.

The federal debt held by the public as a share of gross domestic product increased to 98 percent in fiscal 2022. For governments this metric is comparable to the debt-to-income ratio a lender usually wants to know before approving a loan. Think of GDP as the nation’s income.

It is also a key metric investor use to measure just how creditworthy a country is.  When the debt-to-GDP ratio is high, investors begin to question government’s ability to pay back the debt and start demanding higher interest rates.

The Congressional Budget Office forecasts that the federal debt will increase to 185 percent of GDP by 2052.  Spendthrifts have been ramping up deficit spending for a crazy long time – U.S. debt has increased more rapidly than national income for more than half a century.  To be sure, monetary policy has contributed to the debt; providing cheap credit by keeping interest rates artificially low for more than a decade.

Federal government debt increased by $2.5 trillion in the fiscal year that ended on September 30, 2022, from $28,429 trillion to $30,929 trillion.  Since the new millennium, the debt has increased from almost $6 trillion to nearly $31 trillion.

Governments have four tools to retire debt.  One is to generate higher economic growth by focusing on GDP.  Growth increases nominal GDP, driving down the debt to GDP ratio over time and reducing the risk of a debt crisis.

Another option is to reduce deficits by cutting government spending.  This is politically unpopular in the best of times and therefore unpalatable to politicians.  Taking away something the body politic regards as a “right” or an “entitlement” can be a career ender.

Raising taxes is somewhat less unappealing, especially on those the electorate views as “rich.”  The risk here is that raising taxes may reduce the incentive to work, causing tax revenue to fall, which would force government to borrow more and thereby exacerbate the debt problem.

One other option remains: inflating your way out of debt and debasing the currency through inflation.  High inflation reduces the real value of the debt, allowing government to pay it off with money that is worth less than when they originally borrowed it. As prices go up, so does GDP.  It’s a bit like a snake eating its own tail.

Inflation makes old debt easier to pay off, but it also makes new debt more expensive. That means higher inflation can lead to spiraling hyperinflation. Trying to inflate debt away is dicey; it is by no means a silver bullet.

But compared to the other options, such as getting spending under control, slow, chronic inflation as embodied by the Federal Reserve’s 2 percent annual goal may be the most politically palatable way to reduce the debt.

If all goes well, such an approach might produce the much ballyhooed “soft landing,” or getting inflation under control without triggering a recession.

FTX Collapse Another Regulatory Failure

Disgraced crypto tycoon Sam Bankman-Fried (SBF), a young man with Promethean ambitions, has been arrested for his role in the collapse of FTX, the virtual trading app he founded.  Prosecutors allege that he orchestrated “one of the biggest financial frauds in U.S. history,” using customers’ money to pay the expenses and debts of his hedge fund, Alameda Research.

The episode again raises troubling questions about the effectiveness of government regulators and the lack of regulatory oversight, despite many promises to bring crypto under their regulatory purview and avoid financial fraud.

Americans have gotten used to financial chicanery. They witnessed Bernie Madoff, who ran a multi-billion-dollar Ponzi scheme that wiped out the life savings of thousands of investors. Then there was the 2008 financial meltdown that cost millions of Americans their jobs, homes, life savings, and hopes for decent retirements. Many Americans never recovered from this cataclysm.

A grand jury in the Southern District of New York indicted Bankman-Fried on eight counts, including securities fraud, money laundering, and making illegal political contributions.  In total, the 30-year-old faces a combined maximum sentence of 115 years.

Following extradition from the Bahamas and his release on a record breaking $250 million bail bond, he has holed up at his parents’ $4 million Palo Alto home with an electronic monitoring bracelet while he awaits trial.

Bankman-Fried is also facing a civil case brought by the SEC, and possible civil actions by the Commodity Futures Trading Commission (CFTC) and state banking and securities regulators.

The house of cards collapsed when FTX filed for bankruptcy protection on November 11 with a reported $32 billion in debt. At the heart of the scandal lies a system for defrauding investors. Billions of dollars in customer assets have vanished, used to plug losses at Alameda Research, finance SBF’s lavish lifestyle, massive political contributions and bankroll his speculative’ investments.

FTX was a platform that let users buy and trade crypto currencies, such as bitcoin. The firm also minted its own digital currency called FTT and was big on environmental, social, and governance investments. SBF was a leading proponent of so-called “effective altruism,” a theory that advocates using “evidence and reason” to do societal good. He told the media he planned to give most of his wealth away to make the world a better place.

SBF donated almost $40 million to political candidates and political action committees in the 2022 congressional midterm elections. He was the second-largest individual donor to Democrats, trailing only billionaire businessman George Soros in the 2022 election cycle.

Prosecutors said one reason he made those contributions was to influence policies and laws affecting the cryptocurrency industry. There may not be a criminal trial until late 2023, legal experts say, because the government will need to build an extraordinary case.

Legions of criminal and civil defense attorneys will make bank by the time the dust settles.  Case in point, angry investors have already filed class action suits against prominent endorsers such as Tom Brady, Larry David, Steph Curry and Naomi Osaka, who all received equity in the company for failing to do due diligence before marketing FTX to the public.

The firm’s blue-chip investors included Sequoia Capital, Black Rock Third Point LLC, Tiger Global Management, the Ontario Teachers’ Pension Plan, SoftBank Group Corp. and Singapore’s investment company, Temasek Holdings.

Can there be any wonder why public trust is on the wane? The plain truth is that regulators exist to protect the interests of the regulated. Surely another special counsel is needed.

Closely related, American should be asking questions of politicians in Washington who sit on key financial oversight committees that were beneficiaries of SBF’s generosity. But that may be wishful thinking. Insulated from oversight and accountability, they will not be performing surgery on themselves anytime soon.

All of which brings to mind Honore de Balzac’s insight that “Behind every great fortune, there is a crime.”

Corporate America and Income Inequality in the U.S.

Economic inequality, the gap between the rich and poor, has always existed. This disparity has increased dramatically in the U.S. over the last four decades.  Inequality can be measured in many ways, frequently using income.

The Gini coefficient is one of the most utilized measures of how income is distributed across the population with 0 being perfectly equal (where everyone receives an equal share) and 1 being completely unequal (where 100 percent of income goes to only one person). The measure has been in use since its development by Italian Statistician Corrado Gini in 1921.

The United States has a Gini Coefficient of 0.485, the highest it has been in 50 years according to the Census Bureau, outpacing that of other advanced economies.  This measurement finds that the U.S. is the most unequal high-income economy in the world.

The top 1 percent of earners made a little over 10 percent of the country’s income in 1980.  Currently they take home about 20 percent, more than the entire bottom half of earners.

Academicians and politicians argue over whether automation or overseas manufacturing is more responsible for eliminating American manufacturing jobs and keeping wages lower.  The question is debatable, but the answer is surely a mosaic from globalization to automation.

One factor that catches the eye time and time again has been the role of corporate America.  Sure, automation and globalization have transformed labor markets across the globe, but it is important not to overlook corporate America’s role in accelerating these effects.

The late Jack Welch, the CEO of General Electric from 1981 to 2001, captured this reality when he talked of ideally having “every plant you own on a barge”.   He turned the firm from a manufacturing company into more of a financial services firm while offshoring American manufacturing jobs.  In 1999, Fortune Magazine named him manager of the century.

Other leading companies followed Welch’s path. For example, General Motors moved production to low-wage areas like northern Mexico starting in the 1980s.  In 2017 Boeing, America’s biggest exporter, opened a plant in China for its 737 planes.

From both an economic and national security perspective, the US needs to strengthen smart manufacturing and provide good jobs for future generations through effective public policies.  War and the pandemic have exposed the fragility of supply chains. Increasing domestic production of items like energy, food and medicine would better secure supply chains and create high value jobs and support American workers and their families.

For example, semiconductors (chips) are foundational for many industries, as everything digital has transformed all sectors of the economy. Bear in mind that digital technologies are disrupting entire industries and blurring industry boundaries.  Still, the US is suffering from a severe shortage of semiconductors.

While the US global share of semiconductor manufacturing capacity was 37 percent in 1990, the number has fallen to an alarming 12 percent today.  The US has become an outlier in an industry that is a major engine of U.S. economic growth and job creation.

The US has grown dependent on other countries that provide government subsidies and incentives to make it easier and cheaper to manufacture semiconductors.  The European Union is planning to provide the industry with $48 billion over 10 years.

More importantly, China is investing $100 billion into the sector. The Chinese government is funding the construction of more than 60 new semiconductor fabrication plants and is poised to have the single largest share of chip manufacturing by 2030.

When push comes to shove, the political class should remember that the US must be the world leader in advanced manufacturing: “Not only the wealth but the independence and security of a country appear to be materially connected with the prosperity of manufacturers”.

Who said that? The never less than interesting Alexander Hamilton, of Broadway fame in his Report to Congress on the Subject of Manufactures in 1791.

Prime Minister Trudeau went too far in dealing with Canada’s ‘Freedom Convoy’

The “Freedom Convoy” of trucks that converged in Ottawa on Jan. 28 began in response to the Canadian government’s requirement that Canadian truck drivers crossing the U.S. border be fully vaccinated to avoid testing and quarantine requirements upon their return. Then it evolved into a protest against all public health measures aimed at fighting the COVID-19 pandemic.

Organizers said they would not end their protest until all pandemic-related public health measures were dropped.

After three weeks of protests, Prime Minister Justin Trudeau invoked the Emergency Act to deal with the blockades. It was the first time the law had ever been used, and it was invoked even though there were plenty of other laws on the books to deal with peaceful protests. It was a classic example of using a machete when a scalpel would have worked just fine.

The Act gave the Canadian government broad powers to restore order, ranging from placing significant limits on peaceful assembly, to prohibiting travel, to requiring financial institutions to turn over personal financial information to the Canadian Security Intelligence Service and freezing the bank accounts of protestors and anyone who helped them.

The Act also gave the government broad authority over businesses, such as dragooning private tow truck companies to provide services against their will. Insurance companies were required to revoke insurance on any vehicles used in blockades.

The Emergency Act is only supposed to be invoked in a genuine crisis, such as in wartime. The War Measures Act, its predecessor, was last invoked under the current prime minister’s father, Pierre Trudeau, in response to the 1970 October Crisis, when a group of militant separatists who wanted to create an independent socialist Quebec engaged in numerous bombings and kidnapped and murdered a cabinet minister.

There is a very real difference between invoking a law against violent terrorists using it to combat a largely peaceful protest by Canadian citizens tired of COVID-19 restrictions and lockdowns.

Riot gear-clad Ottawa police, with provincial and federal help, towed dozens of vehicles that were blocking Ottawa’s downtown streets, retaking control of the area around Parliament buildings, and using pepper spray and stun grenades to remove demonstrators. Ottawa’s streets are now back to normal; there is only snow and silence in the country’s capital.

All this could have been done under existing law. As Alberta Premier Jason Kenney put it, “We have all the legal tools and operational resources required to maintain order.” Put simply, the prime minister could have restored and maintained public order without marginalizing substantial segments of the population.

Trudeau, born and bred elite, first described the truckers as a fringe minority who held “unacceptable” racist and misogynist views. He refused to meet the protesters or negotiate with them, and he was not interested in hearing about the mandates’ impact on their lives. Many of these truckers had spent the last two years keeping the supply chain running.

Instead of finding ways to defuse the situation, Mr. Trudeau issued the emergency order, which he called a “last resort.” After a conservative member of Parliament and descendant of Holocaust survivors asked him tough questions about his handling of the truckers’ protest, Trudeau denounced conservatives who “stand with people who wave swastikas and confederate flags.” These comments came from someone who spent his youth wearing blackface.

The role of government is to maintain public order while respecting civil liberties, including the right to peaceful assembly. Many protests are disruptive and often unlawful, so it is reasonable to impose limits on the right to assemble.

But a real leader and statesperson would have gone to the protesters and said: “I’m here. What do you want to say?” Seeking out and meeting with protesters and pursuing dialogue is a far more strategic way to restore the rule of law than imposing martial law.

The return of the Taliban. What went wrong in Afghanistan?

Writing about recent events is always hazardous. It can be difficult to establish precisely what has happened and why. There is also a lack of clarity about the relative significance of events.

Americans don’t yet know where the collapse of Afghanistan ranks in the list of American military and foreign policy disasters such as the debacle in Iraq, the fall of Saigon, the failed “Bay of Pigs” invasion in Cuba, and the 1979 Iran hostage crisis.

But three points are surely certain, first, the shambolic exit from Afghanistan is a major setback that will undermine U.S. credibility for years to come. As Henry Kissinger said, “To be an enemy of the US is dangerous, to be a friend is fatal”.

Second, Afghanistan fell because America forgot the lessons of history. It does not understand the world beyond its borders, which is very different than the U.S.

Finally, given how the atrocious implementation of the pullout. of U.S. troops from Afghanistan was, Joe Biden will have to wait a bit before he receives his Nobel Peace Prize. Another black eye for the U.S.

There will be lots of talk in the coming days about the harsh lessons to be learned from America’s retreat from Afghanistan. In April, Biden announced the U.S. would withdraw our military from the country without conditions on the 20th anniversary of the 9/11 attacks. What an awful historical irony that the Taliban will once again be in control on Sept. 11.

Looking back, there are some indisputable facts about what went wrong in Afghanistan, and responsibility is certainly divisible by more than one president.

On Oct. 7, 2001, the first of these presidents, George W. Bush, launched Operation Enduring Freedom—the invasion of Afghanistan. The operation sought to bring the architects of 9/11 to justice and reduce the threat of terrorism. Then the Afghan mission, which often lacked strategic clarity, morphed from counter insurgency to counter-narcotics and then into capacity building to remake Afghanistan as an award-winning liberal democracy.

The result is a painful lesson of what can happen when immense military might is put in the hands of politicians and their minions who lack the understanding to employ it properly. Equally culpable are politicized American military leaders who consistently lied about the strength of the Afghan security forces.

The result is that the Taliban, a UN-designated terrorist group, defeated the world’s greatest military power. Another self-inflicted blow to America’s reputation that will complicate Biden administration goals to check China’s rise by building coalitions in the Asia Pacific.

According to the Costs of War project at Brown University, the U.S. has spent more than $2 trillion in Afghanistan since 9/11. That’s $300 million per day for two decades.

And the human costs are even greater. There have been 2,448 service members killed and over 21,000 American soldiers injured in action, along with 3,846 contractors killed. That pales beside the estimated 66,000 Afghan national military and police and over 47,000 Afghan civilians who were killed.

And because the U.S. borrowed most of the money to pay for the war, generations of Americans will be burdened by the cost of paying for it. The Costs of War researchers estimate that by 2050, interest payments alone on the Afghan war debt could reach $6.5 trillion. That amounts to $20,000 for each and every U.S. citizen.

You do not need to support a continued presence in that arid, stone-age country to recognize that things have gone badly. The execution of the U.S. withdrawal has been disastrous, deadly, and humiliating, handing power back to the Taliban in a matter of days. The dramatic unravelling of the situation in Afghanistan puts President Biden’s reputation for foreign policy expertise at risk.

It is worth bearing in mind what former Bush and Obama Defense Secretary Robert Gates wrote in his memoirs: Biden has “been wrong on nearly every major foreign policy and national security issue over the past four decades”.

But not to worry, this is not your father’s Taliban. They are smarter and tougher.

A look at how the ‘Nixon Shock’ changed the global economy

If you asked scholars to name the most important happenings in the last 50 years of American history, they would likely list events ranging from the Vietnam war, the Civil Rights Movement, invention of the computer chip, the Sept. 11 terrorist attacks, the Great Recession that officially lasted from 2007 to 2009, and the COVID-19 pandemic.

Missing from this list would be the so-called Nixon Shock, the 50th anniversary of which is upon us.

In a televised address on Aug. 15, 1971, President Nixon (America’s very own Richard III) announced that he was “closing the gold window,” ending the dollar’s convertibility into gold. Unilaterally ending the last vestiges of the gold standard and eliminating the final link between gold and the dollar was a consequential moment in U.S. financial history.

In this photo made from a television screen broadcasting an NBC Special Report, President Richard M. Nixon appears on national television on Aug. 8, 1974, to announce his resignation.

The Nixon Shock had profound implications for the U.S. and the global economy. The U.S. unleashed an era of floating exchange rates, which created a much less stable world economy, since currency values fluctuated due to the disconnect between them and something that was tangible. Many contend it was the beginning of an inflationist era of fiat money and created decades of turbulence in currency markets.

The president announced the end of the American commitment to redeem other countries’ dollars for gold at $35 an ounce, a bedrock of the Bretton Woods system of mostly fixed exchange rates that had been in place since 1944 and established the dollar as the world’s reserve currency.

Closing the gold window marked the end of a commodity-based monetary system and the beginning of a new world of fiat currencies backed entirely by the full faith and trust in the government that issued it. This gave the government and the Federal Reserve greater control over the economy because they can control how much money is printed.

The president’s main concern in 1971 was avoiding a recession that might cost him the 1972 election. He strong-armed Federal Reserve Chair Arthur Burns into keeping interest rates low in the face of rising consumer prices. President Nixon allegedly told the Burns, “we can take inflation if necessary, but we can’t take unemployment”, setting the stage for the birth of the Great Inflation of the 1970s, the Age of Aquarius.

In fairness to President Nixon, he inherited an economy from President Johnson that was under serious strain.  Federal spending to simultaneously fight the Vietnam War and build the Great Society created budget deficits that fueled inflation along with the growing U.S. trade deficit.

The U.S. had printed more dollars than it could back with gold. Inflation had started to rise in the second half of the 1960s, soaring from a mere 1.4 percent in 1960 to 13.5 percent in 1980.

Put plainly, too many dollars were abroad. By 1971, the pledge that an ounce of gold was worth $35 became void. The feds could not make it happen. So, they severed the link. The value of the dollar in foreign exchange markets suddenly plummeted, which caused increases in import prices as well as in the prices of most commodities priced in dollars.

For sure, the Nixon Shock was not the only reason for the accelerating inflation of the 1970s. For example, the Organization of the Petroleum Exporting Counties announced an oil embargo against the U.S. during the October 1973 Yom Kippur War in Israel. Oil prices surged by 400 percent and U.S. economic activity instantly dropped.  In 1973 the U.S. entered into the deepest recession since the Great Depression, but this time it was coupled with price inflation, not the deflation of the 1930s.

The Nixon Shock was another painful example of the politicization of the economy. Sound familiar? A key lesson for today is that price stability is paramount for a strong and growing economy. Tolerating high inflation in an effort to stimulate the economy is a dangerous game to play.

The U.S., China, and Taiwan

There is no getting around the fact that the United States’ primary strategic competitor for global leadership is the People’s Republic of China, which continues to extend its diplomatic, economic, and military influence internationally. Quite apart from China becoming the world’s second largest economy and its leading trading nation, policy makers increasingly describe its military buildup as a threat to U.S. and allied interests in the Indo-Pacific.

Put simply, the Pentagon considers China it most serious competition. Taiwan may be the issue with the greatest potential to turn competition into direct confrontation. Many military analysts note that after two decades of counterinsurgency wars, the U.S. can no longer be certain of its ability to uphold a favorable balance of power in the Indo-Pacific.

By contrast, China has the military strength, and in particular the long-range missile capability, to overwhelm the U.S. in the Indo-Pacific region according to the United States Studies Centre at the University of Sydney. China is now an adversary that is also a military peer. It is in the enviable position of being able to use limited force to achieve a fait accompli victory over Taiwan before the U.S. could respond.

This is not unthinkable, since the Chinese Communist Party regards Taiwan as an inalienable part of China.  The U.S. needs to defend Taiwan effectively against a Chinese invasion or blockade, because it is important to frustrating China’s strategy to achieve regional hegemony.  For many countries in the region, it is the canary in the coal mine — a strong indicator of how far the U.S. would go to defend them against China.

The two million strong People’s Liberation Army (PLA) is the primary concern of U.S. defense experts.  According to a 2020 Department of Defense report, the PLA has “already achieved parity with—or even exceeded—the US” in several areas in which it has focused its military modernization efforts in the Indo-Pacific region where China certainly has the home court advantage.

The PLA’s modernization program has been supported by China’s rapidly growing economy and augmented by the purchase and alleged theft of militarily useful technologies. In 1996, China was deeply embarrassed and humiliated in the Taiwan Strait Crisis when the U.S. responded to Chinese missile threats meant to intimidate Taiwan with a massive show of force.

Two U.S. aircraft carrier groups emerged in the strait and exposed the weakness of the PLA’s Navy compared to the U.S. fleet.  In response, China’s defense budget rose by about 900 percent between 1996 and 2018 and is now the world’s second largest behind the U.S.

For context, it should be acknowledged that the threats along China’s vast frontier should not be discounted.  With a 13,743-mile land border, it counts 14 sovereign states as neighbors.  It also shares maritime borders with Brunei, Indonesia, Japan, South Korea, Malaysia, the Philippines, and Taiwan.

It should come as no surprise that among China’s grand ambitions is to extend its influence along its frontiers through means such as building and militarizing islands to gain exclusive control over the South China Sea through which about three $3 trillion of trade, or a third of the world’s cargo transport, flows each year.

Failure to respond to the growing threat China poses to its Indo-Pacific neighbors would raise questions about the U.S.’s willingness and capacity to act as a security guarantor in the region.  Essentially, the U.S. needs support from allies and partners in the region to deter Chinese adventurism, including a potential attack on Taiwan.

The stakes could not be higher in this contest.  As historian Niall Ferguson recently wrote: “Perhaps Taiwan will turn out to be to the American Empire what Suez was to the British Empire in 1956: the moment when the imperial lion is exposed as a paper tiger.  Losing Taiwan would be seen all over Asia as the end of American predominance.”

The First Amendment and free speech

While many national constitutions come and go every few decades, the U.S. Constitution has served the purpose for which it was intended for more than two centuries. The United States is proud of its tradition of freedom of speech that was established in the First Amendment to the Constitution.

It allows for public criticism of the government. Without it, such behavior could land you in prison – just ask Russian opposition leader Alexei Navalny. Still, there were many times in American history when this principle was traduced.

For example, some of the same people who ratified the Bill of Rights voted in Congress in 1798, during the presidency of John Adams, to pass the Alien and Sedition Acts that made it a crime to utter “false, scandalous, or malicious” speech against the government or the president.

The first 10 amendments to the constitution are known as the Bill of Rights.  They were proposed by Congress in September 1789 and ratified by the states in December 1791.

Freedom of speech isn’t the only freedom protected by the First Amendment.  It reads: “Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof, or abridging the freedom of speech, or of the press; or the right of the people peaceable to assemble, and to petition the Government for a redress of grievances.”

Freedom of speech is considered a fundamental bedrock of liberty, allowing citizens to express their ideas and bring about changes that reflect the needs of its people.  It gives voice to conflicting or dissenting opinions that promote healthy debate that moves society closer to realizing America’s founding ideals.

The Civil Rights Movement is a perfect example of free speech in action.  During the 1950s and 1960s, activists such as Dr. Martin Luther King, Jr. used free speech as a tool to force change in society.  Exercising their voice, these activists were able to outlaw racial discrimination that plagued the country.

But freedom of speech is not an unlimited right. The First Amendment only protects individuals’ speech from U.S. governmental oppression, control, and censorship; it does not extend to private entities. Companies have significant leeway to set their own standards and policies regarding employee conduct.

There is nothing illegal about a private firm censoring people on its platform.  For example, Facebook banning former President Trump indefinitely from its platform and Twitter permanently banning him were within the companies’ legal rights in the aftermath of the Capital incursion on January 6.

The nation has long grappled with which types of speech should be protected and which should not.  Interpreting the broad guarantees of free speech in the First Amendment has not been an easy task.  Over time, the Supreme Court has spilled barrels of ink defining the freedom of speech.  It has upheld people’s right to critique the government and hold political protests, but hasn’t extended protection to those who incite action that might cause harm.

But what constitutes harm is still a matter of debate.  For some, it is limited to physical harm as in the case of falsely shouting “fire” in a crowded movie theater.  For others, harm encompassed a compromise to the dignity of others, as in the case of hate speech.  Another recent argument is that free speech should be curtailed if it causes offense and the speaker makes you feel disrespected. This argument may be setting a lower bar for limiting free speech. But that is a story for another day.

In today’s politically charged climate, some people believe government should restrict certain speech.  But thankfully, the First Amendment protects everything from car commercials to fiery protests.

While it may be unfashionable to quote America’s first President, it merits recalling what he said about free speech: “If freedom of speech is taken away, then dumb and silent we may be led, like sheep to the slaughter.”

Naturally, everyone has their own interpretation of those comments.

Demystifying the rule of law

America’s constitutional order is under great stress and foundational principles such as free speech and the rule of law are under attack. The breakdown in respect for American institutions has helped instigate a season of violence and unrest.

The rule of law (ROL) is an expression most Americans are familiar with. It is a popular but vague term often used in political and economic contexts. Americans routinely hear politicians, judges, legislators and prosecutors mention the ROL right up there with freedom and democracy.

Few have paused to say what they actually mean by it. The concept is defined in many ways. For starters the ROL is an ideal, something to look at as a standard, a criterion. It is another way of saying that laws as written are applied equally to everyone. The ROL in its most basic form is captured in the popular quote “no one is above the law.”

It also means that laws should govern a nation and its citizens, as opposed to power resting with a few individuals. In theory, the law of the land is owned by all, made and enforced by representatives of the people.

The notion of the ROL comes with a host of concepts, like the law should be clear, known, and enforced; people are presumed innocent until proven otherwise; the police cannot arbitrarily arrest or detain people without good reason. Laws are interpreted by an independent judiciary which provides for the peaceful settlement of disputes.

The ROL requires that the law be enforced equally.  The most marginalized people in our society are entitled to be treated exactly the same way as anyone else.  It also requires that laws should not discriminate against people for no good reason, such as the color of their skin, their nationality or gender.

The concept of the ROL dates back thousands of years.  For example, the ancient Greeks started democratic law courts back in the 4th and 5th century BC with juries that had hundreds of members.  At Runnymede in 1215, English leaders signed the Magna Carta (Latin for Great Charter).

One might argue that the exalted Magna Carta was the beginning point of English-speaking peoples’ understanding of the ROL.  It was a document in which, for the first time, monarchs and government leaders agreed to subject themselves to the law, recognized that people were entitled to equality before the law and had a right to a jury trial.  The immediate practical consequence of Magna Carta was the establishment of an elected assembly to hold the monarchy to its side of the bargain.  These were momentous new concepts.

In the U.S., the most visible symbol of the ROL is the constitution, which was drafted by a special convention in Philadelphia in 1787.  It is the framework for effective and limited government and the supreme law of the land.  A congressman once delivered one of the truest statements of American political theory: “There is a straight road which runs from Runnymede to Philadelphia”.

The American effort to make good on the promise of the ROL has been difficult and sometimes bloody.  There is no getting around it – America has struggled to create a legal system that is fair to all its people.

The most glaring example is that the U.S. Constitution did not address the problem of slavery, despite the words in the Declaration of Independence that “all men are created equal”. This was the great flaw in American constitutional history.

America and other countries subscribing to the notion of the rule of law have considerable hard work to do to negotiate the distance between the ideal and the reality on the ground.

The forgotten tribe: America’s working class

Countless working-class Americans of all races and ethnicities, who work hard and play by the rules, are fed up with the extreme partisanship that permeates the country, and with meaningless acts of violence, including the storming of the capitol. These people are the forgotten tribe in America.

In general, working class people are those with a high school diploma but less than a four-year college degree who live in households with annual incomes roughly between $30,000 and $70,000 for two adults and one child. They are somewhere between the poor and the middle class.

Americans by some measures are more deeply divided politically and culturally than ever before. We live in a period of competing moral certitudes, of people who are sure they are right and prepared to engage in violence to make their point.

For the last many years, political correctness; cancel culture; social justice; multiculturalism; the all-pervasive claim to victimhood; judging people on their ethnicity, gender and race rather than the merits of their work; and the politicization of just about everything has generated more heat and fumes than light. For all their rosy rhetoric on the subject, the ruling elites have less experience with ethnic and racial diversity than the working class.

These factors, and probably dozens of others, are contributing to the breakdown in the American genius for reaching compromises that meet the real social and economic needs of the working class.

Both the extreme right and the extreme left are corroded by ideology. Extremists on the right label their counterparts on the left socialists, and the left calls the right fascists. Each faction takes the law into their own hands while politicians see which way the wind is blowing and refuse to intervene. The growing divisions help explain why the nation’s political center is shrinking.

At the same time, the media, both traditional and social media, have accelerated the fragmentation of cultural and political identities. Conservative and liberal TV networks only highlight information that confirms their audiences’ biases, creating ideological echo chambers.

The worst of the fallout from this polarization will be felt by the forgotten tribe. These issues have done little to help them make ends meet and keep their families safe from COVID. Is it any wonder when they walk past a statue of that schnorrer Thomas Jefferson they don’t experience any trauma? Working people, after all, have to work.

America’s working class doesn’t have the luxury of engaging in ideological pursuits; they have to take care of their families, paying for groceries, medical bills, making mortgage or rent payments. The pampered and self-consciously fortunate regard the working class as “deplorables,” half of whom believe Elvis is still alive. Their understanding is the comic book version of diversity. They live in white neighborhoods, send their kids to private schools, and summer in the Hamptons.

These ruling elites don’t have to live with the unintended consequences of their decisions. The working class are the ones who have to work. As long as they do, it hardly matters what color their skin is or what accent they have. All the while, the economic system directs food, shelter, energy away from those who need it most and toward those who need it least.

The causes of the forgotten tribe’s problems have been well documented: The rate and speed of technological changes, growing monopoly power and concentration, and globalization. Is it any wonder why the working class is losing hope in a better future (get real, they are not Bill Clinton)? They are an endangered species, living paycheck to paycheck.

Despite copious amounts of cash provided to families and unemployed workers, COVID-19 rescue plans don’t provide long-term solutions for making work pay, giving the working class the education and skills needed to get better work, and to strengthen families and communities to support work. These omissions only exacerbate the fraying cohesion of America’s society and political fabric.