Corporate America and Income Inequality in the U.S.

Economic inequality, the gap between the rich and poor, has always existed. This disparity has increased dramatically in the U.S. over the last four decades.  Inequality can be measured in many ways, frequently using income.

The Gini coefficient is one of the most utilized measures of how income is distributed across the population with 0 being perfectly equal (where everyone receives an equal share) and 1 being completely unequal (where 100 percent of income goes to only one person). The measure has been in use since its development by Italian Statistician Corrado Gini in 1921.

The United States has a Gini Coefficient of 0.485, the highest it has been in 50 years according to the Census Bureau, outpacing that of other advanced economies.  This measurement finds that the U.S. is the most unequal high-income economy in the world.

The top 1 percent of earners made a little over 10 percent of the country’s income in 1980.  Currently they take home about 20 percent, more than the entire bottom half of earners.

Academicians and politicians argue over whether automation or overseas manufacturing is more responsible for eliminating American manufacturing jobs and keeping wages lower.  The question is debatable, but the answer is surely a mosaic from globalization to automation.

One factor that catches the eye time and time again has been the role of corporate America.  Sure, automation and globalization have transformed labor markets across the globe, but it is important not to overlook corporate America’s role in accelerating these effects.

The late Jack Welch, the CEO of General Electric from 1981 to 2001, captured this reality when he talked of ideally having “every plant you own on a barge”.   He turned the firm from a manufacturing company into more of a financial services firm while offshoring American manufacturing jobs.  In 1999, Fortune Magazine named him manager of the century.

Other leading companies followed Welch’s path. For example, General Motors moved production to low-wage areas like northern Mexico starting in the 1980s.  In 2017 Boeing, America’s biggest exporter, opened a plant in China for its 737 planes.

From both an economic and national security perspective, the US needs to strengthen smart manufacturing and provide good jobs for future generations through effective public policies.  War and the pandemic have exposed the fragility of supply chains. Increasing domestic production of items like energy, food and medicine would better secure supply chains and create high value jobs and support American workers and their families.

For example, semiconductors (chips) are foundational for many industries, as everything digital has transformed all sectors of the economy. Bear in mind that digital technologies are disrupting entire industries and blurring industry boundaries.  Still, the US is suffering from a severe shortage of semiconductors.

While the US global share of semiconductor manufacturing capacity was 37 percent in 1990, the number has fallen to an alarming 12 percent today.  The US has become an outlier in an industry that is a major engine of U.S. economic growth and job creation.

The US has grown dependent on other countries that provide government subsidies and incentives to make it easier and cheaper to manufacture semiconductors.  The European Union is planning to provide the industry with $48 billion over 10 years.

More importantly, China is investing $100 billion into the sector. The Chinese government is funding the construction of more than 60 new semiconductor fabrication plants and is poised to have the single largest share of chip manufacturing by 2030.

When push comes to shove, the political class should remember that the US must be the world leader in advanced manufacturing: “Not only the wealth but the independence and security of a country appear to be materially connected with the prosperity of manufacturers”.

Who said that? The never less than interesting Alexander Hamilton, of Broadway fame in his Report to Congress on the Subject of Manufactures in 1791.

Prime Minister Trudeau went too far in dealing with Canada’s ‘Freedom Convoy’

The “Freedom Convoy” of trucks that converged in Ottawa on Jan. 28 began in response to the Canadian government’s requirement that Canadian truck drivers crossing the U.S. border be fully vaccinated to avoid testing and quarantine requirements upon their return. Then it evolved into a protest against all public health measures aimed at fighting the COVID-19 pandemic.

Organizers said they would not end their protest until all pandemic-related public health measures were dropped.

After three weeks of protests, Prime Minister Justin Trudeau invoked the Emergency Act to deal with the blockades. It was the first time the law had ever been used, and it was invoked even though there were plenty of other laws on the books to deal with peaceful protests. It was a classic example of using a machete when a scalpel would have worked just fine.

The Act gave the Canadian government broad powers to restore order, ranging from placing significant limits on peaceful assembly, to prohibiting travel, to requiring financial institutions to turn over personal financial information to the Canadian Security Intelligence Service and freezing the bank accounts of protestors and anyone who helped them.

The Act also gave the government broad authority over businesses, such as dragooning private tow truck companies to provide services against their will. Insurance companies were required to revoke insurance on any vehicles used in blockades.

The Emergency Act is only supposed to be invoked in a genuine crisis, such as in wartime. The War Measures Act, its predecessor, was last invoked under the current prime minister’s father, Pierre Trudeau, in response to the 1970 October Crisis, when a group of militant separatists who wanted to create an independent socialist Quebec engaged in numerous bombings and kidnapped and murdered a cabinet minister.

There is a very real difference between invoking a law against violent terrorists using it to combat a largely peaceful protest by Canadian citizens tired of COVID-19 restrictions and lockdowns.

Riot gear-clad Ottawa police, with provincial and federal help, towed dozens of vehicles that were blocking Ottawa’s downtown streets, retaking control of the area around Parliament buildings, and using pepper spray and stun grenades to remove demonstrators. Ottawa’s streets are now back to normal; there is only snow and silence in the country’s capital.

All this could have been done under existing law. As Alberta Premier Jason Kenney put it, “We have all the legal tools and operational resources required to maintain order.” Put simply, the prime minister could have restored and maintained public order without marginalizing substantial segments of the population.

Trudeau, born and bred elite, first described the truckers as a fringe minority who held “unacceptable” racist and misogynist views. He refused to meet the protesters or negotiate with them, and he was not interested in hearing about the mandates’ impact on their lives. Many of these truckers had spent the last two years keeping the supply chain running.

Instead of finding ways to defuse the situation, Mr. Trudeau issued the emergency order, which he called a “last resort.” After a conservative member of Parliament and descendant of Holocaust survivors asked him tough questions about his handling of the truckers’ protest, Trudeau denounced conservatives who “stand with people who wave swastikas and confederate flags.” These comments came from someone who spent his youth wearing blackface.

The role of government is to maintain public order while respecting civil liberties, including the right to peaceful assembly. Many protests are disruptive and often unlawful, so it is reasonable to impose limits on the right to assemble.

But a real leader and statesperson would have gone to the protesters and said: “I’m here. What do you want to say?” Seeking out and meeting with protesters and pursuing dialogue is a far more strategic way to restore the rule of law than imposing martial law.

The return of the Taliban. What went wrong in Afghanistan?

Writing about recent events is always hazardous. It can be difficult to establish precisely what has happened and why. There is also a lack of clarity about the relative significance of events.

Americans don’t yet know where the collapse of Afghanistan ranks in the list of American military and foreign policy disasters such as the debacle in Iraq, the fall of Saigon, the failed “Bay of Pigs” invasion in Cuba, and the 1979 Iran hostage crisis.

But three points are surely certain, first, the shambolic exit from Afghanistan is a major setback that will undermine U.S. credibility for years to come. As Henry Kissinger said, “To be an enemy of the US is dangerous, to be a friend is fatal”.

Second, Afghanistan fell because America forgot the lessons of history. It does not understand the world beyond its borders, which is very different than the U.S.

Finally, given how the atrocious implementation of the pullout. of U.S. troops from Afghanistan was, Joe Biden will have to wait a bit before he receives his Nobel Peace Prize. Another black eye for the U.S.

There will be lots of talk in the coming days about the harsh lessons to be learned from America’s retreat from Afghanistan. In April, Biden announced the U.S. would withdraw our military from the country without conditions on the 20th anniversary of the 9/11 attacks. What an awful historical irony that the Taliban will once again be in control on Sept. 11.

Looking back, there are some indisputable facts about what went wrong in Afghanistan, and responsibility is certainly divisible by more than one president.

On Oct. 7, 2001, the first of these presidents, George W. Bush, launched Operation Enduring Freedom—the invasion of Afghanistan. The operation sought to bring the architects of 9/11 to justice and reduce the threat of terrorism. Then the Afghan mission, which often lacked strategic clarity, morphed from counter insurgency to counter-narcotics and then into capacity building to remake Afghanistan as an award-winning liberal democracy.

The result is a painful lesson of what can happen when immense military might is put in the hands of politicians and their minions who lack the understanding to employ it properly. Equally culpable are politicized American military leaders who consistently lied about the strength of the Afghan security forces.

The result is that the Taliban, a UN-designated terrorist group, defeated the world’s greatest military power. Another self-inflicted blow to America’s reputation that will complicate Biden administration goals to check China’s rise by building coalitions in the Asia Pacific.

According to the Costs of War project at Brown University, the U.S. has spent more than $2 trillion in Afghanistan since 9/11. That’s $300 million per day for two decades.

And the human costs are even greater. There have been 2,448 service members killed and over 21,000 American soldiers injured in action, along with 3,846 contractors killed. That pales beside the estimated 66,000 Afghan national military and police and over 47,000 Afghan civilians who were killed.

And because the U.S. borrowed most of the money to pay for the war, generations of Americans will be burdened by the cost of paying for it. The Costs of War researchers estimate that by 2050, interest payments alone on the Afghan war debt could reach $6.5 trillion. That amounts to $20,000 for each and every U.S. citizen.

You do not need to support a continued presence in that arid, stone-age country to recognize that things have gone badly. The execution of the U.S. withdrawal has been disastrous, deadly, and humiliating, handing power back to the Taliban in a matter of days. The dramatic unravelling of the situation in Afghanistan puts President Biden’s reputation for foreign policy expertise at risk.

It is worth bearing in mind what former Bush and Obama Defense Secretary Robert Gates wrote in his memoirs: Biden has “been wrong on nearly every major foreign policy and national security issue over the past four decades”.

But not to worry, this is not your father’s Taliban. They are smarter and tougher.

A look at how the ‘Nixon Shock’ changed the global economy

If you asked scholars to name the most important happenings in the last 50 years of American history, they would likely list events ranging from the Vietnam war, the Civil Rights Movement, invention of the computer chip, the Sept. 11 terrorist attacks, the Great Recession that officially lasted from 2007 to 2009, and the COVID-19 pandemic.

Missing from this list would be the so-called Nixon Shock, the 50th anniversary of which is upon us.

In a televised address on Aug. 15, 1971, President Nixon (America’s very own Richard III) announced that he was “closing the gold window,” ending the dollar’s convertibility into gold. Unilaterally ending the last vestiges of the gold standard and eliminating the final link between gold and the dollar was a consequential moment in U.S. financial history.

In this photo made from a television screen broadcasting an NBC Special Report, President Richard M. Nixon appears on national television on Aug. 8, 1974, to announce his resignation.

The Nixon Shock had profound implications for the U.S. and the global economy. The U.S. unleashed an era of floating exchange rates, which created a much less stable world economy, since currency values fluctuated due to the disconnect between them and something that was tangible. Many contend it was the beginning of an inflationist era of fiat money and created decades of turbulence in currency markets.

The president announced the end of the American commitment to redeem other countries’ dollars for gold at $35 an ounce, a bedrock of the Bretton Woods system of mostly fixed exchange rates that had been in place since 1944 and established the dollar as the world’s reserve currency.

Closing the gold window marked the end of a commodity-based monetary system and the beginning of a new world of fiat currencies backed entirely by the full faith and trust in the government that issued it. This gave the government and the Federal Reserve greater control over the economy because they can control how much money is printed.

The president’s main concern in 1971 was avoiding a recession that might cost him the 1972 election. He strong-armed Federal Reserve Chair Arthur Burns into keeping interest rates low in the face of rising consumer prices. President Nixon allegedly told the Burns, “we can take inflation if necessary, but we can’t take unemployment”, setting the stage for the birth of the Great Inflation of the 1970s, the Age of Aquarius.

In fairness to President Nixon, he inherited an economy from President Johnson that was under serious strain.  Federal spending to simultaneously fight the Vietnam War and build the Great Society created budget deficits that fueled inflation along with the growing U.S. trade deficit.

The U.S. had printed more dollars than it could back with gold. Inflation had started to rise in the second half of the 1960s, soaring from a mere 1.4 percent in 1960 to 13.5 percent in 1980.

Put plainly, too many dollars were abroad. By 1971, the pledge that an ounce of gold was worth $35 became void. The feds could not make it happen. So, they severed the link. The value of the dollar in foreign exchange markets suddenly plummeted, which caused increases in import prices as well as in the prices of most commodities priced in dollars.

For sure, the Nixon Shock was not the only reason for the accelerating inflation of the 1970s. For example, the Organization of the Petroleum Exporting Counties announced an oil embargo against the U.S. during the October 1973 Yom Kippur War in Israel. Oil prices surged by 400 percent and U.S. economic activity instantly dropped.  In 1973 the U.S. entered into the deepest recession since the Great Depression, but this time it was coupled with price inflation, not the deflation of the 1930s.

The Nixon Shock was another painful example of the politicization of the economy. Sound familiar? A key lesson for today is that price stability is paramount for a strong and growing economy. Tolerating high inflation in an effort to stimulate the economy is a dangerous game to play.

The U.S., China, and Taiwan

There is no getting around the fact that the United States’ primary strategic competitor for global leadership is the People’s Republic of China, which continues to extend its diplomatic, economic, and military influence internationally. Quite apart from China becoming the world’s second largest economy and its leading trading nation, policy makers increasingly describe its military buildup as a threat to U.S. and allied interests in the Indo-Pacific.

Put simply, the Pentagon considers China it most serious competition. Taiwan may be the issue with the greatest potential to turn competition into direct confrontation. Many military analysts note that after two decades of counterinsurgency wars, the U.S. can no longer be certain of its ability to uphold a favorable balance of power in the Indo-Pacific.

By contrast, China has the military strength, and in particular the long-range missile capability, to overwhelm the U.S. in the Indo-Pacific region according to the United States Studies Centre at the University of Sydney. China is now an adversary that is also a military peer. It is in the enviable position of being able to use limited force to achieve a fait accompli victory over Taiwan before the U.S. could respond.

This is not unthinkable, since the Chinese Communist Party regards Taiwan as an inalienable part of China.  The U.S. needs to defend Taiwan effectively against a Chinese invasion or blockade, because it is important to frustrating China’s strategy to achieve regional hegemony.  For many countries in the region, it is the canary in the coal mine — a strong indicator of how far the U.S. would go to defend them against China.

The two million strong People’s Liberation Army (PLA) is the primary concern of U.S. defense experts.  According to a 2020 Department of Defense report, the PLA has “already achieved parity with—or even exceeded—the US” in several areas in which it has focused its military modernization efforts in the Indo-Pacific region where China certainly has the home court advantage.

The PLA’s modernization program has been supported by China’s rapidly growing economy and augmented by the purchase and alleged theft of militarily useful technologies. In 1996, China was deeply embarrassed and humiliated in the Taiwan Strait Crisis when the U.S. responded to Chinese missile threats meant to intimidate Taiwan with a massive show of force.

Two U.S. aircraft carrier groups emerged in the strait and exposed the weakness of the PLA’s Navy compared to the U.S. fleet.  In response, China’s defense budget rose by about 900 percent between 1996 and 2018 and is now the world’s second largest behind the U.S.

For context, it should be acknowledged that the threats along China’s vast frontier should not be discounted.  With a 13,743-mile land border, it counts 14 sovereign states as neighbors.  It also shares maritime borders with Brunei, Indonesia, Japan, South Korea, Malaysia, the Philippines, and Taiwan.

It should come as no surprise that among China’s grand ambitions is to extend its influence along its frontiers through means such as building and militarizing islands to gain exclusive control over the South China Sea through which about three $3 trillion of trade, or a third of the world’s cargo transport, flows each year.

Failure to respond to the growing threat China poses to its Indo-Pacific neighbors would raise questions about the U.S.’s willingness and capacity to act as a security guarantor in the region.  Essentially, the U.S. needs support from allies and partners in the region to deter Chinese adventurism, including a potential attack on Taiwan.

The stakes could not be higher in this contest.  As historian Niall Ferguson recently wrote: “Perhaps Taiwan will turn out to be to the American Empire what Suez was to the British Empire in 1956: the moment when the imperial lion is exposed as a paper tiger.  Losing Taiwan would be seen all over Asia as the end of American predominance.”

The First Amendment and free speech

While many national constitutions come and go every few decades, the U.S. Constitution has served the purpose for which it was intended for more than two centuries. The United States is proud of its tradition of freedom of speech that was established in the First Amendment to the Constitution.

It allows for public criticism of the government. Without it, such behavior could land you in prison – just ask Russian opposition leader Alexei Navalny. Still, there were many times in American history when this principle was traduced.

For example, some of the same people who ratified the Bill of Rights voted in Congress in 1798, during the presidency of John Adams, to pass the Alien and Sedition Acts that made it a crime to utter “false, scandalous, or malicious” speech against the government or the president.

The first 10 amendments to the constitution are known as the Bill of Rights.  They were proposed by Congress in September 1789 and ratified by the states in December 1791.

Freedom of speech isn’t the only freedom protected by the First Amendment.  It reads: “Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof, or abridging the freedom of speech, or of the press; or the right of the people peaceable to assemble, and to petition the Government for a redress of grievances.”

Freedom of speech is considered a fundamental bedrock of liberty, allowing citizens to express their ideas and bring about changes that reflect the needs of its people.  It gives voice to conflicting or dissenting opinions that promote healthy debate that moves society closer to realizing America’s founding ideals.

The Civil Rights Movement is a perfect example of free speech in action.  During the 1950s and 1960s, activists such as Dr. Martin Luther King, Jr. used free speech as a tool to force change in society.  Exercising their voice, these activists were able to outlaw racial discrimination that plagued the country.

But freedom of speech is not an unlimited right. The First Amendment only protects individuals’ speech from U.S. governmental oppression, control, and censorship; it does not extend to private entities. Companies have significant leeway to set their own standards and policies regarding employee conduct.

There is nothing illegal about a private firm censoring people on its platform.  For example, Facebook banning former President Trump indefinitely from its platform and Twitter permanently banning him were within the companies’ legal rights in the aftermath of the Capital incursion on January 6.

The nation has long grappled with which types of speech should be protected and which should not.  Interpreting the broad guarantees of free speech in the First Amendment has not been an easy task.  Over time, the Supreme Court has spilled barrels of ink defining the freedom of speech.  It has upheld people’s right to critique the government and hold political protests, but hasn’t extended protection to those who incite action that might cause harm.

But what constitutes harm is still a matter of debate.  For some, it is limited to physical harm as in the case of falsely shouting “fire” in a crowded movie theater.  For others, harm encompassed a compromise to the dignity of others, as in the case of hate speech.  Another recent argument is that free speech should be curtailed if it causes offense and the speaker makes you feel disrespected. This argument may be setting a lower bar for limiting free speech. But that is a story for another day.

In today’s politically charged climate, some people believe government should restrict certain speech.  But thankfully, the First Amendment protects everything from car commercials to fiery protests.

While it may be unfashionable to quote America’s first President, it merits recalling what he said about free speech: “If freedom of speech is taken away, then dumb and silent we may be led, like sheep to the slaughter.”

Naturally, everyone has their own interpretation of those comments.

Demystifying the rule of law

America’s constitutional order is under great stress and foundational principles such as free speech and the rule of law are under attack. The breakdown in respect for American institutions has helped instigate a season of violence and unrest.

The rule of law (ROL) is an expression most Americans are familiar with. It is a popular but vague term often used in political and economic contexts. Americans routinely hear politicians, judges, legislators and prosecutors mention the ROL right up there with freedom and democracy.

Few have paused to say what they actually mean by it. The concept is defined in many ways. For starters the ROL is an ideal, something to look at as a standard, a criterion. It is another way of saying that laws as written are applied equally to everyone. The ROL in its most basic form is captured in the popular quote “no one is above the law.”

It also means that laws should govern a nation and its citizens, as opposed to power resting with a few individuals. In theory, the law of the land is owned by all, made and enforced by representatives of the people.

The notion of the ROL comes with a host of concepts, like the law should be clear, known, and enforced; people are presumed innocent until proven otherwise; the police cannot arbitrarily arrest or detain people without good reason. Laws are interpreted by an independent judiciary which provides for the peaceful settlement of disputes.

The ROL requires that the law be enforced equally.  The most marginalized people in our society are entitled to be treated exactly the same way as anyone else.  It also requires that laws should not discriminate against people for no good reason, such as the color of their skin, their nationality or gender.

The concept of the ROL dates back thousands of years.  For example, the ancient Greeks started democratic law courts back in the 4th and 5th century BC with juries that had hundreds of members.  At Runnymede in 1215, English leaders signed the Magna Carta (Latin for Great Charter).

One might argue that the exalted Magna Carta was the beginning point of English-speaking peoples’ understanding of the ROL.  It was a document in which, for the first time, monarchs and government leaders agreed to subject themselves to the law, recognized that people were entitled to equality before the law and had a right to a jury trial.  The immediate practical consequence of Magna Carta was the establishment of an elected assembly to hold the monarchy to its side of the bargain.  These were momentous new concepts.

In the U.S., the most visible symbol of the ROL is the constitution, which was drafted by a special convention in Philadelphia in 1787.  It is the framework for effective and limited government and the supreme law of the land.  A congressman once delivered one of the truest statements of American political theory: “There is a straight road which runs from Runnymede to Philadelphia”.

The American effort to make good on the promise of the ROL has been difficult and sometimes bloody.  There is no getting around it – America has struggled to create a legal system that is fair to all its people.

The most glaring example is that the U.S. Constitution did not address the problem of slavery, despite the words in the Declaration of Independence that “all men are created equal”. This was the great flaw in American constitutional history.

America and other countries subscribing to the notion of the rule of law have considerable hard work to do to negotiate the distance between the ideal and the reality on the ground.

The forgotten tribe: America’s working class

Countless working-class Americans of all races and ethnicities, who work hard and play by the rules, are fed up with the extreme partisanship that permeates the country, and with meaningless acts of violence, including the storming of the capitol. These people are the forgotten tribe in America.

In general, working class people are those with a high school diploma but less than a four-year college degree who live in households with annual incomes roughly between $30,000 and $70,000 for two adults and one child. They are somewhere between the poor and the middle class.

Americans by some measures are more deeply divided politically and culturally than ever before. We live in a period of competing moral certitudes, of people who are sure they are right and prepared to engage in violence to make their point.

For the last many years, political correctness; cancel culture; social justice; multiculturalism; the all-pervasive claim to victimhood; judging people on their ethnicity, gender and race rather than the merits of their work; and the politicization of just about everything has generated more heat and fumes than light. For all their rosy rhetoric on the subject, the ruling elites have less experience with ethnic and racial diversity than the working class.

These factors, and probably dozens of others, are contributing to the breakdown in the American genius for reaching compromises that meet the real social and economic needs of the working class.

Both the extreme right and the extreme left are corroded by ideology. Extremists on the right label their counterparts on the left socialists, and the left calls the right fascists. Each faction takes the law into their own hands while politicians see which way the wind is blowing and refuse to intervene. The growing divisions help explain why the nation’s political center is shrinking.

At the same time, the media, both traditional and social media, have accelerated the fragmentation of cultural and political identities. Conservative and liberal TV networks only highlight information that confirms their audiences’ biases, creating ideological echo chambers.

The worst of the fallout from this polarization will be felt by the forgotten tribe. These issues have done little to help them make ends meet and keep their families safe from COVID. Is it any wonder when they walk past a statue of that schnorrer Thomas Jefferson they don’t experience any trauma? Working people, after all, have to work.

America’s working class doesn’t have the luxury of engaging in ideological pursuits; they have to take care of their families, paying for groceries, medical bills, making mortgage or rent payments. The pampered and self-consciously fortunate regard the working class as “deplorables,” half of whom believe Elvis is still alive. Their understanding is the comic book version of diversity. They live in white neighborhoods, send their kids to private schools, and summer in the Hamptons.

These ruling elites don’t have to live with the unintended consequences of their decisions. The working class are the ones who have to work. As long as they do, it hardly matters what color their skin is or what accent they have. All the while, the economic system directs food, shelter, energy away from those who need it most and toward those who need it least.

The causes of the forgotten tribe’s problems have been well documented: The rate and speed of technological changes, growing monopoly power and concentration, and globalization. Is it any wonder why the working class is losing hope in a better future (get real, they are not Bill Clinton)? They are an endangered species, living paycheck to paycheck.

Despite copious amounts of cash provided to families and unemployed workers, COVID-19 rescue plans don’t provide long-term solutions for making work pay, giving the working class the education and skills needed to get better work, and to strengthen families and communities to support work. These omissions only exacerbate the fraying cohesion of America’s society and political fabric.

Powell manifesto addressed American economic system under attack

History often has a hidden beginning. Since the 1970s, people who are already well off have enjoyed a rising percentage of income and wealth. Meanwhile, ordinary Americans face declining social mobility, a shrinking middle class, widening income inequality and crumbling infrastructure. There is plenty to be mad about and plenty of blame to go around.

The economic struggles of the American working class since the late 1970s were not just the result of globalization and technology changes. A long series of public policy changes favored the wealthy. Some argue these changes were the result of sophisticated efforts by the corporate and financial sectors to change government policy, from tax laws to deregulation, to favor the wealthy.

In August 1971, less than two months before he was nominated to serve as an associate justice of the Supreme Court, Lewis F. Powell Jr. sent a confidential memorandum to his neighbor and friend Eugene B. Sydnor Jr., chair of the Education Committee of the U.S. Chamber of Commerce. Powell was a leading Virginia corporate lawyer, a former president of the American Bar Association and served on 11 corporate boards.

The 34-page memo was titled “Attack on American Free Enterprise System.” It presented a bold strategy for how business should counter the “broad attack” from “disquieting voices.” The memo, also known as the Powell manifesto, did not become available to the public until after he was confirmed.

He began the memo this way: “No thoughtful person can question that the American economic system is under broad attack.” He went on to write that the assault was coming from “perfectly respectable elements of society: the college campus, the pulpit, the media, the intellectual and literary journals, the arts and sciences, and from politicians.” American business believed it was facing a hostile political environment during the late 1960s and that it was under attack with the growth of government authority under the Great Society and an increase in regulations ranging from the environment to occupational safety to consumer protection.

The memo outlined a bold strategy and blueprint for corporations to take a much more aggressive and direct role in politics. Powell was following the Milton Friedman argument that it was time for big business to focus on the bottom line; it was time to fight for capitalism. Powell proposed waging the war on four fronts: academia, the media, the legal system, and politics.

The memo influenced, for example, the creation of new think tanks such as the Heritage Foundation, the Manhattan Institute, and other powerful organizations. As Jane Mayer wrote, the Powell Memo “electrified the Right, prompting a new breed of wealthy ultraconservatives to weaponize their philanthropic giving in order to fight a multifront war of influence over American political thought.”

The venerable National Association of Manufacturers moved its offices from New York City to Washington. Its CEO noted: “The relationship of business with business is no longer so important as the interrelationship of business with government.” The number of corporations with public offices in Washington grew from 100 in 1968 to over 500 in 1978. In 1971, only 175 firms had registered lobbyists in Washington; by 1982, nearly 2,500 did.

When it comes to lobbying, money is the weapon of choice. It looms over the political landscape like the Matterhorn.  The number of corporate political action committees (PACs) increased from under 300 in 1971 to over 1,400 by the middle of 1980.  The money they spread around gave lobbyists the clout they needed.  The growth of super PACs and lobbyists ensured that any piece of relevant regulation would be watered down, first in Congress and then during implementation.

The Powell memo galvanized Corporate America and enlarged the influence of big business over the political landscape.  It encouraged business to play a more active role in American politics. Corporate America and the 1 percenters got the memo.

Anti-Catholic bigotry

When President Trump nominated Seventh Circuit Judge Amy Coney Barrett to the Supreme Court last month, some media outlets and politicians suggested she would bring her Catholic faith onto the bench when deciding matters of law.

The roughly 51 million Catholic adults in the United States are racially and ethnically diverse. Politically, registered Catholic voters are evenly split between those who lean toward the Democratic Party (47 percent) and those who favor the Republicans (46 percent).

For a long time, many Americans have seen Catholics as taking their cues from Rome and not the U.S. Constitution. In the mid-19th century nativist groups combined to form the Supreme Order of the Star-Spangled Banner, which was obsessed with a hatred of Catholics. Ultimately members of the movement were labelled the “Know-Nothings”. Among their demands were to ban Catholics from holding public office and fears that the growing Irish population was making the church a force in American government.

For years, American politics remained plagued by widespread anti-Catholic sentiment, especially in the South. The Irish bore the brunt of tensions that sometimes erupted into violence between Catholics and the Protestant majority. It was another instance of where white privilege was not equally distributed.

Today Catholics are fully assimilated into society. They inhabit an increasingly secular world in which theological dictates from the church carry far less weight than in earlier generations. Catholics, like members of any faith, pick and choose which teachings to follow. For instance, many U.S, Catholics want the church to allow priests to marry, allow women to become priests and come down hard on child abuse by priests and the church’s shameful cover-up of it.

The nomination of then-Professor Amy Coney Barrett to the Seventh Circuit Court of Appeals in 2017 stirred up an awakening of anti-Catholicism. California Senator Dianne Feinstein, who has the kind of voice that makes you wish you have remote control, tried to undermine the candidate’s legitimacy because she was a Catholic. Feinstein, the top Democrat on the Senate Judiciary Committee, told Professor Barrett, “When you read your speeches, the conclusion one draws is that the dogma lives loudly within you. And that’s of concern”.

Senator Feinstein’s brand of bigotry is less like old-fashioned anti-Catholicism and more about the failure of Catholicism to distinguish between public and private moral duties, such as when the Little Sisters of the Poor fought the Affordable Care Act’s contraception mandate all the way to the Supreme Court and won, or the church’s opposition to capital punishment. Of course, there was no mention of how some governors and mayors are keeping houses of worship closed because of the coronavirus while opening schools, businesses, and even athletic events. Hypocrisy is alive and well.

Ironically, it was because of the questioning of Judge Barrett during her previous confirmation hearing three years ago and the subsequent blowback that Senate Judiciary Committee members avoided obsessive and nauseating spritzing about Judge Barrett’s Catholicism. Republican senators were smart to repackage questions about the Judge’s religious beliefs into bigotry, hoping the Democrats would alienate Catholic voters just before the Nov. 3 election.

Democrats avoided the trap. While arguing that a Justice Barrett would jeopardize Roe vs. Wade and the Affordable Care Act, they bent over backward to make clear that they did not oppose the nomination because of her Catholicism. Other senators asked probing questions such as do you support white supremacy, have you ever committed a sexual assault, who does the laundry in the Barrett household, and do you play a musical instrument.

The rest, as they say, is pure commentary.