The First Amendment and free speech

While many national constitutions come and go every few decades, the U.S. Constitution has served the purpose for which it was intended for more than two centuries. The United States is proud of its tradition of freedom of speech that was established in the First Amendment to the Constitution.

It allows for public criticism of the government. Without it, such behavior could land you in prison – just ask Russian opposition leader Alexei Navalny. Still, there were many times in American history when this principle was traduced.

For example, some of the same people who ratified the Bill of Rights voted in Congress in 1798, during the presidency of John Adams, to pass the Alien and Sedition Acts that made it a crime to utter “false, scandalous, or malicious” speech against the government or the president.

The first 10 amendments to the constitution are known as the Bill of Rights.  They were proposed by Congress in September 1789 and ratified by the states in December 1791.

Freedom of speech isn’t the only freedom protected by the First Amendment.  It reads: “Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof, or abridging the freedom of speech, or of the press; or the right of the people peaceable to assemble, and to petition the Government for a redress of grievances.”

Freedom of speech is considered a fundamental bedrock of liberty, allowing citizens to express their ideas and bring about changes that reflect the needs of its people.  It gives voice to conflicting or dissenting opinions that promote healthy debate that moves society closer to realizing America’s founding ideals.

The Civil Rights Movement is a perfect example of free speech in action.  During the 1950s and 1960s, activists such as Dr. Martin Luther King, Jr. used free speech as a tool to force change in society.  Exercising their voice, these activists were able to outlaw racial discrimination that plagued the country.

But freedom of speech is not an unlimited right. The First Amendment only protects individuals’ speech from U.S. governmental oppression, control, and censorship; it does not extend to private entities. Companies have significant leeway to set their own standards and policies regarding employee conduct.

There is nothing illegal about a private firm censoring people on its platform.  For example, Facebook banning former President Trump indefinitely from its platform and Twitter permanently banning him were within the companies’ legal rights in the aftermath of the Capital incursion on January 6.

The nation has long grappled with which types of speech should be protected and which should not.  Interpreting the broad guarantees of free speech in the First Amendment has not been an easy task.  Over time, the Supreme Court has spilled barrels of ink defining the freedom of speech.  It has upheld people’s right to critique the government and hold political protests, but hasn’t extended protection to those who incite action that might cause harm.

But what constitutes harm is still a matter of debate.  For some, it is limited to physical harm as in the case of falsely shouting “fire” in a crowded movie theater.  For others, harm encompassed a compromise to the dignity of others, as in the case of hate speech.  Another recent argument is that free speech should be curtailed if it causes offense and the speaker makes you feel disrespected. This argument may be setting a lower bar for limiting free speech. But that is a story for another day.

In today’s politically charged climate, some people believe government should restrict certain speech.  But thankfully, the First Amendment protects everything from car commercials to fiery protests.

While it may be unfashionable to quote America’s first President, it merits recalling what he said about free speech: “If freedom of speech is taken away, then dumb and silent we may be led, like sheep to the slaughter.”

Naturally, everyone has their own interpretation of those comments.

Electric cars one of several disruptions that will steer auto industry

Electric vehicles, driverless cars, ridesharing, changing patterns of vehicle ownership and use, and the recognition that climate change poses an existential threat are just a few of the major disruptions that may force automakers to modify their current business models.

Climate scientists contend that electric vehicles are one of the best ways to reduce greenhouse gas emissions, most of which come from cars and trucks. In the United States, the transportation sector is the largest source of emissions, and the automobile industry is under great pressure to meet regulatory emissions targets and do its bit for the planet. Automobile firms, their suppliers. and other mobility players must adapt to an emerging future that threatens their existing business models.

For example, car sharing may undermine the pattern of single-family ownership that has been fundamental to automobile firms’ business model for over a century. If a ride sharing company such as Lyft or Uber is able to send a fully self-driving vehicle to customers’ doors and take them wherever they want to go on command, those customers may be most closely connected to that service, not the automaker.

Electric vehicles will also impact the traditional business model. For starters, there is a dramatic increase in new entrants into the market, and sales and distribution channels are moving from physical dealerships to online stores. Service requirements for electric vehicles are less than for the gas-powered internal combustion engine because of their simplicity, and gross margins may shrink due to much higher competition and lower pricing for electric vehicles over time. In addition, policy makers in Washington will continue to promote and support faster electric vehicle adoption to deal with climate change.

Increasing electric vehicle ownership is at the heart of the Biden administration’s $2.3 trillion infrastructure package. It would provide $174 billion to spur development and adoption of electric vehicles including incentives to buy them and to get more EV charging stations installed across the country – 500,000 of them by 2030 – so people will feel confident they won’t run out of juice. There are currently about 41,000 charging stations in the U.S., compared to more than 136,000 gas stations.

Surveys indicate that while consumers’ appetite for electric vehicles has grown significantly, they remain concerned about the price of battery-powered cars, which can cost up to $10,000 more than conventional vehicles. But total operating costs for electric vehicles may well be less than for conventional ones. Fewer maintenance and charging costs may offset the higher upfront price over time. Electric vehicles also have fewer moving parts and they don’t require oil changes.

The hope is that federal largesse will push the growth of electric vehicles, which currently make up just 2 percent of the new car market and 1 percent of all cars, sport-utility vehicles, vans, and pickup trucks on the road, according to the Department of Energy.

Autonomous or fully self-driving vehicles represent perhaps even greater disruption for the automotive industry, although there remains considerable uncertainty around fully self-driving vehicles, despite considerable investment in them. It is difficult for potential customers to imagine what a community in which these are a viable transportation option would look like.

Even once fully self-driving cars are available, it is extremely difficult to predict their rate of proliferation.  It remains unclear whether they are five, 10, or 15 years away. In any case, they may lead to declining traditional car sales.

All these factors are significantly altering the auto manufacturing landscape. Incumbents will be forced to change their business model, leading to wholesale modifications in their manufacturing base, the closure of current facilities, adjustments to their dealership network and fundamental changes to their overall cost structure.  This kind of disruption does not come easy to large, mature companies.

One thing is certain: How people move from one location to another affects numerous aspects of daily life along with hundreds of related industries, and it will be changing in the near future.

Managing the demographic risk of an aging population

One trend that has been largely overlooked by the movers and shakers is our aging population. It is one of the forces that will shape society and the global economy over the next decades and governments need to adjust their policies accordingly.

Around the world, workforces are steadily aging. Among the key drivers of a rapidly aging population are declining fertility rates, increased longevity, and the decline in mortality rates. For example, retiring baby boomers in the United States will live longer, but there will not be enough new births to offset the surge in the ranks of the elderly.

The world’s fertility rate fell from five children per woman in 1950 to roughly 2.5 today and is projected to drop to two by 2050. This decline has been the result of such factors as the rising social status of women and their increased participation in the workforce, widespread availability of birth control, and the increasing costs of raising children.

On the other hand, global life expectancy has increased from 50.09 years in 1960 to 72.6 years in 2019 and is expected to rise to 75 years by 2050. In the United States, life expectancy is projected to increase by about six years from 79.7 in 2017 to 85.6 in 2060. By 2035, there will be more people in the U.S. aged 65 and over than there will be children under 18, according to the Gerontology Institute at the McCormack School of Policy and Global Studies at UMass Boston.

The reasons for increased longevity include advances in health care, increased emphasis on personal and social hygiene, and increased government programs for the elderly.

In the developed world, the ratio of dependents to workers is rising sharply as baby boomers retire. Retirees are not only living longer but are increasingly prone to dementia at older ages. As the CEO of Dana-Farber Harvard Cancer Center said, one out of three people who reach 85 will have Alzheimer’s. This is a group largely dependent on others to help with daily living. As the need for caregivers intensifies, there will be fewer workers available for other work.

A rising dependency ratio is inflationary because dependents consume but do not produce. The growth in retirees may trigger a vicious cycle of slower economic growth and higher taxes. Going forward, policy makers should consider a progressive decline in the size of the labor force.

With fewer people producing goods and services and significantly more non-working people consuming them, global supply will tend to lag demand. Combined with a greater bargaining power of the workforce in wage negotiations, this may increase inflation.

Meanwhile, workers are likely to consume more as a labor shortage pushes up wages. Investment will rise in advanced countries as companies substitute capital for more expensive labor. Rising wages may improve the galloping inequality gap.

Despite these facts, many business leaders and policymakers don’t have a good grasp of the realities of an aging population and the economic challenge it will pose. Aging populations increase the financial burden on governments, creating a pension time bomb, and increasing demands on health care and elderly care systems.

But these outcomes are not inevitable. Greater longevity presents individuals, employers, and policy makers with opportunities to help the elderly live more purposeful lives. Policy makers should take steps to harness the productive potential of older people. For example, by promoting an education policy that includes a strategy for supporting lifetime skill formation.

The famous maxim that “demography is destiny” may or may not be attributable to Auguste Comte, the 19th century French sociologist. But it was certainly true that it was Comte who first wrote about how population trends could determine the future of a country.

What is not true is that destiny is not susceptible to change. Just as societies must adjust their lifestyles to adapt to climate change, societies with aging populations must adjust their policies to promote economic growth.

Threat of rising inflation could burst any number of asset bubbles

Identifying an asset bubble is not easy. They are only obvious after the bubble has burst – time alone gives definition. Americans may soon get another lesson in asset bubbles. The threat of rising inflation could burst any number of them.

One working definition of a bubble is when an asset’s market price far exceeds its fundamental value and is not justified by estimated price earnings. Artificially high asset valuations that are based on misconceptions that distort reality.

There is certainly disagreement about what is the correct measure of price earnings and one can only estimate fundamental value. Even if the diagnosis is correct, you don’t know when the bubble will burst.

Put simply, bubbles are booms that went bust, followed by a crash – a rapid decline in market prices. After all, that is what bubbles are supposed to do.

Bubbles make for interesting stories. Charles MacKay’s classic book, “Memoires of Extraordinary Popular Delusions and the Madness of Crowds,” was first published in 1852 and is still in print. Fabled asset bubbles are the Dutch tulip bubble of the 1630s, the South Sea and Mississippi bubbles of 1720, the run-up in American stock prices in the 1920s, the dot-com bubble of the late 1990s, and the housing bubble of the mid-2000s, when U.S. housing prices were 50 percent above their long-term trend.

Today there may be asset bubbles in specific markets, such as housing, cryptocurrencies, and stocks. U.S. housing prices rose more than 10 percent last year. The housing market is booming for one key reason: low interest rates. Prices for cryptocurrencies, an asset that doesn’t produce cash flows, rose more than 500 percent in the last year. This surge in the price of unregulated cryptocurrencies such as Bitcoin and Dogecoin has attracted the attention of many investors.

Then there is the incredible explosion in Special Purpose Acquisition Companies (SPACs). This is a company that is formed strictly to raise capital through an initial public offering for the purpose of acquiring an existing company. Also, known as “blank check companies,” SPACs have been around for decades. But new issuance of SPACs has skyrocketed over the past year. Over $75 billion was issued in 2020. Less than three months into this year, they have raised more than $78 billion.

The U.S. stock market ended 2020, another year that will live in infamy, at record highs. After bottoming out during the initial COVID-19 lockdown in March, the S&P 500 rose 68 percent in 2020, finishing the year up more than 16 percent, shattering all-time records along the way. The Dow Jones Industrial Average and the tech-heavy Nasdaq gained 7.25 percent and 43.6 percent respectively, despite the public health and economic crises.

One condition that typically accompanies asset bubbles is easy credit that turbocharges speculation and benefits borrowers.

Federal Reserve Chairman Jerome Powell is continuing the massive expansion of money and credit. Americans hope these policies don’t make a monkey out of Darwin.

Powell recently said the Fed won’t raise rates until they see a 3.5 percent unemployment rate and inflation averaging better than 2 percent. Inflation in not high on the Fed’s list of worries. Their top concern is mending the labor market. Both the Fed and the Biden administration are focused on getting the 10 million unemployed back on payrolls.

The bond market appears skeptical about letting inflation rise as 10-year bond rates are increasing – a signal from investors that they expect higher inflation. The bond market is apparently concerned that inflation would cut into buying power and force the Fed into rate hikes that could pop some of these asset bubbles that inflated thanks to rock-bottom interest rates, creating big risks to the economy.

To strike an optimistic note, if faintly, the accelerating rollout of COVID-19 vaccines has set the stage for rapid economic recovery in the second half of this year, hopefully with limited inflation. So, place you bets accordingly.

Demystifying the rule of law

America’s constitutional order is under great stress and foundational principles such as free speech and the rule of law are under attack. The breakdown in respect for American institutions has helped instigate a season of violence and unrest.

The rule of law (ROL) is an expression most Americans are familiar with. It is a popular but vague term often used in political and economic contexts. Americans routinely hear politicians, judges, legislators and prosecutors mention the ROL right up there with freedom and democracy.

Few have paused to say what they actually mean by it. The concept is defined in many ways. For starters the ROL is an ideal, something to look at as a standard, a criterion. It is another way of saying that laws as written are applied equally to everyone. The ROL in its most basic form is captured in the popular quote “no one is above the law.”

It also means that laws should govern a nation and its citizens, as opposed to power resting with a few individuals. In theory, the law of the land is owned by all, made and enforced by representatives of the people.

The notion of the ROL comes with a host of concepts, like the law should be clear, known, and enforced; people are presumed innocent until proven otherwise; the police cannot arbitrarily arrest or detain people without good reason. Laws are interpreted by an independent judiciary which provides for the peaceful settlement of disputes.

The ROL requires that the law be enforced equally.  The most marginalized people in our society are entitled to be treated exactly the same way as anyone else.  It also requires that laws should not discriminate against people for no good reason, such as the color of their skin, their nationality or gender.

The concept of the ROL dates back thousands of years.  For example, the ancient Greeks started democratic law courts back in the 4th and 5th century BC with juries that had hundreds of members.  At Runnymede in 1215, English leaders signed the Magna Carta (Latin for Great Charter).

One might argue that the exalted Magna Carta was the beginning point of English-speaking peoples’ understanding of the ROL.  It was a document in which, for the first time, monarchs and government leaders agreed to subject themselves to the law, recognized that people were entitled to equality before the law and had a right to a jury trial.  The immediate practical consequence of Magna Carta was the establishment of an elected assembly to hold the monarchy to its side of the bargain.  These were momentous new concepts.

In the U.S., the most visible symbol of the ROL is the constitution, which was drafted by a special convention in Philadelphia in 1787.  It is the framework for effective and limited government and the supreme law of the land.  A congressman once delivered one of the truest statements of American political theory: “There is a straight road which runs from Runnymede to Philadelphia”.

The American effort to make good on the promise of the ROL has been difficult and sometimes bloody.  There is no getting around it – America has struggled to create a legal system that is fair to all its people.

The most glaring example is that the U.S. Constitution did not address the problem of slavery, despite the words in the Declaration of Independence that “all men are created equal”. This was the great flaw in American constitutional history.

America and other countries subscribing to the notion of the rule of law have considerable hard work to do to negotiate the distance between the ideal and the reality on the ground.

The forgotten tribe: America’s working class

Countless working-class Americans of all races and ethnicities, who work hard and play by the rules, are fed up with the extreme partisanship that permeates the country, and with meaningless acts of violence, including the storming of the capitol. These people are the forgotten tribe in America.

In general, working class people are those with a high school diploma but less than a four-year college degree who live in households with annual incomes roughly between $30,000 and $70,000 for two adults and one child. They are somewhere between the poor and the middle class.

Americans by some measures are more deeply divided politically and culturally than ever before. We live in a period of competing moral certitudes, of people who are sure they are right and prepared to engage in violence to make their point.

For the last many years, political correctness; cancel culture; social justice; multiculturalism; the all-pervasive claim to victimhood; judging people on their ethnicity, gender and race rather than the merits of their work; and the politicization of just about everything has generated more heat and fumes than light. For all their rosy rhetoric on the subject, the ruling elites have less experience with ethnic and racial diversity than the working class.

These factors, and probably dozens of others, are contributing to the breakdown in the American genius for reaching compromises that meet the real social and economic needs of the working class.

Both the extreme right and the extreme left are corroded by ideology. Extremists on the right label their counterparts on the left socialists, and the left calls the right fascists. Each faction takes the law into their own hands while politicians see which way the wind is blowing and refuse to intervene. The growing divisions help explain why the nation’s political center is shrinking.

At the same time, the media, both traditional and social media, have accelerated the fragmentation of cultural and political identities. Conservative and liberal TV networks only highlight information that confirms their audiences’ biases, creating ideological echo chambers.

The worst of the fallout from this polarization will be felt by the forgotten tribe. These issues have done little to help them make ends meet and keep their families safe from COVID. Is it any wonder when they walk past a statue of that schnorrer Thomas Jefferson they don’t experience any trauma? Working people, after all, have to work.

America’s working class doesn’t have the luxury of engaging in ideological pursuits; they have to take care of their families, paying for groceries, medical bills, making mortgage or rent payments. The pampered and self-consciously fortunate regard the working class as “deplorables,” half of whom believe Elvis is still alive. Their understanding is the comic book version of diversity. They live in white neighborhoods, send their kids to private schools, and summer in the Hamptons.

These ruling elites don’t have to live with the unintended consequences of their decisions. The working class are the ones who have to work. As long as they do, it hardly matters what color their skin is or what accent they have. All the while, the economic system directs food, shelter, energy away from those who need it most and toward those who need it least.

The causes of the forgotten tribe’s problems have been well documented: The rate and speed of technological changes, growing monopoly power and concentration, and globalization. Is it any wonder why the working class is losing hope in a better future (get real, they are not Bill Clinton)? They are an endangered species, living paycheck to paycheck.

Despite copious amounts of cash provided to families and unemployed workers, COVID-19 rescue plans don’t provide long-term solutions for making work pay, giving the working class the education and skills needed to get better work, and to strengthen families and communities to support work. These omissions only exacerbate the fraying cohesion of America’s society and political fabric.

Leadership lessons from ‘Twelve O’Clock High’

The two best examples of crisis leadership for contemporary students of management and leadership are World War I and World War II. The former a gold mine of information illustrating virtually every conceivable way of doing things wrong and World II a nice balance between doing thing wrong and doing things right.

World War II was actually three separate wars that took place at the same time: United States versus Japan in the Pacific, the United States and the United Kingdom (UK) versus Germany in Western and Southern Europe, and the Soviet Union versus Germany in Eastern Europe.

Germany and Japan started World War II having great successes by doing things right. Then they lost their way and ended up doing everything wrong.

In contrast, the Allies (US, UK, and the Soviet Union) started off doing many things wrong, mainly out of ignorance and false illusions, including the misuse of air power.  But they managed to get their respective acts together and wound up doing most things right.  They won the war, and in so doing, reshaped the world.

Running a business has a great many parallels with running a war.  To succeed in either, you must set realistic goals, identify and deploy the relevant resources necessary for achieving these goals, and then skillfully implement the options you select.  After that you have to roll with the punches that inevitably whack you from unexpected events and adjust your strategy with dispatch.

Two fine Hollywood movies made in the late 1940s effectively dramatize “war situations” that are also common in business.

“Command Decision” is one of the movies with themes that translate to business.  It deals with strategic decision making at the command level.  The other is “Twelve O’ Clock High,” which is about a manager taking over a failing bomber group and whipping it into shape through a program of stern discipline.

It is the harrowing story of the first B-17 bombers in England in World War II and the terrible losses they took before long-range fighters were available to escort them on combat missions over Europe. The movie was adopted from a popular novel that was, in turn, based on a real event that affected the Eighth Air Force in England during 1942 and 1943.

The new leader immediately incurs the hatred of aircrews when he comes down hard on the lack of discipline.  He deals harshly with slackers, segregating the worst misfits into a crew known as “The Leper Colony”. He openly criticizes mistakes, insists on a high level of professionalism and is a straight talker who appreciates straight talk in return.

Resentful of the new management style, all the pilots ask to be transferred out of the unit.  But the new commander sticks to his principles. As the bomber group develops combat effectiveness and the group’s performance improves, and the loss of life decreases, the pilots change their minds and support the new commander and his leadership style.

This story dramatizes steps the leader took to restore the morale of people who had come to regard themselves as “hard-luck failures” who had accumulated the highest loss rate and the worst bombing effectiveness record and motivated them to become a winning team.

The film highlights timeless leadership lessons such as creating a strategy; setting clear expectations; creating performance standards; giving clear directions; putting the right people in the right jobs; communicating the “why”; restoring accountability, and pushing, pushing, and pushing until the job is done.

Whether commanding a bomber group or managing employees towards making their numbers, these leadership qualities are essential and universal, especially in situations of extreme emergency and crisis.

Leadership lessons learned from the 1948 movie ‘Command Decision’

COVID-19 has turned the world upside down, and it is clear that things will not return to the status quo ante anytime soon. The pandemic has provided a test for societies and for their leaders.

One dimension of leadership always in short supply is the ability to tell people the truth, even if the message is unwelcome, such as that things will get worse before they get better.

In the current climate of fear and uncertainty, the United States needs leaders who can make strategic decisions independent of politics and do the right thing.  This dimension of leadership is captured in the excellent 1948 movie about strategic bombing in World War II, “Command Decision.”

The film deals with strategy, leadership, corporate politics, and is probably the most sophisticated American war film ever made.  It dramatizes a fundamental strategic conflict between two Army Air Force generals. Both are West Point graduates who committed themselves early on to Billy Mitchell’s schtick of air power as a “war-winner in its own right.”

The younger general commands the Eighth Air Force’s strategic bombing units in England during 1943. He’s learned that the Luftwaffe has begun production of a wiz-bang new jet fighter at three plants deep inside Germany. He believes these factories must be bombed into oblivion as soon as possible, no matter what the cost in bomber and air -crew losses, to prevent the jet fighter program from creating a defensive shield over Germany that will make strategic bombing impossible and threaten the planned 1944 invasion of France.

But his older and more politically savvy boss is convinced that the ultimate success of strategic bombing depends on the size of the bomber force Washington allocates to the Eighth Air Force. This will be determined by how effective they are at producing acceptable bombing results without high loss rates,  which rules out the go-for-broke raids the younger general wants to mount against the three jet fighter factories.

The younger general insists that they must take advantage of a period of clear weather to complete the destruction of the factories if strategic bombing of Germany is to have any future.  The older general believes the future of strategic bombing depends on the Eighth Air Force getting enough bombers.  This will be determined at an allocation meeting in Washington, where heavy bomber losses certainly won’t help their case.

In other words, the younger general fights the Germans in Europe while the older general has to fight the US Navy, which wants bombers for the Pacific theater, and Army ground forces which wants to recycle bomber pilots now in training as company commanders.

These dramatic debates between the two generals are breathtaking; two dedicated pros with very different perspectives about the strategic issue at hand pour out their arguments, hopes, fears, and differing career expectations.

The movie’s sympathies lie with the younger general and show that he was right.  At the time the movie was made, there was widespread public acceptance of Air Force propaganda that its Strategic Bombing concept had been successful.  It turned out in retrospect, that the pre-war strategic bombing advocates grossly underestimated the resources needed for this concept to succeed, so the older general was actually right.

The problem the two generals confront is similar to the Covid-19 crisis.  You can impose a protracted lockdown and harm the economy to the point where recovery will take decades, or forego lockdowns and get the economy moving, but with a significant increase in illness and death.

Few people like to hear bad news, but telling the public what it needs to hear and facing problems is an important test of leadership.  The role of a leader is to do the right thing in addressing a wicked problem that may have no clear solution – only an array of possible approaches, each with deleterious consequences.

The Fed and inflation

Life has changed substantially for ordinary working-class Americans in the first two decades of the 21st century. The deification of technology, the growth of globalization, the harrowing financial events of 2008 followed by the Great Recession, and the COVID-19 pandemic have left them struggling psychologically, physically, socially, and economically.

Growing income and wealth inequality were on the radar screen long before the coronavirus pandemic, but the pandemic has made the problem more obvious and urgent. The actions of the Federal Reserve (Fed) have widened the gap. Quite apart from persistently low interest rates, there is the issue of inflation.

Last August, Fed Chair Jerome Powell introduced a policy that not only allows but welcomes an inflation level above 2 percent. The Fed assumes it will be able to just snap their fingers and stop inflation at the point they like, which is the pinnacle of hubris.

Inflation matters. It tends to redistribute income and wealth toward groups that are better able to hedge against inflation by sheltering their assets in ways that earn decent returns.

But for the ordinary American, prices that rise faster than wages mean a decline in real income, less purchasing power and lower living standards. Inflation coupled with wage stagnation is eating away at the working class.

While the cost of many discretionary goods has fallen during the pandemic, basic necessities such as housing, healthcare, education, and food are absorbing an ever-larger portion of the incomes of ordinary Americans.

The cost of groceries has been rising at the fastest pace in decades since the pandemic seized the economy. It’s as if working-class Americans are involuntarily observing Lent all year round. They experience life at the sharp end.

In the United States, the Consumer Price Index (CPI), which reflects retail prices of goods and services, including housing costs, transportation, and healthcare, is the most widely followed indicator of inflation. Food inflation is a major part of the CPI.

But the Fed generally focuses on “core inflation” or “core CPI.” This excludes non-discretionary items such as food and energy prices and can give a misleading picture of inflation trends. In the real world, people can’t exclude food from their weekly budget.

According to the latest inflation data published by the U.S. Labor Department’s Bureau of Labor Statistics, another light, or as they now say, lite, read, food prices have increased by nearly 4 percent in the last year, higher than at any point since the 1970s.

The increases are even more dramatic for some food items, with beef and veal prices up 25 percent year-over-year, egg prices up 12 percent, potatoes up 13 percent, and tomato prices up 8 percent.

The report is broken into price changes for “food away from home” and “food at home”. In November, the categories registered year-over-year increases of 3.8 percent and 3.6 percent, respectively.

Rising food prices impact everybody, but they are always top of mind for ordinary working Americans. Even more affected are the poor and the unemployed because they are unable to afford basic necessities. Cutting back on food budgets is one of the first things people do to make ends meet.

Central bankers suffer from a Copernican complex – the belief that the sun and planets revolve around them. Real world experience and history demonstrate that inflation can’t be controlled like a thermostat. But one thing you can be certain of is that inflation has a painful effect on working class Americans.

As the COVID-19 pandemic recedes, the national goal should be to Make America’s Working Class Great Again (MAWCGA). If you believe the intellectual gratin and shekel dispensers in D.C. will internalize that notion, perhaps you would be interested in some prime real estate – something deep in the Everglades.

The downside of low interest rates

The Federal Reserve loves low interest rates.  With rates stuck at low levels since the 2008 financial crisis, they have become the rule rather than the exception.

When the coronavirus pandemic plunged the economy into a sudden freeze, the Fed lowered its benchmark borrowing rate to near zero and purchased corporate and government securities like there is no tomorrow to curb unemployment and to stimulate the economy.

The funds rate defines the cost of lending from bank to bank through the Fed and serves as the benchmark interest rate for the economy. While low interest rates may be great for driving up sales of homes and automobiles, artificially low interest rates punish savers.  Money market and certificate of deposit rates head to near zero when the Fed sets the federal funds rate at near zero.

This action disproportionately hurts senior citizens, retirees, savers, and those folks who prefer less risk.  In accepting the lower yield, those people get less income, less ability to consume, a lower quality of life, and take on more risk in the stock market for which they are not prepared. Nasty choices.

Low interest rates force savers to pursue more risky investments in the hunt for yield.  Ten-year Treasury Bonds offer a laughable less than 1 percent, making stocks look attractive.  Thank the Fed for the stock market’s run.  The rise in stocks benefits the wealthiest 1 percent or 10 percent or wherever you want to draw the line, who own more than $11 trillion of stock and mutual fund shares.

The Fed’s fundamental imperative is to strong- arm ordinary Americans to spend, spend, spend, or to invest.  The notion being that if, for example, a saving account provides an interest rate that rounds to zero percent, savings makes no sense – especially when inflation is rising faster than the interest earned on a savings account.  Low-risk investments don’t keep up with inflation and your money doesn’t have as much purchasing power.

The situation for savers isn’t likely to get better soon.  The Fed chair has said rates would remain near zero at least through 2023, though the Fed insists it won’t take interest rates negative. The reality is that when inflation is factored in people are already experiencing negative interest rates.

When more people spend and invest the economy expands. Of course, every dollar consumers spend instead of saving amounts to several dollars that would have been available in the future if it had instead been earning interest. As low rates discourage people from saving, they must become more and more reliant on government entitlements in old age.

To put the worst construction on it, a policy of constant low interest rates is an idea that deserves to be put on a stretcher and carried back to the leisure of the theory class where it was born.  You don’t have to be Philip Marlowe to know these policymakers have more than they can say grace over and are permanently out of the financial wars.

Low interest rates add to the Illiad of woes faced by ordinary Americans. The working class was in chronic crisis, alliteration aside, even before the pandemic. They work hard to make ends meet and stay out of the grasp of poverty, play by the rules, and do everything asked of them but kick extra points.

What is the right interest rate? Here’s a crazy idea: the free-market interest rate.  Cut out the middleman.  This is the rate you get when the Fed does not interfere in financial markets.

Don’t bet on it; the Fed wants to preserve the status quo, preserve in other words, the wealth of the One Percent and all that.

But not to worry, money isn’t everything – as long as you have enough of it.