Still More On Strategy

In business, tradeoffs occur when companies have to make choices between strategies that are inconsistent. For example, senior executives’ short-term focus on earnings per share may conflict with long-term shareholder value and derail the firm’s strategy.

The pressure from financial markets can tempt executives to perform unnatural acts, such as managing earnings in a fashion that undermines a long-term strategy. They are motivated by the personal impact of the decision and fixated on the short term.

One downside of stock-based compensation for executives is that it incentivizes strategies that might benefit stock prices in the short term but could be detrimental to firms in the long term.  So rather than repairing the unsafe roof on the factory in Toledo this quarter, they decide to postpone the work to meet investor expectations for the quarter.

This asymmetrical reality, coupled with the four-to-six-year average tenure for a CEO, undermines the serious consideration of strategy.  This is unfortunate since the stakes and therefore the costs of failure are high.

Limited resources force executives to carefully and wisely match the resources available to the problem or set of problems at hand. They must not only choose among resources, but also integrate and rationalize their use.

One of the challenges in developing a successful strategy is to set goals that are realistic in the context of finite resources and not to confuse means with ends. Since executives can’t have everything at once, they need a set of goals that recognize the firm’s limitations.  Goals should be feasible, not pipe dreams found in the wild blue yonder.

Closely related, since external circumstances are not static, resources that are valuable now may not be in the future.  When this occurs, executives are faced with the interplay and tradeoffs between internal and external considerations.

Keep in mind that executives are trying to perform all these tasks while trying to cope with the tyranny of day-to-day events and crises.  Functioning in this intense environment inevitably affects the quality of the choices made.

Moreover, all strategies are contextual.  Strategies are derived from and shaped by political, regulatory, social-cultural, economic, and technological forces.  None of these contexts should be ignored.  Context provides meaning to events.

Strategies should act in these multiple contexts.  Successful executives analyze the varied contexts that impact strategy and the ways in which context and ideas act on each other from the time they are developing strategy through its implementation, a progression that will in turn give rise to further ideas.

While the definition of strategy may have changed over the decades, in a word, strategy remains consequential and the stakes are huge.  It involves long-term commitments, large allocations of resources, and the making of critical decisions – all in a fiercely competitive environment in which the path forward is often unclear.  Executives need to maintain the ability to see the forest rather than the trees.

Strategy is an attempt to control the future.  Long-term goals are translated into proximate goals for execution.  Changing strategies is like the popular metaphor of changing the direction of an aircraft carrier—it doesn’t happen quickly.

Despite all these complex variables, strategy should be kept simple.  Simplicity does not guarantee success, but complexity begs for failure.  There is a chain of events between resources and goals.  A chain is as strong as its weakest link, and the more links in the chain, the higher the odds that something will go wrong.  The sovereign role of chance in strategy must be respected.

Surfing for Strategy

Does Justice Potter Stewart’s quote about obscenity: “I know it when I see it,” apply to strategy?  Is strategy some MBA type’s interpretation of elaborate Excel spreadsheets that claim to define the shape of an enterprise’s future?  Is it a carbon copy of something that worked well for another enterprise at a different time, place?

Is strategy solely the product of stained-glass rational thought uncontaminated by the hurly-burly of the real world?  Is it something to fall back on when all else fails? Is it the ad hoc play calling of a CEO whose gut instincts or may be just pure luck have made him or her a Wall Street favorite so far?

The word “strategy” is beguiling, but do we really know what it means?  The coining of a workable framework is a task fraught with danger.  It has to be right enough. It must highlight the core of the subject.

Perhaps it is time to return to basics.  Let’s make a sharp right turn. Consider a somewhat different and perhaps slightly simpler perspective common to and underscoring all strategies used in the business world.  For sure, stated in this rough and ready way, as well as in a manner that invites scholarly challenge, it underlies and accommodates a potentially wide variety of strategies.

Let’s take the capacious model advanced by Arthur F. Lykke, Jr., a military strategist who taught a generation of military leaders at the Army War College in the United States.  He divided military strategy into a ends/ways/means/risk equation.  It is a basic framework for discussing the particulars of a military strategy.  For our purposes, focus on means and ends.

At its most basic, overarching level, strategy is the essential linkage that connects resources with a set of defined, prioritized, and feasible goals that fit the competitive environment.  Usable resources are both tangible and intangible.  Strategy demands the intelligent interaction and integration of all the firm’s significant resources to achieve goals.

It aligns means with ends while reserving some resources for rainy days.  It is the link between resources and goals, the scheme for how to make one produce the other.  The alignment, like beauty, is in the eyes of the beholder.

Strategy may not be about asking “who” and “why.”  The question that haunts every strategy may be “how.”  How do you get from means to ends?  It is always the how before the who and why. Strategy happens in the space between means and ends.  It is the relationship that unfolds at the intersection of means and ends.

Strategy, according to this model, is a force multiplier when it provides value added to resources.  This perspective can accommodate the various schools of strategic thought and plausible arguments about various definitions and their imperial claims that they are valid for all times and places.  This vantage point may provide a unifying perspective among various strategies, a conceptual center of gravity covering competitive activities.  All the relevant resources come together to create a center of gravity to bring to bear on achieving the goal.

Again, in this context, it is not the strategist’s job to select goals, but he or she is obliged to contribute to the setting of goals by advising what is possible based on resources.  Strategy frequently fails when the resources prove insufficient to achieve the goals.  This can happen because the wrong resources are in play or because the ends are too ambitious.

The strategy adopted may frequently be dictated by the availability of resources rather than by desired goals.  Executives quickly come to understand that strategy is unavoidably and inevitably about trade-offs. Making trade-offs means accepting limits—saying no to some customers, for example, so that you can better serve others.

Strategy As A Way of Thinking

Strategy may well be a disposition rather than a doctrine for practitioners.  Strategy is a way of thinking about issues in the future tense that goes to the success or failure of an enterprise.  From this disposition, certain positions follow, views of change and innovation key among them, along with a deep sense of situational awareness.

Strategy, while essential, is not everybody’s idea of a good time.  In a world clamorous with so many other demands on their attention, it is challenging for practitioners.  Helpful as the various schools of strategy have been, successful practitioners are not intellectually hostage to any one school, consulting reality before embracing any of them.

On the other hand, the strong hand, for practitioners there are intelligent arguments on the debate for the superior strategy and on the other hand, the shaky hand, it is hard to know who is right when little guidance is provided on which models and tools to use.  Management theorists who seek the Holy Grail of the Great Single Solution to the problems of business are disappointed.  Successful practitioners understand how each of the various strategies advanced works individually, as well as how they might be combined for best effect.

Behind closed doors, senior leaders embrace a number of approaches and tools to reach a decision as to what their strategy should look like and what they should avoid informed by their own on the ground experience.  Choosing a strategy to meet the specific demands of their competitive environment in mature, nascent, growth, and declining industries is a major effort.  Ultimately, strategy is a way of thinking, not the mindless application of models and tools.

However, how and when to use the various tools and their limits is still an outstanding issue.  How to use business strategies is settled only in the minds of the practitioners who know how to apply the art and blend the various schools.  Should the firm in a mature industry pursue an innovation strategy trying to create new markets?  Should it seek to dominate existing markets, or perhaps use a hybrid strategy?

Also, strategy can be a strange and frustrating subject matter for students who frequently feel as though they are lost in a whiteout, paralyzed with boredom.  Many students are none too enthusiastic to study strategy.

Part of the problem is that students are generally unprepared to receive knowledge that is not immediately useful or exciting, that won’t free them from the financial wars and close the book on their debts.  For many students taking the required course in strategy, time seems to pass more slowly than in a laundromat.

Strategy requires students to have well-stocked minds, which means having knowledge of cross-functional disciplines and acquiring a more than nodding acquaintance with history—in short, to be educated. That means having knowledge of literature, history, and philosophy.  Sadly, historical consciousness is no longer in currency, let alone in vogue.

Students need to think in interdisciplinary terms, invariably that means finding dazzling connections, for as historian Edith Hamilton put it: to see anything in relation to other things is to see it simplified”.  Instead they struggle with trying to integrate and coordinate various functional areas.  Reference to context is de rigueur when discussing and analyzing a particular case or scenario.

Students get caught sometimes between warring disciplines such as finance, accounting, marketing, and other functional subjects.  This is especially difficult in an academic environment with the pressure to specialize and many students living exclusively in the present.  Students who go into the real world and attempt to practice strategy will quickly gain a healthy respect for the myriad challenges it poses.

Jack Welch and Strategy

Everyone, it seems, is in need of a strategy these days.  Luckily, everyone is a strategist. The word is used promiscuously as a value-enhancing modifier: a strategy for tax preparation, a strategy for losing weight, a strategy for coping with stress and the beat goes on.

Overuse has left the word “strategy” devoid of meaning.

As a practical matter, it is about using your limited resources to achieve the best outcome in situations that are both uncertain and contested. In the business world, books about strategy are legion and usually voluminous. These days, no company would dare to admit it lacks one.

One can argue that references to strategy in a business context started in the 1970s, as American companies became subject to increasing global competition and no longer enjoyed benign market conditions.  In 1964, when Peter Drucker sent his publisher the draft of a new book called Business Strategies, the publisher changed the title to Managing for Results, believing that the word “strategy” was associated with politics and the military, not business.

The post-World War II boom in the United States was produced by the massive, global, industrial-scale war that was not fought on American soil and radically depleted the industrial capacity of America’s most important competitors and potential competitors; including but not limited to Germany, Japan and Great Britain.

The American economy benefitted from the Marshall Plan and other spending to help rebuild these nations.  They used much of the money to purchase American goods, and for several decades the United States had very few major global competitors.

For instance, post-World War II Japan relied on close ties with the United States to protect its territorial integrity and regional interests.  This enabled Japan to focus its resources on education, economic development, and nondefense production that created competition for the United States.

America provided assistance to rebuild shattered economies in Western Europe and East Asia and opened up its market to their products.  However, by the 1970s, these countries were competing against American corporations.  By then, thanks to negative trade balances, higher oil prices, the combination of high interest rates, unemployment and inflation, and a crushing defeat in Vietnam, American corporations and households were experiencing real distress.

In response, academics, management consultants, armchair strategists, and corporate executives such as Jack Welch, the CEO of General Electric, the Apple of its time, began to transform their business strategies to acknowledge that international competition was a serious threat.  By then writing and consulting about business strategy had itself become a big business, offering magic bullet solutions such as “attack the competitor’s strongest point,” “swim in blue oceans away from the competition,” as universally valid nostrums.

Jack Welch understood that large firms could use their scale and scope to deal with increasing foreign competition, leverage international opportunities, and exploit the shift from manufacturing to services in the emerging knowledge-based economy, all while managing to stay cool.

Fortunately for Welch, he came to understand that the strategic resource in the new economy was human capital.  He realized that how strategy plays out depends on the operational effectiveness deployed by the Dilberts in the firm.  This is one reason why he was so insistent on learning and sharing knowledge and expertise throughout the organization.  In sum, he got the strategy right in the context of time and place, communicated it relentlessly, and monitored the strategy’s execution.

He understood that it is easier to grasp strategy in theory than to put it in practice, not least because strategy is difficult to develop and implement.  He likely subscribed to Yogi Berra’s perspective: “In theory there is no difference between theory and practice. In practice there is.”

The BRICS and the Almighty Dollar

When the BRICS (Brazil, Russia, India, China, and South Africa) summit was held last month in South Africa, it highlighted both the group’s main economic strengths and the divergent interests that make it difficult for them to leverage those strengths.  Whether those differences can be resolved will have a major impact on the U.S. in general, and the dominance of the dollar in particular.

Nearly two dozen countries formally applied to join the group. The bloc invited top oil exporter Saudi Arabia, along with Iran, Egypt, Argentina, Ethiopia, and the United Arab Emirates, to join in an ambitious push to expand their global influence as a viable counterweight to the West.  This is certainly the goal of Beijing and Moscow.

Developing countries are increasingly the biggest the most dynamic parts of the world economy.  This has resulted in both the shift of a vast amount of know-how from the West to the rest and the development of new know how in the rest—not just in China but also in India.

The new BRICS members bring together several of the largest energy producers with the developing world’s biggest consumers, potentially giving the bloc outsized economic clout.  Most of the world’s energy trade takes place in dollars, but the expansion could enhance the group’s ability to push more trade to alternative currencies.

This is a win for China and Russia, who would very much like to undermine the dominance of the US dollar. This would be especially helpful to Russia as its economy struggles with sanctions imposed after its invasion of Ukraine last year.  China is looking to build a broader coalition of developing countries to extend Beijing’s influence and reinforce its efforts to compete with the US on the global stage.

Former French President Valery Giscard d’Estaing called the dollar’s role as the world’s reserve currency “America’s exorbitant privilege.” Most Americans don’t think about the value of the dollar.  But for the rest of the world, its value on currency exchanges is a big deal.

U.S. monetary policy is closely watched around the world because interest rate hikes by the Federal Reserve increase the dollar’s value and make loans denominated in dollars more expensive to repay in local currencies. This is certainly an advantage for the U.S.

But the dollar’s unique position is under threat on several fronts and will likely experience a stress test in the future. The most immediate and unnecessary threat would stem from the self-inflicted wound of the U.S. defaulting on its debt.

One of the bedtime stories D.C. politicians tell themselves is that the dollar is unassailable. If Americans have learned anything from history, it is that there is no escaping it. Moving on from history requires some honesty and truth telling, but truth tellers are an endangered species among the political elite.

There are a growing number of countries, notably China and Russia, that resent the US’s weaponization of the dollar on global markets.  Their de-dollarization efforts bear watching. Another threat arises from technology, as central banks around the world work to develop their own digital currency networks.

Though home to about 40 percent of the world’s population and a quarter of global GDP, the bloc’s ambitions of becoming a global political and economic player have long been thwarted by internal divisions and the lack of a coherent vision.

The BRICS countries also have economies that are vastly different in scale and governments that often seem to have few common foreign policy goals, which complicates their decision-making.  China’s economy, for example, is more than 40 times larger than South Africa’s.

Russia, isolated by the United States and Europe over its invasion of Ukraine, is keen to show Western powers it still has friends. Brazil and India, in contrast, have both forged closer ties with the West.  Given these differences it is unclear how the group will be able to act in unison and enhance their clout on the global stage.

Can Machines Think ?

In 1950, Alan Turing, theoretical mathematician responsible for breaking the Nazi Enigma code during World War II, who is considered the father of modern computer science and artificial intelligence (AI), posed a fundamental question: “Can machines think?”

Today we are on the verge of answering Turing’s question with the creation of AI systems that imitate human cognitive abilities, interact with humans naturally, and even appear capable of human-like thinking.  These developments have sparked a global discussion about the need for comprehensive and coordinated global AI regulation.

Implementation would be a tall order.  Even if regulations could keep up with the pace of technological change, passing a framework acceptable to countries that would view it through the lens of self-interest would be a daunting task.

Turning was just 41 when he died from poisoning in 1954, a death that was deemed a suicide. For decades, his status as a giant in mathematics was largely unknown, thanks to secrecy around his computer research and the social taboos about his homosexuality.  His story became more widely known after the release of the 2014 movie, “The Imitation Game.”

Alan Turing played a foundational role in the conceptual development of machine learning. For example, one of his key contributions is the Turing Test he proposed in his seminal 1950 paper, “Computing Machinery and Intelligence.”

The Turing Test is a deceptively simple method of determining whether a machine can demonstrate human intelligence.  If a machine can converse with a human without the human consistently being able to tell that they are conversing with a machine, the machine is said to have demonstrated human intelligence.

Critics of the Turing Test argue that a computer can have the ability to think, but not to have a mind of its own. While not everyone accepts the test’s validity, the concept remains foundational in artificial intelligence discussions and research.

AI is pretty much just what it sounds like—getting machines to perform tasks by mimicking human intelligence. AI is the simulation of human intelligence by machines. The personal interactions that individuals have with voice assistants such as Alexa or Siri on their smartphones are prime examples of how AI is being integrated into people’s lives.

Generative AI has made a loud entrance. It is a form of machine learning that allows computers to generate all sorts of content. Recently, examples such as ChatGPT and other content creating tools have garnered a whole lot of attention.

Given the rapid advances in AI technology and its potential impact on almost every aspect of society, the future of global AI governance has become a topic of debate and speculation.  Although there is a growing consensus around the need for proactive AI regulation, the optimal path forward remains unclear.

What is the right approach to regulating AI?  A market-driven approach based on self-regulation could drive innovation. However, the absence of a comprehensive AI governance framework might spark a race among commercial and national superpowers to build the most powerful AI system. This winner-take-all approach could lead to a concentration of power and to geopolitical unrest.

Nations will assess any international agreements to regulate AI based on their national interests. If, for instance, the Chinese Communist Party believed global AI regulation would undermine its economic and military competitive edge, it would not comply with any international agreements as it has done in the past.

For example, China ratified the Paris Global Climate Agreement in 2016 and pledged to peak its carbon dioxide emissions around 2030. Yet it remains the world’s largest emitter of greenhouse gases. Coal continues to play a dominant role in China’s energy mix and emissions have continued to grow.

It would be wise to be realistic about the development and implementation of global AI regulations.  Technology usually does not advance in a linear fashion. Disruptions will occur with little to no foresight. Even if a regulatory framework can keep pace with technological advancement, countries will be hesitant to adopt regulations that undermine their technological advancement, economic competitiveness, and national security.

Is 2% The Right Inflation

People the world over have been facing a poisonous new economic reality, as inflation has emerged from multi-decade hibernation.  And many of the people dealing with it are too young to remember when inflation was last a serious issue.  It is economically damaging, socially corrosive, and very hard to bring down.

Both the U.S. Federal Reserve (Fed) and the European Central Bank appear dead set on getting inflation back to their 2 percent target. Why did these and other banks, such as the Bank of Canada, Sweden’s Riksbank, and the Bank of England gravitate to this 2 percent figure?

In January 2012, a thousand years ago in internet time, the Fed, under Chairman Ben Bernanke, formally adopted an explicit inflation target of 2 percent. This marked the first time the Fed ever officially established a specific numerical inflation target. The 2 percent target was seen as a way to provide clarity and enhance the effectiveness of monetary policy.

Bernanke’s successor Janet Yellen and current chair Jerome Powell maintained the 2 percent inflation target. While Powell has a laser focus on the 2 percent target, the Fed has recently moved to a more flexible 2 percent average over time. This means the Fed would tolerate some periods of inflation above 2 percent to offset periods when inflation was below that level.

The 2 percent target was not established based on any specific formula or fixed economic rule. Despite its widespread adoption by central banks, there is little empirical evidence to suggest that 2 percent is the platonic ideal for addressing the Fed’s dual mandate of price stability and maximum employment.

This inflation target is an arbitrary number that originated in New Zealand. Surprisingly, it came not from any academic study, but rather from an offhand comment during a television interview.

During the late 1980s, New Zealand was going through a period of high inflation and inability to achieve stable economic growth – the financial equivalent of a bloody nose.  In 1988, inflation had just come down from a high of 15 percent to around 10 percent. New Zealand’s finance minister, Roger Douglas, went on TV to talk about the government’s approach to monetary policy.

He was pressed during the interview about whether the government was satisfied with the new inflation rate.  Douglas replied that he was not, saying that he ideally wanted inflation between zero and 2 percent.  This involved targeting inflation, a method that had kicked around in economic literature for years but had not been implemented anywhere.

At the time there was no set target for inflation in New Zealand; Douglas’ remark was completely off the cuff. But the inflation target caught the attention of economists around the world and went viral, becoming a kind of orthodoxy.  The approach gained recognition and as noted, was subsequently adopted by many other central banks, making inflation targeting a widely used monetary policy strategy – a classic example of how ideas spread within the small priesthood of central bankers.

The hard truth is that many economic luminaries have tried to come up with what is thought to be the optimum inflation rate, but with little success.

All things considered the 2 percent target was seen as a kind of sweet spot for inflation despite the lack of serious intellectual groundwork. Simply stated, there is nothing magical about 2 percent.  It is low enough that the public doesn’t feel the need to think about inflation, but not so low as to stifle economic growth.  That’s how it goes, but not so much more.

Bankers Once Went to Prison in the U.S.

Once upon a time in America, bank executives went to prison for white-collar crimes. During the Savings and Loan (S&L) debacle, between 1985 and 1995, there were over 1,000 felony convictions in cases designated as major by the U.S. Department of Justice.

In contrast, no senior bank executives faced prosecution for the widespread mortgage fraud that contributed to the 2008 financial apocalypse that precipitated the Great Recession. Not a single senior banker who had a hand in causing the financial crisis went to prison.  Rather than reining in Wall Street, President Obama and Congress restored the status quo ante, even when it meant ignoring a staggering white-collar crime spree.

Indeed, the Department of Justice did not prosecute a single major bank executive in the largest man-made economic catastrophe since the Great Depression. They went after the small fish, not the mortgage executives who created the toxic products or the senior bank executives who peddled them.

The S&L crisis was arguably the most catastrophic collapse of the banking industry since the Great Depression.  S&Ls were banks that for well over a century had specialized in making home mortgage loans.  Across the United States, more than 1,000 S&Ls had failed, nearly a third of the 3,234 savings and loan associations that existed in 1989.  It is estimated that by 2019, there were only 659 S&L institutions in the United States.

In 1979, the S&L industry was facing many problems.  Oil prices doubled, inflation was in double digits for the second time in five years, and the Federal Reserve decided to target the money supply to control inflation. This not only let interest rates rise, it also made them more volatile.

As inflation continued to soar, S&Ls, with their concentration in home loans, found themselves squeezed by an interest rate mismatch.  The 30-year mortgages on their books earned single-digit interest rates, but they either had to pay depositors double-digit rates or lose them to competitors. Overnight, long-term depositors turned short term.  Funding long-term assets like mortgages with short-term liabilities like deposits is a risky formula, and in a high-inflation environment, it quickly makes insolvency inevitable.

For sure there are several parallels between then and the failures of Silicon Valley Bank and other banks over the last several months.  Just as many S&Ls went bust because surging interest rates increased their costs as mortgages brought low fixed rates of interest, many of today’s banks face similar balance sheet problems.

The changing economic and financial environment ruined the “3-6-3” business model that had served thrift executives well for decades: pay 3 percent on savings deposits, charge 6 percent on mortgages, pocket the difference, and play golf at 3:00.

In 1982, lobbying from the S&L industry led Congress to permit them to make highly leveraged investments far removed from their original franchise to provide mortgage funding.  In response, the federal government also enacted statutory and regulatory changes that lowered the capital standards that apply to S&Ls.

For the first time, the government approved measures intended to increase S&L profits, as opposed to promoting home ownership.  The premise underlying the changes was that deregulation of markets could let the S&Ls grow out of their insolvency.  Instead, the crisis culminated in the collapse of hundreds of S&Ls, which cost taxpayers many billions of dollars and contributed to the recession of 1990-1991.

And some S&Ls contributed to the development of a Wild West attitude that led to outright fraud among insiders. Many S&Ls ended up defrauding their depositors and speculating on high-risk ventures, engaging in illegal land flips, engaging in accounting fictions, and other criminal activities.

The S&L crisis teaches at least one important lesson: There is no ending financial chicanery without holding senior bankers accountable for their wrongdoing.

Ford Motor Co. and Industrial Policy

The U.S. government is giving the Ford Motor Co. a $9.2 billion loan, by far the biggest infusion of taxpayer cash for a U.S. automaker since bailouts during the 2008 financial crisis, to build three battery factories in Kentucky and Tennessee.  Neither Ford nor the Energy Department (DOE), which provides loans at far lower interest rates than those available in the private market, have revealed details about the loan.

The U.S. is taking a page from Beijing’s playbook.  China has a top-down industrial policy, with serious government planning and support of target industries. China’s sustained industrial policy has yielded the world’s largest battery manufacturers.  Between 2009 and 2021, the Chinese government poured more than $130 billion of subsidies into the EV market, according to a report last year by the Center for Strategic and International Studies.  Today, more than 80 percent of lithium-ion battery cell manufacturing capacity is in China.

Simply put, industrial policy means that centralized agencies formulate national visions and programs to develop specific industries.  It has been a toxic phrase in American politics.

As Gary Becker, who won the Nobel Prize for Economics in 1992, said, “The best industrial policy is none at all.” It has long been associated with pork barrel politics, picking winners, and crony capitalism.  The political rhetoric has been that the free market works best and is closely associated with freedom and democracy. The history of the U.S. does not square with this perspective.

On the surface, Ford would seem an unlikely party to receive the largest loan ever extended by the Department’s Loans Programs Office.  Just last month, Ford touted having almost $29 billion of cash on its balance sheet and more than $46 billion in total liquidity.  It is worth nothing that one of the best known loans made by the DOE was $465 million to Tesla in 2010 to support manufacturing of the Model S.

Ford aims to close the gap with Tesla on electric vehicles, just as the U.S. aims to close a similar gap with China. Ford told investors early last year that it would put $50 billion into its EV manufacturing efforts. By the end of 2026, the company wants to make two million EVs a year.

Starting with Alexander Hamilton, the first Secretary of the Treasury, who outlined a strategy for promoting American manufacturing both to catch up with Britain and provide the material base for a powerful military.  Hamilton’s “Report on the Subject of Manufacturers” promoted the use of subsidies and tariffs.  Similar practices have been expressed in various forms throughout American history.

During the 19th and 20th centuries, the government played an active role in promoting economic growth, using policies such as high tariffs to protect strategic industries, federal land grants, and subsidies for infrastructure development. The federal government has sometimes backed failures, but it also has remarkable success stories, such as nuclear energy, computers, the Internet, and building the interstate highway system

These days, industrial policy is viewed more positively, spurred by bipartisan concerns about the competitive threat China poses.  U.S. programs are now underway to cover semiconductor production, development of critical technologies, to secure key domestic supplies and support industries that are considered strategically important.

For example, subsidies from the Inflation Reduction Act and Infrastructure Investment and Jobs Act are spread across the EV value chain and are carpet bombing the entire automobile industry.  There are tax credits for sourcing critical minerals within the U.S. or friendly countries, for manufacturing or assembling the batteries and EVs they go into, for the consumers who buy the vehicles, and even for anyone building the public chargers needed to keep those vehicles moving.

The debate over industrial policy will continue because it gets to the longstanding controversy over the role of the government in our economy.  One thing is clear: the rosy rhetoric about the U.S. not engaging in industrial policy is contradicted by the country’s history.

Cancel Culture and the Chinese Cultural Revolution

It’s not news that Americans live in a new Age of Magical Thinking. The Enlightenment is seen as the start of hate speech, feelings must always overrule facts, and transubstantiation has taken on a whole new meaning.  Men can become women simply by wishing it so.

Over the last several years, much ink has been spilled about whether there are similarities between cancel culture of the 21st century, particularly in Anglosphere countries, and China’s Great Proletarian Cultural Revolution.  Pundits warn of the dangerous implications of cancel culture.

Both social media and real-life mobs target people who dissent, aiming to ruin their reputations and sometimes getting them fired, all while toppling statues of the Founding Fathers and looting in the name of social justice.

Contemporary events come nowhere near the scale of violence and repression associated with the Cultural Revolution.  Thankfully, social ostracism and unemployment are not the same as firing squads and gulags, but they are still harmful, especially to those committed to free speech.

The ordinary American lives in an age when they witness “high-tech” lynching, to borrow a phrase coined in 1991 by then Supreme Court nominee Clarence Thomas.  The core features are public smears, ridicule, along with the moralistic mob forcing victims to publicly recant their sins.

Between 1949 and his death in 1976, dictator Mao Zedong directed a radical transformation of China.  He grew increasingly suspicious of government apparatchiks and Chinese Communist Party intellectuals, leading him in 1966 to launch a stunning attack on the establishment in the form of a “Cultural Revolution.”

He encouraged youthful Red Guards, his shock troops, to destroy the “four olds” (old ideas, old culture, old customs, and old habits).  In practice, this meant widespread beating, denunciations, and mob-instigated “trials.”  Red Guards roamed the country attacking establishment elites, including government officials, managers, intellectuals, and former members of the bourgeois class. The goal was to purge the country of anyone who was insufficiently leftist.

Today, America and other Anglosphere countries are going through an admittedly more genteel cultural revolution of activists preoccupied with identity politics and cancel culture preaching the same old shibboleths.  As under Mao, people suffer disproportionate consequences for small ideological heresies.

Cancel culture involves public shaming, boycotts, online harassment, and calls for removing people from positions of influence due to perceived offensive comments or behavior.  It can lead to reputational damage, loss of employment opportunities and social isolation without due process.  Cancelling people who disagree with you is straight out of the playbook of dictators and cults.

For example, when former NFL quarterback Drew Brees stated he could “never agree with anybody disrespecting the flag of the United States of America,” citing his grandfathers’ military service, he was accused of violating contemporary social justice dogmas. Acquiescing to the pressure less than 24 hours later, Brees issued an apology on Instagram. He soon followed up with another apology.  Then his wife apologized.  Any wonder why public figures spend their days walking on eggshells?

It remains to be seen where America goes next in its nascent cultural revolution.  Where this trend goes and how long it lasts will ultimately depend on whether Americans stand up for their convictions or cave before online mobs.  Maybe nothing permanent will come of it, despite the best efforts of today’s Red Guards.  It may well turn out that the worst harm from legitimization of censorship and cancel culture may befall those on the right or the left who wield these weapons.

We would do well to remember the words of John Stuart Mill:

“He who knows only his side of the case (argument) knows little of that.  His reasons may be good, and no one may have been able to refute them. But if he is equally unable to refute the reasons on the opposite side, if he does not so much as know what they are, he has no grounds for preferring either opinion”.