Trump and Sanders may be right: Free trade is costing U.S. too much

Opposing so-called free trade deals has been an important part of the rhetoric of presidential candidates in both parties, especially polar opposites Donald Trump and Bernie Sanders. They blame free trade for the loss of American jobs, the decline in workers’ real wages, increased income inequality, and a shrinking middle class.

From their opposing ends of the political spectrum, Sanders and Trump have ignited an important debate about just who benefits from free trade. Sanders criticizes free trade as a proxy for corporate greed, while Trump says such deals serve politicians who put the interests of corporate contributors over those of ordinary Americans. Both candidates roll out the full Monty of free trade criticisms and argue that the U.S. needs to be smarter about sustaining a global trading order that supports America’s workers and economic interests rather than playing the victim for trading partners who steal jobs and play by rules that don’t reflect American social and environmental values.

They are fed up with being out-traded and out-negotiated in deals that are the serial killers of American jobs. They believe other countries engage in managed trade, not free trade, and play the game in a way that produces trade surpluses for them and fewer lost jobs for the U.S .

Opposition to free trade is a major vote getter; a way to leverage voter anger and bond with ordinary Americans. In some parts of the country, it has served as an organizing principle in a deeply divided electorate.

The typical American family saw its wealth decline significantly in the wake of the Great Recession and many voters have begun to question the fairness and adequacy of past trade policies. Deals such as the 1994 North America Free Trade Agreement (NAFTA) have been blamed for massive job losses.

Barack Obama repeatedly criticized NAFTA during the 2008 Democratic primary battle, noting that “we can’t keep passing unfair trade deals like NAFTA that put special interests over workers’ interests.” Trump and Sanders, hoping to win support from working class voters who are not fans of globalization, fervently oppose the ambitious 12-member Trans-Pacific Partnership pact the president supports.

Manufacturing’s contribution to U.S. employment has fallen steadily for more than half a century. Over the last 20 years, tens of thousands of factories have closed and many have moved to lower wage countries like Mexico, China and Vietnam. The sword of additional plant closings hangs over the heads of workers as companies pursue the classic go-to move of chasing cheaper labor.

Both Trump and Sanders cite the Carrier Corp.’s recent announcement that it will close its Indianapolis manufacturing plant and move all 1,400 jobs to Mexico. The move comes after the company was awarded $5.1 million in taxpayer money in 2013 under the Clean Energy Tax Credit Program. The funds were supposed to be used to “expand production at its Indianapolis facility to meet increasing demand for its eco-friendly condensing gas furnace product line.” Carrier says it has not received the money and will not claim it despite having been awarded the funds.

Carrier is another example of how low-wage countries can raise their living standards and impoverish American workers by importing American jobs and industries. You could argue that Carrier and other firms are really engaging in the exploitation of cheap labor, a form of economic arbitrage rather than trade, but this would not accrue to the political advantage politicians pursue.

While Carrier’s move will in theory reduce the cost of its products in the U.S., who will compensate the 1,400 workers losing their jobs or the community’s tax base? Is it any wonder that large numbers of voters prefer protecting domestic jobs from low-wage countries over lower prices for consumer goods?

To hold the line, Trump and Sanders contend it is time to rethink free trade and advocate for quotas and tariffs that protect and defend American interests and values rather than those of special interests such as multi-national corporations. On that issue they may have a point.

Originally Published: April 2, 2016

Why ‘good’ job numbers leave us feeling mad, sad and worried

Earlier this month the Bureau of Labor Statistics released its February jobs report. The unemployment rate of 4.9 percent is the lowest since February 2008 and suggests nearly full employment, but the real picture is far more mixed.

The report finds that the country created 242,000 new jobs last month, well ahead of the Wall Street forecast of 190,000. It also revised its December and January reports to add a total of 30,000 more jobs. The numbers suggest that even in the face of financial market turmoil and slowing global demand, the

U.S. has averaged about 228,000 new jobs over each of the last three months.

Still, many of the jobs were concentrated in low-wage sectors. Retailers added 54,900 jobs last month and restaurants and drinking establishments another 40,200. Manufacturers cut their payrolls by 16,000 jobs as slow growth in key markets around the world and the rising value of the dollar reduced demand for U.S. products. By far the weakest sector for job growth was the mining sector, which includes oil and gas producers. It cut jobs for the 17th straight month, losing 19,000 in February.

Hiring by employers directly associated with consumers has more than offset layoffs by manufacturers and fossil fuel companies, the two sectors squeezed by declining oil prices and a strong dollar.

An increase in the labor force participation rate was an encouraging sign. The rate of 62.9 percent is the highest in over a year as more than half-a-million people joined the labor force. Fewer and fewer people appear to be sitting on the sidelines.

But there is more to the story. The headline unemployment number does not account for the underemployed, such as those who are involuntary working part time. And even though labor participation rose, there are still many long-term unemployed and discouraged workers who have stopped looking because they believe no jobs are available for them. When these groups are included, the February unemployment rate rises to 9.7 percent, which suggests that the labor market is far from overheating.

Other downbeat notes were that the average length of the workweek declined by 0.2 hours, aggregate hours worked fell 0.4 percent, and average wages fell by 3 cents to $25.35 an hour. This put the yearly wage growth at 2.2 percent, just slightly ahead of core inflation rate. That makes it difficult for the average American to keep up with the staples of a middle class life. Indeed, real wages for most American workers have been flat lining since the 1970s.

A 4.9 percent unemployment rate masks the fact that things are not going very well for a large share of American workers. Jobs may be plentiful, but they are not paying much. It may be good news that the economy is growing at 2 percent, but ordinary Americans are not reaping the benefits of that growth.

Things are tough on Wall Street, too. Average bonuses paid out in the financial services sector tumbled 9 percent last year to the lowest level in three years, according to new figures from the New York State comptroller. Of course, that average $146,200 bonus is still nearly three times the median annual U.S. household income of about $52,000.

In light of these disparities and glass-half-empty job numbers, is it any wonder that average working class Americans are seething with anger, are anxious about the future, and are feeling betrayed? Stalled incomes may be fueling the hard line positions on illegal immigration and opposition to job-destroying trade deals that spur the rise of both Donald J. Trump and Bernie Sanders, the yin and yang of America’s season of political discontent and economic stagnation.

If that continues, voters might find themselves liking the cure even less than they like the illness.

originally published: March 19, 2016

Remembering a day that was too big to forget

This month marks the anniversary of the collapse of Bear Stearns, once Wall Street’s fifth-largest investment bank. The demise of the 85-year old institution signaled the real start of the 2008-2009 financial crisis. Eight years later, we can only hope our leaders learned something from the experience.

The collapse and Bear’s subsequent bailout by the Federal Reserve, with the support of the Treasury Department and JPMorgan Chase sent shockwaves throughout the financial system. Bear’s incredibly rapid demise raised serious questions about the banking industry’s use of leverage, inadequate oversight of commercial and investment banks, and the role of the Fed and other regulators in preventing the failure of major financial institutions.

Late on Sunday afternoon, March 16, 2008, Bear’s board of directors accepted JPMorgan Chase’s offer to purchase the company. Less than 18 months after its stock was trading at an all-time high of $172.61 a share, Bear Stearns had little choice but to accept the humiliating offer of $2 a share.

JPMorgan Chase later raised the bid to $10 per share and the Fed provided $30 billion in collateral guarantees to facilitate the deal. The Fed considered Bear too large and too interconnected to fail and saw no choice but to arrange a bailout to prevent a global market crisis. It was hoped that the Bear rescue would nip the problem in the bud and avoid the damage to the larger financial world that many policymakers thought would result from the failure of a major investment bank.

The precise nature of the transaction seemed unclear, but the Fed appeared to be accepting responsibility for the toxic, illiquid assets on Bear’s balance sheet if their eventual liquidation resulted in a loss. In this sense, it appeared that the Fed became the residual owner of these securities and put the federal taxpayer on the hook for Bear’s reckless risk taking activities. Some believe that this action exceeded the Fed’s power.

The first sign of trouble at Bear was the July 2007 collapse of two of its hedge funds. The funds had invested heavily in collateralized debt obligations backed by subprime mortgages, and their failure alerted the rest of the financial system to this contagion.

The hedge funds’ collapse also raised concerns about the firm’s own exposure to mortgage-related securities. It damaged the firm’s reputation, weakened its finances and served as the precursor to Bear’s ultimate collapse. Within a year “the plumbing had stopped working” and credit ceased to flow through the financial system.

Hopes that a global financial crisis could be averted proved misplaced just six months later when Lehman Brothers, another bank that was heavily involved in the mortgage business and was even larger than Bear filed for Chapter 11 bankruptcy on September 15, 2008. The public backlash against the Bear bailout made rescuing the 158 year-old Lehman Brothers politically untenable, especially just weeks before a hotly contested presidential election.

The 2008-2009 financial crisis proved to be the most expensive in history. On July 21, 2010, President Obama signed into law the Dodd-Frank Wall Street Reform and Consumer Protection Act (Dodd­ Frank), one of the most sweeping financial reforms in U.S. history.

The law’s stated aim is to “promote the financial stability of the United States by improving accountability and transparency in the financial system, to end ‘too big to fail,’ to protect the American taxpayer by ending bailouts, to protect consumers from abusive financial services practices, and for other purposes.” Essentially, Congress’s intent was to reduce system-wide risk and to prevent another financial collapse.

Let’s hope policy makers actually learned enough about how the economy and the financial system fit together from the 2008-2009 financial crisis to avoid future learning experiences. They would be well advised to recall the words of that prolific author, anonymous, who said, “The past is prologue: but which past?”

originally published: March 5, 2016

Negative interest rates are on the table

Just two months ago the Federal Reserve (the Fed) hiked the short-term interest rate it controls. The quarter-point increase was the first in nine years after efforts to pump up economic growth by keeping the rate close to zero. Now several members of the Fed are talking about reversing themselves and moving interest rates into negative territory.

It’s not such a far-fetched idea. Sweden, Denmark, Switzerland, Japan and the European Central Bank have introduced negative interest rates. It’s the latest toy in the world of monetary policy, where the economy is seen as an automobile and interest rates are the gas pedal.

When Fed Chair Janet Yellen delivered her semiannual testimony before Congress last week, she said the Fed has not fully researched the issue of charging banks to hold their excess reserves. But the plot took a sinister twist when it was disclosed that the Fed asked banks to consider the impact of negative interest rates during the latest round of bank stress tests.

Under a negative interest rate policy, banks are charged to park their cash with the central bank. The hope is that this will encourage banks to stop hoarding money and instead lend to consumers and businesses to accelerate economic growth.

No one knows if negative interest rates would work in the U.S.; the Fed has never tried them. Fully identifying their impact is very complicated, but we know how the story will play out for the average Joe. If the Fed charges banks for excess deposits, the banks will in turn charge customers for depositing money.

Still further, just because the interest rate is negative does not mean a bank will pay you interest (rather than the other way around) when you pay back a loan. The average customer will not get paid to take out a loan, not now, not ever, never.

Negative interest rates effectively charge the customer for deposits, discourage saving and encourage spending. Forget about saving for retirement and a child’s education; this policy is designed to grow the economy by coercing people to spend. Of course, the customer can at least be held harmless by holding cash and earning a zero percent nominal return.

The Fed has effectively punished the millions of American who rely on their savings to get by. Safe options such as savings accounts, certificates of deposit and treasury bonds offer pitiful returns forcing many people to dig into their principle to make ends meet. Under negative interest rates, the longer funds are on deposit the less money is available for withdrawal as banks charge to hold the money. Also, if people are unable to retire, many will either remain in or re-enter the labor force, thereby competing with younger workers for jobs or risk their savings by putting money in risky investments.

Crazy as it sounds, this may be the new normal. Remember that when the Fed thought they could not cut the interest rate any more, they engaged in quantitative easing: basically creating money out of thin air and releasing it into the economy, mainly by buying bank debt securities.

Bargain-basement interest rates and flooding the system with trillions of dollars in cheap money has produced sharp stock market gains -though even that has ended in recent months – and enabled corporations to buy back their own shares and pursue mergers and acquisitions instead of expanding production and creating jobs. It’s time for the public to ask what we have to show for these aggressive and addictive monetary policies that are a misallocation of resources and contribute to income inequality by shifting wealth to asset owners.

Monetary policy does not make for good presidential debate sound bites, but the time is long overdue for candidates to engage on the issue of federal monetary policy and how it has contributed to income inequality.

originally published: February 20, 2016

How do we keep U.S. companies at home?

Last month Milwaukee-based Johnson Controls became the latest American firm to move overseas in pursuit of tax savings. Politicians of all stripes decry these moves, but they have radically different ideas about what to do about them.

Johnson Controls is renouncing its US corporate citizenship by selling itself to Tyco International, domiciled in Ireland, in a deal valued at $14 billion. The move, called a corporate inversion, is the restructuring of an American company’s corporate form such that it becomes a foreign corporation based in a country with low corporate taxes. It is legal and has become a popular way for American companies to reduce their domestic tax payments.

By moving its headquarters to Ireland, Johnson Controls will pay a corporate tax rate of 12.5 percent and reduce its tax bill by at least $150 million annually. About 50 companies have inverted in the past decade and more are expected to vote with their feet.

These deals allegedly reflect the United States’ failure to change the corporate tax code to make it more attractive for American companies to stay that way. The statutory corporate tax rate of 35 percent (39.1 percent when combined with state rates) is unchanged since 1993 and is the highest among industrialized countries.

The federal government taxes all companies doing business in the U.S. on the income they earn here. It also taxes American firms on their foreign income. This is called a worldwide income tax system. Many countries only tax income generated inside their borders. Because they tax their residents (regardless of citizenship) only on domestic income, such countries are said to use a territorial income tax system.

America is now alone among developed countries in taxing the worldwide income of its corporations. This is in addition to the taxes they pay to foreign governments, but American firms are permitted to avoid double taxation by claiming credits for foreign tax payments.

American companies are also permitted to defer domestic tax liabilities on certain unrepatriated foreign profits until they actually receive such profits in the form of dividends. It is estimated that U.S. firms are keeping roughly $2 trillion in profits abroad to reduce their taxes. If an American citizen tried to play the same game of hide and seek, he or she would be in big trouble with the Internal Revenue Service.

When it comes to arguments in support of lowering the corporate rate, cheerleaders never tire of telling the public that American companies can’t compete in a global economy when they are handicapped by the highest corporate tax rate in the developed world. They argue that a significant reduction in the corporate tax rate would improve America’s competitive position, stem the tide of companies leaving the country and perhaps even reverse it by giving foreign corporations an incentive to locate and invest in the U.S.

Overlooked in these arguments is the fact that while on paper American companies are supposed to pay the 35 percent federal income tax rate, the country’s most successful companies pay at just a 19.4 percent rate after accounting for tax credits, deductions and exemptions, according to the Citizens for Tax Justice. If true, this certainly undermines the primary argument for reducing the corporate tax rate.

Neither Democrats nor Republicans like inversions, but they disagree on what to do about them. The former see the issue as one of corporate abuse and want strong rules to stop the exodus of tax dollars; the latter see it as the result of high corporate tax rates and argue for an overhaul of the corporate tax code.

As with so many issues today, you would be right to conclude that achieving consensus on how to halt corporate inversions is the equivalent of cleaning out the Augean stables with the horses still in them.

originally published: February 2, 2016

Mideast tensions are straining U.S.-Saudi ‘special  relationship’

To say that no one is very happy about American involvement in the sectarian political cauldron of the Middle East is to exaggerate very little. The public wants the United States to extricate itself from the Sunni vs. Shiite wars that plague the region and reliable allies are not plentiful as long-term alliances shift with the escalating chaos.

Take for instance America’s decades-old “special relationship” with Saudi Arabia. The alliance was first sealed when President Roosevelt met the first Saudi king, Abdul Aziz, in 1945 aboard the cruiser USS Quincy in the Suez Canal. They cut a simple deal: America would bring the Saudis under its security umbrella and the Saudis would supply oil.

For decades, the Saudi-American relationship largely worked well for both parties. After all, the Saudis were the world’s largest oil producer and sat on better than one-fifth of the world’s proven oil reserves, giving it great influence over global oil prices.

The U.S.-Saudi alliance may be an old one, but since the Arab Spring in 2011, the relationship has deteriorated. The latest fissure was sparked by the Saudi’s recent execution of a prominent Shiite Muslim cleric, which prompted condemnation throughout the Middle East.

There are several reasons why the Saudis are upset with America. They bitterly opposed Washington’s support of pro-democracy protestors in Egypt during the Arab Spring and urged President Obama to use force to preserve President Hosni Mubrarak’s dictatorship. America’s accommodation with the Muslim Brotherhood during their brief reign in Egypt further angered the Saudi monarchy.

Then Washington was critical of the military coup responsible for displacing Egypt’s Muslim Brotherhood President Muhammad Morsi, while Saudi Arabia pledged billions to the new Egyptian government. After this experience, the Saudis became paranoid that America would sell them up the river as they had Mubrarak.

As the Syrian civil war worsened in 2013, President Obama backed off his threat of military force against President Bashar al-Assad, who allegedly used chemical weapons against his own people while concurrently announcing a rhetorical pivot to Asia. The Saudis and other longtime American allies felt abandoned.

Since the overthrow of the Shah during the 1979 Iranian Revolution, Saudi Arabia and Iran have what could mildly be described as a tense relationship. While the two Islamic countries are separated by only a few miles of Persian Gulf, the religious and political gap is much wider. Underlying Saudi concerns is the schism between Sunnis and Shias, who have been at each other’s throats for more than a millennium. Iran is mostly Shia Muslim and, like most of the countries in the Middle East, Saudi Arabia is majority Sunni.

The two are currently engaged in proxy wars in Yemen and Syria that exemplify the Sunni/Shia divide. The Saudis were horrified when the U.S. recently entered into a nuclear deal with Iran. They consider the threat of a nuclear-armed Iran intolerable.

Finally, the United States’ continuing support for beleaguered Israel remains a point of contention. Joint opposition to the emergence of ISIS is the only recent development that reinforces the mutual interests of Saudi Arabia and the United States.

Despite the growing list of grievances, the two countries need each other. The U.S. retains a strong military presence in the Persian Gulf and cannot soon be replaced as the ultimate guarantor of Saudi security. In the midst of regional turmoil and with the ever-present threat of jihadist terrorism, the U.S. still relies heavily on the Saudis to help police the neighborhood.

Still further, the Saudis are a major buyer of U.S. weapons, having spent more than $46 billion on American arms since President Obama took office. The kingdom is also the largest producer in the Organization of Petroleum Exporting Countries that controls about 40 percent of the world’s oil.

Since sectarian wars in the Middle East are likely to get worse before they get better, the relationship calls to mind the old English proverb: “With friends like this who needs enemies?”

originally published: January 23, 2016

Middle East violence is a reminder of the Thirty Years War

Mark Twain’s reputed quip that “history doesn’t repeat itself, but it does rhyme” reminds us that historical analogies can sometimes provide a useful perspective on current events and even inform the future. The sectarian violence and bloodletting raging all over the Middle East have given rise to several historical comparisons, not least the hellish Thirty Years’ War that ravaged Europe in the first half of the 17th century.

With apologies to Dickens, it was the worst of times in Europe. This conflict among the Catholics, Calvinists, Lutherans, and Huguenots, involving multiple great powers, became a bloody, protracted struggle over the continent’s political and religious order.

Across the modern Middle East, Western foreign policy blunders have largely, though not entirely, contributed to a growing sense of instability. Many argue that the turmoil currently engulfing the region was born out of the catastrophic American invasion of Iraq in 2003 and its failure to reconstitute an Iraqi state.6

The turmoil is fueled by the hatred between the Shia and Sunni branches of Islam that has existed for centuries. Toppling Saddam Hussein unleashed the Shia in Iraq and strengthened Iran’s bid to be the region’s most important actor.

Just as with the Thirty Years’ War, the religious conflict is overlaid by a great rivalry between Iran, leading a Shiite coalition, and Saudi Arabia, which is Sunni central. Add to that the presence of the United States and Russia, which are fighting proxy wars in the region, and you have a precarious and highly flammable mix.

In 16th and 17th century Europe, the Protestant Reformation opened a Pandora’s Box of international and civil conflict culminating in the Thirty Years’ War, the greatest of the so-called wars of religion. Although the struggles that led to it erupted many years earlier, the war is conventionally held to start in 1618. It lasted through 1648, a seemingly endless and devastating conflict in which millions of

Europeans were killed, a scale unimaginable during the medieval era. It is estimated that more than 25 to 40 percent of the German population perished during the war.

The roots of both the Middle Eastern and European conflicts stretched back centuries and centered on unresolved questions of religious freedom and power politics. Not unlike the geopolitical and religious contest of will between Sunni and Shia, the Thirty Years’ war began as a conflict between Protestant nobles in Germany fighting to preserve their autonomy and faith against the Catholic Hapsburg Dynasty (the Holy Roman Empire).

On the political side, the Hapsburg Dynasty wanted to preserve its European hegemony. This triggered a conflict among a conga line of great powers such as France, Denmark and Sweden that was not unlike the modern power struggle between Iran and Saudi Arabia.

The Thirty Years’ War ended with the Peace of Westphalia in 1648, referred to by contemporaries as the Peace of Exhaustion. It established a new political order that irrevocably changed the map of Europe. The Netherlands gained independence from Spain, Sweden gained control of the Baltic, the German Protestant nobles were able to determine the religion of their lands, France was acknowledged as the preeminent Western power ,  the Holy Roman Empire continued as an empty shell until it was dissolved 150  years later and the principle of state sovereignty emerged, creating the basis for the modern system of nation states.                  ·                                              ·

In the long run, mitigating the Middle East’s sectarian and geopolitical conflicts may partially center on implementing the Westphalian nation state concept. Some semblance of stability in the Middle East may be restored with the reestablishment of a state-based order. For starters, that may mean a three-state arrangement, redrawing the existing national boundaries to accommodate separate states for the Sunnis, the Shias, and Kurds.

But history teaches us that the West must best be prepared to wait a very long time for the latest conflict in the Middle East to subside and for anything that approaches a solution to take hold.

originally published: January 19, 2016

Political rhetoric and the jihadists

Every terrorist attack on a Western target presents the self-styled saints in Washington and other western capitals with an opportunity to engage in perfectly staged grandiose rhetoric. Employing borrowed words, identical sound bites, and first-cousin cliches designed to curate their images, conceal their ignorance and ignore realities on the ground, world leaders’ pontificate about destroying ISIS.

But there is precious little explanation of what defeating ISIS really means or how it will be accomplished.

Our leaders’ mandarin rhetoric is reminiscent of Queen Gertrude’s admonition to Polonius in Hamlet: “More matter with less art.” In contemporary parlance, this is translated as more substance with less style. More content without the rhetorical ornamentation and digressions. The political classes in God’s menagerie talk until their mouths bleed and reassure the public that they will defeat the terrorists without a hitch like an Ocean’s Eleven heist.

Best to recall the truth of George Orwell’s comment that “…if thought corrupts language, language can also corrupt thought” blurring the boundaries between the fake and the real. It is a reminder that the moment to be wariest of political rhetoric is precisely when elite opinion is lined up on one side of the boat.

Those politicians talk about destroying ISIS, but what about other radical Islamic terrorist groups such as al Qaeda, Hezbollah, Hamas, Jabhat al-Nustra, and Boko Haram, that have proliferated all over the globe partially facilitated by the information revolution?

Does victory over ISIS mean taking the fight to their doorstep in Iraq and Syria? If it means beating them militarily, that is a silly question. If the American public has the stomach to support boots on the ground with the collateral damage to civilians, the world’s mightiest military could go through ISIS in Syria and Iraq to take a phrase from General Patton, “like excrement through a goose.” But the American people will not touch this approach with a barge pole.

The U.S. military did not start bombing ISIS’ oil infrastructure and their fleet of tanker trucks because the Obama administration was worried about civilian casualties and environmental damage. You have to wonder whether the allies would have won World War II if they had to submit their bombing targets to the White House for approval.

Is defeating ISIS militarily, stopping its propaganda machine, blocking its revenue sources sufficient to eliminate radical Islamic terrorism? ISIS and other jihadists ‘ initial goal is to create a caliphate in Iraq, Syria, Lebanon, Jordan, Yemen, Libya and the Palestinian territories. After that, they want to recreate the caliphate of old and then spread Islam over the entire planet. A global caliphate achieved through a global war. Other than that, they have modest ambitions.

Does it mean making their ideas go away? Does a grand strategy have to deal with the challenge of overthrowing a religion, a belief system? Even if we defeat the extremist militarily, we are still going to be dealing with the sons and daughters of jihadists 20 years from now. The fight against terrorism could become like the endless war on crime, or poverty or cancer.

To reduce and manage the terrorist threat, mainstream Muslims themselves must come out forcefully against the jihadis who are trying to hijack their religion. Political rhetoric comes with the speed of light, while developing and executing a successful strategy to deal with the scourge of radical Islamic terrorism comes with that of sound.

originally published: January 2, 2016

When it comes to Syria, let Iraq be the lesson

How many times do people ask themselves “what if,” consciously considering what has happened and what might have been? What if you had taken a different job, or married someone else? These questions are a fundamental feature of the human condition.

They are also a good exercise in understanding the world and suggesting alternative approaches to identifying and achieving goals. In that sense, they can be applied to current events in the Middle East.

The fancy name for “what if’ questions is counterfactual thinking. For historians, this is a way of thinking about a past that did not happen. For the rest of us it is, among other things, one way to make sense of experiences and think about what to do differently in the future.

Popular culture likes counterfactuals as a conventional story telling device. For example, the current Amazon television series “Man in the High Castle,” an adaptation of Philip K. Dick’s 1962 novel of the same name, is essentially a counterfactual, imagining America had the Nazis won World War II.

Today the Middle East is in chaos and is the most active war zone in the world. But what if the United States had not invaded Iraq in 2003? We will never truly know, but let’s look at how things might be different.

Sure Saddam Hussein was a gruesome dictator who killed hundreds of thousands of people and started wars. But after spending much in American blood and treasure, is Iraq a stable functioning democracy, or did the invasion simply cause geopolitical chaos and a humanitarian tragedy?

Under Hussein, Iraq was a bulwark to contain Iran, which now exercises far greater influence in Iraq, Syria, Lebanon and the entire Middle East than it did before the invasion. After the United States packed up and left in 2011, Iran rushed into the power vacuum.

At the time of the Iraq invasion, Libya had been ruled by the same strongman for over 40 years, but it was stable. Policy makers in Washington decided to go along with France, the United Kingdom and others to topple Muammar Gadhafi’s regime, once again without a viable alternative in place to replace the old order. The result is a failed state and another humanitarian tragedy.

It can be argued that the invasion of Iraq diverted military and financial resources away from Afghanistan before the Taliban had been defeated. A greater focus on Afghanistan might also have kept Pakistan from engaging in mischief.

Perhaps it doesn’t make sense for the United States to simultaneously pursue ousting Syrian President Bashar al-Assad while targeting ISIS and radical Islamists of all stripes in Syria. And would ousting Assad simply result in the same murderous chaos we have seen in Iraq and Libya?

The United States may be at a stage in the Middle East where is has to decide whether ISIS or keeping Assad in power with the support of Iran and Russia represents a greater national security threat. Perhaps it’s time for President Obama to get past his Putinphobia and cut a deal with him as the allies did with Stalin to defeat the Nazis in World War II. The current strategy isn’t working, so why not work with Russia and Iran to create an international solution?

There are no easy answers in the fight against radical Islamist terror groups, but when the President and others say they will destroy them, it is incumbent on them to explain a detailed strategy. It is not enough for world leaders to say that ISIS will be defeated. They need to describe what those words really mean. What will success look like, how do you measure it, and how long will it take?

Asking what might have been had the United States not invaded Iraq provides an interesting lens through which to view the Middle East. And perhaps it offers some insights into how to deal with that troubled region more than a decade after the invasion.

Originally Published: December 19, 2015

How Americans became soft targets

Americans can add concerns about their physical safety to a list of worries that already includes job insecurity, record economic inequality, and trust in government reaching an all-time low.

The San Bernardino shootings show that terror attacks on soft targets are not confined to Europe. The premeditated slaughter of innocent civilians by radical Islamic terrorists (dare I say the name) in Paris was followed by promises of similar attacks in other “crusader cities” including Washington, D.C. and New York City.

The most recent attacks are a reminder that American foreign policy blunders have caused chaos in the Middle East, where Islamic State outposts are gaining strength in Libya, Afghanistan, Lebanon, and Egypt. Americans understand you cannot underestimate ISIS, as President Obama did when he characterized them as a junior varsity team that has been contained as a local actor and did not represent a national security threat.

American troops exited Iraq at the end of 2011, completing a deployment that cost nearly 4,500 American lives, left more than 32,000 wounded and cost taxpayers trillions of dollars. The president said the US was leaving behind a “sovereign, stable, and self-reliant” Iraq.

Instead, the exit left the door open for the Islamic State’s land grab. All the gains made following the “surge” from 2007 to 2011 were washed away, with Islamic State terrorists taking territory and committing mass killings.

The President did not help matters in 2012, when he warned Syrian President Bashar al-Assad that using chemical weapons would cross a “red line.” Yet when Assad did just that in 2013, Obama did nothing. The inaction undermined America’s credibility and exasperated our allies.

The blunders did not start with President Obama. Common sense has gone on holiday among the worthies in Washington since 9/11, beginning with a feckless decision to invade Iraq that was a precipitating event in the unraveling of the Middle East and creation of one of the worst refugee crises since World War II.

The American-organized coalition invaded in 2003 because of Saddam Hussein’s alleged connections to terrorism and the potential threat posed by Iraq’s supposed possession of weapons of mass destruction. It turned out that Iraq did not have WMDs; Hussein’s links to al-Qaida and other terrorist groups were equally illusory.

The invasion was also supposed to transform a country benighted by decades of dictatorship into a Western-style free-market democracy that would be a model for other Middle East nations. Instead it opened Pandora’s Box and promoted Iran’s metastasizing regional hegemony.

The abrupt fall of Baghdad was accompanied by massive civil disorder, including the looting of public and government buildings, as the country slipped into anarchy. There was no plan for what to do after the victory and little recognition given to religious, ethnic, and political complexities among the Shiites, Sunnis and Kurds. Dissolving the army put several hundred thousand armed Iraqis on the street with no jobs and firing government employees, mostly Sunnis linked to the Hussein regime, transformed the country into a breeding ground for the very terrorism the invasion was supposed to combat.

By the fall of 2003 these blunders and the lack of enough American troops to establish security for the Iraqi people contributed to the growth of insurgency.

America allowed the old order to topple without a viable alternative in place – a reckless act with no precedent in modern statecraft. Now we are faced with the monumental challenge of picking up the pieces.

Today you need a scorecard to keep track of what is happening in the Middle East. For example, the Kurds have been a strong American partner. But Turkey, an equivocal NATO ally, claims that the Syrian Kurds are a terrorist group and have been bombing America’s most reliable ally in Syria and Iraq while ISIS brokers black market oil in Turkey to fund itself.

Sadly, the effects of these blunders aren’t limited to the Middle East. Here in the United States, Americans now live in fear of their physical safety.

Originally Published: December 12, 2015