In Afghanistan, little to show for America’s longest war

It is the longest war in U.S. history, yet it hardly gets any attention. The public may be suffering from Afghan fatigue, especially when there is little to show for the expenditure of life and treasure.

It has been 15 years since the U.S. invaded Afghanistan to hunt down the architects of the 9/11 attacks. The invasion was an integral piece of President Bush’s hastily conceived “Global War on Terrorism.” By the end of 2001, American forces had toppled the Taliban government. Mission accomplished.

Afghanistan is a country of about 30 million people, approximately the size of Texas, nestled between Pakistan, Iran, and several former Soviet republics. Its location has made it a source of significant geopolitical interest and tension since the 19th century.

The country has historically been wrought with turbulence, characterized by chronic instability and repeated bouts of civil war. Since 329 B.C., when Alexander the Great came to Kabul, Afghanistan has been invaded by Arabs, Chinese, Mongols, Moghuls, the British, and the Soviets. As scholars have written, Afghanistan is the graveyard of empires.

The Taliban took over in 1996. This fundamentalist Islamic group formed from remnants of the Mujahideen (holy warriors) who battled the Soviets for over a decade.

The United States has spent over $1 trillion on fighting and reconstruction, building an Afghan army, instilling western values in a land of warlords and tribal hostilities, and establishing a functioning democracy in a place that has never been changed by a foreign power. Over 2,200 American lives have been sacrificed.

According to the Special Inspector General for Afghanistan Reconstruction, about $115 billion has been spent to support Afghanistan relief and assist the government. Yet the World Bank reports that it remains one of the poorest countries in the world, and three quarters of the population is illiterate.

While the United States and NATO formally ended the war in Afghanistan on December 28, 2014, a force of 8,500 Americans remains to train and support the Afghan security forces.

Since the Taliban fell, some progress has arguably been made in opening up the country and expanding democratic freedoms, especially among women. However, lack of security still impedes development, and corruption remains a significant barrier to progress. The 2015 Corruption Perceptions Index ranks Afghanistan 166th out of the 168 countries monitored. Taliban insurgency is on the rise, drug trafficking flourishes, human rights abuses continue, the rule of law is weak. All the while the U.S. picks up the tab for the Afghan army and police, and continues to provide foreign aid.

Billions have been wasted on fruitless projects that are awash in corruption and have little government oversight, according to the Special Inspector General. This office has documented a greatest hits compilation of waste, fraud and abuse in U.S. government-sponsored programs.

Among the more egregious boondoggles was importing rare blond male Italian goats to mate with female Afghan goats and make cashmere. The $6 million program included shipping nine male goats to western Afghanistan, setting up a farm, lab, and staff to certify their wool.

But the entire herd of female goats was wiped out by disease. As a result, only two of the imported Italian goats are still usable; it could not be confirmed whether the others were dead or alive.

Another baaad idea. You goat to be kidding.

That was not the only example of wasting American taxpayer money. The Pentagon’s Task Force for Business and Stability Operations spent nearly $150 million for employees to stay in private luxury villas with flat-screen TVs rather than bunking at military bases. Another $43 million was spent on a gas station that should have cost about $500,000.

America may have good intentions, but we know which road is paved with those. We will be mowing the grass in Afghanistan for the foreseeable future, like Sisyphus rolling his boulder up the hill for all eternity.

originally published: January 7, 2017

Make High Earners Save Social Security

In these days of presidential interregnum, the American public has seen newspapers and digital media filled with discussions of tax cuts, increased military and infrastructure spending, economic growth proposals, regulatory relief, immigration reform, repealing Obamacare, reducing the national debt, keeping deficits on a short leash, draining the swamp of political and economic favoritism and other domestic traumas.

Social Security, however, has received little attention. How the new administration will accomplish all these promises without yielding to the temptation to cut programs like Social Security is an open question. President-elect Trump, who enjoyed the support of working class Americans, promised during the campaign not to cut Social Security. Speaker Paul Ryan said he has no plans to change Social Security, although he has been outspoken on the need for entitlement reform.

Funny how a politician can forget campaign promises after election day. Loyalty appears to be paramount for these folks until all of a sudden it isn’t. Politicians all too frequently forget, to put it in the cant language of the ‘hood, that a deal is a deal.

Social Security is a promise to all eligible Americans that they will not live in abject poverty if they become disabled or when they get old, but the Social Security 2015 Trust report finds that the fund has enough money to pay full benefits until 2034. After that it will collect only enough in taxes to pay 79 percent of benefits.

With the number of workers available to pay for Social Security benefits falling rapidly, there will inevitably be calls for benefit cuts, higher taxes or both. But there is a better way.

Social Securityis not an entitlement program; it is a “pay-as-you-go” system funded by the payroll tax. Companies and nearly 168 million working people pay into it to provide benefits to about 60 million retirees. Each generation pays for current retirees in return for a commitment that the next generation will do the same.

It is the backbone of retirement planning for millions of Americans. Almost a third of retirees receive practically all their retirement income from the system and about two thirds receive the majority of their retirement income from Social Security.

The top 100 CEOs, in contrast, have platinum pension plans. On average, their massive next eggs are large enough to generate about $253,000 in monthly retirement payments for the rest of their lives. Heaven for them, hell for the ordinary American worker.

Dealing with the coming Social Security funding crisis by raising the payroll tax places a significant burden on low-wage workers, especially when the Federal Reserve has kept interest rates so low that their saving accounts are yielding next to nothing, forcing baby boomers to work longer and retirees to rely even more on Social Security income.

An alternative that merits serious consideration is to increase the ceiling on annual wages subject to Social Security payroll taxes, which is currently at $118,500. All annual income above that amount is exempt from the tax, meaning that 94 percent of Americans pay Social Security tax on all their income but the wealthiest 6 percent do not.

Expanding the payroll tax to all earnings above $118,500 would wipe out funding issues. According to Social Security actuaries, it would keep the Trust solvent for the next 45 years.

Since Social Security began, the need for retirement income has risen as life expectancy has increased by 17 years. This is particularly true for top earners who need Social Security the least and whose jobs are less physically demanding than those of construction workers, janitors, etc.

Political leaders have time to decide how to address Social Security’s long-term funding problems. As they contemplate potential solutions, they should consider expanding the payroll tax to include all earnings. It’s a fair way to rescue the program from financial limbo and provide lasting stability without taking draconian measures that would harm tens of millions of hard-working Americans.

Originally Published: December 23, 2016

Infrastructure Yes, But Not Just Any Infrastructure

The American public is routinely bombarded with messages about the need to spend vast sums of money on infrastructure, drumming the subject into the public consciousness, promoting an often rehearsed-sounding catalog of new capital projects. The need is indeed great, but so is the importance of spending wisely. That means emphasizing the lifecycle management of infrastructure assets.

President-elect Trump says his plan to spend $1 trillion on infrastructure projects over 10 years would be paid for by leveraging public-private partnerships and encouraging private investment through tax incentives. Infrastructure spending is a priority Trump shares with congressional Democrats, who have said they believe they can work with him on the Augean task of renewing America’s infrastructure.

What is often overlooked is that infrastructure spending is not just about new construction, but the maintenance of existing assets. Timely lifecycle management and maintenance is needed to extend the service lives of infrastructure assets in a state of good repair and significantly reduce overall costs. The rationale is to avoid the high cost of reconstruction and replacement that results from deferred maintenance.

Political leaders frequently say that stewardship of infrastructure assets is essential for economic growth. But the evidence suggests that many of them don’t believe it. They are predisposed to defer maintenance because their concept of the future extends no further than the next election cycle, and initial timeframe for infrastructure assets to show the effects of irreversible deferred maintenance is much longer than their likely terms in office.

Consider, for example, that large sections of the Washington D.C. transit system are out of service because maintenance has been shortchanged over decades. Service quality declines substantially when maintenance is deferred. Here in Boston, MBTA maintenance has been underfunded for so long that it will take years to eliminate a $7.3 billion maintenance backlog even though the T plans to devote $870 million to the cause this year.

Also, public officials all too frequently understate the true costs of infrastructure projects by focusing on what they cost to build and ignoring operation and maintenance.

Another factor contributing to the failure to maintain infrastructure assets is that highway funding arrangements, for example, traditionally favored capital expenditures for new construction. As originally established by Congress, the Federal-Aid Highway Funding Program specified that Federal Trust Fund grants would cover up to 80 percent of the cost of new construction and subsequent reconstruction or replacement.

But state and local governments had to bear operating and maintenance costs. When highway links inevitably wore out before their time, state and local governments only had to worry about coming up with 20 percent of the total sum from their capital budgets since federal construction grants covered the rest, so maintenance was not a priority.

All but forgotten in this dubious calculus were costs incurred by motorists who had to struggle with increasingly decrepit highways, as well as plenty of congestion when highway lanes were closed for restoration; an inconvenience, any driver knows, that always last much longer than advertised.

Although later reauthorization bills made federal funds available for rehabilitation, renewal, and reconstruction at levels comparable to new construction, the damage had already been done. Today the advanced deterioration of the nation’s highway system is testament to the consequences of deferred maintenance.

The price tag for renewing America’s infrastructure is astronomical, and comes at a time when a federal funding regime dependent on insufficient fuel tax revenues is least able to afford escalating construction and maintenance costs.

Going forward with a big infrastructure package and setting aside, for the moment, the issue of finding the cash to do it, there needs to be an emphasis on the lifecycle management of infrastructure assets. The health care industry understands that it is far less expensive to keep a patient well than to treat them once they become sick; the same is true for our nation’s infrastructure.

Originally Published: December 10, 2016

Civil and military success depend on developing and adapting strategy

Developing strategy is too often thought of as a by-the-book, one-shot undertaking to provide managers with a comprehensive roadmap that is supposed to cover all eventualities. But in the real world, this is scarcely the case.

Instead, developing an effective strategy is a relatively messy process that involves evaluating everything we know about the external environment at any given time, designing a realistic way to achieve long-term goals, constantly monitoring for changes in the environment, and revising strategies as they are being executed to take such changes into account. Strategy must reflect reality, not what you think the world ought to be like.

As proposals to invest in transportation and other infrastructure currently making headlines, military history provides essential background for those attempting to develop effective strategies for such large undertakings. Without this background, they’re like techno-wannabes trying to do engineering without have studied physics.

As the United States approaches the 75th anniversary of Japan’s surprise attack on Pearl Harbor, we should remember lessons the military has taught us: How to properly develop a strategy, why it must be regarded as an ongoing process, and how it must respect changing realities.

Just before 8 a.m. on Dec. 7, 1941, hundreds of Japanese fighter planes attacked the American naval base at Pearl Harbor near Honolulu, Hawaii, killing more than 2,000 and wounding another 1,000. Sixteen battleships, cruisers, and other warships were sunk or disabled in the attack, but all-important fuel storage and ship repair facilities were left untouched. This omission allowed Pearl Harbor to continue as a forward base for American naval power in the Pacific.

When President Roosevelt delivered his “Day of Infamy” speech asking Congress to declare war on Japan the next day, the federal government already had a detailed game plan for defeating Japan in the Pacific. It was known as War Plan Orange and had been under development by the U.S. Navy since 1905.

The Navy began this effort and carried it forward in response to growing awareness that the U.S. acquisition of the Philippines during the Spanish-American War was likely to create conflicts with Japan in the western Pacific that could eventually lead to war.

By 1941 War Plan Orange had undergone many revisions and updates to reflect changing political and tactical realities such as the emergence of the aircraft carrier as a naval weapons system that had the potential to become as important as the battleship.

The game plan contained extensive detail about the numbers and types of fighting personnel that would be required to carry out the strategy, and how to recruit, organize and train them. Finally, it detailed the types and quantities of weapons and equipment that would be needed, how to produce them, what kinds and quantities of raw materials their production would require and how and where to allocate them in the theater of war for maximum effect.

It was all there in black and white. And as history has demonstrated, War Plan Orange reflected what actually happened. It was indeed the blueprint for the campaigns that eventually defeated Japan in 1945.

War Plan Orange guided the U.S. to victory over Japan less than four years after Pearl Harbor. This was less than half the time the U.S. spent in Vietnam, and far shorter than the Iraq and Afghanistan wars. It began as a sound strategy and was flexible enough to roll with the punches from events that strategists were unable to anticipate.

Clearly, the United States needs this kind of strategic focus at all levels of government if efforts to address major domestic and foreign policy issues are to succeed. Otherwise the country risks missing worthwhile opportunities, doing new projects and programs without proper coordination, and spending a lot of money just to make things worse.

As a new administration comes into power, it would be wise to recall that, as former President Eisenhower wisely remarked, “Plans may be irrelevant, but planning is essential.”

Originally Published: November 26, 2016

Infrastructure spending must look forward

Many economists and politicians are once again peddling the conceit that billions of dollars in infrastructure spending (aka investment) will create new jobs, raise incomes, boost productivity and promote economic growth. After all, a report card from the American Society of Civil Engineers gave America’s infrastructure a D+ grade and claimed that an investment of $3.6 trillion is needed by 2020.

But before we accept this idea as gospel, we should remember that the future isn’t likely to look like the past.

Americans are reminded that a large part of President Roosevelt’s New Deal to “Save Capitalism in America” was massive federal investments in economic growth projects like rural electrification, the Tennessee Valley Authority, the Boulder and Grand Coulee Dams, and other monumental hydroelectric generating facilities. Not to mention hundreds of commercial airports like La-Guardia and JFK in New York City, thousands of modern post offices, schools and courthouses.

The investments culminated in the 41,000-mile Interstate Highway and Defense System, begun in the 1950s under President Eisenhower (the Republican New Dealer”) because of what he had learned from his military experiences leading the allied armies in Europe during World War II.

It is further claimed that Americans have been living off these federal investments ever since. Their contribution to decades of job growth and increasing national prosperity has been so enormous that Americans have come to take them for granted as cost-free gifts from a beneficent God, like the unimaginably bountiful resources of crude oil discovered under that legendary East Texas hill called Spindletop, which came exploding out of the Lucas Number 1 well in 1901 with a roar that shook the world.

The $828 billion stimulus plan President Obama signed in 2009 focused on “shovel-ready” projects like repaving potholed highways and making overdue bridge repairs that could put people to work right away. Still as Gary Johnson noted in 2011, “My neighbor’s two dogs have created more shovel-ready jobs than the Obama stimulus plan”.

Let’s not kid ourselves, spending for these projects scarcely represented “investment in the future.” Had we been managing infrastructure assets sensibly, they would have been little more than ongoing maintenance activities that should have been funded out of current revenues, like replacing burned-out light bulbs in a factory.

One problem with initiating a massive new capital investment program is figuring out where the dollars to fund it will be found. Projections for escalating federal deficits and skyrocketing debt are bound to raise questions about the federal government’s ability to come up with the necessary cash.

For starters, it’s time to recognize that the future will be quite different from the past, particularly when it comes to transportation infrastructure. Large projects may be rendered obsolete and the burden of stranded fixed costs left to the next generation.

Disruptive technologies such as electric or hybrid, semi-autonomous or self-driving vehicles, and changing consumer preferences, especially among urban millennials who are more interested in the on-demand riding experience than driving, is a cause for optimism about the future of America’s infrastructure condition.

These new patterns of vehicle ownership and use and the emergence of privately funded technologies are changing the way people and goods move, and transforming the transportation industry in both the public and private sectors. They offer the potential for dramatic improvements in traffic congestion (due to improved safety and reduced spacing between vehicles) and reducing motor vehicle accidents and fatalities.

They can also generate environmental gains from smoother traffic flow, promote productivity growth as reduced congestion improves access to labor markets, and improved utilization of transportation assets such as existing highway capacity with higher through put without additional capital investments.

These changes create an opportunity for a new generation of political leaders to present the public with a modern vision for transportation, the economy, and the environment, not one that harks back to an earlier time.

Originally Published: Nov 12, 2016

AIG’s $180 billion bailout still stings

 

Eight years ago this month the global financial system seemed on the verge of collapse, and its rescue led to the greatest depredation on the public purse in American history. There were many crucial events during the month of the long knives, but no corporation was more central to the mess than AIG.

On Sept. 7, 2008, the federal government took control of Freddie Mac and Fannie Mae and injected $100 billion to ensure the troubled mortgage lenders could pay their debts. On September 15, Lehman Brothers announced it would file for Chapter 11 bankruptcy and Bank of America acquired Merrill Lynch for $50 billion.

Then, in the biggest bank failure in U.S. history, the Federal Deposit Insurance Corporation (FDIC) seized the assets of Washington Mutual, the sixth largest U.S. bank, and JPMorgan acquired the bank’s deposits, assets, and troubled mortgage portfolio from the FDIC. On Sept. 21, the Federal Reserve approved Morgan Stanley and Goldman Sachs transition to commercial banks.

While many blamed the investment banks for high leverage, bad risk management and overreliance on faulty internal models, not to be overlooked is the role AIG, the nation’s largest insurance company, played in the crisis. AIG was once one of the largest and most profitable companies in corporate America, with a gold-plated “AAA” credit rating.

But on September 16, the federal government provided an initial $85 billion in taxpayer cash to bail out the firm. In return, AIG became a ward of Uncle Sam, which acquired 79.9 percent ownership of the company. This was only the first of four bailouts for AIG, totaling an estimated $180 billion.

AIG was in worse shape than Lehman Brothers had been. Yet unlike Lehman, the feds chose to save it. The explanation: AIG was regarded as too big, too global, and too interconnected to fail.

After the 1999 repeal of Glass-Steagall, the law that had regulated financial markets for over six decades, President Clinton signed the Commodity Futures Modernization Act (CFMA) in 2000, which effectively removed derivatives such as Credit Default Swaps (CDS) from federal and state regulation, proving once again that regulators exist to protect the interests of the regulated.

CDS are essentially a bet on whether a company will default on its bonds and loans. AIG was a huge player in the CDS business, which allowed the firm to insure asset-based securities containing sub-prime mortgages against default.

Although swaps behave similarly to insurance policies, they were not covered by the same regulations as insurance after passage of the CFMA. When an insurance company sells a policy, it is required to set aside a reserve in case of a loss on the insured object. But since credit default swaps were not classified as insurance contracts, there was no such requirement.

AIG’s CDS business caused it serious financial difficulties in 2007, when the housing bubble burst, home values dropped and holders of sub-prime mortgages defaulted on their loans. By selling these contracts without reserves, the firm left itself unprotected if the assets insured by the swaps defaulted. AIG had insured bonds whose repayments were dependent on sub-prime mortgage payments. Yet it never bothered to put money aside to pay claims, leaving the company without sufficient resources to make good on the insurance.

Taxpayers stepped in to pay in full the dozens of banks whose financial products were insured with AIG swaps. Unlike in corporate bankruptcies, none of these counterparties were forced to take a haircut, requiring the government to pump more public money into the banks.

To add insult to injury, two weeks after the government provided its fourth bailout to AIG in 2009, it was revealed that the firm was paying $165 million in bonuses to retain key employees to unwind the toxic financial waste. Most people understand that if you go to government for a handout, executives should forgo bonuses. Then again, so much of what happened eight years ago this month defied common sense.

Originally Published: Sep 16, 2016.

Why not just scrap all corporate taxes?

Both major-party presidential candidates claim to have tax plans that will help make the economy work for everyone. They will assist the anxious middle class, the downsized and the dispossessed while growing the economy and jobs. An important component of each is the treatment of corporate profits.

The candidates differ on how to reform the corporate tax code. Front-runner Hillary Clinton has not embraced President Barrack Obama’s proposal to reduce the federal corporate marginal tax rate to 28 percent from 35 percent, the highest in the developed world, and pair the reduction with a broader tax base (fewer exemptions) to generate savings to finance the proposed cut. Instead, Clinton has proposed tighter rules to deter corporations from moving abroad and measures to prevent corporate tax avoidance.

Donald Trump, on the other hand, favors a 15 percent corporate income tax rate and would offer corporations a reduced 10 percent rate if they bring home some of the $2 trillion American corporations have stashed overseas.

But the best path might just be to scrap the corporate tax altogether.

Neither candidate has addressed the issue of most high-income countries having adopted a territorial tax system, in which income earned abroad is not taxed by the home country. Yet the U.S. continues to use a version of a global tax system that taxes domestic companies’  income regardless of where it was earned.

The case for lowering the tax rate is that the gap between the U.S. rate and that of other countries encourages companies to shift investment and profits overseas. Corporations complain that high corporate taxes and a global tax system make it more difficult for them to compete in the world economy, attract foreign investment to the U.S. and create American jobs.

The American public is greatly unimpressed by these arguments. Polls show that the majority of Americans believe corporations pay less than their fair share in taxes. According to a survey by Citizens for Tax Justice, many Fortune 500 companies paid an average effective federal tax rate of just  19.4 percent, much less than the 35 percent marginal rate, the additional tax paid on an extra dollar of income.

A key target of public criticism is the expansion of deductions and exemptions to corporate income that have contributed to its decline as a share of total tax revenue over the last several decades. Corporate income taxes accounted for 32 percent of federal tax income in 1951; by 2015 it was 11 percent.

The U.S. has a dysfunctional and confusing tax code. Lost between fact and fiction is the question who bears the economic burden of taxing  corporate profits. You don’t have to be drunk, crazy or both to understand that this is a nontrivial question.

Are corporate taxes simply another way to tax firms’ shareholders, employees, and customers? When corporate income is paid out as dividends or realized as capital gains, corporations and shareholders pay tax twice on the same income. Are these the only people who actually end up paying the corporate income tax, or do employees also pay in the form of lower wages and fewer benefits?

If that is indeed the case, then perhaps it is time to scrap the corporate income tax altogether and instead tax individuals on their dividends and capital gains at ordinary income tax rates. The corporate income tax would go the way of Prohibition, and in the process make the U.S. a desirable place to locate and build businesses.

This certainly isn’t the last word on the subject, but it isn’t a bad approach to reforming the corporate tax code, which will likely be addressed after the 2016 elections if one party controls the White House and Congress . If not, we’ll just continue to improvise- and likely produce the same dysfunctional results.

Originally Published: Aug 23, 2016

Aftermath of the Brexit vote will be long and uncertain

Once again we are reminded that nothing is forever; not now, not ever, never. On June 23 Great Britain’s electorate voted to quit the 28-member European Union despite threats that economic Armageddon will follow. The Brexit vote represented the sort of populist victory over establishment politics that give elites, few of whom have the scars of the marketplace, agita.

The take-home message is that the British voted to be free to make their own decisions on issues from trade to immigration and free from burdensome one-size-fits-all E.U. regulations passed by unelected, know-it-all Eurocrats.

Like the Arab Spring, the result took many by surprise. Part of the shock came from the fact that pundits, pollsters, and bookmakers all got it spectacularly wrong. They were pretty sure that the British would reject Brexit, the clever name given to the decision to leave the E.U. Few people were surprised when there was a steep sell-off the following day, with shock and awe in financial markets around the globe.

Leadership in the country’s two major parties was in disarray following the vote and none of the British leaders had their hands on the wheel as the vehicle was careening off the cliff. Conservative Prime Minister David Cameron, an opponent of leaving, fell on his sword and announced the next day that he would resign his post but would linger as a lame duck for several months while leaving the divorce negotiations to his successor.

Labor was also in limbo with a leadership challenge being organized against Jeremy Corbyn, who was blamed for a half-hearted effort to keep Britain in the E.U. Perhaps they were thinking they would simply call Harry Potter and borrow his magic wand to deal with the aftermath of the vote.

Britain’s departure must be negotiated with the E.U. and should come at less economic and political cost than when America severed its relationships with an offshore power in 1776. The E.U. wants Britain to kick-start the legal process of quitting by immediately invoking the 250 words in a treaty that set guidelines for divorce and provide a two-year window for talks.

But there is no requirement that Britain invoke the article until it chooses to do so. Until then it remains a full member, with all privileges and obligations.

All this foot dragging contributes to a policy vacuum and heightens the uncertainty surrounding the divorce. If you are a British firm looking to expand, do you implement your plans or consider relocating somewhere where the relationship with the E.U. is more settled?

The vote also casts uncertainty over the future of the Union Jack. Scotland and Northern Ireland voted to remain in the E.U. Scotland is now considering a second independence referendum that would give its electorate the opportunity to leave the United Kingdom and stay in the E.U.

It will take years to sort through the economic impact of the vote as Great Britain and the E.U. negotiate post-Brexit relationships. Ideally the British want to work out trade agreements that maintain unfettered access to the single E.U. market, but without the requirement for the free movement of people. But it is hard to see why E.U. member states would agree to unravel rules on free movement that they regard as sacrosanct.

The loss of Great Britain raises fears that the E.U. will disintegrate into rival nation states. Other members such as the Netherlands may stage referendums on leaving. You can expect the E.U. to take a tough line on the terms of Britain’s departure to make it clear to any other nation that might try to ride British coattails out of the union that there is a considerable cost to doing so.

The only certainty is the cascade of commentary you will hear about this complicated story as the divorce papers are filed in the coming weeks and months.

Originally Published: Jul 9, 2016

Candidates run from, are ignorant about, and mostly just ignore the national debt

There have been few signs that the three remaining presidential candidates seeking to capture the nation’s commanding heights are willing to confront the subject of America’s public debt, which has grown to over $19 trillion, more than the gross domestic product. It is estimated that by 2023, entitlement payments, military spending and interest on the debt will consume 100 percent of tax revenues.

All three have behaved like they know less than zilch about the subject. Assuming the final match-up is Hillary Clinton versus Donald Trump, you get to choose from two disliked candidates who give egomaniacs a bad name. All in all, this match-up is not a battle of good against evil. It is a choice between bad and less bad.

When it comes to the debt, all three remaining candidates have behaved like Scarlett O’Hara in “Gone with the Wind” who reacted to every adverse circumstance with the statement: “I can’t think about that right now. If l do, I’ll go crazy. I’ll think about that tomorrow.”

Donald Trump, the presumptive Republican nominee, did wade into the subject several weeks ago. There are a thousand things you can say about Trump, some of which you can even print in newspapers. But we have come to know one thing above all else: He’s going to say what is on his mind.

Several weeks ago, Trump made the stunning suggestion that maybe Uncle Sam can save a few shekels by renegotiating the public debt and paying back holders of United States bonds less than 100 cents on the dollar. Such action would be tantamount to a default. His proposal overshadows everything he has said about the economy. It was greeted as lunacy and created quite a kerfuffle in global financial markets, which found his suggestion as enticing as exploratory surgery.

Despite concerns about the United States putting its fiscal house in order, Treasury securities are seen as among the world’s safest, if not the safest, debt because they are backed by the full faith and credit of the United States government. No other investment carries as strong a guarantee that interest and principle will be paid in full and on time.

Responding to the tsunami of ridicule that greeted this absurd suggestion, Trump walked back his comments the following day, saying he never meant to suggest he wanted the United States to default on its debt.

Some perspective is in order here regarding who owns our nation’s debt. American stakeholders own nearly $13 trillion of the more than $19 trillion. More than $5 trillion is held by trust funds such as Social Security and the Highway Trust Fund; $5.1 trillion is held by individuals, pension funds and state and local governments; and the balance of$2.5 trillion is held by the Federal Reserve.

Of the remaining $6.2 trillion, China holds $1.3 trillion, followed by Japan with $1.1 trillion, and the $3.8 trillion that’s left is held by other countries such as Saudi Arabia, with $117 billion.

Foreign governments don’t own us; we owe us.

While nobody knows for certain what would happen, failing to pay creditors anything less than the full amount owed undermines the very notion of the full faith and credit guarantee of United States government sovereign debt. Americans whose savings and retirement accounts include treasury bonds would be hurt. International investors would panic and raise future borrowing costs for the United States government by demanding higher interest rates since the debt would be seen as a less safe investment. This would prompt interest rates around the globe, which are often tied to U.S. treasuries, to spike. After all, U.S. treasuries are the pillar of the global financial system.

Sadly, it’s time to toss in the towel, the tablecloth and the rest of the accoutrements and admit it: We got these candidates to this point; they are what the American public deserves.

Originally published: May 28, 2016

Free trade doesn’t work for most American workers

The aphorism “A rising tide lifts all boats” has become entwined with a basic assumption that free trade results in economic wins for all players in the global economy. Of course this assumes you are lucky enough to have a boat that has not run aground.

The classic case for free trade was made nearly 200 years ago by economist David Ricardo. This static argument relies on the principle of comparative advantage; that trade enables countries to specialize in goods and services they produce more efficiently than do their trading partners. This increases overall productivity and total output.

The conclusion follows from countries having different opportunity costs of producing tradeable goods. The opportunity cost of any good is the other goods that could have been produced by the same resources. Each country focuses on what it does best and everyone gains. This notion of free trade has a hallowed status among the cheerleaders for globalization.

Another way to understand comparative advantage is to consider the opportunity cost of undertaking a certain activity. Let’s assume that Lady Gaga, the famous entertainer, also happens to be a world-class typist. Rather than entertaining and typing, she should specialize in entertaining, where her comparative advantage is greatest and she could maximize her income.

In this example, Lady Gaga has a much higher opportunity cost of typing than does her secretary. If Lady Gaga spent an hour typing while the secretary spent the hour running the business, there would be a loss of overall output.

The real world is much more complex. Free trade has a downside: while its benefits are broadly distributed, costs are often concentrated. Consider the case of American textile workers. In the aggregate, American consumers gain by having access to cheap clothing, but unemployed textile workers bear the loss.

Many free trade cheerleaders confuse it with off shoring jobs, which is simply substituting, cheap foreign labor for more expensive American labor when nothing is in fact being traded. Moving production overseas has nothing to do with comparative advantage; it simply reflects wage and price competition from countries seeking jobs and economic growth.

If a firm shifts production to low-wage countries, its profits improve, driving up share prices and senior management performance bonuses. To paraphrase one-time presidential candidate Ross Perot: If you can build a factory overseas, pay about a dollar an hour, have little or no health care or retirement benefits and no environmental controls, then you are the greatest businessman in the world

But when many firms move overseas, American workers lose their incomes. So when do the costs of lower incomes resulting from job losses and government revenues exceed the benefits to consumers of lower prices? Put differently, do the costs of exporting good-paying American jobs outweigh gains from cheaper imports and contribute to a shrinking middle class.

Free trade advocates contend that the Americans left unemployed have acquired new skills and will find better jobs in “sunrise” industries. In reality, how many steelworkers do you know who have become computer software engineers?

This is one reason why Americans’ real incomes have stopped growing as manufacturing jobs have been moved offshore.

As then-presidential candidate Barack Obama said in 2008, “You go into these small towns in Pennsylvania and like a lot of small towns in the Midwest, the jobs have been gone now for over 25 years and nothing’s replaced them. And it’s not surprising, then they get bitter, they cling to guns, or religion or antipathy to people who aren’t like them or anti-immigrant sentiment or antitrade sentiment to explain their frustrations.”

A former General Motors CEO allegedly said “what is good for GM is good for America.” But offshoring challenges the conventional wisdom that American firms generally advance the nation’s economic interests. When they employ a large foreign workforce but few people within the United States, it certainly is good for the firms, but not for the American worker.

Originally Published: April 16, 2016.