Why not just scrap all corporate taxes?

Both major-party presidential candidates claim to have tax plans that will help make the economy work for everyone. They will assist the anxious middle class, the downsized and the dispossessed while growing the economy and jobs. An important component of each is the treatment of corporate profits.

The candidates differ on how to reform the corporate tax code. Front-runner Hillary Clinton has not embraced President Barrack Obama’s proposal to reduce the federal corporate marginal tax rate to 28 percent from 35 percent, the highest in the developed world, and pair the reduction with a broader tax base (fewer exemptions) to generate savings to finance the proposed cut. Instead, Clinton has proposed tighter rules to deter corporations from moving abroad and measures to prevent corporate tax avoidance.

Donald Trump, on the other hand, favors a 15 percent corporate income tax rate and would offer corporations a reduced 10 percent rate if they bring home some of the $2 trillion American corporations have stashed overseas.

But the best path might just be to scrap the corporate tax altogether.

Neither candidate has addressed the issue of most high-income countries having adopted a territorial tax system, in which income earned abroad is not taxed by the home country. Yet the U.S. continues to use a version of a global tax system that taxes domestic companies’  income regardless of where it was earned.

The case for lowering the tax rate is that the gap between the U.S. rate and that of other countries encourages companies to shift investment and profits overseas. Corporations complain that high corporate taxes and a global tax system make it more difficult for them to compete in the world economy, attract foreign investment to the U.S. and create American jobs.

The American public is greatly unimpressed by these arguments. Polls show that the majority of Americans believe corporations pay less than their fair share in taxes. According to a survey by Citizens for Tax Justice, many Fortune 500 companies paid an average effective federal tax rate of just  19.4 percent, much less than the 35 percent marginal rate, the additional tax paid on an extra dollar of income.

A key target of public criticism is the expansion of deductions and exemptions to corporate income that have contributed to its decline as a share of total tax revenue over the last several decades. Corporate income taxes accounted for 32 percent of federal tax income in 1951; by 2015 it was 11 percent.

The U.S. has a dysfunctional and confusing tax code. Lost between fact and fiction is the question who bears the economic burden of taxing  corporate profits. You don’t have to be drunk, crazy or both to understand that this is a nontrivial question.

Are corporate taxes simply another way to tax firms’ shareholders, employees, and customers? When corporate income is paid out as dividends or realized as capital gains, corporations and shareholders pay tax twice on the same income. Are these the only people who actually end up paying the corporate income tax, or do employees also pay in the form of lower wages and fewer benefits?

If that is indeed the case, then perhaps it is time to scrap the corporate income tax altogether and instead tax individuals on their dividends and capital gains at ordinary income tax rates. The corporate income tax would go the way of Prohibition, and in the process make the U.S. a desirable place to locate and build businesses.

This certainly isn’t the last word on the subject, but it isn’t a bad approach to reforming the corporate tax code, which will likely be addressed after the 2016 elections if one party controls the White House and Congress . If not, we’ll just continue to improvise- and likely produce the same dysfunctional results.

Originally Published: Aug 23, 2016

And God created woman

The all-female leads in the latest “Ghostbusters” movie reboot have upset many misogynistic male fans. The film’s trailer is the most “disliked” in the history of YouTube.

No surprise here.

There are always men who are angered by women demanding their reproductive rights or running for president. From the dawn of human awareness, men have used their greater physical size and strength to control, oppress, subvert and generally abuse women, betraying a deep fear of losing male power.

Men have always come first in human societies. This is reflected in the standard version of the Adam and Eve myth that is enshrined in Judea-Christian culture and which, it goes without saying, was invented by men.

But there is a very different version of the myth, which draws on the Talmudic tradition of Midrash. According to this version, as God was nearly done populating the planet earth, he realized that he had not yet developed a really effective serial killer among land animals. He had created the shark, the barracuda, and the piranha, but these were sea animals. He needed a creature at least as formidably murderous to roam the earth’s land.

So God marshaled his creative powers and after much research and development, he finally came up with the greatest serial killer of all – the cat.

God immediately realized that this was his masterpiece among the earth’s creatures, so he developed more versions of cats than of any other species, ranging in size from 900-pound Siberian tigers to tiny felines of a little more than a few pounds each. All of them endowed with the physical and instinctive characteristics needed to be world champion serial killers.

God was so pleased with that he had done that he decided to award himself a prize. The prize was Eve, a remarkable creature who seemed to epitomize the grace and mystery inherent in the feline species. After admiring Eve for a while, God placed her in the Garden of Eden for safekeeping while he went off to clean up some loose ends on Jupiter.

But Eve, like all women, had her own ideas. One involved having a bigger, stronger, more-or-less mirror image of herself to take out the garbage, mow Eden’s lawns, bring her armloads of fresh fruit and fill her nights with ecstasy. So by herself, while God’s back was turned, she conceived Adam and brought him into the world to be her companion, even though Adam turned out to be something of a mixed blessing because of his domineering ways and general contrariness.

Thus began the great saga of human dominion over the earth. In metaphorical terms, this story is generally consistent with what many anthropologists believe actually happened when humans first appeared. But this Eve-first reality was too disturbing for the men who wrote the Old Testament men who, like all men, were instinctively terrified of women.

So they came up with a story that made Adam the first human being and reduced Eve to something of an afterthought created from, of all things, one of Adam’s ribs. They also invented numerous fairy tales to blame women for all the world’s troubles. “The woman made me do it,” Adam insisted when God asked him why he had eaten the forbidden fruit from the Tree of Knowledge.

This atavistic fear of women drove men to use their superior physical size and strength to develop male chauvinist societies in a fruitless attempt to make women seem less intimidating. Men denied them basic human rights, restricting their freedom, imposing on them the state of chattels (“Who giveth this woman in marriage?”) and all the other examples of male tyranny.

In one form or another, these irrational and fear-based attempts to suppress recognition of women as equals are universal in human religions, myths, cultural traditions, knee-jerk social norms and even legal codes. Now we can add the reaction to Hollywood casting decisions to that list.

Originally Published: Aug 6, 2016

Stock buybacks do nothing for most of us

Economic inequality in the United States is at historic levels. In the wake of the Great Recession, the issue has captured the attention of the American public, but there is little consensus about its causes. One of the causes is clearly the rise in corporate stock buybacks and short-term thinking.

In the 1980s, the top 1 percent of Americans accounted for 10 percent of the income generated in the economy; by 2012 it was approaching 20 percent. The top 1 percent controlled nearly 42 percent of the wealth, a level not seen since the roaring ’20s.

This increased inequality does not support, and even inhibits, the consumer spending that drives economic growth in the United States because it leaves the middle class with less buying power.

Those who are supposedly smart on the issue point to a range of reasons for economic inequality, such as technological change, the decline of unions, globalization and trade agreements. Often overlooked is the expansion of the financial sector and corporate America’s Ahab-like obsession with short-term thinking.

According to the Bureau of Economic Analysis, in 1970 the finance and insurance industries accounted for 4.2 percent of gross domestic product, up from 2.8 percent in 1950. By 2012, the sector represented 6.6 percent.

The story with profits is similar: In 1970, finance and insurance industry profits made up about one quarter of the profits of all sectors, up from 8 percent in 1950. Despite the after effects of the financial crisis, that number had grown to 37 percent by 2013. Yet these industries create only 4 percent of all jobs, so profits go to a small minority.

The increase in the influence of financial sector extends to public corporations that face increased pressure to make immediate investor payouts through stock buybacks. According to Research Affiliates, S&P 500 companies spent $521 billion on stock buybacks in 2013 and $634 billion in 2014. More than

$6.9 trillion has been spent on share buybacks since 2004. Not one dime of this money has gone into expanding operations, hiring more employees, increasing wages, research and development, enhancing productivity, and improving the customer experience.

An important part of the appeal of stock buybacks is their ability to increase earnings per share. In theory, buybacks tend to jack up the share price, at least in the short term, by decreasing the number of shares outstanding while increasing earnings per share. Corporations frequently finance these buy backs by issuing debt, taking advantage of the Federal Reserve holding interest rates underwater and the fact that interest expense on the debt is tax deductible.

Underlying all this are two notions. First, the only responsibility of the corporation is to maximize shareholder value as reflected in the stock price, as opposed to getting sidetracked by talk about multiple stakeholders such as employees, customers and the community.

The second is that corporate management should be compensated in stock to align their interest with those of shareholders. Since managers’ pay is tied to the firm’s stock performance even at the expense of long-term shareholder wealth, the temptation to manage earnings to meet short-term investor expectations instead of long-term shareholder value is quite strong. For example, if the choice is between repairing the roof on the factory in Toledo this quarter or missing the quarterly earnings figure, which could cause earnings per share to tumble, corporate management might decide not to make the capital investment.

Stock-based compensation has also contributed to the sharp rise in CEO compensation. Between 1978 and 2013, CEO compensation increased by nearly 10-fold while workers experienced stagnant wages and increasing job insecurity.

While corporate and finance executives live in a second gilded age, stock buybacks and short-term thinking contribute to under investing in innovation and skilled workers, and ultimately to more economic inequality. But none of this troubles the 1 percenters, and they appear to be the only ones who really matter.

Originally Published: Jul 23, 2016

Aftermath of the Brexit vote will be long and uncertain

Once again we are reminded that nothing is forever; not now, not ever, never. On June 23 Great Britain’s electorate voted to quit the 28-member European Union despite threats that economic Armageddon will follow. The Brexit vote represented the sort of populist victory over establishment politics that give elites, few of whom have the scars of the marketplace, agita.

The take-home message is that the British voted to be free to make their own decisions on issues from trade to immigration and free from burdensome one-size-fits-all E.U. regulations passed by unelected, know-it-all Eurocrats.

Like the Arab Spring, the result took many by surprise. Part of the shock came from the fact that pundits, pollsters, and bookmakers all got it spectacularly wrong. They were pretty sure that the British would reject Brexit, the clever name given to the decision to leave the E.U. Few people were surprised when there was a steep sell-off the following day, with shock and awe in financial markets around the globe.

Leadership in the country’s two major parties was in disarray following the vote and none of the British leaders had their hands on the wheel as the vehicle was careening off the cliff. Conservative Prime Minister David Cameron, an opponent of leaving, fell on his sword and announced the next day that he would resign his post but would linger as a lame duck for several months while leaving the divorce negotiations to his successor.

Labor was also in limbo with a leadership challenge being organized against Jeremy Corbyn, who was blamed for a half-hearted effort to keep Britain in the E.U. Perhaps they were thinking they would simply call Harry Potter and borrow his magic wand to deal with the aftermath of the vote.

Britain’s departure must be negotiated with the E.U. and should come at less economic and political cost than when America severed its relationships with an offshore power in 1776. The E.U. wants Britain to kick-start the legal process of quitting by immediately invoking the 250 words in a treaty that set guidelines for divorce and provide a two-year window for talks.

But there is no requirement that Britain invoke the article until it chooses to do so. Until then it remains a full member, with all privileges and obligations.

All this foot dragging contributes to a policy vacuum and heightens the uncertainty surrounding the divorce. If you are a British firm looking to expand, do you implement your plans or consider relocating somewhere where the relationship with the E.U. is more settled?

The vote also casts uncertainty over the future of the Union Jack. Scotland and Northern Ireland voted to remain in the E.U. Scotland is now considering a second independence referendum that would give its electorate the opportunity to leave the United Kingdom and stay in the E.U.

It will take years to sort through the economic impact of the vote as Great Britain and the E.U. negotiate post-Brexit relationships. Ideally the British want to work out trade agreements that maintain unfettered access to the single E.U. market, but without the requirement for the free movement of people. But it is hard to see why E.U. member states would agree to unravel rules on free movement that they regard as sacrosanct.

The loss of Great Britain raises fears that the E.U. will disintegrate into rival nation states. Other members such as the Netherlands may stage referendums on leaving. You can expect the E.U. to take a tough line on the terms of Britain’s departure to make it clear to any other nation that might try to ride British coattails out of the union that there is a considerable cost to doing so.

The only certainty is the cascade of commentary you will hear about this complicated story as the divorce papers are filed in the coming weeks and months.

Originally Published: Jul 9, 2016

A LESSON OF WAR: Iraq, Afghanistan and from a century past

The Battle of the Somme was a meat grinder. The centenary of this battle, fought mid-way through World War I, will be commemorated on July 1 in Great Britain, France and other countries that lost men in one of the largest and bloodiest battles in the history of human warfare.

Between July 1 and Nov. 18, 1916, the British suffered about 420,000 casualties, the French about 200,000 and the Germans about 465,000. All told, 300,000 soldiers died and little was achieved. Somme was like America’s recent conflicts in Iraq and Afghanistan writ large.

After two years of relative stalemate, allied forces decided to make a big push to break through the German lines and hopefully achieve a quick and decisive victory on the Western Front, much like politicians and generals assumed quick victories in Iraq and Afghanistan. The offensive was designed to relieve pressure on the French as a result of the German offensive against French forces at Verdun, and take control of a 20-mile stretch of the meandering River Somme.

The first day of that battle was the bloodiest in the history of the British army . Of the 120,000 troops who went into battle, the British suffered about 60,000 casualties, as many as 20,000 of whom died before the day was over.

The plan drawn up by generals in their chateau headquarters miles behind the battlefield was for an artillery barrage to pound the German defenses to an extent that the attacking British could just walk in and occupy the opposing trenches with minimal opposition. Cavalry units would then gloriously pour through the German lines, pursue the fleeing Germans and turn the tide of a war that had been in a deadly stalemate for the better part of two years.

Before the battle started, the British fired over a million and a half shells at the German soldiers, many of which either did not explode or completely missed their targets.

During seven days and nights of bombardment that removed the element of surprise, German troops simply moved into their deep underground concrete bunkers and waited. When the artillery pounding stopped, scores of British soldiers walked in a row uphill in successive waves across no-man’s-land and were mowed down, easy targets for swarms of German machine gun nests. By nightfall, few of the objectives had been taken despite massive loss of life.

The offensive would continue for another 4 1/2 months in a similar vein. After July 1, a long stalemate settled in as the British employed the same hopeless method of attack conforming to a prefabricated interpretation of events on the ground, despite assault after assault turning into a killing ground. Somme became a bloody battle of attrition.

By the end of the battle, a massive loss of human life had netted the allies roughly six miles of GermanĀ­ held territory.

The battle helped cement the reputation of World War I as a war of terrible slaughter caused by poor decisions on the part of high commanders. The troubled British offensive resulted in the epithet “lions led by donkeys.”

Today, revisionist historians contend that the battle, while costly and flawed, put an end to German hopes at Verdun, badly weakened the German army and helped the British learn new tactics for successfully prosecuting future offensives.

Traditionalists believe this interpretation airbrushes reality. They say the battle achieved nothing but untold misery and loss. It was an unjustified bloodbath and evidence of the British high command’s incompetence. They argue that British military leaders failed in the fashion of Pyrrhus, who lamented after the battle at Asculum: “another such victory over the Romans and we are undone.”

Having just lived through two conflicts, Americans can relate to this quote. Iraq and Afghanistan, which is ongoing, both created more problems than they solved. Optimistic miscalculations led to unintended consequences and bloody inconclusiveness. And so it goes.

Originally Published: Jun 25, 2016

The Fed got it wrong

The job market received a jolt last week when the Labor Department reported that just 38,000 jobs were added in May, the fewest for any month in more than five years. The experts expected a gain of 150,000 jobs and had included an estimated decrease of about 35,000 striking Verizon workers.

Equally disturbing, the job numbers for the two previous months were revised downward. In total, there were 59,000 fewer jobs in March and April than had previously been reported. This suggests the May numbers will be revised downward next month.

But it gets worse. Of the 38,000 new jobs, only 25,000 were in the private sector. Yet even as job growth stalled, the headline unemployment rate fell to 4.7 percent from 5 percent, in large part due to a drop in the labor force participation rate as many frustrated Americans stopped looking for jobs, meaning they are not counted in the unemployment rate. It’s an ominous sign that suggests the economy may be slowing.

Since the end of the recession, economic growth has been lackluster despite the Federal Reserve putting the pedal to the metal by pursuing zero interest rates and engaging in bond purchases known as quantitative easing. The rationale for this policy is that artificially suppressed interest rates and easy money are required for the Fed to fulfill its full-employment mandate. They assume that low rates stimulate business investment and make it easier for consumers to finance big-ticket purchases such as housing and automobiles.

The May employment numbers are just the latest evidence that it isn’t working. This should come as no surprise, since the Fed high priests’ failure to prophesize the 2008 crisis has been well documented.

President Truman once famously asked for a one-armed economist because his economic advisers kept telling him “on the one hand this, and on the other hand that.” For sure, there are pros and cons to the Fed’s monetary policy. Low interest rates have contributed to a partial recovery and growth may be stronger than it would otherwise have been. The rationale was to lower interest rates to encourage movement into riskier assets with higher yields, including stocks, junk bonds, real estate and commodities. The Fed has privileged Wall Street over Main Street in the belief that the wealth effect will trickle down to the ordinary American worker.

Lower rates would encourage greater leverage, i.e., borrowing to invest and boost asset prices. This pseudo “wealth effect” would then stimulate consumption, economic growth, and job creation. Such monetary policy raises the question of whether the Fed should be promoting risk and inflated asset prices that outpace real economic growth.

On the other hand, zero interest rates have created problems for savers, retirees and those on the other side of the velvet rope. Savers get virtually no return on their money market funds and saving accounts. Indeed, after inflation and taxes, real rates on these instruments are negative, promoting inequality and resulting in declining purchasing power. With so many Americans living paycheck to paycheck, is it any wonder that payday lenders are doing record business?

Lost interest is a permanent loss of wealth. Very low interest rates force retirees, who rely on interest income, to reduce their spending. Workers contemplating retirement will stay in the labor force longer to save more, blocking access for younger workers.

More importantly, low interest rates play havoc with retirement planning for both individuals and pension plans. Pension funds face increasing unfunded liabilities. Without adequate future income streams, retirement as Americans have known it is off the table.

Fed policy can’t overcome structural weakness in the job market that results from the twin challenges of globalization and rapid technological change. Continuing the policy of cheap credit is reminiscent of the old lesson about looking for a lost item under a lamppost at night because that’s where the light is. It’s time to look elsewhere for answers.

Originally Published: Jun 11, 2016

Candidates run from, are ignorant about, and mostly just ignore the national debt

There have been few signs that the three remaining presidential candidates seeking to capture the nation’s commanding heights are willing to confront the subject of America’s public debt, which has grown to over $19 trillion, more than the gross domestic product. It is estimated that by 2023, entitlement payments, military spending and interest on the debt will consume 100 percent of tax revenues.

All three have behaved like they know less than zilch about the subject. Assuming the final match-up is Hillary Clinton versus Donald Trump, you get to choose from two disliked candidates who give egomaniacs a bad name. All in all, this match-up is not a battle of good against evil. It is a choice between bad and less bad.

When it comes to the debt, all three remaining candidates have behaved like Scarlett O’Hara in “Gone with the Wind” who reacted to every adverse circumstance with the statement: “I can’t think about that right now. If l do, I’ll go crazy. I’ll think about that tomorrow.”

Donald Trump, the presumptive Republican nominee, did wade into the subject several weeks ago. There are a thousand things you can say about Trump, some of which you can even print in newspapers. But we have come to know one thing above all else: He’s going to say what is on his mind.

Several weeks ago, Trump made the stunning suggestion that maybe Uncle Sam can save a few shekels by renegotiating the public debt and paying back holders of United States bonds less than 100 cents on the dollar. Such action would be tantamount to a default. His proposal overshadows everything he has said about the economy. It was greeted as lunacy and created quite a kerfuffle in global financial markets, which found his suggestion as enticing as exploratory surgery.

Despite concerns about the United States putting its fiscal house in order, Treasury securities are seen as among the world’s safest, if not the safest, debt because they are backed by the full faith and credit of the United States government. No other investment carries as strong a guarantee that interest and principle will be paid in full and on time.

Responding to the tsunami of ridicule that greeted this absurd suggestion, Trump walked back his comments the following day, saying he never meant to suggest he wanted the United States to default on its debt.

Some perspective is in order here regarding who owns our nation’s debt. American stakeholders own nearly $13 trillion of the more than $19 trillion. More than $5 trillion is held by trust funds such as Social Security and the Highway Trust Fund; $5.1 trillion is held by individuals, pension funds and state and local governments; and the balance of$2.5 trillion is held by the Federal Reserve.

Of the remaining $6.2 trillion, China holds $1.3 trillion, followed by Japan with $1.1 trillion, and the $3.8 trillion that’s left is held by other countries such as Saudi Arabia, with $117 billion.

Foreign governments don’t own us; we owe us.

While nobody knows for certain what would happen, failing to pay creditors anything less than the full amount owed undermines the very notion of the full faith and credit guarantee of United States government sovereign debt. Americans whose savings and retirement accounts include treasury bonds would be hurt. International investors would panic and raise future borrowing costs for the United States government by demanding higher interest rates since the debt would be seen as a less safe investment. This would prompt interest rates around the globe, which are often tied to U.S. treasuries, to spike. After all, U.S. treasuries are the pillar of the global financial system.

Sadly, it’s time to toss in the towel, the tablecloth and the rest of the accoutrements and admit it: We got these candidates to this point; they are what the American public deserves.

Originally published: May 28, 2016

Stock options for executives carry unintended consequences

If you are Rip Van Winkle awakening from a 20-year slumber, you might not know about America’s outrageous compensation for chief executive officers. But almost everyone else does. The flow of most income and wealth gains to the few highest earners comes at the expense of everyone else.

Let’s not forget that Americans’ real median incomes have been stagnant. Annual U.S. household income reached $57,263 this past March but is still below the $57,342 median in January 2000, according to Sentier Research. Any wonder why Americans are angry?

In contrast, a recent AFL-CIO study found that heads of the Standard & Poor’s 500 companies are paid about 330 times as much, on average, as production and non-supervisory employees.

CEO compensation took off in the 1990s because activist shareholders, board members and academics, all mating like alley cats, pushed to better align management’s interests with those of shareholders. So corporations began to award stock options to senior managers

Executive stock options have been a controversial topic for some time because of the fortunes executives have made under these programs. Stock options come in several forms. In the most common, executives granted stock options have the right but not the obligation to purchase shares of their company’s stock at a favorable set price within a specified time period.

Stock options are often used in lieu of signing bonuses as a tool to attract talented executives. In theory, they also align shareholder and management interests. The idea is that granting stock options gives executives skin in the game and creates incentives for them to make decisions that lead to higher stock prices. Vesting periods for options give current managers incentives to remain with the firm.

While beneficial in some ways, this formulation has its downsides. It tempts executives to focus on the short term at the expense of long-term shareholder value. Let the next guy worry about the Ohio factory whose leaky roof should have been replaced years ago while management focuses on managing quarterly earnings figures to meet investor expectations and lift stock prices. Management may decide not to invest in research and development on projects whose payoff is down the road.

Recent revelations about Valeant Pharmaceuticals International offer a treasure trove of teachable moments. Conventional pharmaceutical companies spend about 20 percent of sales on R&D for new drugs. Valeant executives devoted only 3 percent to R&D. The firm also had to restate its 2014 and 2015 earnings because millions in sales had been recognized during the wrong period and an array of costs excluded to allow it to report fantasy earnings of $2.74 a share when each Valeant share earned 14 cents.

But wait, there’s more. While CEO Michael Pearson received a base salary of $2 million, his executive pay was tied to Valeant’ s stock price. He owned stock and options worth more than $3 billion, putting him on the Forbes billionaire list before the recent scandal crushed the stock.

Today’s business world is a playground for feckless conduct that pats you on the back for behaving badly. Maybe it is time to put an end to that by prohibiting hired gun managers from buying and selling stock in their companies, just like we bar professional athletes from betting on their own games. In lieu of stock options, give them big cash salaries plus generous bonuses linked to how profitable their companies are over several years as an incentive for them to manage for the long term.

When future historians look whether stock options are an effective way to align the interests of managers and shareholders, they will ask some basic questions: Do they motivate executives to act in the best interest of shareholders? What costs do stock options impose on the company and its shareholders?

The answer is that they may indeed accomplish those things, but with a lot of unintended consequences.

Originally published: May 14, 2016

The repeal of a Depression-era banking law and the economic crash of 2008

The causes of the 2008 financial crisis are multiple and complicated. Minor deities of finance and even presidential candidates such as Bernie Sanders argue over whether the repeal of the longstanding GlassĀ­ Steagall Act laid the groundwork for the financial meltdown. Those who don’t think it did overlook one major unintended consequence of repealing Glass Steagall: the excessive use of leverage.

After the 1929 stock market crash and the onset of the Great Depression, Congress passed the iconic Glass-Steagall Act in 1933 to help ensure safer banking practices and restore faith in the financial system. Before the Great Depression, banks had engaged in imprudent stock speculation. In addition to their traditional staid banking services such as taking in deposits and making loans, they also used depositor’s funds to engage in high-stakes gambling on Wall Street.

The act was passed to halt a wave of bank failures and rein in the excesses that contributed to the 1929 Crash. Among other things, Glass-Steagall separated the more stable consumer-oriented commercial banking from riskier investment banking and set up the bank deposit insurance system to protect small savers against bank failures. The business of accepting deposits and making loans was to be kept separate from underwriting and peddling stocks, bonds, and other securities.

The movement to deregulate the American economy began in the 1970s. It spread to air travel, railroads, electric power, telephone service and other industries, including banking. The sustained bull market of the 1990s supported arguments that financial markets could regulate themselves, and bankers lobbied Congress to further emancipate the financial sector.

Citigroup forced Congress’s hand in 1998 when the firm announced it would join forces with the Traveler’s Group in a corporate merger. The $70 billion deal would bring together America’s second largest commercial bank with a sprawling financial conglomerate that offered banking, insurance, and brokerage services. The proposed transaction violated portions of the Glass-Steagall Act, but Citigroup obtained a temporary waiver, completed the merger, and then intensified the decades-old effort to repeal Glass-Steagall.

Just a year earlier, Travelers had become the country’s third largest brokerage house with its acquisition of the investment banking firm Salomon Brothers. Touting the pressures of technological change, diversification, globalization of the banking industry, and both individual and corporate customers’ desire for a “one-stop shop” -a financial supermarket- both firms lobbied hard for approval of the merger.

In 1999 a Republican Congress passed and a Democratic President signed the Gramm-Leach-Bliley Act, essentially repealing Glass-Steagall and removing regulatory barriers between commercial banks, investment banks, and insurers.

Advocates of the universal bank model argued that customers preferred to do all their business – life insurance, retail brokerage, retirement planning, checking accounts, mergers and acquisition advisory, underwriting, and commercial banking lending -with one financial institution.

The universal bank created an uphill battle for the major investment banks like Lehman Brothers and Bear Steams. For example, it was believed that the investment banking arms of universal banks would move into the lucrative securities underwriting business, using loans as bait to get the inside track on underwriting engagements, essentially using depositors’ money to drive investment banking fees.

As public companies, these investment banking firms faced pressure to deliver returns on equity comparable to that of the universal banks. To stay competitive,  they resorted to excessive leverage or borrowing to juice their returns.

In 2004 they received approval from the Securities Exchange Commission to increase their leverage from 12-1 to better than 30-1. The numbers were indeed worrisome. For instance, Bear Steams was leveraged 33 to 1 and before crashing in September 2008 Lehman Brothers had a 35 to 1 leverage ratio, meaning they borrowed 35 dollars for every dollar of capital.

By the winter of 2008, excessive leverage would ravage the investment banking industry, leading to the downfall, merger, or restructuring of all major investment bank firms and unleashing a global recession. And the American taxpayer would learn that free markets are not free.

Originally published: April 30, 2016

Free trade doesn’t work for most American workers

The aphorism “A rising tide lifts all boats” has become entwined with a basic assumption that free trade results in economic wins for all players in the global economy. Of course this assumes you are lucky enough to have a boat that has not run aground.

The classic case for free trade was made nearly 200 years ago by economist David Ricardo. This static argument relies on the principle of comparative advantage; that trade enables countries to specialize in goods and services they produce more efficiently than do their trading partners. This increases overall productivity and total output.

The conclusion follows from countries having different opportunity costs of producing tradeable goods. The opportunity cost of any good is the other goods that could have been produced by the same resources. Each country focuses on what it does best and everyone gains. This notion of free trade has a hallowed status among the cheerleaders for globalization.

Another way to understand comparative advantage is to consider the opportunity cost of undertaking a certain activity. Let’s assume that Lady Gaga, the famous entertainer, also happens to be a world-class typist. Rather than entertaining and typing, she should specialize in entertaining, where her comparative advantage is greatest and she could maximize her income.

In this example, Lady Gaga has a much higher opportunity cost of typing than does her secretary. If Lady Gaga spent an hour typing while the secretary spent the hour running the business, there would be a loss of overall output.

The real world is much more complex. Free trade has a downside: while its benefits are broadly distributed, costs are often concentrated. Consider the case of American textile workers. In the aggregate, American consumers gain by having access to cheap clothing, but unemployed textile workers bear the loss.

Many free trade cheerleaders confuse it with off shoring jobs, which is simply substituting, cheap foreign labor for more expensive American labor when nothing is in fact being traded. Moving production overseas has nothing to do with comparative advantage; it simply reflects wage and price competition from countries seeking jobs and economic growth.

If a firm shifts production to low-wage countries, its profits improve, driving up share prices and senior management performance bonuses. To paraphrase one-time presidential candidate Ross Perot: If you can build a factory overseas, pay about a dollar an hour, have little or no health care or retirement benefits and no environmental controls, then you are the greatest businessman in the world

But when many firms move overseas, American workers lose their incomes. So when do the costs of lower incomes resulting from job losses and government revenues exceed the benefits to consumers of lower prices? Put differently, do the costs of exporting good-paying American jobs outweigh gains from cheaper imports and contribute to a shrinking middle class.

Free trade advocates contend that the Americans left unemployed have acquired new skills and will find better jobs in “sunrise” industries. In reality, how many steelworkers do you know who have become computer software engineers?

This is one reason why Americans’ real incomes have stopped growing as manufacturing jobs have been moved offshore.

As then-presidential candidate Barack Obama said in 2008, “You go into these small towns in Pennsylvania and like a lot of small towns in the Midwest, the jobs have been gone now for over 25 years and nothing’s replaced them. And it’s not surprising, then they get bitter, they cling to guns, or religion or antipathy to people who aren’t like them or anti-immigrant sentiment or antitrade sentiment to explain their frustrations.”

A former General Motors CEO allegedly said “what is good for GM is good for America.” But offshoring challenges the conventional wisdom that American firms generally advance the nation’s economic interests. When they employ a large foreign workforce but few people within the United States, it certainly is good for the firms, but not for the American worker.

Originally Published: April 16, 2016.