In Afghanistan, little to show for America’s longest war

It is the longest war in U.S. history, yet it hardly gets any attention. The public may be suffering from Afghan fatigue, especially when there is little to show for the expenditure of life and treasure.

It has been 15 years since the U.S. invaded Afghanistan to hunt down the architects of the 9/11 attacks. The invasion was an integral piece of President Bush’s hastily conceived “Global War on Terrorism.” By the end of 2001, American forces had toppled the Taliban government. Mission accomplished.

Afghanistan is a country of about 30 million people, approximately the size of Texas, nestled between Pakistan, Iran, and several former Soviet republics. Its location has made it a source of significant geopolitical interest and tension since the 19th century.

The country has historically been wrought with turbulence, characterized by chronic instability and repeated bouts of civil war. Since 329 B.C., when Alexander the Great came to Kabul, Afghanistan has been invaded by Arabs, Chinese, Mongols, Moghuls, the British, and the Soviets. As scholars have written, Afghanistan is the graveyard of empires.

The Taliban took over in 1996. This fundamentalist Islamic group formed from remnants of the Mujahideen (holy warriors) who battled the Soviets for over a decade.

The United States has spent over $1 trillion on fighting and reconstruction, building an Afghan army, instilling western values in a land of warlords and tribal hostilities, and establishing a functioning democracy in a place that has never been changed by a foreign power. Over 2,200 American lives have been sacrificed.

According to the Special Inspector General for Afghanistan Reconstruction, about $115 billion has been spent to support Afghanistan relief and assist the government. Yet the World Bank reports that it remains one of the poorest countries in the world, and three quarters of the population is illiterate.

While the United States and NATO formally ended the war in Afghanistan on December 28, 2014, a force of 8,500 Americans remains to train and support the Afghan security forces.

Since the Taliban fell, some progress has arguably been made in opening up the country and expanding democratic freedoms, especially among women. However, lack of security still impedes development, and corruption remains a significant barrier to progress. The 2015 Corruption Perceptions Index ranks Afghanistan 166th out of the 168 countries monitored. Taliban insurgency is on the rise, drug trafficking flourishes, human rights abuses continue, the rule of law is weak. All the while the U.S. picks up the tab for the Afghan army and police, and continues to provide foreign aid.

Billions have been wasted on fruitless projects that are awash in corruption and have little government oversight, according to the Special Inspector General. This office has documented a greatest hits compilation of waste, fraud and abuse in U.S. government-sponsored programs.

Among the more egregious boondoggles was importing rare blond male Italian goats to mate with female Afghan goats and make cashmere. The $6 million program included shipping nine male goats to western Afghanistan, setting up a farm, lab, and staff to certify their wool.

But the entire herd of female goats was wiped out by disease. As a result, only two of the imported Italian goats are still usable; it could not be confirmed whether the others were dead or alive.

Another baaad idea. You goat to be kidding.

That was not the only example of wasting American taxpayer money. The Pentagon’s Task Force for Business and Stability Operations spent nearly $150 million for employees to stay in private luxury villas with flat-screen TVs rather than bunking at military bases. Another $43 million was spent on a gas station that should have cost about $500,000.

America may have good intentions, but we know which road is paved with those. We will be mowing the grass in Afghanistan for the foreseeable future, like Sisyphus rolling his boulder up the hill for all eternity.

originally published: January 7, 2017

Make High Earners Save Social Security

In these days of presidential interregnum, the American public has seen newspapers and digital media filled with discussions of tax cuts, increased military and infrastructure spending, economic growth proposals, regulatory relief, immigration reform, repealing Obamacare, reducing the national debt, keeping deficits on a short leash, draining the swamp of political and economic favoritism and other domestic traumas.

Social Security, however, has received little attention. How the new administration will accomplish all these promises without yielding to the temptation to cut programs like Social Security is an open question. President-elect Trump, who enjoyed the support of working class Americans, promised during the campaign not to cut Social Security. Speaker Paul Ryan said he has no plans to change Social Security, although he has been outspoken on the need for entitlement reform.

Funny how a politician can forget campaign promises after election day. Loyalty appears to be paramount for these folks until all of a sudden it isn’t. Politicians all too frequently forget, to put it in the cant language of the ‘hood, that a deal is a deal.

Social Security is a promise to all eligible Americans that they will not live in abject poverty if they become disabled or when they get old, but the Social Security 2015 Trust report finds that the fund has enough money to pay full benefits until 2034. After that it will collect only enough in taxes to pay 79 percent of benefits.

With the number of workers available to pay for Social Security benefits falling rapidly, there will inevitably be calls for benefit cuts, higher taxes or both. But there is a better way.

Social Securityis not an entitlement program; it is a “pay-as-you-go” system funded by the payroll tax. Companies and nearly 168 million working people pay into it to provide benefits to about 60 million retirees. Each generation pays for current retirees in return for a commitment that the next generation will do the same.

It is the backbone of retirement planning for millions of Americans. Almost a third of retirees receive practically all their retirement income from the system and about two thirds receive the majority of their retirement income from Social Security.

The top 100 CEOs, in contrast, have platinum pension plans. On average, their massive next eggs are large enough to generate about $253,000 in monthly retirement payments for the rest of their lives. Heaven for them, hell for the ordinary American worker.

Dealing with the coming Social Security funding crisis by raising the payroll tax places a significant burden on low-wage workers, especially when the Federal Reserve has kept interest rates so low that their saving accounts are yielding next to nothing, forcing baby boomers to work longer and retirees to rely even more on Social Security income.

An alternative that merits serious consideration is to increase the ceiling on annual wages subject to Social Security payroll taxes, which is currently at $118,500. All annual income above that amount is exempt from the tax, meaning that 94 percent of Americans pay Social Security tax on all their income but the wealthiest 6 percent do not.

Expanding the payroll tax to all earnings above $118,500 would wipe out funding issues. According to Social Security actuaries, it would keep the Trust solvent for the next 45 years.

Since Social Security began, the need for retirement income has risen as life expectancy has increased by 17 years. This is particularly true for top earners who need Social Security the least and whose jobs are less physically demanding than those of construction workers, janitors, etc.

Political leaders have time to decide how to address Social Security’s long-term funding problems. As they contemplate potential solutions, they should consider expanding the payroll tax to include all earnings. It’s a fair way to rescue the program from financial limbo and provide lasting stability without taking draconian measures that would harm tens of millions of hard-working Americans.

Originally Published: December 23, 2016

Infrastructure Yes, But Not Just Any Infrastructure

The American public is routinely bombarded with messages about the need to spend vast sums of money on infrastructure, drumming the subject into the public consciousness, promoting an often rehearsed-sounding catalog of new capital projects. The need is indeed great, but so is the importance of spending wisely. That means emphasizing the lifecycle management of infrastructure assets.

President-elect Trump says his plan to spend $1 trillion on infrastructure projects over 10 years would be paid for by leveraging public-private partnerships and encouraging private investment through tax incentives. Infrastructure spending is a priority Trump shares with congressional Democrats, who have said they believe they can work with him on the Augean task of renewing America’s infrastructure.

What is often overlooked is that infrastructure spending is not just about new construction, but the maintenance of existing assets. Timely lifecycle management and maintenance is needed to extend the service lives of infrastructure assets in a state of good repair and significantly reduce overall costs. The rationale is to avoid the high cost of reconstruction and replacement that results from deferred maintenance.

Political leaders frequently say that stewardship of infrastructure assets is essential for economic growth. But the evidence suggests that many of them don’t believe it. They are predisposed to defer maintenance because their concept of the future extends no further than the next election cycle, and initial timeframe for infrastructure assets to show the effects of irreversible deferred maintenance is much longer than their likely terms in office.

Consider, for example, that large sections of the Washington D.C. transit system are out of service because maintenance has been shortchanged over decades. Service quality declines substantially when maintenance is deferred. Here in Boston, MBTA maintenance has been underfunded for so long that it will take years to eliminate a $7.3 billion maintenance backlog even though the T plans to devote $870 million to the cause this year.

Also, public officials all too frequently understate the true costs of infrastructure projects by focusing on what they cost to build and ignoring operation and maintenance.

Another factor contributing to the failure to maintain infrastructure assets is that highway funding arrangements, for example, traditionally favored capital expenditures for new construction. As originally established by Congress, the Federal-Aid Highway Funding Program specified that Federal Trust Fund grants would cover up to 80 percent of the cost of new construction and subsequent reconstruction or replacement.

But state and local governments had to bear operating and maintenance costs. When highway links inevitably wore out before their time, state and local governments only had to worry about coming up with 20 percent of the total sum from their capital budgets since federal construction grants covered the rest, so maintenance was not a priority.

All but forgotten in this dubious calculus were costs incurred by motorists who had to struggle with increasingly decrepit highways, as well as plenty of congestion when highway lanes were closed for restoration; an inconvenience, any driver knows, that always last much longer than advertised.

Although later reauthorization bills made federal funds available for rehabilitation, renewal, and reconstruction at levels comparable to new construction, the damage had already been done. Today the advanced deterioration of the nation’s highway system is testament to the consequences of deferred maintenance.

The price tag for renewing America’s infrastructure is astronomical, and comes at a time when a federal funding regime dependent on insufficient fuel tax revenues is least able to afford escalating construction and maintenance costs.

Going forward with a big infrastructure package and setting aside, for the moment, the issue of finding the cash to do it, there needs to be an emphasis on the lifecycle management of infrastructure assets. The health care industry understands that it is far less expensive to keep a patient well than to treat them once they become sick; the same is true for our nation’s infrastructure.

Originally Published: December 10, 2016

Civil and military success depend on developing and adapting strategy

Developing strategy is too often thought of as a by-the-book, one-shot undertaking to provide managers with a comprehensive roadmap that is supposed to cover all eventualities. But in the real world, this is scarcely the case.

Instead, developing an effective strategy is a relatively messy process that involves evaluating everything we know about the external environment at any given time, designing a realistic way to achieve long-term goals, constantly monitoring for changes in the environment, and revising strategies as they are being executed to take such changes into account. Strategy must reflect reality, not what you think the world ought to be like.

As proposals to invest in transportation and other infrastructure currently making headlines, military history provides essential background for those attempting to develop effective strategies for such large undertakings. Without this background, they’re like techno-wannabes trying to do engineering without have studied physics.

As the United States approaches the 75th anniversary of Japan’s surprise attack on Pearl Harbor, we should remember lessons the military has taught us: How to properly develop a strategy, why it must be regarded as an ongoing process, and how it must respect changing realities.

Just before 8 a.m. on Dec. 7, 1941, hundreds of Japanese fighter planes attacked the American naval base at Pearl Harbor near Honolulu, Hawaii, killing more than 2,000 and wounding another 1,000. Sixteen battleships, cruisers, and other warships were sunk or disabled in the attack, but all-important fuel storage and ship repair facilities were left untouched. This omission allowed Pearl Harbor to continue as a forward base for American naval power in the Pacific.

When President Roosevelt delivered his “Day of Infamy” speech asking Congress to declare war on Japan the next day, the federal government already had a detailed game plan for defeating Japan in the Pacific. It was known as War Plan Orange and had been under development by the U.S. Navy since 1905.

The Navy began this effort and carried it forward in response to growing awareness that the U.S. acquisition of the Philippines during the Spanish-American War was likely to create conflicts with Japan in the western Pacific that could eventually lead to war.

By 1941 War Plan Orange had undergone many revisions and updates to reflect changing political and tactical realities such as the emergence of the aircraft carrier as a naval weapons system that had the potential to become as important as the battleship.

The game plan contained extensive detail about the numbers and types of fighting personnel that would be required to carry out the strategy, and how to recruit, organize and train them. Finally, it detailed the types and quantities of weapons and equipment that would be needed, how to produce them, what kinds and quantities of raw materials their production would require and how and where to allocate them in the theater of war for maximum effect.

It was all there in black and white. And as history has demonstrated, War Plan Orange reflected what actually happened. It was indeed the blueprint for the campaigns that eventually defeated Japan in 1945.

War Plan Orange guided the U.S. to victory over Japan less than four years after Pearl Harbor. This was less than half the time the U.S. spent in Vietnam, and far shorter than the Iraq and Afghanistan wars. It began as a sound strategy and was flexible enough to roll with the punches from events that strategists were unable to anticipate.

Clearly, the United States needs this kind of strategic focus at all levels of government if efforts to address major domestic and foreign policy issues are to succeed. Otherwise the country risks missing worthwhile opportunities, doing new projects and programs without proper coordination, and spending a lot of money just to make things worse.

As a new administration comes into power, it would be wise to recall that, as former President Eisenhower wisely remarked, “Plans may be irrelevant, but planning is essential.”

Originally Published: November 26, 2016

Infrastructure spending must look forward

Many economists and politicians are once again peddling the conceit that billions of dollars in infrastructure spending (aka investment) will create new jobs, raise incomes, boost productivity and promote economic growth. After all, a report card from the American Society of Civil Engineers gave America’s infrastructure a D+ grade and claimed that an investment of $3.6 trillion is needed by 2020.

But before we accept this idea as gospel, we should remember that the future isn’t likely to look like the past.

Americans are reminded that a large part of President Roosevelt’s New Deal to “Save Capitalism in America” was massive federal investments in economic growth projects like rural electrification, the Tennessee Valley Authority, the Boulder and Grand Coulee Dams, and other monumental hydroelectric generating facilities. Not to mention hundreds of commercial airports like La-Guardia and JFK in New York City, thousands of modern post offices, schools and courthouses.

The investments culminated in the 41,000-mile Interstate Highway and Defense System, begun in the 1950s under President Eisenhower (the Republican New Dealer”) because of what he had learned from his military experiences leading the allied armies in Europe during World War II.

It is further claimed that Americans have been living off these federal investments ever since. Their contribution to decades of job growth and increasing national prosperity has been so enormous that Americans have come to take them for granted as cost-free gifts from a beneficent God, like the unimaginably bountiful resources of crude oil discovered under that legendary East Texas hill called Spindletop, which came exploding out of the Lucas Number 1 well in 1901 with a roar that shook the world.

The $828 billion stimulus plan President Obama signed in 2009 focused on “shovel-ready” projects like repaving potholed highways and making overdue bridge repairs that could put people to work right away. Still as Gary Johnson noted in 2011, “My neighbor’s two dogs have created more shovel-ready jobs than the Obama stimulus plan”.

Let’s not kid ourselves, spending for these projects scarcely represented “investment in the future.” Had we been managing infrastructure assets sensibly, they would have been little more than ongoing maintenance activities that should have been funded out of current revenues, like replacing burned-out light bulbs in a factory.

One problem with initiating a massive new capital investment program is figuring out where the dollars to fund it will be found. Projections for escalating federal deficits and skyrocketing debt are bound to raise questions about the federal government’s ability to come up with the necessary cash.

For starters, it’s time to recognize that the future will be quite different from the past, particularly when it comes to transportation infrastructure. Large projects may be rendered obsolete and the burden of stranded fixed costs left to the next generation.

Disruptive technologies such as electric or hybrid, semi-autonomous or self-driving vehicles, and changing consumer preferences, especially among urban millennials who are more interested in the on-demand riding experience than driving, is a cause for optimism about the future of America’s infrastructure condition.

These new patterns of vehicle ownership and use and the emergence of privately funded technologies are changing the way people and goods move, and transforming the transportation industry in both the public and private sectors. They offer the potential for dramatic improvements in traffic congestion (due to improved safety and reduced spacing between vehicles) and reducing motor vehicle accidents and fatalities.

They can also generate environmental gains from smoother traffic flow, promote productivity growth as reduced congestion improves access to labor markets, and improved utilization of transportation assets such as existing highway capacity with higher through put without additional capital investments.

These changes create an opportunity for a new generation of political leaders to present the public with a modern vision for transportation, the economy, and the environment, not one that harks back to an earlier time.

Originally Published: Nov 12, 2016

Fraud just another way bankers operate

Once upon a time, the “F” word (fraud) was in vogue when dealing with the U.S. banking community. After the savings and loan scandals of the 1980s, more than 1,100 bankers were prosecuted on felony charges and over 800 sent to prison for white-collar crimes, including top executives at many of the largest failed banks. By throwing the savings and loan bankers in jail, the federal government sent a message: if you rip people off, you will pay for it.

No more. The federal government’s response to the 2008 financial crisis couldn’t have been more unlike what it did in the wake of the savings and loan crisis. The Justice Department has taken the position that these cases are too hard to win and the size of some large banks makes it difficult to bring criminal charges against them because they threaten a bank’s existence, which would endanger the economy. This collateral consequences approach basically gives too-big-to-fail banks and their senior executives a get-out-of-jail-free card.

After the man-made 2008 financial meltdown that left millions of Americans jobless and led to a $700 billion taxpayer bailout that dwarfed the savings and loan crisis, not one Wall Street executive went to jail for the events leading up to the crisis.

There were no high-profile big banker prosecutions for the widespread mortgage fraud and financial chicanery that fueled the bubble. These bankers were too big to jail.

Sure, there were prosecutions of small fish like mortgage brokers and loan officers, which is fine if you believe the fraud took place at the bottom of the food chain.

There were billions of dollars in civil settlements but no serious criminal prosecutions.

The notion of accountability is becoming an endangered species. Regulators still treat the banking industry with velvet gloves. Standard fare involves a firm paying a big fine with shareholder money and treating it as a corporate expense that in certain cases is tax deductible. The company promises never to commit such a crime again and in the final analysis it is just a cost of doing business.

For example, Wells Fargo got its rear-end in a sling when it was revealed that since 2011, thousands of employees secretly opened more than two million bogus bank and credit card accounts using unauthorized customer names and signatures.

The fraud was so common that employees had a name for it: sandbagging. The firm fired 5,300 employees involved in the scandal who were trying to hit steep sales targets and refunded $2.6 million in customer fees.

Here again, the government got its pound of flesh in fines rather than by prosecuting wrongdoers. WFC was fined $100 million by the federal Consumer Financial Protection Bureau, $35 million by the Office of the Comptroller of the Currency, and $50 million by the city and county of Los Angeles. The $185 million total amounts to three days of profit for the bank. Last week the California attorney general’s office announced it is conducting a criminal investigation into whether employees at San Francisco-based Wells Fargo committed identity theft in violation of state law during the sales practice scandal.

Also, the U.S. Department of Labor is promising a “top-to-bottom” review of the firm.

It is unclear whether the investigation will focus on employees at the bottom of the food chain or senior executives, the banking industry’s untouchables. But if recent history is any guide, the biggest fish face little risk of prosecution. They may have created the cross-selling practices but were not the ones creating the fake accounts.

There is no mystery here as the American public continues to watch ordinary citizens turned into a veritable basket of deplorables and jailed for minor offenses while the most powerful walk away unpunished and with complete impunity.

As Cassius says in “Julius Caesar,” “The fault, dear Brutus, is not in our stars, but in ourselves.”

Originally Published: Oct 30, 2016

Give shareholders more say in picking directors

The recent Wells Fargo fraud scandal over its sales practices has once again placed corporate governance in the public eye. While the firm has agreed to pay $185 million in fines for opening 1.5 million bank accounts and 565,000 credit card accounts without customer’s permission over the past five years, it has not admitted misconduct in committing fraud against its own customers.

The notion of accountability is becoming an endangered species. One of the reasons is how common it is for a single person to serve as CEO and chair a corporation’s board of directors.

Last week with public criticism spreading, class-action shareholder lawsuits, a U.S. Labor Department review, moves by California and Illinois to stop doing business with the firm and bipartisan pressure from lawmakers, the board of directors of Wells Fargo, the country’s third largest bank, finally took action. They announced that Chairman and CEO John Stumpf would forfeit about $41 million stock awards, forego his salary during the inquiry, and receive no bonus for 2016.

They also announced the immediate retirement of Ms. Carrie L. Tolstedt, the former senior executive vice president of community banking who ran the unit where the bogus customer accounts were created. She will forfeit $19 million in stock grants and receive neither a bonus this year nor a severance package. Tolstedt should still be able to squeak by; she made $9 million in total compensation and her accumulated stock is supposedly worth over $100 million. If one is going to hell, then best to go first-class.

Then this past Wednesday, John Stumpf abruptly announced his retirement.

Corporate boards are supposed to be the centerpiece of corporate governance. Holding both roles creates an inherent conflict of interest. For instance, key among the board’s roles is selecting, overseeing, and evaluating the CEO. How can that happen when the CEO heads the board?

In the United States, it is estimated that half of public companies have one person serve as both CEO and chairman. Boards need real independence to exercise real oversight and that starts with the chairman. There could be no better time to get real and make a case for unbundling the two positions.

In recent years, public confidence in board independence has been undermined by an array of scandals, fraud, accounting restatements, backdating options, and CEO compensation abuses. These issues have highlighted the need for boards to be fully independent and free of conflicts to protect shareholder interests.

For this reason, the separation of the chairman position from that of CEO job is the model of corporate governance in most European countries. More than 90 percent of the Financial Times Stock Exchange 100 companies have long had distinct roles for the CEO and the chairman. The goal is to keep the two roles separate in the interests of proper and strong oversight of corporate activities including the firm’s culture, CEO evaluation and compensation, not to mention keeping the CEO’s XXL ego in check.

Boards of directors are designed to represent shareholders and provide a critical check and balance on corporate management. Still, it must be acknowledged that board directors have little individual accountability to shareholders because the latter have little influence on who serves on corporate boards.

Prior to the annual shareholders’ meeting the board proposes a slate of nominees. Board candidates are typically nominated not by shareholders but by the board. When new members join, they are joining a board with already established norms.

Instead of merely being able to vote for or against directors nominated by the board’s nominating committee, the time has come to allow shareholders to nominate board members.

The recent Wells Fargo scandal is just the latest reminder that splitting the roles of CEO and chairman of the board is the place to begin the development of independent boards that will produce better oversight, checks and balances, transparency, and disclosure through adequate independence of boards of directors.

Originally Published: Oct 14, 2016

Wells Fargo scandal highlights failure to hold corporations accountable

Is corporate accountability, like virginity past the age of 16, a dead letter? Unfortunately, based on the latest banking shenanigans, the answer seems to be yes.

On Sept. 8, Wells Fargo, the country’s third largest bank with $1.9 trillion in assets, which has portrayed itself as a bank for Main Street, became the latest to experience a major scandal after it agreed to pay $185 million in fines over the “widespread illegal practice” of opening unwanted accounts to meet sales targets and reap compensation incentives.

Fines included $100 million to the Consumer Financial Protection Bureau, $35 million to the Office of the Comptroller of the Currency and $50 million to the city and county of Los Angeles. Put in context, these fines amount to a rounding error for WFC, which earned $5.6 billion just in the second quarter this year.

The firm also agreed to pay $5 million to customers who incurred fees on the ghost accounts. That works out to an average of about $25 per customer.

As usual, the firm did not admit wrongdoing, despite acknowledging that it has fired roughly 5,300 employees, or about 1 percent of its workforce, over the past five years for fraudulently opening up to 2 million fake fee-generating accounts for products like credit and debit cards, checking and savings accounts for unsuspecting customers. By creating these sham accounts, the firm ripped off customers, who paid overdraft and late fees on credit cards and deposits they

An aggressive sales culture that includes cross-selling, or getting customers to open multiple deposit, mortgage, and investment accounts, has been a hallmark of WFC’s strategy for years. The bank explicitly cites it as a key strategic goal in its 2015 Annual Report. The policy once again proves that you get the behavior you reward.

You don’t need a PhD to know that other “too big to fail” banks are likely engaging in the same aggressive sales practices to make their numbers. After all, it is the promise of increased pay that keeps the engine running.

The bank said of its settlement: “Wells Fargo reached these agreements consistent with our commitment to customers and in the interests of putting this matter behind us.”

But the executive in charge of WFC’s community banking operations made $9 million in total compensation last year and was set to walk away with an even bigger payday when she retired at age 56 at the end of the year. Her payout had been pegged at $124.6 million in a mixture of shares, options, and restricted stock. But the firm’s board of directors under pressure from lawmakers and others said this week she will forfeit $19 million of her stock awards immediately.

Also, the board announced that WFC Chairman and CEO John Stumpf, who has led the bank since 2007 and made $19.3 million in 2015, will forfeit $41 million in stock awards and be ineligible for a bonus this year. He has defended the bank’s cross selling strategy, saying it promotes “deep relationships” and helps customers. He had turned away calls for a claw back of executive compensation when testifying before the Senate last week punting to the board.

While investor support of Wells Fargo continues to deteriorate, Warren Buffet, the bank’s biggest shareholder with 10 percent of its stock, has stayed mum on the scandal.

For all the media attention given to accountability abuses and the continuing debate over whether regulators are doing enough to hold firms accountable, the American public has grown numb to scandalous behavior in the financial community and knows the government won’t punish the perpetrators.

Not all employees are subject to the same standard. Once again, senior executives are granted greater latitude to violate the rules; none of them have lost their jobs. Sadly, it is a truism that accountability rolls downhill in the corporate hierarchy. It’s all very now.

Originally Published: Sep 30, 2016

AIG’s $180 billion bailout still stings

 

Eight years ago this month the global financial system seemed on the verge of collapse, and its rescue led to the greatest depredation on the public purse in American history. There were many crucial events during the month of the long knives, but no corporation was more central to the mess than AIG.

On Sept. 7, 2008, the federal government took control of Freddie Mac and Fannie Mae and injected $100 billion to ensure the troubled mortgage lenders could pay their debts. On September 15, Lehman Brothers announced it would file for Chapter 11 bankruptcy and Bank of America acquired Merrill Lynch for $50 billion.

Then, in the biggest bank failure in U.S. history, the Federal Deposit Insurance Corporation (FDIC) seized the assets of Washington Mutual, the sixth largest U.S. bank, and JPMorgan acquired the bank’s deposits, assets, and troubled mortgage portfolio from the FDIC. On Sept. 21, the Federal Reserve approved Morgan Stanley and Goldman Sachs transition to commercial banks.

While many blamed the investment banks for high leverage, bad risk management and overreliance on faulty internal models, not to be overlooked is the role AIG, the nation’s largest insurance company, played in the crisis. AIG was once one of the largest and most profitable companies in corporate America, with a gold-plated “AAA” credit rating.

But on September 16, the federal government provided an initial $85 billion in taxpayer cash to bail out the firm. In return, AIG became a ward of Uncle Sam, which acquired 79.9 percent ownership of the company. This was only the first of four bailouts for AIG, totaling an estimated $180 billion.

AIG was in worse shape than Lehman Brothers had been. Yet unlike Lehman, the feds chose to save it. The explanation: AIG was regarded as too big, too global, and too interconnected to fail.

After the 1999 repeal of Glass-Steagall, the law that had regulated financial markets for over six decades, President Clinton signed the Commodity Futures Modernization Act (CFMA) in 2000, which effectively removed derivatives such as Credit Default Swaps (CDS) from federal and state regulation, proving once again that regulators exist to protect the interests of the regulated.

CDS are essentially a bet on whether a company will default on its bonds and loans. AIG was a huge player in the CDS business, which allowed the firm to insure asset-based securities containing sub-prime mortgages against default.

Although swaps behave similarly to insurance policies, they were not covered by the same regulations as insurance after passage of the CFMA. When an insurance company sells a policy, it is required to set aside a reserve in case of a loss on the insured object. But since credit default swaps were not classified as insurance contracts, there was no such requirement.

AIG’s CDS business caused it serious financial difficulties in 2007, when the housing bubble burst, home values dropped and holders of sub-prime mortgages defaulted on their loans. By selling these contracts without reserves, the firm left itself unprotected if the assets insured by the swaps defaulted. AIG had insured bonds whose repayments were dependent on sub-prime mortgage payments. Yet it never bothered to put money aside to pay claims, leaving the company without sufficient resources to make good on the insurance.

Taxpayers stepped in to pay in full the dozens of banks whose financial products were insured with AIG swaps. Unlike in corporate bankruptcies, none of these counterparties were forced to take a haircut, requiring the government to pump more public money into the banks.

To add insult to injury, two weeks after the government provided its fourth bailout to AIG in 2009, it was revealed that the firm was paying $165 million in bonuses to retain key employees to unwind the toxic financial waste. Most people understand that if you go to government for a handout, executives should forgo bonuses. Then again, so much of what happened eight years ago this month defied common sense.

Originally Published: Sep 16, 2016.

The merger that hurt

Why the demise of Glass-Steagall helped trigger the 2008 financial meltdown that cost millions of Americans their jobs, homes and savings

This month is the eighth anniversary of the all-enveloping 2008 financial crisis. Wall Street apologists and many of their Washington, D.C., acolytes argue there is zero evidence that the takedown of the Glass-Steagall Act had anything to do with the meltdown, but the assertion ignores the role the rule of unintended consequences played in the crisis.

Glass-Steagall was enacted during the Great Depression to separate Main Street from Wall Street, creating a firewall between consumer-oriented commercial banks and riskier, more speculative investment banks. During the six-plus decades the law was in effect, there were few large bank failures and no financial panics comparable to what happened in 2008.

In the 1980s, Sandy Weil, one of the godfathers of modem finance, began acquiring control of various banks, insurance companies, brokerage firms and similar financial institutions. These were cobbled together into a conglomerate under the umbrella of a publicly traded insurance company known as Travelers Group.

In 1998 Weil proposed a $70 billion merger with Citicorp, America’s second-largest commercial bank. It would be the biggest corporate merger in American history and create the world’s largest one-stop financial services institution.

Touting the need to remain competitive in a globalized industry and customers’ desire for a “one-stop shop” (a supermarket bank), both companies lobbied hard for regulatory approval of the merger. Advocates argued that customers preferred to do all their business -life insurance, credit cards, mortgages, retail brokerage, retirement planning, checking accounts, commercial banking, and securities underwriting and trading -with one financial institution.

But the merger’s one-stop-shopping approach would make a mockery of the Glass-Steagall firewall. The proposed transaction violated its prohibition of combining a depository institution, such as a bank holding company, with other financial companies, such as investment banks and brokerage houses.

Citigroup successfully obtained a temporary waiver for the violation, then intensified decades-old efforts to eliminate the last vestige of depression-era financial market regulation so it could complete themerger. A Republican Congress passed the Financial Services Modernization Act and President Clinton signed it in November 1999. It permitted insurance companies, investment banks, and commercial banks to combine and compete across products and markets, hammering the final nail into the coffin of Glass­ Steagall.

Now liberated, the banking industry embarked upon a decade of concentrating financial power in fewer and fewer hands. Acquisitions of investment banks by commercial banks, such as FleetBoston buying Robertson Stephens and Bank of America buying Montgomery Securities, became commonplace.

Traditional investment banks suddenly faced competition from publicly traded commercial banks with huge reserves of federally insured deposits. The investment banks faced pressure to deliver returns on equity comparable to those of the new financial supermarkets, which also put competitive pressure on traditional investment banking businesses such as mergers and acquisitions, underwriting, and sales and trading.

In response, the investment banks sought to raise their leverage limits so they could borrow more money to engage in proprietary, speculative trading activities. In 2005 they convinced the Securities Exchange Commission to abolish the “net capital” rule that restricted the amount of debt these firms could take from 12-1 to 30-1, meaning the banks could borrow 30 dollars for every dollar of equity they held.

By 2008, increased leverage and speculation on toxic assets would ravage investment banking, leading to the collapse, merger, or restructuring of all five major Wall Street investment banks. During a six­ month period, Bear Stearns collapsed into the arms of JP Morgan, Lehman Brothers filed for bankruptcy protection, Merrill Lynch merged into Bank of America, and Goldman Sachs and Morgan Stanley converted to bank holding companies, giving them access to precious short-term funds from the Federal Reserve’s discount window.

The demise of Glass-Steagall may not have been at the heart of the 2008 financial crisis but it certainly contributed to the lunacy of financial deregulation. Had the law not been neutered, it would have lessened the depth and breath of the crisis that cost millions of Americans their jobs, homes and savings.

Originally Published: Sep 3, 2016