Sausage making and the President’s Build Back Better legislation

The legislative process is rarely pretty in the best of times, never mind in times like these.  Many people console themselves with this reality by quoting Otto von Bismarck, the pragmatic Prussian politician who, among other things, was the first chancellor of the German Empire from 1871 to 1890.

He is often erroneously quoted as saying “Laws are like sausages.  It is best not to see them being made.”  There has been a lot of sausage making going on in full view at the White House and in Congress over the last several months on the President’s Build Back Better legislation.

When a big bill makes its way through Congress, it highlights political divisions and can seem disconnected from the average American’s life. The Biden administration’s quest for a legislatively viable version of its Build Back Better agenda is an example.

Several of the administration’s promises have been abandoned in the new package, such as free community college and instituting a clean electricity standard with penalties for utilities that don’t comply.  Senator Joe Manchin, D-West Virginia, kneecapped the provision to retire coal and natural gas plants.

Other programs that were initially going to be permanent will instead be set to expire in a year or two or five, like the expanded child tax credit and expanding Medicaid in the 12 states that have not already done so.  It merits noting that once entitlement programs are established, they are famously difficult to repeal.

Still, the $1.75 trillion package contains a wide-ranging set of programs such as universal preschool for all 3- and 4-year olds, subsidized child care that caps what parents pay at 7% of their income, expanded Medicare to cover the cost of hearing benefits, and expanded tax credits for 10 years for utility and residential clean energy to reduce pollution, including electric vehicles.  Also notable is that although an overwhelming majority of Americans favor government action like Medicare negotiating with drug companies to reduce drug prices, that policy in not in the proposed legislation.

While the White House claims the legislation would not add to the deficit because of tax increases on corporations and the affluent, finding the taxes to pay for this package is proving difficult.  For example, Sen. Kyrsten Sinema, D-Arizona, is opposed to increasing the corporate tax to 25% or 26% and raising personal income tax rates. The progressive wing of the Democratic Party is now proposing annual taxes on billionaires for unrealized capital gains on stocks that have not even been sold and received as income.

According to an analysis from the University of Pennsylvania’s Wharton School of Business, the proposed new taxes and tax increases to pay for the $1.75 trillion bill would raise nearly $470 billion less than the White House claims.

With the President out of the country, Democrats are arguing among themselves over the details of the legislation.  House progressives are adamant about requiring the bill to be a done deal before they will vote for the $1.2 trillion bipartisan infrastructure bill that has been passed by the Senate because they don’t trust moderate Democrats to keep their word.

As the late, great New York Yankee catcher Yogi Berra said: “It ain’t over till it’s over.” So, the public sausage making, also known as lawmaking, will continue on Capitol Hill over the President’s Build Back Better legislation. As always, the devil is in the details.

Irresponsible behavior on immigration reform

President Trump was hoping to mark his first anniversary in office at his Mar-a-Largo estate in Florida, but then the federal government shut down for 69 hours. The high-stakes game of chicken that began Jan. 20 ended when Democrats and Republicans in the Senate reluctantly came to an agreement that will keep the federal government paying its bills until Feb. 8.

Unable to pass a federal budget for the fiscal year that began Oct. 1, Congress has repeatedly resorted to these “continuing resolutions.”

The latest stalemate ended when Senate Democrats woke up, smelled the coffee, and relented on their demand for immigration reform in return for assurances from Majority Leader Mitch McConnell that the Senate will consider immigration proposals in the coming weeks and take up the plight of Deferred Action Childhood Arrivals recipients, often referred to as “Dreamers.”

Poll after poll has shown that most Americans want the Dreamers, who were brought to the United States illegally as children, protected. But a recent CNN poll also showed that when given a choice between keeping the federal government open and passing DACA legislation, most said they don’t want the government to shut down.

Americans understand that attracting hard-working legal immigrants has been an important reason for the nation’s prosperity. They also understand that promised entitlements like Social Security won’t be around in a few decades unless we have more workers paying into them.

President Obama introduced DACA in 2012 as a stopgap measure to avoid deportations. President Trump rescinded Obama’s executive order creating the program last September, but delayed implementation until March 2018 to give Congress the opportunity to develop a replacement. As a practical matter, Dreamers are not in immediate danger of being deported because any action would trigger legal challenges.

While the media was salivating over the prospect of an extended federal shutdown, this three-day version was uneventful. Unlike the 21-day instance in 1995-1996 and the 16-day shutdown in 2013, the fight was not over raising the federal debt ceiling or health care policy. Instead, it was about Senate Democrats trying to pressure their Republican counterparts to ensure that about 800,000 immigrants, mostly from Mexico, who came to the United States as children could remain.

Before you know it, Feb. 8 will be upon us. There is no end to the suspense.

All this political posturing and blame-gaming is about one part of a much larger immigration issue and the President’s insistence on building a wall on our southern border.

Moreover, both parties dance around an unspoken yet reasonable question: Once DACA recipients are addressed, how long before pressure mounts to accommodate the Obama administration’s Deferred Action for Parents of Americans and Lawful Permanent Residents, which was designed to defer deportation for about five million parents of children born in the United States and also of children brought to the country legally?

“Deferred action” is Washington speak which in plain English means ignoring the law.

The evidence with entitlements suggests that each extension of benefits establishes a new base for future expansion. As time passes, more groups of undocumented immigrants come forth claiming they are no less deserving and political pressure is brought on their behalf to again expand protection. The process repeats itself until a program’s original intention is virtually unrecognizable.

Immigration issues have defied compromise for decades. Americans have a wide range of opinions on the subject, many of which don’t add up to a coherent point of view. These conflicted emotions have blocked comprehensive immigration legislation and skirted the issue of enforcing existing laws.

Not to be overlooked is the political imperative to be reelected, which incentivizes politicians to follow Scarlett O’Hara’s approach from “Gone with the Wind”: “After all, tomorrow is another day.” Given that we elect politicians, the lack of a well-conceived immigration policy is the price the electorate must pay for their irresponsible behavior.

Originally Published: Feb 3, 2018

 

Technology transforming the automobile industry

It’s obvious that the automobile industry is on the cusp of a technological revolution. Manufacturers and technology companies are working together to reinvent the automobile, much like the way Apple reinvented itself from a computer company to a cultural force or even how Madonna has remained a media icon by constantly adapting to new trends.

Although new technologies and consumer markets are still in their gestation stage, Ford, for example, is making major investments that will transform it from a company that just makes cars to one that touches all aspects of mobility.

Technology companies see a driverless world of autonomous or robotic vehicles as a software and artificial intelligence play. For them, the car is a platform, a commodity, like a cell-phone body. You can get the car body anywhere; the real smarts are in the software. The car may be the ultimate mobile device.

As the value of each vehicle becomes more dependent upon the software it contains, tech companies may be in a better position to capture this value than the automakers. New technologies are redefining boundaries between software firms and the lumbering dinosaurs of the automobile industry.

Opinions differ as to when widespread adoption of fully autonomous and commercially viable vehicles will occur. They could dot our roadways in five-to-ten years but saturation will take several decades.

Market penetration may not be uniform; it could start in trucks, for example, before private cars, or even as part of an on-demand commercial ride sharing fleet. In any case, it is not too early to start planning for the roadway management challenges that will be created by autonomous trucks and cars sharing the roads with driver-operated vehicles.

Autonomous vehicle proponents claim they hold the potential to dramatically reduce traffic casualties by eliminating human error. Activities like speeding and driving while texting are deadly. The National Highway Traffic Safety Administration says human error is a factor in 94 percent of fatal crashes. According to the National Safety Council, as many as 40,000 people died in motor vehicle crashes last year, a 6 percent increase over 2015. An estimated 4.6 million people were seriously injured.

When we begin seeing fully driverless cars hinges as much on the regulatory environment as advances in self-driving technology. Autonomous vehicles operating without a steering wheel, brake pedals, and human intervention pose questions about whether regulations can catch up to technological advances.

Market participants argue that realizing the safety benefits of autonomous vehicles will require a single national standard, not 50 sets of rules. Automakers complain that states are moving ahead with their own regulations, creating the potential for a confusing “patchwork” of laws under which autonomous vehicles operate. As of December, California, Florida, Michigan, Nevada, Utah, and the District of Columbia had enacted laws authorizing autonomous vehicle testing under certain conditions. Washington, Ohio, Pennsylvania, and Texas have active testing programs but no legislation.

On the same day Uber started to test its self-driving Volvos near its Bay Area headquarters, the state’s Department of Motor Vehicles ordered the firm to stop because its cars did not have the proper registration for such testing. Uber loaded the cars onto a self-driving truck and sent them to Arizona.

Michigan now allows companies to test self-driving vehicles without steering wheels, pedals or a human that can take over in an emergency. In contrast, California has a rule that self-driving vehicles can only hit the road with a safety driver.

It is uncertain how soon fully autonomous vehicles will enter the mainstream. When they do, avoiding the pushback that, for example, on demand mobility firms such as Uber and Lyft have faced in a variety of cities will require clarifying the proper role of all levels of government within the regulatory landscape. If autonomous vehicles are safer than their driver-operated counterparts, it is imperative that regulators not risk preventable injuries and deaths by unnecessarily delaying their deployment.

Originally Published: March 4, 2017

The merger that hurt

Why the demise of Glass-Steagall helped trigger the 2008 financial meltdown that cost millions of Americans their jobs, homes and savings

This month is the eighth anniversary of the all-enveloping 2008 financial crisis. Wall Street apologists and many of their Washington, D.C., acolytes argue there is zero evidence that the takedown of the Glass-Steagall Act had anything to do with the meltdown, but the assertion ignores the role the rule of unintended consequences played in the crisis.

Glass-Steagall was enacted during the Great Depression to separate Main Street from Wall Street, creating a firewall between consumer-oriented commercial banks and riskier, more speculative investment banks. During the six-plus decades the law was in effect, there were few large bank failures and no financial panics comparable to what happened in 2008.

In the 1980s, Sandy Weil, one of the godfathers of modem finance, began acquiring control of various banks, insurance companies, brokerage firms and similar financial institutions. These were cobbled together into a conglomerate under the umbrella of a publicly traded insurance company known as Travelers Group.

In 1998 Weil proposed a $70 billion merger with Citicorp, America’s second-largest commercial bank. It would be the biggest corporate merger in American history and create the world’s largest one-stop financial services institution.

Touting the need to remain competitive in a globalized industry and customers’ desire for a “one-stop shop” (a supermarket bank), both companies lobbied hard for regulatory approval of the merger. Advocates argued that customers preferred to do all their business -life insurance, credit cards, mortgages, retail brokerage, retirement planning, checking accounts, commercial banking, and securities underwriting and trading -with one financial institution.

But the merger’s one-stop-shopping approach would make a mockery of the Glass-Steagall firewall. The proposed transaction violated its prohibition of combining a depository institution, such as a bank holding company, with other financial companies, such as investment banks and brokerage houses.

Citigroup successfully obtained a temporary waiver for the violation, then intensified decades-old efforts to eliminate the last vestige of depression-era financial market regulation so it could complete themerger. A Republican Congress passed the Financial Services Modernization Act and President Clinton signed it in November 1999. It permitted insurance companies, investment banks, and commercial banks to combine and compete across products and markets, hammering the final nail into the coffin of Glass­ Steagall.

Now liberated, the banking industry embarked upon a decade of concentrating financial power in fewer and fewer hands. Acquisitions of investment banks by commercial banks, such as FleetBoston buying Robertson Stephens and Bank of America buying Montgomery Securities, became commonplace.

Traditional investment banks suddenly faced competition from publicly traded commercial banks with huge reserves of federally insured deposits. The investment banks faced pressure to deliver returns on equity comparable to those of the new financial supermarkets, which also put competitive pressure on traditional investment banking businesses such as mergers and acquisitions, underwriting, and sales and trading.

In response, the investment banks sought to raise their leverage limits so they could borrow more money to engage in proprietary, speculative trading activities. In 2005 they convinced the Securities Exchange Commission to abolish the “net capital” rule that restricted the amount of debt these firms could take from 12-1 to 30-1, meaning the banks could borrow 30 dollars for every dollar of equity they held.

By 2008, increased leverage and speculation on toxic assets would ravage investment banking, leading to the collapse, merger, or restructuring of all five major Wall Street investment banks. During a six­ month period, Bear Stearns collapsed into the arms of JP Morgan, Lehman Brothers filed for bankruptcy protection, Merrill Lynch merged into Bank of America, and Goldman Sachs and Morgan Stanley converted to bank holding companies, giving them access to precious short-term funds from the Federal Reserve’s discount window.

The demise of Glass-Steagall may not have been at the heart of the 2008 financial crisis but it certainly contributed to the lunacy of financial deregulation. Had the law not been neutered, it would have lessened the depth and breath of the crisis that cost millions of Americans their jobs, homes and savings.

Originally Published: Sep 3, 2016

Stock buybacks do nothing for most of us

Economic inequality in the United States is at historic levels. In the wake of the Great Recession, the issue has captured the attention of the American public, but there is little consensus about its causes. One of the causes is clearly the rise in corporate stock buybacks and short-term thinking.

In the 1980s, the top 1 percent of Americans accounted for 10 percent of the income generated in the economy; by 2012 it was approaching 20 percent. The top 1 percent controlled nearly 42 percent of the wealth, a level not seen since the roaring ’20s.

This increased inequality does not support, and even inhibits, the consumer spending that drives economic growth in the United States because it leaves the middle class with less buying power.

Those who are supposedly smart on the issue point to a range of reasons for economic inequality, such as technological change, the decline of unions, globalization and trade agreements. Often overlooked is the expansion of the financial sector and corporate America’s Ahab-like obsession with short-term thinking.

According to the Bureau of Economic Analysis, in 1970 the finance and insurance industries accounted for 4.2 percent of gross domestic product, up from 2.8 percent in 1950. By 2012, the sector represented 6.6 percent.

The story with profits is similar: In 1970, finance and insurance industry profits made up about one quarter of the profits of all sectors, up from 8 percent in 1950. Despite the after effects of the financial crisis, that number had grown to 37 percent by 2013. Yet these industries create only 4 percent of all jobs, so profits go to a small minority.

The increase in the influence of financial sector extends to public corporations that face increased pressure to make immediate investor payouts through stock buybacks. According to Research Affiliates, S&P 500 companies spent $521 billion on stock buybacks in 2013 and $634 billion in 2014. More than

$6.9 trillion has been spent on share buybacks since 2004. Not one dime of this money has gone into expanding operations, hiring more employees, increasing wages, research and development, enhancing productivity, and improving the customer experience.

An important part of the appeal of stock buybacks is their ability to increase earnings per share. In theory, buybacks tend to jack up the share price, at least in the short term, by decreasing the number of shares outstanding while increasing earnings per share. Corporations frequently finance these buy backs by issuing debt, taking advantage of the Federal Reserve holding interest rates underwater and the fact that interest expense on the debt is tax deductible.

Underlying all this are two notions. First, the only responsibility of the corporation is to maximize shareholder value as reflected in the stock price, as opposed to getting sidetracked by talk about multiple stakeholders such as employees, customers and the community.

The second is that corporate management should be compensated in stock to align their interest with those of shareholders. Since managers’ pay is tied to the firm’s stock performance even at the expense of long-term shareholder wealth, the temptation to manage earnings to meet short-term investor expectations instead of long-term shareholder value is quite strong. For example, if the choice is between repairing the roof on the factory in Toledo this quarter or missing the quarterly earnings figure, which could cause earnings per share to tumble, corporate management might decide not to make the capital investment.

Stock-based compensation has also contributed to the sharp rise in CEO compensation. Between 1978 and 2013, CEO compensation increased by nearly 10-fold while workers experienced stagnant wages and increasing job insecurity.

While corporate and finance executives live in a second gilded age, stock buybacks and short-term thinking contribute to under investing in innovation and skilled workers, and ultimately to more economic inequality. But none of this troubles the 1 percenters, and they appear to be the only ones who really matter.

Originally Published: Jul 23, 2016

A LESSON OF WAR: Iraq, Afghanistan and from a century past

The Battle of the Somme was a meat grinder. The centenary of this battle, fought mid-way through World War I, will be commemorated on July 1 in Great Britain, France and other countries that lost men in one of the largest and bloodiest battles in the history of human warfare.

Between July 1 and Nov. 18, 1916, the British suffered about 420,000 casualties, the French about 200,000 and the Germans about 465,000. All told, 300,000 soldiers died and little was achieved. Somme was like America’s recent conflicts in Iraq and Afghanistan writ large.

After two years of relative stalemate, allied forces decided to make a big push to break through the German lines and hopefully achieve a quick and decisive victory on the Western Front, much like politicians and generals assumed quick victories in Iraq and Afghanistan. The offensive was designed to relieve pressure on the French as a result of the German offensive against French forces at Verdun, and take control of a 20-mile stretch of the meandering River Somme.

The first day of that battle was the bloodiest in the history of the British army . Of the 120,000 troops who went into battle, the British suffered about 60,000 casualties, as many as 20,000 of whom died before the day was over.

The plan drawn up by generals in their chateau headquarters miles behind the battlefield was for an artillery barrage to pound the German defenses to an extent that the attacking British could just walk in and occupy the opposing trenches with minimal opposition. Cavalry units would then gloriously pour through the German lines, pursue the fleeing Germans and turn the tide of a war that had been in a deadly stalemate for the better part of two years.

Before the battle started, the British fired over a million and a half shells at the German soldiers, many of which either did not explode or completely missed their targets.

During seven days and nights of bombardment that removed the element of surprise, German troops simply moved into their deep underground concrete bunkers and waited. When the artillery pounding stopped, scores of British soldiers walked in a row uphill in successive waves across no-man’s-land and were mowed down, easy targets for swarms of German machine gun nests. By nightfall, few of the objectives had been taken despite massive loss of life.

The offensive would continue for another 4 1/2 months in a similar vein. After July 1, a long stalemate settled in as the British employed the same hopeless method of attack conforming to a prefabricated interpretation of events on the ground, despite assault after assault turning into a killing ground. Somme became a bloody battle of attrition.

By the end of the battle, a massive loss of human life had netted the allies roughly six miles of German­ held territory.

The battle helped cement the reputation of World War I as a war of terrible slaughter caused by poor decisions on the part of high commanders. The troubled British offensive resulted in the epithet “lions led by donkeys.”

Today, revisionist historians contend that the battle, while costly and flawed, put an end to German hopes at Verdun, badly weakened the German army and helped the British learn new tactics for successfully prosecuting future offensives.

Traditionalists believe this interpretation airbrushes reality. They say the battle achieved nothing but untold misery and loss. It was an unjustified bloodbath and evidence of the British high command’s incompetence. They argue that British military leaders failed in the fashion of Pyrrhus, who lamented after the battle at Asculum: “another such victory over the Romans and we are undone.”

Having just lived through two conflicts, Americans can relate to this quote. Iraq and Afghanistan, which is ongoing, both created more problems than they solved. Optimistic miscalculations led to unintended consequences and bloody inconclusiveness. And so it goes.

Originally Published: Jun 25, 2016

The repeal of a Depression-era banking law and the economic crash of 2008

The causes of the 2008 financial crisis are multiple and complicated. Minor deities of finance and even presidential candidates such as Bernie Sanders argue over whether the repeal of the longstanding Glass­ Steagall Act laid the groundwork for the financial meltdown. Those who don’t think it did overlook one major unintended consequence of repealing Glass Steagall: the excessive use of leverage.

After the 1929 stock market crash and the onset of the Great Depression, Congress passed the iconic Glass-Steagall Act in 1933 to help ensure safer banking practices and restore faith in the financial system. Before the Great Depression, banks had engaged in imprudent stock speculation. In addition to their traditional staid banking services such as taking in deposits and making loans, they also used depositor’s funds to engage in high-stakes gambling on Wall Street.

The act was passed to halt a wave of bank failures and rein in the excesses that contributed to the 1929 Crash. Among other things, Glass-Steagall separated the more stable consumer-oriented commercial banking from riskier investment banking and set up the bank deposit insurance system to protect small savers against bank failures. The business of accepting deposits and making loans was to be kept separate from underwriting and peddling stocks, bonds, and other securities.

The movement to deregulate the American economy began in the 1970s. It spread to air travel, railroads, electric power, telephone service and other industries, including banking. The sustained bull market of the 1990s supported arguments that financial markets could regulate themselves, and bankers lobbied Congress to further emancipate the financial sector.

Citigroup forced Congress’s hand in 1998 when the firm announced it would join forces with the Traveler’s Group in a corporate merger. The $70 billion deal would bring together America’s second largest commercial bank with a sprawling financial conglomerate that offered banking, insurance, and brokerage services. The proposed transaction violated portions of the Glass-Steagall Act, but Citigroup obtained a temporary waiver, completed the merger, and then intensified the decades-old effort to repeal Glass-Steagall.

Just a year earlier, Travelers had become the country’s third largest brokerage house with its acquisition of the investment banking firm Salomon Brothers. Touting the pressures of technological change, diversification, globalization of the banking industry, and both individual and corporate customers’ desire for a “one-stop shop” -a financial supermarket- both firms lobbied hard for approval of the merger.

In 1999 a Republican Congress passed and a Democratic President signed the Gramm-Leach-Bliley Act, essentially repealing Glass-Steagall and removing regulatory barriers between commercial banks, investment banks, and insurers.

Advocates of the universal bank model argued that customers preferred to do all their business – life insurance, retail brokerage, retirement planning, checking accounts, mergers and acquisition advisory, underwriting, and commercial banking lending -with one financial institution.

The universal bank created an uphill battle for the major investment banks like Lehman Brothers and Bear Steams. For example, it was believed that the investment banking arms of universal banks would move into the lucrative securities underwriting business, using loans as bait to get the inside track on underwriting engagements, essentially using depositors’ money to drive investment banking fees.

As public companies, these investment banking firms faced pressure to deliver returns on equity comparable to that of the universal banks. To stay competitive,  they resorted to excessive leverage or borrowing to juice their returns.

In 2004 they received approval from the Securities Exchange Commission to increase their leverage from 12-1 to better than 30-1. The numbers were indeed worrisome. For instance, Bear Steams was leveraged 33 to 1 and before crashing in September 2008 Lehman Brothers had a 35 to 1 leverage ratio, meaning they borrowed 35 dollars for every dollar of capital.

By the winter of 2008, excessive leverage would ravage the investment banking industry, leading to the downfall, merger, or restructuring of all major investment bank firms and unleashing a global recession. And the American taxpayer would learn that free markets are not free.

Originally published: April 30, 2016

A day that should live in infamy

Early in 1941, the government of resource-poor Japan realized that it needed to seize control of the petroleum and other raw material sources in the Dutch East Indies, French Indochina and the Malay Peninsula. Doing that would require neutralizing the threat posed by the U.S. Navy’s Pacific Fleet based at Pearl Harbor in Hawaii.

The government assigned this task to the Imperial Navy, whose combined fleet was headed by Admiral Isoroku Yamamoto. The Imperial Navy had two strategic alternatives for neutralizing the U.S. Pacific Fleet. One was to cripple the fleet itself through a direct attack on its warships,  or cripple Pearl Harbor’s ability to function as the fleet’s forward base in the Pacific.

Crippling the U.S. fleet would require disabling the eight battleships that made up the fleet’s traditional battle line. It was quite a tall order.

The most effective way to cripple Pearl Harbor’s ability to function as a naval base would be to destroy its fuel storage and ship repair facilities. Without them, the Pacific Fleet would have to return to the U.S., where it could no longer deter Japanese military expansion in the region during the year or so it would take to rebuild Pearl Harbor.

It soon became apparent that the basics of either strategy could be carried out through a surprise air raid launched from the Imperial Navy’s six first-line aircraft carriers. Admiral Yamamoto had a reputation as an expert poker player, gained during his years of study at Harvard and as an Imperial Navy naval attache in Washington. He decided to attack the U.S. warships that were moored each weekend in Pearl Harbor. But in this case the expert poker player picked the wrong target.

The Imperial Navy’s model for everything it did was the British Royal Navy. Standard histories of the Royal Navy emphasized its victories in spectacular naval battles.

Lost in the shuffle was any serious consideration of trying to cripple Pearl Harbor’s  ability to function as a forward naval base. So it was that, in one of history’s finest displays of tactical management, six of the world’s best aircraft carriers furtively approached the Hawaiian Islands from the north just before dawn that fateful Sunday, Dec. 7, 1941, launched their planes into the rising sun, caught the U.S. Pacific Fleet with its pants down and wrought havoc in spectacular fashion. On paper at least, this rivaled the British Royal Navy’s triumph at Trafalgar.

But so what?

The American battleships at Pearl Harbor were slow-moving antiques from the World War I era. As we know, the U.S. Navy already had two brand new battleships in its Atlantic Fleet that could run rings around them. And eight new ones the navy was building were even better.

More importantly, the Pacific Fleet’s three aircraft carriers weren’t at Pearl Harbor. American shipyards were already building 10 modem carriers whose planes would later devastate Imperial Navy forces in the air/sea battles of the Philippine Sea and Leyte Gulf.

Most importantly, as the sun set on Dec. 7 and the U.S. Navy gathered the bodies of its 2,117 sailors and Marines killed that day, all-important fuel storage and ship repair facilities remained untouched by Japanese bombs, allowing Pearl Harbor to continue as a forward base for American naval power in the Pacific.

So in reality, Dec. 7 marked the sunset of Japan’s extravagant ambitions to dominate Asia. Admiral Yamamoto and the Imperial Navy’s other tradition-bound leaders chose the wrong targets at Pearl Harbor.

The dictates of tradition are usually the worst guides to follow when it comes doing anything really important. After all, if they survived long enough to be venerated, they’re probably obsolete.

Originally published: December 6, 2014

Dancing on the edge of absurdity

There can be little doubt that one of the causes of the 2008 financial crisis was diminished regulatory control , the seeds of which were sown during the three preceding decades.

Recent legislation re-regulates financial markets, but attracting the best and brightest to regulatory jobs is proving to be a major challenge. The congressionally authorized Financial Industry Regulatory Authority, a not-for-profit self-regulator,  may offer a solution to the problem.

Beginning with the Carter administration and accelerating during Reagan’s presidency, the banking industry,  among others, was steadily deregulated. Not only were leveraging requirements continually lowered but watchdog organizations such as the Securities and Exchange Commission were weakened both by legislation and the appointment of free-market advocates.

Successive administrations were enthusiastic advocates of deregulation. The dominant economic paradigm was that markets are efficient and inherently maximize welfare and work best when managed least. Moreover, with free-market advocates in charge of regulatory agencies such as the SEC, many existing laws were ignored or rarely enforced.

For example, observers repeatedly warned the Securities and Exchange Commission about suspected irregularities at Bernard Madoff’s investment firm, which was later revealed to be a multi-billion dollar Ponzi scheme. In spite of several warnings, no serious investigation was undertaken until after the firm’s spectacular collapse.

One reason offered for poor financial regulation is that government agencies are seriously disadvantaged when it comes to attracting the best and the brightest. The salaries of elected officials tend to impose an artificial ceiling on how much public employees can be paid. Even though these ceilings ignore marketplace realities, elected officials are reluctant to raise them by advocating higher salaries for themselves because it looks bad to voters.

Consequently,  Americans are told that many government regulatory agencies lack the talent to regulate financial markets because they can’t pay the going rate for good people. Thus, the regulatory agencies’ best and the brightest flock to higher-paying jobs with firms they regulate. This leaves the public to complain that our regulatory agencies are less effective than they need to be.

But not all regulators are underpaid. According to the Bond Buyer ‘s annual salary survey of 21 industry regulatory groups, compensation for the chairman and CEO of Financial Industry Regulatory Authority , which oversees the 4,100 securities firms and over 636,000 stockbrokers in the United States, was $2.63 million in 2013.

The perks aren’t bad either, he receives $20,000 annually for admission fees, dues, and house charges to one club each in the Big Apple and Washington, and up to $20,000 annually for personal finance and tax counseling, as well as spousal travel for certain business-related events. Financial Industry Regulatory Authority also paid four of its top executives more than $1 million in 2013. These folks can spend more for one dinner than the average American -whose wages have been flat for decades – spends on a vacation.

Let’s put these salaries in perspective: The President earns $400,000 annually. Janet Yellen, the chair of the Federal Reserve who has sway over the entire world economy as opposed to just American stockbrokers, earns $201,700. Securities and Exchange Commission Chair Mary Jo White makes $165,300. White’s predecessor at the SEC, Mary Shapiro, was fresh from running Financial Industry Regulatory Authority, which gave her a $9 million severance to ease the pain of a low government salary.

These are clearly difficult times for national financial regulators. They are challenged with implementing hideously complicated Dodd-Frank legislation that is supposed to safeguard and stabilize the financial system to avoid another financial crisis.

At 2,319 pages, the Dodd-Frank Act is the most far-reaching financial regulatory undertaking since the 1930s, requiring regulatory agencies that had been withering to enact 447 rules and complete 63 reports and 59 studies within tight congressional deadlines.

It may be at the edge of absurdity, but just maybe the best way to attract the best and the brightest would be to expand the number of one-percenters by outsourcing all regulation to not-for-profit entities such as Financial Industry Regulatory Authority.

originally published: October 25, 2014