When corporations treat society like casinos

People joke that there is no such thing as “business ethics.” They call it an oxymoron – a concept that combines contradictory ideas. Sadly, the critics are right. Changing that may require holding those guilty of unethical behavior personally responsible for their actions.

In just the last several weeks a former peanut company owner was sent to prison for 28 years for his role in knowingly selling salmonella-tainted peanut butter that killed nine people. Volkswagen, the world’s biggest automaker, had software in the firm’s diesel-powered cars that cleverly put a lid on emissions during testing. Turing Pharmaceuticals jacked up the price of a 62-year-old lifesaving drug from $13.50 to $750 a pill. And the beat goes on.

All these examples capture the zeitgeist of an unethical, winner-take-all business climate in which bottom line-obsessed high flyers treat society like a giant casino and give the public the middle finger. Is it any wonder that business leaders are among the nation’s least trusted groups, ranking only slightly ahead of members of Congress?

People’s trust in business and those who lead it is eroding. It seems to many that executives no longer run their companies for the benefit of consumers or even shareholders and employees, but rather for the pursuit of personal ambition and financial gain.

In the wake of recent corporate scandals, it is again time to ask the most fundamental of questions: What is the scope of corporate social responsibility?

Maximizing shareholder value has for decades been executives’ top priority. Milton Friedman put it succinctly in a famous New York Times Magazine piece: “The Social Responsibility of Business is to Increase its Profits.” He referred to the concept of businesses having social responsibility as a “fundamentally subversive doctrine.” For Friedman, the role of business was clearly not to act as a social worker.

This formulation is one side of an ongoing debate regarding corporate responsibility. Milton Friedman’s formulation overlooks the fact that, as shareholders’ agents, business leaders have important responsibilities that extend beyond maximizing stockholder wealth. Simply put, many people believe that corporations are licensed by society to pursue profits with the expectation that they will produce goods and services that are of value to society.

Corporations should also remember that the way to maximize shareholder value is by maximizing customer satisfaction. Or as Peter Drucker, hailed as “the man who invented management,” put it, the purpose of business is to create and keep customers. For him, shareholders benefit when customer satisfaction: is the major priority.

While there are no black-and-white judgments, corporations can reasonably be expected to identify stakeholders beyond owners and investors, such as customers and employees.

Balancing shareholders’ expectations of maximum return against other priorities is a fundamental problem confronting corporate management. For starters, being responsible means obeying the laws and also behaving in a fashion that society universally values even if it is not required by law.

Believing that the markets will eventually sort the good from the bad is naive. What the public can now see, in hindsight , is that discussions about market discipline and increased government regulation are endless and don’t amount to much.

With the rise of institutional investing represented by large private and public pension funds, ownership in many large corporations is concentrated in the hands of a relatively small number of investors. The public may well have to rely on these investors to monitor managerial misconduct and corporate social responsibility before embarking on another wave of regulatory reform.

Of course, the dearth of personal risk associated with performance failures increases the incentives for corporate misbehavior and argues for personal punishment of the perpetrators. Real punishment may not cure the disease that lies at the core of the business culture and create a new vision of corporate responsibility, but behaving responsibly would surely be taken more seriously by all concerned.

Originally Published: October 24, 2015

Obama free trade isn’t so free for US

The fight for fast track legislation to allow President Obama to negotiate the secretive Trans-Pacific Partnership trade deal is over. After pulling out all the stops to push the deal through Congress, the President signed legislation giving him the authority to negotiate the trade agreement and put it before Congress for a straight up-or-down vote with no amendments allowed.

Americans are told that free trade is the best strategy for advancing global economic development, reducing poverty and achieving world peace. There is a lot to be said on behalf of the utopian dreams of free traders if you ladle enough frosting on the cake to compensate for its shortcomings. But if we want to help the American middle class -the stated goal of virtually  every politician -we would pursue different policy priorities.

To say that everyone benefits from free trade is misleading. Trade creates winners and losers and every American deserves to know the details buried in these deals. The benefits of the North American Free Trade Agreement and other trade deals have not been shared as broadly as promised.

Economists, businessmen and politicians, the most devoted acolytes, say technological advances lead to increased productivity, which means fewer workers are needed to get the job done. Yes, we have substituted capital for labor. But we have also substituted cheap offshore labor for American workers and the result is that Americans are losing jobs, their wages are stagnating and the middle class is coming apart at the seams.

How countries trade and whether they benefit from it are important questions. Starting with Adam Smith, economists have emphasized specialization and exchange as essential to increasing productivity and raising living standards.

The economic argument for free trade relies on the principle of comparative advantage developed by David Riccardo in 1817. His quaint theory, which built on Smith’s work, remains the cornerstone of free trade economics. So what in simple terms is comparative advantage?

Let’s assume that Lady Gaga, the world-famous entertainer, also happens to be a world- class typist. Rather than both entertaining and typing, she should specialize in entertaining, where her comparative advantage is greatest and she could maximize her income. This key insight is still endorsed today by the overwhelming majority  of economists.

Americans who lose their jobs are becoming less rich so people in foreign countries can be less poor. In the aggregate, people are better off, but domestic workers bear the cost. It should be clear by now that on the home front, free trade contributes to rising inequality, wage stagnation, and lost jobs .

The gains from trade are often widely dispersed, while the losses are concentrated. The extent to which offshore outsourcing is responsible for some of our current labor market woes has become highly contentious in recent years.

Perhaps it is time to adopt a national strategy that can make the American economy grow fast enough to produce decent jobs for every member of the American family who wants to work. How about if we start by investing in our broken infrastructure so it can generate economic growth instead of hamstringing it, and educating our children so they become world leaders in something besides sports?

Then we just might become internationally competitive again, and restore our economy to full employment while we’re at it

originally published: July 11, 2015

US must confront the new realities

The 21st century has witnessed the death of the old world economic order and the birth of a new one. America remains the world’s military superpower but Brazil, Russia, India, China and others are challenging our economic pre-eminence.

The parade of 2016 presidential candidates will offer short-form solutions unconstrained by resource limitations. They will blame others and predict impending doom if they are not elected. And the political rhetoric will surely be accompanied by nostalgia for the golden economic age of the decades that followed World War II.

But America’s dominance in the decades following World War II was a function of unique circumstances. Europe lay in ruins in 1945. In the rest of the world, cities were shattered, economies devastated and people were starving. In the two years after the war, the vulnerability of countries to Soviet expansionism heightened the sense of crisis.

The postwar economy was quite successful by any standard. The American middle class enjoyed higher wages from the end of World War II until the mid-1970s. Real wages, after inflation, continually rose until 1973. But that was when the United States accounted for a disproportionate share of the global economy, nearly two-thirds of the world’s gold reserves, and the dollar was the world’s reserve currency.

Prosperity was the governing theme of the postwar era. During those years, gross domestic product grew 140 percent and real (inflation-adjusted) per capita income doubled. Living standards improved to the point where the large majority of Americans could describe themselves as middle class.

The United States made the economic recovery of Western Europe and Japan a national security priority. Two basic motives guided policy.

Primo, the United States was increasingly concerned about the ambitions of the Soviet Union that had imposed communist governments on Eastern Europe and their threat to Western democracies.

Secondo, the United States believed stable, prosperous, democratic governments would serve as ramparts against Soviet expansion and bind these countries to us.

To restore Europe’s economic infrastructure, President Harry S. Truman signed what became known as the Marshall Plan in April 1948. Over the next four years the plan delivered $13 billion to modernize industry in 16 European countries. This funding, which translates into $103 billion in today’s dollars, enabled Europe to rejuvenate its domestic markets as well as export its way to economic recovery. By contrast, Afghanistan still can’t stand on its own after receiving about $110 billion in assistance.

The Marshall Plan along with cutting American tariffs by 35 percent to accommodate and promote foreign imports, which provided Americans with cheap foreign goods, supported the development of stable democratic governments in Western Europe. It also provided markets for American goods and services, a grand example of vendor financing.

The United States also developed and helped finance a comprehensive economy recovery program for Japan. The war had devastated the country and terminated almost all of its foreign trade.

It should not be overlooked that it was with America’s help that the world became a more prosperous and competitive place, which has indeed put downward pressure on wages as footloose companies take advantage of the information technology revolution to disperse supply chains contributing to the erosion of middle-class wages in the face of low-cost competition.

If America wants to maintain its status as the world’s economic superpower, it is time to jettison the addiction to past achievements and focus on new realities: The world is experiencing dramatic technological change and we face economic competition from millions of people around the world who are happy to work for a fraction of Americans’ wages.

We must get serious about issues that are the very foundation of American exceptionalism such as combating economic inequality and declining living standards for the shrinking middle class. If we don’t, Americans will have to drastically adjust their expectations about growth and opportunity and step back from our special place in the world.

originally published: July 4, 2015

How to manage pension liability

Americans today exist amid the tension between hope for better times and the scars of the worst financial crisis since the Great Depression. Among the fiscal challenges we face are states and cities that are struggling to keep up with their promise to set aside enough money to fund public employees ‘defined benefit pension plans as those governments also struggle to recover from the longest economic downturn since the 1930s.

As difficult as it will be to live up to their pension promises, governments must avoid the temptation to increase their contributions to pension systems by raising taxes on average Americans. This would only deepen the wounds of those hit hardest by the 2008 financial crisis and subsequent Great Recession.

While corporate America has shifted to defined contribution retirement plans, most state and local government employees still participate in defined benefit plans. There is plenty of evidence that these plans lack the funds to make good on the promises made to public employees, but little evidence that structural reforms are being implemented.

Defined benefit plans promise a set monthly payment during retirement. Major challenges of managing these plans include estimating future retiree obligations and making accurate assumptions about variables such as length of service, future salary levels, retiree mortality and expected return on pension fund assets

Investment returns are critical, since investment earnings account for the majority of public pension financing. Any shortfall must be made up by increasing contributions or reducing benefits. One common flaw in pension plan management is undervaluing liabilities by assuming unrealistic rates of returns on pension assets.

Even small changes in rates of return can have a significant effect on assets. A Congressional Budget Office study found that with an 8 percent projected rate of return, unfunded liabilities amounted to $700 billion for state and local pension plans. If the projected rate of return dropped to 5 percent, the unfunded liability more than tripled to $2.2 trillion.

Public pension funds generally assume a high rate of return. For example, many base pension contributions on an assumed 7.75 percent annual return on fund investments, which is difficult to achieve in a near-zero interest rate environment.

Since actual returns have been less than the assumed rates, unfunded liabilities have increased.

There is no consensus on how best to deal with this crisis and preserve the sustainability of public pension plans. Since the 2008 financial crisis, nearly all states have enacted changes to make their defined benefit pension plans more solvent. Common options for overhauling the plans range from increasing employee contributions to benefit reductions, including cost of living adjustments and increases in eligibility requirements such as increasing the retirement age for new employees.

A handful of states have passed reforms that replace defined benefit plans with some version of defined contribution plans for new employees. Others have borrowed money through the issuance of pension obligation bonds, floating tax-exempt bonds at low interest rates in the hope of investing the money in securities with higher yields.

Still other states have changed their asset allocation mix to generate higher returns, investing in alternative investments such as private equity, real estate, hedge funds, commodities and derivatives. But along with higher expected returns, these investment vehicles also bring greater risk.

As part of the struggle to deliver promised benefits, some pension plans have begun to focus on investment management fees. For example, the California Public Employees’ Retirement System, the country’s biggest state pension fund with about $300 billion of assets, announced plans earlier this year to cut back on the $1.6 billion in management fees it paid to Wall Street firms last year.

Tough choices are needed to get public pension funds plans back on track. But in the current economic environment, increasing governments’ contributions to the plans by raising taxes on average taxpayers who have already taken a beating in recent years should be avoided at all costs.

originally published: June 27, 2015

Time to limit immigration of low-wage workers

Politicians often remind us that we are a nation of immigrants. For much of America’s history,
immigration strengthened the nation’s economy. But that’s far less clear today.

In an era of global competition, the intake of low-wage immigrant workers who benefit big businesses at the expense of workers by depressing wages and increasing income inequality should be limited. The  war on terror also raises concerns about just who is coming to our country.

The French philosopher Auguste Conte is reputed to have said “demography is destiny.” American demographics have certainly changed dramatically over the last several decades.

According to the Census Bureau, in 2013 there were 41.3 million immigrants (legal and illegal) living in the United States, an all-time high and double the number in 1990, nearly triple the 1980 number, and quadruple the 1970 count of 9.6 million. Immigrants make up nearly 13 percent of the population, the highest share in 93 years. In 1970, fewer than one in 21 residents were born abroad. Today it is about  one out of eight.

When you add in their U.S.-born children, this group numbers about 80 million, or one-quarter of the overall U.S. population. The U.S. represents the destination of choice for the world’s migrant population. With less than 5 percent of the world’s population, we attract nearly 20 percent of its migrants .

In 2013, close to 47 percent of immigrants (19.3 million) were naturalized U.S. citizens. The remaining 53 percent (22.1 million) included lawful permanent residents, legal residents on temporary visas such as students and temporary workers, and illegal immigrants. The latter category is estimated at 11-12 million and represents about 3.5 percent of the American population.

Mexican-born immigrants accounted for approximately 28 percent of all immigrants to the U.S., making them by far the largest immigrant group in the country. India was the second largest, closely trailed by China, the Philippines, Vietnam and El Salvador. All told, the top 10 countries of origin accounted for about 60 percent of the immigrant population in 2013.

The demographic diversity of today’s United States is in many ways a direct result of the Immigration and Nationality Act amendments of 1965, which shifted U.S. immigration policy from a historic ethnic European population bias to one that favored a new stream of immigrants from developing countries in Asia and Latin America. Under the old system, admission to the U.S. largely depended upon an immigrant’s country of birth. The new system eliminated the nationality criteria and family reunification became the cornerstone of immigration policy.

The act was shepherded through the Senate by Ted Kennedy and signed by President Johnson at the foot of the Statue of Liberty on October 3, 1965. At the signing Johnson said, “This bill we sign today is not a revolutionary bill. It does not affect the lives of millions. It will not restructure the shape of our daily lives.”

But the law did change the immigration flow. For example, the European and Canadian share of legal immigration fell from 60 percent in the 1950s to 22 percent in the 1970s. By contrast, the Asian share of legal immigration rose from 6 percent in the 1950s to 35 percent by the 1980s and 40 percent in 2013.

Years later, Theodore White, the Pulitzer Prize-winning journalist and historian called the legislation “noble, revolutionary and one of the most thoughtless of the many acts of the Great Society.”

The evidence now suggests that immigrants are entering the U.S. faster than the economy can absorb them. An oversupply of low-wage immigrant workers has saturated the job market and depressed wages, thereby exacerbating income equality and the wage stagnation that has been a fact of life in the United States for over 40 years.

The time has come to tailor American immigration policy to the 21st century and put the economic interests of American workers at the center of immigration policy. For starters, this means limiting the entry of low-wage workers before the second coming.

originally published: May 23, 2015

When multiculturalism clashes with women’s rights

Some forms of the multiculturalism many Americans favor can only intensify the challenge of reducing the various forms of gender discrimination still common in mainstream America. Consider the insistence by some groups that the cause of multiculturalism is best served by granting special “group rights” to cultural minorities (especially those composed of non-European immigrants) to help them preserve their distinctiveness in a society that emphasizes the white bread, homogenized, sitcom ideal of “real America.”

The problem is that part of the distinctiveness of these minority cultures sometimes stems from their traditional abuse of women by permitting oppressive practices such as forced marriage, female genital mutilation, and physical abuse. While we condemn atrocities done to women abroad, we largely ignore or rationalize discrimination at home.

Rosa Parks must have been spinning in her grave in 2011 when we learned that a Brooklyn public bus catering to a predominately Orthodox Jewish ridership had special rules requiring all women to sit in the back of the bus. Also, signs written in Hebrew and English directed women to use the back door during busy times.

Closely related, last month, a New York-to-London flight was delayed by an ultra-Orthodox Jewish man who refused to sit next to a woman because his religion precludes him from sitting next to a woman who is not a family member. The woman agreed to move. It wasn’t the only such incident of its kind. It’s another example of religious rights trumping a woman’s civil rights.

Apart from numerous instances of domestic violence and discrimination justified by religious beliefs and cultural practices, we witness the closing of the academic mind when Brandeis University last year rescinded its offer of an honorary degree to the Somali-born Ayaan Hirsi Ali because of her scathing criticism of Islam.

She experienced it firsthand when she says she underwent female genital mutilation at 5 and was targeted by the same Islamic militant who murdered Theo van Gogh. A note was pinned to his body saying she was next because of her criticism. For sure, Ms. Ali, author of the memoir “Infidel,” is a controversial public figure who has spoken and written powerfully about the culture of oppression affecting women in the Islamic culture at great personal danger. But universities are supposed to be about learning more, not less, and entertaining dissenting views.

Any government action to preserve these discriminatory practices among minority groups living in America is no more defensible than officially sanctioning discrimination in any form. Such actions would interfere with the already too-slow process of weaning mainstream America away from its historical patterns of male-imposed discrimination against women.

Government should insist that everyone living in the United States observe and obey all American laws regarding human rights without regard to membership in certain cultural minorities, religious sects or golf clubs. In short, no special treatment for any group that seeks to defend its abuse of women because it’s part of the group’s cultural distinctiveness.

Put differently, immigrant cultures with ingrained behavior patterns that are contrary to prevailing secular humanist views about the rights of women should not be tolerated. Such groups should not be exempt from American anti-discrimination laws. Minority groups living in America should not receive special rights to discriminate against women as a means of preserving their cultural distinctiveness.

The special rights case for allowing U.S. minority groups to continue practicing their own brand of discrimination against women is claimed by adherents as being consistent with liberal principles. Their main argument seems to be that liberal values require (among other things) tolerance and respect for diverse cultures.

If such tolerance and respect are to have any practical meaning, the practices of these diverse cultures must be consistent with tolerance and respect for all people.

Indeed, one argument for expanding women’s rights in America could well be its potential for restricting the ability of certain religious or cultural groups to encourage discrimination against individuals for reasons such as gender, race, sexual preference, lifestyle or business practices.

Feminism, therefore, could turn out to be nearly as liberating for men as for women.

originally published: May 16, 2015

The allure of Wall Street’s lusty pleasures

Many people believe that a relatively few individuals were the real villains behind the financial heart attack of 2008: Those on Wall Street; in banks and other financial institutions; on the faculties of the nation’s leading graduate business schools, writing financial jabberwocky for small-circulation journals; setting policy in the West Wing of the White House and on Alan Greenspan’s Federal Reserve Board.

It’s popular to believe they hijacked the free market ideal because they could. It was American as handguns. They then proceeded to twist it to serve personal agendas at great cost to the. American people. Enough financial violence was done to make Attila the Hun look like Mother Teresa.

Some of these hijackers could have been hopeless psychopaths whose brains were wired in such a way that they actually got more pleasure scamming $10 from widows and orphans through elaborate Times Square shell games than by honestly earning $100 selling Bibles door-to-door.

In fact, everything needed to understand them is contained in several film noir classics.

Presumably, the only defense against such psychopaths is to isolate them before they can do too much damage. But the overwhelming majority of those assumed to have turned the free market ideal into a rip­ off of the American public probably started as fundamentally decent individuals, as morally straight as church deacons.

So what turned these Boy Scouts into shameless hustlers eager to sell their mothers 10 times over for a fast buck? The answer is clear enough to anyone who’s ever been bedazzled by Billy Wilder’s corrosively breathtaking  1944 movie “Double Indemnity,” with Barbara Stanwyck’s pathologically definitive scarlet woman promising poor schnook Fred MacMurray riches and sexual ecstasies beyond his wildest dreams if he helps her with a murderous insurance scam, all while working her own angles and making her own rules. If only he would bend a few rules. Just a little. Even for a short time.

Now imagine Stanwyck is America’s free market and MacMurray  is the Wall Street schmuck who should have known better.

Money and sex are hopelessly tangled in the male consciousness. So when a scarlet woman strutting in capitalism’s strapless red gown turns her wet-lipped allure loose on them and moves in close enough to fill their lungs with her dizzying perfume, what hungry Wall Street player is strong enough to resist her? Or even care when their homes and hearths and panoply of family values go rushing down the drain?

And if worse comes to worst, they can always stand up in court and plead the equivalent of Adam’s excuse when God scolded him for having eaten the forbidden fruit.

Barbara Stanwyck’s definitive portrayals of scarlet women throughout her .long career make these performances especially relevant in helping us appreciate why so many men in our male-dominated society remain confused little boys who get sex and money all mixed up. They become ·ready prey for the allure of money and power and all too eagerly sacrifice their careers, families, and very lives for the promise of a tainted dollar.

To our good fortune, many of these classic films noir are now available on DVDs and various video ­ streaming services. So be on the lookout for “The Lady from Shanghai,” “The Maltese Falcon,” “Out of the Past,” “Touch of Evil,” “The Killers” and many others.

Nothing beats movies from the classic noir era when it comes to exploring the darker side of human nature and providing us with psychological insights into why so many Americans are driven to behave like schmucks. Or at least they offer some convenient and reassuring explanations. 

originally published: May 9, 2015

Billions in bonuses on Wall Street at the expense of Main Street

Seven years after the traumatic 2008 financial crisis, millions of Americans still have not recovered. But a few others are doing quite well, thank you. One of the first signs of the impending implosion in financial markets occurred in the summer of 2007 when two Bear Steams hedge funds with major investments in mortgage-backed securities collapsed. It was the beginning of the end for the world’s fifth largest investment bank, which, during its 90-year run, had developed a maverick reputation in the white-shoe culture of investment banking.

During the wee hours of March 24, 2008, just before Asian markets opened, the federal government forced Bear to announce its sale for a few pennies on the dollar to JPMorgan Chase, an offer that would not have been made without government assistance.

The deal was backstopped by the Federal Reserve’s commitment to buy upwards of $30 billion worth of mortgage-based securities in Bear’s portfolio that Morgan regarded as “too toxic to touch.” It was hoped that the Bear rescue would stem any fallout from spreading into the larger financial world, which many policymakers viewed as likely following the failure of a major investment bank.

Bear’s collapse was a critical event signaling the start of a great unraveling. One of the things that made Bear’s demise such a watershed event was the federal government’s direct involvement in orchestrating the deal that saved the company from having to file for bankruptcy.

Previously, the federal government would become so intimately involved only when a deposit-taking commercial or savings bank got into financial trouble.

If they screwed up and failed? Others would learn from their mistakes. That’s what was supposed to happen under capitalism. That is until the federal government got bushwhacked by Bear, a “don’t get no respect” underdog, and found itself in a jam.

So the feds had to throw out the standard game plan, even if it meant the Federal Reserve buying $30 billion worth of mortgage-backed securities from Bear that nobody else would touch as the financial tsunami of 2008 began rolling across the globe.

Bear Steams may have ceased to exist on March 24, 2008, but it continued to haunt the financial world like Marley’s ghost for months thereafter as the global meltdown continued, marked by formerly solid financial institutions turning into basket cases that could no longer survive on their own – after years of shooting up on short-term borrowings and boozing away on risky trades that blew up in their faces.

At the beginning of 2008, Merrill Lynch, Goldman Sachs, Morgan Stanley, Lehman Brothers and Bear were the five largest stand-alone investment banks in the world. By the end of the year all would be gone.

Goldman Sachs and Morgan Stanley were converted to bank holding companies while Lehman Brothers filed for bankruptcy and Merrill Lynch was acquired by Bank of America. These supposedly omnipotent institutions proved to be giants with feet of clay.

The financial crisis precipitated the worst economic downturn since the Great Depression, costing millions of Americans their jobs, homes, life savings and hopes for decent retirements. Since then, workers’ median incomes have effectively stayed unchanged while inequality between the top and bottom of the income scale has risen sharply.

Meanwhile, we recently learned from the New York State comptroller that Wall Street banks handed out $28.5 billion in bonuses in 2014. The average bonus was $172,860, more than three times the median household income of about $52,000. To say that anyone is surprised would be selling the truth below wholesale.

It’s reassuring to know that some folks have recovered very nicely from the financial crisis. But Main Street America will apparently have to learn to live with the wounds from the financial crisis.

originally published: March 28, 2015

Federal Reserve, Americans must listen to Carmen Segarra

To most people, the name Carmen Segarra means nothing. But to a few, her fate validates their worst suspicions about regulators who exist to protect the interests of the regulated.

Segarra is a former bank examiner whose job was to be the Goldman Sachs’ watchdog for the Federal Reserve Bank of New York, which regulates many large New York banks and is the Federal Reserve System’s primary connection to financial and credit markets. She secretly recorded 46 hours of conversations inside the Federal Reserve and Goldman Sachs and released the tapes to Pro-Publica and the radio show, “This American Life.” You can listen to the episode online at ThisAmericanLife.org.

Segarra was fired by the Federal Reserve after seven months, apparently because she refused to budge on her findings that Reserve officials on numerous occasions seemed to treat Goldman Sachs with too much deference. In particular, she insisted based on her fact-finding that the company did not have a policy on conflicts of interest that met regulatory standards.

Her story underscores how regulators have become too cozy with the industry they are charged with policing. Academics call it “regulatory capture.”

This is hardly breaking news. Lax external oversight was among the chief reasons the world’s biggest economy was brought to the brink of depression in 2008. Put bluntly, regulators have to shoulder some of the blame for the financial apocalypse that unleashed the worst economic crisis since the Great Depression of 1929, at a galactic cost to the American taxpayer, and threw millions of Americans out of their jobs and homes. The economy still bears deep scars.

The 2008 financial crisis demonstrated more than ever that the self-regulating financial system was pure myth.

The public has come to catch the joke that on Wall Street, if you represent everyone there is no conflict of interest. Transparency and the financial services industry don’t exactly waltz around arm in arm. In fact, for some bankers transparency is an occupational hazard.

The coverage in the media since the Sept. 26 release of Carmen Segarra’s recordings of Federal Reserve officials not doing their jobs has been minimal. Hers is not a household name like Edward Snowden, who leaked classified information from the National Security Agency that advertised security vulnerabilities and spying on Americans and international leaders.

It may be that the public’s default mode is indifference; they would like to care but there’s just too much going on at the moment. The average American is too busy worrying about making ends meet. And after all, they already knew that banks hold regulators hostage.

Sure, Sens. Elizabeth Warren, D-Mass., and Sherrod Brown, D-Ohio, both members of the Senate Banking Committee, want Congress to investigate Goldman Sachs’ relationship with the Federal Reserve, but it’s more likely that the issue will quietly disappear.

Wall Street makes generous campaign contributions to the guardians of democracy in Washington and spends big on lobbyists to communicate their policy preferences to government apparatchiks. Despite the rosy rhetoric, that makes it highly unlikely that Congress will hold hearings.

Another problem is that many people see government regulatory jobs as stepping stones to lucrative private-sector careers. They develop useful contacts with key employees in the private-sector firms whose behavior they are supposed to regulate and quietly impress these contacts that their “hearts are in the right place.” In this culture of coziness, nothing should be taken at face value.

In the final analysis you can write all the tough regulations you want to regulate the financial system and its participants to prevent future financial debacles. But for those regulations to have any teeth, they must be accompanied by closing the revolving door between lavish private-sector executive suites and the basic steel-desk offices of government agencies.

originally published: October 11, 2014

Navigating a free Market (Basket) economy

The bitter clash between factions of the DeMoulas family, the major shareholders in the Market Basket supermarket chain, once again raises the issue of corporate responsibility. Is the sole responsibility of executives and boards of directors to maximize the value of stockholders or are they responsible to a broader array of stakeholders that include customers, employees, suppliers and host communities?

In recent decades, a grand total of two options have evolved for dealing with the issue of corporate responsibility. If you believe businesses should exist unmolested, solely to serve the interests of stockholders, then the late economist Milton Friedman is your man. He was the most outspoken advocate of that view and argued that corporate social programs add to the cost of doing business. Spending money to reduce pollution, for example, makes a business less profitable.

Many management gurus counter that there is danger in focusing solely on profitability. An overzealous pursuit of stockholder returns can encourage maximizing short-term rather than long-term returns. Such an orientation leads to actions like cutting expenditures judged to be nonessential in the short term such as research and development. The resulting underinvestment jeopardizes long-term returns.

The near financial meltdown in 2008 and the subsequent Great Recession demonstrated the large and diverse group of stakeholders who are affected by companies’ actions. In the wake of this shock to free­ market capitalism, the traditional view of corporate responsibility is giving way to a belief that enlightened self-interest requires a business to consider all important stakeholders when running the enterprise, not just stockholders.

Stockholders provide the business with capital, but if customers don’t get value for their money they can take their business elsewhere, employees provide labor and expect commensurate income and job satisfaction in return or they can leave their jobs, suppliers seek dependable buyers , and local communities want firms that are responsible citizens.

To create customer value, most firms rely on a network of stakeholders. In determining company goals and strategies, executives and board members must recognize that each has justifiable reasons for expecting and often demanding that the firm take its interests into account. Family-owned businesses such as Market Basket are no different.

As Southwest Airlines founding CEO Herb Kelleher noted, the key to delivering outstanding customer service is putting employees first. “If they’re happy, satisfied, dedicated and energetic, they’ll take real good care of the customers. When the customers are happy, they come back. And that makes the shareholders happy.” At Southwest, people and profits are explicitly linked and that has accounted for outstanding profitability over several decades in a highly competitive industry.

Leaders at Market Basket and other companies don’t realize that they don’t hold all the picture cards. If they don’t reform their behavior, an angry public will do it for them by boycotting their businesses.

originally posted: August 16, 2014