Wolves of Wall Street: Financialization and American Inequality

Wolves of Wall Street: Financialization and American Inequality

It’s no secret by now that the recent spike in American inequality, and the gains rapidly accruing to the wealthy, are driven in large part by “financialization.” Over the last generation, financial services have expanded not with economic growth, but with stagnation and crisis—and their spectacular rise has accounted for about half of the decline in labor’s share of national income. How did things get this bad?

(Antonio Morales Garcia/Flickr)

This series is adapted from Growing Apart: A Political History of American Inequality, a resource developed for the Project on Inequality and the Common Good at the Institute for Policy Studies and inequality.org. It is presented in nine parts. The introduction laid out the basic dimensions of American inequality and examined some of the usual explanatory suspects. The political explanation for American inequality is developed through chapters looking in turn at labor relations, the minimum wage and labor standardsjob-based benefitssocial policy, taxes, financialization, executive pay, and macroeconomic policy. Previous installments in this series can be found here.

It’s no secret by now that the recent spike in American inequality, and the gains rapidly accruing to those at the upper end of the income distribution ladder, are driven in large part by “financialization”—the growing scale and profitability of the financial sector relative to the rest of the economy, and the shrinking regulation of its rules and returns. The success or failure of the financial sector has a disproportionate impact on the rest of the economy, especially when the combination of too much speculation and too little regulation starts inflating and bursting bubbles. And its returns flow almost exclusively to high earners. An overcharged finance sector, in other words, breeds inequality when it succeeds and when it fails.

A Short History of American Finance
Across the modern era, key moments of economic growth—the railroad and heavy industry development of the 1890s, the advent of electricity and automobiles in the 1920s and 1930s, and the IT boom of the 1980s—have been accompanied by parallel innovations in financial services. Each of these eras, in turn, was punctuated by a crisis in which speculation in new financial instruments, over-exuberance about their prospects, or outright chicanery turned boom into bust. The railroad boom of the nineteenth century yielded a wildly unregulated market for railroad securities and a series of market collapses. The emergence of a consumer-goods economy in the early decades of the twentieth century transformed both corporate finance and consumer credit and spilled the country into the Great Depression.

Although the public stake in a stable and efficient financial system was readily apparent by the turn of the twentieth century, regulatory oversight and standards developed quite slowly—in large part because of the relative weakness of the federal government. The Federal Reserve Act of 1913 established a central banking system based on twelve district banks and a national board with the power to buy and sell government securities, to alter “reserve” requirements (the share of deposits commercial banks are required to have on hand), and to adjust the interest rates charged to member banks. In this respect, the Federal Reserve had considerable clout in monetary policy, but a light regulatory hand. Indeed much of this authority—little exercised—remained in the states where most banks were chartered. 

The fragility of this system was exposed in the late 1920s, an era in which the Federal Reserve abetted stock market speculation through loose monetary policy, exacerbated the subsequent crisis by responding with a tight monetary policy, and proved unwilling or incapable of stemming the run of bank failures. This led to a number of fairly dramatic changes—designed in part to regulate the speculative fever that led to the crash, in part to restore confidence and stability to the banking system, and in part to bridge the gap between the economy’s ability to produce and its ability to consume.

As part of its response to the Great Depression, the New Deal erected sweeping and systematic new regulations of commercial and investment banking. The Glass-Steagall Act (1933) regulated interest rates (building in a small rate advantage for savings and loans), established deposit insurance, and erected a wall between commercial and investment banking by prohibiting the former from “engaging principally” in non-banking activities like securities or insurance. The new Securities and Exchange Commission (SEC) brought similar oversight to Wall Street—primarily through new disclosure rules governing publicly traded companies. And the establishment of federal mortgage insurance, under the auspices of the Home Owners Loan Corporation and then the Federal Housing Administration, leveraged new home loans in exchange for new federal underwriting guidelines. 

Financialization accounts for about half of the decline in labor’s share of national income (in the United States and elsewhere) since 1970.

The idea, in all of this, was that finance was a sort of public utility—a tightly regulated service to the rest of the economy. And indeed, the financial sector served (and serves) a number of crucial functions: matching borrowers with lenders, managing the risk associated with borrowing and lending, and providing liquidity to the economy as a whole. But even at the peak of the New Deal regulatory push, the federal government did little to ensure that these services would be provided responsibly. The reforms instead created a patchwork of regulatory authority, an alphabet soup of stand-alone agencies organized by the kind of financial activity or the kind of institution they were responsible for. It was, as Senator William Proxmire famously characterized it, “the most bizarre and tangled financial regulatory system in the world.” While New Deal banking laws changed the landscape of financial regulation, they also created jurisdictional confusion and encouraged a deferential approach to banks and bankers on the part of competing agencies.

Bit by bit, this regulatory framework came under attack. Large banks began pushing against the confines of Glass-Steagall in the 1960s and began making substantial deregulatory progress in the 1970s. In 1978, a successful legal challenge to state usury laws led to dramatic growth in the credit card industry, much of it headquartered in the high-interest havens of Delaware and South Dakota. In response to the persistent inflation of the 1970s, investors fled conventional interest-bearing accounts for more lucrative (and lightly regulated) alternatives, including money market accounts and venture capital or hedge funds. The loosening of pension regulation, meanwhile, created new markets for speculation—and the capital to feed them. 

In 1980, the Carter Administration set in motion the deregulation of interest rates at conventional banks, a process that would unfold over six years. Between 1980 and 1985, deregulatory reforms reduced or eliminated various net-worth, accounting standards, and loan-to-value ratio requirements at savings and loans institutions—resulting, in short order, in widespread defaults and a sprawling $201 billion bailout.

The savings and loan debacle did little to stem the deregulatory fever, and lawmakers spent the next decade mopping up that mess with one hand while hacking away at the remnants of Glass-Steagall with the other. Between 1986 and 1996, Congress gradually expanded the share of revenues that commercial banks could derive from investments. In 1999, the “Financial Modernization Act” wiped away the last vestiges of Glass-Steagall. By this time, much of the country’s financial activity had drifted to a “shadow banking system” of derivatives and securities, many of which were expressly exempted from regulation by the Commodities Futures Modernization Act of 2000. These new instruments, in turn, fed speculative growth: the ability to securitize (chop up and re-sell) loans and mortgages turned the attention of lenders from quality to quantity. Even loans originating in conventional banks soon ended up, elaborately sliced and repackaged, as securities or derivatives.

One result was a dramatic rise in the financial sector’s share of employment, economic activity, and corporate profits [see graphic above]. Finance accounted for about 10 percent of gross domestic product (value-added) at the end of World War II. This grew to about 15 percent at the end of the 1970s, and to over 20 percent by 2010. Growth was fueled not only by the repeal of Glass-Steagall but by financial globalization and speculative innovation during an era of inflationary and regulatory uncertainty. Finance swelled, in real and relative terms, as the rest of the economy weakened.

Another consequence was the dramatic rise of unregulated “shadow” banking institutions. For most of the post-1945 era (through the late 1970s), assets in conventional banks outnumbered those in markets for securities and derivatives by more than five to one. By 1989, the ratio had fallen to two to one. By 1999, assets in the shadow banking system exceeded those in conventional banks. Today, most credit in the U.S. economy is managed through non-bank institutions (finance companies, mortgage pools, brokers) that are both lightly regulated and highly leveraged (that is, they carry a high ratio of debt to equity). 

The problem with all this growth is that, in a sector like finance, bigger is not necessarily better—and may be much worse. Finance makes up a much bigger share of the economy now than at any other time in our history, and it is markedly less efficient. It is far bigger than it needs to be to provide the intermediary services demanded by today’s economy; indeed the pattern over the last generation has been for financial services to expand not with economic growth, but with stagnation and crisis. What has grown is not the demand for credit or intermediation but the sheer volume of trading in stocks, securities, and foreign exchange. This speculation promises big returns—at a risk dampened by the implicit promise or expectation of a bailout should things turn south.

American Finance and American Inequality
The rise of the financial sector has fed inequality in a number of ways. First, the disproportionate growth of finance diverts incomes from labor (wages and salaries) to capital. Indeed, recent work by the International Labor Office suggests that financialization accounts for about half of the decline in labor’s share of national income (in the United States and elsewhere) since 1970.

But even more important than the slow siphoning off of labor’s share is the widening inequality within that share, as top earners pull away from the rest of the pack. Increased employment in finance has been accompanied by accelerating rates of compensation in the sector, from about $20,000 per year per employee (including secretaries and clerks) in 1980 to nearly $100,000 today. This is of course exaggerated at the top of the income spectrum. In 2004, by one estimate, the combined income of the top twenty-five hedge fund managers exceeded the combined income of all of the Standard and Poor top 500 CEOs. The number of Wall Street investors earning more than $100 million a year was nine times higher than the public company executives earning that amount. About 14 percent of the “1 percent” are employed in finance, a share that has doubled since 1979. 

The pattern over the last generation has been for financial services to expand not with economic growth, but with stagnation and crisis.

Financial professionals in the top 0.1 percent had substantially faster income growth over this period than almost all other professions: the share of national income (including capital gains) going to the richest 1 percent doubled between 1979 and 2005 (from about 10 percent to about 20 percent), while the share going to the richest 1 percent employed in finance more than tripled (from less than 1 percent to over 3 percent)—a gap that is even wider for the top 0.1 percent. This is the cutting edge of a “winner-take-all” economy marked by stagnant insecurity for most and unfathomable rewards for the very few. And, of course, this dispersion is further skewed by a tax system in which much of this compensation is taxed at the capital gains rate—to the chagrin of Warren Buffett and the embarrassment of Mitt Romney. 

The disproportionate growth of finance not only skews incomes but robs the rest of the economy of resources, skilled workers, and the capacity to grow. As a purely intermediary sector, finance should encourage or sustain growth in the rest of the economy, but as a bloated and speculative sector it can actually do more damage than good—distorting the allocation of capital, slowing the rate of new business creation, and sapping other sectors of technological and entrepreneurial talent. Beyond a certain point, growth in the financial sector actually impedes overall growth, and rapid growth in the financial sector—when boom gives way to bust—can be especially destructive.

In turn, the financial sector is actively involved in the buying, selling, merging, and dismantling of firms—a process of restructuring that invariably benefits one set of stakeholders (managers, shareholders, private equity) at the expense of others (workers and their communities). Such restructuring has often involved the abrogation of collective bargaining agreements, wage concessions, the dissolution of pension funds, layoffs, and even bankruptcy. In this sense, the financial sector is truly detached from the rest of the economy, less interested in providing it liquidity or credit than in stripping it of its assets.

Whatever impact such activity has on business creation or growth, it virtually guarantees that the new economy will offer less social value (decent wages, economic security) than the one it displaced. The trajectories of manufacturing and finance across the modern era drive home this point [see graphic above]. Manufacturing is a labor-intensive sector involved in the production of real goods: it has historically been a source of good "living wage" jobs and an engine of productivity growth across the economy. Finance is a capital-intensive sector whose contributions to employment and economic growth are less clear. Indeed, according to one recent review of the literature, each percentage increase in financialization actually yields deeper inequality, slower growth, and higher unemployment.

Financial growth and deregulation has also led to dramatic increases in household debt, effectively hardening and exaggerating underlying patterns of inequality [see graphic below]. The basic pattern here is not hard to discern, and can be found in the years leading to the crash of 1929 and to the onset of the recession in 2007. In both eras, income inequality widened (the income share of the top 5 percent grew from 24 to 34 percent between 1920 and 1928, and from 22 to 34 percent between 1983 and 2007) and the ratio of household debt to GDP almost doubled. For much of the postwar era, top earners paid the rest of us wages. Since the mid-1970s they have paid us much less in wages—and lent us money to make up the shortfall. 

At the same time, the simple provision of credit was gradually displaced by the drive to create new and novel forms of credit in order to meet the seemingly insatiable market for securities and derivatives. This, alongside the slow collapse of regulatory oversight in consumer and mortgage lending, encouraged and succored predatory lending (high-interest credit cards, payday loans, subprime mortgages) as lenders dragged the barrel for more borrowers. This pattern, especially in racially and economically segregated urban settings where credit had long been scarce, not only preyed upon the poor but actually made them poorer. 

Perhaps more importantly, growing inequality feeds a vicious cycle in which the fortunes of ordinary Americans slip and they are compelled to borrow more to maintain their standard of living—indeed, we can view the rapid expansion of credit card markets in the 1980s as a direct and intentional response to wage and income stagnation. This feeds inequality from both ends: as the poor and middle class borrow to get by, they inevitably borrow from the rich—whose assets, in turn, are increasingly backed by loans that paper over inequality in the short term but exaggerate it in the long term.

This mechanism, by which high-income households recycle their gains back to the rest of us in the form of loans, is spectacularly unsustainable. Across the twentieth century [see graphic below], most recently in the lead-up to 2007, slow wage growth and weak aggregate demand has been accompanied, and compensated for, by speculative lending. As long as the wages and incomes of ordinary Americans stagnate, the underlying debt becomes more fragile—contributing to financial crisis and panic. To add insult to injury, top income shares climbed as banks and other financial institutions failed.

Inequality is fed by both the speculative bubble and the conditions under which it bursts, because financial firms hoard most of the gains from risky investments but bear few of the costs. In part, this reflects the industry’s compensation and reward structure, which generates lucrative fees, bonuses, and dividends in good years—and only slightly less lucrative returns in bad years. It also reflects the peculiar pattern of private gain and socialized risk—“an unholy dynamic,” as Tyler Cowen observes, “of short-term trading and investing, backed up by bailouts and risk reduction from the government and the Federal Reserve.” By one estimate, the housing crash alone cost $3.4 trillion, an average of nearly $5800 for each American household. 

American Finance in International Perspective
In many respects, this is an international story—driven by the globalization of capital markets and by a common deregulatory pattern across national settings. But, in other respects, the American setting is unique. The United States has seen faster growth in the size and share of the financial sector than its peers, and much faster wage growth within the finance sector. The result is a vicious cycle: aggressive deregulation in the United States attracts both global capital (seeking quick and reliable returns) and entrepreneurial talent (for whom the lack of rules invites creativity or chicanery). Much of that creativity, in turn, is devoted to winning further deregulation.

In this sense, financialization sustains inequality in other, less direct, ways. The inflow of global capital inflates the value of the American dollar. This makes U.S. exports artificially expensive and those produced abroad unusually cheap—a combination that has hammered sectors (most prominently manufacturing) that once sustained good middle-class jobs. And, as deregulation and speculative bubbles ratchet up net compensation in the finance sector, they also ratchet up its political clout—and opposition to a range of policies that promise anything better.


Colin Gordon is a professor of history at the University of Iowa. He writes widely on the history of American public policy and is the author, most recently, of Growing Apart: A Political History of American Inequality.