In 2020, the gamified day-trading app Robinhood announced that its mission was "to democratize finance for all." "Democratize," in this case, meant opening the doors to a stock-market casino. Six years earlier, promoters were talking about crypto-trading as "democratizing Wall Street." Cryptocurrency Bitcoin "seeks to democratize currency and payments." In 2011, the résumé-matching site Monster promised to "democratize recruiting" by letting more kinds of job seekers link up with employers.
Start looking, and you will see it everywhere: promises to "democratize" advertising, design, direct marketing, medicine, whatever. Some of these barrier-lowering changes do increase users' powers in real ways. Many just intensify the vulnerability of life in the marketplace, speeding up the already relentless press of speculative bets, pushy ads, and precarious jobs, dressed up to make market vulnerability look like freedom's fun new frontier.
It isn't surprising that touts would debauch a charismatic word. In twenty-first-century America, whatever you care about will be used to try to sell you something. But marketers weren't leading the charge to change the meaning of democracy. In 2010, the Arbuckle Professor at Harvard Business School explained in the Harvard Business Review that Apple cofounder Steve Jobs "set out to democratize computing" by making it "available conveniently to the masses."
In the same year, Robert Zoellick, then the president of the World Bank and previously George W. Bush's trade ambassador, promised to democratize development economics by providing open access to the bank's databases, which included loan records and analyses of economic policies throughout the de-eloping world. Already in 2009, the New York Times was referring to Robinhood's precursors as "democratiz[ing] investment," and in 2007 the Times explained the trend of "democratizing plastic surgery," which meant that people with household incomes under $30,000, who often lacked health insurance, were financing their cosmetic procedures with loans. After all, the paper of record pointed out, earning power follows attractiveness. "I financed my car," the Times reported one patient saying in an emblematic reflection. "Why shouldn't I finance my face?"
These are not random abuses of a word. The professional explainers, like the professional marketers, are using democracy in a way that, when you link the points on the scatter plot, adds up to "universal market participation plus some transparency." It's no surprise that Silicon Valley icon Steve Jobs pops up here, along with the Apple empire that he helped to build.
The meaning of democracy that these uses trace is basically the one that the internet optimists of the 1990s and early 2000s popularized: universal access and transparency would democratize software (through open, unencrypted code), democratize knowledge (through sites such as Wikipedia, which, it became briefly fashionable to say, was better than Encyclopaedia Britannica), democratize the news through blogs and amateur reporting, and democratize democracy itself by enabling citizens to organize online. We now know (and there were warnings at the time, if eyes were open) that actual results would include the largest monopolies in world history, a vile bloom of conspiracy theories and other "alternative" knowledge, and an online Hobbesian dystopia of warring multitudes.
A thoroughgoing market order is less a version of democracy than a bizarro democracy, an opposite that, precisely through its resemblances, makes actual democracy ever more unlikely.
This conflation between democracy and the market is not completely inapposite, but a slippery half truth. The market, like democracy, theoretically (and to a considerable degree in practice) lets everyone in, gives everyone a forum for their convictions or preferences. The market, like democracy, organizes shared life partly by aggregating many dispersed perspectives and values—not by voting, as democracy does, but through purchases. It was partly on the strength of these resemblances that the egalitarian spirit in democracy—"every atom belonging to me as good belongs to you," in Walt Whitman's famous phrase—came to mean the breakdown of barriers to entering markets, of "expert" knowledge, of whatever stood in the way of consumer investors and their plans for their marginal dollars. But a thoroughgoing market order is less a version of democracy than a bizarro democracy, an opposite that, precisely through its resemblances, makes actual democracy ever more unlikely.
What did this new form of political imagination replace, and with what consequences? It grew up within, and in key ways against, the historically unusual period of relative economic equality that coincided with the heart of the Cold War, from the years after World War II until the mid-1970s. Then, as in few other times, high levels of growth were widely shared among the middle classes and many working-class people, including the large portion who were union members or worked in industries where union firms set prevailing wages.
Many prominent commentators assumed that this relative egalitarianism was a new normal. The usual parochialism of the eternal present was assisted in this assumption by the influential research of economist Simon Kuznets, whose study of tax records showed early to mid-twentieth-century inequality rising, then falling. The graphic representation of this trend, dubbed the Kuznets curve, came to be treated as a feature of mature industrial societies: severe early inequality, part of the price of rapid growth, would give way later to moderate inequality.
Yes, there were sweatshops and robber barons early on, but the factory worker's daughter would become a secretary, and his son might go to college or join a union. Confidence in this alleged historical law of economic development was strong enough that those who were vividly left out of general prosperity, notably most Black Americans and many Appalachians, were characterized as exceptions, islands that would take just a little longer to be worn away by the currents of history. Liberal economist John Kenneth Galbraith, arguing in The Affluent Society for bringing these populations into the order of well-being, treated poverty as the condition of being left out of the social bargain. This was a very different picture from today's inequality, which forms a basic, persistent feature of the American political economy.
The very idea of a social bargain made a different kind of sense then than it does today. World War II had spurred an unprecedented mobilization of labor and industrial capacity, turning countries in real, material ways into platforms of common fate and endeavor. In the United States, war mobilization came just a few years after Democratic supermajorities made the New Deal, an experience confirming as few moments in American history have that the country could recast its terms of cooperation through democratic action. A social bargain, that is, need not be just a metaphor: it might be a concrete agreement among living citizens whose fates were thoroughly entwined. (Galbraith saw this, too. He wasn't a fatalist. Rather, he thought the social bargain of the New Deal and World War II was still strong and needed expansion.)
Today's dispersed supply chains evade worker power, at the same time that labor law gives unions fewer opportunities to push for collective bargaining.
Economic life, too, confirmed the plausibility of a social bargain. The basic strategy of industrial peace in those decades was collective bargaining between unions and management. The two sides of the negotiation needed each other in part because of the technological nature of production. A factory, and an industry made up of a regional chain of factories, mines, and mills, threw people together in a system that was vulnerable to strikes, which meant each side of the bargain could issue a real threat. Today's dispersed supply chains evade this kind of worker power, at the same time that labor law gives unions fewer opportunities to push for collective bargaining.
At quite a different scale, the global economic architecture that World War II's victors composed at the Bretton Woods conference in 1944 (before they had even won the war) gave national governments considerable scope to set their own levels of spending and debt, stimulate or tamp down growth, and make distributional decisions, while capital remained substantially within national borders.
Workers and bosses in a country were, to some meaningful extent, caught together like those in a factory. What was a country, in these decades? Among other things, it was a place where people made things together and so depended on one another, a fact that gave the makers power. Neither that reality nor the World War II mobilization can have been far from the mind of Pacific Theater veteran and eminent political philosopher John Rawls when he described the just society, in his landmark 1971 work A Theory of Justice, as a fair scheme of cooperation.
Rawls's theory of justice shaped decades of subsequent political philosophy and came to occupy a place in the broader world of law, politics, and policy that contemporary philosophers very seldom achieve. Rawls famously asked which social world a person would choose if they could not know where they would fall in its hierarchies. He answered that a just economy was one that ensured the greatest possible benefit to the worst off (limited by personal freedoms of the due-process, free-conscience, and free-speech variety).
Both the question and the answer gave systematic expression to a robust liberal egalitarianism that was the leading politics of the decades when Rawls developed his thought, enshrined in civil rights law and the program of public investment and social support that President Lyndon Johnson called the Great Society. Rawls described his approach as an update of Immanuel Kant's ethics and an improvement on the social-contract theories of John Locke and Jean-Jacques Rousseau, but intellectual historian and political theorist Katrina Forrester has shown that he also drew pivotally on "ordinary language philosophy," seeking to identify implicit but powerful shared meanings in everyday life. He appealed, that is, to an unspoken consensus about basic ideas of fairness, which could be drawn out systematically, rather as one might turn a few basic points about circles and angles into a system of geometry.
____________________________________
Excerpted from Two Cheers for Politics: Why Democracy Is Flawed, Frightening—and Our Best Hope by Jedediah Purdy. Copyright © 2022. Available from Basic Books, an imprint of Hachette Book Group, Inc.
No comments:
Post a Comment