Against unconstrained individualism
For American individualism to thrive, we must rediscover its limits.
Written by Grayson Slover.
There is no trait more synonymous with American culture than individualism. The United States was the first nation in human history to be based on the concept—derived from a mix of Enlightenment philosophy and Protestant Christianity—that the individual rather than the group to which they belong was the fundamental unit in a free society. The self-made man or woman, whose hard work and perseverance enable them to succeed is the archetypal American.
America’s individualist ethos has been one of its greatest assets throughout its history. Individual freedom is the primary driver of innovation of all kinds, from new inventions to novel forms of artistic expression, and is the basis for the private property rights that protect these innovations. The American conception of the sanctity of the individual is the basis for most of international human rights law. This ethos is also a key reason why America has been able to attract and integrate immigrants from all over the world.
Simply put, America would not be America without its individualist ethos. Yet, for most of American history, this ethos has coexisted with communitarian elements that served as necessary countervailing forces. America’s Founding Fathers—as well as the philosophical founders of classical liberalism, such as John Locke and Adam Smith—took for granted the existence of many of these forces and couldn’t comprehend a society in which they didn’t exist.
Over time, however, these moderating forces have withered, and today they exert a negligible influence on American life. Without them, America’s individualist ethos has grown to occupy an unmanageable and corrosive prominence in our national culture. The result is unconstrained individualism, an individualism hostile to the notion that constraints on individual freedom are ever justified or useful—and it has already had deleterious effects on us.
The first clear signs of the shift to unconstrained individualism appear in the mid-1960s. As was argued in 2000 by Robert Putnam in his book Bowling Alone, and more recently by Jean Twenge in Generations, the movements in the late 1960s and early 1970s to liberate individuals from all forms of social constraint resulted in a broad decline in community involvement and social trust. This decline has continued and gained momentum in the years since then as individualism has become more radical and less tolerant of any restraint.
According to unconstrained individualism, all restrictions on the individual are presumed unjust from the outset: the main purpose of a society is to create the conditions in which individuals can freely choose how to pursue their idea of a good life. Many constraints on individual freedom throughout history have indeed been very oppressive, and individualism gives us tools to identify and eliminate those oppressive constraints. However, some of those constraints are not only beneficial, but essential to maintaining the delicate web of social relations that give our lives meaning and allow our liberal democracy to thrive.
Reinvigorating our institutions
Societies are not simply collections of individuals, compelled by self-interest to work together and live in harmony. For humans to live together in large groups, they must believe they are a “people” with at least a moderate degree of shared culture, values, and purpose. In his excellent book The WEIRDest People in the World: How the West Became Psychologically Peculiar and Particularly Prosperous, Harvard Anthropologist Joseph Heinrich explains how “human societies…are stitched together by culturally transmitted social norms that cluster into institutions.” Norms, he writes, are
“…stable, self-reinforcing sets of culturally-learned and interlocking beliefs, practices, and motivations that arise as people learn from each other and interact over generations. Norms create social rules or standards that prescribe, forbid, or sometimes endorse some set of actions. These actions are incentivized and sustained by their reputational consequences—by the evaluations and reactions of the community…”
In other words, institutions, composed of social norms, hold societies together. In modern America, institutions can include any social structure that mediates relations between individuals: from schools and private associations to Congress and the Supreme Court. Every institution is different, but one feature they all share is that they inherently set constraints on individual freedom. Even more controversially (at least to modern sensibilities), institutions tell us where to draw lines for an “in” group and an “out” group, between those who accept and adhere to the social norms and those who do not. At first glance, this might appear to promote the sort of tribal thinking that tears societies apart; and if a society’s social norms are unjust or extreme, or without the possibility of atonement when violated, that is in fact what can happen. But human beings, no matter how enlightened, cannot help but divide ourselves into tribes of “us” and “them.” What makes institutions so essential is that they give us the ability to take this intrinsic and often pernicious feature of human nature and direct it in ways that are healthy and constructive.
Unconstrained individualism promotes the quixotic notion that individuals, indeed whole societies of individuals, can learn to reject the tribal mindset entirely. This, presumably, is why the advocates of unconstrained individualism fail to see the severe dangers of attacking or neglecting society’s institutions. In their view, institutions are an oppressive force, or at best a redundancy, since enlightened individuals can reliably recognize and navigate around the parts of human nature that aren’t conducive to a free, democratic society.
Every society in human history has faced the problem of weighing individual freedom against institutional authority, and practically all of them have heavily favored institutions. America’s individualist ethos makes us an outlier in our general aversion to institutional constraints on individual freedom. The dialectic between the competing forces of institutional authority and individual freedom—with the latter taking a moderate precedence over the former—is a key part of what has made America so exceptionally successful. But for that dialectic to function, each side has to be granted its due respect, and unconstrained individualism prevents us from acknowledging any importance in institutions.
In his book A Time to Build, Yuval Levin illuminates the purpose of institutions, in contrast to the prevailing view in America today that our institutions are simply the sum of the autonomous individuals who populate them. He contends that the relationship between institutions and individuals is meant to be a two-way street: individuals change the institutions they populate, but our institutions are also supposed to shape us as individuals. Healthy institutions are formative, they act on us as individuals and help us shape our individual identities.
Levin convincingly argues that our institutions today have become performative, mere stages upon which individuals can promote themselves and their own interests. We see Congress continually fail to pass even the most essential legislation; we see the New York Times reject basic journalistic standards to placate censorious leftist staff at the paper; we see prestigious universities such as Harvard investigating protected speech as “microaggressions” while being willing to tolerate chants calling for Jewish genocide. These and many other examples we could cite are not primarily an indictment on the individuals who happen to be working in these institutions (though the failings of individuals are often a relevant part of the problem). Rather, they are a sign that the incentives that once drove individuals to make decisions with the long-term survival of the institution in mind have been superseded by the incentive for individuals to use their position to promote their own following or prestige.
These performative incentives have been supercharged in the age of smartphones and social media. While formative institutions require individuals to sublimate their personalities and talents to a purpose greater than the self, social media is an “anti-institution,” in Levin’s words, because it rewards us for indulging the worst aspects of unconstrained individualism. It replaces our role as part of an institution—which “describes an obligation shaped by constraints”—with the role of a performer acting on a stage. It not only excuses but requires us to think selfishly, to constantly worry—consciously or not—over how others rate us in the virtual world rather than spend that time and attention bettering our communities in the real world.
“In order to be free,” Levin writes, “we need more than just to be liberated…We need to be formed for freedom—given the tools of judgment and character and habit to use our freedom responsibly and effectively.” In other words, the formative role of institutions, which constrain individual freedom, is in fact a prerequisite for the viability of individual freedom on a societal level. Unconstrained individualism assumes that the individual, left to his or her own devices, will naturally gravitate and adhere to the norms and ethics that are required to maintain a liberal democratic society. Conversely, formative institutions are based on the understanding that human beings are naturally flawed creatures, inclined to behaving in antisocial and unethical ways when it suits us individually, who can live together in relative peace and harmony only through careful cultivation and maintenance of social norms and the constructive individual incentives that result.
It is true that many of our institutions—political institutions, media, and higher education in particular—have given us ample reason to distrust them in recent years. People should not be expected to recommit to institutions that haven’t done the work to maintain or regain their trust. But much of this distrust is rooted not in the particular failings of these institutions, but rather in unconstrained individualism and the feeling of disconnectedness that stems from it. If institutions are nothing more than the sum of the individuals who happen to populate them at any given time, then the people who aren’t insiders will feel little connection to them. On the other hand, when our institutions are widely understood to have a formative role on the individual, to require individual members to sublimate their personal ambitions to forward the institution’s goals, the broader public will be more inclined to trust that institution and feel solidarity with its members.
The most important institution of all is the family. From our first moment on earth, our family forms who we are on a far deeper level than any other institution. But the incentives of unconstrained individualism are hostile to stable families. In his 2002 book The Marriage Problem, political scientist James Q. Wilson warned that the “cultural, religious, and legal doctrines” that had previously incentivized healthy marriages had mostly disappeared, and that history suggests that the remaining incentives—romantic attraction and child care—would not be sufficient to work on the societal level.
The enactment of “no-fault” divorce laws—which for the first time in history made marriages go from being terminable for only a few serious causes to being “more or less terminable at will”—provides us a clear example of the damaging effects of unconstrained individualism. As Wilson explains, these laws were enacted in many Western European countries and by “almost all American states between 1969 and 1985.” Predictably, divorce rates and single motherhood skyrocketed soon after.
Centuries ago, marriage constituted a sacrament before God, which implied all of the associated penalties if one should break this sacrament. Then it became a secular-but-still-serious lifelong contract. Today, Wilson explains, marriage for many is “a wager that two people enter into because of sexual attraction and personal friendship (sometimes accompanied by the existence of a fetus), a wager that in the future things will work out for the best. But if things don’t work out, and often they do not, then the bet has been lost and each party is free to end the attachment.”
Divorce rates have declined significantly since the 1990s, but as Jean Twenge points out, this is because fewer young people are getting married in the first place. The Millennials, she writes, “are the first generation in American history in which the majority of 25-to-39-year-olds are not married.” And Gen Z appears even less interested in marriage, or even in having a “lifelong romantic partner.” This is another example of how the destructive trends of unconstrained individualism are magnified by modern technology. When young men have constant access to a virtually endless library of high definition pornography in their pocket—and, for the better looking and less lethargic ones, hookup apps like Tinder—it’s no mystery why most of them aren’t interested in putting in the effort of building a long-term relationship. Young women who have grown up in a culture that commodifies sex and intimate relationships in this way won’t be any more eager to put themselves out there.
The norms that incentivized marriage were successful because they conferred shame on individuals who transgressed them. For example, women and men who had children out of wedlock incurred social penalties, which had two obvious social benefits: 1) many men and women were deterred from having sex and bringing children into the world outside of a stable family, and 2) those who got pregnant before marriage were incentivized to get married to avoid social ostracization. This is certainly not to say that marriage and sexual norms of the past were an unmitigated positive—indeed, many of them were appallingly callous and oppressive, disproportionately towards women. But it is also true that shame is probably a necessary ingredient to any viable social norm. As Wilson put it:
“Shame is the inner sense that one has violated an important moral or social rule. Shame once inhibited women from having children without marrying and men from abandoning wives for trophy alternatives. Today it does much less of either. We wrongly suppose, I think, that shame is the enemy of personal emancipation when in fact an emancipated man or woman is one for whom inner control is sufficiently powerful to produce inner limits on actions that once were controlled by external forces.”
People who are only motivated by self-interest can’t be convinced to sacrifice for any greater good. Take for example the desire “to be a great father,” which is almost universally lauded as a virtue. Many positive words could be used to describe great fathers and would likely resonate with most people irrespective of ideology or political affiliation. But it would be quite strange to call the feeling of being a great father “liberating.” The highest good in unconstrained individualism is the liberation of the individual, but being a great father is respectable precisely because it implies a man’s sacrifice of his individual freedom for something more important.
More young adults than ever before are saying they don’t wish to have children—and for the very reasons we’d expect in a culture of unconstrained individualism. As Jean Twenge notes: “When younger adults who don’t want children are asked why, the majority in national polls name not financial issues or climate change but reasons centered on individualism, such as the desire for more leisure time, wanting more personal independence, and the choice-based, matter-of-fact ‘I just don’t want them.’” If unconstrained individualism remains unchallenged, it isn’t unreasonable to worry about population collapse, as more and more people decide to follow the dominant incentive and place personal freedom and desire above everything else. A society of atomized individuals is a society without a future—the future to the self-interested individual is over once they’ve taken their last breath.
Revitalizing our National Identity
The health of our institutions is inseparable from a strong national identity. An inclusive and compelling national identity gives our institutions a clear, overarching purpose, which makes us more open to sacrificing parts of our individuality to institutions’ formative role. Many advocates of unconstrained individualism argue that in order to fully commit to the project of individual self-discovery and fulfillment, we must forego all attachments to group identity, including national identity. But group identity is an ineradicable part of human nature and an inevitability of human life. The question is which types of group identity we want to promote and which we want to discourage.
For most of American history, our national identity was based in large part on White-Anglo-Saxon-Protestant (WASP) identity. Thankfully this is no longer the case, but it is understandable why it used to be. Religion and race/ethnicity are historically the most powerful forces for creating a national identity—even today, America is an outlier globally in its insistence that to become an American you need not have been born in a certain place or believe in a specific higher power.
National identity is integral not only to the survival of America, but to the survival of any liberal democracy. As Francis Fukuyama explains in his book Identity: The Demand for Dignity and the Politics of Resentment:
“The final function of national identity is to make possible liberal democracy itself. A liberal democracy is an implicit contract between citizens and their government, and among the citizens themselves, under which they give up certain rights in order that the government protects other rights that are more basic and important. National identity is built around the legitimacy of this contract; if citizens do not believe they are part of the same polity, the system will not function.”
Fukuyama adds that national identity must also be supported by citizens who are in some measure “irrationally attached to the ideas of constitutional government and human equality through feelings of pride and patriotism” (emphasis added). In other words, a national identity must be more than just a self-interested cost-benefit calculation by individuals that their own lives are better in this country rather than another. It must be rooted in deeper sentiments—sentiments that unconstrained individualism views with ambivalence and skepticism.
The necessity of these deeper sentiments is shown clearly in relation to our military and national defense. A Yougov poll in February of this year found that only 16% of Americans would volunteer for military service if “a new world war broke out and the U.S. was under imminent threat of invasion.” This finding is profoundly discouraging, but perhaps it should come as no surprise. A culture that preaches individual fulfillment as the highest good will unavoidably seem hypocritical when it asks citizens to risk the ultimate sacrifice of individual freedom: their lives. As Irving Kristol wrote in 1973, “No merely utilitarian definition of civic loyalty is going to convince anyone that it makes sense for him to die for his country. In actual fact, it has been the secular myth of nationalism which, for the past century and a half, has provided this rationale.” But nationalism—even in its healthy, inclusive form—is incompatible with unconstrained individualism, being a type of group identity that requires the subordination of self-interest.
As with institutions, inherent to the concept of national identity is the distinction between “us” and “them.” In order for people to strongly identify with being an American, to be an American has to mean something importantly different than to be a Croatian or a Nepalese or a Nigerian, and, to an extent, we have to believe that those important differences are in fact superiorities. Since Americans have rightly rejected an ethno-religious national identity, Americans must believe—in some ways, as Fukuyama would say, irrationally—that their culture and way of life is not only good, but better than the alternatives around the world.
Rather than promote a strong national identity that is inclusive of America’s diversity, much of our society today rewards a divisive and antagonistic conception of diversity that frames America as a battleground between “oppressor” ethnic groups (white people) and “oppressed” ethnic groups (non-white people). This ideology is often called “wokeness” or, the term I prefer, “critical social justice.”
It is commonly argued that the ascendance of critical social justice ideology, which places a quasi-sacred importance on various “oppressed” identity categories, suggests that the real issue today is not an excess of individualism but the resurgence of collectivism. But this is a misinterpretation of critical social justice ideology and the context in which it arose. The proliferation of new identity groups associated with critical social justice ideology is not due to a lack of space for individualism; such a proliferation is only possible because our rejection of national identity leaves a vacuum that will inevitably be filled with something else. Critical social justice is a visceral and intemperate reaction to the unconstrained individualist ethos that forces people to walk the daunting path to creating an individual identity on their own, without the influence of national identity or formative institutions that guided previous generations.
The ideology of critical social justice may be collectivist in nature; its advocates may even (and often do) openly praise totalitarian collectivist movements. But the reason why it has gained such prominence with younger people is primarily because they grew up in a time where America seemed to have no national identity and only performative institutions, and thus they were left to look elsewhere for something that would satisfy their deeply ingrained impulse to identify with a larger cause.
Like every radical movement, there is a core at the center of the critical social justice movement that consists of true believers in the ideology—mostly professors, professional activists, and university students who hope to enter these fields. But outside of that core, the vast majority of those who identify with critical social justice do so for plainly individualistic reasons. As Yascha Mounk observes in his recent book The Identity Trap, critical social justice ideology was only able to go mainstream after the advent of modern social media in the early 2010s. Like everything else on social media, critical social justice became popular mostly because it was a useful marketing tool for individuals to promote themselves.
The reaction of the post-liberal Right to unconstrained individualism is no less misplaced. They correctly understand that the liberal individualist ethos on its own is not a durable basis for a healthy society. But rather than advocate for the restoration of the counterweights of formative institutions and unifying national identity that made liberal democracy sustainable, they insist that the whole liberal project should be replaced by some form of Christian authoritarianism. They accurately point out how unconstrained individualism ignores the deep influence of Christianity on America’s founding values, but they make the opposite mistake by ignoring the centrality of liberal individualism on America’s identity. Instead of rebalancing the dialectic between individual freedom and institutional power, they would return us to the norm in human history before America and before the Enlightenment: a society that forcibly imposes a rigid moral structure on its people without any recognition of individual human rights.
Collective identity is inevitable. We can promote an identity that unifies us as a nation and motivates those “deeper sentiments” that are necessary for true patriotism and civic duty. Or we can continue to pretend that America is merely a collection of autonomous, freely-choosing individuals, while the ever-growing list of divisive and illiberal identitarian movements tear our country apart.
Towards a Better Approach to Individualism
In his bestselling book The Anxious Generation, social psychologist Jonathan Haidt uses the term anomie—or “normlessness,” coined by French Sociologist Émile Durkheim in 1897—to describe what has happened to Gen Z since the advent of the “Great Rewiring of childhood” in the early 2010s, in which teenagers began to spend less time in the real world and more time in the virtual world of social media, video games, and porn. As an older member of Gen Z, I believe Haidt’s thesis is very compelling. Today’s virtual world provides the individual a synthetic, degraded version of the fundamental human needs—friendship, competition, sex—that until a decade ago were only attainable through face-to-face social interaction, often mediated through institutions. We are in the midst of “entertaining ourselves to death,” as Sam Harris has observed, and we’re doing so in physical solitude while our nation and civilization are left to crumble around us.
But the relatively recent onset of the phone-based society is only part of the story. Today’s general feeling of anomie—not only among Gen Z, I’d argue, but among most Americans—is the culmination of disintegrating forces that began 60 years ago. The Great Rewiring of our social fabric is a cliff at the end of a long downward trend. The movements of the 1960s sought to liberate the individual—not just from clearly oppressive forces like racial segregation, but ultimately from every social constraint on one’s freedom to do whatever they want. Today we see all too well how successful they were.
The social decay we witness all around us—mass mental illness; the breakdown of trust and social capital; skyrocketing rates of every form of addiction; the normalization of what Christopher Lasch called “the culture of narcissism”; the disappearance of virtues like humility, duty, and charity from public life—is not ultimately due to tech oligarchs, wokeism, income inequality, or globalization, but because we have sleepwalked into the foolish belief that the enlightened individual has no need for antiquities like edifying institutions or a unifying national identity. This explanation might be uncomfortable for us to face, for it forces us to admit that we might need constraints on our desires—even though these constraints may not seem necessary, fair, or reasonable by unconstrained individualist standards. But if we want to save our nation, and the values it was founded upon, including the individualist ethos, we might not have a choice.
Grayson Slover is a freelance writer and the author of 'Middle Country: An American Student Visits China’s Uyghur Prison-State'. Most recently, he was a Policy Analyst at the Foundation Against Intolerance and Racism (FAIR), and the Managing Editor at FAIR Substack.
Consider supporting Aporia with a paid subscription:
You can also follow us on Twitter.
There is a worm in the apple of American individualism (or more broadly in Enlightenment individualism per se). Most people are just not that well suited to individualism in its positive sense - they want to fit in, they want boundaries and they want leaders. So you get our post-1960s paradox.....groupthink, copycat 'individualism'.
When I was a young man, my father and I used to have drunken arguments about things like this. And I remember one time saying to him in exasperation "so when did it all go wrong then according to you Dad?" I expected him to say something like The Beatles or Socialism but his answer took me completely by surprise...."The French Revolution" he said. Now huge good things have come from The Enlightenment but that worm has nevertheless also been eating away now for 200+ years....and we are living in the hollow shell.
It's great that a picture of John Wayne is used above. I just watched The Quiet Man, arguably the one Ford/Wayne production where the lesson of the film has to do with learning limits on one's ambitions by way of fitting into a culture. Tellingly, Wayne's character is an American who goes to Ireland to recapture his family's past.
If there's one objection I have to the column above, it's this:
"For most of American history, our national identity was based in large part on White-Anglo-Saxon-Protestant (WASP) identity. Thankfully this is no longer the case"
Why "thankfully"?! Cards on table, I'm a Catholic married to an Asian Buddhist. But neither this fact, nor the fact of millions of others like me, stop America from being a WASP country; its very soul, its social, political, and economic functions are WASP. And a person who asks for it to be otherwise is essentially wishing death on it, in the same manner that a person who wishes endless organ transplants on a sick patient is willing that patient's death.
Fukuyama's use of "irrational" gives too much away. Is the recognition of one's limits, and a resolution to work within them "irrational?" This is the case only if you think boundless choice is "reason." But if a nation has a nature, if a nation has a soul, then bringing it to a better condition *is* the task of reason, and this requires some sense of limitation -- again, something lacking in the American ethos. I think it can be learned, though. It has to be, or we're through.
I just read a piece about how parts of the South are being torn apart by the huge influx of migrants, both foreign and domestic. The default American answer to this problem has always been "run!" or "start anew elsewhere!"... the idea permeates our culture, highbrow and low, as if there will always be a new, verdant, virgin meadow awaiting. This, as much as protestantism or the enlightenment, is a big part of why Americans maintain such an outsized sense of the individual. I think it also accounts in part for the imbecilic reactions to Trump's efforts at deportation.
There is some poll data showing that Americans may be learning the error of that view. I pray it's true. Do we follow Wayne to chase down more Comanches? Or do we finally seek out our bride at Innisfree? Let's hope it's the latter.