Few elections in the Western world offer the spectacle that is an American presidential race. While Brits tend to keep a stiff upper lip when jockeying for position, and Canadians prefer the power of positive thinking in determining just who runs the show, every four years the United States of America offers up the kind of no-holds-barred political brawl fit to make a reality show producer green with envy. Will the Bernie Bros defeat the Hillary Sisterhood? Will Black Lives Matter crush Trump? Or will the Trumpeters pummel the activists? Whatever the outcome, the number of breakdowns, collapses, thrashings, annihilations, massacres, meltdowns, and, of course, Armageddons in any American presidential race is truly impressive. And while many people find the pugnacious process of determining just who will lead their country to be at times both embarrassing and exhausting, nearly all agree that it is a very public competition and that that is what democracy is all about—competing for political power. May the fittest survive, may the best candidate win.

Qualms about this process tend not to question the idea of competition per se. Rather, they reflect criticisms of candidates’ conduct as ungentlemanly, of their performance as unconvincing, and, increasingly, of the influence that campaign donors may exercise over policy. This shouldn’t be surprising. Highly aggressive and increasingly expensive campaigns ($5 billion was spent on the 2008 presidential election, a figure surpassed by the $7 billion spent in 2012)  Occupy parlance—exert an influence on politics that extends well beyond their personal vote at the ballot box.

But if the 1 percent are running politics in the United States, how did they gain that position? In a democracy—that is, a country run by the people—why would the majority not simply recalibrate the political system to minimize the advantages that money can buy?

When I first began researching the history of Western democracy 10 years ago, these were the questions that most fascinated me. Democracy, after all, means “people power.” If the people of a nation truly held political power, one would expect them to consistently pass laws in the interests of the majority, and that the political landscape would reflect this. But it was hard to see why the majority of citizens would ever agree, for example, to allow wealth to play such a spectacular role in elections (congressional candidates that outspend their opponents win 78 percent of their races), given that the vast majority of people are, per definition, not rich.

Of course, the new American bourgeoisie, populated by privately schooled trust-fund babies and latte-swigging hipsters, has an answer for this, blaming every objectionable aspect of national politics on the influence of an under-educated Idiot America that persists in voting against its own best interests. In this account, the proletariat simply does not know what is good for it. This doesn’t, however, harmonize very well with the fact that, whatever else you want to say about them, Americans generally aren’t any more ignorant than they were 20 or 30 years ago, a time when the political system was arguably more functional than it is now. Indeed, thanks to cheap air travel and the Internet, Americans may even be more knowledgeable about the world beyond their borders than they were in times gone by.

Another explanation of which the liberal elite is fond goes like this: The system is good, but the players are bad. Ted Cruz is a psychopath, Trump a narcissist, and Hillary Clinton an opportunist. If only better people were in politics, everything would be rosy. Yet this still fails to answer the ultimate question: Why does this system allow so many basketcases to climb to the top? In a country of over 300 million people, there must be at least a few decent individuals out there. This brings us to the timely question of why they are so rarely seen on-screen, and the even timelier one of why, when they do appear, they are so quickly and so brusquely escorted off the set.

* * *

The founders had a fairly robust idea of the kind of nation they wanted to create: a republic, or a state without a hereditary ruler. Contrary to their depiction by political figures of both left and right today, they were adamant that the republic would not be a democracy—that is, it would not be a system of simple majoritarian rule. The primary reason democracy was undesirable was that the majority of the people would never agree to guarantee special protections for the wealth and privilege of a few, but would instead insist on greater material equality for all. As James Madison, the so-called “father of the Constitution” and the fourth president, put it in Federalist 10, “democracies have ever been spectacles of turbulence and contention; have ever been found incompatible with personal security or the rights of property; and have in general been as short in their lives as they have been violent in their deaths.”

This was not true. But Madison believed it was, and it was on this set of beliefs that he constructed a political system of checks and balances in which the majority, acting in its capacity as a collective, could never hope to attain control over the levers of political power. Instead, in the new republic that Madison helped to craft, an assortment of “factions” would use elections to compete against one another for the privilege of holding power for set periods of time.

By holding these elections relatively infrequently at the national level, the founders could ensure that they would tilt towards those with fame and resources on their side. They did not consider this a problematic outcome. But it was not nearly as popular with the early settlers as one is often led to believe, and in order to allay opposition their plans, the framers agreed to institute a set of personal legal protections that would be ratified as amendments to the Constitution. Thus, in one fell swoop, political participation was effectively enclosed as the preserve of the rich and privileged, and a social contract was enshrined in the form of the Bill of Rights to protect the majority from the most offensive excesses of this power. Even Alexander Hamilton, no friend to the poor, noted that “bills of rights are in their origin, stipulations between kings and their subjects, abridgments of prerogative in favor of privilege, reservations of rights not surrendered to the prince.”

Democracy was nowhere to be found in the new highly mediated and controlled American system of government. But despite the fact that most Americans would remain dissociated from personal participation in politics, few of them perceived this as a hardship, because the American government was still able to deliver for its citizens where it mattered most—their wallets.

* * *

Unlike so many other nations, the United States had one advantage in its development that is rarely discussed but is hard to overstate: It was built upon enormous swaths of resource-rich land that had seen little previous exploitation. That meant that from a capitalist point of view the nation had an enormous surplus of resources and a deficit of workers to extract them. For many years American policy reflected this basic equation and was geared to protect the institution of slavery, to attract immigrants, and to enable economic production. In a country where the government periodically gave away prime farmland and adopted high tariffs to protect home-grown industry, it would have been difficult for European settlers not to be significantly better off than they had been in their countries of origin. In this atmosphere, it was possible to work your way up, if not to the very top, at least to a position a little better than the one you had started with. It is unsurprising that so many people were willing to embrace this new way of life with all of its trappings and its accessories.

However, all was not paradise in the land of milk and honey. Americans made one fatal mistake in attributing the fruits of their labor solely to their own hard work, and another in believing that just because they were doing well economically, occasionally voting actually put them in control of the government. These beliefs gave rise to a hyper-individualist philosophy that ascribed a great deal of significance to personal responsibility and minimized the importance of collective action.

The truth, however, was that most of what was described as success was always largely attributable to the force exerted by the new American state on behalf of those deemed to be its most useful citizens. The most privileged were helped further by laws like the Indian Removal Act of 1830, which required Native Americans to leave their land, and the Fugitive Slave Acts of 1793 and 1850, which required runaway slaves to be returned to their masters. Far from earning the fruits of their labor purely on their own, many early Americans had the means of self-improvement handed to them on a silver platter—one branded with the insignia of the federal government. The state’s function was to legitimize the forcible removal of goods and labor from some people and their appropriation by others. As former World Bank economist Branko Milanovic pointed out in his 2010 book The Haves and the Have-Nots, the most important factors in a person’s wealth are the country they are born into and the status of their parents—by comparison, personal effort is negligible. For a long time, this ugly truth was simply covered up by the bonanza of resource exploitation.

The only thing that has recently changed with this picture is that many of the previously favored people stopped being so useful. Continued immigration, automation, and relocation of labor overseas have rendered economically superfluous many Americans who had previously benefited from these policies of state-sponsored protection. Many Americans suddenly discovered that hard work alone didn’t get them anywhere, and that their influence on politics was limited. Only the practice of faction-based election and protection for civil liberties had ever been written into the social contract. Upward mobility and collective action had never been part of the deal. They were just incidentals.

This brought to the surface the long-suppressed truth about the American political and economic system: namely, that the United States was designed to be a collection of elite interests fighting among themselves to govern a mass of proletarians that could be used for whatever purpose they were good for as long as they were good for it

The truth is that the 1 percent never took over American politics. They had control all along.

* * *

And they don’t have much incentive to act in the interest of the majority. Indeed, the entire constitutional system was designed to ensure that they would not have to be bothered with cries from the unwashed masses. Under the present constitutional system, there simply isn’t much that those masses can threaten them with. They are as superfluous to the economy as they always were to the political system, required to act merely in a superficial capacity as consumers or as voters—roles that have increasingly come to coincide.

The insulation of the elite, however, does not change the fact that for most of America there is an ongoing, and worsening, societal crisis: Living standards continue to slip; quality education has become unaffordable to many; foreign policy ill serves the needs of the majority; and the pressing issues of the day, such as ISIS and climate change, are systematically neglected.

There are only two ways to resolve this crisis: reset the economy or reset the political system. Both would require Americans to moderate their hyper-individualist philosophy.

This has been possible in the past. The rise of labor unions in the 1920s and the New Deal of the 1930s led to an increased focus on collective-bargaining power and national solidarity with the unemployed and the uneducated. This effectively rebalanced the economy, by giving the majority more power in the workplace, if not in the political arena.

Americans could alleviate the current crisis with another New Deal. There are a multitude of means, such as introducing basic social income, raising the inheritance tax, introducing financial-transaction taxes, agreeing to a package of debt relief for former students, and pegging salaries within corporations to prevent severe income inequality within the same company. Together, these reforms could improve the lives of most Americans while preserving the constitutional system and leaving the political power of the 1 percent largely intact, as the founders intended. The elite, however, currently show no interest in taking such system-saving measures. If anything, they have chosen to double down on the myth of go-it-alone individualism that seeks to portray the very wealthy as “job-creators” who must be wheedled into staying in the country lest they seek greener pastures elsewhere.

The New Deal may not be replicable. It was the product of fortuitous circumstances, with President Franklin D. Roosevelt managing to get both Congress and the Supreme Court behind him for a considerable length of time, effectively circumventing the constitutional checks and balances that had been put in place by the founding fathers. It is impossible to say how long Roosevelt’s policies would have remained in place, had WWII and the Cold War not made them so expedient in counteracting the ‘threat’ of communism.

Such an alignment of the planets is not always forthcoming, and the last eight years of American politics have been characterized by inaction on the issue of wealth inequality. The economy has not been reset and there is no sign that with either Hillary Clinton or Donald Trump as president it will be. In the absence of any progress on the economic front, we have to turn to the second alternative for dealing with the crisis—fundamental political reform.

* * *

Madison’s claim to the contrary notwithstanding, democracy as practiced in ancient Greece was not a particularly fraught form of government. In fact, democracy was often noted for its lack of internal violence, as well as what could broadly be called its capitalist values. The ancient democrats of Athens protected private property and, far from going up in flames, their particular form of government ended only with incorporation into the Macedonian Empire under Alexander the Great. Considering that killing all the men and selling the women and children into slavery was a relatively common termination for civilizations at the time, the winding-down of democracy was fairly low-key.

Madison was, however, correct in his view that real democracy was much different than the system we know by that name today. Unlike the founders, the Athenians were committed to majoritarian rule. They also had a profound understanding of political equality, believing that every free citizen was equally qualified to participate in government. Political participation was not considered to be a privilege in Athens but, rather, a duty. Pericles, one of Athens’ most admired citizens, noted that Athenians “consider the man who takes no part in public life not as one minding his own business, but rather a good for nothing.”

Because the Athenians believed so strongly in political equality, they regarded democracy as inherently incompatible with the idea of a competition to hold political power. The citizen in a democracy was to be spared the need to compete for the privilege of running the country—instead, he was to be given ample opportunity to do so with minimum inconvenience to himself. The practice of elections did not fulfill these criteria, so the Athenians selected their officials using a sophisticated lottery system instead.

Ancient democracies drew the balance of power between the individual and the community very differently than we do now. In ancient democracies, decision-making power resided in the collective. In fact, this was what democracy was all about: Only a state in which the people rendered decisions collectively could the people be said to hold power. This was not, as Madison would later allege, a chaotic experience, but rather a structured exercise in civil responsibility.

Modern republics are bereft of institutions for exercising the collective will. They were intentionally left out of the grand American design, as Madison openly acknowledged, because the founders wanted to ensure that the people could never exercise their will, lest they overturn the privileges of the few. But that left the United States with little ability to act as a single nation; instead, it was a collection of individuals and factions who got along more or less well while the going was good, but whose relations are rapidly beginning to deteriorate now that things have gotten a little tougher.

In such circumstances, the attempts to knit the nation back together again using blind patriotism, fear and xenophobia, by threatening, for example, to expel Muslims, or to build a wall along the Mexican border, are as predictable as they are frightening. They are the convulsions of an entity attempting to define itself in purely negative terms, because it lacks any constructive vision of a collective future.

The United States Constitution may have been a progressive document when it was first written, but it was never going to be a permanent solution. Factions would sooner or later beat each other to a standstill, and the fantastic, Monopoly-like fight over American resources would eventually be had out. In a society where political power is a function of competition, wealth, and power will always win. Such is the price of refusing to recognize that any people who form a nation together are bound to each other and that if they are to succeed over time, they must find a way to work together. The Constitution of 1787 does not allow for this. It was designed to foreclose its possibility. A renewal is long overdue.

The old dog-eat-dog United States of America never worked for many people, and it is working for even fewer people now. It’s time to let go. Make America great again, by developing a new idea of what being American really means.