Decentralization and democracy: Three centuries of debate

0
22

On this day, September 17th, 1787, delegates from all over the former British colonies in North America gathered at Independence Hall in Philadelphia, Pennsylvania, to overthrow and replace the United States government.

The one they replaced—the Articles of Confederation—had failed so thoroughly as to be a non-entity by the time the delegates gathered at Philadelphia. The one they replaced it with— the United States Constitution—was the result of an intense summer of complex debate surrounding one question: how can we build a functional, authoritative government that does not eventually trend toward autocracy?

The brief but powerful legal document that resulted is still the most successful decentralized system that has ever been designed; one that, on aggregate but not without error, has trended toward greater liberties and enfranchisement for its constituents over time, not less.

It is a profound and tragic irony, then, to see the United States so unapologetically positioned at the top of the world in the 21st Century, a vast superpower whose interests and institutions hold sway over all of global society. For many, the first nation ever to free itself successfully from internal tyranny has itself become the very kind of tyrant it swore off in 1776.

Now, not quite 250 years later, a global urge for decentralization is pushing back against the United States itself.

The Ideology of Decentralization

The 2008 financial crisis, in which a grossly overleveraged global financial system collapsed under the weight of its own short-sighted avarice, unveiled the fragility and interconnectedness of the world’s centralized financial systems, at the center of which sat the United States, Wall Street, and the Federal Reserve System. It was in this era that we learned that some actors were “too big to fail,” but the common question quickly became how they were ever allowed to get so big in the first place.

Following the crisis, centralized financial systems and the governments that enabled or controlled them became the target of heated criticism. Public confidence in these institutions seriously faltered as calls for transparency, accountability, and easier access to financial services surged.

It was in this climate that the blockchain and cryptocurrency movements began to gain traction, beginning almost immediately with the Bitcoin Whitepaper and the network’s genesis block in January 2009 and exploding into the kaleidoscope of projects, chains, concepts, jargon, and dreams that would make up the Web3 world over the next 14 years. Powering the movement is the belief that decentralization, in all of its forms, benefits all of society by ensuring transparency, reducing the risk of power abuse, and democratizing access to finance and political power. This perspective views decentralization as a means to more equitable and resilient systems.

This attitude, however, is not new. It is the very idea upon which the nation was founded and, in many respects, has acted as the beacon that has guided it to success. Yet, before the U.S. successfully established a robust, decentralized government that worked, it only did so after a series of missteps in crafting one that definitely did not.

Decentralized disaster

While it’s easy to think of the Constitution as a reaction against centralized power, it was, in fact, quite the opposite. Before the U.S. Constitution, there were the Articles of Confederation, America’s first attempt at a government, which was entirely decentralized—and entirely useless.

The Articles of Confederation were ratified by the thirteen states in 1781 and served as the country’s first constitution. It envisioned a nation without a strong central government, placing power entirely in the hands of individual states. Rather than a formal union, the Articles established a confederation, termed “a firm league of friendship,” among the states where each retained its sovereignty, independence, and every power not expressly delegated to the United States federal government, which were few and unenforceable.

This extreme form of decentralization led to immediate chaos. States acted primarily in their interests, often in conflict with one another and with no national judiciary to settle disputes. They issued their own currencies, abided by their own trade rules, imposed their own taxes and tariffs, and thwarted each other’s attempts at commerce. The nation also sat defenseless—the national government held the responsibility of raising an army, but without the power to tax or issue credit, it had no money to do so.

By 1786, it was apparent that the problems of the Articles were not solvable issues but fundamental flaws in the structure of governance and had to be tossed out altogether. The Philadelphia Convention was convened in 1787 to address these issues and to draft an entirely new constitution. Doing so, however, involved confronting a political question that had never been successfully resolved: can a society vest authority in leaders without inadvertently sowing the seeds of autocracy?

Many thought not. Opponents of the new constitution—called Anti-Federalists—thought the powers of a government to tax and raise an army was enough to tip the scales toward autocracy, and they complained that the convention focused its attention on the mechanics of power structure over the drafting of a bill of rights. Federalists, though, maintained that a powerful—but somehow decentralized—central government was necessary to have social order at all. Without that assurance, the Anti-Federalists’ bill of rights wouldn’t be worth more than the broadside it was printed on.

The solution the framers arrived at was a political insight that radically altered the way politics would be structured. Rather than eliminating centralized authorities altogether, the new government would break them into separate institutions under separate leadership. Further, each institution would be provided the legal tools necessary to thwart the others at critical junctures—Congress could remove presidents, presidents could veto bills, Congress could overturn vetoes, the courts could invalidate laws… the list goes on and on.

In each case, however, the checks and balances built into the system were designed to ensure that while each branch had its distinct powers, it could not act without the consent of the others. This idea was as old as the Roman Empire, but the United States Constitution was the first example of a government consciously designed from the ground up to revolve around the concept. The decentralization of powers is not something that United States politicians have historically found distasteful—in fact, it has been the very key to the nation’s success from the very beginning.

 American Irony

The iron hand crushd the Tyrant’s head
And became a Tyrant in his stead”   — William Blake

The rise of the United States to global prominence is a study in contradiction. Founded on principles of decentralized governance, it has, over two and a half centuries, evolved into the world’s foremost financial, military, and commercial power by any standard of comparison. The paradox is both unnerving and ironic: a nation whose bedrock is decentralization has emerged as arguably the most formidable central authority the world has ever seen.

The United States Government, despite its centralized appearance, draws its resilience and adaptability from its decentralized design. Viewed internally, no one ever really seems to be in charge in the United States, and that’s because no one ever really is. It’s a system that has been meticulously constructed to preserve necessary powers while placing each into its own dedicated silo. And it works very well.

As such, it’s worth taking the time to understand what worked—and didn’t work—about America’s first foray into decentralized governance. The U.S. Constitution was a document born both from the spirit of and reaction to decentralization. Its framers were not ideologues but pragmatists, and their approach to government was informed by the failures of both centralized and decentralized systems alike.

As today’s decentralization movement gains momentum, there is much to glean from the measured approach of the framers. Here are just a few points I took away from assembling this article, though there are doubtlessly many, many more:

First, change has to be guided by a nuanced understanding of precisely what it is that we are attempting to change—not a vague idea of it.

Second, to fundamentally base one’s thinking on ideological convictions, however pure or well-reasoned, only sets the stage for mistakes, as there will always be contingencies for which the ideology cannot account. You cannot write a code for every possibility.

Third—and this one is very important—progress happens when people come together for informed, good-faith conversation, not when they yell at each other from a distance or, God forbid, fight. This was not lost on the delegates, all of whom understood that everything depended on coming to an agreement without forcing anyone into it—violence would follow from that as sure as the day follows the night. “We are perhaps the only people in the world, “remarked South Carolina delegate Charles Pinkney to the Convention, “who ever had sense enough to appoint delegates to establish a general government.”

It’s a wonder that it took so long.

So, with a bit of wisdom, reflection, and pragmatism, there is immense potential for progress. The technological innovations we have seen have advanced us into a new arena, which means history has opened up an opportunity to make change at this moment. Let the success of the Convention of 1787 serve as evidence that remarkable things can be accomplished when people come together to accomplish something remarkable.

Credit: Source link

ads

LEAVE A REPLY

Please enter your comment!
Please enter your name here