Grumpy opinions about everything.

Category: Politics Page 3 of 6

The Republic of Indian Stream: America’s Forgotten Frontier Nation

Did you know that there once an independent republic in the farthest reaches of northern New Hampshire, where the dense forests blend into the Canadian wilderness?  Neither did I until I came across it in a fascinating book titled A Brief History of the World in 47 Boarders by John Elledge.

It was a short-lived but remarkable experiment in self-government. For three years in the 1830s, the settlers of a disputed border region declared themselves citizens of an independent republic—complete with their own constitution, legislature, and militia. They called it the Republic of Indian Stream, a name that today sounds almost mythical, yet it was a genuine, functioning democracy. Their story blends frontier improvisation, international diplomacy, and Yankee self-reliance—and it leaves us with a curious artifact: a constitution written not by statesmen in Philadelphia, but by farmers, loggers, and merchants caught between two competing nations.

A Territory in Limbo

The roots of the Indian Stream story go back to the Treaty of Paris (1783), which ended the American Revolution. The treaty defined the U.S.–Canada border but used vague geographic language—particularly the phrase “the northwesternmost head of the Connecticut River.” No one could agree which of several small tributaries the treaty meant.

The ambiguity created a slice of wilderness—about 200 square miles—claimed by both the United States and British Lower Canada (now Quebec). For decades, the region existed in a gray zone. Both countries sent tax collectors and law officers, both demanded military service, and neither provided clear legal protection. Residents couldn’t vote, hold secure property titles, or rely on either government’s courts. To make matters worse, they were sometimes forced to pay taxes twice—once to New Hampshire and once to Canada.

Origins of the Republic

By the late 1820s, frustration had reached a boiling point. Attempts to resolve the border dispute were unsuccessful—including arbitration by the King of the Netherlands in 1827 that failed when the United States rejected his decision that favored Great Britain.

With both sides still pressing their claims, the settlers decided they’d had enough of outside interference. On July 9, 1832, they convened a local meeting and declared independence, forming the Republic of Indian Stream. Their constitution—modeled on American state constitutions—began with a simple premise: authority rested with “the citizens inhabiting the territory.”

This wasn’t an act of rebellion but one of survival. The settlers wanted peace, order, and local control. Their goal was to withdrawal from ambiguous regulation and to create a government that could function until the border question was finally settled.

The Constitution of Indian Stream

The constitution of the Republic, adopted the same day they declared sovereignty, was an impressively crafted document for a community of barely 300 people. It reflected the settlers’ familiarity with republican ideals and their determination to govern themselves fairly.

Key features included:

  • Democratic foundation: All authority stemmed from the citizens.
  • Annual elections: A single House of Representatives made the laws, and a magistrate acted as both executive and judge.
  • Judicial simplicity: Local justices of the peace handled disputes—there were no elaborate court hierarchies.
  • Individual rights: Residents enjoyed protections derived from U.S. constitutions—trial by jury, due process, and freedom from arbitrary arrest.
  • Defense and civic duty: Citizens pledged to defend their independence and assist one another in emergencies.

Despite its modest scale, the system worked. The republic passed laws, issued warrants, collected taxes, and even mustered a small militia to maintain order.

Life on the Frontier

Life in Indian Stream resembled that of many frontier communities: logging, farming, hunting, and trading. The land was rough, winters long, and access to distant markets limited. Yet the people thrived through cooperation and self-reliance. Trade with both Canadian and New Hampshire merchants continued—proof that practicality often trumped politics on the frontier.

The republic’s remote location provided a degree of safety from interference, but not immunity. Both British and American agents continued to assert claims, and occasional arrests or skirmishes kept tensions high.

The End of the Republic

The experiment in independence lasted only three years. In 1835, a dispute between an Indian Stream constable and a Canadian deputy sheriff triggered a diplomatic crisis. Canada sent troops to assert control, prompting New Hampshire’s governor to respond in kind.

Realizing they were caught between two competing governments, the citizens voted in April 1836 to accept New Hampshire’s jurisdiction. Indian Stream became part of the town of Pittsburg, and peace was restored.

The larger boundary issue wasn’t fully settled until the Webster–Ashburton Treaty of 1842, which formally placed Indian Stream within the United States.

Legacy of a Lost Republic

Today, little remains of the Republic of Indian Stream except New Hampshire Historical Marker #1 and a scattering of homesteads near the Connecticut Lakes.

Yet its legacy is profound.  It may have lasted only three years, but its story reflects the broader American frontier experience: independence, inventive, and determination to live free from arbitrary rule. In an era defined by rigid borders and powerful states, the memory of Indian Stream reminds us that freedom once depended, not on lines on a map, but on the courage of people willing to draw their own lines.

The story also illustrates the complexities of nation-building in the early American period when borders remained fluid and communities sometimes had to forge their own path toward self-governance. While the republic was short lived, it stands as a testament to the ingenuity and determination of America’s frontier settlers, who refused to accept statelessness and instead chose to create their own nation in the wilderness.

The Indian Stream constitution reminds us that political order is not always imposed from above; sometimes, out of necessity, it is created from below. The settlers were neither revolutionaries nor idealists—they simply wanted clear rules, fair courts, and predictable taxes. Ordinary citizens, faced with legal chaos and neglect, designed a functioning democracy grounded in fairness and mutual responsibility.

That such a tiny community would craft its own constitution speaks to the enduring appeal of constitutional government in the early 19th century. Even on the edge of two empires, far from capitals and legislatures, these settlers turned to a familiar American solution: write it down, elect your leaders, and hold them accountable every year.  Hopefully we will be able to keep their spirit and live up to the example of Indian Stream.

How A Nobel Laureate Thinks We Can Save The American Economy…But It Won’t Be Easy

I just finished People, Power, and Profits by Joseph Stiglitz — the Nobel Prize winning economist.  He wrote this near the end of Trump’s first term, but honestly, the world he describes feels even more relevant now.

Stiglitz doesn’t sugarcoat it: capitalism, as we’re practicing it today, is broken. Monopolies dominate markets, inequality has gone wild, and trust in democracy is running on fumes. His proposed fix? Something he calls progressive capitalism — capitalism with guardrails, conscience, and a sense of fairness.

Stiglitz makes the case that our economic system is rigged — not by accident, but by design. Here are his most compelling arguments and what he thinks we should do about them.

1. Taxation and Rent-Seeking: The Rigged Game

Stiglitz draws a sharp distinction between making money through productive work and extracting it through what economists call “rent-seeking” – essentially, using power to skim wealth without creating value. Think of a pharmaceutical company that buys a drug patent and jacks up prices 5,000%, or telecom monopolies that divide up markets to avoid competing.

His argument is straightforward: our tax system rewards the wrong behavior. Capital gains are taxed at lower rates than wages, which means someone living off investments pays less than someone working a regular job. Meanwhile, the wealthy can afford armies of accountants to exploit loopholes that most people don’t even know exist.

What Stiglitz recommends: Tax wealth more aggressively, especially inherited wealth. Close the capital gains loophole. Tax rent-seeking activities heavily while reducing taxes on productive work and innovation. The goal isn’t just revenue – it’s changing incentives so that the path to riches runs through creating value, not extracting it.

2. Green Energy and the True Cost of Pollution

Here’s where Stiglitz gets into what economists call “externalities” – costs that businesses impose on society without paying for them. When a coal plant spews carbon into the atmosphere, we all pay through climate change and increased healthcare costs, but the plant’s balance sheet looks great.

Stiglitz argues this is fundamentally dishonest accounting. If we properly priced pollution and carbon emissions, green energy wouldn’t need subsidies to compete – fossil fuels would suddenly look much more expensive once you factor in their real costs to society.

His recommendation: Implement carbon pricing – either through a carbon tax or cap-and-trade system. Make polluters pay for the damage they cause. This isn’t about punishing business; it’s about honest accounting. Once prices reflect reality, the market will naturally shift toward cleaner energy because it’s actually cheaper when you account for all the costs.

3. Big Business and Big Banks: Concentration of Power

Stiglitz has been particularly vocal about how corporate consolidation hurts everyone except shareholders and executives.  His critique of “too big to fail” is sharp. He argues that concentrated economic power — in tech, finance, and even agriculture — undermines both democracy and efficiency. When a few firms dominate markets, they can suppress wages, block innovation, and bend regulations in their favor—they gain power over prices, wages, and even politics.

The banking sector especially concerns him. After the 2008 financial crisis, which was caused largely by reckless behavior from major banks, these same institutions emerged even larger through government-facilitated mergers. We allowed them to spread their losses among their depositors but let them keep their gains as internal profits.

His recommendations: Reinstate and strengthen regulations that were stripped away, including bringing back something like the Glass-Steagall Act that separated commercial and investment banking. Break up banks that are “too big to fail.” Strengthen antitrust enforcement across all industries. Use the government’s regulatory power to promote competition rather than letting industry consolidate.

4. Money in Politics: The Feedback Loop

This is where everything connects for Stiglitz. Concentrated economic power translates directly into political power. Wealthy interests fund campaigns, lobby relentlessly, and effectively write regulations for the agencies that are supposed to oversee them. This creates a vicious cycle: economic inequality begets political inequality, which creates policies that worsen economic inequality.

Stiglitz argues that the Supreme Court’s Citizens United decision, which allowed unlimited corporate spending in elections, turbocharged this problem by treating money as speech and corporations as people.

His recommendations: Limit campaign spending and institute public financing of campaigns to reduce candidates’ dependence on wealthy donors. Place strict limits on lobbying and implement a robust “revolving door” policy that prevents government officials from immediately cashing in with the industries they regulated. Mandate transparency requirements so voters know who’s funding what. Pass Constitutional amendments if necessary to overturn Citizens United.

The Big Picture

What makes Stiglitz’s argument powerful is how these pieces fit together. You can’t fix inequality just through taxation if big corporations control the political process. You can’t address climate change if fossil fuel companies can buy enough influence to block action. Everything is connected.

His recommendations aren’t radical in historical terms – they’re actually trying to restore a balance that existed during the post-war economic boom of the 1950s.  Stiglitz’s “progressive capitalism” isn’t socialism. It’s capitalism with a conscience — one that remembers who it’s supposed to serve.

Whether you see that as a rescue plan or a recipe for red tape depends entirely on where you put your faith: in public institutions or private markets. The question is do we have the political will to implement his recommendation despite entrenched opposition from those benefiting from the current system?

 Either way, this debate isn’t going away — it’s the one shaping the 21st-century economy.

No Kings!

When Evidence Isn’t Enough: The Crisis of Science in Public Life

While I would never call myself a scientist, as a physician my whole professional life is built on the belief in and the trust of science. I am distressed that so many people have chosen to disregard trust in science in favor of misinformation.

Throughout history, scientific discovery has been humanity’s most reliable guide to progress. From the germ theory of disease to space exploration, science has reshaped how we live and what we believe possible. Yet in recent years, the very foundation of this methodical pursuit—evidence, observation, and experimentation—has come under sustained political, cultural, and economic attack. This struggle is often described as “the war on science,” a phrase that captures how debates once rooted in policy have shifted into battles over truth itself.

The numbers tell a stark story. The National Science Foundation has terminated roughly 1,040 grants that would have awarded $739 million to researchers and has awarded only 52 undergraduate research grants in 2025, compared to about 200 annually since 2015. The proposed cuts are staggering. Trump will request a $4 billion budget for the NSF in fiscal year 2026, a 55% reduction from what Congress appropriated for 2025.

At the heart of the conflict lies mistrust. Science requires patience since answers evolve as new data emerge. But in a world driven by instant communication and ideological certainties, that evolving nature is often cast as contradiction or weakness. Critics dismiss changing conclusions not as hallmarks of rigorous inquiry, but as evidence of unreliability. The result is a dangerous fracture; science depends on trust in evidence, while many segments of society increasingly place trust in ideology or anecdote or even outright falsehoods.

Climate change is one of the most visible fronts in this battle. Virtually every major scientific body worldwide affirms that human activities are driving global warming. Yet climate scientists are routinely accused of bias or conspiracy, their data questioned, and their motives impugned. What is often overlooked in the controversy is not the complexity of climate systems—scientists have long acknowledged uncertainties—but the political and economic interests threatened by the solutions science prescribes.  When climate scientists publish evidence of global warming, their research doesn’t just describe weather patterns—it challenges powerful industries built on fossil fuels.

Public health provides another stark example. During the COVID-19 pandemic, scientific guidance became subject to fierce political polarization. Masking policies, vaccine safety, and even simple social distancing rules morphed into partisan symbols rather than matters of medical evidence. Scientists found themselves vilified, their professional debates distorted into talking points. The losers in this exchange were not the scientists themselves but the broader public, denied clear trust in institutions that are dedicated to safeguarding health.

Underlying these conflicts are powerful currents. Some industries resist regulation by casting doubt on findings that threaten profit. Certain political movements thrive on skepticism of expertise, channeling populist distrust of “elites” toward scientists. And in the swirl of social media, misinformation spreads more rapidly than peer-reviewed studies, eroding the influence of evidence before consensus can take hold.

What makes this particularly concerning is the timing. America’s main scientific and technological rivals are rising fast. In terms of federal Research and Development funding as a percentage of GDP, U.S. investment has dropped for decades, and the lead that the U.S. enjoyed over China’s R&D expenditure has largely been erased.

While the war on science is often treated as a distinctly modern dilemma, born of political polarization, mass media, and cultural distrust of expertise, its roots stretch back centuries. Galileo was silenced for challenging religious dogma. Early physicians were scorned when they argued that invisible germs, not miasmas or curses, caused disease.  During the Enlightenment of the 17th and 18th centuries, thinkers faced their own version of this struggle—a battle between dogma and reason, authority and evidence, tradition and discovery.   In every case, vested interests—whether theological, cultural, or economic—feared the disruption that scientific truth carried. Understanding those earlier conflicts provides valuable context for our challenges today.

The stakes today, however, feel higher. Our era’s challenges—climate change, pandemics, artificial intelligence, genetic engineering—demand unprecedented reliance on scientific understanding. To wage war on science is, in effect, to wage war on our own best chance for survival and responsible progress. If truth becomes negotiable, then evidence loses meaning, and with it, the possibility of reasoned self-government. That is why the war on science cannot be dismissed as a technical squabble—it is a philosophical contest echoing the Enlightenment battles that shaped modern civilization.

Ultimately, the struggle is less about data than about values. Do we commit to curiosity, openness, and the willingness to change our minds? Or do we cling to certainties that soothe but endanger us in the end? The war on science will not be won by scientists alone. It can only be resolved if society restores trust in evidence as the most reliable compass we have—however unsettling the direction it may point.  There may be alternative opinions but there are no alternative facts.

Bread and Circuses: From Ancient Rome to Modern America

“Already long ago, from when we sold our vote to no man, the People have abdicated our duties; for the People who once upon a time handed out military command, high civil office, legions — everything, now restrains itself and anxiously desires for just two things: bread and circuses.”

Nearly 2,000 years ago, Roman satirist Juvenal penned one of history’s most enduring political observations: “Two things only the people anxiously desire — bread and circuses.” Writing around 100 CE in his Satire X, Juvenal wasn’t celebrating this phenomenon—he was lamenting it. The poet watched as Roman citizens traded their political engagement for free grain and spectacular entertainment, becoming passive spectators rather than active participants in their democracy. The phrase has endured for nearly two millennia as shorthand for a troubling political dynamic: entertainment and consumption replacing civic engagement and accountability.

The Roman Warning

Juvenal’s critique came at a pivotal moment in Roman history. The republic had collapsed, and emperors like Augustus had systematically dismantled democratic institutions. Rather than revolt, Roman citizens seemed content as long as the government provided basic sustenance (the grain dole called annona) and elaborate spectacles at venues like the Colosseum. Political participation withered as people focused on immediate pleasures rather than long-term civic responsibilities.

The strategy worked brilliantly for Roman rulers. Keep the masses fed and entertained, and they won’t question your authority or demand meaningful representation. It was political control through distraction—a form of soft authoritarianism that maintained order without overt oppression.  The policy was effective in the short term—peace in the streets and loyalty to the emperors—but disastrous over time. Rome’s population became disengaged from politics, while real power consolidated in the hands of a few.

Modern American Parallels

Fast-forward to contemporary America, and Juvenal’s observation feels uncomfortably relevant. While we don’t have gladiatorial games, we do have our own version of “circuses”—professional sports, reality TV, social media feeds, and celebrity culture that dominate public attention. These aren’t inherently problematic, but they become concerning when they crowd out civic engagement.

Our modern “bread” takes various forms: government assistance programs, subsidies, and economic policies designed to maintain consumer spending. We are saturated with cheap goods, instant delivery services, and mass consumerism. For many, economic struggles are temporarily softened by accessible consumption, from fast food to online shopping. Yet material comfort often masks deeper inequalities and systemic challenges—wage stagnation, healthcare costs, and mounting national debt. These programs often serve legitimate purposes, but they can also function as political tools to maintain public satisfaction and suppress dissent.

Consider how political campaigns increasingly focus on entertainment value rather than substantive policy debates. Politicians hire social media managers and appear on talk shows, understanding that capturing attention often matters more than presenting coherent governance plans. Meanwhile, voter turnout for local elections—where citizens have the most direct impact—remains dismally low.

The Distraction Economy

Perhaps most striking is how our information landscape mirrors Roman spectacles. We’re bombarded with sensational news, viral content, and manufactured controversies that generate strong emotional reactions but little productive action. Complex policy issues get reduced to soundbites and memes, making genuine democratic deliberation increasingly difficult.

Social media algorithms are specifically optimized for engagement, not enlightenment. They feed us content designed to provoke reactions—anger, outrage, schadenfreude—rather than encourage thoughtful consideration of difficult issues. This creates a population that feels politically engaged through constant consumption of political content while remaining largely passive in actual civic participation.

The danger of “bread and circuses” in modern America lies in apathy. When civic participation declines, voter turnout falls, and policy debates get reduced to simplistic slogans, elites face less scrutiny. The result is a weakened democracy, vulnerable to manipulation and short-term thinking.

Breaking the Cycle

Juvenal’s warning doesn’t mean we should abandon entertainment or social programs. Rather, it suggests we need intentional balance. Democratic societies thrive when citizens remain actively engaged in governance beyond just voting every few years.

This means staying informed about local issues, attending town halls, contacting representatives, and participating in community organizations. It means choosing substance over spectacle and long-term thinking over immediate gratification.

The Roman Republic fell partly because its citizens stopped paying attention to governance. Juvenal’s “bread and circuses” reminds us that democracy requires constant vigilance—and that comfortable distraction can be freedom’s most seductive enemy.

The Communist Dream vs. Stalinist Reality: A Tale of Two Visions

Recently, I’ve been looking at various political philosophies. I’ve written about fascism, totalitarianism, authoritarianism, autocracy and kleptocracy. In this post I’m going to look at theoretical communism versus the reality of Stalinist communism, and I would be remiss if I didn’t at least briefly mention oligarchy as currently practiced in Russia.

The gap between Karl Marx’s theoretical vision of communism and its implementation under Joseph Stalin’s leadership in the Soviet Union represents one of history’s most significant divergences between ideological theory and political practice. While both claimed the same ultimate goal—a classless, stateless society—their approaches and outcomes differed in fundamental ways that continue to shape our world today, one a workers’ utopia and the other a brutal dictatorship.

The Marxist Vision

Marx envisioned communism as the natural culmination of historical progress, emerging from the inherent conflicts within capitalism. In his theoretical framework, the working class (proletariat) would eventually overthrow the capitalist system through revolution, leading to a transitional socialist phase before achieving true communism. This final stage would be characterized by the absence of social classes, private property, private ownership of the means of production, and ultimately, the state itself.

Central to Marx’s concept was the idea that communism would emerge from highly developed capitalist societies where industrial production had reached its peak. He believed that the abundance created by advanced capitalism would make scarcity obsolete, allowing society to operate according to the principle “from each according to his ability, to each according to his needs.” The state, having served its purpose as an instrument of class rule, would simply “wither away” as class distinctions disappeared.

Marx also emphasized that the transition to communism would be an international phenomenon. He argued that capitalism was inherently global, and therefore its replacement would necessarily be worldwide. The famous rallying cry “Workers of the world, unite!” reflected this internationalist perspective, suggesting that communist revolution would spread across national boundaries as workers recognized their common interests.

The Stalinist Implementation

Vladimir Lenin took firm control of Russia following the revolution in 1917 and oversaw creation of a state that was characterized by centralization, suppression of opposition parties, and the establishment of the Cheka (secret police) to enforce party rule. Economically, Lenin’s government shifted from War Communism (state control of production during the civil war) to the New Economic Policy (NEP) in 1921, which allowed limited private trade and small-scale capitalism to stabilize the economy. It formally became the Union of Soviet Socialist Republics in 1922. This period provided the groundwork for the highly centralized, totalitarian state under Stalin that followed Lenin’s death in 1924.

Stalin’s approach to building communism in the Soviet Union diverged sharply from Marx’s theoretical blueprint. Rather than emerging from advanced capitalism, Stalin attempted to construct socialism in a largely agricultural society that had barely begun industrialization. This fundamental difference in starting conditions shaped every aspect of the Soviet experiment.

Instead of the gradual withering away of the state, Stalin presided over an unprecedented expansion of state power. The Soviet government under his leadership controlled virtually every aspect of economic and social life, from industrial production to agricultural collectivization to cultural expression. The state became not a temporary tool for managing the transition to communism, but a permanent and increasingly powerful institution that dominated all aspects of society.  By the early 1930s, Joseph Stalin had centralized all power in his own hands, sidelining collective decision-making bodies like the Politburo or Soviets.

Marx emphasized rule by the proletariat giving power to all people equally.  Stalin fostered a cult of personality through relentless propaganda.  His image appeared on posters, statues, and in schools.  History books were rewritten to credit him for Soviet successes—often erasing Lenin, Trotsky, or others.  He was referred to as the “Father of Nations,” “Brilliant Genius,” and “Great Leader.”  Loyalty to Stalin became more important than loyalty to the Communist Party or its ideals.  The government and the economy operated at his personal direction, enforced by the secret police, censorship, executions, and mass purges of dissidents.

Stalin implemented a command economy, in which the government or central authority makes all major decisions about production, investment, pricing, and the allocation of resources, rather than leaving those choices to market forces. In this system, planners typically set production targets, control industries, and determine what goods and services will be available, often with the goal of achieving social or political objectives such as central control and rapid industrialization. This is the direct opposite of the voluntary cooperation Marx had envisioned. The forced collectivization of peasants onto government farms, rapid industrialization through five-year plans, and the use of prison labor in gulags represented a top-down model of development that contradicted Marx’s emphasis on worker empowerment and democratic participation.

Where Marx emphasized emancipation and freedom for workers, Stalinist policies involved widespread repression, political purges, forced labor camps, and censorship. Most notable is the period that came to be known as the “Great Purge,” also called the “Great Terror,” a campaign of political repression between 1936 and 1938. It involved widespread arrests, forced confessions, show trials, executions, and imprisonment in labor camps (the Gulag system). Stalin accused perceived political rivals, military leaders, intellectuals, and ordinary citizens of being disloyal or conducting “counter-revolutionary” activities. It is estimated that about 700,000 people were executed by firing squad after being branded “enemies of the people” in show trials or secret proceedings.  Another 1.5 to 2 million people were arrested and sent to Gulag labor camps, prisons, or exile. Many died from overwork, malnutrition, disease, or harsh conditions.

Perhaps most significantly, Stalin abandoned Marx’s internationalist vision in favor of “socialism in one country.” This doctrine, developed in the 1920s, argued that the Soviet Union could build socialism independently of worldwide revolution. This shift not only contradicted Marx’s theoretical framework but also led to policies that prioritized Soviet national interests over international worker solidarity.

Key Contradictions

The differences between Marxist theory and Stalinist practice created several fundamental contradictions. Where Marx predicted the elimination of social classes, Stalin’s Soviet Union developed a rigid hierarchy with the Communist Party elite at the top, followed by technical specialists, workers, and peasants. This new class structure, while different from capitalist society, still involved significant inequalities in power, privilege, and access to resources.

Marx’s vision of worker control over production stood in stark contrast to Stalin’s centralized command economy. Rather than workers democratically managing their workplaces, Soviet workers found themselves subject to increasingly detailed state control over their labor. The factory became less a site of worker empowerment than a component in a vast state machine directed from Moscow.

The treatment of dissent also revealed fundamental differences. Marx believed that communism would eliminate the need for political repression as class conflicts disappeared. Stalin’s regime, however, relied extensively on surveillance, censorship, and violent suppression of opposition. The extensive use of terror against both perceived enemies and ordinary citizens contradicted Marx’s vision of a society based on cooperation and mutual benefit.

Modern Russia

At this point, I want to mention something about modern Russia and its current governmental and economic situation since the breakup of the Soviet Union

An oligarchy is a form of government where power rests in the hands of a small number of people. These individuals typically come from similar backgrounds – they might be distinguished by wealth, family ties, education, corporate control, military influence, or religious authority. The word comes from the Greek “oligarkhia,” meaning “rule by few.” In an oligarchy, this small group makes the major political and economic decisions that affect the entire population, often prioritizing their own interests over those of the broader society.

Modern Russia’s economy is often described as having oligarchic features because a relatively small group of wealthy business leaders—many of whom made their fortunes during the chaotic privatization of the 1990s—maintain outsized influence over key industries like energy, banking, and natural resources. While Russia is technically a mixed economy with both private and state involvement, political connections determine who gains access to wealth and power. This creates a system where economic opportunity is concentrated among elites closely tied to the Kremlin, most closely resembling an oligarchy.

Historical Context and Consequences

Understanding the differences between Marxist theory and Stalinist implementation requires considering the historical context in which Stalin operated. The Soviet Union faced external threats, internal resistance, and the enormous challenge of rapid modernization. Stalin’s supporters argued that harsh measures were necessary to defend the revolution and build industrial capacity quickly enough to survive in a hostile international environment.

Critics, however, contend that Stalin’s methods created a system that was fundamentally incompatible with Marx’s vision of human liberation. The concentration of power in a single party—much less a single person— combined with the suppression of democratic institutions, and the extensive use of violence and coercion demonstrate that Stalinist practice moved away from, rather than toward, Marx’s goals.

The legacy of this divergence continues to influence contemporary political debates. Supporters of Marxist theory often argue that Stalin’s failures demonstrate the dangers of abandoning egalitarian principles and internationalist perspectives. Meanwhile, critics of communism point to the Soviet experience as evidence that Marxist ideals are inherently unrealistic or even dangerous.

This comparison reveals the complex relationship between political theory and practice, highlighting how historical circumstances, leadership decisions, and practical constraints can shape the implementation of ideological visions in ways that may fundamentally alter their character and outcomes.

The Erosion of Decorum in Public Discourse

The nature of public debate has undergone a dramatic change in recent years. Civility and reasoned discourse—once the hallmarks of political and social commentary—have given way to something closer to a verbal battleground.

Today’s public exchanges are increasingly defined by inflammatory rhetoric, personal attacks, and an abandonment of long-held norms of decorum.

From Respectful Dialogue to Profanity-Laced Exchanges

The decline is nowhere more evident than in the normalization of profanity. What was once limited to private conversations or edgy entertainment now spills freely across digital platforms.

Social media comment threads, online forums, and even professional publications regularly feature language that, not long ago, would have been considered unacceptable in public life. This shift reflects a broader cultural preference for emotional expression over reasoned argument.

Substack and the Temptation of Provocation

Even Substack, often positioned as a refuge for serious, long-form writing, has not been immune.

When I first joined the platform, I was drawn by its promise of thoughtful essays outside the noise of traditional media. Yet I’ve noticed a sharp increase in profanity, personal insults, and derogatory comments—paired with a noticeable decline in reasoned discussion.

False claims, easily disproven with a quick fact-check, are repeated and restacked with little regard for accuracy. The subscription model, rewarding engagement over editorial oversight, can inadvertently encourage more inflammatory tones in order to hold readers’ attention.

The Meme Problem

Memes have only accelerated this decline. And here, I’ll admit my own complicity: I’ve created and shared memes to make ironic or satirical points. But over time, irony can blur into sarcasm, and satire into insult.

Memes thrive on simplification and emotional impact. Complex policies collapse into pithy slogans and mocking images. They’re shareable, entertaining, and easy—but rarely conducive to real understanding.

The result? Substantive debate gets replaced by fast, shallow exchanges of oversimplified (and often misleading) talking points.

From Essays to Punchlines

Essays once demanded careful argument: claims supported by evidence, acknowledgment of counterpoints, and respect for nuance. Memes demand only a laugh—or a groan.

Worse, their viral nature ensures that inflammatory or misleading content spreads faster than any correction ever could.

This isn’t just an aesthetic concern. When communication prioritizes winning over understanding, democracy suffers. Citizens grow less equipped to grapple with complex issues, and leaders find it easier to appeal to emotion rather than present workable solutions.

Can We Reverse the Trend?

The trajectory is worrisome—but not irreversible.

  • Platforms could design features that reward thoughtful engagement instead of amplifying outrage.
  • Educational institutions could recommit to teaching critical thinking and civil debate.
  • Individuals can model better behavior, remembering that persuasion usually requires respect.

Still, if I’m honest, I’m not optimistic. Too many incentives—from clicks to cash—push the culture of discourse in the opposite direction.

Final Thoughts

The health of our public discourse is the health of democracy itself. As writers, readers, and citizens, we carry responsibility for raising the standard.

Our words shape not only our immediate conversations but also the norms of civic life for generations to come. The choice is ours: continue down the path of hostility and simplification—or rebuild the habits of respect and reason.

I hope we choose the latter. But hope, at this moment, feels fragile.

The Electoral College: Should America Go Popular?

Few topics in American politics generate as much perennial debate as the Electoral College. Every four years, calls to abolish it resurface—often with renewed vigor when the electoral vote winner loses the popular vote, as happened in 1824, 1876, 1888, 2000, and 2016. The proposal is to elect the president by a nationwide popular vote, just as we do governors and senators.

Why We Have an Electoral College

The Electoral College was a late-stage compromise at the Constitutional Convention of 1787. The framers were balancing multiple tensions:

  • Large vs. small states
  • Slave vs. free states
  • Congress choosing the president vs. direct election

Delegates feared that direct election by popular vote would favor populous states, allow urban centers to dominate rural areas, and encourage demagogues to campaign purely on popular passions. At the same time, they worried about giving Congress too much control over the executive branch.

The system for selecting the president—via the Electoral College—was partly designed to prevent direct popular influence. Its original intent, according to historians, was to empower electors (seen as more knowledgeable) and to ensure thoughtful deliberation in choosing the president, guarding against the masses being swayed by charm rather than substance.

Some delegates—like James Madison, James Wilson, and Gouverneur Morris—supported direct popular election of the president, while others, like Elbridge Gerry and Roger Sherman, explicitly voiced distrust in direct election of the president and believed ordinary voters lacked impartiality or sufficient knowledge. 

Institutional and political bargaining ultimately shaped the final structure. Their solution: each state gets electors equal to its total number of representatives and senators. The addition of two electors for the senators ensures that the small states remain, on a population basis, overrepresented in the Electoral College.

State legislatures determine how electors are chosen (eventually, every state moved to popular election). Most states now award all their electoral votes to the statewide popular vote winner—“winner-take-all.”

The Electoral College thus emerged not as anyone’s ideal system, but as a possible,  workable compromise that balanced competing regional interests, philosophical concerns about democracy, and the practical realities of governing a large, diverse republic in the 18th century.

Pros of Eliminating the Electoral College

Equal Weight for Every Vote

The most compelling argument for eliminating the Electoral College centers on democratic equality. Under the current electoral system, a vote in Wyoming carries roughly three times the weight of a vote in California when measured by electoral votes per capita. To put this in real numbers Wyoming has about 193,000 people per electoral vote while California has about 718,000.  This mathematical reality means that some Americans’ voices count more than others in selecting their president, a principle that seems to contradict the foundational democratic ideal of “one person, one vote.”

A national popular vote would ensure that every American’s vote carries identical weight, regardless of geography. This approach would eliminate scenarios where candidates win the presidency while losing the popular vote. Such outcomes can undermine public confidence in democratic institutions and raise questions about the legitimacy of electoral results.

Reflects the Will of the Majority

In two of the last six elections (2000 and 2016), the candidate with fewer total popular votes became president. While the framers accepted the possibility of divergence between the popular and electoral results, many modern Americans view such outcomes as undermining democratic legitimacy.

Encourages Nationwide Campaigning

Because many states are firmly “red” or “blue,” campaigns focus their energy on a handful of battleground states that could go either way—like Pennsylvania, Wisconsin, and Arizona. Under a popular vote, candidates would have an incentive to compete everywhere, because every additional vote counts the same regardless of location.

Simplifies the Process

The Electoral College system confuses many Americans and can seem archaic in the 21st century. A direct popular vote is straightforward and immediately understandable: the candidate who receives the most votes wins. This simplicity could increase public trust and participation in the democratic process.

Eliminates “Faithless Electors”

Although rare, faithless electors—those who cast electoral votes against their state’s popular choice—are possible under the current system. A direct election would remove this constitutional quirk.

Cons of Eliminating the Electoral College

Federalism Concerns

The United States is a union of states as well as a single nation. The Electoral College reinforces the role of states in presidential elections, reflecting their status as sovereign entities in certain respects. Abolishing it could be seen as eroding federalism by further centralizing power.

Risk of Regional Dominance

Opponents argue that without the Electoral College, candidates could focus disproportionately on high-population regions—California, Texas, Florida, New York—while ignoring rural states and smaller communities. Whether this would happen in practice is debated, but the perception of neglect could deepen regional divides.

Potential for Narrow-Margin Crises

In a popular vote system, a razor-thin margin would require a nationwide recount. Under the Electoral College, disputes are typically contained within a state (e.g., Florida in 2000). A national recount would be a logistical and political nightmare.

Constitutional Hurdles

Abolishing the Electoral College requires a constitutional amendment—an extraordinarily high bar. That means approval by two-thirds of both houses of Congress and ratification by three-quarters of the states. Smaller states, which benefit from the Electoral College’s vote weighting, have little incentive to approve such a change.

Intermediate Options

Since abolishing the Electoral College outright is politically unlikely in the near term, reform advocates have proposed middle-ground solutions.

The National Popular Vote Interstate Compact (NPVIC)

The NPVIC is an agreement among states to award all their electoral votes to the national popular vote winner, but it only takes effect once states totaling at least 270 electoral votes join. As of 2025, 17 states plus D.C. (totaling 209 electoral votes) have joined. This approach sidesteps a constitutional amendment but relies on states’ willingness to cede control over their electoral votes.  The compact could be implemented without amending the constitution and achieves the functional equivalent of a popular vote. However, it has not been legally tested and would likely face court challenges. To me, the greatest drawback is that states could withdraw at any time. I would envision that in a closely contested and contentious election states unhappy with the national outcome would likely withdraw from the compact.

Proportional Allocation of Electoral Votes

Instead of winner-take-all, states could allocate electoral votes proportionally to the share of the statewide vote. Maine and Nebraska already use a variation of this system, awarding some votes by congressional district.  Theoretically, this would reduce the impact of battleground states and increase the representation for minority views within states. But it could also increase the likelihood of no candidate reaching 270 electoral votes thereby sending the election into the House of Representatives. It still preserves the over representation of smaller states because it retains the two electors for senators. 

If electors are awarded proportionally based on statewide voting, the popular vote may not be distributed in a manner to allow awarding of whole delegates. There’s no constitutional provision for awarding partial electors. This would be especially significant in states with only one or two representatives in the house.

If electors were awarded to the winners of each Congressional District this would encourage even more gerrymandering than we are currently seeing. Extreme gerrymandering could undermine any progress towards reflecting the popular vote, simply continuing the current mismatch of popular and electoral votes.

Gerrymandering is a political practice that involves manipulating the boundaries of electoral districts to benefit a particular party or group. It is nothing new in American politics, originating in the early 19th century.  The term “gerrymandering” was coined after an 1812 incident in Massachusetts, where Governor Elbridge Gerry signed a bill redrawing district lines to favor his party. One of the districts resembled a mythical salamander in shape, inspiring the portmanteau “Gerry-mander” in a satirical cartoon by Elkanah Tisdale that helped popularize the term. It’s interesting, that since gerrymandering favored the Democratic-Republican Party and the newspaper that published the cartoon supported the Federalist Party, it was made to look not like a cute salamander but more like an ominous dragon. 

Bonus Electoral Votes for National Popular Vote Winner

A hybrid idea would keep the Electoral College but award a fixed number of bonus electors (say, 100) to the national popular vote winner. This would almost guarantee alignment between the popular and electoral results without abandoning the current structure.  This option maintains a state-based system and reduces the chance of a split result. But it would also require a constitutional amendment and add complexity that many voters may find confusing.

Feasibility of Change

Reforming or abolishing the Electoral College faces three main obstacles:

  • Constitutional Entrenchment – Article II and the 12th Amendment are clear about elector allocation. Full abolition would require one of the most difficult political feats in American governance—a constitutional amendment.
  • State Incentives – Smaller states and swing states have outsized influence under the current system. They are unlikely to support reforms that dilute their power.
  • Partisan Dynamics – Since recent Electoral College/popular vote splits have benefited Republicans, Democrats tend to favor reform, while Republicans tend to defend the status quo. That dynamic could shift if the pattern changes.

 Conclusion

The Electoral College is both a relic of 18th-century compromises and a living feature of America’s federal structure. Its defenders argue that it protects smaller states, contains electoral disputes, and reinforces the states’ role in national governance. Its critics counter that it violates the principle of “one person, one vote” and distorts campaign priorities.

Abolishing it in favor of a direct popular vote would likely make presidential elections more democratic in the literal sense, but it would also raise questions about federalism, campaign strategy, and the handling of close results. The Electoral College preserves federalism and geographic balance but can produce outcomes that seem to contradict majority will.

Intermediate options like the NPVIC or proportional allocation may offer ways to mitigate the College’s most controversial effects without uprooting the constitutional framework but also face significant hurdles for implementation.

Whether reform happens will depend not just on the merits of the arguments, but on the political incentives of the states and the parties. Until those incentives shift, the Electoral College is likely to remain—imperfect, contentious, and uniquely  American.

The Constitutional Foundations

Who Controls Elections?

Donald Trump has repeatedly claimed that the president should have broad authority to change how elections are conducted—particularly when it comes to abolishing mail-in voting and voting machines. As recently as August 2025, Trump pledged to issue an executive order banning mail-in ballots and voting machines ahead of the 2026 midterm elections, insisting that states must comply with his directive because, in his words, “States act merely as ‘agents’ for the Federal Government when it comes to counting and tabulating votes.… They are required to follow what the Federal Government, represented by the President of the United States, instructs them to do, FOR THE GOOD OF OUR COUNTRY”.

But this isn’t the first time he has suggested that he could control the election process.  In March 2025, Trump issued a major executive order titled “Preserving and Protecting the Integrity of American Elections” that aims to expand presidential control over the election process.  The order attempts to direct the Election Assistance Commission (EAC) — an independent, bipartisan agency — to mandate that voters show a passport or other similar document proving citizenship when registering to vote using the federal voter registration form.  The executive order has been the subject of extensive litigation, and several federal judges have issued injunctions against various portions of it.

Amid the COVID-19 pandemic during his first term, President Trump publicly suggested delaying the election. Constitutional scholars and members of Congress quickly pointed out he lacked such authority—the date of federal elections is set by statute, and only Congress could change it.

The U.S. Constitution provides a clear framework for who holds the authority to control elections, and it is not the president.

Article I, Section 4: Congressional and State Authority

The main constitutional authority over U.S. elections is found in Article I, Section 4, commonly called the “Elections Clause.” It states:

“The Times, Places and Manner of holding Elections for Senators and Representatives, shall be prescribed in each State by the Legislature thereof; but the Congress may at any time by Law make or alter such Regulations…”

This language charges state legislatures with defining the details of congressional elections, including logistics and procedures. Importantly, Congress retains the power to override state laws and impose federal rules—such as standardized Election Days or regulations for voter registration and districting.

What does this mean for the president? The Constitution is clear: the president has no direct authority to determine the conduct of congressional elections or to unilaterally change the way federal elections are held. Presidential influence over elections is limited to signing or vetoing congressional legislation, not acting alone.

Article II and the 12th Amendment: Presidential Elections

Presidential elections are regulated by Article II, which created the Electoral College, and by the 12th Amendment .

Article II, Section 1 provides:

“Each State shall appoint, in such Manner as the Legislature thereof may direct, a Number of Electors…”

States arrange how their presidential electors are selected, subject to changes imposed by congressional law. The federal government, through Congress (not the president!), determines the timing of choosing electors and casting electoral votes. The 12th Amendment sets procedures for how electors meet and vote for both president and vice president.

Again, neither Article II nor the 12th Amendment gives the president authority to independently set election rules. At most, the president can recommend reforms, sign laws crafted by Congress, and advocate for certain policies.

Historical Examples of Limits on Presidential Power Over Elections

Even during national crises, presidents have not been able to unilaterally change election rules:

  • 1864 Election (Lincoln): Despite the Civil War, Abraham Lincoln did not postpone or suspend the presidential election. Elections were carried out in the states, including special arrangements for soldiers to vote.
  • 1944 Election (Roosevelt): In the midst of World War II, Franklin Roosevelt stood for re-election. Again, no effort was made by the president to change election laws.

Presidential Powers: What Can the Executive Branch Do?

The president’s responsibilities in elections are more limited than you might expect and are essentially ministerial and ceremonial, not regulatory.

The executive power in Article II invests the president with broad national leadership, command of the military, and responsibility to “take Care that the Laws be faithfully executed”. This can include enforcing voting rights laws and overseeing federal agencies that support election integrity. However, the Constitution and decades of legal precedent restrict the president from directly controlling election rules.

  • The president cannot by executive order change state rules for voting methods (e.g., mail-in voting, voting machines).
  • The president cannot unilaterally suspend or postpone federal elections.
  • The president cannot direct states to alter their voter registration, polling locations, or other administrative details.
  • The president has no role in certifying state results. That function belongs to state officials, with Congress responsible for counting electoral votes.
  • The president can direct federal agencies like the Department of Justice to enforce federal election laws, protect voting rights, and intervene in cases of fraud or intimidation.  The president does not have the authority to direct federal agencies to act in a manner contrary to the law.

When presidents have sought to influence election administration more directly, courts and Congress have reaffirmed the constitutional boundaries. For example, efforts to change the date of an election or prohibit certain voting methods without congressional action have consistently failed in the courts.

Congressional Power: The Real Check on Election Rules

While state legislatures remain the primary manager of elections, Congress retains the final word. The Supreme Court has confirmed that congressional law “preempts” conflicting state rules in matters of federal elections. When Congress acts—through laws like the Voting Rights Act, Help America Vote Act, and the National Voter Registration Act—states must comply, and the president’s role is simply to sign or veto those laws.

Congress has used its power over the years to:

  • Set a uniform national Election Day.
  • Establish protections for disabled voters and overseas citizens.
  • Mandate requirements around voter registration and accessibility.
  • Regulate campaign finance and transparency.

Checks, Balances, and Modern Tensions

Recent political debates have seen calls for presidents to take stronger action on election oversight, especially regarding the use of mail-in ballots or voting machines. However, these calls run up against clear constitutional limits: the president cannot rewrite the rules of elections without Congress or state legislatures.

Any presidential attempt to do so by executive order would face swift legal challenges and almost certainly be invalidated. The intent of the Framers was to divide election power between the states and Congress, with the president largely excluded from direct rule-making authority. This balance—central to federalism—protects elections from potential abuses of executive power and ensures that reforms require broad democratic consensus. While presidents can champion reforms and enforce federal laws supporting fair elections, they are constitutionally forbidden from unilaterally changing election rules.

Conclusion

The framework isn’t perfect—it can create confusion when state and federal authorities clash. But the basic principle remains: states run elections. Congress can regulate them within constitutional bounds, and presidents enforce the resulting laws.

For citizens, lawmakers, and presidents alike, respect for these boundaries secures the foundation of American democracy. The right to vote—and the integrity of how that vote is counted—is protected not by any single leader, but by enduring constitutional principles and the shared power of states and Congress.

Citizens United: A Supreme Court Decision That Reshaped American Politics

In January 2010, the U.S. Supreme Court issued one of the most consequential and controversial rulings in modern political history: Citizens United v. Federal Election Commission. The 5–4 decision dramatically altered the legal landscape of campaign finance, opening the door for unlimited spending by corporations, unions, and certain nonprofit organizations, potentially giving them disproportionate influence in election campaigning.

Hailed by some as a victory for free expression and condemned by others as unleashing a torrent of special interest cash into politics, Citizens United has continued to define the shape of campaign financing for the past 15 years.

The Origins of the Case

The lawsuit began with a film. In 2008, Citizens United, a conservative nonprofit organization led by David Bossie, produced “Hillary: The Movie,” a 90-minute documentary highly critical of then-Senator Hillary Clinton, who was seeking the Democratic presidential nomination. Citizens United wanted to air the film on cable television through video-on-demand services within 30 days of the Democratic primary elections and planned to run promotional advertisements.

However, federal campaign finance law under the Bipartisan Campaign Reform Act (also known as McCain-Feingold) prohibited corporations from using general treasury funds for “electioneering communications” within specific timeframes before elections.

Fearing civil and criminal penalties, Citizens United sought a court declaration that their film and promotional materials were exempt from these restrictions.

The organization argued that because the documentary didn’t explicitly tell viewers how to vote, it shouldn’t be classified as campaign advocacy subject to corporate spending limits. A federal district court disagreed, ruling unanimously that the film could only be interpreted as telling viewers that Clinton was “unfit for office” and encouraging them to vote against her.

The Supreme Court’s Landmark Ruling

Citizens United appealed, and the Supreme Court ultimately agreed to hear the case, raising questions not just about this particular film, but about the broader constitutionality of limiting corporate and union election expenditures.

In their decision, the justices went far beyond the narrow question in the original case. In a 5-4 decision authored by Justice Anthony Kennedy, and joined by Chief Justice Roberts and Justices Thomas, Scalia, and Alito, the Court ruled that restrictions on independent political expenditures by corporations and unions violated the First Amendment’s free speech protections.

The majority opinion overturned two significant precedents: Austin v. Michigan Chamber of Commerce (1990) and portions of McConnell v. Federal Election Commission (2003). Justice Kennedy wrote that political speech is “indispensable to decision making in a democracy, and this is no less true because the speech comes from a corporation.”

Justice John Paul Stevens, writing in dissent, warned that the decision represented “a rejection of the common sense of the American people, who have recognized a need to prevent corporations from undermining self-government.”

The ruling did not lift limits on direct contributions to candidates; those caps remain in place. But it cleared the way for unlimited spending on independent political advocacy, so long as it is not coordinated with a candidate’s campaign.

The majority decision made two key assumptions: that independent political spending wouldn’t lead to corruption because it would be transparent, and that it would remain truly separate from candidate campaigns. Both assumptions have proven incorrect in practice.

Transforming the Political Landscape

The decision’s impact has been dramatic and far-reaching. Outside spending in federal elections skyrocketed from $730 million at the time of the ruling to $4.5 billion in 2024. The ruling enabled the creation of “super PACs”—political action committees that can raise and spend unlimited amounts as long as they maintain nominal (sometimes fictional) independence from campaigns.

Each election cycle since 2010 has seen a new record in campaign spending, much of it “outside money”—funds raised and spent by organizations not directly affiliated with candidates or parties. The influence of wealthy donors has only grown, with some estimates suggesting that the vast majority of outside election funding comes from a small handful of deep-pocketed interests.

Perhaps more concerning to democracy advocates is the rise of “dark money” — political spending where the funding source remains secret. Dark money expenditures increased from less than $5 million in 2006 to over $1 billion in the 2024 presidential election alone.

The 2024 election exemplified Citizens United‘s influence. Billionaire-backed super PACs helped close substantial fundraising gaps, with groups like those funded by Elon Musk taking on core campaign functions including voter outreach operations, while supposedly remaining independent. This concentration of political influence among ultra-wealthy donors represents a fusion of private wealth and political power unseen since the late 19th century.

Current Political Implications

Today, Citizens United remains deeply unpopular with the American public. A Washington Post – ABC News poll found that 80% of Americans opposed the Citizens United ruling, Including 85% of Democrats, 76% of Republicans, and 81% of independents.

The ruling has created what some campaign finance experts call a “corruption bomb” effect, where wealthy individuals can effectively buy political influence through seemingly independent expenditures. Recent legislative efforts, including the proposed Abolish Super PACs Act introduced in Congress, aim to restore some limits on political spending, though prospects for passage remain unlikely.

The Citizens United decision continues to shape American politics in 2025. Campaigns increasingly rely on outside groups to fund negative advertising, which can deepen political polarization. With no upper limit on independent expenditures, candidates may feel beholden to the interests of big-money backers who can tip the scales in tight races.

Supporters of the ruling argue that it promotes free speech, enabling more voices to be heard in the political arena. They contend that limiting corporate or union spending would amount to government censorship. Opponents counter that equating money with speech effectively drowns out the voices of ordinary voters who cannot match the spending power of corporations or billionaires.

Some reform advocates are pursuing constitutional amendments to overturn Citizens United, though such efforts face steep political and procedural hurdles. Others push for enhanced disclosure laws to ensure voters know who is funding political messages.

As the 2026 midterm elections approach, Citizens United‘s legacy continues to define the relationship between money and political power. It raises fundamental questions about whether democratic governance can function effectively when political speech is increasingly dominated by the ultra-wealthy. Should the First Amendment protect unlimited political spending by corporations and unions, or does such spending distort democracy by giving disproportionate influence to the wealthiest? The Court’s ruling in Citizens United has transformed the way American elections are fought and the consequences of that decision are still unfolding.

Page 3 of 6

Powered by WordPress & Theme by Anders Norén