Grumpy opinions about everything.

Category: Politics Page 3 of 6

Bread and Circuses: From Ancient Rome to Modern America

“Already long ago, from when we sold our vote to no man, the People have abdicated our duties; for the People who once upon a time handed out military command, high civil office, legions — everything, now restrains itself and anxiously desires for just two things: bread and circuses.”

Nearly 2,000 years ago, Roman satirist Juvenal penned one of history’s most enduring political observations: “Two things only the people anxiously desire — bread and circuses.” Writing around 100 CE in his Satire X, Juvenal wasn’t celebrating this phenomenon—he was lamenting it. The poet watched as Roman citizens traded their political engagement for free grain and spectacular entertainment, becoming passive spectators rather than active participants in their democracy. The phrase has endured for nearly two millennia as shorthand for a troubling political dynamic: entertainment and consumption replacing civic engagement and accountability.

The Roman Warning

Juvenal’s critique came at a pivotal moment in Roman history. The republic had collapsed, and emperors like Augustus had systematically dismantled democratic institutions. Rather than revolt, Roman citizens seemed content as long as the government provided basic sustenance (the grain dole called annona) and elaborate spectacles at venues like the Colosseum. Political participation withered as people focused on immediate pleasures rather than long-term civic responsibilities.

The strategy worked brilliantly for Roman rulers. Keep the masses fed and entertained, and they won’t question your authority or demand meaningful representation. It was political control through distraction—a form of soft authoritarianism that maintained order without overt oppression.  The policy was effective in the short term—peace in the streets and loyalty to the emperors—but disastrous over time. Rome’s population became disengaged from politics, while real power consolidated in the hands of a few.

Modern American Parallels

Fast-forward to contemporary America, and Juvenal’s observation feels uncomfortably relevant. While we don’t have gladiatorial games, we do have our own version of “circuses”—professional sports, reality TV, social media feeds, and celebrity culture that dominate public attention. These aren’t inherently problematic, but they become concerning when they crowd out civic engagement.

Our modern “bread” takes various forms: government assistance programs, subsidies, and economic policies designed to maintain consumer spending. We are saturated with cheap goods, instant delivery services, and mass consumerism. For many, economic struggles are temporarily softened by accessible consumption, from fast food to online shopping. Yet material comfort often masks deeper inequalities and systemic challenges—wage stagnation, healthcare costs, and mounting national debt. These programs often serve legitimate purposes, but they can also function as political tools to maintain public satisfaction and suppress dissent.

Consider how political campaigns increasingly focus on entertainment value rather than substantive policy debates. Politicians hire social media managers and appear on talk shows, understanding that capturing attention often matters more than presenting coherent governance plans. Meanwhile, voter turnout for local elections—where citizens have the most direct impact—remains dismally low.

The Distraction Economy

Perhaps most striking is how our information landscape mirrors Roman spectacles. We’re bombarded with sensational news, viral content, and manufactured controversies that generate strong emotional reactions but little productive action. Complex policy issues get reduced to soundbites and memes, making genuine democratic deliberation increasingly difficult.

Social media algorithms are specifically optimized for engagement, not enlightenment. They feed us content designed to provoke reactions—anger, outrage, schadenfreude—rather than encourage thoughtful consideration of difficult issues. This creates a population that feels politically engaged through constant consumption of political content while remaining largely passive in actual civic participation.

The danger of “bread and circuses” in modern America lies in apathy. When civic participation declines, voter turnout falls, and policy debates get reduced to simplistic slogans, elites face less scrutiny. The result is a weakened democracy, vulnerable to manipulation and short-term thinking.

Breaking the Cycle

Juvenal’s warning doesn’t mean we should abandon entertainment or social programs. Rather, it suggests we need intentional balance. Democratic societies thrive when citizens remain actively engaged in governance beyond just voting every few years.

This means staying informed about local issues, attending town halls, contacting representatives, and participating in community organizations. It means choosing substance over spectacle and long-term thinking over immediate gratification.

The Roman Republic fell partly because its citizens stopped paying attention to governance. Juvenal’s “bread and circuses” reminds us that democracy requires constant vigilance—and that comfortable distraction can be freedom’s most seductive enemy.

The Communist Dream vs. Stalinist Reality: A Tale of Two Visions

Recently, I’ve been looking at various political philosophies. I’ve written about fascism, totalitarianism, authoritarianism, autocracy and kleptocracy. In this post I’m going to look at theoretical communism versus the reality of Stalinist communism, and I would be remiss if I didn’t at least briefly mention oligarchy as currently practiced in Russia.

The gap between Karl Marx’s theoretical vision of communism and its implementation under Joseph Stalin’s leadership in the Soviet Union represents one of history’s most significant divergences between ideological theory and political practice. While both claimed the same ultimate goal—a classless, stateless society—their approaches and outcomes differed in fundamental ways that continue to shape our world today, one a workers’ utopia and the other a brutal dictatorship.

The Marxist Vision

Marx envisioned communism as the natural culmination of historical progress, emerging from the inherent conflicts within capitalism. In his theoretical framework, the working class (proletariat) would eventually overthrow the capitalist system through revolution, leading to a transitional socialist phase before achieving true communism. This final stage would be characterized by the absence of social classes, private property, private ownership of the means of production, and ultimately, the state itself.

Central to Marx’s concept was the idea that communism would emerge from highly developed capitalist societies where industrial production had reached its peak. He believed that the abundance created by advanced capitalism would make scarcity obsolete, allowing society to operate according to the principle “from each according to his ability, to each according to his needs.” The state, having served its purpose as an instrument of class rule, would simply “wither away” as class distinctions disappeared.

Marx also emphasized that the transition to communism would be an international phenomenon. He argued that capitalism was inherently global, and therefore its replacement would necessarily be worldwide. The famous rallying cry “Workers of the world, unite!” reflected this internationalist perspective, suggesting that communist revolution would spread across national boundaries as workers recognized their common interests.

The Stalinist Implementation

Vladimir Lenin took firm control of Russia following the revolution in 1917 and oversaw creation of a state that was characterized by centralization, suppression of opposition parties, and the establishment of the Cheka (secret police) to enforce party rule. Economically, Lenin’s government shifted from War Communism (state control of production during the civil war) to the New Economic Policy (NEP) in 1921, which allowed limited private trade and small-scale capitalism to stabilize the economy. It formally became the Union of Soviet Socialist Republics in 1922. This period provided the groundwork for the highly centralized, totalitarian state under Stalin that followed Lenin’s death in 1924.

Stalin’s approach to building communism in the Soviet Union diverged sharply from Marx’s theoretical blueprint. Rather than emerging from advanced capitalism, Stalin attempted to construct socialism in a largely agricultural society that had barely begun industrialization. This fundamental difference in starting conditions shaped every aspect of the Soviet experiment.

Instead of the gradual withering away of the state, Stalin presided over an unprecedented expansion of state power. The Soviet government under his leadership controlled virtually every aspect of economic and social life, from industrial production to agricultural collectivization to cultural expression. The state became not a temporary tool for managing the transition to communism, but a permanent and increasingly powerful institution that dominated all aspects of society.  By the early 1930s, Joseph Stalin had centralized all power in his own hands, sidelining collective decision-making bodies like the Politburo or Soviets.

Marx emphasized rule by the proletariat giving power to all people equally.  Stalin fostered a cult of personality through relentless propaganda.  His image appeared on posters, statues, and in schools.  History books were rewritten to credit him for Soviet successes—often erasing Lenin, Trotsky, or others.  He was referred to as the “Father of Nations,” “Brilliant Genius,” and “Great Leader.”  Loyalty to Stalin became more important than loyalty to the Communist Party or its ideals.  The government and the economy operated at his personal direction, enforced by the secret police, censorship, executions, and mass purges of dissidents.

Stalin implemented a command economy, in which the government or central authority makes all major decisions about production, investment, pricing, and the allocation of resources, rather than leaving those choices to market forces. In this system, planners typically set production targets, control industries, and determine what goods and services will be available, often with the goal of achieving social or political objectives such as central control and rapid industrialization. This is the direct opposite of the voluntary cooperation Marx had envisioned. The forced collectivization of peasants onto government farms, rapid industrialization through five-year plans, and the use of prison labor in gulags represented a top-down model of development that contradicted Marx’s emphasis on worker empowerment and democratic participation.

Where Marx emphasized emancipation and freedom for workers, Stalinist policies involved widespread repression, political purges, forced labor camps, and censorship. Most notable is the period that came to be known as the “Great Purge,” also called the “Great Terror,” a campaign of political repression between 1936 and 1938. It involved widespread arrests, forced confessions, show trials, executions, and imprisonment in labor camps (the Gulag system). Stalin accused perceived political rivals, military leaders, intellectuals, and ordinary citizens of being disloyal or conducting “counter-revolutionary” activities. It is estimated that about 700,000 people were executed by firing squad after being branded “enemies of the people” in show trials or secret proceedings.  Another 1.5 to 2 million people were arrested and sent to Gulag labor camps, prisons, or exile. Many died from overwork, malnutrition, disease, or harsh conditions.

Perhaps most significantly, Stalin abandoned Marx’s internationalist vision in favor of “socialism in one country.” This doctrine, developed in the 1920s, argued that the Soviet Union could build socialism independently of worldwide revolution. This shift not only contradicted Marx’s theoretical framework but also led to policies that prioritized Soviet national interests over international worker solidarity.

Key Contradictions

The differences between Marxist theory and Stalinist practice created several fundamental contradictions. Where Marx predicted the elimination of social classes, Stalin’s Soviet Union developed a rigid hierarchy with the Communist Party elite at the top, followed by technical specialists, workers, and peasants. This new class structure, while different from capitalist society, still involved significant inequalities in power, privilege, and access to resources.

Marx’s vision of worker control over production stood in stark contrast to Stalin’s centralized command economy. Rather than workers democratically managing their workplaces, Soviet workers found themselves subject to increasingly detailed state control over their labor. The factory became less a site of worker empowerment than a component in a vast state machine directed from Moscow.

The treatment of dissent also revealed fundamental differences. Marx believed that communism would eliminate the need for political repression as class conflicts disappeared. Stalin’s regime, however, relied extensively on surveillance, censorship, and violent suppression of opposition. The extensive use of terror against both perceived enemies and ordinary citizens contradicted Marx’s vision of a society based on cooperation and mutual benefit.

Modern Russia

At this point, I want to mention something about modern Russia and its current governmental and economic situation since the breakup of the Soviet Union

An oligarchy is a form of government where power rests in the hands of a small number of people. These individuals typically come from similar backgrounds – they might be distinguished by wealth, family ties, education, corporate control, military influence, or religious authority. The word comes from the Greek “oligarkhia,” meaning “rule by few.” In an oligarchy, this small group makes the major political and economic decisions that affect the entire population, often prioritizing their own interests over those of the broader society.

Modern Russia’s economy is often described as having oligarchic features because a relatively small group of wealthy business leaders—many of whom made their fortunes during the chaotic privatization of the 1990s—maintain outsized influence over key industries like energy, banking, and natural resources. While Russia is technically a mixed economy with both private and state involvement, political connections determine who gains access to wealth and power. This creates a system where economic opportunity is concentrated among elites closely tied to the Kremlin, most closely resembling an oligarchy.

Historical Context and Consequences

Understanding the differences between Marxist theory and Stalinist implementation requires considering the historical context in which Stalin operated. The Soviet Union faced external threats, internal resistance, and the enormous challenge of rapid modernization. Stalin’s supporters argued that harsh measures were necessary to defend the revolution and build industrial capacity quickly enough to survive in a hostile international environment.

Critics, however, contend that Stalin’s methods created a system that was fundamentally incompatible with Marx’s vision of human liberation. The concentration of power in a single party—much less a single person— combined with the suppression of democratic institutions, and the extensive use of violence and coercion demonstrate that Stalinist practice moved away from, rather than toward, Marx’s goals.

The legacy of this divergence continues to influence contemporary political debates. Supporters of Marxist theory often argue that Stalin’s failures demonstrate the dangers of abandoning egalitarian principles and internationalist perspectives. Meanwhile, critics of communism point to the Soviet experience as evidence that Marxist ideals are inherently unrealistic or even dangerous.

This comparison reveals the complex relationship between political theory and practice, highlighting how historical circumstances, leadership decisions, and practical constraints can shape the implementation of ideological visions in ways that may fundamentally alter their character and outcomes.

The Erosion of Decorum in Public Discourse

The nature of public debate has undergone a dramatic change in recent years. Civility and reasoned discourse—once the hallmarks of political and social commentary—have given way to something closer to a verbal battleground.

Today’s public exchanges are increasingly defined by inflammatory rhetoric, personal attacks, and an abandonment of long-held norms of decorum.

From Respectful Dialogue to Profanity-Laced Exchanges

The decline is nowhere more evident than in the normalization of profanity. What was once limited to private conversations or edgy entertainment now spills freely across digital platforms.

Social media comment threads, online forums, and even professional publications regularly feature language that, not long ago, would have been considered unacceptable in public life. This shift reflects a broader cultural preference for emotional expression over reasoned argument.

Substack and the Temptation of Provocation

Even Substack, often positioned as a refuge for serious, long-form writing, has not been immune.

When I first joined the platform, I was drawn by its promise of thoughtful essays outside the noise of traditional media. Yet I’ve noticed a sharp increase in profanity, personal insults, and derogatory comments—paired with a noticeable decline in reasoned discussion.

False claims, easily disproven with a quick fact-check, are repeated and restacked with little regard for accuracy. The subscription model, rewarding engagement over editorial oversight, can inadvertently encourage more inflammatory tones in order to hold readers’ attention.

The Meme Problem

Memes have only accelerated this decline. And here, I’ll admit my own complicity: I’ve created and shared memes to make ironic or satirical points. But over time, irony can blur into sarcasm, and satire into insult.

Memes thrive on simplification and emotional impact. Complex policies collapse into pithy slogans and mocking images. They’re shareable, entertaining, and easy—but rarely conducive to real understanding.

The result? Substantive debate gets replaced by fast, shallow exchanges of oversimplified (and often misleading) talking points.

From Essays to Punchlines

Essays once demanded careful argument: claims supported by evidence, acknowledgment of counterpoints, and respect for nuance. Memes demand only a laugh—or a groan.

Worse, their viral nature ensures that inflammatory or misleading content spreads faster than any correction ever could.

This isn’t just an aesthetic concern. When communication prioritizes winning over understanding, democracy suffers. Citizens grow less equipped to grapple with complex issues, and leaders find it easier to appeal to emotion rather than present workable solutions.

Can We Reverse the Trend?

The trajectory is worrisome—but not irreversible.

  • Platforms could design features that reward thoughtful engagement instead of amplifying outrage.
  • Educational institutions could recommit to teaching critical thinking and civil debate.
  • Individuals can model better behavior, remembering that persuasion usually requires respect.

Still, if I’m honest, I’m not optimistic. Too many incentives—from clicks to cash—push the culture of discourse in the opposite direction.

Final Thoughts

The health of our public discourse is the health of democracy itself. As writers, readers, and citizens, we carry responsibility for raising the standard.

Our words shape not only our immediate conversations but also the norms of civic life for generations to come. The choice is ours: continue down the path of hostility and simplification—or rebuild the habits of respect and reason.

I hope we choose the latter. But hope, at this moment, feels fragile.

The Electoral College: Should America Go Popular?

Few topics in American politics generate as much perennial debate as the Electoral College. Every four years, calls to abolish it resurface—often with renewed vigor when the electoral vote winner loses the popular vote, as happened in 1824, 1876, 1888, 2000, and 2016. The proposal is to elect the president by a nationwide popular vote, just as we do governors and senators.

Why We Have an Electoral College

The Electoral College was a late-stage compromise at the Constitutional Convention of 1787. The framers were balancing multiple tensions:

  • Large vs. small states
  • Slave vs. free states
  • Congress choosing the president vs. direct election

Delegates feared that direct election by popular vote would favor populous states, allow urban centers to dominate rural areas, and encourage demagogues to campaign purely on popular passions. At the same time, they worried about giving Congress too much control over the executive branch.

The system for selecting the president—via the Electoral College—was partly designed to prevent direct popular influence. Its original intent, according to historians, was to empower electors (seen as more knowledgeable) and to ensure thoughtful deliberation in choosing the president, guarding against the masses being swayed by charm rather than substance.

Some delegates—like James Madison, James Wilson, and Gouverneur Morris—supported direct popular election of the president, while others, like Elbridge Gerry and Roger Sherman, explicitly voiced distrust in direct election of the president and believed ordinary voters lacked impartiality or sufficient knowledge. 

Institutional and political bargaining ultimately shaped the final structure. Their solution: each state gets electors equal to its total number of representatives and senators. The addition of two electors for the senators ensures that the small states remain, on a population basis, overrepresented in the Electoral College.

State legislatures determine how electors are chosen (eventually, every state moved to popular election). Most states now award all their electoral votes to the statewide popular vote winner—“winner-take-all.”

The Electoral College thus emerged not as anyone’s ideal system, but as a possible,  workable compromise that balanced competing regional interests, philosophical concerns about democracy, and the practical realities of governing a large, diverse republic in the 18th century.

Pros of Eliminating the Electoral College

Equal Weight for Every Vote

The most compelling argument for eliminating the Electoral College centers on democratic equality. Under the current electoral system, a vote in Wyoming carries roughly three times the weight of a vote in California when measured by electoral votes per capita. To put this in real numbers Wyoming has about 193,000 people per electoral vote while California has about 718,000.  This mathematical reality means that some Americans’ voices count more than others in selecting their president, a principle that seems to contradict the foundational democratic ideal of “one person, one vote.”

A national popular vote would ensure that every American’s vote carries identical weight, regardless of geography. This approach would eliminate scenarios where candidates win the presidency while losing the popular vote. Such outcomes can undermine public confidence in democratic institutions and raise questions about the legitimacy of electoral results.

Reflects the Will of the Majority

In two of the last six elections (2000 and 2016), the candidate with fewer total popular votes became president. While the framers accepted the possibility of divergence between the popular and electoral results, many modern Americans view such outcomes as undermining democratic legitimacy.

Encourages Nationwide Campaigning

Because many states are firmly “red” or “blue,” campaigns focus their energy on a handful of battleground states that could go either way—like Pennsylvania, Wisconsin, and Arizona. Under a popular vote, candidates would have an incentive to compete everywhere, because every additional vote counts the same regardless of location.

Simplifies the Process

The Electoral College system confuses many Americans and can seem archaic in the 21st century. A direct popular vote is straightforward and immediately understandable: the candidate who receives the most votes wins. This simplicity could increase public trust and participation in the democratic process.

Eliminates “Faithless Electors”

Although rare, faithless electors—those who cast electoral votes against their state’s popular choice—are possible under the current system. A direct election would remove this constitutional quirk.

Cons of Eliminating the Electoral College

Federalism Concerns

The United States is a union of states as well as a single nation. The Electoral College reinforces the role of states in presidential elections, reflecting their status as sovereign entities in certain respects. Abolishing it could be seen as eroding federalism by further centralizing power.

Risk of Regional Dominance

Opponents argue that without the Electoral College, candidates could focus disproportionately on high-population regions—California, Texas, Florida, New York—while ignoring rural states and smaller communities. Whether this would happen in practice is debated, but the perception of neglect could deepen regional divides.

Potential for Narrow-Margin Crises

In a popular vote system, a razor-thin margin would require a nationwide recount. Under the Electoral College, disputes are typically contained within a state (e.g., Florida in 2000). A national recount would be a logistical and political nightmare.

Constitutional Hurdles

Abolishing the Electoral College requires a constitutional amendment—an extraordinarily high bar. That means approval by two-thirds of both houses of Congress and ratification by three-quarters of the states. Smaller states, which benefit from the Electoral College’s vote weighting, have little incentive to approve such a change.

Intermediate Options

Since abolishing the Electoral College outright is politically unlikely in the near term, reform advocates have proposed middle-ground solutions.

The National Popular Vote Interstate Compact (NPVIC)

The NPVIC is an agreement among states to award all their electoral votes to the national popular vote winner, but it only takes effect once states totaling at least 270 electoral votes join. As of 2025, 17 states plus D.C. (totaling 209 electoral votes) have joined. This approach sidesteps a constitutional amendment but relies on states’ willingness to cede control over their electoral votes.  The compact could be implemented without amending the constitution and achieves the functional equivalent of a popular vote. However, it has not been legally tested and would likely face court challenges. To me, the greatest drawback is that states could withdraw at any time. I would envision that in a closely contested and contentious election states unhappy with the national outcome would likely withdraw from the compact.

Proportional Allocation of Electoral Votes

Instead of winner-take-all, states could allocate electoral votes proportionally to the share of the statewide vote. Maine and Nebraska already use a variation of this system, awarding some votes by congressional district.  Theoretically, this would reduce the impact of battleground states and increase the representation for minority views within states. But it could also increase the likelihood of no candidate reaching 270 electoral votes thereby sending the election into the House of Representatives. It still preserves the over representation of smaller states because it retains the two electors for senators. 

If electors are awarded proportionally based on statewide voting, the popular vote may not be distributed in a manner to allow awarding of whole delegates. There’s no constitutional provision for awarding partial electors. This would be especially significant in states with only one or two representatives in the house.

If electors were awarded to the winners of each Congressional District this would encourage even more gerrymandering than we are currently seeing. Extreme gerrymandering could undermine any progress towards reflecting the popular vote, simply continuing the current mismatch of popular and electoral votes.

Gerrymandering is a political practice that involves manipulating the boundaries of electoral districts to benefit a particular party or group. It is nothing new in American politics, originating in the early 19th century.  The term “gerrymandering” was coined after an 1812 incident in Massachusetts, where Governor Elbridge Gerry signed a bill redrawing district lines to favor his party. One of the districts resembled a mythical salamander in shape, inspiring the portmanteau “Gerry-mander” in a satirical cartoon by Elkanah Tisdale that helped popularize the term. It’s interesting, that since gerrymandering favored the Democratic-Republican Party and the newspaper that published the cartoon supported the Federalist Party, it was made to look not like a cute salamander but more like an ominous dragon. 

Bonus Electoral Votes for National Popular Vote Winner

A hybrid idea would keep the Electoral College but award a fixed number of bonus electors (say, 100) to the national popular vote winner. This would almost guarantee alignment between the popular and electoral results without abandoning the current structure.  This option maintains a state-based system and reduces the chance of a split result. But it would also require a constitutional amendment and add complexity that many voters may find confusing.

Feasibility of Change

Reforming or abolishing the Electoral College faces three main obstacles:

  • Constitutional Entrenchment – Article II and the 12th Amendment are clear about elector allocation. Full abolition would require one of the most difficult political feats in American governance—a constitutional amendment.
  • State Incentives – Smaller states and swing states have outsized influence under the current system. They are unlikely to support reforms that dilute their power.
  • Partisan Dynamics – Since recent Electoral College/popular vote splits have benefited Republicans, Democrats tend to favor reform, while Republicans tend to defend the status quo. That dynamic could shift if the pattern changes.

 Conclusion

The Electoral College is both a relic of 18th-century compromises and a living feature of America’s federal structure. Its defenders argue that it protects smaller states, contains electoral disputes, and reinforces the states’ role in national governance. Its critics counter that it violates the principle of “one person, one vote” and distorts campaign priorities.

Abolishing it in favor of a direct popular vote would likely make presidential elections more democratic in the literal sense, but it would also raise questions about federalism, campaign strategy, and the handling of close results. The Electoral College preserves federalism and geographic balance but can produce outcomes that seem to contradict majority will.

Intermediate options like the NPVIC or proportional allocation may offer ways to mitigate the College’s most controversial effects without uprooting the constitutional framework but also face significant hurdles for implementation.

Whether reform happens will depend not just on the merits of the arguments, but on the political incentives of the states and the parties. Until those incentives shift, the Electoral College is likely to remain—imperfect, contentious, and uniquely  American.

The Constitutional Foundations

Who Controls Elections?

Donald Trump has repeatedly claimed that the president should have broad authority to change how elections are conducted—particularly when it comes to abolishing mail-in voting and voting machines. As recently as August 2025, Trump pledged to issue an executive order banning mail-in ballots and voting machines ahead of the 2026 midterm elections, insisting that states must comply with his directive because, in his words, “States act merely as ‘agents’ for the Federal Government when it comes to counting and tabulating votes.… They are required to follow what the Federal Government, represented by the President of the United States, instructs them to do, FOR THE GOOD OF OUR COUNTRY”.

But this isn’t the first time he has suggested that he could control the election process.  In March 2025, Trump issued a major executive order titled “Preserving and Protecting the Integrity of American Elections” that aims to expand presidential control over the election process.  The order attempts to direct the Election Assistance Commission (EAC) — an independent, bipartisan agency — to mandate that voters show a passport or other similar document proving citizenship when registering to vote using the federal voter registration form.  The executive order has been the subject of extensive litigation, and several federal judges have issued injunctions against various portions of it.

Amid the COVID-19 pandemic during his first term, President Trump publicly suggested delaying the election. Constitutional scholars and members of Congress quickly pointed out he lacked such authority—the date of federal elections is set by statute, and only Congress could change it.

The U.S. Constitution provides a clear framework for who holds the authority to control elections, and it is not the president.

Article I, Section 4: Congressional and State Authority

The main constitutional authority over U.S. elections is found in Article I, Section 4, commonly called the “Elections Clause.” It states:

“The Times, Places and Manner of holding Elections for Senators and Representatives, shall be prescribed in each State by the Legislature thereof; but the Congress may at any time by Law make or alter such Regulations…”

This language charges state legislatures with defining the details of congressional elections, including logistics and procedures. Importantly, Congress retains the power to override state laws and impose federal rules—such as standardized Election Days or regulations for voter registration and districting.

What does this mean for the president? The Constitution is clear: the president has no direct authority to determine the conduct of congressional elections or to unilaterally change the way federal elections are held. Presidential influence over elections is limited to signing or vetoing congressional legislation, not acting alone.

Article II and the 12th Amendment: Presidential Elections

Presidential elections are regulated by Article II, which created the Electoral College, and by the 12th Amendment .

Article II, Section 1 provides:

“Each State shall appoint, in such Manner as the Legislature thereof may direct, a Number of Electors…”

States arrange how their presidential electors are selected, subject to changes imposed by congressional law. The federal government, through Congress (not the president!), determines the timing of choosing electors and casting electoral votes. The 12th Amendment sets procedures for how electors meet and vote for both president and vice president.

Again, neither Article II nor the 12th Amendment gives the president authority to independently set election rules. At most, the president can recommend reforms, sign laws crafted by Congress, and advocate for certain policies.

Historical Examples of Limits on Presidential Power Over Elections

Even during national crises, presidents have not been able to unilaterally change election rules:

  • 1864 Election (Lincoln): Despite the Civil War, Abraham Lincoln did not postpone or suspend the presidential election. Elections were carried out in the states, including special arrangements for soldiers to vote.
  • 1944 Election (Roosevelt): In the midst of World War II, Franklin Roosevelt stood for re-election. Again, no effort was made by the president to change election laws.

Presidential Powers: What Can the Executive Branch Do?

The president’s responsibilities in elections are more limited than you might expect and are essentially ministerial and ceremonial, not regulatory.

The executive power in Article II invests the president with broad national leadership, command of the military, and responsibility to “take Care that the Laws be faithfully executed”. This can include enforcing voting rights laws and overseeing federal agencies that support election integrity. However, the Constitution and decades of legal precedent restrict the president from directly controlling election rules.

  • The president cannot by executive order change state rules for voting methods (e.g., mail-in voting, voting machines).
  • The president cannot unilaterally suspend or postpone federal elections.
  • The president cannot direct states to alter their voter registration, polling locations, or other administrative details.
  • The president has no role in certifying state results. That function belongs to state officials, with Congress responsible for counting electoral votes.
  • The president can direct federal agencies like the Department of Justice to enforce federal election laws, protect voting rights, and intervene in cases of fraud or intimidation.  The president does not have the authority to direct federal agencies to act in a manner contrary to the law.

When presidents have sought to influence election administration more directly, courts and Congress have reaffirmed the constitutional boundaries. For example, efforts to change the date of an election or prohibit certain voting methods without congressional action have consistently failed in the courts.

Congressional Power: The Real Check on Election Rules

While state legislatures remain the primary manager of elections, Congress retains the final word. The Supreme Court has confirmed that congressional law “preempts” conflicting state rules in matters of federal elections. When Congress acts—through laws like the Voting Rights Act, Help America Vote Act, and the National Voter Registration Act—states must comply, and the president’s role is simply to sign or veto those laws.

Congress has used its power over the years to:

  • Set a uniform national Election Day.
  • Establish protections for disabled voters and overseas citizens.
  • Mandate requirements around voter registration and accessibility.
  • Regulate campaign finance and transparency.

Checks, Balances, and Modern Tensions

Recent political debates have seen calls for presidents to take stronger action on election oversight, especially regarding the use of mail-in ballots or voting machines. However, these calls run up against clear constitutional limits: the president cannot rewrite the rules of elections without Congress or state legislatures.

Any presidential attempt to do so by executive order would face swift legal challenges and almost certainly be invalidated. The intent of the Framers was to divide election power between the states and Congress, with the president largely excluded from direct rule-making authority. This balance—central to federalism—protects elections from potential abuses of executive power and ensures that reforms require broad democratic consensus. While presidents can champion reforms and enforce federal laws supporting fair elections, they are constitutionally forbidden from unilaterally changing election rules.

Conclusion

The framework isn’t perfect—it can create confusion when state and federal authorities clash. But the basic principle remains: states run elections. Congress can regulate them within constitutional bounds, and presidents enforce the resulting laws.

For citizens, lawmakers, and presidents alike, respect for these boundaries secures the foundation of American democracy. The right to vote—and the integrity of how that vote is counted—is protected not by any single leader, but by enduring constitutional principles and the shared power of states and Congress.

Citizens United: A Supreme Court Decision That Reshaped American Politics

In January 2010, the U.S. Supreme Court issued one of the most consequential and controversial rulings in modern political history: Citizens United v. Federal Election Commission. The 5–4 decision dramatically altered the legal landscape of campaign finance, opening the door for unlimited spending by corporations, unions, and certain nonprofit organizations, potentially giving them disproportionate influence in election campaigning.

Hailed by some as a victory for free expression and condemned by others as unleashing a torrent of special interest cash into politics, Citizens United has continued to define the shape of campaign financing for the past 15 years.

The Origins of the Case

The lawsuit began with a film. In 2008, Citizens United, a conservative nonprofit organization led by David Bossie, produced “Hillary: The Movie,” a 90-minute documentary highly critical of then-Senator Hillary Clinton, who was seeking the Democratic presidential nomination. Citizens United wanted to air the film on cable television through video-on-demand services within 30 days of the Democratic primary elections and planned to run promotional advertisements.

However, federal campaign finance law under the Bipartisan Campaign Reform Act (also known as McCain-Feingold) prohibited corporations from using general treasury funds for “electioneering communications” within specific timeframes before elections.

Fearing civil and criminal penalties, Citizens United sought a court declaration that their film and promotional materials were exempt from these restrictions.

The organization argued that because the documentary didn’t explicitly tell viewers how to vote, it shouldn’t be classified as campaign advocacy subject to corporate spending limits. A federal district court disagreed, ruling unanimously that the film could only be interpreted as telling viewers that Clinton was “unfit for office” and encouraging them to vote against her.

The Supreme Court’s Landmark Ruling

Citizens United appealed, and the Supreme Court ultimately agreed to hear the case, raising questions not just about this particular film, but about the broader constitutionality of limiting corporate and union election expenditures.

In their decision, the justices went far beyond the narrow question in the original case. In a 5-4 decision authored by Justice Anthony Kennedy, and joined by Chief Justice Roberts and Justices Thomas, Scalia, and Alito, the Court ruled that restrictions on independent political expenditures by corporations and unions violated the First Amendment’s free speech protections.

The majority opinion overturned two significant precedents: Austin v. Michigan Chamber of Commerce (1990) and portions of McConnell v. Federal Election Commission (2003). Justice Kennedy wrote that political speech is “indispensable to decision making in a democracy, and this is no less true because the speech comes from a corporation.”

Justice John Paul Stevens, writing in dissent, warned that the decision represented “a rejection of the common sense of the American people, who have recognized a need to prevent corporations from undermining self-government.”

The ruling did not lift limits on direct contributions to candidates; those caps remain in place. But it cleared the way for unlimited spending on independent political advocacy, so long as it is not coordinated with a candidate’s campaign.

The majority decision made two key assumptions: that independent political spending wouldn’t lead to corruption because it would be transparent, and that it would remain truly separate from candidate campaigns. Both assumptions have proven incorrect in practice.

Transforming the Political Landscape

The decision’s impact has been dramatic and far-reaching. Outside spending in federal elections skyrocketed from $730 million at the time of the ruling to $4.5 billion in 2024. The ruling enabled the creation of “super PACs”—political action committees that can raise and spend unlimited amounts as long as they maintain nominal (sometimes fictional) independence from campaigns.

Each election cycle since 2010 has seen a new record in campaign spending, much of it “outside money”—funds raised and spent by organizations not directly affiliated with candidates or parties. The influence of wealthy donors has only grown, with some estimates suggesting that the vast majority of outside election funding comes from a small handful of deep-pocketed interests.

Perhaps more concerning to democracy advocates is the rise of “dark money” — political spending where the funding source remains secret. Dark money expenditures increased from less than $5 million in 2006 to over $1 billion in the 2024 presidential election alone.

The 2024 election exemplified Citizens United‘s influence. Billionaire-backed super PACs helped close substantial fundraising gaps, with groups like those funded by Elon Musk taking on core campaign functions including voter outreach operations, while supposedly remaining independent. This concentration of political influence among ultra-wealthy donors represents a fusion of private wealth and political power unseen since the late 19th century.

Current Political Implications

Today, Citizens United remains deeply unpopular with the American public. A Washington Post – ABC News poll found that 80% of Americans opposed the Citizens United ruling, Including 85% of Democrats, 76% of Republicans, and 81% of independents.

The ruling has created what some campaign finance experts call a “corruption bomb” effect, where wealthy individuals can effectively buy political influence through seemingly independent expenditures. Recent legislative efforts, including the proposed Abolish Super PACs Act introduced in Congress, aim to restore some limits on political spending, though prospects for passage remain unlikely.

The Citizens United decision continues to shape American politics in 2025. Campaigns increasingly rely on outside groups to fund negative advertising, which can deepen political polarization. With no upper limit on independent expenditures, candidates may feel beholden to the interests of big-money backers who can tip the scales in tight races.

Supporters of the ruling argue that it promotes free speech, enabling more voices to be heard in the political arena. They contend that limiting corporate or union spending would amount to government censorship. Opponents counter that equating money with speech effectively drowns out the voices of ordinary voters who cannot match the spending power of corporations or billionaires.

Some reform advocates are pursuing constitutional amendments to overturn Citizens United, though such efforts face steep political and procedural hurdles. Others push for enhanced disclosure laws to ensure voters know who is funding political messages.

As the 2026 midterm elections approach, Citizens United‘s legacy continues to define the relationship between money and political power. It raises fundamental questions about whether democratic governance can function effectively when political speech is increasingly dominated by the ultra-wealthy. Should the First Amendment protect unlimited political spending by corporations and unions, or does such spending distort democracy by giving disproportionate influence to the wealthiest? The Court’s ruling in Citizens United has transformed the way American elections are fought and the consequences of that decision are still unfolding.

One Big Disgusting Bill

An Existential threat to American Democracy

Now that the Senate has shamefully capitulated to Trump and passed the One Big Disgusting Bill, we’re starting to see another flurry of articles appropriately denouncing it.  Unfortunately these articles continue to focus on taxes, Medicaid and SNAP.  They largely ignore the most insidious aspect of this bill—its assault on judicial review.

I’ve discussed this bill in a recent post and won’t go in to detail here. This bill represents a flagrant attempt to bypass judicial review and undermine the separation of powers. Given that Congress has willingly abrogated all responsibilities and allowed Trump to rule single handedly, the courts remain our only recourse. But since the bought and paid for Supreme Court also seems to be giving away their authority, I’m not sure whether we have any hope left.

The only course is to remove all Republicans from office and impeach Trump.  It also may be necessary to impeach members of the Supreme Court, particularly those who blatantly accept bribes.

Congress has essentially given Donald Trump a fast pass to dictatorship.  I think this represents the American version of the Enabling Act of 1933 where the German Parliament gave Hitler absolute authority to rule without question.

As we approach the 250th anniversary of the Declaration of Independence, I fear for our Republic.

“The accumulation of all powers, legislative, executive, and judiciary, in the same hands… may justly be pronounced the very definition of tyranny.”
James Madison, Federalist No. 47 , 1788

Declaring Independence: The Origin of America’s Founding Document

When Americans celebrate the Fourth of July, we imagine fireworks, flags, and a dramatic reading of the Declaration of Independence. We think we know the story—The Continental Congress selected Thomas Jefferson to write the declaration. He labored alone to produce this famous document. Congress then approved it unanimously and it was signed on the 4th of July.

 But the truth is far different and more complex. The story behind this iconic document—the how, who, and why of its creation—is just as explosive and illuminating as the day it represents. Far from a spontaneous outburst of rebellion, the Declaration was the product of political strategy, collaborative writing, and a shared sense of urgency among men who knew their words would change the course of history.

Setting the Stage: Why a Declaration?

By the spring of 1776, the American colonies were deep in conflict with Great Britain. Battles at Lexington and Concord had already been fought. George Washington was attempting to transform the Continental Army into a professional fighting force. Thomas Paine’s Common Sense had ignited widespread public support for full separation from the British Crown. The Continental Congress had been meeting in Philadelphia, debating how far they were willing to go. By June, the mood had shifted from reconciliation to revolution.

On June 7, 1776, Richard Henry Lee of Virginia introduced a resolution to the Continental Congress declaring “that these United Colonies are, and of right ought to be, free and independent States.” The motion was controversial—some delegates wanted more time to consult their colonies. But most in Congress knew that if independence was going to happen, it needed to be explained and justified to the world, so they created a committee to draft a formal declaration.

The Committee of Five

On June 11, 1776, the Continental Congress appointed a “Committee of Five” to write the declaration. The members were:

  • Thomas Jefferson of Virginia
  • John Adams of Massachusetts
  • Benjamin Franklin of Pennsylvania
  • Roger Sherman of Connecticut
  • Robert R. Livingston of New York

This was not a random selection. Each man represented a different region of the colonies and had earned the trust of fellow delegates. Jefferson was relatively young but already known for his eloquence. Adams was an outspoken advocate of independence. Franklin brought wisdom, wit, diplomatic experience, and international prestige. Sherman brought New England theological perspectives and legislative experience, while Livingston represented the more moderate New York delegation and brought keen legal insight.

Jefferson Takes the Pen

Although it was a group project on paper, the heavy lifting fell to Thomas Jefferson. The committee chose him to draft the initial version. Why Jefferson? According to John Adams, Jefferson was chosen for three reasons: he was from Virginia (the most influential colony), he was popular, and, Adams admitted, “you can write ten times better than I can.”

Jefferson wrote the draft in a rented room at 700 Market Street in Philadelphia. He leaned heavily on Enlightenment ideas, especially those of John Locke, emphasizing natural rights and the notion that government derives its power from the consent of the governed. He also borrowed phrasing from earlier colonial declarations, including his own A Summary View of the Rights of British America and borrowed extensively from George Mason’s Virginia Declaration of Rights.

The Editing Process: Group Work Gets Messy

After Jefferson completed the initial draft (likely by June 28), he shared it with Adams and Franklin. Both men suggested revisions. Franklin, ever the editor, softened some of Jefferson’s sharpest attacks and corrected language for flow and diplomacy. His most famous contribution was changing Jefferson’s phrase “We hold these truths to be sacred and undeniable” to the more secular and philosophically precise “We hold these truths to be self-evident.”  

Adams contributed to structural suggestions and to tone. He also contributed to the strategic presentation of grievances against King George III, understanding that the declaration needed to justify revolution in terms that would be acceptable to both colonial readers and potential European allies.

Sherman and Livingston played more limited but still meaningful roles. Sherman, with his theological background, helped ensure the document’s religious references would appeal to Puritan New England, while Livingston’s legal expertise helped refine the constitutional arguments against British rule.  Otherwise, their involvement in the actual content of the declaration was likely minimal.

The revised draft was presented to the full Continental Congress on June 28, 1776. What followed was a few days of intense debate and revision by the entire body.

Congress Takes the Red Pen

From July 1 to July 4, the Continental Congress debated the resolution for independence and edited the Declaration. Jefferson watched as more than two dozen changes were made to his prose. The Congress cut about a quarter of the original text, including a lengthy passage condemning King George III for perpetuating the transatlantic slave trade that would have sparked deep division among the delegates, especially those from Southern colonies.

Other modifications included strengthening the religious language, toning down some of the more inflammatory rhetoric, and making the grievances more specific and legally grounded.  Congress made 86 edits, removing about a quarter of Jefferson’s original content. Jefferson was reportedly frustrated by the changes, calling them “mutilations,” but he recognized that compromise was the cost of consensus

Approval and Promulgation

Despite the extensive revisions, the core of Jefferson’s vision remained intact and on July 2, 1776, the Continental Congress voted in favor of Lee’s resolution for independence. That’s the actual date the colonies officially broke from Britain. John Adams even predicted in a letter to his wife that July 2 would be celebrated forever as America’s Independence Day. He was close—but the official adoption of the Declaration came two days later.

On July 4, 1776, Congress formally approved the final version of the Declaration of Independence. Contrary to popular belief, most of the signers did not sign it on that day. Only John Hancock, as president of Congress, and Charles Thomson, as secretary, signed then.   The famous handwritten version, now in the National Archives, wasn’t signed until August 2. But the document approved on July 4 was immediately printed by John Dunlap, the official printer to Congress.

These first copies, known as Dunlap Broadsides, were distributed throughout the colonies and sent to military leaders, state assemblies, and even King George III. George Washington had it read aloud to the Continental Army.  This rapid dissemination was crucial to its impact, as it was needed to rally public support for the revolutionary cause and explain the colonies’ actions to the world.

Legacy and Impact

The Declaration wasn’t just a break-up letter to the British Crown—it was a manifesto for a new kind of political order. Its assertion that “all men are created equal” would echo through centuries of American history, invoked by abolitionists, suffragists, civil rights leaders, and more.

The creation of the Declaration of Independence demonstrates that even the most iconic documents in American history emerged from collaborative processes involving compromise, revision, and collective wisdom. While Jefferson deserves primary credit for the document’s eloquent expression of revolutionary ideals, the contributions of his committee colleagues and the broader Continental Congress were essential to creating a text that could unite thirteen diverse colonies in common cause.

This collaborative origin reflects the democratic principles the declaration itself proclaimed, showing that American independence was achieved not through the vision of a single individual, but through the collective efforts of representatives working together to articulate their shared commitment to liberty, equality, and self-governance. The process that created the Declaration of Independence thus embodied the very democratic ideals it proclaimed to the world.

Today, the Declaration of Independence is enshrined as one of the foundational texts of American democracy. But it’s worth remembering that it was created under immense pressure, forged by committee, and edited by compromise. Its authors knew they were taking a dangerous step. As Franklin quipped at the signing, “We must all hang together, or most assuredly we shall all hang separately.”

The Origin of Juneteenth: America’s Second Independence Day

The Juneteenth flag is red, white, and blue to reflect the American flag and includes a bursting star to symbolize freedom.

On June 19, 1865, an event that would forever change American history unfolded in Galveston, Texas. Union Major General Gordon Granger stood before a crowd and read General Order No. 3, announcing that “all slaves are free.” This proclamation marked the beginning of what we now celebrate as Juneteenth, America’s newest federal holiday and a day that celebrates the fulfillment of emancipation for all enslaved people in the United States.

Delayed Freedom

The story of Juneteenth begins with a troubling gap between law and reality. President Abraham Lincoln had issued the Emancipation Proclamation on January 1, 1863, declaring freedom for enslaved people in states “…in rebellion against the United States”. However, enforcement depended on the advance of Union troops and In the Confederate state of Texas—remote and beyond Union control—the proclamation went unenforced for more than two years.  Many slaveholders deliberately withheld information about emancipation, and the absence of Union forces meant that freedom remained out of reach for thousands.

Even after the Civil War effectively ended in April 1865 with Lee’s surrender at Appomattox, news of emancipation remained deliberately suppressed in Texas. Some enslavers continued to hold people in bondage through the spring planting season.  It wasn’t until federal troops arrived in Galveston in sufficient force to ensure compliance that the promise of emancipation became reality for the last enslaved Americans.

Birth of a Celebration

The newly freed Texans didn’t wait for official recognition to begin celebrating their liberation. They called it Juneteenth, a combination of June and nineteenth and celebrations erupted spontaneously across Texas as communities gathered to commemorate their freedom with prayer, music, food, and fellowship. These early celebrations were deeply rooted in African American culture, featuring traditional foods and drinks, spirituals and folk songs, and the retelling of the freedom story to younger generations.

As African Americans moved from Texas to other parts of the country during the Great Migration, they carried Juneteenth traditions with them. Throughout the late 19th and early 20th centuries, Juneteenth celebrations grew, often featuring parades, music, food, and family gatherings. The holiday’s popularity waned during the mid-20th century but experienced a resurgence during the Civil Rights Movement, as activists sought to reconnect with their heritage and the ongoing struggle for equality.

From Regional Tradition to National Recognition

For over a century, Juneteenth was primarily a regional and cultural celebration rather than an official holiday. Texas became the first state to make Juneteenth a state holiday in 1980, other states followed gradually. The movement gained momentum in the 21st century as Americans increasingly recognized the need to acknowledge the full history of emancipation.

The nationwide racial justice protests of 2020 brought renewed attention to Juneteenth’s significance. On June 17, 2021, President Joe Biden signed legislation making Juneteenth a federal holiday, acknowledging it as both a celebration of freedom and a reminder of America’s ongoing journey toward equality.

A Day of Reflection and Celebration

Today, Juneteenth serves multiple purposes in American life. It’s a day of celebration, honoring the resilience and culture of African Americans. It’s also a day of education, reminding all Americans about the complexities of emancipation and the ongoing struggle for civil rights. Most importantly, it stands for hope—proof that progress, however delayed, is possible when people demand justice and equality. It honors the struggles and achievements of African Americans, reminding us of the enduring importance of freedom, perseverance, and hope in the face of adversity. As communities gather each year to celebrate Juneteenth, they continue the tradition of remembering the past while striving for a more inclusive and equitable future

Juneteenth stands as a testament to the truth that freedom delayed need not be freedom denied.

Juneteenth is not an official state holiday in West Virginia. In prior years, former governor Jim Justice issued a proclamation declaring Juneteenth a paid holiday for state employees. The current governor has made no such proclamation. Those who are planning the Juneteenth celebration in West Virginia have scheduled a Juneteenth parade for June 20th, West Virginia Day, which is an official state holiday.

Anti-Vax or Disease Supporter

Between June 9 and 11, 2025, HHS Secretary Robert F. Kennedy Jr. dismissed all 17 members of the CDC’s Advisory Committee on Immunization Practices—a body that has guided U.S. vaccine policy for about 60 years. He followed this by appointing eight new members, the minimum under the charter, including several known vaccine deniers.

In light of this, I have decided to repost an article I wrote over a year ago.  (With new artwork.)

“There are two ways to be fooled. One is to believe what is not true; the other is to refuse to believe what is true.”

– Søren Kierkegaard

Saturday morning, I was reading in the newspaper about the resurgence of measles in West Virginia. I find it appalling that this disease should be returning, given that we have safe and effective vaccinations.  What is next, polio, smallpox, or even plague?  It is only through the unexpected veto by our [former] governor that the ill-advised bill passed by our legislature to make all vaccinations optional with a little more than a request by the parents, did not become law. [The current governor has issued an executive rendering vaccinations virtually optional for school children.]

Some people may wonder why vaccinations are important. There are two principal reasons to ensure that a large portion of the population is vaccinated against communicable diseases. The first is that it reduces the individual vulnerability to disease. The person who is vaccinated is protected. But there is also a second, sometimes not well-understood, reason.  That is herd immunity.

Communicable diseases require a large susceptible population to spread. When a significant portion of the population has been vaccinated the disease does not have the core of potential victims to allow spreading. This means that the vaccinated are protecting the non-vaccinated. However, it does require a large portion of the population to be vaccinated. The idea is that herd immunity will protect those who are unable to be vaccinated either due to age, allergies, or other medical conditions that would prohibit vaccination. It is never going to protect a large proportion of the population who just choose not to be vaccinated.  For example, about 90-95% of the population needs to be vaccinated against measles to provide herd immunity.

So why do people who otherwise can be vaccinated choose not to be?

There are, of course, those who have true religious objections to vaccination.  There are others who object to vaccination on the basis of personal autonomy. They believe their right to refuse vaccination outweighs any consideration of the health concerns of the frail members of our community.

There are many who mistrust the medical system. There were some cases in the past where unethical studies were conducted on unsuspecting populations. Given the rigorous oversight of medical research now, this no longer happens. Information about research into vaccinations and their safety and efficacy can be found on websites for the Centers for Disease Control and Prevention and the World Health Organization among others. (Website references are provided at the end of this post.)

What concerns me most are those who refuse to believe reputable medical authorities, government agencies, and mainline news services. They prefer to get their information from anonymous websites or from conspiracy theory websites that still give credence to the now-discredited 1999 study linking the MMR vaccine to autism. They completely ignore the fact that 10 of the 11 reported co-authors disavowed any part in the conclusions of the study. They also ignore the fact that the principal author was found guilty of fraud for personal gain as he was employed by the manufacturer of rival drugs. They also ignore the fact that he lost his medical license over his falsifications in this study. Yet, he is still cited in anti-vaccine literature as an expert source.

Equally disturbing is the fact that vaccine resistance has become a part of political identification. Certain reactionary political groups have, for some unfathomable reason, decided that refusing vaccination is a badge of their political allegiance.  They seem to care more about maintaining their political purity than they care about science, public health, or even the welfare of their family and friends.  Politicizing public health is dangerous for all of us.  I’m not sure how we overcome this. It is easy to find the truth and verify it through fact-based studies, yet people refuse to do it. [See my post Choosing Not To Know.]

I encourage everyone to work hard to ensure that our political leaders do not remove vaccination mandates for school children. For those of us of my age, we already have immunity through vaccination or prior exposure to the disease.  It is our grandchildren and their children and their children’s children who will suffer through the return of these deadly diseases.

Rather than “vaccine deniers,” they should be referred to as “disease supporters.”

SOURCES:

  World Health Organization: https://www.who.int/health-topics/vaccines-and-immunization#tab=tab_1

  CDC:  https://www.cdc.gov/vaccines/index.html   https://www.cdc.gov/vaccines/hcp/vis/index.html

   WV DHHR: https://oeps.wv.gov/immunizations/Pages/default.aspx

   Immunise.org:  https://www.vaccineinformation.org/

Page 3 of 6

Powered by WordPress & Theme by Anders Norén