Grumpy opinions about everything.

Category: History Page 6 of 9

Grumpy opinions about American history

Declaring Independence: The Origin of America’s Founding Document

When Americans celebrate the Fourth of July, we imagine fireworks, flags, and a dramatic reading of the Declaration of Independence. We think we know the story—The Continental Congress selected Thomas Jefferson to write the declaration. He labored alone to produce this famous document. Congress then approved it unanimously and it was signed on the 4th of July.

 But the truth is far different and more complex. The story behind this iconic document—the how, who, and why of its creation—is just as explosive and illuminating as the day it represents. Far from a spontaneous outburst of rebellion, the Declaration was the product of political strategy, collaborative writing, and a shared sense of urgency among men who knew their words would change the course of history.

Setting the Stage: Why a Declaration?

By the spring of 1776, the American colonies were deep in conflict with Great Britain. Battles at Lexington and Concord had already been fought. George Washington was attempting to transform the Continental Army into a professional fighting force. Thomas Paine’s Common Sense had ignited widespread public support for full separation from the British Crown. The Continental Congress had been meeting in Philadelphia, debating how far they were willing to go. By June, the mood had shifted from reconciliation to revolution.

On June 7, 1776, Richard Henry Lee of Virginia introduced a resolution to the Continental Congress declaring “that these United Colonies are, and of right ought to be, free and independent States.” The motion was controversial—some delegates wanted more time to consult their colonies. But most in Congress knew that if independence was going to happen, it needed to be explained and justified to the world, so they created a committee to draft a formal declaration.

The Committee of Five

On June 11, 1776, the Continental Congress appointed a “Committee of Five” to write the declaration. The members were:

  • Thomas Jefferson of Virginia
  • John Adams of Massachusetts
  • Benjamin Franklin of Pennsylvania
  • Roger Sherman of Connecticut
  • Robert R. Livingston of New York

This was not a random selection. Each man represented a different region of the colonies and had earned the trust of fellow delegates. Jefferson was relatively young but already known for his eloquence. Adams was an outspoken advocate of independence. Franklin brought wisdom, wit, diplomatic experience, and international prestige. Sherman brought New England theological perspectives and legislative experience, while Livingston represented the more moderate New York delegation and brought keen legal insight.

Jefferson Takes the Pen

Although it was a group project on paper, the heavy lifting fell to Thomas Jefferson. The committee chose him to draft the initial version. Why Jefferson? According to John Adams, Jefferson was chosen for three reasons: he was from Virginia (the most influential colony), he was popular, and, Adams admitted, “you can write ten times better than I can.”

Jefferson wrote the draft in a rented room at 700 Market Street in Philadelphia. He leaned heavily on Enlightenment ideas, especially those of John Locke, emphasizing natural rights and the notion that government derives its power from the consent of the governed. He also borrowed phrasing from earlier colonial declarations, including his own A Summary View of the Rights of British America and borrowed extensively from George Mason’s Virginia Declaration of Rights.

The Editing Process: Group Work Gets Messy

After Jefferson completed the initial draft (likely by June 28), he shared it with Adams and Franklin. Both men suggested revisions. Franklin, ever the editor, softened some of Jefferson’s sharpest attacks and corrected language for flow and diplomacy. His most famous contribution was changing Jefferson’s phrase “We hold these truths to be sacred and undeniable” to the more secular and philosophically precise “We hold these truths to be self-evident.”  

Adams contributed to structural suggestions and to tone. He also contributed to the strategic presentation of grievances against King George III, understanding that the declaration needed to justify revolution in terms that would be acceptable to both colonial readers and potential European allies.

Sherman and Livingston played more limited but still meaningful roles. Sherman, with his theological background, helped ensure the document’s religious references would appeal to Puritan New England, while Livingston’s legal expertise helped refine the constitutional arguments against British rule.  Otherwise, their involvement in the actual content of the declaration was likely minimal.

The revised draft was presented to the full Continental Congress on June 28, 1776. What followed was a few days of intense debate and revision by the entire body.

Congress Takes the Red Pen

From July 1 to July 4, the Continental Congress debated the resolution for independence and edited the Declaration. Jefferson watched as more than two dozen changes were made to his prose. The Congress cut about a quarter of the original text, including a lengthy passage condemning King George III for perpetuating the transatlantic slave trade that would have sparked deep division among the delegates, especially those from Southern colonies.

Other modifications included strengthening the religious language, toning down some of the more inflammatory rhetoric, and making the grievances more specific and legally grounded.  Congress made 86 edits, removing about a quarter of Jefferson’s original content. Jefferson was reportedly frustrated by the changes, calling them “mutilations,” but he recognized that compromise was the cost of consensus

Approval and Promulgation

Despite the extensive revisions, the core of Jefferson’s vision remained intact and on July 2, 1776, the Continental Congress voted in favor of Lee’s resolution for independence. That’s the actual date the colonies officially broke from Britain. John Adams even predicted in a letter to his wife that July 2 would be celebrated forever as America’s Independence Day. He was close—but the official adoption of the Declaration came two days later.

On July 4, 1776, Congress formally approved the final version of the Declaration of Independence. Contrary to popular belief, most of the signers did not sign it on that day. Only John Hancock, as president of Congress, and Charles Thomson, as secretary, signed then.   The famous handwritten version, now in the National Archives, wasn’t signed until August 2. But the document approved on July 4 was immediately printed by John Dunlap, the official printer to Congress.

These first copies, known as Dunlap Broadsides, were distributed throughout the colonies and sent to military leaders, state assemblies, and even King George III. George Washington had it read aloud to the Continental Army.  This rapid dissemination was crucial to its impact, as it was needed to rally public support for the revolutionary cause and explain the colonies’ actions to the world.

Legacy and Impact

The Declaration wasn’t just a break-up letter to the British Crown—it was a manifesto for a new kind of political order. Its assertion that “all men are created equal” would echo through centuries of American history, invoked by abolitionists, suffragists, civil rights leaders, and more.

The creation of the Declaration of Independence demonstrates that even the most iconic documents in American history emerged from collaborative processes involving compromise, revision, and collective wisdom. While Jefferson deserves primary credit for the document’s eloquent expression of revolutionary ideals, the contributions of his committee colleagues and the broader Continental Congress were essential to creating a text that could unite thirteen diverse colonies in common cause.

This collaborative origin reflects the democratic principles the declaration itself proclaimed, showing that American independence was achieved not through the vision of a single individual, but through the collective efforts of representatives working together to articulate their shared commitment to liberty, equality, and self-governance. The process that created the Declaration of Independence thus embodied the very democratic ideals it proclaimed to the world.

Today, the Declaration of Independence is enshrined as one of the foundational texts of American democracy. But it’s worth remembering that it was created under immense pressure, forged by committee, and edited by compromise. Its authors knew they were taking a dangerous step. As Franklin quipped at the signing, “We must all hang together, or most assuredly we shall all hang separately.”

The Origin of Juneteenth: America’s Second Independence Day

The Juneteenth flag is red, white, and blue to reflect the American flag and includes a bursting star to symbolize freedom.

On June 19, 1865, an event that would forever change American history unfolded in Galveston, Texas. Union Major General Gordon Granger stood before a crowd and read General Order No. 3, announcing that “all slaves are free.” This proclamation marked the beginning of what we now celebrate as Juneteenth, America’s newest federal holiday and a day that celebrates the fulfillment of emancipation for all enslaved people in the United States.

Delayed Freedom

The story of Juneteenth begins with a troubling gap between law and reality. President Abraham Lincoln had issued the Emancipation Proclamation on January 1, 1863, declaring freedom for enslaved people in states “…in rebellion against the United States”. However, enforcement depended on the advance of Union troops and In the Confederate state of Texas—remote and beyond Union control—the proclamation went unenforced for more than two years.  Many slaveholders deliberately withheld information about emancipation, and the absence of Union forces meant that freedom remained out of reach for thousands.

Even after the Civil War effectively ended in April 1865 with Lee’s surrender at Appomattox, news of emancipation remained deliberately suppressed in Texas. Some enslavers continued to hold people in bondage through the spring planting season.  It wasn’t until federal troops arrived in Galveston in sufficient force to ensure compliance that the promise of emancipation became reality for the last enslaved Americans.

Birth of a Celebration

The newly freed Texans didn’t wait for official recognition to begin celebrating their liberation. They called it Juneteenth, a combination of June and nineteenth and celebrations erupted spontaneously across Texas as communities gathered to commemorate their freedom with prayer, music, food, and fellowship. These early celebrations were deeply rooted in African American culture, featuring traditional foods and drinks, spirituals and folk songs, and the retelling of the freedom story to younger generations.

As African Americans moved from Texas to other parts of the country during the Great Migration, they carried Juneteenth traditions with them. Throughout the late 19th and early 20th centuries, Juneteenth celebrations grew, often featuring parades, music, food, and family gatherings. The holiday’s popularity waned during the mid-20th century but experienced a resurgence during the Civil Rights Movement, as activists sought to reconnect with their heritage and the ongoing struggle for equality.

From Regional Tradition to National Recognition

For over a century, Juneteenth was primarily a regional and cultural celebration rather than an official holiday. Texas became the first state to make Juneteenth a state holiday in 1980, other states followed gradually. The movement gained momentum in the 21st century as Americans increasingly recognized the need to acknowledge the full history of emancipation.

The nationwide racial justice protests of 2020 brought renewed attention to Juneteenth’s significance. On June 17, 2021, President Joe Biden signed legislation making Juneteenth a federal holiday, acknowledging it as both a celebration of freedom and a reminder of America’s ongoing journey toward equality.

A Day of Reflection and Celebration

Today, Juneteenth serves multiple purposes in American life. It’s a day of celebration, honoring the resilience and culture of African Americans. It’s also a day of education, reminding all Americans about the complexities of emancipation and the ongoing struggle for civil rights. Most importantly, it stands for hope—proof that progress, however delayed, is possible when people demand justice and equality. It honors the struggles and achievements of African Americans, reminding us of the enduring importance of freedom, perseverance, and hope in the face of adversity. As communities gather each year to celebrate Juneteenth, they continue the tradition of remembering the past while striving for a more inclusive and equitable future

Juneteenth stands as a testament to the truth that freedom delayed need not be freedom denied.

Juneteenth is not an official state holiday in West Virginia. In prior years, former governor Jim Justice issued a proclamation declaring Juneteenth a paid holiday for state employees. The current governor has made no such proclamation. Those who are planning the Juneteenth celebration in West Virginia have scheduled a Juneteenth parade for June 20th, West Virginia Day, which is an official state holiday.

From Breaker Boys to Burger Flippers: The Resurgence of Child Labor in America

What West Virginia’s new child labor law tells us about a growing trend and a forgotten history.

📜 Introduction
In April 2025, West Virginia passed a law eliminating work permit requirements for 14- and 15-year-olds and opening hazardous occupations to older teens. It’s a policy shift that echoes a much darker chapter of American history—one most of us thought was long behind us.

As I read the news, I couldn’t help but recall Lewis Hine’s haunting photos of the “Breaker Boys”—children as young as eight sorting coal in dangerous conditions. Their faces were the face of American industry at its most exploitative. Their plight helped spark the labor reforms we now take for granted.

But are those reforms at risk of unraveling?


🕰 A Brief History of Child Labor in America
At the turn of the 20th century, over two million American children worked long hours in factories, coal mines, and fields. Some were as young as five. The wages were low, the conditions dangerous, and the toll—educational, emotional, and physical—immeasurable.

Most of these children came from poor or immigrant families. Factory and mine owners favored child labor because it was cheap, compliant, and expendable.


⚖️ Early Reforms and Legal Battles
The reform movement gained traction in the early 1900s thanks to activists, labor unions, and journalists. The National Child Labor Committee, founded in 1904, worked with photographers like Lewis Hine to expose the brutality of child labor to the American public.

Attempts to legislate federally met fierce resistance. The Keating-Owen Act (1916) was struck down by the Supreme Court in Hammer v. Dagenhart (1918), and a second effort was defeated in 1923. It wasn’t until the Fair Labor Standards Act (FLSA) of 1938 that the federal government established real guardrails:

  • Prohibited employment under 16 in manufacturing/mining
  • Banned hazardous work under 18
  • Limited working hours for minors
  • Authorized federal inspections

The FLSA marked the beginning of consistent national protections for working children.


🎓 Child Labor and Education: A Damaging Tradeoff
There’s a well-documented tradeoff between child labor and education:

  • Working children attend school less, perform worse, and are more likely to drop out.
  • Child labor perpetuates intergenerational poverty.
  • Education access is key to breaking this cycle—but only if children aren’t too exhausted or endangered to learn.

Even today, agricultural labor laws allow children as young as 12 to work long hours, especially among migrant families. These children have some of the country’s highest school dropout rates.


📉 Modern Rollbacks: A Disturbing Trend
Since 2021, over a dozen U.S. states have proposed or passed laws rolling back child labor protections, often citing labor shortages or “career readiness”:

  • Arkansas (2023): Removed permit and parental consent requirements for 14- and 15-year-olds.
  • Iowa: Now allows minors in meatpacking and industrial work, with waivers.
  • Kentucky: Loosened hour limits during the school year.
  • Other states: Missouri, New Jersey, New Hampshire, and others are following suit.

Critics warn that these laws open the door to exploitation, especially in lower-income communities.


🧠 Why It Matters
The repeal of child labor protections isn’t just a policy dispute—it’s a moral referendum. If child labor laws are weakened, the most vulnerable children will bear the cost, just as they did a century ago.

The lesson from history is simple: when economic hardship or political expediency trumps child welfare, it’s children who are put at risk.


📣 Final Thoughts
Public memory is short. But the bodies of exhausted child laborers buried in unknown graves and the broken educational paths of working teens are silent witnesses to the past—and a warning for the future.

If we claim to value children’s futures, our policies must reflect that—not just in schools, but in the workplace.


🔗 Sources and Suggested Further Reading

  • U.S. Department of Labor: Child Labor Provisions
  • National Child Labor Committee Archives
  • Keating-Owen Act Summary – OurDocuments.gov

“America at 250: A Revolution Remembered… or Forgotten?”

I’m old enough to remember the 200th anniversary of the American Revolution. Bicentennial symbols were everywhere. Liberty Bells, eagles, and the ubiquitous Bicentennial logo of the red, white and blue stylized five-point star. They could be found on hats, T-shirts, socks, soft drink cups, beer cans, and even a special “Spirit of ‘76” edition of the Ford Mustang II. Commemorative events and celebrations were being planned everywhere and people had “bicentennial fever”.

But the 250th anniversary is not attracting that same kind of attention or interest. I wonder why that is. Perhaps it’s that the name for a 250th anniversary, Semiquincentennial, doesn’t seem to roll off the tongue the way Bicentennial does. But I suspect it’s far more than just a tongue twisting name.

The Bicentennial came after a decade of national trauma.  The Vietnam War, Watergate, and the civil rights struggles had all roiled the country.  By 1976, most Americans wanted to feel good about the country again. It became a giant, colorful celebration of “American resilience.”

While the 250th anniversary of the American Revolution is being marked by numerous events, commemorations, and official proclamations, most are local, and it has not yet captured widespread public attention or generated the scale of national excitement seen during previous milestone anniversaries.

The anniversary arrives at a time of deep political polarization, which has complicated celebration plans.  There is an ongoing debate within the group tasked with planning the celebration, the U.S. Semiquincentennial Commission, about how to present American history. Some members advocate for a traditional, celebratory approach focusing on the Founding Fathers and patriotic themes. Others push for a more inclusive narrative that acknowledges the complexities of American history, including the experiences of women, enslaved people, Indigenous communities, and other marginalized groups

Beyond the commission itself, some historians note that the “history wars”—ongoing disputes throughout society over how U.S. history should be taught and remembered—have made it harder to generate broad, enthusiastic buy-in for the anniversary among the general public. 

Commemorations in places like Lexington and Concord have seen anti-Trump protesters carrying signs such as “Resist Like It’s 1775” and “No Kings,” explicitly drawing parallels between opposition to King George III and contemporary resistance to what they perceive as autocratic tendencies in current leadership. At the reenactment of Patrick Henry’s “Give Me Liberty or Give Me Death” speech, Virginia Governor Glenn Youngkin was met with boos and protest chants, highlighting how the Revolution’s legacy is being invoked in current political struggles.

While some organizers and historians hope the anniversary can serve as a unifying moment—emphasizing that “patriotism should not be a partisan issue”—the reality is that commemorations have often become forums for expressing contemporary political grievances and anxieties. The presence of both celebratory and dissenting voices at these events reflects the enduring debate over what it means to be American and who gets to define that identity.  The complexity and messiness of American history, combined with current societal tensions, may dampen the celebratory mood and make it harder for people to connect emotionally with the anniversary.

Even the 250th logo has become a source of dispute, although it is one of the few areas of disagreement that is nonpartisan and tends to be about stylistic and artistic merits of the logo. Proponents of the new logo appreciate its modern and inclusive design emphasizing that the flowing ribbon represents “unity, cooperation, and harmony,” and reflects the nation’s aspirations as it commemorates this milestone.  Detractors are concerned about the legibility of the “250” and the lack of traditional American symbols, such as stars, which could have reinforced its patriotic theme.

Surveys by history related organizations suggest that most Americans are not yet thinking about the 250th anniversary.  The run-up to 2026 may see increased attention, but as of now, the anniversary has not broken through as a major topic of national conversation.  If the anniversary continues to be viewed as a contentious partisan undertaking, it may never gain widespread popularity, and the general public may choose to stay away.

A friend who is a member of the West Virginia 250th committee told me that they had an initial meeting at which nothing was accomplished, and they have had no meeting since. It seems to me, this is up to us, the citizens, to ensure that the 250th anniversary of the American Revolution is appropriately remembered. We don’t have to live in an area where a Revolutionary War event occurred for us to recognize its events. Here in West Virginia, in October of 2024 we commemorated the 250th anniversary of the battle of Point Pleasant which many consider a precursor to the American Revolution.  This event was not organized by any state or national group. It was the result of efforts on the part of the City of Point Pleasant and the West Virginia Sons of the American Revolution.

We do not need to depend on the government; we the people can hold local commemorations of revolutionary events that occurred in other areas. We can hold commemorations of the Battle of Bunker Hill, the signing of the Declaration of Independence, the Battle of Saratoga and many other events. It will take the initiative of local people to organize these events.

It will be our great shame if we allow this the commemoration of an event so significant in both American and world history to be turned into something that divides us rather unites us and strengthens our common bond.

Blockchain

The Origins, Evolution, and Future of a Decentralized Revolution

Introduction

While trying to understand cryptocurrency, I came across blockchain. I found that I understood even less about blockchain than I did about cryptocurrency. The following article is my attempt to explain blockchain to myself.  If you have not read my earlier post The Rise of Cryptocurrency, doing so may be helpful for understanding this post.

Blockchain technology was once a niche topic among cryptographers and libertarians who hoped to be shielded from government scrutiny. It has since evolved into a global force reshaping how we think about data, transactions, and trust. Born in the wake of the 2008 financial crisis, blockchain offers a radical transparent alternative to traditional financial institutions.

Today, it underpins not only cryptocurrencies but also supply chains, voting systems, healthcare, and intellectual property. This article explores the history, mechanics, current applications, and future potential of blockchain technology.

1. Origins of Blockchain

  • Who Created It?  The modern concept of blockchain was introduced in 2008 by a pseudonymous developer (or group) known as Satoshi Nakamoto, in a white paper titled Bitcoin: A Peer-to-Peer Electronic Cash System. While Nakamoto’s identity remains unknown, the paper built on earlier work by cryptographers such as David Chaum (digital cash, 1980s) and Nick Szabo (“bit gold”).
  • Why Was It Developed?  Blockchain emerged in response to a global crisis of trust. The 2008 financial meltdown exposed the dangers of opaque, centralized financial systems. Nakamoto’s vision was a decentralized system that did not rely on trust and was an alternative where users wouldn’t need banks or governments to verify transactions.
  • First Use Case: The original application of blockchain was Bitcoin—the first decentralized digital currency. Many people believe that Bitcoin evolved from blockchain, but in fact, blockchain was created to make Bitcoin feasible.  Bitcoin’s blockchain acts as a transparent, time-stamped public ledger to prevent double-spending and centralized tampering.
  • Key Innovation: The Chain of Blocks, at its core, blockchain is a distributed ledger where transactions are grouped into blocks. Each block is cryptographically linked to the one before it, forming a secure, tamper-resistant chain that is spread across many computer networks.

2. How Blockchain Works

Blockchain operates on several core principles:

  • Decentralization: Data is stored across a network of nodes (think computers for simplicity) rather than a single server.
  • Immutability: Once added, a block cannot be altered without changing all subsequent blocks.
  • Consensus Mechanisms: Agreement is achieved through protocols like Proof of Work or Proof of Stake (explained below).
  • Transparency with Pseudonymity: Transactions are visible to all but are tied to encrypted addresses—not personal identities.

3. Why Blockchain Is Secure

  • Cryptographic Hashing: Each block contains a cryptographic hash (repeat) of the previous block’s data.  A cryptographic hash is a mathematical function that takes an input (or “message”) and returns a fixed-size string of characters, which appears random.  A discussion of it is well beyond the scope of this article (and my understanding as well).  Even a tiny change in the data drastically changes the hash.  Any tampering becomes immediately obvious, breaking the chain’s integrity.
  • Decentralization: Every node on the network has a full copy of the blockchain.  If a single node is altered, the change is rejected by the others.  This makes coordinated attacks extremely difficult, especially on large networks.
  • Consensus Mechanisms: Blockchain uses mathematical consensus to validate new blocks:
  • Proof of Work (PoW): Used by Bitcoin; involves solving complex mathematical puzzles. A 51% attack (controlling most of the computing power) is prohibitively expensive and would cost far more than could be realized through manipulation of the blockchain.
  • Proof of Stake (PoS): Used by Ethereum 2.0 and others; validators stake tokens, risking loss if they act dishonestly.  This might be thought of as posting a bond.
  • Immutability: Once a block is added and validated, it’s nearly impossible to alter.  Changing one block would require rewriting all subsequent ones and redoing the work—an impractical task on any meaningful scale.
  • Public and Private Key Cryptography: Each user has a private key (used to sign transactions) and a public key (used to verify them).  This ensures only the rightful owner can authorize a transaction.
  • Auditability: Most public blockchains are fully transparent.  Anyone can audit the ledger, view transaction history, and verify balances—without relying on centralized authorities.

4. Current Uses of Blockchain

Blockchain’s applications now stretch across numerous industries:

  • Finance Beyond Bitcoin:
  • Ethereum introduced smart contracts and decentralized apps (dApps).  Think of a smart contract as a digital vending machine. You put in a specific input (e.g., cryptocurrency), and the contract automatically performs a pre-programmed action (e.g., transfer of ownership, release of funds). No lawyer, banker, or notary is needed to oversee or verify the transaction.dApps are software programs that run on a blockchain or peer-to-peer network, rather than being hosted on centralized servers.
  • Decentralized Finance (DeFi) enables peer-to-peer lending, borrowing, and trading without traditional intermediaries.
  • Stablecoins (e.g., USDC, Tether) offer price stability by pegging cryptocurrencies to government backed currencies.
  • Cross-border payments are cheaper and faster using blockchain.
  • Supply Chain Transparency, companies like Walmart, IBM, and Maersk use blockchain for traceability.  Example: Lettuce traced from farm to shelf helps speed up food recalls.
  • Healthcare uses blockchain to secure medical records and track pharmaceuticals.  Estonia integrates blockchain into its national health system.
  • Voting and Governance is supported by trials, like West Virginia’s 2018 blockchain voting pilot, that aim to improve election transparency.  Concerns remain about digital vote integrity and security.
  • Digital Identity & Intellectual Property utilizesblockchaintoallowartists to use Non Fungible Tokens (NFT) to register digital ownership of art. An NFT is a unique digital asset that represents ownership or proof of authenticity of a specific item, such as artwork, music, video clips, virtual real estate, or even tweets, and it’s stored on a blockchain—a decentralized digital ledger.  It is used for assets that have no physical existence.  Think of it as owning the rights to a computer program.
  • Self-sovereign identity systems are being developed by companies like Microsoft for developing user-controlled credentials.

5. Criticisms and Challenges

Despite its promise, blockchain faces significant obstacles:

  • Scalability: Networks like Bitcoin can become slow and costly at high volumes.
  • Energy Consumption: PoW systems have been criticized for their high carbon footprint.  They make high demands on electrical grids and on water systems.
  • Regulatory Uncertainty: Governments differ widely on how to regulate blockchain and crypto.  International agreements will be necessary for advanced implementation but have not yet been established and in most cases have not even begun.
  • Fraud & Hype: Scams and speculative investments have eroded public trust in some blockchain projects.  Because of their decentralized structure, there’s no central authority to guarantee their security.  Given that the philosophy behind blockchain is to avoid government oversight, this may always be a problem.

6. The Future of Blockchain

  • Greener Alternatives: such asProof of Stake (e.g., Ethereum 2.0) significantly reduce energy use and improve scalability.
  • Central Bank Digital Currencies (CBDCs):  Countries like the U.S., China, and Sweden are considering, or in some cases piloting, digital currencies backed by governments and built on blockchain-like infrastructure.
  • Tokenization of Real Assets allows real estate, art, and even wine to be digitally fractionalized, allowing more people to invest in historically exclusive markets.
  • Interoperability of block chain means future systems will allow cross-blockchain communication, improving flexibility and usability across networks.
  • Decentralized Autonomous Organizations (DAOs) can operate through smart contracts and community voting—no CEOs or managers required. Potential applications include governance, philanthropy, and startup funding.

Conclusion

So, do I now fully understand blockchain?  Not hardly.  But it is important to be aware of it and know that it will have a significant impact on our lives.

Blockchain is more than an esoteric new technology—it’s a reimagining of how trust, authority, and ownership work in a digital society. From its roots in cyber-activism to its integration into governments and corporations, blockchain is reshaping the way we do business.

Its future will depend on whether we manage its risks and harness its power responsibly. Done right, blockchain could form a core part of tomorrow’s digital infrastructure. Done poorly, it could become another overhyped fad that imposes additional burdens on society.


🔑 Key Takeaways

  • Blockchain is a decentralized ledger that enhances transparency and trust.
  • It started with Bitcoin but now spans many industries.
  • Key strengths include immutability, transparency, and security.
  • Major challenges include scalability, energy use, and regulatory ambiguity.
  • The future could bring CBDCs, DAOs, interoperability, and asset tokenization.

What Is Fascism Anyway?

Fascist! The very word conjures up images of totalitarianism, militarism, suppression of dissent and brutality. Unfortunately, it’s become a ubiquitous portion of our political discourse. Each side, at one time or another, has accused the other of being fascist. But what do they really mean by fascist? Do they understand the definition and the reality of fascism? Or do they simply mean: “I disagree with you, and I really want to make you sound evil.”

I decided I needed to know more about fascism, so I’ve done some research, and I’d like to share the results with you. As I frequently do, I’ll start with the dictionary definition.  According to Merriam-Webster fascism is a political philosophy, movement, or regime that exalts nation and often race above the individual and that stands for a centralized autocratic government headed by a dictatorial leader, severe economic and social regimentation, and forcible suppression of opposition

As with many dictionary definitions, it gives us the 50,000-foot view without any real detail. What I’d like to do is cover the origins of fascism, its basic principles and how it rose to prominence in the middle of the 20th century. I also want to compare fascism to communism—another ideology that shaped much of the 20th century—and to provide insights into the differences and similarities between these two systems.

The Origins of Fascism

Fascism emerged in the early 20th century, primarily in Italy, as a reaction to the perceived failures of liberal democracy and socialism. The term itself comes from the Italian word “fascio,” meaning a bundle or group, symbolizing unity and collective strength. It also references fasces, a bundle of rods tied around an ax symbolizing authority in the Roman Republic.  It was appropriated as a symbol by Italian fascists in an attempt to identify with Roman history, much as American patriotic symbols are being appropriated by the radical right in the U.S. today.

Benito Mussolini, an Italian political leader, is often credited as the founder of fascism.   He established the groundwork for first fascist regime in Italy beginning in 1922 after he was appointed Prime Minister.  Fascism arose in a period of social and economic turmoil following the First World War. Many people in Europe were disillusioned with the existing political systems, which they believed had failed to prevent the war and its devastating consequences. The post-war economic instability, along with fears of communist revolutions like the one in Russia, provided fertile ground for the rise of fascist movements.

Moussolini, together with Italian philosopher Giovanni Gentile, published “The Doctrine of Fascism” (La Dottrina del Fascismo) in 1932, after he had consolidated political power in his hands.  It lays out the guiding principles and theoretical foundations of fascism, stressing nationalism, anti-communism, the glorification of the state, the belief in a strong centralized leadership, and the rejection of liberal democracy.   

The Philosophical Basis of Fascism

Fascism is rooted in several key philosophical ideas:

  • Nationalism and Militarism: Fascism places the nation or race at the center of its ideology, often elevating it to a quasi-religious status. The state is seen as a living entity that must be protected and expanded through internal police action and external military strength.
  • Authoritarianism: Fascists reject democratic institutions, believing that a strong, centralized authority is necessary to maintain order and achieve national greatness. Individual freedoms are subordinated to the needs of the state.
  • Anti-Communism and Anti-Liberalism: Fascism is explicitly opposed to both communism and liberal democracy. It views communism as a threat to national unity and social order, while liberal democracy is seen as weak and indecisive.
  • Social Darwinism: Fascists often believe in the idea of the survival of the fittest, applying this concept to nations and races. They argue that conflict and struggle are natural and necessary for the advancement of the state.

Implementation and Practice of Fascism

Fascism has been implemented in various forms, with Italy under Mussolini and Nazi Germany under Adolf Hitler being the most prominent examples. In practice, fascist regimes are characterized by:

  • Centralized Power: Fascist governments concentrate power in the hands of a single leader or party, often through the use of propaganda, censorship, political repression, and mass imprisonment and execution of opponents.
  • State Control of the Economy: While fascists generally allow for private ownership, they maintain strict control over the economy, directing resources toward the state’s goals, particularly militarization.
  • Suppression of Dissent: Fascist regimes are intolerant of opposition, often using violence, imprisonment, and even assassination to eliminate political rivals and suppress dissent.
  • Cult of Personality: Fascist leaders often create a cult of personality, presenting themselves as the embodiment of the nation and its destiny.

Comparing Fascism and Communism

While both fascism and communism reject liberal democracy, they differ significantly in their goals and methods.

  • Philosophical Differences:
    • Fascism: As mentioned earlier, fascism emphasizes nationalism, authoritarianism, and social hierarchy. It seeks to create a strong, unified state that can compete with other nations on the global stage.
    • Communism: Communism, based on the ideas of Karl Marx, advocates for a classless society where the means of production are owned collectively. It seeks to eliminate private property and achieve equality among all citizens.
  • Economic Systems:
    • Fascism: Fascists allow for private ownership but maintain state control over key industries and direct economic activity to serve the state’s interests.
    • Communism: Communism advocates for the abolition of private property, with all means of production owned and controlled by the state (or the people in theory). The economy is centrally planned and managed.
  • Political Structures:
    • Fascism: Fascist regimes are typically one-party states with a strong leader at the top. Political pluralism is non-existent, and the government exercises strict control over all aspects of life.
    • Communism: Communist states are also typically one-party systems, but they claim to represent the working class. In practice, these regimes often become highly centralized and authoritarian or totalitarian, similar to fascist states.

Comparative Examples

  • Italy and Nazi Germany (Fascism): Both Mussolini’s Italy and Hitler’s Germany exemplify fascist regimes. They were characterized by aggressive nationalism, military expansionism, and the suppression of political opposition. Hitler’s regime, however, took these ideas to their most extreme and horrifying conclusion with the Holocaust, a genocide driven by racist ideology.
  • Soviet Union (Communism): The Soviet Union under Joseph Stalin provides a clear example of a totalitarian communist state. The government abolished private property, collectivized agriculture, and implemented central planning. Political repression was severe, with millions of people imprisoned, starved to death or executed during Stalin’s purges.  It is important to recognize that Stalinist communism differed significantly from the theoretical communism of Karl Marx.

Conclusion

Fascism and communism, despite their profound differences, share certain similarities in practice, particularly in their authoritarianism and intolerance of dissent. However, their philosophical foundations and goals are fundamentally different: fascism seeks to elevate the nation above all else, while communism theoretically aims to create a classless society. Understanding these ideologies and their historical manifestations is crucial for anyone interested in the political history of the 20th century and its lasting impact on the world today. 

We can use our understanding of fascism and its comparison to democracy to ask important questions. What kind of government do we want?  Are there any possible crossovers or compromises between the two? And, importantly, should there be?

Postscript

Many of the ideas in this post were inspired by two excellent books on the subject, “The Origins of Totalitarianism” by Hannah Arendt and “Fascism: A Warning” by Madeleine Albright.

 Doctors of the Deep Blue Sea

A Brief History of the U.S. Navy Medical Corps

The U.S. Navy Medical Corps has a history that evolves from a humble beginning during the Revolutionary War to its current role as a vital component of modern military medicine. The Medical Corps ensures the health and well-being of sailors, Marines, and their families, while contributing to public health and advancements in medical science.

Origins in the Revolutionary War

The roots of Navy medicine trace back to the Revolutionary War, when medical care aboard ships was primitive at best. Shipboard surgeons, often lacking formal medical training, treated injuries and disease with the limited tools and knowledge available to them. In the early days of the U.S. Navy, physicians served without formal commissions, often receiving temporary appointments for specific cruises.  Their primary tasks included amputations, treating infections, and caring for diseases like scurvy and dysentery.

In 1798, Congress formally established the Department of the Navy, creating the foundation for organized medical care within the naval service.  Surgeon Edward Cutbush published the first American text on naval medicine in 1808. The Naval Hospital Act of 1811 marked another milestone, authorizing the construction of naval hospitals to support the growing fleet.

Establishment of the Navy Medical Corps (1871)

The U.S. Navy Medical Corps was officially established on March 3, 1871, by an act of Congress. This legislation created a formal medical staff to support the Navy, setting standards for the recruiting and training naval physicians. These physicians were initially known as “Surgeons” and “Assistant Surgeons,” tasked with providing care on ships and at naval hospitals.  The act granted Navy physicians rank relative to their line counterparts, acknowledged their role as a staff corps, and established the title of “Surgeon General” for the Navy’s senior medical officer.

During this period, the Navy Medical Corps began to expand its scope. It embraced emerging medical technologies and scientific discoveries, setting the stage for its later contributions to public health and medical innovation.

The Navy Hospital Corps

The U.S. Navy Hospital Corps was established on June 17, 1898. Its creation was prompted by the increased medical needs during the Spanish-American War. Since then, the enlisted corpsmen have served in every conflict involving the United States, providing critical medical care on battlefields, aboard ships, and in hospitals worldwide.

Corpsmen are trained to perform a wide range of medical tasks, including emergency battlefield triage and treatment, surgery assistance, and disease prevention. They are often embedded directly with Marine Corps units, making them indispensable on the battlefield.

The Hospital Corps is the most decorated group in the U.S. Navy. To date, its members have earned numerous high-level awards for valor, including: 22 Medals of Honor, 182 Navy Crosses, 946 Silver Stars, and 1,582 Bronze Stars.

World Wars and the Expansion of Military Medicine

Both World War I and World War II were transformative for the Navy Medical Corps. During World War I, Navy medical personnel treated injuries and illnesses both aboard ships and in field hospitals. Their efforts were instrumental in managing wartime epidemics, including the devastating 1918 influenza pandemic.

World War II brought further advancements. The Navy Medical Corps played a pivotal role in addressing the challenges of warfare in diverse climates, including tropical diseases in the Pacific Theater. It also pioneered methods for treating trauma, burns, and psychiatric conditions.

Cold War Era and Modernization

The Cold War era marked a time of significant innovation for the Navy Medical Corps. The establishment of the Navy Medical Research Institutes advanced studies in areas such as tropical medicine, submarine medicine, and aerospace medicine. These efforts supported the Navy’s global missions and contributed to broader medical advancements.

In the latter half of the 20th century, Navy medical personnel became key players in humanitarian missions, responding to natural disasters and providing aid in conflict zones. Their expertise in public health, infectious disease control, and trauma care enhanced the Navy’s ability to spread goodwill worldwide.

Modern Contributions and Future Challenges

Today, the Navy Medical Corps supports both military readiness and global health. Its personnel provide care on ships, submarines, aircraft carriers, and for Marine Corps forces, and at shore-based facilities. They also participate in humanitarian missions and disaster response, reflecting the Navy’s commitment to a broader vision of security and well-being.

In recent years, Navy medicine has faced challenges such as the COVID-19 pandemic, addressing mental health issues among service members, and adapting to emerging threats like climate change and cyber warfare defense. These challenges underscore the evolving role of the Navy Medical Corps in a complex world.

From its early days of rudimentary care to its modern role in global health and innovation, the U.S. Navy Medical Corps has been a cornerstone of military medicine. Its contributions extend beyond the battlefield, shaping public health, medical research, and humanitarian efforts worldwide.

As the Navy Medical Corps continues to adapt to new challenges, it remains a testament to the enduring value of medical service in the defense of the nation and the promotion of global health.

Superheros of the American Revolution

The American Revolution was fought not just by great leaders but by ordinary men and women who sacrificed everything for the promise of liberty. These “superheroes” of The Revolution—everyday soldiers of the Continental Army—endured unimaginable hardships and proved their resilience and commitment to a cause greater than themselves.

Who Were the Soldiers?
The typical soldier in the Continental Army was a young, able-bodied man in his late teens or twenties. However, recruits ranged widely in age, from boys as young as 16 to older men in their 40s or 50s. They came from all walks of life, reflecting the agrarian and small-town character of colonial America.

Work Background
Most soldiers were farmers or farm laborers, the backbone of the colonial economy. Others worked as apprentices or tradesmen, honing skills in blacksmithing, carpentry, and shoemaking. In coastal regions, fishermen and sailors also joined the ranks, bringing valuable maritime experience. Whatever their occupation, enlistment often meant leaving behind grueling but steady work, placing enormous burdens on their families and communities.

Education
Formal education was limited for most enlisted men. Literacy rates in colonial America, though higher than in Europe, were modest. Many soldiers could read and write only minimally, though these skills were sufficient for reading orders or sending letters home. Officers were generally better educated, often hailing from wealthier families with access to classical training and instruction in leadership and military strategy.

Family Life
Family ties were integral to the soldiers’ lives. Most were unmarried young men, but some older recruits left wives and children behind. Married soldiers relied on their families to manage farms and households in their absence, with women stepping into traditionally male roles to keep homes running. Communities often influenced enlistment decisions, with entire groups of men from the same town joining together, fostering camaraderie and mutual responsibility.

Why They Fought
Motivations for joining the Continental Army varied:
Patriotism: Many believed passionately in independence and the ideals of liberty and self-governance.
Economic Opportunity: For poorer colonists, enlistment promised steady (albeit delayed) pay and the promise of land grants after the war.
Community Expectations: Peer pressure and local leaders often spurred enlistments.
Adventure: For some young men, the army offered a chance for excitement and novelty.

Life in the Continental Army
Soldiers in the Continental Army faced extraordinary challenges that tested their endurance, commitment, and morale.
Logistical Struggles
The army constantly grappled with a lack of basic supplies:
Food: Soldiers often endured long periods of hunger, relying on inconsistent local contributions, sometimes going days without eating.
⦁ Clothing: Many lacked proper uniforms, footwear and blankets, suffering in harsh weather some even dying from exposure.
Ammunition: Weapons and ammunition were scarce, forcing soldiers to scavenge from battlefields.

At Valley Forge in the winter of 1777–1778, these shortages reached a critical point, with thousands suffering from frostbite, near starvation and exposure.
Extreme weather compounded the soldiers’ difficulties. Winter encampments like Valley Forge were marked by freezing temperatures, snow, and overcrowded, unsanitary conditions that led to outbreaks of smallpox, typhus, and dysentery.
Soldiers marched long distances with heavy packs, often on empty stomachs and in worn-out shoes. The physical strain was enormous, and separation from families added emotional stress. Many struggled to adapt to military life, which was vastly different from their previous experiences as farmers or tradesmen.

Financial Hardships
The fledgling American government struggled to fund the war:
⦁ Soldiers were rarely paid on time, leading to frustration and occasional mutinies.
⦁ Promised wages were often months or years late, making it difficult for soldiers to support their families.

Inconsistent Leadership and Training
Early in the war, the army lacked professional training and experienced leadership. While General George Washington provided steadfast guidance, many officers were political appointees with little military expertise. This began to change when Baron von Steuben arrived at Valley Forge, introducing systematic training and discipline.

Psychological Strain
The Revolutionary War dragged on for eight years, leaving soldiers to question whether their sacrifices would lead to victory. Early defeats against the better-equipped British Army demoralized many, and desertion rates were high. Still, the shared belief in the cause of liberty and the support of local communities kept many soldiers in the fight.

The Role of Communities
The army’s survival depended on civilian support. Local farmers, tradesmen, and women provided food, clothing, and moral encouragement. Civilians risked their lives to aid soldiers, and the collective belief in independence buoyed spirits even in the darkest times.

Conclusion
The common soldiers of the Continental Army were true superheroes of the American Revolution. Despite enduring hunger, cold, disease, and financial instability, they fought with unwavering determination. Their sacrifices laid the foundation for a new nation, proving that the quest for freedom often requires immense personal and collective sacrifice.

Sources:
Robert Middlekauff, The Glorious Cause: The American Revolution, 1763-1789
⦁ Caroline Cox, A Proper Sense of Honor: Service and Sacrifice in George Washington’s Army
⦁ Library of Congress: American Revolution resources

Waiting For The Reichstag Fire

On the evening of February 27th, 1933 the German Reichstag burst into flames. This attack on the German national parliament building was viewed by many as an attack on Germany itself.

A Dutchman named Marinus van der Lubbe was found and arrested at the scene almost immediately after the fire erupted. The Nazis quickly claimed that the fire was part of a broader communist uprising and used this claim to push for emergency powers.

 Van der Lubbe confessed to setting the fire alone, but the Nazi Party quickly claimed that it was part of a widespread communist conspiracy. Many people believe that the Nazis may have set the fire themselves and used it as a pretext to declare emergency rule.

 Adolf Hitler persuaded German President Paul von Hindenburg to issue the “Decree for the Protection of the People and the State” which suspended civil liberties, including freedom of speech, press and assembly. It allowed for the arrest and detention of political opponents without due process. Thousands of communists and socialists were arrested.

Within a month new elections were held. While the Nazis did not win an outright majority, they used the fire to create fear that led to passage of the “Enabling Act” on March 23, 1933. The act gave Hitler dictatorial powers, effectively ending democracy in Germany.

The Reichstag Fire was a crucial point in world history. Whether it was a Nazi engineered false flag operation or the act of a alone arsonist, it provided Hitler with the excuse he needed to dismantle democracy and establish a totalitarian dictatorship. This is a chilling example of how fear and propaganda can be weaponized to erase freedom; a lesson that remains relevant today.

The Wisdom of Dietrich Bonhoeffer

Dietrich Bonhoeffer (1906–1945) was a German Lutheran pastor, theologian, and anti-Nazi dissident. Born in Poland into an intellectual family, he pursued theology at the University of Berlin, earning a doctorate at just 21. His early work emphasized the importance of the Church in standing against injustice, a principle that would shape his resistance to Adolf Hitler’s regime.

In the 1930s, Bonhoeffer became a leading voice in the Confessing Church, a movement opposing Nazi influence in German Protestantism. He condemned the regime’s treatment of Jews and rejected the idea of a church subservient to state ideology. After the Nazis banned him from teaching and speaking publicly, he joined the German resistance, working secretly with military officers plotting to overthrow Hitler.

Arrested in 1943 for his role in the conspiracy, Bonhoeffer was imprisoned for two years, during which he wrote some of his most profound theological works, including Letters and Papers from Prison. The quotes below are taken from this work.

On April 9, 1945, just weeks before Germany’s surrender, he was executed at Flossenbürg concentration camp. His legacy endures as a model of Christian resistance, moral courage, and faith in action.

Quotes from Letters and Papers from Prison

“The impression one gains is not so much that stupidity is a congenital defect, but that, under certain circumstances, people are made stupid or that they allow this to happen to them.”

“Having thus become a mindless tool, the stupid person will also be capable of any evil and at the same time incapable of seeing that it is evil. This is where the danger of diabolical misuse lurks, for it is this that can once and for all destroy human beings.”

“Stupidity is a more dangerous enemy of the good than malice. One may protest against evil; it can be exposed and, if need be, prevented by use of force. Against stupidity we are defenseless.”

“Neither protests nor the use of force accomplish anything here; reasons fall on deaf ears; facts that contradict one’s prejudgment simply need not be believed – in such moments the stupid person even becomes critical – and when facts are irrefutable they are just pushed aside as inconsequential, as incidental.”

“In all this the stupid person, in contrast to the malicious one, is utterly self satisfied and, being easily irritated, becomes dangerous by going on the attack.”

Page 6 of 9

Powered by WordPress & Theme by Anders Norén