Grumpy opinions about everything.

Category: History Page 4 of 9

Grumpy opinions about American history

Three Shades of Left

Understanding Classical Socialism, Democratic Socialism, and Social Democracy in Today’s America

If you’ve ever wondered what politicians really mean when they throw around words like “socialism” or “social democracy,” you’re not alone. These ideas used to live mostly in political theory textbooks. Now they show up in campaign speeches and social media debates. With figures like Bernie Sanders and groups like the Democratic Socialists of America bringing these ideas into the mainstream, it’s worth sorting out what each actually means.

Even though classical socialism, democratic socialism, and social democracy all claim to focus on fairness and reducing inequality, they take very different routes to get there. Understanding those differences helps make sense of what’s really being argued about in American politics today.

Classical Socialism: The Original Blueprint

Classical socialism came out of the 19th century, when industrial capitalism was grinding workers down and a couple of guys named Karl Marx and Friedrich Engels thought they had the fix. Their idea: workers should collectively own and control the means of production — factories, land, and major industries.

This wasn’t just about taxing the rich. It was about redesigning the whole system from the ground up, through violent revolution if necessary. In theory, private property creates exploitation; collective ownership ends it. In practice, that often means top-down control by the state, with economies planned from above — as seen in the Soviet Union or Maoist China.

The central ideas of classical socialism are collective ownership of big industries and central or cooperative planning instead of market competition.  Production is aimed at meeting needs, not profits with the eventual goal of a classless, stateless society. Classical socialism accepts that revolution will most likely be necessary for implementation.

In theory, classical socialism wipes out worker exploitation and wealth extremes. Its central tenant is that production serves human needs, not corporate profit.  In practice, it often leads to authoritarian governments, clumsy economic planning, and little room for innovation or dissent.

Would it work in America?
Probably not. The U.S. has deep cultural roots in individualism and private enterprise. Replacing markets with centralized planning would clash hard with both our Constitution and national temperament.

The Siblings of Socialism

In the real world, classical socialism has produced two offsprings, the confusingly named democratic socialism and social democracy. While they share many similarities, the major difference is that democratic socialism aims to replace capitalism while social democracy has the objective of reforming capitalism and making it more humane.

Democratic socialism

Democratic socialism shares many of classical socialism’s goals but emphasizes getting there through elections — not revolution. It aims to establish central control of key parts of the economy while protecting some political freedom and most civil rights.

The vision of Democratic Socialism is collective (public) ownership of major industries like energy, transportation, manufacturing, and communications. The economy would be directed and managed by the government, but the government would be elected and it would not be an authoritarian state.  It proposes that within individual industries there would be worker self-management and workplace democracy. It also proposes that there would be private sector businesses allowed on a small scale—think Mom and Pop retail. It supposes gradual reform, not a violent upheaval, while maintaining democracy and civil liberties.

There are several major drawbacks to democratic socialism. Progress can be slow, easily reversed, and still subject to bureaucratic inefficiencies. Competing globally with capitalist economies might also prove tough. To me the major drawback is how major corporations, financial institutions, and wealthy businesspeople can be convinced to peacefully hand over control of major portions of the economy to a “people’s collective”.

How it fits in the U.S.:
Democratic Socialism has grown in popularity, especially among younger voters; although, it seems that many younger people seem to believe that this means making things more fair rather than supporting the reality of Democratic Socialism.

Bernie Sanders and Alexandria Ocasio-Cortez wear the label proudly. Still, the idea of government control of a significant portion of the economy faces serious resistance here. Realistically, it’s more a movement that nudges policy leftward than a model ready for prime time.

Social Democracy: Capitalism with Guardrails

Social democracy takes a different track. It doesn’t want to abolish capitalism — it wants to civilize it. Think Scandinavia: private ownership, strong markets, but also universal healthcare, paid leave, and free college.

The central elements of Social Democracy are a mixed economy with both public and private sector control. In some models, there is direct government management of such public services as healthcare, energy and transportation. In other models, there remains private control of these services with a strong regulation on the part of the government.

Regardless of the chosen model, a Social Democracy is a strong welfare state with universal benefits. The definition of welfare in this context is a way of providing earned support for hard working citizens  Perhaps it should be called an earned benefits state as the term welfare has a pejorative implication for some.

There is strong market regulation to prevent unfair competition, price gouging, and monopolies that are detrimental to public good. There is a progressive tax program designed to reward productivity while heavily taxing passive or nonproductive income. These taxes are used to fund generous public services.

The government remains elective and responsive to the public. It’s proven to work. Nordic countries show that capitalism can coexist with equality and innovation.  While it is expensive, and high taxes can be a political lightning rod, it leaves capitalism’s basic structure intact.   There is a constant risk that inequality can creep back if protection weaken.

In the U.S. context:
Social democracy may be the most realistic option. As social scientist Lane Kenworthy puts it, America already is a social democracy — just not a particularly generous one. We’ve got Medicare, Social Security, public education — we just underfund them compared to our European cousins.  The reality is that income lost to increased taxation is regained through decreases in insurance premiums, healthcare costs, education expenses and retirement expenses. 

With Elon Musk on the cusp of becoming the world’s first trillionaire we have to ask: “How much is enough before they accept their social responsibility to the working people that made their wealth possible?”  The bottom line is that when the ultra-wealthy are required to pay their fair share of taxes, public services become affordable. We should be supporting people, not yachts.

What’s Realistically Possible Here?

Culturally, Americans value freedom, competition, and property rights. Yet polls show younger voters are warming up to “socialism,” even if most don’t seem to be clear about the specifics. Institutionally, the U.S. political system makes sweeping change tough. Our winner-take-all elections favor a two-party system that leaves little room for socialist parties to grow independently.

Democratic Socialism may continue to shape the conversation, but full socialism — especially the classic Marxist kind — is not likely to take hold here.  From my perspective, the most realistic option, Social Democracy is too often overlooked in these discussions.

Given that, the path of least resistance looks like expanded Social Democracy: things like a revised and equitable tax code, universal healthcare, free or subsidized higher education, paid family leave, stronger labor laws, and public investment in infrastructure and green energy.

Social Democracy looks like the most attainable path — not a revolution, but an evolution toward a fairer society.  Only time will tell.

The Continental Marines: Birth of America’s Amphibious Warriors

When most people think of the American Revolution, they picture Continental soldiers marching across snowy battlefields or patriot militias defending their homes. But there’s another group that played a crucial role in securing American independence: the Continental Marines. These amphibious warriors served in America’s nascent naval force and proved their worth on both land and sea during the eight-year struggle for independence.

The Continental Marines, established in 1775, served as America’s first organized marine force during the Revolutionary War before being disbanded in 1783, laying the foundation for what would eventually become the modern U.S. Marine Corps.  Though short-lived, the original Marine Corps played a significant role in America’s fight for independence, setting precedents that the modern Marine Corps still honors today.

The Legislative Foundation

By the fall of 1775, the American colonies were no longer engaged in mere protest—they were in open rebellion against the British Empire. Battles had already been fought at Lexington, Concord, and Bunker Hill. The Continental Congress, led by figures like John Adams, had begun to organize a Continental Army under George Washington’s command. But many in the Congress, especially Adams, believed a navy was also essential to challenge British power at sea and disrupt its supply lines.

With a navy, it was reasoned, must come Marines—soldiers trained to serve aboard ships, conduct landings, enforce discipline, and fight in close quarters during boarding actions. This model was based on the British Royal Marines, a corps with a long and respected tradition.

The Continental Marines came into existence through a resolution passed by the Second Continental Congress on November 10, 1775. This date, which Marines still celebrate today as their birthday, marked a pivotal moment in American military history.

The Continental Marine Act of 1775 decreed: “That two battalions of Marines be raised consisting of one Colonel, two lieutenant-colonels, two majors and other officers, as usual in other regiments; that they consist of an equal number of privates as with other battalions, that particular care be taken that no persons be appointed to offices, or enlisted into said battalions, but such as are good seamen, or so acquainted with maritime affairs as to be able to serve for and during the present war with Great Britain and the Colonies.”

The legislation was part of Congress’s broader effort to create a Continental Navy capable of challenging British naval supremacy. The resolution was drafted by future U.S. president John Adams and adopted in Philadelphia. This wasn’t just about creating another military unit—Congress recognized that naval warfare required specialized troops who could fight effectively both on ships and on shore. The concept wasn’t entirely new—European navies had long employed marines for similar purposes—but the Continental Marines represented America’s first organized attempt to create a professional amphibious force, though the term amphibious didn’t come into use in a military setting until the 1930s—they would likely have been informally referred to as a naval landing force.

Recruitment: From Taverns to the Fleet

The recruitment of the Continental Marines has become the stuff of legend, particularly the story of their traditional birthplace at Tun Tavern in Philadelphia. Though legend places its first recruiting post at Tun Tavern, historian Edwin Simmons surmises that it may as likely have been the Conestoga Waggon [sic], a tavern owned by the Nicholas family. Regardless of which tavern served as the primary recruiting station, the Marines can claim the unique distinction of being the only military branch “born in a bar”.

The first Commandant of the Marine Corps was Captain Samuel Nicholas, and his first Captain and recruiter was Robert Mullan, the owner of Tun Tavern. Samuel Nicholas, a Quaker-born Philadelphia native and experience mariner, was commissioned on November 28, 1775, becoming the Continental Marines’ senior officer and only commandant throughout their existence. While his background as a Philadelphia tavern keeper may seem unusual for a military leader, his connections in the maritime community proved invaluable for recruiting. The requirement for maritime experience shaped the character of the force from its inception.

The Marines faced immediate recruitment challenges. Originally, Congress envisioned using the Marines for a planned invasion of Nova Scotia.  They expected the Marines to draw personnel from George Washington’s Continental Army.  However, Washington was reluctant to part with his soldiers, forcing the Marines to recruit independently, primarily from the maritime communities of Philadelphia and New York.

By December 1775, Nicholas had raised a battalion of approximately 300 men, organized into five companies, though this fell short of the original plan for two full battalions. Robert Mullan, helped to assemble the fledgling fighting force. Plans to form the second battalion were suspended indefinitely after several British regiments-of-foot and cavalry landed in Nova Scotia, making the planned naval assault impossible.

Organization for Dual Service

The Continental Marines were organized as a flexible force capable of serving both aboard ships and on land. For shipboard service, Marines were organized into small detachments that could be distributed across the Continental Navy’s vessels. Their organization reflected their multi-purpose mission: they served as security forces protecting ship officers, repelling boarders and joining boarding parties during naval engagements, and as assault troops for amphibious operations. Marksmanship received particular emphasis—a tradition that continues to this day—as Marines often served as sharpshooters in naval engagements, targeting enemy officers and sailors from the rigging and fighting tops of ships.

During the Revolutionary War, the Continental Marines uniform directives specified a green jacket with white facings and cuffs.   However, when the first sets of uniforms were actually ordered and delivered, red facings were substituted for white. The likely reason was supply availability: red cloth was easier to obtain from Continental or captured British stores. The most authoritative description comes from Captain Samuel Nicholas, who wrote from Philadelphia in 1776 that Marines were outfitted in “green coats faced with red, and lined with white”

The uniform also included a high leather collar, or stock, to ostensibly protect the neck against sword slashes, although there is some evidence that may actually have been intended to improve posture. This distinctive uniform item helped establish their identity as an elite force and eventually lead to their treasured nickname “leathernecks”.

Shipboard Service and Naval Operations

The Continental Marines’ role aboard ship was multifaceted and crucial to naval operations. Their most important duty was to serve as onboard security forces, protecting the captain of a ship and his officers. During naval engagements, in addition to manning the cannons along with the crew of the ship, Marine sharp shooters were stationed in the fighting tops of a ship’s masts specifically to shoot the opponent’s officers and crew. These duties reflected centuries of naval tradition and drew on the example of the British Marines.

The Marines’ first major naval operation came in early 1776 when five companies joined Commodore Esek Hopkins’ Continental Navy  squadron, on its first cruise in the Caribbean. This deployment demonstrated their value as both shipboard security and assault troops, setting the pattern for their service throughout the war.

Major Land-Based Actions

Despite their naval origins, the Continental Marines proved equally effective in land combat. Their most famous early action was the landing at Nassau on the Island of New Providence in the Bahamas in March 1776. The landing was the first by Marines on a hostile shore.  It was led by Captain Nicholas and consisted of 250 marines and sailors. After 13 Days the Marines had captured two forts, the Government House, occupied Nassau and captured cannons and large stores of supplies. While they missed capturing the gunpowder stores (which had been evacuated before their arrival), the raid demonstrated American capability to strike British positions anywhere.

Though modest in scale, this operation had a major symbolic weight and established the Marines as America’s premier amphibious force. The operation did not decisively alter the balance of the war, but it foreshadowed the Marines’ enduring identity as a seafaring, expeditionary force. Today, the Battle of Nassau is remembered less for the supplies seized than for what it represented: the moment the Continental Marines stepped onto the world stage.

Other notable operations included raids on British soil itself. In April of 1778, Marines under the command of John Paul Jones made two daring raids, one at the port of Whitehaven, in northwest England, and the second later that day at St. Mary’s Isle. These operations brought the war directly to British territory, demonstrating American reach and resolve.  While the battles had no strategic impact on the outcome of the war, they were a great moral booster when reports, though largely exaggerated, reached the rebellious colonies

Official Marine Corps history also acknowledges Marine participation in the Battle of Princeton, though it wasn’t a major Marine engagement. Marines from Captain William Shippen’s company, who had been serving aboard Continental Navy ships, participated in this battle as a part of Cadwalader’s Brigade on Washington’s flank.  Some Marines were detached to augment the artillery, with a few eventually transferring to the army.  However, the Marines’ role was relatively minor compared to their more significant naval actions during this period.

The Gradual Decline

As the Revolutionary War progressed, the Continental Marines faced increasing challenges. Financial constraints plagued the Continental forces throughout the war, and the Marines were no exception. The Continental Congress struggled to fund and supply all military branches, and the relatively small Marine force often found itself at a disadvantage competing for resources with the larger Continental Army and Navy.

Recruitment became increasingly difficult as the war dragged on. After the early campaigns, Nicholas’s four-company battalion discontinued independent service, and remaining Marines were reassigned to shipboard detachments.  Their number had been reduced by transfers, desertion, and the loss of eighty Marines through disease.

The Continental Navy also faced severe challenges that directly impacted the Marines. Many ships were captured, destroyed, or sold, leaving Marines without their primary operational platform. As the naval war shifted toward privateering and smaller-scale operations, the need for organized Marine units diminished.

Beginning in February 1777 two companies of Marines either transferred to Morristown to assume the roles in the Continental artillery batteries or left the service altogether. This transfer of Marines to army artillery units reflected the practical reality that their specialized skills were needed elsewhere as the Continental forces adapted to changing circumstances.

Disbanded at War’s End

The end of the Revolutionary War marked the end of the Continental Marines as an organized force. Both the Continental Navy and Marines were disbanded in April 1783. Although a few individual Marines briefly stayed on to provide security for the remaining U.S. Navy vessels, the last Continental Marine was discharged in September 1783.

The last official act of the Continental Marines was escorting a stash of French Silver Crowns (coins) from Boston to Philadelphia—a loan from Louis XVI to establish of the Bank of North America. This final mission, conducted in 1781, symbolically linked the Marines to the new nation’s financial foundations even as their military role ended.

The disbanding reflected broader American attitudes toward standing military forces. Having won their independence, Americans were skeptical of maintaining large military establishments that might threaten republican government. The Continental Congress, facing financial pressures and political opposition to permanent military forces, chose to disband both the Navy and Marines.

Legacy

The Continental Marines’ contribution to American independence was significant despite their small numbers. In all, over the course of 7 years of battle, the Continental Marines had only 49 men killed and just 70 more wounded, out of a total force of roughly 130 Marine Officers and 2,000 enlisted. These relatively low casualty figures reflected both their effectiveness and the limited size of the force.

Rising tensions with Revolutionary France in the late 1790s led to the Quasi-War, prompting Congress to reestablish the Navy in 1798. On July 11 of that year, President John Adams signed legislation formally creating the United States Marine Corps as a permanent branch of the military, under the jurisdiction of the Department of the Navy. This new Marine Corps inherited the traditions, mission, and esprit de corps of its Revolutionary War predecessors.  Despite the gap between the disbanding of the Continental Marines and the establishment of the new United States Marine Corps, Marines honor November 10, 1775, as the official founding date of their Corps.

The Continental Marines established precedents that would shape American military doctrine for more than two centuries. The Revolutionary War not only led to the founding of the United States (Continental) Marine Corps but also highlighted for the first time the versatility for which Marines have come to be known. They fought on land, they fought at sea on ships, and they performed amphibious assaults.

The Continental Marines represented a crucial innovation in American military organization. Born from congressional resolution and tavern recruitment, these maritime warriors proved their worth in battles from the Caribbean to the British Isles. Though disbanded with the war’s end, their legacy lives on in the traditions and spirit of the modern Marine Corps. While their numbers were small and their existence brief, their impact on American military tradition proved lasting and significant.

The Republic of Indian Stream: America’s Forgotten Frontier Nation

Did you know that there once an independent republic in the farthest reaches of northern New Hampshire, where the dense forests blend into the Canadian wilderness?  Neither did I until I came across it in a fascinating book titled A Brief History of the World in 47 Boarders by John Elledge.

It was a short-lived but remarkable experiment in self-government. For three years in the 1830s, the settlers of a disputed border region declared themselves citizens of an independent republic—complete with their own constitution, legislature, and militia. They called it the Republic of Indian Stream, a name that today sounds almost mythical, yet it was a genuine, functioning democracy. Their story blends frontier improvisation, international diplomacy, and Yankee self-reliance—and it leaves us with a curious artifact: a constitution written not by statesmen in Philadelphia, but by farmers, loggers, and merchants caught between two competing nations.

A Territory in Limbo

The roots of the Indian Stream story go back to the Treaty of Paris (1783), which ended the American Revolution. The treaty defined the U.S.–Canada border but used vague geographic language—particularly the phrase “the northwesternmost head of the Connecticut River.” No one could agree which of several small tributaries the treaty meant.

The ambiguity created a slice of wilderness—about 200 square miles—claimed by both the United States and British Lower Canada (now Quebec). For decades, the region existed in a gray zone. Both countries sent tax collectors and law officers, both demanded military service, and neither provided clear legal protection. Residents couldn’t vote, hold secure property titles, or rely on either government’s courts. To make matters worse, they were sometimes forced to pay taxes twice—once to New Hampshire and once to Canada.

Origins of the Republic

By the late 1820s, frustration had reached a boiling point. Attempts to resolve the border dispute were unsuccessful—including arbitration by the King of the Netherlands in 1827 that failed when the United States rejected his decision that favored Great Britain.

With both sides still pressing their claims, the settlers decided they’d had enough of outside interference. On July 9, 1832, they convened a local meeting and declared independence, forming the Republic of Indian Stream. Their constitution—modeled on American state constitutions—began with a simple premise: authority rested with “the citizens inhabiting the territory.”

This wasn’t an act of rebellion but one of survival. The settlers wanted peace, order, and local control. Their goal was to withdrawal from ambiguous regulation and to create a government that could function until the border question was finally settled.

The Constitution of Indian Stream

The constitution of the Republic, adopted the same day they declared sovereignty, was an impressively crafted document for a community of barely 300 people. It reflected the settlers’ familiarity with republican ideals and their determination to govern themselves fairly.

Key features included:

  • Democratic foundation: All authority stemmed from the citizens.
  • Annual elections: A single House of Representatives made the laws, and a magistrate acted as both executive and judge.
  • Judicial simplicity: Local justices of the peace handled disputes—there were no elaborate court hierarchies.
  • Individual rights: Residents enjoyed protections derived from U.S. constitutions—trial by jury, due process, and freedom from arbitrary arrest.
  • Defense and civic duty: Citizens pledged to defend their independence and assist one another in emergencies.

Despite its modest scale, the system worked. The republic passed laws, issued warrants, collected taxes, and even mustered a small militia to maintain order.

Life on the Frontier

Life in Indian Stream resembled that of many frontier communities: logging, farming, hunting, and trading. The land was rough, winters long, and access to distant markets limited. Yet the people thrived through cooperation and self-reliance. Trade with both Canadian and New Hampshire merchants continued—proof that practicality often trumped politics on the frontier.

The republic’s remote location provided a degree of safety from interference, but not immunity. Both British and American agents continued to assert claims, and occasional arrests or skirmishes kept tensions high.

The End of the Republic

The experiment in independence lasted only three years. In 1835, a dispute between an Indian Stream constable and a Canadian deputy sheriff triggered a diplomatic crisis. Canada sent troops to assert control, prompting New Hampshire’s governor to respond in kind.

Realizing they were caught between two competing governments, the citizens voted in April 1836 to accept New Hampshire’s jurisdiction. Indian Stream became part of the town of Pittsburg, and peace was restored.

The larger boundary issue wasn’t fully settled until the Webster–Ashburton Treaty of 1842, which formally placed Indian Stream within the United States.

Legacy of a Lost Republic

Today, little remains of the Republic of Indian Stream except New Hampshire Historical Marker #1 and a scattering of homesteads near the Connecticut Lakes.

Yet its legacy is profound.  It may have lasted only three years, but its story reflects the broader American frontier experience: independence, inventive, and determination to live free from arbitrary rule. In an era defined by rigid borders and powerful states, the memory of Indian Stream reminds us that freedom once depended, not on lines on a map, but on the courage of people willing to draw their own lines.

The story also illustrates the complexities of nation-building in the early American period when borders remained fluid and communities sometimes had to forge their own path toward self-governance. While the republic was short lived, it stands as a testament to the ingenuity and determination of America’s frontier settlers, who refused to accept statelessness and instead chose to create their own nation in the wilderness.

The Indian Stream constitution reminds us that political order is not always imposed from above; sometimes, out of necessity, it is created from below. The settlers were neither revolutionaries nor idealists—they simply wanted clear rules, fair courts, and predictable taxes. Ordinary citizens, faced with legal chaos and neglect, designed a functioning democracy grounded in fairness and mutual responsibility.

That such a tiny community would craft its own constitution speaks to the enduring appeal of constitutional government in the early 19th century. Even on the edge of two empires, far from capitals and legislatures, these settlers turned to a familiar American solution: write it down, elect your leaders, and hold them accountable every year.  Hopefully we will be able to keep their spirit and live up to the example of Indian Stream.

The Real Enemy of the Revolution: Disease

When you think about the American Revolution, you probably picture dramatic battles like Bunker Hill or the crossing of the Delaware. But here’s something that might surprise you: the biggest killer during the war wasn’t British muskets—it was disease. And it’s not even close.

The Numbers Tell a Grim Story

Let’s talk numbers for a second. On the American side, about 6,800 soldiers died from battlefield wounds. Sounds terrible, right? Well, disease killed an estimated 17,000 to 20,000. That’s roughly three times as many. The British and their Hessian allies faced similar odds: around 7,000 combat deaths versus 15,000 to 25,000 disease deaths.

Think about that for a moment. You were actually safer charging into battle than hanging around camp. In some regiments, disease wiped out more than a third of the troops before they even saw their first fight.

Why Was Disease So Deadly?

Picture yourself in a Revolutionary War military camp. Hundreds of men crammed together in makeshift shelters, no running water, primitive latrines dug too close to where everyone lives, and basically zero understanding of what we’d call “germ theory” today. It’s a perfect storm for infectious disease.

The big killers were:

Smallpox was the heavyweight champion of camp diseases. This virus killed about 30% of people it infected and spread like wildfire through packed military camps. Soldiers tried to protect themselves through a risky practice called inoculation—basically giving themselves a mild case of smallpox on purpose by rubbing infected pus into cuts on their skin. Without proper quarantine procedures, though, this sometimes made outbreaks worse instead of better.

Typhus (called “camp fever” back then) spread through lice and fleas. If you’ve ever been to a prolonged camping trip and felt gross after a few days, imagine that times a hundred. Soldiers lived in the same clothes for weeks, rarely bathed, and the parasites just had a field day. The fever, headaches, and diarrhea that came with typhus made it one of the most dreaded camp diseases.

Dysentery (charmingly nicknamed “bloody flux”) came from contaminated water and poor sanitation. When your latrine is 20 feet from your water source and you don’t understand how disease spreads, this becomes pretty much inevitable. The severe diarrhea weakened soldiers to the point where many couldn’t fight even if they wanted to and it made them even more susceptible to other diseases.

Malaria was especially important in the South, where mosquitoes thrived in the humid climate. This one actually played a fascinating role in how the war ended—but more on that in a bit.

When Disease Changed Everything

The 1776 invasion of Canada was a disaster largely because of smallpox. Out of 3,200 American soldiers in the Quebec campaign, 1,200 fell sick. You can’t mount much of an offensive when more than a third of your army is flat on their backs with fever. Similarly, during the siege of Boston, Washington couldn’t effectively engage the British because so many of his troops were sick with smallpox. These weren’t just setbacks—they were strategic catastrophes.

This is what pushed George Washington to make one of his boldest decisions in 1777: he ordered a mass inoculation of the Continental Army. This was controversial and dangerous at the time, but it worked. Washington had survived smallpox himself as a young man, so he understood both the risks and the benefits. The inoculation program probably saved the army from complete collapse.

Medical “Treatment” Was Often Worse Than Nothing

Here’s where things get really grim. Eighteenth-century medicine was basically medieval. Doctors believed in “balancing the humors” through bloodletting—literally draining blood from already weakened soldiers. They also gave powerful laxatives to people who were already suffering from diarrhea. Yeah, let that sink in.

Pain relief meant opium-based drinks or just straight alcohol. Some doctors used herbal remedies, but results were inconsistent at best. Quinine helped with malaria, though nobody really understood why. Mostly, if you got seriously sick, your survival came down to luck and a strong constitution.

Valley Forge: The Turning Point

Valley Forge is famous for being a brutal winter encampment, and disease was a huge part of why it was so terrible. Scabies left nearly half the troops unable to serve. Dysentery and camp fever killed somewhere between 1,700 and 2,000 soldiers during that single winter—and remember, these weren’t battle casualties. These men died from preventable diseases in what was supposed to be a safe encampment.

But Valley Forge taught the Continental Army a crucial lesson. After that nightmare winter, military leaders started taking sanitation seriously. They began focusing on camp hygiene, protecting water supplies, placing latrines away from living areas, and making sure soldiers could bathe and wash their clothes and bedding.

Baron von Steuben is famous for teaching the Continental Army how to march and drill, but he also deserves credit for implementing serious sanitation reforms. These changes helped prevent future disease outbreaks and kept the army functional for the rest of the war.

The Secret Weapon at Yorktown

Here’s one of my favorite historical details: mosquitoes may have helped win American independence. At Yorktown, roughly 30% of Cornwallis’s British army was knocked out by malaria and other diseases during the siege. The British commander was trying to hold off the American and French forces while also dealing with the fact that almost a third of his troops were too sick to fight.

Many American soldiers from the southern colonies had grown up with malaria and had some partial immunity. The British? Not so much. Some historians even think Cornwallis himself might have been suffering from malaria, which could have affected his decision-making. His second-in-command, Brigadier General Charles O’Hara, was definitely seriously ill during the siege. Fighting a war while you can barely stand is a pretty significant handicap.

The Bigger Picture

The American Revolution shows us something important: wars aren’t just won on battlefields. They’re won by the side that can keep its soldiers alive and healthy. Disease shaped strategic decisions, determined the outcomes of campaigns, and killed far more men than any British regiment ever did.

Washington’s decision to inoculate the army was genuinely revolutionary (pun intended). It showed a willingness to embrace controversial medical practices for the greater good. The sanitation reforms that came out of Valley Forge laid groundwork for modern military medicine and influenced public health policies in the new United States.

So next time someone mentions the American Revolution, remember: while we celebrate the military victories, one of the most important battles was fought against an enemy you couldn’t see—and for most of the war, nobody really knew how to fight it.

The casualty figures and major disease outbreaks are well-documented in historical records. The specific percentages and numbers are estimates based on historical research, as precise record-keeping was limited during this period. The overall narrative about disease being the primary cause of death is strongly supported by multiple historical sources.

Pistols at Dawn: The Rise and Fall of the Code Duello

Not long ago I was watching a news show and one of the panelists started talking about “a duel of words” that went on in a congressional hearing. I was intrigued by the use of the word duel and I thought I’d look into the history of this strange custom.

In the age before Twitter feuds, internet trolling, and legal settlements, honor was defended with pistols at dawn. The Code Duello, a set of rules governing dueling, offers a fascinating glimpse into how ideas of masculinity, reputation, and justice shaped public and private life in the Anglo-American world from the mid-18th century through the antebellum era.

The Code Duello emerged as one of the most distinctive and controversial aspects of genteel culture in the American colonies in the early United States. This elaborate system of honor-based combat, imported from European aristocratic traditions, would profoundly shape American society between 1750 and 1860, creating a culture where personal honor often trumped legal authority and where violence became a sanctioned means of dispute resolution among the elite.

European Origins 

The Code Duello originated in Renaissance Italy and spread throughout European aristocratic circles as a means of settling disputes while maintaining social hierarchy. The practice reached the American colonies through British and Continental European settlers who brought with them deeply ingrained notions of honor, reputation, and gentlemanly conduct. Unlike random violence or brawling, dueling operated under strict protocols that emphasized courage, skill, and adherence to prescribed rituals.

The most influential codification was the Irish Code Duello of 1777, written by gentlemen of Tipperary and Galway. This twenty-six-rule system established procedures for issuing challenges, selecting weapons, determining conditions of combat, and defining acceptable outcomes. The code emphasized that dueling was a privilege of gentlemen, requiring both participants to be of equal social standing and ensuring that honor could only be satisfied through formal, regulated combat.

Colonial Implementation and Adaptation

The first recorded American duel occurred in 1621 in Plymouth, Massachusetts, between two servants, but the practice soon became the exclusive domain of elites as only “gentlemen” were considered to possess honor worth defending in this way.

The Irish Code Duello was widely adopted in America, though often with local variations. In 1838, South Carolina Governor John Lyde Wilson published an “Americanized” version, known as the Wilson Code, which further codified the practice for the southern states and attempted to increase negotiated settlements. These codes served as the de facto law of honor, even as formal legal systems struggled to suppress dueling.

The practice gained prominence among the southern plantation society’s hierarchy as dueling fit well with its emphasis on personal honor.   The ritual was highly formal: challenges were issued in writing, seconds (assistants to the duelists) attempted to mediate, the weapons chosen, and terms were carefully negotiated.

Colonial dueling adapted European practices to American circumstances. While European duels often involved swords, reflecting centuries of aristocratic martial tradition, American duelists increasingly favored pistols, which were more readily available and required less specialized training. This shift democratized dueling to some extent, as pistol proficiency was more easily acquired than swordsmanship, though the practice remained largely restricted to the upper classes.

The Revolutionary War significantly expanded dueling’s influence. Military service brought together men from different regions and social backgrounds, spreading dueling customs beyond their original geographic and social boundaries. Officers who had learned European military traditions during the conflict carried these practices into civilian life, establishing dueling as a marker of martial virtue and gentlemanly status.

The Early Republic

Following independence, dueling became increasingly institutionalized in American society.  The young republic’s political culture, characterized by intense partisan conflict and personal attacks in newspapers, created numerous opportunities for perceived slights to honor that demanded satisfaction through combat.

The most famous American duel occurred in 1804 when Aaron Burr killed Alexander Hamilton at Weehawken, New Jersey. This encounter exemplified both the power and the contradictions of dueling culture. Hamilton, despite philosophical opposition to dueling, felt compelled to accept Burr’s challenge to maintain his political viability. The duel’s outcome effectively ended Burr’s political career and demonstrated how adherence to the code could destroy the very honor it purported to defend.

Prior to becoming president, Andrew Jackson took part in at least three duels, although he is rumored to have been in many more. In his most famous duel, Jackson shot and killed a man who had insulted his wife. Jackson was also wounded in the duel and carried the bullet in his chest for the rest of his life.

Political dueling reached epidemic proportions in the antebellum period. Congressional representatives, senators, and other public figures regularly challenged opponents to combat over policy disagreements or personal insults. The practice became so common that some politicians deliberately provoked duels to enhance their reputation for courage, while others saw dueling as essential to maintaining credibility in public life.

Regional Variations and Social Dynamics

Dueling culture varied significantly across regions. The South developed the most elaborate and persistent dueling traditions, where the practice became intimately connected with concepts of honor, masculinity, and social hierarchy that would later influence Confederate military culture. Southern dueling codes often emphasized elaborate rituals and multiple exchanges of fire, reflecting a culture that viewed honor as more important than life itself.

Northern attitudes toward dueling were more ambivalent. While many Northern elites participated in dueling, the practice faced stronger opposition from religious groups, legal authorities, and emerging middle-class values that emphasized commerce over honor. Anti-dueling societies formed in several Northern cities, and some states enacted specific anti-dueling legislation, though enforcement remained inconsistent. Laws against it were passed in several colonies as early as the mid-18th century, with harsh penalties including denial of Christian burial for duelists killed in combat. Clergy denounced it as un-Christian, and reformers sought to eradicate it, but the practice persisted, especially in regions where courts were weak or social hierarchies unstable. The South, with its less institutionalized markets and governance, saw dueling as a quicker, more reliable way to settle disputes.

Western frontier regions adapted dueling to their own circumstances, often emphasizing practical marksmanship over elaborate ceremony. Frontier dueling tended to be less formal than Eastern practices, but it served similar functions in establishing social hierarchies and resolving disputes in areas where legal institutions remained weak.

Decline and Legacy

By the 1850s, dueling faced increasing opposition from legal, religious, and social reform movements. The rise of professional journalism, which could destroy reputations without resort to violence, provided alternative means of defending honor. Changing economic conditions that emphasized commercial success over martial virtue gradually undermined dueling’s social foundations.

The Civil War marked dueling’s effective end as a significant social institution. The massive scale of organized violence made individual combat seem anachronistic, while post-war society increasingly emphasized industrial progress over aristocratic honor. Though isolated duels continued into the 1870s, the practice lost its central role in American elite culture.

The Code Duello’s legacy extended far beyond its formal practice. It established patterns of violence, honor, and masculine identity that would influence American culture for generations, contributing to regional differences in attitudes toward violence and honor that persist today. The code’s emphasis on individual resolution of disputes also reflected broader American skepticism toward institutional authority, helping shape a culture that often preferred private justice to public law.

How the Code Duello Shaped Western Gunfighting Culture

The Code Duello was a script for settling personal disputes through controlled violence. Its influence waned in the East by the mid-1800s, but many of its ideas persisted, especially among military veterans, Southern transplants, and frontiersmen. As the American frontier expanded, the ethic of “settling scores” through personal combat found fertile ground in the west. What changed was the style and setting.

From Pistols at Dawn to High Noon

In the Code Duello, challenges were typically issued in writing, often with formal language and designated seconds. A duel was planned, often days in advance, and fought with flintlock pistols or swords. By contrast, gunfights in the Old West were more spontaneous, often provoked by insults, cheating, or long-standing feuds. Still, both forms were ultimately about defending personal honor in public view.

Gunfighters like Wild Bill Hickok and Wyatt Earp became mythologized partly because they embodied an honor-based culture in an environment where the law was weak or slow. In many ways, the Western gunfight was an informal, democratized version of the Code Duello, stripped of its aristocratic pretenses but keeping its emotional and symbolic core.

Myth vs. Reality

Ironically, formal duels were relatively rare in the actual Old West, and many “gunfights” were closer to ambushes or drunken brawls than ritualized combat. But dime novels, Wild West shows, and later Hollywood films reimagined them using a Code Duello-like template: two men meet face to face, in broad daylight, to resolve a conflict through a test of nerve and skill. The image of the high-noon shootout—with a silent crowd, an agreed time and place, and an implied code of fairness—is the Code Duello in cowboy boots, but it likely never existed.

The Duel That Never Was

I will end the discussion of Code Duello with what may be one of the most unusual of all American dueling stories.  

In 1842, Abraham Lincoln became embroiled in a public dispute with James Shields, the auditor of Illinois, largely over Illinois State banking policy and some satirical letters that mocked Shields.  Shields took great offense to these attacks—particularly the ones written by Lincoln under the pseudonym “Rebecca”—and formally challenged Lincoln to a duel.  According to the rules of dueling, Lincoln, as the one challenged, had the right to choose the weapons. He selected cavalry broadswords of the largest size to take advantage of his own height and reach over Shields.

The Duel’s Outcome

The duel was scheduled for September 22, 1842, on Bloody Island, a sandbar in the Mississippi River near Alton, Missouri—chosen because dueling was still legal there.  On the day of the duel, before any blood was shed, Lincoln dramatically demonstrated his advantage by slicing off a high tree branch with his broadsword, showcasing his reach and physical prowess.  After witnessing this and following subsequent negotiations by their seconds, Shields and Lincoln decided to call off the duel, resolving their differences without violence.

Legacy

Although the duel never resulted in violence, it became a notorious episode in Lincoln’s life, one he rarely spoke of later, even when asked about it.  The event is commonly cited as a reflection of Lincoln’s quick wit, physical presence, and preference for peaceful resolution when possible.  While Abraham Lincoln never actually fought a duel, he was briefly a participant in one of the more colorful near-duels of American political history.

A Final Thought

Perhaps the world would be a better place if we reinstitute some elements of Code Duello and instead of sending armies off to fight bloody battles, the national leaders settle disputes by individual combat.  I suspect there would be many more negotiated settlements.

Powdered Wigs and Politics: The Rise and Fall of America’s Most Distinguished Hair Trend

I’ve been spending a lot of time recently researching and writing about the 250th Anniversary of the American Revolution and I keep asking myself, “What’s up with the wigs?”   Have you ever wondered why the Founding Fathers look so impossibly fancy in their portraits?  Well, you can thank a French king and a syphilis epidemic. The elaborate wigs worn by early American leaders weren’t just fashion statements—they were complex social symbols that said everything about who you were, what you could afford, and how seriously you wanted to be taken.

Where It All Started

The wig craze didn’t begin in America. It started across the Atlantic when France’s King Louis XIII went bald prematurely in the 1600s and decided to cover it up with a wig. But it was his son, Louis XIV, who really kicked things into high gear. When the Sun King started losing his hair, he commissioned elaborate wigs that became the epitome of aristocratic style. European nobility, desperate to emulate French sophistication, quickly followed suit.

The practice also had a less glamorous origin story. Syphilis was rampant in 17th-century Europe, and one of its unfortunate side effects was hair loss. Wigs conveniently covered up this telltale symptom while also hiding the sores and blemishes that came with the disease.

Europe in the 1600s and 1700s also had frequent outbreaks of lice and other parasites. Shaving one’s natural hair short and wearing a wig—which could be cleaned, boiled, or deloused more easily—became a practical solution. Powdering helped keep wigs fresh and masked odors.

By the time the fashion crossed the ocean to colonial America in the early 1700s, wigs had become standard attire for anyone with social pretensions.

Status on Your Head

In colonial America, your wig announced your place in society before you even opened your mouth. The most expensive and elaborate wigs featured long, flowing curls that cascaded past the shoulders—these full-bottomed wigs could cost the equivalent of several months’ wages for an average worker. Wealthy merchants, successful plantation owners, and colonial officials wore these statement pieces to project authority and refinement.

Professional men like doctors, lawyers, and clergy typically wore more modest styles. The “tie wig” gathered hair at the back with a ribbon, while the “bob wig” featured shorter hair that ended around the neck. These styles were practical enough for men who actually had to work, but still formal enough to command respect. Even the style of curl mattered—tight curls suggested conservatism and tradition, while looser waves indicated a more progressive outlook.

Working-class men generally couldn’t afford real wigs. Some wore simple caps or went bareheaded, while others might invest in a cheap wig made from horsehair or goat hair for special occasions. The quality difference was obvious—human hair wigs, especially those made from blonde or white hair, were luxury items that only the wealthy could obtain.

Many men who did not wear wigs but still wanted the fashionable look would grow their own hair long, pull it into a queue (pony tail), and powder it. George Washington is a good example — portraits show his natural hair powdered white, not a wig.

The Daily Reality of Wig Life

Maintaining these hairpieces was no joke. Owners had to powder their wigs regularly with starch powder, often scented with lavender or orange, to achieve that distinctive white or gray color that signaled refinement. The powder got everywhere, which is why men often wore special dressing gowns during the powdering process.

Wigs required regular cleaning and restyling by professionals called peruke makers or wigmakers. These craftsmen commanded good money in colonial cities, advertising their services alongside other luxury trades. The hot, humid summers in places like Virginia and South Carolina made wig-wearing particularly miserable, but fashion demanded sacrifice.

The Revolutionary Shift

By the time of the American Revolution, attitudes toward wigs were already changing. The shift happened for several interconnected reasons, and it reflected broader transformations in American society.

First, the Revolutionary War itself promoted practical thinking. Military officers found elaborate wigs impractical in the field, and the democratic ideals of the Revolution made aristocratic European fashions seem pretentious. Many younger revolutionaries, including Thomas Jefferson, stopped wearing wigs as a political statement against Old World affectation.

A young Jefferson with a wig

Second, France—the original source of wig fashion—underwent its own revolution in 1789. As French revolutionaries literally beheaded the aristocracy, powdered wigs became associated with the despised nobility. What had once symbolized sophistication now suggested tyranny and excess.

In Great Britain, Parliament introduced a tax on hair powder as part of Prime Minister William Pitt the Younger’s revenue-raising measures.  The law required anyone who used hair powder to purchase an annual certificate costing one guinea (a little over $200 in today’s money).  This contributed to the growing sense that wigs were an unnecessary extravagance. Meanwhile, changing ideals of masculinity emphasized natural simplicity over artificial ornamentation.

By the early 1800s, the wig had largely disappeared from everyday American life. A new generation of leaders, including Andrew Jackson, proudly displayed their natural hair. The transition happened remarkably quickly—within a single generation, wigs went from essential to absurd. By the 1820s, anyone still wearing a powdered wig looked hopelessly outdated, clinging to a world that no longer existed.

The Legacy

Today, elaborate wigs survive primarily in British courtrooms, where some judges still wear them in formal proceedings—a deliberate echo of legal tradition. The powdered wigs of the Founding Fathers remain iconic, instantly recognizable symbols of early American history, even though the men who wore them were already abandoning the fashion by the time they built the new nation.

Divine Providence and Patriotism

Religion in the Ranks of the Continental Army

The Continental Army that fought for American independence from 1775 to 1783 represented a cross-section of colonial religious life, bringing together men from diverse faith traditions under a common cause. The religious faith of both enlisted soldiers and officers reflected the broader religious landscape of colonial America, and their regional differences contributed to a complex tapestry of faith within the ranks.

The Continental Army drew from a population where religious diversity was already well-established, particularly when compared to European armies of the same period. Protestant denominations were the majority within the ranks, reflecting the colonial religious demographics.

The American Revolution was not only a political and military struggle but also a deeply religious experience for many Continental Army soldiers. Their faith shaped how they interpreted the war, coped with its hardships, and interacted with comrades from diverse backgrounds.

Enlisted soldiers often relied on providentialism, the belief that God directly intervened in daily life, to make sense of battlefield chaos and suffering. Diaries and letters reveal troops attributing survival in skirmishes, unexpected weather shifts, and even mundane events to divine will. For example, many saw the Continental Army’s unlikely battlefield victories as evidence of God’s favor toward their cause.

Diversity in the Ranks

Congregationalists from New England formed a significant portion of the army, bringing with them the Puritan theological tradition that emphasized divine providence and moral responsibility.  Ministers in New England frequently preached that resistance to tyranny was a Christian duty and many soldiers viewed themselves as fighting against tyranny, much as their ancestors had fled religious persecution.

Presbyterian soldiers, many of Scots-Irish descent, comprised a substantial group. Concentrated heavily in the Middle Colonies and frontier areas, they tended toward evangelical and Presbyterian influences, regardless of their home colony . The challenging conditions of frontier life had already created a more individualistic and emotionally intense form of Christianity that adapted well to military service. These soldiers often brought a fatalistic acceptance of divine will combined with fierce determination.

Baptist and Methodist soldiers, though fewer in number, represented growing evangelical movements that would later transform American Christianity. German Reformed and Lutheran soldiers from Pennsylvania added to the religious diversity, while smaller numbers of Catholics, particularly from Maryland and Pennsylvania, served despite facing legal restrictions in many colonies. Even a few Jewish soldiers joined the cause, though their numbers were minimal given the tiny Jewish population in colonial America. The religious pluralism in regiments from the Middle Colonies created a more tolerant atmosphere that foreshadowed the religious diversity of the new nation.  

Quakers were generally pacifists and avoided military service, however some “Free Quakers” broke from their tradition and joined the Patriot cause. Other pacifist religious groups, such as Mennonites, abstained from combat but occasionally provided non-combatant support.

Anglicans, ironically fighting against their own church’s mother country, served in significant numbers, particularly from the Southern colonies where the Church of England frequently had been established as a state supported church. 

While religious diversity was generally a unifying force in the Continental Army, there were some instances of religious tension and while they were relatively limited, they were not entirely absent. One significant example occurred early in the war in November 1775, when some American troops planned to burn an effigy of the Pope on Guy Fawkes Day (also known as Pope’s Day in New England).  General Washington strongly condemned this anti-Catholic action, denouncing it as indecent and lacking common sense.

Washington actively tried to prevent sectarianism from undermining unity in the ranks, whether between Protestants and Catholics or among different Protestant denominations. The overall trend was towards religious tolerance and unity, with religious diversity ultimately contributing positively to the army’s cohesion and morale.

Officer Corps and Religious Leadership

The officer corps of the Continental Army reflected a somewhat different religious profile than the enlisted ranks. Many officers came from the colonial elite and were often Anglican or belonged to more established denominations. However, the Revolution’s anti-episcopal sentiment led many Anglican officers to distance themselves from their church’s political connections while maintaining their basic Christian beliefs.  The relationship between the soldiers and the established Church of England became increasingly strained as the Revolution progressed.

George Washington himself exemplified this complex relationship with religion. Though nominally Anglican, Washington never wrote about his personal faith.  He was likely influenced by Deist philosophy, then popular among Enlightenment thinkers including Jefferson and possibly Franklin. Deism holds that reason and observation of the natural world are sufficient to determine the existence of a Creator, but that this Creator does not intervene in the universe after its creation.

Washington regularly invoked divine providence in his correspondence and orders, understanding the importance of religious sentiment in maintaining morale, even while his personal beliefs remained ambiguous. His famous directive that “the blessing and protection of Heaven are at all times necessary but especially so in times of public distress and danger” reflected his understanding of religion’s role in military leadership.

Other prominent officers brought their own religious convictions to their leadership. Nathanael Greene, the Southern theater commander, was raised as a Quaker but was expelled from the Society of Friends for his military service. Rev. Peter Muhlenberg, a Lutheran minister, left the pulpit and joined the Continental Army, rising to the rank of Major General.  The Marquis de Lafayette, though Catholic, adapted to the predominantly Protestant environment of the American officer corps.

Officers, including Washington, viewed religion as a tool for discipline and unity. Washington mandated Sunday worship. He also appointed chaplains to every brigade, insisting they foster “obedience and subordination”.

 Continental Army Chaplain Service

Recognizing the importance of religion to morale and discipline, the Continental Congress authorized the appointment of chaplains to serve with the army. The chaplain system evolved throughout the war, beginning with regimental chaplains and eventually expanding to include brigade and division chaplains for larger organizational units.

Continental Army chaplains faced unique challenges. Unlike European armies with established state churches, American chaplains served religiously diverse units. They needed to provide spiritual comfort to soldiers from different denominational backgrounds while avoiding sectarian conflicts that could undermine unit cohesion. Most chaplains were Protestant ministers, reflecting the army’s composition, but they were expected to serve all soldiers regardless of specific denominational affiliation.

The duties of Continental Army chaplains extended beyond conducting religious services. They often served as informal counselors, helped soldiers write letters home, provided basic education to illiterate soldiers, and sometimes even served as medical assistants. Their moral authority made them valuable in maintaining discipline, and many commanders relied on chaplains to address problems of desertion, drunkenness, and other behavioral issues.

Chaplains also played important roles in significant military events. They conducted prayers before major battles and led thanksgiving services after victories. They provided comfort to the dying and services for the dead.

The British recognized the importance of chaplains to the Continental Army and in some cases offered rewards for their capture.

The famous painting of “The Prayer at Valley Forge” with its image of Washington praying alone in the snow, whether historically accurate or not, represents the type of spiritual leadership chaplains were expected to provide during the army’s darkest moments.

Conclusion Religion in the Continental Army reflected both the differing and the common aspects of colonial American faith. While denominational variations existed, most soldiers shared basic Christian beliefs that provided comfort during hardship and meaning to their sacrifice. The army’s religious diversity foreshadowed the religious pluralism that would characterize the new American nation, while the chaplain service established precedents for military religious support that continue today. The Revolution’s success owed much to the spiritual resources that sustained these soldiers through eight years of difficult warfare, demonstrating religion’s crucial role in the founding of the American republic.

Always Faithful: A Brief History of the Marine Corps Motto

When I started training as a Marine more than 50 years ago one of the first things we were taught was the call and response “Semper Fi” followed quickly by “Do or Die”.  But to Marines, Semper Fi, Semper Fidelis—Always Faithful—is more than just a motto. It becomes a personal belief system, a statement of individual integrity and a way of life.  Faithful to country, faithful to the Corps, faithful to fellow Marines, faithful to duty.  It reflects your faith in the Marine Corps and your fellow Marines.

How did Marines come to adopt this distinctly non martial motto?  Other more military sounding mottos and nicknames come to mind: “Devil Dogs”, “First to Fight”, and “Leathernecks”.  But Semper Fidelis has become the way Marines see themselves, so much so that their greeting to one another is “Semper Fi”.  The same ethos is embodied in an unofficial Marine Corps motto, “No Man Left Behind”.

But what is the origin of this motto that seems to sum up the entire philosophy of the Marine Corps?

The United States Marine Corps is known for its discipline, dedication, and fierce loyalty, qualities that are symbolized by Semper Fidelis. Translated from Latin, the phrase means “Always Faithful.” But like many traditions within the military, the motto is rooted in a rich history that stretches back hundreds of years.

The Marine Corps was established in 1775 as the Continental Marines, but the famous motto did not appear until more than a century later. By the early 19th century, several mottos had been associated with the Marines, including “Fortitudine” (With Fortitude) and “By Sea and by Land.” While these phrases captured elements of the Marines’ mission, they lacked the enduring emotional impact that would ultimately come with Semper Fidelis.

It was in 1883 that the motto was formally adopted under the leadership of the 8th Commandant, Colonel Charles McCawley. Colonel McCawley likely chose that motto because it embodies the values of loyalty, faithfulness and dedication that he believed should define every Marine.  Unfortunately, we will never know his exact reason for choosing this specific motto because he did not leave any documentation about his thought process.  Regardless, from that point on, the motto became inseparable from the identity of the Corps.

The phrase “Semper Fidelis” has much older origins than its Marine Corps adoption. It’s believed to have originated from phrases used by senators in ancient Rome, with the earliest recorded use as a motto dating back to the French town of Abbeville in 1369. The phrase has been used by various European families since the 16th century, and possibly as early as the 13th century.

The earliest recorded military use was by the Duke of Beaufort’s Regiment of Foot, raised in southwestern England in 1685. The motto also has connections to Irish, Scottish, and English nobility, as well as 17th-century European military units, some of whose members may have emigrated to American colonies in the 1690s

The choice of the Latin phrase by Colonel McCawley was likely deliberate. Latin carries with it a sense of permanence and tradition, and its concise wording communicated volumes in only two words. “Always Faithful” perfectly captured the bond that must exist between Marines and the responsibilities they shoulder. Marines are expected to remain faithful to the mission, to their comrades in arms, and to the United States, regardless of the personal cost. It is this idea of unshakable fidelity that has come to define what it means to wear the Eagle, Globe, and Anchor.

Since its adoption, Semper Fidelis has carried Marines through every conflict the United States has faced. From the battlefields of World War I, where Marines earned the name “Devil Dogs,” to the grueling island campaigns of the Pacific in World War II, to the frozen battle fields of Korea, to the steaming jungles of Vietnam, Marines have demonstrated again and again what it means to be “Always Faithful.” In modern times, whether in Iraq, Afghanistan, or in humanitarian missions across the globe, this motto continues to serve as a reminder of the Corps’ unwavering commitment.

The phrase has also influenced the broader culture of the Marines, inspiring the title of the official Marine Corps march, “Semper Fidelis,” composed by John Philip Sousa in 1888, which remains a powerful symbol of pride and esprit de corps.

The motto’s meaning extends beyond active service. Marines pride themselves on being “once a Marine, always a Marine,” and Semper Fidelis reflects that lifelong bond. Even after leaving the uniform behind, Marines carry that sense of loyalty into civilian life, honoring the values and traditions of their service. For many, it becomes a central guiding principle throughout their lives.  Marine veterans always say “I was a Marine”.

In the end, the motto “Semper Fidelis” is far more than a catchy phrase. It is both a promise and a challenge—a pledge of unwavering loyalty and a challenge to live up to the highest standards of duty, honor, and fidelity. When Marines declare “Semper Fi,” they acknowledge not only their devotion to the Marine Corps, but also the unbreakable loyalty that binds them together as brothers and sisters in arms.

The celebration of the 250th anniversary of the signing of the Declaration of Independence is coming up next year on July 4th. But what about the events leading up to this? What about the men and women who helped make this happen? There are events coming up to commemorate the 250th anniversary of the founding of the Continental Navy and the Continental Marines in 1775. We will be holding commemorative celebrations here in West Virginia and there will be a national event in Philadelphia in October of this year.

Frédéric Bazille: The Impressionist Who Never Saw Impressionism

I like to think of my research and my writing as being eclectic, although sometimes I have to admit they may be better described as unfocused. This post may be an example of one of those episodes.  Recently I was looking through a magazine and saw an ad with an illustration that was obviously based on Monet’s Garden in Giverny. It sent me thinking about a series of lectures I had watched on French Impressionists and that got me thinking about Frédéric Bazille, an artist I have always found fascinating. I decided to spend a little time looking into him. So, I completely forgot about the project I was working on and started on this one. That may be why I have so many unfinished articles and random files of unrelated research.

In French Impressionism there are names that stand in the forefront— Monet, Renoir, Degas, Manet — and then there are names that hover just behind them. Frédéric Bazille is one of those in the shadows. He was part of the same circle, painted with the same daring brush, and showed the same fascination with light and color. Yet his life ended before Impressionism even had a name. Bazille was killed in the Franco-Prussian War in 1870, at just twenty-eight years old. His death robbed the movement of both a gifted painter and a generous friend who helped shape its history.

A Wealthy Outsider

Bazille was born in Montpellier in 1841, the son of a prosperous Protestant family. Unlike Monet and Renoir, who often lived in dire poverty, Bazille never worried about how to pay for paints or rent. That freedom made him unusual in Parisian art circles.

Bazille became interested in painting after seeing works by Eugène Delacroix, but his family insisted he also study medicine to ensure financial independence. By 1859, he had begun taking drawing and painting classes at the Musée Fabre in Montpellier with local sculptors Joseph and Auguste Baussa.

In 1862, Bazille moved to Paris ostensibly to continue his medical studies, but he also enrolled as a painting student in Charles Gleyre’s studio. There he met three fellow students who would become close friends and collaborators: Claude Monet, Pierre-Auguste Renoir, and Alfred Sisley. He soon became part of a group of artists and writers that also included Édouard Manet, and Émile Zola.  After failing his medical exam (perhaps intentionally) in 1864, Bazille began painting full-time.

Bazille used his money to help everyone in his circle. He rented large studios in Paris, where friends who couldn’t afford space of their own painted and slept. He bought their finished canvases when no one else would. To Manet, Renoir, Monet, and Sisley, Bazille was not just a colleague but a lifeline. Without him, some of the paintings we now consider cornerstones of Impressionism might never have been finished.

Experiments With Light

What makes Bazille more than a wealthy patron is his own work as an artist. He was fascinated by how sunlight transformed color and how outdoor settings could frame the human figure. Long before the Impressionists formally broke with mainstream French art, Bazille was exploring these themes.  In The Pink Dress (1864), he painted his cousin on a terrace overlooking the countryside, her figure half-lost in shadow, half-caught by light.  In Family Reunion (1867), he executed a technically difficult group portrait outside, with natural sunlight revealing the folds of dresses and the textures of grass. In The Studio on the Rue de la Condamine (1870), Bazille turned his brush on his own circle, capturing Manet, Monet, Renoir, and Zola in a collective portrait of the avant-garde.

His style was less free than Monet’s and more deliberate than Renoir’s, but the suggestion of Impressionism is unmistakable. He was bridging the academic precision of his training with the looser brushwork of the new school.

Bazille exhibited at the Salon in Paris in 1866 and 1868.  The Salon was the most prestigious and conservative art exhibition in France. These official exhibitions became increasingly controversial as they repeatedly rejected innovative artists like the Impressionists, leading to the creation of alternative exhibitions such as the famous Salon des Refusés in 1863 and independent Impressionist exhibitions starting in 1874.

A Call to War

When France declared war on Prussia in 1870, Bazille enlisted in the army. He could easily have avoided service — his family connections, his money, and his medical background all gave him options. But he joined the Zouave, an elite infantry regiment known for its colorful uniforms and reckless bravery.

On November 28, 1870, at the Battle of Beaune-la-Rolande, Bazille’s commanding officer was struck down. Bazille stepped forward to lead the attack. Within minutes, he too was hit and killed. He never saw the armistice. He never saw the first Impressionist exhibition in 1874. He never saw his friends vindicated by history.

The Spirit of Impressionism

Bazille left fewer than sixty known canvases. That small number alone ensures his reputation will never match Monet’s or Renoir’s. Yet the works he did leave offer glimpses of a painter who might have been one of the movement’s greats. He had both vision and means — a rare combination in the avant-garde world.

Without him, the Impressionists lost not only a friend but also a stabilizing force. Bazille’s studios had been safe havens, his purchases financial lifelines, his company a source of encouragement. Monet and Renoir were devastated by his death, and for years afterward, they spoke of him with unslacking grief.

Art historians often speculate about what role he might have played had he lived. Perhaps he would have anchored Impressionism more firmly in the Paris art establishment, or perhaps his money and position would have shielded his friends from years of ridicule. We can only guess.

Remembering Bazille

Today, Bazille’s paintings hang in the Musée d’Orsay in Paris and the Musée Fabre in Montpellier and in the United States at The National Gallery of Art, the Metropolitan Museum of Art, and the Art Institute of Chicago.  To see them is to feel both promise and loss. His canvases are alive with sunlight and color, and they hint at the career that never was.

Bazille reminds us that history is shaped not just by the titans who endure but also by the voices cut short. Impressionism survived and flourished without him, but it was poorer for his absence. In a way, every bright patch of sunlight in Monet’s gardens or flashing dress in Renoir’s dance halls carries a trace of the young man who painted light before the movement even had a name — and who never lived to see it shine.

Bread and Circuses: From Ancient Rome to Modern America

“Already long ago, from when we sold our vote to no man, the People have abdicated our duties; for the People who once upon a time handed out military command, high civil office, legions — everything, now restrains itself and anxiously desires for just two things: bread and circuses.”

Nearly 2,000 years ago, Roman satirist Juvenal penned one of history’s most enduring political observations: “Two things only the people anxiously desire — bread and circuses.” Writing around 100 CE in his Satire X, Juvenal wasn’t celebrating this phenomenon—he was lamenting it. The poet watched as Roman citizens traded their political engagement for free grain and spectacular entertainment, becoming passive spectators rather than active participants in their democracy. The phrase has endured for nearly two millennia as shorthand for a troubling political dynamic: entertainment and consumption replacing civic engagement and accountability.

The Roman Warning

Juvenal’s critique came at a pivotal moment in Roman history. The republic had collapsed, and emperors like Augustus had systematically dismantled democratic institutions. Rather than revolt, Roman citizens seemed content as long as the government provided basic sustenance (the grain dole called annona) and elaborate spectacles at venues like the Colosseum. Political participation withered as people focused on immediate pleasures rather than long-term civic responsibilities.

The strategy worked brilliantly for Roman rulers. Keep the masses fed and entertained, and they won’t question your authority or demand meaningful representation. It was political control through distraction—a form of soft authoritarianism that maintained order without overt oppression.  The policy was effective in the short term—peace in the streets and loyalty to the emperors—but disastrous over time. Rome’s population became disengaged from politics, while real power consolidated in the hands of a few.

Modern American Parallels

Fast-forward to contemporary America, and Juvenal’s observation feels uncomfortably relevant. While we don’t have gladiatorial games, we do have our own version of “circuses”—professional sports, reality TV, social media feeds, and celebrity culture that dominate public attention. These aren’t inherently problematic, but they become concerning when they crowd out civic engagement.

Our modern “bread” takes various forms: government assistance programs, subsidies, and economic policies designed to maintain consumer spending. We are saturated with cheap goods, instant delivery services, and mass consumerism. For many, economic struggles are temporarily softened by accessible consumption, from fast food to online shopping. Yet material comfort often masks deeper inequalities and systemic challenges—wage stagnation, healthcare costs, and mounting national debt. These programs often serve legitimate purposes, but they can also function as political tools to maintain public satisfaction and suppress dissent.

Consider how political campaigns increasingly focus on entertainment value rather than substantive policy debates. Politicians hire social media managers and appear on talk shows, understanding that capturing attention often matters more than presenting coherent governance plans. Meanwhile, voter turnout for local elections—where citizens have the most direct impact—remains dismally low.

The Distraction Economy

Perhaps most striking is how our information landscape mirrors Roman spectacles. We’re bombarded with sensational news, viral content, and manufactured controversies that generate strong emotional reactions but little productive action. Complex policy issues get reduced to soundbites and memes, making genuine democratic deliberation increasingly difficult.

Social media algorithms are specifically optimized for engagement, not enlightenment. They feed us content designed to provoke reactions—anger, outrage, schadenfreude—rather than encourage thoughtful consideration of difficult issues. This creates a population that feels politically engaged through constant consumption of political content while remaining largely passive in actual civic participation.

The danger of “bread and circuses” in modern America lies in apathy. When civic participation declines, voter turnout falls, and policy debates get reduced to simplistic slogans, elites face less scrutiny. The result is a weakened democracy, vulnerable to manipulation and short-term thinking.

Breaking the Cycle

Juvenal’s warning doesn’t mean we should abandon entertainment or social programs. Rather, it suggests we need intentional balance. Democratic societies thrive when citizens remain actively engaged in governance beyond just voting every few years.

This means staying informed about local issues, attending town halls, contacting representatives, and participating in community organizations. It means choosing substance over spectacle and long-term thinking over immediate gratification.

The Roman Republic fell partly because its citizens stopped paying attention to governance. Juvenal’s “bread and circuses” reminds us that democracy requires constant vigilance—and that comfortable distraction can be freedom’s most seductive enemy.

Page 4 of 9

Powered by WordPress & Theme by Anders Norén