Grumpy opinions about everything.

Tag: American History Page 3 of 4

The Eagle, Globe, and Anchor

How the Marine Corps Found Its Symbol

Few military emblems carry as much history and pride as the Marine Corps’ Eagle, Globe, and Anchor, better known as the EGA or simply as the emblem. New recruits and officer candidates work intensely to earn the right to wear this symbol. It is a source of immense pride for every Marine who achieves that distinction.

When entering the Corps, I encountered World War II veterans who affectionately called the EGA the “Birdie on the Ball.” But only Marines can take such liberties—outsiders risk offense if they use the term.

The emblem is instantly recognizable, yet few realize its deep historical roots or appreciate the transformations it has undergone to become the symbol every Marine wears today.

From Anchors to Eagles: The Early Years (1776–1868)

At its inception in 1776, the Continental Marines lacked any formal insignia. Some Marines, predominantly officers, adopted maritime icons such as the fouled anchor—an anchor entwined with rope—often emblazoned on buttons or hat plates. This design echoed the British Royal Navy and underscored their naval identity, but it was never standardized.

Uniform innovations began in the early 1800s. By 1804, Marines were using brass eagles mounted on square plates. During the War of 1812, octagonal plates appeared, embossed with eagles, anchors, drums, shields, and flags. Later designs were simplified to feature metal letters “U.S.M.” (United States Marines), reflecting the shift towards a national identity.

The example below is an officer’s coat button circa 1805-1820.

A more distinctive step came in 1821: the Corps adopted an eagle perched on a fouled anchor encircled by 13 stars, a motif featured on buttons for nearly four decades. However, similar symbols were also used by the Army and Navy, making it less than unique.

Following the Civil War, Marine Corps leadership under Brigadier General Jacob Zeilin, the seventh Commandant, sought a truly unique insignia for the service.

The Zeilin Board and the Birth of the Modern EGA (1868)

On November 12, 1868, Zeilin established a board of officers “To decide and report upon the various devices of cap ornaments of the Marine Corps.” They wasted no time: by November 19, the Secretary of the Navy, Gideon Welles, had approved the new emblem.

The board drew inspiration from the British Royal Marines’ “Globe and Laurel” emblem.

The American version added a few important touches:

  • Globe showing the Western Hemisphere: Representing the Corps’ defense of the Americas and a global presence.
  • Fouled anchor: Honoring the Corps’ naval origins.
  • Eagle: Symbolizing national service and pride.

Zeilin described the new emblem as representing the Corps’ “readiness to serve anywhere, by sea or land.”

At the same time, a distinct emblem was also created for Marine Corps musicians, still seen today on the formal red and gold uniforms of the U.S. Marine Band—“The President’s Own”.

The Motto and Later Refinements

The Latin motto, Semper Fidelis (“Always Faithful”), was introduced in 1883 under Commandant Charles McCawley, replacing previous mottoes such as Fortitudine (“With Fortitude”). Semper Fidelis became central to the Marine Corps’ ethos.

The emblem saw many variations over the decades. Initial designs featured a crested eagle—borrowed from European heraldry. Semper Fidelis appeared on a scroll held in the eagle’s beak on some versions of the emblem.

Only in 1954, with President Dwight D. Eisenhower’s Executive Order 10538, did the American bald eagle with a scroll officially become part of the emblem. This finalized the design used today.

Officer and Enlisted Differences

Since 1868, design distinctions have marked officer and enlisted EGA emblems. Officers’ original emblems were elaborate—frosted silver hemispheres with gold-plated Americas, crowned by a solid silver eagle. Enlisted emblems were brass, emphasizing practicality.

Modern officers wear a multi-piece, high-relief insignia with fine rope detailing, while enlisted Marines use a one-piece emblem. Notably, officers’ globes omit Cuba to strengthen the emblem structurally.  A running joke among enlisted personnel is that officers couldn’t find Cuba on a map.

Before WWII, officers often purchased insignia from jewelers like Bailey, Banks & Biddle, resulting in stylistic inconsistency. One museum curator quipped, “World War I eagles looked like fat turkeys.” Eventually, standardization brought the crisp, clean look seen today.

A Legacy That Endures

From 18th-century anchors to the refined Eagle, Globe, and Anchor of today, the emblem tracks the Corps’ evolution from shipboard security to a global expeditionary force. Over centuries, its form has varied—engraved by jewelers, stamped for wartime, and cast in silver for dress blues—but its meaning remains constant.

Every Marine who earns the EGA joins a tradition stretching back 250 years, defined by courage, loyalty, and the enduring promise to remain Always Faithful.

The Real Enemy of the Revolution: Disease

When you think about the American Revolution, you probably picture dramatic battles like Bunker Hill or the crossing of the Delaware. But here’s something that might surprise you: the biggest killer during the war wasn’t British muskets—it was disease. And it’s not even close.

The Numbers Tell a Grim Story

Let’s talk numbers for a second. On the American side, about 6,800 soldiers died from battlefield wounds. Sounds terrible, right? Well, disease killed an estimated 17,000 to 20,000. That’s roughly three times as many. The British and their Hessian allies faced similar odds: around 7,000 combat deaths versus 15,000 to 25,000 disease deaths.

Think about that for a moment. You were actually safer charging into battle than hanging around camp. In some regiments, disease wiped out more than a third of the troops before they even saw their first fight.

Why Was Disease So Deadly?

Picture yourself in a Revolutionary War military camp. Hundreds of men crammed together in makeshift shelters, no running water, primitive latrines dug too close to where everyone lives, and basically zero understanding of what we’d call “germ theory” today. It’s a perfect storm for infectious disease.

The big killers were:

Smallpox was the heavyweight champion of camp diseases. This virus killed about 30% of people it infected and spread like wildfire through packed military camps. Soldiers tried to protect themselves through a risky practice called inoculation—basically giving themselves a mild case of smallpox on purpose by rubbing infected pus into cuts on their skin. Without proper quarantine procedures, though, this sometimes made outbreaks worse instead of better.

Typhus (called “camp fever” back then) spread through lice and fleas. If you’ve ever been to a prolonged camping trip and felt gross after a few days, imagine that times a hundred. Soldiers lived in the same clothes for weeks, rarely bathed, and the parasites just had a field day. The fever, headaches, and diarrhea that came with typhus made it one of the most dreaded camp diseases.

Dysentery (charmingly nicknamed “bloody flux”) came from contaminated water and poor sanitation. When your latrine is 20 feet from your water source and you don’t understand how disease spreads, this becomes pretty much inevitable. The severe diarrhea weakened soldiers to the point where many couldn’t fight even if they wanted to and it made them even more susceptible to other diseases.

Malaria was especially important in the South, where mosquitoes thrived in the humid climate. This one actually played a fascinating role in how the war ended—but more on that in a bit.

When Disease Changed Everything

The 1776 invasion of Canada was a disaster largely because of smallpox. Out of 3,200 American soldiers in the Quebec campaign, 1,200 fell sick. You can’t mount much of an offensive when more than a third of your army is flat on their backs with fever. Similarly, during the siege of Boston, Washington couldn’t effectively engage the British because so many of his troops were sick with smallpox. These weren’t just setbacks—they were strategic catastrophes.

This is what pushed George Washington to make one of his boldest decisions in 1777: he ordered a mass inoculation of the Continental Army. This was controversial and dangerous at the time, but it worked. Washington had survived smallpox himself as a young man, so he understood both the risks and the benefits. The inoculation program probably saved the army from complete collapse.

Medical “Treatment” Was Often Worse Than Nothing

Here’s where things get really grim. Eighteenth-century medicine was basically medieval. Doctors believed in “balancing the humors” through bloodletting—literally draining blood from already weakened soldiers. They also gave powerful laxatives to people who were already suffering from diarrhea. Yeah, let that sink in.

Pain relief meant opium-based drinks or just straight alcohol. Some doctors used herbal remedies, but results were inconsistent at best. Quinine helped with malaria, though nobody really understood why. Mostly, if you got seriously sick, your survival came down to luck and a strong constitution.

Valley Forge: The Turning Point

Valley Forge is famous for being a brutal winter encampment, and disease was a huge part of why it was so terrible. Scabies left nearly half the troops unable to serve. Dysentery and camp fever killed somewhere between 1,700 and 2,000 soldiers during that single winter—and remember, these weren’t battle casualties. These men died from preventable diseases in what was supposed to be a safe encampment.

But Valley Forge taught the Continental Army a crucial lesson. After that nightmare winter, military leaders started taking sanitation seriously. They began focusing on camp hygiene, protecting water supplies, placing latrines away from living areas, and making sure soldiers could bathe and wash their clothes and bedding.

Baron von Steuben is famous for teaching the Continental Army how to march and drill, but he also deserves credit for implementing serious sanitation reforms. These changes helped prevent future disease outbreaks and kept the army functional for the rest of the war.

The Secret Weapon at Yorktown

Here’s one of my favorite historical details: mosquitoes may have helped win American independence. At Yorktown, roughly 30% of Cornwallis’s British army was knocked out by malaria and other diseases during the siege. The British commander was trying to hold off the American and French forces while also dealing with the fact that almost a third of his troops were too sick to fight.

Many American soldiers from the southern colonies had grown up with malaria and had some partial immunity. The British? Not so much. Some historians even think Cornwallis himself might have been suffering from malaria, which could have affected his decision-making. His second-in-command, Brigadier General Charles O’Hara, was definitely seriously ill during the siege. Fighting a war while you can barely stand is a pretty significant handicap.

The Bigger Picture

The American Revolution shows us something important: wars aren’t just won on battlefields. They’re won by the side that can keep its soldiers alive and healthy. Disease shaped strategic decisions, determined the outcomes of campaigns, and killed far more men than any British regiment ever did.

Washington’s decision to inoculate the army was genuinely revolutionary (pun intended). It showed a willingness to embrace controversial medical practices for the greater good. The sanitation reforms that came out of Valley Forge laid groundwork for modern military medicine and influenced public health policies in the new United States.

So next time someone mentions the American Revolution, remember: while we celebrate the military victories, one of the most important battles was fought against an enemy you couldn’t see—and for most of the war, nobody really knew how to fight it.

The casualty figures and major disease outbreaks are well-documented in historical records. The specific percentages and numbers are estimates based on historical research, as precise record-keeping was limited during this period. The overall narrative about disease being the primary cause of death is strongly supported by multiple historical sources.

How A Nobel Laureate Thinks We Can Save The American Economy…But It Won’t Be Easy

I just finished People, Power, and Profits by Joseph Stiglitz — the Nobel Prize winning economist.  He wrote this near the end of Trump’s first term, but honestly, the world he describes feels even more relevant now.

Stiglitz doesn’t sugarcoat it: capitalism, as we’re practicing it today, is broken. Monopolies dominate markets, inequality has gone wild, and trust in democracy is running on fumes. His proposed fix? Something he calls progressive capitalism — capitalism with guardrails, conscience, and a sense of fairness.

Stiglitz makes the case that our economic system is rigged — not by accident, but by design. Here are his most compelling arguments and what he thinks we should do about them.

1. Taxation and Rent-Seeking: The Rigged Game

Stiglitz draws a sharp distinction between making money through productive work and extracting it through what economists call “rent-seeking” – essentially, using power to skim wealth without creating value. Think of a pharmaceutical company that buys a drug patent and jacks up prices 5,000%, or telecom monopolies that divide up markets to avoid competing.

His argument is straightforward: our tax system rewards the wrong behavior. Capital gains are taxed at lower rates than wages, which means someone living off investments pays less than someone working a regular job. Meanwhile, the wealthy can afford armies of accountants to exploit loopholes that most people don’t even know exist.

What Stiglitz recommends: Tax wealth more aggressively, especially inherited wealth. Close the capital gains loophole. Tax rent-seeking activities heavily while reducing taxes on productive work and innovation. The goal isn’t just revenue – it’s changing incentives so that the path to riches runs through creating value, not extracting it.

2. Green Energy and the True Cost of Pollution

Here’s where Stiglitz gets into what economists call “externalities” – costs that businesses impose on society without paying for them. When a coal plant spews carbon into the atmosphere, we all pay through climate change and increased healthcare costs, but the plant’s balance sheet looks great.

Stiglitz argues this is fundamentally dishonest accounting. If we properly priced pollution and carbon emissions, green energy wouldn’t need subsidies to compete – fossil fuels would suddenly look much more expensive once you factor in their real costs to society.

His recommendation: Implement carbon pricing – either through a carbon tax or cap-and-trade system. Make polluters pay for the damage they cause. This isn’t about punishing business; it’s about honest accounting. Once prices reflect reality, the market will naturally shift toward cleaner energy because it’s actually cheaper when you account for all the costs.

3. Big Business and Big Banks: Concentration of Power

Stiglitz has been particularly vocal about how corporate consolidation hurts everyone except shareholders and executives.  His critique of “too big to fail” is sharp. He argues that concentrated economic power — in tech, finance, and even agriculture — undermines both democracy and efficiency. When a few firms dominate markets, they can suppress wages, block innovation, and bend regulations in their favor—they gain power over prices, wages, and even politics.

The banking sector especially concerns him. After the 2008 financial crisis, which was caused largely by reckless behavior from major banks, these same institutions emerged even larger through government-facilitated mergers. We allowed them to spread their losses among their depositors but let them keep their gains as internal profits.

His recommendations: Reinstate and strengthen regulations that were stripped away, including bringing back something like the Glass-Steagall Act that separated commercial and investment banking. Break up banks that are “too big to fail.” Strengthen antitrust enforcement across all industries. Use the government’s regulatory power to promote competition rather than letting industry consolidate.

4. Money in Politics: The Feedback Loop

This is where everything connects for Stiglitz. Concentrated economic power translates directly into political power. Wealthy interests fund campaigns, lobby relentlessly, and effectively write regulations for the agencies that are supposed to oversee them. This creates a vicious cycle: economic inequality begets political inequality, which creates policies that worsen economic inequality.

Stiglitz argues that the Supreme Court’s Citizens United decision, which allowed unlimited corporate spending in elections, turbocharged this problem by treating money as speech and corporations as people.

His recommendations: Limit campaign spending and institute public financing of campaigns to reduce candidates’ dependence on wealthy donors. Place strict limits on lobbying and implement a robust “revolving door” policy that prevents government officials from immediately cashing in with the industries they regulated. Mandate transparency requirements so voters know who’s funding what. Pass Constitutional amendments if necessary to overturn Citizens United.

The Big Picture

What makes Stiglitz’s argument powerful is how these pieces fit together. You can’t fix inequality just through taxation if big corporations control the political process. You can’t address climate change if fossil fuel companies can buy enough influence to block action. Everything is connected.

His recommendations aren’t radical in historical terms – they’re actually trying to restore a balance that existed during the post-war economic boom of the 1950s.  Stiglitz’s “progressive capitalism” isn’t socialism. It’s capitalism with a conscience — one that remembers who it’s supposed to serve.

Whether you see that as a rescue plan or a recipe for red tape depends entirely on where you put your faith: in public institutions or private markets. The question is do we have the political will to implement his recommendation despite entrenched opposition from those benefiting from the current system?

 Either way, this debate isn’t going away — it’s the one shaping the 21st-century economy.

Pistols at Dawn: The Rise and Fall of the Code Duello

Not long ago I was watching a news show and one of the panelists started talking about “a duel of words” that went on in a congressional hearing. I was intrigued by the use of the word duel and I thought I’d look into the history of this strange custom.

In the age before Twitter feuds, internet trolling, and legal settlements, honor was defended with pistols at dawn. The Code Duello, a set of rules governing dueling, offers a fascinating glimpse into how ideas of masculinity, reputation, and justice shaped public and private life in the Anglo-American world from the mid-18th century through the antebellum era.

The Code Duello emerged as one of the most distinctive and controversial aspects of genteel culture in the American colonies in the early United States. This elaborate system of honor-based combat, imported from European aristocratic traditions, would profoundly shape American society between 1750 and 1860, creating a culture where personal honor often trumped legal authority and where violence became a sanctioned means of dispute resolution among the elite.

European Origins 

The Code Duello originated in Renaissance Italy and spread throughout European aristocratic circles as a means of settling disputes while maintaining social hierarchy. The practice reached the American colonies through British and Continental European settlers who brought with them deeply ingrained notions of honor, reputation, and gentlemanly conduct. Unlike random violence or brawling, dueling operated under strict protocols that emphasized courage, skill, and adherence to prescribed rituals.

The most influential codification was the Irish Code Duello of 1777, written by gentlemen of Tipperary and Galway. This twenty-six-rule system established procedures for issuing challenges, selecting weapons, determining conditions of combat, and defining acceptable outcomes. The code emphasized that dueling was a privilege of gentlemen, requiring both participants to be of equal social standing and ensuring that honor could only be satisfied through formal, regulated combat.

Colonial Implementation and Adaptation

The first recorded American duel occurred in 1621 in Plymouth, Massachusetts, between two servants, but the practice soon became the exclusive domain of elites as only “gentlemen” were considered to possess honor worth defending in this way.

The Irish Code Duello was widely adopted in America, though often with local variations. In 1838, South Carolina Governor John Lyde Wilson published an “Americanized” version, known as the Wilson Code, which further codified the practice for the southern states and attempted to increase negotiated settlements. These codes served as the de facto law of honor, even as formal legal systems struggled to suppress dueling.

The practice gained prominence among the southern plantation society’s hierarchy as dueling fit well with its emphasis on personal honor.   The ritual was highly formal: challenges were issued in writing, seconds (assistants to the duelists) attempted to mediate, the weapons chosen, and terms were carefully negotiated.

Colonial dueling adapted European practices to American circumstances. While European duels often involved swords, reflecting centuries of aristocratic martial tradition, American duelists increasingly favored pistols, which were more readily available and required less specialized training. This shift democratized dueling to some extent, as pistol proficiency was more easily acquired than swordsmanship, though the practice remained largely restricted to the upper classes.

The Revolutionary War significantly expanded dueling’s influence. Military service brought together men from different regions and social backgrounds, spreading dueling customs beyond their original geographic and social boundaries. Officers who had learned European military traditions during the conflict carried these practices into civilian life, establishing dueling as a marker of martial virtue and gentlemanly status.

The Early Republic

Following independence, dueling became increasingly institutionalized in American society.  The young republic’s political culture, characterized by intense partisan conflict and personal attacks in newspapers, created numerous opportunities for perceived slights to honor that demanded satisfaction through combat.

The most famous American duel occurred in 1804 when Aaron Burr killed Alexander Hamilton at Weehawken, New Jersey. This encounter exemplified both the power and the contradictions of dueling culture. Hamilton, despite philosophical opposition to dueling, felt compelled to accept Burr’s challenge to maintain his political viability. The duel’s outcome effectively ended Burr’s political career and demonstrated how adherence to the code could destroy the very honor it purported to defend.

Prior to becoming president, Andrew Jackson took part in at least three duels, although he is rumored to have been in many more. In his most famous duel, Jackson shot and killed a man who had insulted his wife. Jackson was also wounded in the duel and carried the bullet in his chest for the rest of his life.

Political dueling reached epidemic proportions in the antebellum period. Congressional representatives, senators, and other public figures regularly challenged opponents to combat over policy disagreements or personal insults. The practice became so common that some politicians deliberately provoked duels to enhance their reputation for courage, while others saw dueling as essential to maintaining credibility in public life.

Regional Variations and Social Dynamics

Dueling culture varied significantly across regions. The South developed the most elaborate and persistent dueling traditions, where the practice became intimately connected with concepts of honor, masculinity, and social hierarchy that would later influence Confederate military culture. Southern dueling codes often emphasized elaborate rituals and multiple exchanges of fire, reflecting a culture that viewed honor as more important than life itself.

Northern attitudes toward dueling were more ambivalent. While many Northern elites participated in dueling, the practice faced stronger opposition from religious groups, legal authorities, and emerging middle-class values that emphasized commerce over honor. Anti-dueling societies formed in several Northern cities, and some states enacted specific anti-dueling legislation, though enforcement remained inconsistent. Laws against it were passed in several colonies as early as the mid-18th century, with harsh penalties including denial of Christian burial for duelists killed in combat. Clergy denounced it as un-Christian, and reformers sought to eradicate it, but the practice persisted, especially in regions where courts were weak or social hierarchies unstable. The South, with its less institutionalized markets and governance, saw dueling as a quicker, more reliable way to settle disputes.

Western frontier regions adapted dueling to their own circumstances, often emphasizing practical marksmanship over elaborate ceremony. Frontier dueling tended to be less formal than Eastern practices, but it served similar functions in establishing social hierarchies and resolving disputes in areas where legal institutions remained weak.

Decline and Legacy

By the 1850s, dueling faced increasing opposition from legal, religious, and social reform movements. The rise of professional journalism, which could destroy reputations without resort to violence, provided alternative means of defending honor. Changing economic conditions that emphasized commercial success over martial virtue gradually undermined dueling’s social foundations.

The Civil War marked dueling’s effective end as a significant social institution. The massive scale of organized violence made individual combat seem anachronistic, while post-war society increasingly emphasized industrial progress over aristocratic honor. Though isolated duels continued into the 1870s, the practice lost its central role in American elite culture.

The Code Duello’s legacy extended far beyond its formal practice. It established patterns of violence, honor, and masculine identity that would influence American culture for generations, contributing to regional differences in attitudes toward violence and honor that persist today. The code’s emphasis on individual resolution of disputes also reflected broader American skepticism toward institutional authority, helping shape a culture that often preferred private justice to public law.

How the Code Duello Shaped Western Gunfighting Culture

The Code Duello was a script for settling personal disputes through controlled violence. Its influence waned in the East by the mid-1800s, but many of its ideas persisted, especially among military veterans, Southern transplants, and frontiersmen. As the American frontier expanded, the ethic of “settling scores” through personal combat found fertile ground in the west. What changed was the style and setting.

From Pistols at Dawn to High Noon

In the Code Duello, challenges were typically issued in writing, often with formal language and designated seconds. A duel was planned, often days in advance, and fought with flintlock pistols or swords. By contrast, gunfights in the Old West were more spontaneous, often provoked by insults, cheating, or long-standing feuds. Still, both forms were ultimately about defending personal honor in public view.

Gunfighters like Wild Bill Hickok and Wyatt Earp became mythologized partly because they embodied an honor-based culture in an environment where the law was weak or slow. In many ways, the Western gunfight was an informal, democratized version of the Code Duello, stripped of its aristocratic pretenses but keeping its emotional and symbolic core.

Myth vs. Reality

Ironically, formal duels were relatively rare in the actual Old West, and many “gunfights” were closer to ambushes or drunken brawls than ritualized combat. But dime novels, Wild West shows, and later Hollywood films reimagined them using a Code Duello-like template: two men meet face to face, in broad daylight, to resolve a conflict through a test of nerve and skill. The image of the high-noon shootout—with a silent crowd, an agreed time and place, and an implied code of fairness—is the Code Duello in cowboy boots, but it likely never existed.

The Duel That Never Was

I will end the discussion of Code Duello with what may be one of the most unusual of all American dueling stories.  

In 1842, Abraham Lincoln became embroiled in a public dispute with James Shields, the auditor of Illinois, largely over Illinois State banking policy and some satirical letters that mocked Shields.  Shields took great offense to these attacks—particularly the ones written by Lincoln under the pseudonym “Rebecca”—and formally challenged Lincoln to a duel.  According to the rules of dueling, Lincoln, as the one challenged, had the right to choose the weapons. He selected cavalry broadswords of the largest size to take advantage of his own height and reach over Shields.

The Duel’s Outcome

The duel was scheduled for September 22, 1842, on Bloody Island, a sandbar in the Mississippi River near Alton, Missouri—chosen because dueling was still legal there.  On the day of the duel, before any blood was shed, Lincoln dramatically demonstrated his advantage by slicing off a high tree branch with his broadsword, showcasing his reach and physical prowess.  After witnessing this and following subsequent negotiations by their seconds, Shields and Lincoln decided to call off the duel, resolving their differences without violence.

Legacy

Although the duel never resulted in violence, it became a notorious episode in Lincoln’s life, one he rarely spoke of later, even when asked about it.  The event is commonly cited as a reflection of Lincoln’s quick wit, physical presence, and preference for peaceful resolution when possible.  While Abraham Lincoln never actually fought a duel, he was briefly a participant in one of the more colorful near-duels of American political history.

A Final Thought

Perhaps the world would be a better place if we reinstitute some elements of Code Duello and instead of sending armies off to fight bloody battles, the national leaders settle disputes by individual combat.  I suspect there would be many more negotiated settlements.

Powdered Wigs and Politics: The Rise and Fall of America’s Most Distinguished Hair Trend

I’ve been spending a lot of time recently researching and writing about the 250th Anniversary of the American Revolution and I keep asking myself, “What’s up with the wigs?”   Have you ever wondered why the Founding Fathers look so impossibly fancy in their portraits?  Well, you can thank a French king and a syphilis epidemic. The elaborate wigs worn by early American leaders weren’t just fashion statements—they were complex social symbols that said everything about who you were, what you could afford, and how seriously you wanted to be taken.

Where It All Started

The wig craze didn’t begin in America. It started across the Atlantic when France’s King Louis XIII went bald prematurely in the 1600s and decided to cover it up with a wig. But it was his son, Louis XIV, who really kicked things into high gear. When the Sun King started losing his hair, he commissioned elaborate wigs that became the epitome of aristocratic style. European nobility, desperate to emulate French sophistication, quickly followed suit.

The practice also had a less glamorous origin story. Syphilis was rampant in 17th-century Europe, and one of its unfortunate side effects was hair loss. Wigs conveniently covered up this telltale symptom while also hiding the sores and blemishes that came with the disease.

Europe in the 1600s and 1700s also had frequent outbreaks of lice and other parasites. Shaving one’s natural hair short and wearing a wig—which could be cleaned, boiled, or deloused more easily—became a practical solution. Powdering helped keep wigs fresh and masked odors.

By the time the fashion crossed the ocean to colonial America in the early 1700s, wigs had become standard attire for anyone with social pretensions.

Status on Your Head

In colonial America, your wig announced your place in society before you even opened your mouth. The most expensive and elaborate wigs featured long, flowing curls that cascaded past the shoulders—these full-bottomed wigs could cost the equivalent of several months’ wages for an average worker. Wealthy merchants, successful plantation owners, and colonial officials wore these statement pieces to project authority and refinement.

Professional men like doctors, lawyers, and clergy typically wore more modest styles. The “tie wig” gathered hair at the back with a ribbon, while the “bob wig” featured shorter hair that ended around the neck. These styles were practical enough for men who actually had to work, but still formal enough to command respect. Even the style of curl mattered—tight curls suggested conservatism and tradition, while looser waves indicated a more progressive outlook.

Working-class men generally couldn’t afford real wigs. Some wore simple caps or went bareheaded, while others might invest in a cheap wig made from horsehair or goat hair for special occasions. The quality difference was obvious—human hair wigs, especially those made from blonde or white hair, were luxury items that only the wealthy could obtain.

Many men who did not wear wigs but still wanted the fashionable look would grow their own hair long, pull it into a queue (pony tail), and powder it. George Washington is a good example — portraits show his natural hair powdered white, not a wig.

The Daily Reality of Wig Life

Maintaining these hairpieces was no joke. Owners had to powder their wigs regularly with starch powder, often scented with lavender or orange, to achieve that distinctive white or gray color that signaled refinement. The powder got everywhere, which is why men often wore special dressing gowns during the powdering process.

Wigs required regular cleaning and restyling by professionals called peruke makers or wigmakers. These craftsmen commanded good money in colonial cities, advertising their services alongside other luxury trades. The hot, humid summers in places like Virginia and South Carolina made wig-wearing particularly miserable, but fashion demanded sacrifice.

The Revolutionary Shift

By the time of the American Revolution, attitudes toward wigs were already changing. The shift happened for several interconnected reasons, and it reflected broader transformations in American society.

First, the Revolutionary War itself promoted practical thinking. Military officers found elaborate wigs impractical in the field, and the democratic ideals of the Revolution made aristocratic European fashions seem pretentious. Many younger revolutionaries, including Thomas Jefferson, stopped wearing wigs as a political statement against Old World affectation.

A young Jefferson with a wig

Second, France—the original source of wig fashion—underwent its own revolution in 1789. As French revolutionaries literally beheaded the aristocracy, powdered wigs became associated with the despised nobility. What had once symbolized sophistication now suggested tyranny and excess.

In Great Britain, Parliament introduced a tax on hair powder as part of Prime Minister William Pitt the Younger’s revenue-raising measures.  The law required anyone who used hair powder to purchase an annual certificate costing one guinea (a little over $200 in today’s money).  This contributed to the growing sense that wigs were an unnecessary extravagance. Meanwhile, changing ideals of masculinity emphasized natural simplicity over artificial ornamentation.

By the early 1800s, the wig had largely disappeared from everyday American life. A new generation of leaders, including Andrew Jackson, proudly displayed their natural hair. The transition happened remarkably quickly—within a single generation, wigs went from essential to absurd. By the 1820s, anyone still wearing a powdered wig looked hopelessly outdated, clinging to a world that no longer existed.

The Legacy

Today, elaborate wigs survive primarily in British courtrooms, where some judges still wear them in formal proceedings—a deliberate echo of legal tradition. The powdered wigs of the Founding Fathers remain iconic, instantly recognizable symbols of early American history, even though the men who wore them were already abandoning the fashion by the time they built the new nation.

Divine Providence and Patriotism

Religion in the Ranks of the Continental Army

The Continental Army that fought for American independence from 1775 to 1783 represented a cross-section of colonial religious life, bringing together men from diverse faith traditions under a common cause. The religious faith of both enlisted soldiers and officers reflected the broader religious landscape of colonial America, and their regional differences contributed to a complex tapestry of faith within the ranks.

The Continental Army drew from a population where religious diversity was already well-established, particularly when compared to European armies of the same period. Protestant denominations were the majority within the ranks, reflecting the colonial religious demographics.

The American Revolution was not only a political and military struggle but also a deeply religious experience for many Continental Army soldiers. Their faith shaped how they interpreted the war, coped with its hardships, and interacted with comrades from diverse backgrounds.

Enlisted soldiers often relied on providentialism, the belief that God directly intervened in daily life, to make sense of battlefield chaos and suffering. Diaries and letters reveal troops attributing survival in skirmishes, unexpected weather shifts, and even mundane events to divine will. For example, many saw the Continental Army’s unlikely battlefield victories as evidence of God’s favor toward their cause.

Diversity in the Ranks

Congregationalists from New England formed a significant portion of the army, bringing with them the Puritan theological tradition that emphasized divine providence and moral responsibility.  Ministers in New England frequently preached that resistance to tyranny was a Christian duty and many soldiers viewed themselves as fighting against tyranny, much as their ancestors had fled religious persecution.

Presbyterian soldiers, many of Scots-Irish descent, comprised a substantial group. Concentrated heavily in the Middle Colonies and frontier areas, they tended toward evangelical and Presbyterian influences, regardless of their home colony . The challenging conditions of frontier life had already created a more individualistic and emotionally intense form of Christianity that adapted well to military service. These soldiers often brought a fatalistic acceptance of divine will combined with fierce determination.

Baptist and Methodist soldiers, though fewer in number, represented growing evangelical movements that would later transform American Christianity. German Reformed and Lutheran soldiers from Pennsylvania added to the religious diversity, while smaller numbers of Catholics, particularly from Maryland and Pennsylvania, served despite facing legal restrictions in many colonies. Even a few Jewish soldiers joined the cause, though their numbers were minimal given the tiny Jewish population in colonial America. The religious pluralism in regiments from the Middle Colonies created a more tolerant atmosphere that foreshadowed the religious diversity of the new nation.  

Quakers were generally pacifists and avoided military service, however some “Free Quakers” broke from their tradition and joined the Patriot cause. Other pacifist religious groups, such as Mennonites, abstained from combat but occasionally provided non-combatant support.

Anglicans, ironically fighting against their own church’s mother country, served in significant numbers, particularly from the Southern colonies where the Church of England frequently had been established as a state supported church. 

While religious diversity was generally a unifying force in the Continental Army, there were some instances of religious tension and while they were relatively limited, they were not entirely absent. One significant example occurred early in the war in November 1775, when some American troops planned to burn an effigy of the Pope on Guy Fawkes Day (also known as Pope’s Day in New England).  General Washington strongly condemned this anti-Catholic action, denouncing it as indecent and lacking common sense.

Washington actively tried to prevent sectarianism from undermining unity in the ranks, whether between Protestants and Catholics or among different Protestant denominations. The overall trend was towards religious tolerance and unity, with religious diversity ultimately contributing positively to the army’s cohesion and morale.

Officer Corps and Religious Leadership

The officer corps of the Continental Army reflected a somewhat different religious profile than the enlisted ranks. Many officers came from the colonial elite and were often Anglican or belonged to more established denominations. However, the Revolution’s anti-episcopal sentiment led many Anglican officers to distance themselves from their church’s political connections while maintaining their basic Christian beliefs.  The relationship between the soldiers and the established Church of England became increasingly strained as the Revolution progressed.

George Washington himself exemplified this complex relationship with religion. Though nominally Anglican, Washington never wrote about his personal faith.  He was likely influenced by Deist philosophy, then popular among Enlightenment thinkers including Jefferson and possibly Franklin. Deism holds that reason and observation of the natural world are sufficient to determine the existence of a Creator, but that this Creator does not intervene in the universe after its creation.

Washington regularly invoked divine providence in his correspondence and orders, understanding the importance of religious sentiment in maintaining morale, even while his personal beliefs remained ambiguous. His famous directive that “the blessing and protection of Heaven are at all times necessary but especially so in times of public distress and danger” reflected his understanding of religion’s role in military leadership.

Other prominent officers brought their own religious convictions to their leadership. Nathanael Greene, the Southern theater commander, was raised as a Quaker but was expelled from the Society of Friends for his military service. Rev. Peter Muhlenberg, a Lutheran minister, left the pulpit and joined the Continental Army, rising to the rank of Major General.  The Marquis de Lafayette, though Catholic, adapted to the predominantly Protestant environment of the American officer corps.

Officers, including Washington, viewed religion as a tool for discipline and unity. Washington mandated Sunday worship. He also appointed chaplains to every brigade, insisting they foster “obedience and subordination”.

 Continental Army Chaplain Service

Recognizing the importance of religion to morale and discipline, the Continental Congress authorized the appointment of chaplains to serve with the army. The chaplain system evolved throughout the war, beginning with regimental chaplains and eventually expanding to include brigade and division chaplains for larger organizational units.

Continental Army chaplains faced unique challenges. Unlike European armies with established state churches, American chaplains served religiously diverse units. They needed to provide spiritual comfort to soldiers from different denominational backgrounds while avoiding sectarian conflicts that could undermine unit cohesion. Most chaplains were Protestant ministers, reflecting the army’s composition, but they were expected to serve all soldiers regardless of specific denominational affiliation.

The duties of Continental Army chaplains extended beyond conducting religious services. They often served as informal counselors, helped soldiers write letters home, provided basic education to illiterate soldiers, and sometimes even served as medical assistants. Their moral authority made them valuable in maintaining discipline, and many commanders relied on chaplains to address problems of desertion, drunkenness, and other behavioral issues.

Chaplains also played important roles in significant military events. They conducted prayers before major battles and led thanksgiving services after victories. They provided comfort to the dying and services for the dead.

The British recognized the importance of chaplains to the Continental Army and in some cases offered rewards for their capture.

The famous painting of “The Prayer at Valley Forge” with its image of Washington praying alone in the snow, whether historically accurate or not, represents the type of spiritual leadership chaplains were expected to provide during the army’s darkest moments.

Conclusion Religion in the Continental Army reflected both the differing and the common aspects of colonial American faith. While denominational variations existed, most soldiers shared basic Christian beliefs that provided comfort during hardship and meaning to their sacrifice. The army’s religious diversity foreshadowed the religious pluralism that would characterize the new American nation, while the chaplain service established precedents for military religious support that continue today. The Revolution’s success owed much to the spiritual resources that sustained these soldiers through eight years of difficult warfare, demonstrating religion’s crucial role in the founding of the American republic.

Always Faithful: A Brief History of the Marine Corps Motto

When I started training as a Marine more than 50 years ago one of the first things we were taught was the call and response “Semper Fi” followed quickly by “Do or Die”.  But to Marines, Semper Fi, Semper Fidelis—Always Faithful—is more than just a motto. It becomes a personal belief system, a statement of individual integrity and a way of life.  Faithful to country, faithful to the Corps, faithful to fellow Marines, faithful to duty.  It reflects your faith in the Marine Corps and your fellow Marines.

How did Marines come to adopt this distinctly non martial motto?  Other more military sounding mottos and nicknames come to mind: “Devil Dogs”, “First to Fight”, and “Leathernecks”.  But Semper Fidelis has become the way Marines see themselves, so much so that their greeting to one another is “Semper Fi”.  The same ethos is embodied in an unofficial Marine Corps motto, “No Man Left Behind”.

But what is the origin of this motto that seems to sum up the entire philosophy of the Marine Corps?

The United States Marine Corps is known for its discipline, dedication, and fierce loyalty, qualities that are symbolized by Semper Fidelis. Translated from Latin, the phrase means “Always Faithful.” But like many traditions within the military, the motto is rooted in a rich history that stretches back hundreds of years.

The Marine Corps was established in 1775 as the Continental Marines, but the famous motto did not appear until more than a century later. By the early 19th century, several mottos had been associated with the Marines, including “Fortitudine” (With Fortitude) and “By Sea and by Land.” While these phrases captured elements of the Marines’ mission, they lacked the enduring emotional impact that would ultimately come with Semper Fidelis.

It was in 1883 that the motto was formally adopted under the leadership of the 8th Commandant, Colonel Charles McCawley. Colonel McCawley likely chose that motto because it embodies the values of loyalty, faithfulness and dedication that he believed should define every Marine.  Unfortunately, we will never know his exact reason for choosing this specific motto because he did not leave any documentation about his thought process.  Regardless, from that point on, the motto became inseparable from the identity of the Corps.

The phrase “Semper Fidelis” has much older origins than its Marine Corps adoption. It’s believed to have originated from phrases used by senators in ancient Rome, with the earliest recorded use as a motto dating back to the French town of Abbeville in 1369. The phrase has been used by various European families since the 16th century, and possibly as early as the 13th century.

The earliest recorded military use was by the Duke of Beaufort’s Regiment of Foot, raised in southwestern England in 1685. The motto also has connections to Irish, Scottish, and English nobility, as well as 17th-century European military units, some of whose members may have emigrated to American colonies in the 1690s

The choice of the Latin phrase by Colonel McCawley was likely deliberate. Latin carries with it a sense of permanence and tradition, and its concise wording communicated volumes in only two words. “Always Faithful” perfectly captured the bond that must exist between Marines and the responsibilities they shoulder. Marines are expected to remain faithful to the mission, to their comrades in arms, and to the United States, regardless of the personal cost. It is this idea of unshakable fidelity that has come to define what it means to wear the Eagle, Globe, and Anchor.

Since its adoption, Semper Fidelis has carried Marines through every conflict the United States has faced. From the battlefields of World War I, where Marines earned the name “Devil Dogs,” to the grueling island campaigns of the Pacific in World War II, to the frozen battle fields of Korea, to the steaming jungles of Vietnam, Marines have demonstrated again and again what it means to be “Always Faithful.” In modern times, whether in Iraq, Afghanistan, or in humanitarian missions across the globe, this motto continues to serve as a reminder of the Corps’ unwavering commitment.

The phrase has also influenced the broader culture of the Marines, inspiring the title of the official Marine Corps march, “Semper Fidelis,” composed by John Philip Sousa in 1888, which remains a powerful symbol of pride and esprit de corps.

The motto’s meaning extends beyond active service. Marines pride themselves on being “once a Marine, always a Marine,” and Semper Fidelis reflects that lifelong bond. Even after leaving the uniform behind, Marines carry that sense of loyalty into civilian life, honoring the values and traditions of their service. For many, it becomes a central guiding principle throughout their lives.  Marine veterans always say “I was a Marine”.

In the end, the motto “Semper Fidelis” is far more than a catchy phrase. It is both a promise and a challenge—a pledge of unwavering loyalty and a challenge to live up to the highest standards of duty, honor, and fidelity. When Marines declare “Semper Fi,” they acknowledge not only their devotion to the Marine Corps, but also the unbreakable loyalty that binds them together as brothers and sisters in arms.

The celebration of the 250th anniversary of the signing of the Declaration of Independence is coming up next year on July 4th. But what about the events leading up to this? What about the men and women who helped make this happen? There are events coming up to commemorate the 250th anniversary of the founding of the Continental Navy and the Continental Marines in 1775. We will be holding commemorative celebrations here in West Virginia and there will be a national event in Philadelphia in October of this year.

Why History Still Matters: Lessons for Today’s World

Recently I was reading an article about the poor state of historical knowledge in the United States, and I decided to repost my first article from when I started blogging almost five years ago.  It seems very little has changed.

“Study the past if you would define the future,”—Confucius. I particularly like this quotation. It is similar to the more modern version: Those who don’t learn from the past are doomed to repeat it. However, I much prefer the former because it seems to be more in the form of advice or instruction. The latter seems to be more of a dire warning. Though I suspect, given the current state of the world, a dire warning is in order.

But regardless of whether it comes in the form of advice or warning, people today do not seem to heed the importance of studying the past.  The knowledge of history in our country is woeful. The lack of emphasis on the teaching of history in general and specifically American history, is shameful. While it is tempting to blame it on the lack of interest on the part of the younger generation, I find people my own age also have very little appreciation of the events that shaped our nation, the world, and their lives. Without this understanding, how can we evaluate what is currently happening and understand what we must do to come together as a nation and as a world.

I have always found history to be a fascinating subject. Biographies and nonfiction historical books remain among my favorite reading. In college I always added one or two history courses every semester to raise my grade point average. Even in college I found it strange that many of my friends hated history courses and took only the minimum. At the time, I didn’t realize just how serious this lack of historical perspective was to become.

Several years ago, I became aware of just how little historical knowledge most people possess. At the time, Jay Leno was still doing his late-night show, and he had a segment called jaywalking. During that segment he would ask people in the street questions that were somewhat obscure and to which he could expect to get unusual and generally humorous answers. On one show, on the 4th of July, he asked people “From what country did the United States declare independence on the 4th of July?” and of course no one knew the answer.  

My first response was he must have gone through dozens of people to find the four or five people who did not know the answer to his question. The next day at work, the 5th of July, I decided to ask several people, all of whom were college graduates, the same question. I got not one single correct answer. Although, one person at least realized “I think I should know this”. When I told my wife, a retired teacher, she wasn’t surprised.  For a long time, she had been concerned about the lack of emphasis on social studies and the arts in school curriculums.  I too was now becoming seriously concerned about the direction of education in our country.

A lot of people are probably thinking “So what, who really cares what a bunch of dead people did 250 years ago?” If we don’t know what they did and why they did it how can we understand its relevance today?  We have no way to judge what actions may support the best interests of society and what will ultimately be detrimental.

Failure to learn from and understand the past results in a me-centric view of everything. If you fail to understand how and why things have developed, then you certainly cannot understand what the best course forward will be. Attempting to judge all people and events of the past through your own personal prejudices leads only to continued and worsening conflict.

If you study the past, you will see that there has never general agreement on anything. There were many disagreements, debates and even a civil war over differences of opinion.  It helps us to understand that there are no perfect people who always do everything the right way and at the right time. It helps us to appreciate the good that people do while understanding the human weaknesses that led to the things that we consider faults today. In other words, we cannot expect anyone to be a 100% perfect person. They may have accomplished many good and meaningful things, and those good and meaningful things should not be discarded because the person was also a human being with human flaws.

Understanding the past does not mean approving of everything that occurred but it also means not condemning everything that does not fit into twenty-first century mores.  Only by recognizing this and seeing what led to the disasters of the past can we hope to avoid repetition of the worst aspects of our history. History teaches lessons in compromise, involvement and understanding. Failure to recognize that leads to strident argument and an unwillingness to cooperate with those who may differ in even the slightest way. Rather than creating the hoped-for perfect society, it simply leads to a new set of problems and a new group of grievances.

In sum, failure to study history is failure to prepare for the future. We owe it to ourselves and future generations to understand where we came from and how we can best prepare our country and the world we leave for them. They deserve nothing less than a full understanding of the past and a rational way forward. 

I’m going to close with a quote I recently came across:

  “Indifference to history isn’t just ignorant, it’s rude.”

                        —David McCollum

Deborah Sampson: A Revolutionary Soldier

In the story of the American Revolution, the names most often remembered are those of the Founding Fathers and battlefield generals. Yet woven through the familiar narrative are lesser known but extraordinary individuals whose actions defied the norms of their time. One of the most remarkable among them was Deborah Sampson, a Massachusetts woman who disguised herself as a man and served for nearly two years in the Continental Army. Her life reflects not only courage and patriotism, but also the complexity of gender roles in Revolutionary America

A Difficult Early Life

Deborah Sampson was born in Plympton, Massachusetts, in 1760 as the eldest of seven children in a family with deep Pilgrim roots, tracing lineage to Myles Standish and Governor William Bradford. Despite this heritage, her family struggled financially, and she grew up with poverty and abandonment. Her father deserted the family when she was young, leaving her mother with limited resources to care for their children. It was initially thought that he had died at sea, but they later discovered he had actually moved to Maine where he married and raised a second family.

Deborah was still young when her mother died and she was sent to live with a widow, Mary Price Thatcher, then in her 80s. Deborah likely learned to read while living with her.  After Widow Thatcher died, Deborah was bound out as an indentured servant to the Thomas family in Middleborough, Massachusetts, where she worked until she turned 18. This experience exposed her to hard physical labor and taught her skills typically associated with men’s work, including farming and carpentry. During this time, she educated herself and developed a keen intellect that would prove invaluable throughout her life. 

When her term of indenture ended in 1782, Sampson found herself in a precarious position as a young, unmarried woman with few economic opportunities. She intermittently supported herself as a teacher in the summers and a weaver in the winters.

Enlisting in the Army

The Revolutionary War was still raging, and the Continental Army desperately needed recruits. Motivated by both patriotic fervor and economic necessity, Sampson made the audacious decision to enlist in the army disguised as a man. She initially enlisted in 1782 under the name Timothy Taylor and collected a cash enlistment bounty but she failed to report for duty with her company.   She was later recognized as being Taylor and was required to repay what she had not already spent from her enlistment bounty.  No further punishment was made by the civil authorities; however, the Baptist Church withdrew its fellowship until she apologized and asked for forgiveness.

She later made a second enlistment, adopting the name Robert Shurtleff (sometimes spelled Shurtlieff or Shirtliff). This time she followed through and reported for duty.

She bound her chest, cut her hair, and donned men’s clothing to complete her transformation.  Sampson’s physical appearance aided her deception. She was tall for a woman of her era, standing nearly six feet, with a lean build and strong constitution developed through years of manual labor. Her lack of facial hair was not unusual among young male recruits, and she successfully passed the initial examination to join the 4th Massachusetts Regiment in May 1782.

The challenge of maintaining her disguise while living in close quarters with other soldiers required constant vigilance. Sampson developed strategies to protect her secret, including volunteering for guard duty to avoid sleeping arrangements that might expose her, and finding private moments to tend to personal needs. She also had to manage the physical demands of military life while dealing with the unique challenges of being a woman in a male-dominated environment.

Sampson’s military career nearly ended when she was wounded during a skirmish. She received a sword cut to her head and was shot in the thigh. Fearing that medical treatment would reveal her true identity, she initially treated her wounds herself, even digging a musket ball out of her own leg with a knife. Some of the shot remained too deep to remove, leaving her with a lifelong disability.

During her military service, Sampson demonstrated exceptional courage and skill as a soldier. She participated in several skirmishes and battles, including engagements near New York City and in Westchester County. Her fellow soldiers respected her for her dedication, marksmanship, and willingness to volunteer for dangerous scouting missions. She proved herself particularly adept at reconnaissance work, using her intelligence and observational skills to gather valuable information about enemy positions and movements.

Discovery and Discharge

During an epidemic in Philadelphia, she fell seriously ill with a fever and was taken to a hospital, where a physician discovered her secret while treating her. Fortunately, the doctor, Barnabas Binney, chose to protect Sampson rather than expose her. He treated her quietly and helped facilitate her honorable discharge from the army in October 1783. Her commanding officer, General John Paterson, reportedly handled the situation with discretion and respect, recognizing her valuable service to the cause of independence.  Eventually she was discharged by General Henry Knox on October 25, 1783, and was given funds to return home and a Note of Advice, similar to modern discharge papers.

Life After the War

After the war, Sampson returned to Massachusetts, where she married Benjamin Gannet in 1785 and had three children. But like many veterans, she struggled financially and had difficulty obtaining the military pay and benefits she had earned. In 1792, with the help of prominent supporters—including Paul Revere—she successfully petitioned the Massachusetts legislature for back pay and a modest state pension and she later received a pension from the federal government.

Her story didn’t end with domestic life. She became one of the first women in America to go on a speaking tour, traveling throughout New England and New York to share her experiences. Wearing her military uniform, she delivered a combination of storytelling, dramatic performance of military drills, and patriotic appeal.  These lectures, which began in 1802, were groundbreaking for their time, as respectable women rarely spoke publicly before mixed audiences.

A Lasting Legacy

Deborah Sampson’s legacy extends far beyond her military service. She challenged rigid gender roles and demonstrated that women could serve their country with the same valor and effectiveness as men. Her story inspired future generations of women who sought to break barriers and serve in traditionally male-dominated fields.

After she died in 1827, her story continued to gain recognition. In 1838, her husband was awarded a widow’s pension, possibly the first instance in U.S. history that the benefit was granted to a man based on his wife’s military service.

She left behind a legacy of courage, determination, and pioneering spirit that continues to resonate today. In 1983, she was declared the Official Heroine of the Commonwealth of Massachusetts, and in 2020, the U.S. House of Representatives passed the Deborah Sampson Act, expanding healthcare and benefits for female veterans. Statues and memorials, including her gravesite in Sharon, Massachusetts, commemorate her contributions.  Her wartime exploits have been the subject of books, plays, and scholarly research and her story continues to inspire generations as a symbol of courage and the ongoing struggle for gender equality in military service. 

While she was not the only woman to disguise herself and enlist—others like Margaret Corbin and Anna Maria Lane also took up arms—Sampson is among the best documented and celebrated.

Her life represents a crucial chapter in both military history and women’s history, illustrating the complex ways in which the American Revolution created opportunities for individuals to transcend social conventions in service of the greater cause of independence.  Deborah’s journey from indentured servant to Continental Army soldier and national lecturer is a testament to her extraordinary courage and determination. By stepping into a role forbidden to women and excelling under the harshest conditions, she challenged the boundaries of her time and set a precedent for future generations.

Though it is possible that her wartime activities may have been exaggerated—a common practice in biographies of the time—her life remains a powerful reminder of the contributions women have made, often unrecognized, in the shaping of American history.

The illustration at the beginning of this post is from The Female Review: Life of Deborah Sampson, the Female Soldier in the War of Revolution (1916), a reprint of the 1797 biography by Herman Mann.  

Declaring Independence: The Origin of America’s Founding Document

When Americans celebrate the Fourth of July, we imagine fireworks, flags, and a dramatic reading of the Declaration of Independence. We think we know the story—The Continental Congress selected Thomas Jefferson to write the declaration. He labored alone to produce this famous document. Congress then approved it unanimously and it was signed on the 4th of July.

 But the truth is far different and more complex. The story behind this iconic document—the how, who, and why of its creation—is just as explosive and illuminating as the day it represents. Far from a spontaneous outburst of rebellion, the Declaration was the product of political strategy, collaborative writing, and a shared sense of urgency among men who knew their words would change the course of history.

Setting the Stage: Why a Declaration?

By the spring of 1776, the American colonies were deep in conflict with Great Britain. Battles at Lexington and Concord had already been fought. George Washington was attempting to transform the Continental Army into a professional fighting force. Thomas Paine’s Common Sense had ignited widespread public support for full separation from the British Crown. The Continental Congress had been meeting in Philadelphia, debating how far they were willing to go. By June, the mood had shifted from reconciliation to revolution.

On June 7, 1776, Richard Henry Lee of Virginia introduced a resolution to the Continental Congress declaring “that these United Colonies are, and of right ought to be, free and independent States.” The motion was controversial—some delegates wanted more time to consult their colonies. But most in Congress knew that if independence was going to happen, it needed to be explained and justified to the world, so they created a committee to draft a formal declaration.

The Committee of Five

On June 11, 1776, the Continental Congress appointed a “Committee of Five” to write the declaration. The members were:

  • Thomas Jefferson of Virginia
  • John Adams of Massachusetts
  • Benjamin Franklin of Pennsylvania
  • Roger Sherman of Connecticut
  • Robert R. Livingston of New York

This was not a random selection. Each man represented a different region of the colonies and had earned the trust of fellow delegates. Jefferson was relatively young but already known for his eloquence. Adams was an outspoken advocate of independence. Franklin brought wisdom, wit, diplomatic experience, and international prestige. Sherman brought New England theological perspectives and legislative experience, while Livingston represented the more moderate New York delegation and brought keen legal insight.

Jefferson Takes the Pen

Although it was a group project on paper, the heavy lifting fell to Thomas Jefferson. The committee chose him to draft the initial version. Why Jefferson? According to John Adams, Jefferson was chosen for three reasons: he was from Virginia (the most influential colony), he was popular, and, Adams admitted, “you can write ten times better than I can.”

Jefferson wrote the draft in a rented room at 700 Market Street in Philadelphia. He leaned heavily on Enlightenment ideas, especially those of John Locke, emphasizing natural rights and the notion that government derives its power from the consent of the governed. He also borrowed phrasing from earlier colonial declarations, including his own A Summary View of the Rights of British America and borrowed extensively from George Mason’s Virginia Declaration of Rights.

The Editing Process: Group Work Gets Messy

After Jefferson completed the initial draft (likely by June 28), he shared it with Adams and Franklin. Both men suggested revisions. Franklin, ever the editor, softened some of Jefferson’s sharpest attacks and corrected language for flow and diplomacy. His most famous contribution was changing Jefferson’s phrase “We hold these truths to be sacred and undeniable” to the more secular and philosophically precise “We hold these truths to be self-evident.”  

Adams contributed to structural suggestions and to tone. He also contributed to the strategic presentation of grievances against King George III, understanding that the declaration needed to justify revolution in terms that would be acceptable to both colonial readers and potential European allies.

Sherman and Livingston played more limited but still meaningful roles. Sherman, with his theological background, helped ensure the document’s religious references would appeal to Puritan New England, while Livingston’s legal expertise helped refine the constitutional arguments against British rule.  Otherwise, their involvement in the actual content of the declaration was likely minimal.

The revised draft was presented to the full Continental Congress on June 28, 1776. What followed was a few days of intense debate and revision by the entire body.

Congress Takes the Red Pen

From July 1 to July 4, the Continental Congress debated the resolution for independence and edited the Declaration. Jefferson watched as more than two dozen changes were made to his prose. The Congress cut about a quarter of the original text, including a lengthy passage condemning King George III for perpetuating the transatlantic slave trade that would have sparked deep division among the delegates, especially those from Southern colonies.

Other modifications included strengthening the religious language, toning down some of the more inflammatory rhetoric, and making the grievances more specific and legally grounded.  Congress made 86 edits, removing about a quarter of Jefferson’s original content. Jefferson was reportedly frustrated by the changes, calling them “mutilations,” but he recognized that compromise was the cost of consensus

Approval and Promulgation

Despite the extensive revisions, the core of Jefferson’s vision remained intact and on July 2, 1776, the Continental Congress voted in favor of Lee’s resolution for independence. That’s the actual date the colonies officially broke from Britain. John Adams even predicted in a letter to his wife that July 2 would be celebrated forever as America’s Independence Day. He was close—but the official adoption of the Declaration came two days later.

On July 4, 1776, Congress formally approved the final version of the Declaration of Independence. Contrary to popular belief, most of the signers did not sign it on that day. Only John Hancock, as president of Congress, and Charles Thomson, as secretary, signed then.   The famous handwritten version, now in the National Archives, wasn’t signed until August 2. But the document approved on July 4 was immediately printed by John Dunlap, the official printer to Congress.

These first copies, known as Dunlap Broadsides, were distributed throughout the colonies and sent to military leaders, state assemblies, and even King George III. George Washington had it read aloud to the Continental Army.  This rapid dissemination was crucial to its impact, as it was needed to rally public support for the revolutionary cause and explain the colonies’ actions to the world.

Legacy and Impact

The Declaration wasn’t just a break-up letter to the British Crown—it was a manifesto for a new kind of political order. Its assertion that “all men are created equal” would echo through centuries of American history, invoked by abolitionists, suffragists, civil rights leaders, and more.

The creation of the Declaration of Independence demonstrates that even the most iconic documents in American history emerged from collaborative processes involving compromise, revision, and collective wisdom. While Jefferson deserves primary credit for the document’s eloquent expression of revolutionary ideals, the contributions of his committee colleagues and the broader Continental Congress were essential to creating a text that could unite thirteen diverse colonies in common cause.

This collaborative origin reflects the democratic principles the declaration itself proclaimed, showing that American independence was achieved not through the vision of a single individual, but through the collective efforts of representatives working together to articulate their shared commitment to liberty, equality, and self-governance. The process that created the Declaration of Independence thus embodied the very democratic ideals it proclaimed to the world.

Today, the Declaration of Independence is enshrined as one of the foundational texts of American democracy. But it’s worth remembering that it was created under immense pressure, forged by committee, and edited by compromise. Its authors knew they were taking a dangerous step. As Franklin quipped at the signing, “We must all hang together, or most assuredly we shall all hang separately.”

Page 3 of 4

Powered by WordPress & Theme by Anders Norén