Grumpy opinions about everything.

Tag: American Revolution Page 2 of 3

The Continental Marines: Birth of America’s Amphibious Warriors

When most people think of the American Revolution, they picture Continental soldiers marching across snowy battlefields or patriot militias defending their homes. But there’s another group that played a crucial role in securing American independence: the Continental Marines. These amphibious warriors served in America’s nascent naval force and proved their worth on both land and sea during the eight-year struggle for independence.

The Continental Marines, established in 1775, served as America’s first organized marine force during the Revolutionary War before being disbanded in 1783, laying the foundation for what would eventually become the modern U.S. Marine Corps.  Though short-lived, the original Marine Corps played a significant role in America’s fight for independence, setting precedents that the modern Marine Corps still honors today.

The Legislative Foundation

By the fall of 1775, the American colonies were no longer engaged in mere protest—they were in open rebellion against the British Empire. Battles had already been fought at Lexington, Concord, and Bunker Hill. The Continental Congress, led by figures like John Adams, had begun to organize a Continental Army under George Washington’s command. But many in the Congress, especially Adams, believed a navy was also essential to challenge British power at sea and disrupt its supply lines.

With a navy, it was reasoned, must come Marines—soldiers trained to serve aboard ships, conduct landings, enforce discipline, and fight in close quarters during boarding actions. This model was based on the British Royal Marines, a corps with a long and respected tradition.

The Continental Marines came into existence through a resolution passed by the Second Continental Congress on November 10, 1775. This date, which Marines still celebrate today as their birthday, marked a pivotal moment in American military history.

The Continental Marine Act of 1775 decreed: “That two battalions of Marines be raised consisting of one Colonel, two lieutenant-colonels, two majors and other officers, as usual in other regiments; that they consist of an equal number of privates as with other battalions, that particular care be taken that no persons be appointed to offices, or enlisted into said battalions, but such as are good seamen, or so acquainted with maritime affairs as to be able to serve for and during the present war with Great Britain and the Colonies.”

The legislation was part of Congress’s broader effort to create a Continental Navy capable of challenging British naval supremacy. The resolution was drafted by future U.S. president John Adams and adopted in Philadelphia. This wasn’t just about creating another military unit—Congress recognized that naval warfare required specialized troops who could fight effectively both on ships and on shore. The concept wasn’t entirely new—European navies had long employed marines for similar purposes—but the Continental Marines represented America’s first organized attempt to create a professional amphibious force, though the term amphibious didn’t come into use in a military setting until the 1930s—they would likely have been informally referred to as a naval landing force.

Recruitment: From Taverns to the Fleet

The recruitment of the Continental Marines has become the stuff of legend, particularly the story of their traditional birthplace at Tun Tavern in Philadelphia. Though legend places its first recruiting post at Tun Tavern, historian Edwin Simmons surmises that it may as likely have been the Conestoga Waggon [sic], a tavern owned by the Nicholas family. Regardless of which tavern served as the primary recruiting station, the Marines can claim the unique distinction of being the only military branch “born in a bar”.

The first Commandant of the Marine Corps was Captain Samuel Nicholas, and his first Captain and recruiter was Robert Mullan, the owner of Tun Tavern. Samuel Nicholas, a Quaker-born Philadelphia native and experience mariner, was commissioned on November 28, 1775, becoming the Continental Marines’ senior officer and only commandant throughout their existence. While his background as a Philadelphia tavern keeper may seem unusual for a military leader, his connections in the maritime community proved invaluable for recruiting. The requirement for maritime experience shaped the character of the force from its inception.

The Marines faced immediate recruitment challenges. Originally, Congress envisioned using the Marines for a planned invasion of Nova Scotia.  They expected the Marines to draw personnel from George Washington’s Continental Army.  However, Washington was reluctant to part with his soldiers, forcing the Marines to recruit independently, primarily from the maritime communities of Philadelphia and New York.

By December 1775, Nicholas had raised a battalion of approximately 300 men, organized into five companies, though this fell short of the original plan for two full battalions. Robert Mullan, helped to assemble the fledgling fighting force. Plans to form the second battalion were suspended indefinitely after several British regiments-of-foot and cavalry landed in Nova Scotia, making the planned naval assault impossible.

Organization for Dual Service

The Continental Marines were organized as a flexible force capable of serving both aboard ships and on land. For shipboard service, Marines were organized into small detachments that could be distributed across the Continental Navy’s vessels. Their organization reflected their multi-purpose mission: they served as security forces protecting ship officers, repelling boarders and joining boarding parties during naval engagements, and as assault troops for amphibious operations. Marksmanship received particular emphasis—a tradition that continues to this day—as Marines often served as sharpshooters in naval engagements, targeting enemy officers and sailors from the rigging and fighting tops of ships.

During the Revolutionary War, the Continental Marines uniform directives specified a green jacket with white facings and cuffs.   However, when the first sets of uniforms were actually ordered and delivered, red facings were substituted for white. The likely reason was supply availability: red cloth was easier to obtain from Continental or captured British stores. The most authoritative description comes from Captain Samuel Nicholas, who wrote from Philadelphia in 1776 that Marines were outfitted in “green coats faced with red, and lined with white”

The uniform also included a high leather collar, or stock, to ostensibly protect the neck against sword slashes, although there is some evidence that may actually have been intended to improve posture. This distinctive uniform item helped establish their identity as an elite force and eventually lead to their treasured nickname “leathernecks”.

Shipboard Service and Naval Operations

The Continental Marines’ role aboard ship was multifaceted and crucial to naval operations. Their most important duty was to serve as onboard security forces, protecting the captain of a ship and his officers. During naval engagements, in addition to manning the cannons along with the crew of the ship, Marine sharp shooters were stationed in the fighting tops of a ship’s masts specifically to shoot the opponent’s officers and crew. These duties reflected centuries of naval tradition and drew on the example of the British Marines.

The Marines’ first major naval operation came in early 1776 when five companies joined Commodore Esek Hopkins’ Continental Navy  squadron, on its first cruise in the Caribbean. This deployment demonstrated their value as both shipboard security and assault troops, setting the pattern for their service throughout the war.

Major Land-Based Actions

Despite their naval origins, the Continental Marines proved equally effective in land combat. Their most famous early action was the landing at Nassau on the Island of New Providence in the Bahamas in March 1776. The landing was the first by Marines on a hostile shore.  It was led by Captain Nicholas and consisted of 250 marines and sailors. After 13 Days the Marines had captured two forts, the Government House, occupied Nassau and captured cannons and large stores of supplies. While they missed capturing the gunpowder stores (which had been evacuated before their arrival), the raid demonstrated American capability to strike British positions anywhere.

Though modest in scale, this operation had a major symbolic weight and established the Marines as America’s premier amphibious force. The operation did not decisively alter the balance of the war, but it foreshadowed the Marines’ enduring identity as a seafaring, expeditionary force. Today, the Battle of Nassau is remembered less for the supplies seized than for what it represented: the moment the Continental Marines stepped onto the world stage.

Other notable operations included raids on British soil itself. In April of 1778, Marines under the command of John Paul Jones made two daring raids, one at the port of Whitehaven, in northwest England, and the second later that day at St. Mary’s Isle. These operations brought the war directly to British territory, demonstrating American reach and resolve.  While the battles had no strategic impact on the outcome of the war, they were a great moral booster when reports, though largely exaggerated, reached the rebellious colonies

Official Marine Corps history also acknowledges Marine participation in the Battle of Princeton, though it wasn’t a major Marine engagement. Marines from Captain William Shippen’s company, who had been serving aboard Continental Navy ships, participated in this battle as a part of Cadwalader’s Brigade on Washington’s flank.  Some Marines were detached to augment the artillery, with a few eventually transferring to the army.  However, the Marines’ role was relatively minor compared to their more significant naval actions during this period.

The Gradual Decline

As the Revolutionary War progressed, the Continental Marines faced increasing challenges. Financial constraints plagued the Continental forces throughout the war, and the Marines were no exception. The Continental Congress struggled to fund and supply all military branches, and the relatively small Marine force often found itself at a disadvantage competing for resources with the larger Continental Army and Navy.

Recruitment became increasingly difficult as the war dragged on. After the early campaigns, Nicholas’s four-company battalion discontinued independent service, and remaining Marines were reassigned to shipboard detachments.  Their number had been reduced by transfers, desertion, and the loss of eighty Marines through disease.

The Continental Navy also faced severe challenges that directly impacted the Marines. Many ships were captured, destroyed, or sold, leaving Marines without their primary operational platform. As the naval war shifted toward privateering and smaller-scale operations, the need for organized Marine units diminished.

Beginning in February 1777 two companies of Marines either transferred to Morristown to assume the roles in the Continental artillery batteries or left the service altogether. This transfer of Marines to army artillery units reflected the practical reality that their specialized skills were needed elsewhere as the Continental forces adapted to changing circumstances.

Disbanded at War’s End

The end of the Revolutionary War marked the end of the Continental Marines as an organized force. Both the Continental Navy and Marines were disbanded in April 1783. Although a few individual Marines briefly stayed on to provide security for the remaining U.S. Navy vessels, the last Continental Marine was discharged in September 1783.

The last official act of the Continental Marines was escorting a stash of French Silver Crowns (coins) from Boston to Philadelphia—a loan from Louis XVI to establish of the Bank of North America. This final mission, conducted in 1781, symbolically linked the Marines to the new nation’s financial foundations even as their military role ended.

The disbanding reflected broader American attitudes toward standing military forces. Having won their independence, Americans were skeptical of maintaining large military establishments that might threaten republican government. The Continental Congress, facing financial pressures and political opposition to permanent military forces, chose to disband both the Navy and Marines.

Legacy

The Continental Marines’ contribution to American independence was significant despite their small numbers. In all, over the course of 7 years of battle, the Continental Marines had only 49 men killed and just 70 more wounded, out of a total force of roughly 130 Marine Officers and 2,000 enlisted. These relatively low casualty figures reflected both their effectiveness and the limited size of the force.

Rising tensions with Revolutionary France in the late 1790s led to the Quasi-War, prompting Congress to reestablish the Navy in 1798. On July 11 of that year, President John Adams signed legislation formally creating the United States Marine Corps as a permanent branch of the military, under the jurisdiction of the Department of the Navy. This new Marine Corps inherited the traditions, mission, and esprit de corps of its Revolutionary War predecessors.  Despite the gap between the disbanding of the Continental Marines and the establishment of the new United States Marine Corps, Marines honor November 10, 1775, as the official founding date of their Corps.

The Continental Marines established precedents that would shape American military doctrine for more than two centuries. The Revolutionary War not only led to the founding of the United States (Continental) Marine Corps but also highlighted for the first time the versatility for which Marines have come to be known. They fought on land, they fought at sea on ships, and they performed amphibious assaults.

The Continental Marines represented a crucial innovation in American military organization. Born from congressional resolution and tavern recruitment, these maritime warriors proved their worth in battles from the Caribbean to the British Isles. Though disbanded with the war’s end, their legacy lives on in the traditions and spirit of the modern Marine Corps. While their numbers were small and their existence brief, their impact on American military tradition proved lasting and significant.

The Republic of Indian Stream: America’s Forgotten Frontier Nation

Did you know that there once an independent republic in the farthest reaches of northern New Hampshire, where the dense forests blend into the Canadian wilderness?  Neither did I until I came across it in a fascinating book titled A Brief History of the World in 47 Boarders by John Elledge.

It was a short-lived but remarkable experiment in self-government. For three years in the 1830s, the settlers of a disputed border region declared themselves citizens of an independent republic—complete with their own constitution, legislature, and militia. They called it the Republic of Indian Stream, a name that today sounds almost mythical, yet it was a genuine, functioning democracy. Their story blends frontier improvisation, international diplomacy, and Yankee self-reliance—and it leaves us with a curious artifact: a constitution written not by statesmen in Philadelphia, but by farmers, loggers, and merchants caught between two competing nations.

A Territory in Limbo

The roots of the Indian Stream story go back to the Treaty of Paris (1783), which ended the American Revolution. The treaty defined the U.S.–Canada border but used vague geographic language—particularly the phrase “the northwesternmost head of the Connecticut River.” No one could agree which of several small tributaries the treaty meant.

The ambiguity created a slice of wilderness—about 200 square miles—claimed by both the United States and British Lower Canada (now Quebec). For decades, the region existed in a gray zone. Both countries sent tax collectors and law officers, both demanded military service, and neither provided clear legal protection. Residents couldn’t vote, hold secure property titles, or rely on either government’s courts. To make matters worse, they were sometimes forced to pay taxes twice—once to New Hampshire and once to Canada.

Origins of the Republic

By the late 1820s, frustration had reached a boiling point. Attempts to resolve the border dispute were unsuccessful—including arbitration by the King of the Netherlands in 1827 that failed when the United States rejected his decision that favored Great Britain.

With both sides still pressing their claims, the settlers decided they’d had enough of outside interference. On July 9, 1832, they convened a local meeting and declared independence, forming the Republic of Indian Stream. Their constitution—modeled on American state constitutions—began with a simple premise: authority rested with “the citizens inhabiting the territory.”

This wasn’t an act of rebellion but one of survival. The settlers wanted peace, order, and local control. Their goal was to withdrawal from ambiguous regulation and to create a government that could function until the border question was finally settled.

The Constitution of Indian Stream

The constitution of the Republic, adopted the same day they declared sovereignty, was an impressively crafted document for a community of barely 300 people. It reflected the settlers’ familiarity with republican ideals and their determination to govern themselves fairly.

Key features included:

  • Democratic foundation: All authority stemmed from the citizens.
  • Annual elections: A single House of Representatives made the laws, and a magistrate acted as both executive and judge.
  • Judicial simplicity: Local justices of the peace handled disputes—there were no elaborate court hierarchies.
  • Individual rights: Residents enjoyed protections derived from U.S. constitutions—trial by jury, due process, and freedom from arbitrary arrest.
  • Defense and civic duty: Citizens pledged to defend their independence and assist one another in emergencies.

Despite its modest scale, the system worked. The republic passed laws, issued warrants, collected taxes, and even mustered a small militia to maintain order.

Life on the Frontier

Life in Indian Stream resembled that of many frontier communities: logging, farming, hunting, and trading. The land was rough, winters long, and access to distant markets limited. Yet the people thrived through cooperation and self-reliance. Trade with both Canadian and New Hampshire merchants continued—proof that practicality often trumped politics on the frontier.

The republic’s remote location provided a degree of safety from interference, but not immunity. Both British and American agents continued to assert claims, and occasional arrests or skirmishes kept tensions high.

The End of the Republic

The experiment in independence lasted only three years. In 1835, a dispute between an Indian Stream constable and a Canadian deputy sheriff triggered a diplomatic crisis. Canada sent troops to assert control, prompting New Hampshire’s governor to respond in kind.

Realizing they were caught between two competing governments, the citizens voted in April 1836 to accept New Hampshire’s jurisdiction. Indian Stream became part of the town of Pittsburg, and peace was restored.

The larger boundary issue wasn’t fully settled until the Webster–Ashburton Treaty of 1842, which formally placed Indian Stream within the United States.

Legacy of a Lost Republic

Today, little remains of the Republic of Indian Stream except New Hampshire Historical Marker #1 and a scattering of homesteads near the Connecticut Lakes.

Yet its legacy is profound.  It may have lasted only three years, but its story reflects the broader American frontier experience: independence, inventive, and determination to live free from arbitrary rule. In an era defined by rigid borders and powerful states, the memory of Indian Stream reminds us that freedom once depended, not on lines on a map, but on the courage of people willing to draw their own lines.

The story also illustrates the complexities of nation-building in the early American period when borders remained fluid and communities sometimes had to forge their own path toward self-governance. While the republic was short lived, it stands as a testament to the ingenuity and determination of America’s frontier settlers, who refused to accept statelessness and instead chose to create their own nation in the wilderness.

The Indian Stream constitution reminds us that political order is not always imposed from above; sometimes, out of necessity, it is created from below. The settlers were neither revolutionaries nor idealists—they simply wanted clear rules, fair courts, and predictable taxes. Ordinary citizens, faced with legal chaos and neglect, designed a functioning democracy grounded in fairness and mutual responsibility.

That such a tiny community would craft its own constitution speaks to the enduring appeal of constitutional government in the early 19th century. Even on the edge of two empires, far from capitals and legislatures, these settlers turned to a familiar American solution: write it down, elect your leaders, and hold them accountable every year.  Hopefully we will be able to keep their spirit and live up to the example of Indian Stream.

The Real Enemy of the Revolution: Disease

When you think about the American Revolution, you probably picture dramatic battles like Bunker Hill or the crossing of the Delaware. But here’s something that might surprise you: the biggest killer during the war wasn’t British muskets—it was disease. And it’s not even close.

The Numbers Tell a Grim Story

Let’s talk numbers for a second. On the American side, about 6,800 soldiers died from battlefield wounds. Sounds terrible, right? Well, disease killed an estimated 17,000 to 20,000. That’s roughly three times as many. The British and their Hessian allies faced similar odds: around 7,000 combat deaths versus 15,000 to 25,000 disease deaths.

Think about that for a moment. You were actually safer charging into battle than hanging around camp. In some regiments, disease wiped out more than a third of the troops before they even saw their first fight.

Why Was Disease So Deadly?

Picture yourself in a Revolutionary War military camp. Hundreds of men crammed together in makeshift shelters, no running water, primitive latrines dug too close to where everyone lives, and basically zero understanding of what we’d call “germ theory” today. It’s a perfect storm for infectious disease.

The big killers were:

Smallpox was the heavyweight champion of camp diseases. This virus killed about 30% of people it infected and spread like wildfire through packed military camps. Soldiers tried to protect themselves through a risky practice called inoculation—basically giving themselves a mild case of smallpox on purpose by rubbing infected pus into cuts on their skin. Without proper quarantine procedures, though, this sometimes made outbreaks worse instead of better.

Typhus (called “camp fever” back then) spread through lice and fleas. If you’ve ever been to a prolonged camping trip and felt gross after a few days, imagine that times a hundred. Soldiers lived in the same clothes for weeks, rarely bathed, and the parasites just had a field day. The fever, headaches, and diarrhea that came with typhus made it one of the most dreaded camp diseases.

Dysentery (charmingly nicknamed “bloody flux”) came from contaminated water and poor sanitation. When your latrine is 20 feet from your water source and you don’t understand how disease spreads, this becomes pretty much inevitable. The severe diarrhea weakened soldiers to the point where many couldn’t fight even if they wanted to and it made them even more susceptible to other diseases.

Malaria was especially important in the South, where mosquitoes thrived in the humid climate. This one actually played a fascinating role in how the war ended—but more on that in a bit.

When Disease Changed Everything

The 1776 invasion of Canada was a disaster largely because of smallpox. Out of 3,200 American soldiers in the Quebec campaign, 1,200 fell sick. You can’t mount much of an offensive when more than a third of your army is flat on their backs with fever. Similarly, during the siege of Boston, Washington couldn’t effectively engage the British because so many of his troops were sick with smallpox. These weren’t just setbacks—they were strategic catastrophes.

This is what pushed George Washington to make one of his boldest decisions in 1777: he ordered a mass inoculation of the Continental Army. This was controversial and dangerous at the time, but it worked. Washington had survived smallpox himself as a young man, so he understood both the risks and the benefits. The inoculation program probably saved the army from complete collapse.

Medical “Treatment” Was Often Worse Than Nothing

Here’s where things get really grim. Eighteenth-century medicine was basically medieval. Doctors believed in “balancing the humors” through bloodletting—literally draining blood from already weakened soldiers. They also gave powerful laxatives to people who were already suffering from diarrhea. Yeah, let that sink in.

Pain relief meant opium-based drinks or just straight alcohol. Some doctors used herbal remedies, but results were inconsistent at best. Quinine helped with malaria, though nobody really understood why. Mostly, if you got seriously sick, your survival came down to luck and a strong constitution.

Valley Forge: The Turning Point

Valley Forge is famous for being a brutal winter encampment, and disease was a huge part of why it was so terrible. Scabies left nearly half the troops unable to serve. Dysentery and camp fever killed somewhere between 1,700 and 2,000 soldiers during that single winter—and remember, these weren’t battle casualties. These men died from preventable diseases in what was supposed to be a safe encampment.

But Valley Forge taught the Continental Army a crucial lesson. After that nightmare winter, military leaders started taking sanitation seriously. They began focusing on camp hygiene, protecting water supplies, placing latrines away from living areas, and making sure soldiers could bathe and wash their clothes and bedding.

Baron von Steuben is famous for teaching the Continental Army how to march and drill, but he also deserves credit for implementing serious sanitation reforms. These changes helped prevent future disease outbreaks and kept the army functional for the rest of the war.

The Secret Weapon at Yorktown

Here’s one of my favorite historical details: mosquitoes may have helped win American independence. At Yorktown, roughly 30% of Cornwallis’s British army was knocked out by malaria and other diseases during the siege. The British commander was trying to hold off the American and French forces while also dealing with the fact that almost a third of his troops were too sick to fight.

Many American soldiers from the southern colonies had grown up with malaria and had some partial immunity. The British? Not so much. Some historians even think Cornwallis himself might have been suffering from malaria, which could have affected his decision-making. His second-in-command, Brigadier General Charles O’Hara, was definitely seriously ill during the siege. Fighting a war while you can barely stand is a pretty significant handicap.

The Bigger Picture

The American Revolution shows us something important: wars aren’t just won on battlefields. They’re won by the side that can keep its soldiers alive and healthy. Disease shaped strategic decisions, determined the outcomes of campaigns, and killed far more men than any British regiment ever did.

Washington’s decision to inoculate the army was genuinely revolutionary (pun intended). It showed a willingness to embrace controversial medical practices for the greater good. The sanitation reforms that came out of Valley Forge laid groundwork for modern military medicine and influenced public health policies in the new United States.

So next time someone mentions the American Revolution, remember: while we celebrate the military victories, one of the most important battles was fought against an enemy you couldn’t see—and for most of the war, nobody really knew how to fight it.

The casualty figures and major disease outbreaks are well-documented in historical records. The specific percentages and numbers are estimates based on historical research, as precise record-keeping was limited during this period. The overall narrative about disease being the primary cause of death is strongly supported by multiple historical sources.

Powdered Wigs and Politics: The Rise and Fall of America’s Most Distinguished Hair Trend

I’ve been spending a lot of time recently researching and writing about the 250th Anniversary of the American Revolution and I keep asking myself, “What’s up with the wigs?”   Have you ever wondered why the Founding Fathers look so impossibly fancy in their portraits?  Well, you can thank a French king and a syphilis epidemic. The elaborate wigs worn by early American leaders weren’t just fashion statements—they were complex social symbols that said everything about who you were, what you could afford, and how seriously you wanted to be taken.

Where It All Started

The wig craze didn’t begin in America. It started across the Atlantic when France’s King Louis XIII went bald prematurely in the 1600s and decided to cover it up with a wig. But it was his son, Louis XIV, who really kicked things into high gear. When the Sun King started losing his hair, he commissioned elaborate wigs that became the epitome of aristocratic style. European nobility, desperate to emulate French sophistication, quickly followed suit.

The practice also had a less glamorous origin story. Syphilis was rampant in 17th-century Europe, and one of its unfortunate side effects was hair loss. Wigs conveniently covered up this telltale symptom while also hiding the sores and blemishes that came with the disease.

Europe in the 1600s and 1700s also had frequent outbreaks of lice and other parasites. Shaving one’s natural hair short and wearing a wig—which could be cleaned, boiled, or deloused more easily—became a practical solution. Powdering helped keep wigs fresh and masked odors.

By the time the fashion crossed the ocean to colonial America in the early 1700s, wigs had become standard attire for anyone with social pretensions.

Status on Your Head

In colonial America, your wig announced your place in society before you even opened your mouth. The most expensive and elaborate wigs featured long, flowing curls that cascaded past the shoulders—these full-bottomed wigs could cost the equivalent of several months’ wages for an average worker. Wealthy merchants, successful plantation owners, and colonial officials wore these statement pieces to project authority and refinement.

Professional men like doctors, lawyers, and clergy typically wore more modest styles. The “tie wig” gathered hair at the back with a ribbon, while the “bob wig” featured shorter hair that ended around the neck. These styles were practical enough for men who actually had to work, but still formal enough to command respect. Even the style of curl mattered—tight curls suggested conservatism and tradition, while looser waves indicated a more progressive outlook.

Working-class men generally couldn’t afford real wigs. Some wore simple caps or went bareheaded, while others might invest in a cheap wig made from horsehair or goat hair for special occasions. The quality difference was obvious—human hair wigs, especially those made from blonde or white hair, were luxury items that only the wealthy could obtain.

Many men who did not wear wigs but still wanted the fashionable look would grow their own hair long, pull it into a queue (pony tail), and powder it. George Washington is a good example — portraits show his natural hair powdered white, not a wig.

The Daily Reality of Wig Life

Maintaining these hairpieces was no joke. Owners had to powder their wigs regularly with starch powder, often scented with lavender or orange, to achieve that distinctive white or gray color that signaled refinement. The powder got everywhere, which is why men often wore special dressing gowns during the powdering process.

Wigs required regular cleaning and restyling by professionals called peruke makers or wigmakers. These craftsmen commanded good money in colonial cities, advertising their services alongside other luxury trades. The hot, humid summers in places like Virginia and South Carolina made wig-wearing particularly miserable, but fashion demanded sacrifice.

The Revolutionary Shift

By the time of the American Revolution, attitudes toward wigs were already changing. The shift happened for several interconnected reasons, and it reflected broader transformations in American society.

First, the Revolutionary War itself promoted practical thinking. Military officers found elaborate wigs impractical in the field, and the democratic ideals of the Revolution made aristocratic European fashions seem pretentious. Many younger revolutionaries, including Thomas Jefferson, stopped wearing wigs as a political statement against Old World affectation.

A young Jefferson with a wig

Second, France—the original source of wig fashion—underwent its own revolution in 1789. As French revolutionaries literally beheaded the aristocracy, powdered wigs became associated with the despised nobility. What had once symbolized sophistication now suggested tyranny and excess.

In Great Britain, Parliament introduced a tax on hair powder as part of Prime Minister William Pitt the Younger’s revenue-raising measures.  The law required anyone who used hair powder to purchase an annual certificate costing one guinea (a little over $200 in today’s money).  This contributed to the growing sense that wigs were an unnecessary extravagance. Meanwhile, changing ideals of masculinity emphasized natural simplicity over artificial ornamentation.

By the early 1800s, the wig had largely disappeared from everyday American life. A new generation of leaders, including Andrew Jackson, proudly displayed their natural hair. The transition happened remarkably quickly—within a single generation, wigs went from essential to absurd. By the 1820s, anyone still wearing a powdered wig looked hopelessly outdated, clinging to a world that no longer existed.

The Legacy

Today, elaborate wigs survive primarily in British courtrooms, where some judges still wear them in formal proceedings—a deliberate echo of legal tradition. The powdered wigs of the Founding Fathers remain iconic, instantly recognizable symbols of early American history, even though the men who wore them were already abandoning the fashion by the time they built the new nation.

Divine Providence and Patriotism

Religion in the Ranks of the Continental Army

The Continental Army that fought for American independence from 1775 to 1783 represented a cross-section of colonial religious life, bringing together men from diverse faith traditions under a common cause. The religious faith of both enlisted soldiers and officers reflected the broader religious landscape of colonial America, and their regional differences contributed to a complex tapestry of faith within the ranks.

The Continental Army drew from a population where religious diversity was already well-established, particularly when compared to European armies of the same period. Protestant denominations were the majority within the ranks, reflecting the colonial religious demographics.

The American Revolution was not only a political and military struggle but also a deeply religious experience for many Continental Army soldiers. Their faith shaped how they interpreted the war, coped with its hardships, and interacted with comrades from diverse backgrounds.

Enlisted soldiers often relied on providentialism, the belief that God directly intervened in daily life, to make sense of battlefield chaos and suffering. Diaries and letters reveal troops attributing survival in skirmishes, unexpected weather shifts, and even mundane events to divine will. For example, many saw the Continental Army’s unlikely battlefield victories as evidence of God’s favor toward their cause.

Diversity in the Ranks

Congregationalists from New England formed a significant portion of the army, bringing with them the Puritan theological tradition that emphasized divine providence and moral responsibility.  Ministers in New England frequently preached that resistance to tyranny was a Christian duty and many soldiers viewed themselves as fighting against tyranny, much as their ancestors had fled religious persecution.

Presbyterian soldiers, many of Scots-Irish descent, comprised a substantial group. Concentrated heavily in the Middle Colonies and frontier areas, they tended toward evangelical and Presbyterian influences, regardless of their home colony . The challenging conditions of frontier life had already created a more individualistic and emotionally intense form of Christianity that adapted well to military service. These soldiers often brought a fatalistic acceptance of divine will combined with fierce determination.

Baptist and Methodist soldiers, though fewer in number, represented growing evangelical movements that would later transform American Christianity. German Reformed and Lutheran soldiers from Pennsylvania added to the religious diversity, while smaller numbers of Catholics, particularly from Maryland and Pennsylvania, served despite facing legal restrictions in many colonies. Even a few Jewish soldiers joined the cause, though their numbers were minimal given the tiny Jewish population in colonial America. The religious pluralism in regiments from the Middle Colonies created a more tolerant atmosphere that foreshadowed the religious diversity of the new nation.  

Quakers were generally pacifists and avoided military service, however some “Free Quakers” broke from their tradition and joined the Patriot cause. Other pacifist religious groups, such as Mennonites, abstained from combat but occasionally provided non-combatant support.

Anglicans, ironically fighting against their own church’s mother country, served in significant numbers, particularly from the Southern colonies where the Church of England frequently had been established as a state supported church. 

While religious diversity was generally a unifying force in the Continental Army, there were some instances of religious tension and while they were relatively limited, they were not entirely absent. One significant example occurred early in the war in November 1775, when some American troops planned to burn an effigy of the Pope on Guy Fawkes Day (also known as Pope’s Day in New England).  General Washington strongly condemned this anti-Catholic action, denouncing it as indecent and lacking common sense.

Washington actively tried to prevent sectarianism from undermining unity in the ranks, whether between Protestants and Catholics or among different Protestant denominations. The overall trend was towards religious tolerance and unity, with religious diversity ultimately contributing positively to the army’s cohesion and morale.

Officer Corps and Religious Leadership

The officer corps of the Continental Army reflected a somewhat different religious profile than the enlisted ranks. Many officers came from the colonial elite and were often Anglican or belonged to more established denominations. However, the Revolution’s anti-episcopal sentiment led many Anglican officers to distance themselves from their church’s political connections while maintaining their basic Christian beliefs.  The relationship between the soldiers and the established Church of England became increasingly strained as the Revolution progressed.

George Washington himself exemplified this complex relationship with religion. Though nominally Anglican, Washington never wrote about his personal faith.  He was likely influenced by Deist philosophy, then popular among Enlightenment thinkers including Jefferson and possibly Franklin. Deism holds that reason and observation of the natural world are sufficient to determine the existence of a Creator, but that this Creator does not intervene in the universe after its creation.

Washington regularly invoked divine providence in his correspondence and orders, understanding the importance of religious sentiment in maintaining morale, even while his personal beliefs remained ambiguous. His famous directive that “the blessing and protection of Heaven are at all times necessary but especially so in times of public distress and danger” reflected his understanding of religion’s role in military leadership.

Other prominent officers brought their own religious convictions to their leadership. Nathanael Greene, the Southern theater commander, was raised as a Quaker but was expelled from the Society of Friends for his military service. Rev. Peter Muhlenberg, a Lutheran minister, left the pulpit and joined the Continental Army, rising to the rank of Major General.  The Marquis de Lafayette, though Catholic, adapted to the predominantly Protestant environment of the American officer corps.

Officers, including Washington, viewed religion as a tool for discipline and unity. Washington mandated Sunday worship. He also appointed chaplains to every brigade, insisting they foster “obedience and subordination”.

 Continental Army Chaplain Service

Recognizing the importance of religion to morale and discipline, the Continental Congress authorized the appointment of chaplains to serve with the army. The chaplain system evolved throughout the war, beginning with regimental chaplains and eventually expanding to include brigade and division chaplains for larger organizational units.

Continental Army chaplains faced unique challenges. Unlike European armies with established state churches, American chaplains served religiously diverse units. They needed to provide spiritual comfort to soldiers from different denominational backgrounds while avoiding sectarian conflicts that could undermine unit cohesion. Most chaplains were Protestant ministers, reflecting the army’s composition, but they were expected to serve all soldiers regardless of specific denominational affiliation.

The duties of Continental Army chaplains extended beyond conducting religious services. They often served as informal counselors, helped soldiers write letters home, provided basic education to illiterate soldiers, and sometimes even served as medical assistants. Their moral authority made them valuable in maintaining discipline, and many commanders relied on chaplains to address problems of desertion, drunkenness, and other behavioral issues.

Chaplains also played important roles in significant military events. They conducted prayers before major battles and led thanksgiving services after victories. They provided comfort to the dying and services for the dead.

The British recognized the importance of chaplains to the Continental Army and in some cases offered rewards for their capture.

The famous painting of “The Prayer at Valley Forge” with its image of Washington praying alone in the snow, whether historically accurate or not, represents the type of spiritual leadership chaplains were expected to provide during the army’s darkest moments.

Conclusion Religion in the Continental Army reflected both the differing and the common aspects of colonial American faith. While denominational variations existed, most soldiers shared basic Christian beliefs that provided comfort during hardship and meaning to their sacrifice. The army’s religious diversity foreshadowed the religious pluralism that would characterize the new American nation, while the chaplain service established precedents for military religious support that continue today. The Revolution’s success owed much to the spiritual resources that sustained these soldiers through eight years of difficult warfare, demonstrating religion’s crucial role in the founding of the American republic.

The First Amphibious Landing

 The Continental Marines at Nassau

When the Second Continental Congress authorized the creation of the Continental Marines on November 10, 1775, few could foresee their pivotal role in orchestrating North America’s first amphibious assault less than four months later.  The operation against Nassau, on New Providence Island in the Bahamas, was born of necessity, marked by improvisation, and ultimately set the tone for Marine Corps operations—an audacious legacy that endures to this day.

Origins: Gunpowder Desperation and Strategic Vision

The American Revolution’s early years were marked by chronic shortages, especially of gunpowder. After the British seized stores destined for the Patriot cause, intelligence uncovered that significant quantities were stockpiled at Nassau. The Continental Congress approached this challenge with typical Revolutionary War creativity—they would use their brand-new Navy and even newer Marines to solve an Army problem. The Congress’ official instructions to Commodore Esek Hopkins focused on patrolling the Virginia and Carolina coasts, but “secret orders” directed attention to the Bahamas, setting in motion a bold plan to directly address the fledgling army’s supply crisis.

Organization: The Making of an Amphibious Battalion

With barely three months’ existence, the Continental Marines had hastily raised five companies of around 300 men. Captain Samuel Nicholas, tasked as the first Marine officer, oversaw their training and organization in Philadelphia. Their equipment was uneven—many wore civilian garb rather than uniforms and carried whatever muskets and bayonets were available. The uniform regulations specifying the now famous green coats with white facings were not promulgated until several months after the raid was over.

The Voyage South: Challenges and Preparation

Hopkins’ fleet consisted of the ships Alfred, Hornet, Wasp, Fly, Andrew Doria, Cabot, Providence, and Columbus. In addition to ships’ crews, the fleet carried more than 200 Continental Marines under the command of Captain Nicholas. The expedition began inauspiciously on January 4, 1776, when the fleet attempted to leave Philadelphia but became trapped by ice in the Delaware River for six weeks.

When they finally reached the Atlantic on February 17, 1776, the small fleet faced additional challenges. Disease found its way onboard most of the ships. Smallpox was a huge concern and was reported on at least four ships.

The fleet’s journey to the Caribbean took nearly two weeks of sailing through challenging winter conditions. Despite the hardships, Hopkins maintained the element of surprise—British intelligence had detected American naval preparations but assumed the fleet was bound for New York or Boston, not the distant Bahamas.

Implementation: Amphibious Innovation at Nassau

The element of surprise was initially lost when the fleet’s approach triggered alarm at Nassau. Plans to storm the stronger Fort Nassau dissolved, and Hopkins convened a council to identify a new landing point. A revised strategy saw about 230 Marines and 50 sailors, led by Captain Nicholas, land from longboats two miles east of the weaker Fort Montagu on March 3, 1776. They wore a patchwork of civilian clothes and white breeches—some men had managed to find green shirts as a form of identification. They set out marching toward the fort armed with muskets and bayonets, looking perhaps more like pirates than soldiers. 

Their advance was met with only token resistance. Outnumbered and ill-prepared, local militia withdrew as Nicholas’s men captured Fort Montagu in what historian Edwin Simmons called a “battle as bemused as it was bloodless.”

Nicholas decided to wait until morning to advance on the town.  His decision was tactically sound given the circumstances—he’d lost surprise, did not know the enemy’s strength, was operating in unknown terrain, night was falling, and he lacked naval support. However, this prudent military decision allowed Governor Browne to escape with over 80% of Nassau’s gunpowder stores, turning what could have been a complete strategic victory into a partial success. This incident highlights the tension between tactical prudence and strategic urgency that was destined to become a recurring theme in amphibious warfare.

The next day the Americans took Fort Nassau and arrested the Governor, Montfort Browne. Browne had already sent most of the coveted gunpowder on to St. Augustine, Florida, the night before. Despite this, American forces seized cannons, shells, and other military stores before occupying Nassau for nearly two weeks.

Marine discipline and flexibility were evident, as they pivoted from their surprise landing, conducted operations deep inland, and created their evolving amphibious reputation. The fleet departed on March 17, not before stripping Nassau and its forts of anything militarily useful.

Aftermath: Growing Pains and Enduring Lessons

Though the mission failed in its primary objective of securing a cache of gunpowder, its operational successes far outweighed the losses. The Marines returned with large quantities of artillery, munitions, and several recaptured vessels. On the return leg, they faced and fought (though did not defeat) HMS Glasgow; the squadron returned to New England by April 8, with several casualties including the first Marine officer killed in action, Lt. John Fitzpatrick.

Controversy followed—Hopkins was censured for failing to engage British forces as directed in his official orders.  Nicholas was promoted to major and tasked with raising additional Marine companies for new frigates then under construction. These developments reflected both the lessons learned and the growing recognition of the value of the Marine force in expeditionary operations ashore.

A second raid on Nassau by Continental Marines occurred from January 27–30, 1778, under Captain John Peck Rathbun. Marines and seamen landed covertly at midnight, quickly seizing Fort Nassau and liberating American prisoners held by the British. The raiders proceeded to capture five anchored vessels, dismantled Fort Montagu, spiked the guns, and loaded 1,600 lbs of captured gunpowder before departing. This bold operation marked the first time the Stars and Stripes flew over a foreign fort and showcased the resourcefulness of American forces, who managed to strike a valuable blow against British power in the Caribbean without suffering casualties.

Long-Term Implications for the United States Marine Corps

The Nassau operation set powerful precedents:

  • Amphibious Warfare Doctrine: This was the Marines’ first organized amphibious landing, shaping the Corps’ future focus on rapid deployment from sea to shore, a hallmark that continues in modern doctrine.  This was likely referred to at the time as a Naval landing, as the word amphibious did not come into use in this context until the 1930s.
  • Adaptability Under Fire: The improvisational tactics used at Nassau foreshadowed the Corps’ reputation for flexibility and mission focus.
  • Naval Integration: Joint operations with the Navy not only succeeded tactically, but helped institutionalize the Marine-Navy partnership, with Marines serving as shipboard security, landing parties, and naval infantry.
  • Legacy of Boldness: This first operation established a “first-in” ethos and a culture embracing challenge and audacity, foundational principles in Marine culture.

After the war, the Continental Marines disbanded, only to be re-established in 1798. Yet the legacy of Nassau endured. “Semper Fidelis”—always faithful—has its roots in that March 1776 assault, when the odds seemed long and the stakes critical.

Today’s United States Marine Corps draws a direct lineage from that small, ragtag battalion of Marines scrambling ashore at Nassau, forever entwining its identity with the promise, risk, and legacy of that first storied mission. Every modern Marine, stepping from ship to shore, walks in the footprints of Captain Samuel Nicholas and his men—soldiers of the sea whose boldness, improvisation, and teamwork have echoed across the centuries.

Deborah Sampson: A Revolutionary Soldier

In the story of the American Revolution, the names most often remembered are those of the Founding Fathers and battlefield generals. Yet woven through the familiar narrative are lesser known but extraordinary individuals whose actions defied the norms of their time. One of the most remarkable among them was Deborah Sampson, a Massachusetts woman who disguised herself as a man and served for nearly two years in the Continental Army. Her life reflects not only courage and patriotism, but also the complexity of gender roles in Revolutionary America

A Difficult Early Life

Deborah Sampson was born in Plympton, Massachusetts, in 1760 as the eldest of seven children in a family with deep Pilgrim roots, tracing lineage to Myles Standish and Governor William Bradford. Despite this heritage, her family struggled financially, and she grew up with poverty and abandonment. Her father deserted the family when she was young, leaving her mother with limited resources to care for their children. It was initially thought that he had died at sea, but they later discovered he had actually moved to Maine where he married and raised a second family.

Deborah was still young when her mother died and she was sent to live with a widow, Mary Price Thatcher, then in her 80s. Deborah likely learned to read while living with her.  After Widow Thatcher died, Deborah was bound out as an indentured servant to the Thomas family in Middleborough, Massachusetts, where she worked until she turned 18. This experience exposed her to hard physical labor and taught her skills typically associated with men’s work, including farming and carpentry. During this time, she educated herself and developed a keen intellect that would prove invaluable throughout her life. 

When her term of indenture ended in 1782, Sampson found herself in a precarious position as a young, unmarried woman with few economic opportunities. She intermittently supported herself as a teacher in the summers and a weaver in the winters.

Enlisting in the Army

The Revolutionary War was still raging, and the Continental Army desperately needed recruits. Motivated by both patriotic fervor and economic necessity, Sampson made the audacious decision to enlist in the army disguised as a man. She initially enlisted in 1782 under the name Timothy Taylor and collected a cash enlistment bounty but she failed to report for duty with her company.   She was later recognized as being Taylor and was required to repay what she had not already spent from her enlistment bounty.  No further punishment was made by the civil authorities; however, the Baptist Church withdrew its fellowship until she apologized and asked for forgiveness.

She later made a second enlistment, adopting the name Robert Shurtleff (sometimes spelled Shurtlieff or Shirtliff). This time she followed through and reported for duty.

She bound her chest, cut her hair, and donned men’s clothing to complete her transformation.  Sampson’s physical appearance aided her deception. She was tall for a woman of her era, standing nearly six feet, with a lean build and strong constitution developed through years of manual labor. Her lack of facial hair was not unusual among young male recruits, and she successfully passed the initial examination to join the 4th Massachusetts Regiment in May 1782.

The challenge of maintaining her disguise while living in close quarters with other soldiers required constant vigilance. Sampson developed strategies to protect her secret, including volunteering for guard duty to avoid sleeping arrangements that might expose her, and finding private moments to tend to personal needs. She also had to manage the physical demands of military life while dealing with the unique challenges of being a woman in a male-dominated environment.

Sampson’s military career nearly ended when she was wounded during a skirmish. She received a sword cut to her head and was shot in the thigh. Fearing that medical treatment would reveal her true identity, she initially treated her wounds herself, even digging a musket ball out of her own leg with a knife. Some of the shot remained too deep to remove, leaving her with a lifelong disability.

During her military service, Sampson demonstrated exceptional courage and skill as a soldier. She participated in several skirmishes and battles, including engagements near New York City and in Westchester County. Her fellow soldiers respected her for her dedication, marksmanship, and willingness to volunteer for dangerous scouting missions. She proved herself particularly adept at reconnaissance work, using her intelligence and observational skills to gather valuable information about enemy positions and movements.

Discovery and Discharge

During an epidemic in Philadelphia, she fell seriously ill with a fever and was taken to a hospital, where a physician discovered her secret while treating her. Fortunately, the doctor, Barnabas Binney, chose to protect Sampson rather than expose her. He treated her quietly and helped facilitate her honorable discharge from the army in October 1783. Her commanding officer, General John Paterson, reportedly handled the situation with discretion and respect, recognizing her valuable service to the cause of independence.  Eventually she was discharged by General Henry Knox on October 25, 1783, and was given funds to return home and a Note of Advice, similar to modern discharge papers.

Life After the War

After the war, Sampson returned to Massachusetts, where she married Benjamin Gannet in 1785 and had three children. But like many veterans, she struggled financially and had difficulty obtaining the military pay and benefits she had earned. In 1792, with the help of prominent supporters—including Paul Revere—she successfully petitioned the Massachusetts legislature for back pay and a modest state pension and she later received a pension from the federal government.

Her story didn’t end with domestic life. She became one of the first women in America to go on a speaking tour, traveling throughout New England and New York to share her experiences. Wearing her military uniform, she delivered a combination of storytelling, dramatic performance of military drills, and patriotic appeal.  These lectures, which began in 1802, were groundbreaking for their time, as respectable women rarely spoke publicly before mixed audiences.

A Lasting Legacy

Deborah Sampson’s legacy extends far beyond her military service. She challenged rigid gender roles and demonstrated that women could serve their country with the same valor and effectiveness as men. Her story inspired future generations of women who sought to break barriers and serve in traditionally male-dominated fields.

After she died in 1827, her story continued to gain recognition. In 1838, her husband was awarded a widow’s pension, possibly the first instance in U.S. history that the benefit was granted to a man based on his wife’s military service.

She left behind a legacy of courage, determination, and pioneering spirit that continues to resonate today. In 1983, she was declared the Official Heroine of the Commonwealth of Massachusetts, and in 2020, the U.S. House of Representatives passed the Deborah Sampson Act, expanding healthcare and benefits for female veterans. Statues and memorials, including her gravesite in Sharon, Massachusetts, commemorate her contributions.  Her wartime exploits have been the subject of books, plays, and scholarly research and her story continues to inspire generations as a symbol of courage and the ongoing struggle for gender equality in military service. 

While she was not the only woman to disguise herself and enlist—others like Margaret Corbin and Anna Maria Lane also took up arms—Sampson is among the best documented and celebrated.

Her life represents a crucial chapter in both military history and women’s history, illustrating the complex ways in which the American Revolution created opportunities for individuals to transcend social conventions in service of the greater cause of independence.  Deborah’s journey from indentured servant to Continental Army soldier and national lecturer is a testament to her extraordinary courage and determination. By stepping into a role forbidden to women and excelling under the harshest conditions, she challenged the boundaries of her time and set a precedent for future generations.

Though it is possible that her wartime activities may have been exaggerated—a common practice in biographies of the time—her life remains a powerful reminder of the contributions women have made, often unrecognized, in the shaping of American history.

The illustration at the beginning of this post is from The Female Review: Life of Deborah Sampson, the Female Soldier in the War of Revolution (1916), a reprint of the 1797 biography by Herman Mann.  

Button Gwinnett

An Almost Forgotten Signer of the Declaration of Independence

History is full of people both little known and unknown who were present at important events. They may have participated, or they may simply have been observers. Understanding them, their lives and their involvement can help us to understand the human aspect of historical events. This is what I love most about history, the stories of average people.

Not long ago, I was looking at a copy of a broadside of the Declaration of Independence when I noticed an intriguing signature — Button Gwinnett. He is one of the lesser-known signers of the Declaration of Independence, yet he played a significant role in the early political landscape of Georgia. His life was a blend of ambition and political maneuvering. His dramatic rise and fall remain intriguing to historians. Even though Gwinnett is little remembered today, his story offers a glimpse into the turbulent period of America’s founding.

Early Life and Migration to America

Button Gwinnett was born in 1735 in Down Hatherley, Gloucestershire, England. He was the son of an Anglican vicar and was named after his mother’s cousin Barbara Button who was also his godmother.

While details about his early education are scarce, it is believed that he received a basic education typical of the English gentry. Gwinnett’s early adulthood was marked by modest success as a merchant. In the 1760s, facing limited opportunities in England and the promise of economic prosperity in the American colonies, Gwinnett and his wife, Ann, emigrated to the New World.

Initially, Gwinnett settled in Charleston, South Carolina, where he engaged in trade. However, he struggled financially, and by 1765, he had relocated to Savannah, Georgia. This move marked not only the beginning of his political career, but also a period of fluctuating fortune. Gwinnett purchased St. Catherine’s Island off the coast of Georgia, hoping to become a successful plantation owner. Unfortunately, he overextended himself financially, and his attempts to establish a profitable business met with failure. Despite his financial setbacks, Gwinnett’s status as a landowner and merchant allowed him to enter the local political scene.

Rise in Politics and Revolutionary Activity

Gwinnett’s involvement in politics grew as tensions between the American colonies and Britain escalated. By the early 1770s, he had become aligned with the growing revolutionary sentiment. In 1775, he was elected to Georgia’s Provincial Congress, where he quickly rose to prominence due to his vocal support for independence from British rule. Although Georgia had initially shown less enthusiasm for independence than colonies like Massachusetts or Virginia, a growing faction of Georgia patriots, including Gwinnett, began advocating for stronger opposition to British rule. By 1776, Gwinnett had become a delegate to the Second Continental Congress.

Continental Congress and the Declaration

On January 20, 1776, Gwinnett left Georgia for Philadelphia to represent the colony in Congress. This appointment marked the pinnacle of his political career and placed him at the center of the deliberations for American independence. His journey to Philadelphia came at a crucial moment when the Continental Congress was moving toward a formal declaration of independence.

Gwinnett voted for independence on July 2, voted to approve the declaration on July 4, and signed his name to the parchment of the Declaration of Independence on August 2. Out of the 56 delegates who signed the Declaration, Button was one of only 8 who were born in Britain. His British birth added a unique perspective to his role as a Founding Father, representing the immigrant experience that was central to colonial American society.

His signing of the Declaration of Independence would later make his signature one of the most valuable autographs in American history. Gwinnett is known chiefly because his autographs are extremely rare and collectors have paid dearly to obtain one. (In 2001 one of his 36 known autographs sold at public auction for $110,000. Since then, several others have been documented.)

Conflict and Power Struggles in Georgia

Back in Georgia, Gwinnett became embroiled in a power struggle with General Lachlan McIntosh, a prominent figure in the colony’s revolutionary army. The conflict between Gwinnett and McIntosh was fueled by political rivalry and personal animosity. Gwinnett aspired to leadership positions within Georgia’s government and military, and in March 1777, he became acting president of Georgia’s Revolutionary Council after the sudden death of Governor Archibald Bulloch.

During his brief tenure as acting council president, Gwinnett’s leadership was controversial. He proposed a bold military expedition against British-controlled East Florida, intending to bolster his political standing and secure Georgia’s borders. However, the campaign was poorly executed, and it ended in failure. This debacle intensified the feud between Gwinnett and McIntosh, with each blaming the other for the military defeat.

Gwinnett’s promising political career was cut short by an ongoing personal conflict that became intertwined with the honor culture of the American South. The rivalry between Gwinnett and McIntosh reached its climax in May 1777. After a series of public insults—McIntosh called Gwinnett a “scoundrel and lying rascal,” Gwinnett responded by challenging him to a duel. Dueling, though technically illegal, was still a common way to resolve disputes among gentlemen of the period. On May 16, 1777, the two men faced each other with pistols in a pasture near Savannah. Both were wounded, but only Gwinnett’s injuries proved fatal. He died three days later, at age 42, and was buried in Savannah’s Colonial Park Cemetery, though the exact location of his grave is still unknown.

Legacy and Historical Significance

Gwinnett’s legacy is visible in his namesake Gwinnett County, one of Georgia’s most populous counties, a tribute to his contributions to the state’s early political history.

In recent decades, historians have taken a renewed interest in Button Gwinnett, examining his role beyond the narrow context of his duel and signature. While he lacked the fame of other founding fathers, Gwinnett’s political maneuvering and his role during the revolutionary period highlight the complexities of early American politics. His rivalry with McIntosh reflects the deep divisions and regional conflicts that existed even among those who supported independence.

Gwinnett’s life also underscores the risks faced by those who ventured into the revolutionary cause. Unlike many of his contemporaries who enjoyed long, celebrated careers, Gwinnett’s story is one of a meteoric rise and abrupt fall. His legacy, while overshadowed by more prominent figures, is a reminder of the many lesser-known men and women who played vital roles in America’s fight for independence.

Button Gwinnett’s life was marked by ambition, conflict, and an untimely death that left him as one of the more obscure figures of the American Revolution. His contributions to the independence movement in Georgia were significant, even if his political career was cut short. Today, Gwinnett’s name lives on in Georgia’s geography, and his autograph serves as a rare artifact of a fleeting yet impactful moment in history.

Sources:

·      National Archives: Declaration of Independence Signers – https://www.archives.gov/founding-docs/signers

·      The Georgia Historical Society: Biography of Button Gwinnett – https://georgiahistory.com

·      Smithsonian Magazine: The Rare Autograph of Button Gwinnett – https://www.smithsonianmag.com

·      Library of Congress: Early American Biographies – https://www.loc.gov

Declaring Independence: The Origin of America’s Founding Document

When Americans celebrate the Fourth of July, we imagine fireworks, flags, and a dramatic reading of the Declaration of Independence. We think we know the story—The Continental Congress selected Thomas Jefferson to write the declaration. He labored alone to produce this famous document. Congress then approved it unanimously and it was signed on the 4th of July.

 But the truth is far different and more complex. The story behind this iconic document—the how, who, and why of its creation—is just as explosive and illuminating as the day it represents. Far from a spontaneous outburst of rebellion, the Declaration was the product of political strategy, collaborative writing, and a shared sense of urgency among men who knew their words would change the course of history.

Setting the Stage: Why a Declaration?

By the spring of 1776, the American colonies were deep in conflict with Great Britain. Battles at Lexington and Concord had already been fought. George Washington was attempting to transform the Continental Army into a professional fighting force. Thomas Paine’s Common Sense had ignited widespread public support for full separation from the British Crown. The Continental Congress had been meeting in Philadelphia, debating how far they were willing to go. By June, the mood had shifted from reconciliation to revolution.

On June 7, 1776, Richard Henry Lee of Virginia introduced a resolution to the Continental Congress declaring “that these United Colonies are, and of right ought to be, free and independent States.” The motion was controversial—some delegates wanted more time to consult their colonies. But most in Congress knew that if independence was going to happen, it needed to be explained and justified to the world, so they created a committee to draft a formal declaration.

The Committee of Five

On June 11, 1776, the Continental Congress appointed a “Committee of Five” to write the declaration. The members were:

  • Thomas Jefferson of Virginia
  • John Adams of Massachusetts
  • Benjamin Franklin of Pennsylvania
  • Roger Sherman of Connecticut
  • Robert R. Livingston of New York

This was not a random selection. Each man represented a different region of the colonies and had earned the trust of fellow delegates. Jefferson was relatively young but already known for his eloquence. Adams was an outspoken advocate of independence. Franklin brought wisdom, wit, diplomatic experience, and international prestige. Sherman brought New England theological perspectives and legislative experience, while Livingston represented the more moderate New York delegation and brought keen legal insight.

Jefferson Takes the Pen

Although it was a group project on paper, the heavy lifting fell to Thomas Jefferson. The committee chose him to draft the initial version. Why Jefferson? According to John Adams, Jefferson was chosen for three reasons: he was from Virginia (the most influential colony), he was popular, and, Adams admitted, “you can write ten times better than I can.”

Jefferson wrote the draft in a rented room at 700 Market Street in Philadelphia. He leaned heavily on Enlightenment ideas, especially those of John Locke, emphasizing natural rights and the notion that government derives its power from the consent of the governed. He also borrowed phrasing from earlier colonial declarations, including his own A Summary View of the Rights of British America and borrowed extensively from George Mason’s Virginia Declaration of Rights.

The Editing Process: Group Work Gets Messy

After Jefferson completed the initial draft (likely by June 28), he shared it with Adams and Franklin. Both men suggested revisions. Franklin, ever the editor, softened some of Jefferson’s sharpest attacks and corrected language for flow and diplomacy. His most famous contribution was changing Jefferson’s phrase “We hold these truths to be sacred and undeniable” to the more secular and philosophically precise “We hold these truths to be self-evident.”  

Adams contributed to structural suggestions and to tone. He also contributed to the strategic presentation of grievances against King George III, understanding that the declaration needed to justify revolution in terms that would be acceptable to both colonial readers and potential European allies.

Sherman and Livingston played more limited but still meaningful roles. Sherman, with his theological background, helped ensure the document’s religious references would appeal to Puritan New England, while Livingston’s legal expertise helped refine the constitutional arguments against British rule.  Otherwise, their involvement in the actual content of the declaration was likely minimal.

The revised draft was presented to the full Continental Congress on June 28, 1776. What followed was a few days of intense debate and revision by the entire body.

Congress Takes the Red Pen

From July 1 to July 4, the Continental Congress debated the resolution for independence and edited the Declaration. Jefferson watched as more than two dozen changes were made to his prose. The Congress cut about a quarter of the original text, including a lengthy passage condemning King George III for perpetuating the transatlantic slave trade that would have sparked deep division among the delegates, especially those from Southern colonies.

Other modifications included strengthening the religious language, toning down some of the more inflammatory rhetoric, and making the grievances more specific and legally grounded.  Congress made 86 edits, removing about a quarter of Jefferson’s original content. Jefferson was reportedly frustrated by the changes, calling them “mutilations,” but he recognized that compromise was the cost of consensus

Approval and Promulgation

Despite the extensive revisions, the core of Jefferson’s vision remained intact and on July 2, 1776, the Continental Congress voted in favor of Lee’s resolution for independence. That’s the actual date the colonies officially broke from Britain. John Adams even predicted in a letter to his wife that July 2 would be celebrated forever as America’s Independence Day. He was close—but the official adoption of the Declaration came two days later.

On July 4, 1776, Congress formally approved the final version of the Declaration of Independence. Contrary to popular belief, most of the signers did not sign it on that day. Only John Hancock, as president of Congress, and Charles Thomson, as secretary, signed then.   The famous handwritten version, now in the National Archives, wasn’t signed until August 2. But the document approved on July 4 was immediately printed by John Dunlap, the official printer to Congress.

These first copies, known as Dunlap Broadsides, were distributed throughout the colonies and sent to military leaders, state assemblies, and even King George III. George Washington had it read aloud to the Continental Army.  This rapid dissemination was crucial to its impact, as it was needed to rally public support for the revolutionary cause and explain the colonies’ actions to the world.

Legacy and Impact

The Declaration wasn’t just a break-up letter to the British Crown—it was a manifesto for a new kind of political order. Its assertion that “all men are created equal” would echo through centuries of American history, invoked by abolitionists, suffragists, civil rights leaders, and more.

The creation of the Declaration of Independence demonstrates that even the most iconic documents in American history emerged from collaborative processes involving compromise, revision, and collective wisdom. While Jefferson deserves primary credit for the document’s eloquent expression of revolutionary ideals, the contributions of his committee colleagues and the broader Continental Congress were essential to creating a text that could unite thirteen diverse colonies in common cause.

This collaborative origin reflects the democratic principles the declaration itself proclaimed, showing that American independence was achieved not through the vision of a single individual, but through the collective efforts of representatives working together to articulate their shared commitment to liberty, equality, and self-governance. The process that created the Declaration of Independence thus embodied the very democratic ideals it proclaimed to the world.

Today, the Declaration of Independence is enshrined as one of the foundational texts of American democracy. But it’s worth remembering that it was created under immense pressure, forged by committee, and edited by compromise. Its authors knew they were taking a dangerous step. As Franklin quipped at the signing, “We must all hang together, or most assuredly we shall all hang separately.”

“America at 250: A Revolution Remembered… or Forgotten?”

I’m old enough to remember the 200th anniversary of the American Revolution. Bicentennial symbols were everywhere. Liberty Bells, eagles, and the ubiquitous Bicentennial logo of the red, white and blue stylized five-point star. They could be found on hats, T-shirts, socks, soft drink cups, beer cans, and even a special “Spirit of ‘76” edition of the Ford Mustang II. Commemorative events and celebrations were being planned everywhere and people had “bicentennial fever”.

But the 250th anniversary is not attracting that same kind of attention or interest. I wonder why that is. Perhaps it’s that the name for a 250th anniversary, Semiquincentennial, doesn’t seem to roll off the tongue the way Bicentennial does. But I suspect it’s far more than just a tongue twisting name.

The Bicentennial came after a decade of national trauma.  The Vietnam War, Watergate, and the civil rights struggles had all roiled the country.  By 1976, most Americans wanted to feel good about the country again. It became a giant, colorful celebration of “American resilience.”

While the 250th anniversary of the American Revolution is being marked by numerous events, commemorations, and official proclamations, most are local, and it has not yet captured widespread public attention or generated the scale of national excitement seen during previous milestone anniversaries.

The anniversary arrives at a time of deep political polarization, which has complicated celebration plans.  There is an ongoing debate within the group tasked with planning the celebration, the U.S. Semiquincentennial Commission, about how to present American history. Some members advocate for a traditional, celebratory approach focusing on the Founding Fathers and patriotic themes. Others push for a more inclusive narrative that acknowledges the complexities of American history, including the experiences of women, enslaved people, Indigenous communities, and other marginalized groups

Beyond the commission itself, some historians note that the “history wars”—ongoing disputes throughout society over how U.S. history should be taught and remembered—have made it harder to generate broad, enthusiastic buy-in for the anniversary among the general public. 

Commemorations in places like Lexington and Concord have seen anti-Trump protesters carrying signs such as “Resist Like It’s 1775” and “No Kings,” explicitly drawing parallels between opposition to King George III and contemporary resistance to what they perceive as autocratic tendencies in current leadership. At the reenactment of Patrick Henry’s “Give Me Liberty or Give Me Death” speech, Virginia Governor Glenn Youngkin was met with boos and protest chants, highlighting how the Revolution’s legacy is being invoked in current political struggles.

While some organizers and historians hope the anniversary can serve as a unifying moment—emphasizing that “patriotism should not be a partisan issue”—the reality is that commemorations have often become forums for expressing contemporary political grievances and anxieties. The presence of both celebratory and dissenting voices at these events reflects the enduring debate over what it means to be American and who gets to define that identity.  The complexity and messiness of American history, combined with current societal tensions, may dampen the celebratory mood and make it harder for people to connect emotionally with the anniversary.

Even the 250th logo has become a source of dispute, although it is one of the few areas of disagreement that is nonpartisan and tends to be about stylistic and artistic merits of the logo. Proponents of the new logo appreciate its modern and inclusive design emphasizing that the flowing ribbon represents “unity, cooperation, and harmony,” and reflects the nation’s aspirations as it commemorates this milestone.  Detractors are concerned about the legibility of the “250” and the lack of traditional American symbols, such as stars, which could have reinforced its patriotic theme.

Surveys by history related organizations suggest that most Americans are not yet thinking about the 250th anniversary.  The run-up to 2026 may see increased attention, but as of now, the anniversary has not broken through as a major topic of national conversation.  If the anniversary continues to be viewed as a contentious partisan undertaking, it may never gain widespread popularity, and the general public may choose to stay away.

A friend who is a member of the West Virginia 250th committee told me that they had an initial meeting at which nothing was accomplished, and they have had no meeting since. It seems to me, this is up to us, the citizens, to ensure that the 250th anniversary of the American Revolution is appropriately remembered. We don’t have to live in an area where a Revolutionary War event occurred for us to recognize its events. Here in West Virginia, in October of 2024 we commemorated the 250th anniversary of the battle of Point Pleasant which many consider a precursor to the American Revolution.  This event was not organized by any state or national group. It was the result of efforts on the part of the City of Point Pleasant and the West Virginia Sons of the American Revolution.

We do not need to depend on the government; we the people can hold local commemorations of revolutionary events that occurred in other areas. We can hold commemorations of the Battle of Bunker Hill, the signing of the Declaration of Independence, the Battle of Saratoga and many other events. It will take the initiative of local people to organize these events.

It will be our great shame if we allow this the commemoration of an event so significant in both American and world history to be turned into something that divides us rather unites us and strengthens our common bond.

Page 2 of 3

Powered by WordPress & Theme by Anders Norén