Have you ever wondered how the tradition of sending Christmas cards got started? It’s a story that combines busy social calendars, a new postal system, and one clever solution that became a worldwide phenomenon.
Before Christmas Cards: The Early Messengers
Long before anyone thought to mass-produce holiday greetings, people were already experimenting with seasonal messages. In fifteenth-century Germany, the “Andachtsbilder” appeared—proto-greeting cards with religious imagery, usually depicting baby Jesus, accompanied by the inscription “Ein gut selig jar” (A good and radiant year) that were presented as gifts during the Christmas season. Additionally, handwritten letters wishing “Merry Christmas” date from as early as 1534.These weren’t Christmas cards as we know them, but they laid the groundwork.
The first known Christmas card was sent in 1611 by Michael Maier, a German physician, to King James I of England and his son, with an elaborate greeting celebrating “the birthday of the Sacred King”. This, however, was an ornate document rather than a mass-produced card. The true breakthrough came much later.
In late 1700s, British schoolchildren were creating their own versions. They would take large sheets of decorated writing paper and pen messages like “Love to Dearest Mummy at the Christmas Season” to show their parents how much their handwriting had improved over the year. It was part homework assignment, part holiday greeting—definitely more practical than sentimental!
Also during the latter part of the 18th century wealthy British families adopted a more personal variant: handwritten holiday letters. These were carefully composed greetings expressing seasonal good will and family updates, often decorated with small flourishes or illustrations. A forerunner of the much maligned Christmas letter. In Victorian England—where social correspondence was almost an art form—sending letters for Christmas and New Year became fashionable among the middle class. The combination of widespread literacy and improvements in the postal system laid the groundwork for something new: a printed, affordable Christmas greeting.
The Birth of the Modern Christmas Card
The real game-changer came in 1843, thanks to a social problem that sounds remarkably modern: too many people to keep in touch with and not enough time. Henry Cole, a prominent civil servant, helped establish the Penny Post postal system—named after the cost of posting a letter. He found himself with unanswered mail piling up during the busy Christmas season. His solution? Why not create one design that could be sent to everyone?
Cole commissioned his friend, artist John Callcott Horsley, to design what would become the world’s first commercial Christmas card. The design featured three generations of the Cole family raising a toast in celebration, surrounded by scenes depicting acts of charity. The message was simple: “A Merry Christmas and a Happy New Year to You.”
About 2,050 cards were printed in two versions—a black and white version for sixpence and a hand-colored version for one shilling. Interestingly, the card caused some controversy. The image showed young children enjoying glasses of wine with their family, which upset the Victorian temperance movement.
The Penny Post, introduced in 1840, made mailing affordable and accessible. What started as Cole’s time-saving solution quickly caught on among his friends and acquaintances, though it took a few decades for the tradition to really explode in popularity.
Crossing the Atlantic
Christmas cards made their way to America in the late 1840s, but they were expensive luxuries at first. In 1875, Louis Prang, a German-born printer who had worked on early cards in England, began mass-producing cards in America. He made them affordable for average families. His first cards featured flowers, plants, and children. By the 1880s, Prang was producing over five million cards annually.
Christmas cards spread rapidly with improvements in both postal systems and printing. Victorian cards often featured sentimental, elaborate images—sometimes anthropomorphic animals or unexpected motifs. The Hall Brothers Company (later Hallmark) shifted the format to folded cards in envelopes rather than postcards, allowing for more personal written messages—setting the standard still seen today.
The 20th century brought both industrialization and personalization to the Christmas card. Advances in color printing, photography, and mass marketing meant that cards became cheaper and more varied. In the 1920s and 1930s, families began sending cards featuring their own photographs, a tradition that gained momentum after World War II with the rise of suburban life and inexpensive cameras. By the 1950s and 1960s, Christmas cards had become a fixture of middle-class life. Designs reflected changing tastes—from sentimental Victorian nostalgia to sleek mid-century modernism. Surprisingly, the first known Christmas card with a personal photo was sent by Annie Oakley in 1891using a photo taken during a visit to Scotland.
Christmas Cards Around the World Today
Fast forward to today, and Christmas card traditions vary wildly depending on where you are. In Great Britian and US, sending cards remains a major tradition. British people send around 55 cards per year on average, with Christmas cards accounting for almost half of all greeting card sales
But the tradition looks quite different in other parts of the world. In Japan, where only about 1.5% of the population is Christian, Christmas is celebrated as a secular, romantic holiday rather than a religious one. Christmas Eve is treated similarly to Valentine’s Day, with couples exchanging gifts. While many people observe the Western custom of sending cards, these are nengajo—New Year’s cards—sent to friends, family, and business associates, expressing wishes for a happy and prosperous year.
In the Philippines, one of Asia’s most Christian nations, Christmas is celebrated with incredible enthusiasm starting as early as September, with the season officially beginning with nine days of dawn masses on December 16. Cards are part of the celebration, but they’re just one element of an extended, community-focused holiday.
In Australia, the tradition of sending handwritten Christmas cards remains popular despite the summer heat. Australian cards often feature unique imagery—Santa in shorts and sandals, or kangaroos instead of reindeer, adapting the tradition to local culture.
The Digital Shift
Today, while e-cards and social media posts have certainly cut into traditional card sales, many people still cherish the ritual of sending and receiving physical cards. There’s something irreplaceable about finding a thoughtful card in your mailbox among the bills and advertisements.
What started as Henry Cole’s practical solution to a busy social calendar has evolved into a diverse global tradition, adapted and reimagined by different cultures worldwide. Whether you’re mailing elaborate family photo cards, sending quick e-greetings, or exchanging romantic messages in Tokyo, you’re participating in a tradition that’s over 200 years old.
The Continental Navy, established during the American Revolution, represented the colonies’ first organized attempt to challenge British naval supremacy. Though vastly outnumbered and outgunned by the Royal Navy, this fledgling force played a crucial role in securing American independence through daring raids, strategic disruption of British supply lines, and pivotal battles that helped turn the tide of war.
Congressional Acts and Political Support
The Continental Navy’s creation stemmed from military necessity rather than long-term naval planning. On October 13, 1775, the Continental Congress passed the first naval legislation, authorizing the fitting out of two vessels to intercept British supply ships carrying munitions to loyalist forces. This modest beginning expanded rapidly when Congress passed additional acts on October 30, 1775, calling for the construction of thirteen frigates and establishing the foundation of American naval power.
The Navy’s primary champions in Congress came from maritime colonies that understood sea power’s importance. John Adams of Massachusetts emerged as the Navy’s most vocal advocate, arguing that naval forces were essential for protecting American commerce and challenging British control of coastal waters. Recognizing that their states’ economic survival depended on maintaining sea access Samuel Chase of Maryland and Christopher Gadsden of South Carolina (designer of the Gadsden Flag) also provided crucial support. Rhode Island’s Stephen Hopkins, whose state had a rich maritime tradition, consistently voted for naval appropriations and expansion.
Opposition came primarily from other southern agricultural colonies that viewed naval expenditures as wasteful diversions from land-based military needs. Virginia’s delegates, despite their state’s extensive coastline, often questioned the wisdom of directly challenging Britain’s naval supremacy. These political divisions reflected deeper disagreements about military strategy and resource allocation during the war.
Ship Acquisition and Fleet Development
The Continental Navy acquired vessels through multiple methods, reflecting the revolution’s improvisational nature. Congress initially authorized the purchase and conversion of merchant ships, transforming trading vessels into warships through the addition of cannons and other military equipment. The frigates Cabot and Andrew Doria began as merchant vessels before receiving naval modifications.
New construction was the Navy’s most ambitious undertaking. The thirteen frigates authorized in 1775 were built in shipyards from New Hampshire to Georgia, spreading construction contracts across multiple colonies to ensure political support and reduce vulnerability to British attacks. These ships, including the Hancock and Randolph—named after prominent patriots to increase support—varied in size from 24 to 32 guns and represented state-of-the-art naval architecture.
Captured British vessels were also added to the fleet. American naval forces seized numerous enemy ships during the war, with some converted to Continental Navy service. The most famous capture occurred when John Paul Jones took HMS Serapis during his epic battle aboard Bonhomme Richard, though ironically, his own ship sank shortly after the victory.
Private vessels operating under letters of marque also supplemented the official navy. These privateers, while not technically part of the Continental Navy, operated under congressional authorization and contributed significantly to disrupting British commerce. Although, many considered privateers to be little more than questionably legal piracy.
Officer and Sailor Recruitment
Recruiting qualified officers proved challenging for a nation lacking naval traditions. Congress appointed many officers based on political connections and regional representation rather than solely on maritime experience. However, several appointees possessed substantial seafaring backgrounds. John Paul Jones, a Scottish-born merchant captain, brought extensive seafaring experience. Esek Hopkins, the Navy’s first commander-in-chief, had commanded privateers during the French and Indian War.
Other members of the officer corps reflected colonial society’s diversity. Captains came from various backgrounds, including merchant marine service, privateering, and even some Royal Navy officers. Congress attempted to maintain geographic balance in appointments, ensuring that all colonies felt represented in the naval leadership.
Sailor recruitment proved more difficult. The Continental Navy competed with privateers, merchant ships, and the army for manpower. Privateering offered potentially greater financial rewards through prize money, making it difficult to attract sailors to regular naval service. The navy relied on bounties, promises of prize shares, and appeals to patriotism to fill crew rosters.
Many sailors were drawn from coastal communities with maritime traditions. New England provided the largest contingent, given its extensive fishing and merchant fleets. However, the navy also recruited inland farmers, artisans, and even some former British naval personnel who had deserted or been captured.
The Continental Navy rarely resorted to impressment which was little more than kidnapping, though the few sailors who were impressed were paid and usually were released after completion of a single voyage.
Major Naval Battles and Strategic Impact
The Continental Navy’s most famous engagement occurred on September 23, 1779, when John Paul Jones commanding the Bonhomme Richard fought the HMS Serapis off the English coast. During this brutal three-and-a-half-hour battle the British called upon Jones to surrender and he reportedly replied, “I have not yet begun to fight!” His eventual victory provided a massive morale boost and international recognition of American naval capabilities.
The capture of New Providence in the Bahamas during March 1776 marked the navy’s first major operation. Esek Hopkins led a fleet of eight vessels in this successful raid, seizing gunpowder and military supplies desperately needed by Washington’s army. This victory demonstrated the navy’s potential for strategic operations beyond American coastal waters.
Naval battles along the American coast proved equally significant. The Delaware River battles of 1777 saw Continental Navy vessels attempting to prevent British naval forces from supporting the occupation of Philadelphia. Though ultimately unsuccessful, these engagements delayed British operations and demonstrated American willingness to contest enemy naval movements.
The most strategically important naval operations involved disrupting British supply lines and commerce. Continental Navy vessels captured hundreds of British merchant ships, depriving the enemy of supplies while providing America with desperately needed materials. These operations forced Britain to divert warships from other duties to provide convoy protection, reducing pressure on American forces ashore.
The Continental Navy also operated in partnership with French forces after the 1778 alliance. Joint operations extended American reach and contributed to key turning points in the war. French naval victories, especially at the Battle of the Chesapeake in 1781, indirectly sealed the fate of Cornwallis’s army at Yorktown by cutting off British reinforcements. Although this victory was French, it fulfilled the strategic vision the Continental Congress had first imagined in 1775—a sea power capable of shaping the war’s outcome.
Great Lakes Naval Operations
During the Revolution, both sides recognized the Great Lakes’ strategic importance for controlling the northwestern frontier. The British maintained naval superiority on these waters through their base at Detroit and control of key shipbuilding facilities. American forces attempted to challenge this dominance through the construction of small naval vessels on Lake Champlain and other waterways.
The most significant Revolutionary War naval action on inland waters occurred on Lake Champlain in October 1776. Benedict Arnold, commanding a small American fleet built on site, engaged a superior British force in a desperate delaying action. Though Arnold’s fleet was largely destroyed, the battle forced the British to postpone their invasion plans until the following year, providing crucial time for Americans to consolidate defenses and contributing to the American victory at Saratoga.
Trials and Transformations
Despite its courage, the Continental Navy faced constant hardship. Its ships were outgunned, its officers underpaid, and its crews plagued by desertion and disease. Many vessels were captured or scuttled to avoid seizure. The Alfred, the Navy’s first flagship, was taken by the British in 1778; others, like the Reprisal and Lexington, were lost at sea.
After the Treaty of Paris (1783), Congress was burdened by debt and saw no need for a standing blue-water navy. The last remaining ship, USS Alliance, was sold on August 1, 1785, marking the formal end of the Continental Navy, two years after the Revolutionary War ended.
It was not long before increasing attacks on American merchant ships by Barbary corsairs pushed Congress to pass the 1794 Naval Act, authorizing construction of six frigates. This was the first step in rebuilding the naval force, though it wasn’t yet a fully independent service.
On April 30, 1798, Congress created the Department of the Navy, taking naval affairs out of the War Department and officially re-establishing the United States Navy as a separate, permanent institution.
Legacy and Impact on Revolutionary Success
The Continental Navy’s impact on the Revolutionary War extended far beyond what its modest size might suggest. By challenging British naval supremacy, even unsuccessfully at times, the Continental Navy forced Britain to maintain large fleet deployments in American waters, reducing British naval availability for operations elsewhere and increasing the war’s cost.
More importantly, Continental Navy operations helped secure the French alliance that proved decisive in achieving independence. French officials were impressed by American naval courage and potential, viewing the Navy as evidence of serious commitment to independence. Naval victories like Jones’s triumph over HMS Serapis provided powerful propaganda tools for American diplomats seeking European support.
The Continental Navy also established important precedents for American naval development. The officer corps trained during the Revolution provided leadership for subsequent naval expansion. Naval yards and facilities developed during the war became foundations for future fleet construction.
Despite its relatively small size and limited resources, the Continental Navy demonstrated that determined naval forces could challenge even the world’s most powerful fleet. Through courage, innovation, and strategic thinking, America’s first navy helped secure the independence that made possible the nation’s eventual emergence as a global naval power. The lessons learned and traditions established during these formative years continued to influence American naval development long after the Revolution’s end.
Understanding Classical Socialism, Democratic Socialism, and Social Democracy in Today’s America
If you’ve ever wondered what politicians really mean when they throw around words like “socialism” or “social democracy,” you’re not alone. These ideas used to live mostly in political theory textbooks. Now they show up in campaign speeches and social media debates. With figures like Bernie Sanders and groups like the Democratic Socialists of America bringing these ideas into the mainstream, it’s worth sorting out what each actually means.
Even though classical socialism, democratic socialism, and social democracy all claim to focus on fairness and reducing inequality, they take very different routes to get there. Understanding those differences helps make sense of what’s really being argued about in American politics today.
Classical Socialism: The Original Blueprint
Classical socialism came out of the 19th century, when industrial capitalism was grinding workers down and a couple of guys named Karl Marx and Friedrich Engels thought they had the fix. Their idea: workers should collectively own and control the means of production — factories, land, and major industries.
This wasn’t just about taxing the rich. It was about redesigning the whole system from the ground up, through violent revolution if necessary. In theory, private property creates exploitation; collective ownership ends it. In practice, that often means top-down control by the state, with economies planned from above — as seen in the Soviet Union or Maoist China.
The central ideas of classical socialism are collective ownership of big industries and central or cooperative planning instead of market competition. Production is aimed at meeting needs, not profits with the eventual goal of a classless, stateless society. Classical socialism accepts that revolution will most likely be necessary for implementation.
In theory, classical socialism wipes out worker exploitation and wealth extremes. Its central tenant is that production serves human needs, not corporate profit. In practice, it often leads to authoritarian governments, clumsy economic planning, and little room for innovation or dissent.
Would it work in America? Probably not. The U.S. has deep cultural roots in individualism and private enterprise. Replacing markets with centralized planning would clash hard with both our Constitution and national temperament.
The Siblings of Socialism
In the real world, classical socialism has produced two offsprings, the confusingly named democratic socialism and social democracy. While they share many similarities, the major difference is that democratic socialism aims to replace capitalism while social democracy has the objective of reforming capitalism and making it more humane.
Democratic socialism
Democratic socialism shares many of classical socialism’s goals but emphasizes getting there through elections — not revolution. It aims to establish central control of key parts of the economy while protecting some political freedom and most civil rights.
The vision of Democratic Socialism is collective (public) ownership of major industries like energy, transportation, manufacturing, and communications. The economy would be directed and managed by the government, but the government would be elected and it would not be an authoritarian state. It proposes that within individual industries there would be worker self-management and workplace democracy. It also proposes that there would be private sector businesses allowed on a small scale—think Mom and Pop retail. It supposes gradual reform, not a violent upheaval, while maintaining democracy and civil liberties.
There are several major drawbacks to democratic socialism. Progress can be slow, easily reversed, and still subject to bureaucratic inefficiencies. Competing globally with capitalist economies might also prove tough. To me the major drawback is how major corporations, financial institutions, and wealthy businesspeople can be convinced to peacefully hand over control of major portions of the economy to a “people’s collective”.
How it fits in the U.S.: Democratic Socialism has grown in popularity, especially among younger voters; although, it seems that many younger people seem to believe that this means making things more fair rather than supporting the reality of Democratic Socialism.
Bernie Sanders and Alexandria Ocasio-Cortez wear the label proudly. Still, the idea of government control of a significant portion of the economy faces serious resistance here. Realistically, it’s more a movement that nudges policy leftward than a model ready for prime time.
Social Democracy: Capitalism with Guardrails
Social democracy takes a different track. It doesn’t want to abolish capitalism — it wants to civilize it. Think Scandinavia: private ownership, strong markets, but also universal healthcare, paid leave, and free college.
The central elements of Social Democracy are a mixed economy with both public and private sector control. In some models, there is direct government management of such public services as healthcare, energy and transportation. In other models, there remains private control of these services with a strong regulation on the part of the government.
Regardless of the chosen model, a Social Democracy is a strong welfare state with universal benefits. The definition of welfare in this context is a way of providing earned support for hard working citizens Perhaps it should be called an earned benefits state as the term welfare has a pejorative implication for some.
There is strong market regulation to prevent unfair competition, price gouging, and monopolies that are detrimental to public good. There is a progressive tax program designed to reward productivity while heavily taxing passive or nonproductive income. These taxes are used to fund generous public services.
The government remains elective and responsive to the public. It’s proven to work. Nordic countries show that capitalism can coexist with equality and innovation. While it is expensive, and high taxes can be a political lightning rod, it leaves capitalism’s basic structure intact. There is a constant risk that inequality can creep back if protection weaken.
In the U.S. context: Social democracy may be the most realistic option. As social scientist Lane Kenworthy puts it, America already is a social democracy — just not a particularly generous one. We’ve got Medicare, Social Security, public education — we just underfund them compared to our European cousins. The reality is that income lost to increased taxation is regained through decreases in insurance premiums, healthcare costs, education expenses and retirement expenses.
With Elon Musk on the cusp of becoming the world’s first trillionaire we have to ask: “How much is enough before they accept their social responsibility to the working people that made their wealth possible?” The bottom line is that when the ultra-wealthy are required to pay their fair share of taxes, public services become affordable. We should be supporting people, not yachts.
What’s Realistically Possible Here?
Culturally, Americans value freedom, competition, and property rights. Yet polls show younger voters are warming up to “socialism,” even if most don’t seem to be clear about the specifics. Institutionally, the U.S. political system makes sweeping change tough. Our winner-take-all elections favor a two-party system that leaves little room for socialist parties to grow independently.
Democratic Socialism may continue to shape the conversation, but full socialism — especially the classic Marxist kind — is not likely to take hold here. From my perspective, the most realistic option, Social Democracy is too often overlooked in these discussions.
Given that, the path of least resistance looks like expanded Social Democracy: things like a revised and equitable tax code, universal healthcare, free or subsidized higher education, paid family leave, stronger labor laws, and public investment in infrastructure and green energy.
Social Democracy looks like the most attainable path — not a revolution, but an evolution toward a fairer society. Only time will tell.
When most people think of the American Revolution, they picture Continental soldiers marching across snowy battlefields or patriot militias defending their homes. But there’s another group that played a crucial role in securing American independence: the Continental Marines. These amphibious warriors served in America’s nascent naval force and proved their worth on both land and sea during the eight-year struggle for independence.
The Continental Marines, established in 1775, served as America’s first organized marine force during the Revolutionary War before being disbanded in 1783, laying the foundation for what would eventually become the modern U.S. Marine Corps. Though short-lived, the original Marine Corps played a significant role in America’s fight for independence, setting precedents that the modern Marine Corps still honors today.
The Legislative Foundation
By the fall of 1775, the American colonies were no longer engaged in mere protest—they were in open rebellion against the British Empire. Battles had already been fought at Lexington, Concord, and Bunker Hill. The Continental Congress, led by figures like John Adams, had begun to organize a Continental Army under George Washington’s command. But many in the Congress, especially Adams, believed a navy was also essential to challenge British power at sea and disrupt its supply lines.
With a navy, it was reasoned, must come Marines—soldiers trained to serve aboard ships, conduct landings, enforce discipline, and fight in close quarters during boarding actions. This model was based on the British Royal Marines, a corps with a long and respected tradition.
The Continental Marines came into existence through a resolution passed by the Second Continental Congress on November 10, 1775. This date, which Marines still celebrate today as their birthday, marked a pivotal moment in American military history.
The Continental Marine Act of 1775 decreed: “That two battalions of Marines be raised consisting of one Colonel, two lieutenant-colonels, two majors and other officers, as usual in other regiments; that they consist of an equal number of privates as with other battalions, that particular care be taken that no persons be appointed to offices, or enlisted into said battalions, but such as are good seamen, or so acquainted with maritime affairs as to be able to serve for and during the present war with Great Britain and the Colonies.”
The legislation was part of Congress’s broader effort to create a Continental Navy capable of challenging British naval supremacy. The resolution was drafted by future U.S. president John Adams and adopted in Philadelphia. This wasn’t just about creating another military unit—Congress recognized that naval warfare required specialized troops who could fight effectively both on ships and on shore. The concept wasn’t entirely new—European navies had long employed marines for similar purposes—but the Continental Marines represented America’s first organized attempt to create a professional amphibious force, though the term amphibious didn’t come into use in a military setting until the 1930s—they would likely have been informally referred to as a naval landing force.
Recruitment: From Taverns to the Fleet
The recruitment of the Continental Marines has become the stuff of legend, particularly the story of their traditional birthplace at Tun Tavern in Philadelphia. Though legend places its first recruiting post at Tun Tavern, historian Edwin Simmons surmises that it may as likely have been the Conestoga Waggon [sic], a tavern owned by the Nicholas family. Regardless of which tavern served as the primary recruiting station, the Marines can claim the unique distinction of being the only military branch “born in a bar”.
The first Commandant of the Marine Corps was Captain Samuel Nicholas, and his first Captain and recruiter was Robert Mullan, the owner of Tun Tavern. Samuel Nicholas, a Quaker-born Philadelphia native and experience mariner, was commissioned on November 28, 1775, becoming the Continental Marines’ senior officer and only commandant throughout their existence. While his background as a Philadelphia tavern keeper may seem unusual for a military leader, his connections in the maritime community proved invaluable for recruiting. The requirement for maritime experience shaped the character of the force from its inception.
The Marines faced immediate recruitment challenges. Originally, Congress envisioned using the Marines for a planned invasion of Nova Scotia. They expected the Marines to draw personnel from George Washington’s Continental Army. However, Washington was reluctant to part with his soldiers, forcing the Marines to recruit independently, primarily from the maritime communities of Philadelphia and New York.
By December 1775, Nicholas had raised a battalion of approximately 300 men, organized into five companies, though this fell short of the original plan for two full battalions. Robert Mullan, helped to assemble the fledgling fighting force. Plans to form the second battalion were suspended indefinitely after several British regiments-of-foot and cavalry landed in Nova Scotia, making the planned naval assault impossible.
Organization for Dual Service
The Continental Marines were organized as a flexible force capable of serving both aboard ships and on land. For shipboard service, Marines were organized into small detachments that could be distributed across the Continental Navy’s vessels. Their organization reflected their multi-purpose mission: they served as security forces protecting ship officers, repelling boarders and joining boarding parties during naval engagements, and as assault troops for amphibious operations. Marksmanship received particular emphasis—a tradition that continues to this day—as Marines often served as sharpshooters in naval engagements, targeting enemy officers and sailors from the rigging and fighting tops of ships.
During the Revolutionary War, the Continental Marines uniform directives specified a green jacket with white facings and cuffs. However, when the first sets of uniforms were actually ordered and delivered, red facings were substituted for white. The likely reason was supply availability: red cloth was easier to obtain from Continental or captured British stores. The most authoritative description comes from Captain Samuel Nicholas, who wrote from Philadelphia in 1776 that Marines were outfitted in “green coats faced with red, and lined with white”
The uniform also included a high leather collar, or stock, to ostensibly protect the neck against sword slashes, although there is some evidence that may actually have been intended to improve posture. This distinctive uniform item helped establish their identity as an elite force and eventually lead to their treasured nickname “leathernecks”.
Shipboard Service and Naval Operations
The Continental Marines’ role aboard ship was multifaceted and crucial to naval operations. Their most important duty was to serve as onboard security forces, protecting the captain of a ship and his officers. During naval engagements, in addition to manning the cannons along with the crew of the ship, Marine sharp shooters were stationed in the fighting tops of a ship’s masts specifically to shoot the opponent’s officers and crew. These duties reflected centuries of naval tradition and drew on the example of the British Marines.
The Marines’ first major naval operation came in early 1776 when five companies joined Commodore Esek Hopkins’ Continental Navy squadron, on its first cruise in the Caribbean. This deployment demonstrated their value as both shipboard security and assault troops, setting the pattern for their service throughout the war.
Major Land-Based Actions
Despite their naval origins, the Continental Marines proved equally effective in land combat. Their most famous early action was the landing at Nassau on the Island of New Providence in the Bahamas in March 1776. The landing was the first by Marines on a hostile shore. It was led by Captain Nicholas and consisted of 250 marines and sailors. After 13 Days the Marines had captured two forts, the Government House, occupied Nassau and captured cannons and large stores of supplies. While they missed capturing the gunpowder stores (which had been evacuated before their arrival), the raid demonstrated American capability to strike British positions anywhere.
Though modest in scale, this operation had a major symbolic weight and established the Marines as America’s premier amphibious force. The operation did not decisively alter the balance of the war, but it foreshadowed the Marines’ enduring identity as a seafaring, expeditionary force. Today, the Battle of Nassau is remembered less for the supplies seized than for what it represented: the moment the Continental Marines stepped onto the world stage.
Other notable operations included raids on British soil itself. In April of 1778, Marines under the command of John Paul Jones made two daring raids, one at the port of Whitehaven, in northwest England, and the second later that day at St. Mary’s Isle. These operations brought the war directly to British territory, demonstrating American reach and resolve. While the battles had no strategic impact on the outcome of the war, they were a great moral booster when reports, though largely exaggerated, reached the rebellious colonies
Official Marine Corps history also acknowledges Marine participation in the Battle of Princeton, though it wasn’t a major Marine engagement. Marines from Captain William Shippen’s company, who had been serving aboard Continental Navy ships, participated in this battle as a part of Cadwalader’s Brigade on Washington’s flank. Some Marines were detached to augment the artillery, with a few eventually transferring to the army. However, the Marines’ role was relatively minor compared to their more significant naval actions during this period.
The Gradual Decline
As the Revolutionary War progressed, the Continental Marines faced increasing challenges. Financial constraints plagued the Continental forces throughout the war, and the Marines were no exception. The Continental Congress struggled to fund and supply all military branches, and the relatively small Marine force often found itself at a disadvantage competing for resources with the larger Continental Army and Navy.
Recruitment became increasingly difficult as the war dragged on. After the early campaigns, Nicholas’s four-company battalion discontinued independent service, and remaining Marines were reassigned to shipboard detachments. Their number had been reduced by transfers, desertion, and the loss of eighty Marines through disease.
The Continental Navy also faced severe challenges that directly impacted the Marines. Many ships were captured, destroyed, or sold, leaving Marines without their primary operational platform. As the naval war shifted toward privateering and smaller-scale operations, the need for organized Marine units diminished.
Beginning in February 1777 two companies of Marines either transferred to Morristown to assume the roles in the Continental artillery batteries or left the service altogether. This transfer of Marines to army artillery units reflected the practical reality that their specialized skills were needed elsewhere as the Continental forces adapted to changing circumstances.
Disbanded at War’s End
The end of the Revolutionary War marked the end of the Continental Marines as an organized force. Both the Continental Navy and Marines were disbanded in April 1783. Although a few individual Marines briefly stayed on to provide security for the remaining U.S. Navy vessels, the last Continental Marine was discharged in September 1783.
The last official act of the Continental Marines was escorting a stash of French Silver Crowns (coins) from Boston to Philadelphia—a loan from Louis XVI to establish of the Bank of North America. This final mission, conducted in 1781, symbolically linked the Marines to the new nation’s financial foundations even as their military role ended.
The disbanding reflected broader American attitudes toward standing military forces. Having won their independence, Americans were skeptical of maintaining large military establishments that might threaten republican government. The Continental Congress, facing financial pressures and political opposition to permanent military forces, chose to disband both the Navy and Marines.
Legacy
The Continental Marines’ contribution to American independence was significant despite their small numbers. In all, over the course of 7 years of battle, the Continental Marines had only 49 men killed and just 70 more wounded, out of a total force of roughly 130 Marine Officers and 2,000 enlisted. These relatively low casualty figures reflected both their effectiveness and the limited size of the force.
Rising tensions with Revolutionary France in the late 1790s led to the Quasi-War, prompting Congress to reestablish the Navy in 1798. On July 11 of that year, President John Adams signed legislation formally creating the United States Marine Corps as a permanent branch of the military, under the jurisdiction of the Department of the Navy. This new Marine Corps inherited the traditions, mission, and esprit de corps of its Revolutionary War predecessors. Despite the gap between the disbanding of the Continental Marines and the establishment of the new United States Marine Corps, Marines honor November 10, 1775, as the official founding date of their Corps.
The Continental Marines established precedents that would shape American military doctrine for more than two centuries. The Revolutionary War not only led to the founding of the United States (Continental) Marine Corps but also highlighted for the first time the versatility for which Marines have come to be known. They fought on land, they fought at sea on ships, and they performed amphibious assaults.
The Continental Marines represented a crucial innovation in American military organization. Born from congressional resolution and tavern recruitment, these maritime warriors proved their worth in battles from the Caribbean to the British Isles. Though disbanded with the war’s end, their legacy lives on in the traditions and spirit of the modern Marine Corps. While their numbers were small and their existence brief, their impact on American military tradition proved lasting and significant.
Did you know that there once an independent republic in the farthest reaches of northern New Hampshire, where the dense forests blend into the Canadian wilderness? Neither did I until I came across it in a fascinating book titled A Brief History of the World in 47 Boarders by John Elledge.
It was a short-lived but remarkable experiment in self-government. For three years in the 1830s, the settlers of a disputed border region declared themselves citizens of an independent republic—complete with their own constitution, legislature, and militia. They called it the Republic of Indian Stream, a name that today sounds almost mythical, yet it was a genuine, functioning democracy. Their story blends frontier improvisation, international diplomacy, and Yankee self-reliance—and it leaves us with a curious artifact: a constitution written not by statesmen in Philadelphia, but by farmers, loggers, and merchants caught between two competing nations.
A Territory in Limbo
The roots of the Indian Stream story go back to the Treaty of Paris (1783), which ended the American Revolution. The treaty defined the U.S.–Canada border but used vague geographic language—particularly the phrase “the northwesternmost head of the Connecticut River.” No one could agree which of several small tributaries the treaty meant.
The ambiguity created a slice of wilderness—about 200 square miles—claimed by both the United States and British Lower Canada (now Quebec). For decades, the region existed in a gray zone. Both countries sent tax collectors and law officers, both demanded military service, and neither provided clear legal protection. Residents couldn’t vote, hold secure property titles, or rely on either government’s courts. To make matters worse, they were sometimes forced to pay taxes twice—once to New Hampshire and once to Canada.
Origins of the Republic
By the late 1820s, frustration had reached a boiling point. Attempts to resolve the border dispute were unsuccessful—including arbitration by the King of the Netherlands in 1827 that failed when the United States rejected his decision that favored Great Britain.
With both sides still pressing their claims, the settlers decided they’d had enough of outside interference. On July 9, 1832, they convened a local meeting and declared independence, forming the Republic of Indian Stream. Their constitution—modeled on American state constitutions—began with a simple premise: authority rested with “the citizens inhabiting the territory.”
This wasn’t an act of rebellion but one of survival. The settlers wanted peace, order, and local control. Their goal was to withdrawal from ambiguous regulation and to create a government that could function until the border question was finally settled.
The Constitution of Indian Stream
The constitution of the Republic, adopted the same day they declared sovereignty, was an impressively crafted document for a community of barely 300 people. It reflected the settlers’ familiarity with republican ideals and their determination to govern themselves fairly.
Key features included:
Democratic foundation: All authority stemmed from the citizens.
Annual elections: A single House of Representatives made the laws, and a magistrate acted as both executive and judge.
Judicial simplicity: Local justices of the peace handled disputes—there were no elaborate court hierarchies.
Individual rights: Residents enjoyed protections derived from U.S. constitutions—trial by jury, due process, and freedom from arbitrary arrest.
Defense and civic duty: Citizens pledged to defend their independence and assist one another in emergencies.
Despite its modest scale, the system worked. The republic passed laws, issued warrants, collected taxes, and even mustered a small militia to maintain order.
Life on the Frontier
Life in Indian Stream resembled that of many frontier communities: logging, farming, hunting, and trading. The land was rough, winters long, and access to distant markets limited. Yet the people thrived through cooperation and self-reliance. Trade with both Canadian and New Hampshire merchants continued—proof that practicality often trumped politics on the frontier.
The republic’s remote location provided a degree of safety from interference, but not immunity. Both British and American agents continued to assert claims, and occasional arrests or skirmishes kept tensions high.
The End of the Republic
The experiment in independence lasted only three years. In 1835, a dispute between an Indian Stream constable and a Canadian deputy sheriff triggered a diplomatic crisis. Canada sent troops to assert control, prompting New Hampshire’s governor to respond in kind.
Realizing they were caught between two competing governments, the citizens voted in April 1836 to accept New Hampshire’s jurisdiction. Indian Stream became part of the town of Pittsburg, and peace was restored.
The larger boundary issue wasn’t fully settled until the Webster–Ashburton Treaty of 1842, which formally placed Indian Stream within the United States.
Legacy of a Lost Republic
Today, little remains of the Republic of Indian Stream except New Hampshire Historical Marker #1 and a scattering of homesteads near the Connecticut Lakes.
Yet its legacy is profound. It may have lasted only three years, but its story reflects the broader American frontier experience: independence, inventive, and determination to live free from arbitrary rule. In an era defined by rigid borders and powerful states, the memory of Indian Stream reminds us that freedom once depended, not on lines on a map, but on the courage of people willing to draw their own lines.
The story also illustrates the complexities of nation-building in the early American period when borders remained fluid and communities sometimes had to forge their own path toward self-governance. While the republic was short lived, it stands as a testament to the ingenuity and determination of America’s frontier settlers, who refused to accept statelessness and instead chose to create their own nation in the wilderness.
The Indian Stream constitution reminds us that political order is not always imposed from above; sometimes, out of necessity, it is created from below. The settlers were neither revolutionaries nor idealists—they simply wanted clear rules, fair courts, and predictable taxes. Ordinary citizens, faced with legal chaos and neglect, designed a functioning democracy grounded in fairness and mutual responsibility.
That such a tiny community would craft its own constitution speaks to the enduring appeal of constitutional government in the early 19th century. Even on the edge of two empires, far from capitals and legislatures, these settlers turned to a familiar American solution: write it down, elect your leaders, and hold them accountable every year. Hopefully we will be able to keep their spirit and live up to the example of Indian Stream.
Few military emblems carry as much history and pride as the Marine Corps’ Eagle, Globe, and Anchor, better known as the EGA or simply as the emblem. New recruits and officer candidates work intensely to earn the right to wear this symbol. It is a source of immense pride for every Marine who achieves that distinction.
When entering the Corps, I encountered World War II veterans who affectionately called the EGA the “Birdie on the Ball.” But only Marines can take such liberties—outsiders risk offense if they use the term.
The emblem is instantly recognizable, yet few realize its deep historical roots or appreciate the transformations it has undergone to become the symbol every Marine wears today.
From Anchors to Eagles: The Early Years (1776–1868)
At its inception in 1776, the Continental Marines lacked any formal insignia. Some Marines, predominantly officers, adopted maritime icons such as the fouled anchor—an anchor entwined with rope—often emblazoned on buttons or hat plates. This design echoed the British Royal Navy and underscored their naval identity, but it was never standardized.
Uniform innovations began in the early 1800s. By 1804, Marines were using brass eagles mounted on square plates. During the War of 1812, octagonal plates appeared, embossed with eagles, anchors, drums, shields, and flags. Later designs were simplified to feature metal letters “U.S.M.” (United States Marines), reflecting the shift towards a national identity.
The example below is an officer’s coat button circa 1805-1820.
A more distinctive step came in 1821: the Corps adopted an eagle perched on a fouled anchor encircled by 13 stars, a motif featured on buttons for nearly four decades. However, similar symbols were also used by the Army and Navy, making it less than unique.
Following the Civil War, Marine Corps leadership under Brigadier General Jacob Zeilin, the seventh Commandant, sought a truly unique insignia for the service.
The Zeilin Board and the Birth of the Modern EGA (1868)
On November 12, 1868, Zeilin established a board of officers “To decide and report upon the various devices of cap ornaments of the Marine Corps.” They wasted no time: by November 19, the Secretary of the Navy, Gideon Welles, had approved the new emblem.
The board drew inspiration from the British Royal Marines’ “Globe and Laurel” emblem.
The American version added a few important touches:
Globe showing the Western Hemisphere: Representing the Corps’ defense of the Americas and a global presence.
Fouled anchor: Honoring the Corps’ naval origins.
Eagle: Symbolizing national service and pride.
Zeilin described the new emblem as representing the Corps’ “readiness to serve anywhere, by sea or land.”
At the same time, a distinct emblem was also created for Marine Corps musicians, still seen today on the formal red and gold uniforms of the U.S. Marine Band—“The President’s Own”.
The Motto and Later Refinements
The Latin motto, Semper Fidelis (“Always Faithful”), was introduced in 1883 under Commandant Charles McCawley, replacing previous mottoes such as Fortitudine (“With Fortitude”). Semper Fidelis became central to the Marine Corps’ ethos.
The emblem saw many variations over the decades. Initial designs featured a crested eagle—borrowed from European heraldry. Semper Fidelis appeared on a scroll held in the eagle’s beak on some versions of the emblem.
Only in 1954, with President Dwight D. Eisenhower’s Executive Order 10538, did the American bald eagle with a scroll officially become part of the emblem. This finalized the design used today.
Officer and Enlisted Differences
Since 1868, design distinctions have marked officer and enlisted EGA emblems. Officers’ original emblems were elaborate—frosted silver hemispheres with gold-plated Americas, crowned by a solid silver eagle. Enlisted emblems were brass, emphasizing practicality.
Modern officers wear a multi-piece, high-relief insignia with fine rope detailing, while enlisted Marines use a one-piece emblem. Notably, officers’ globes omit Cuba to strengthen the emblem structurally. A running joke among enlisted personnel is that officers couldn’t find Cuba on a map.
Before WWII, officers often purchased insignia from jewelers like Bailey, Banks & Biddle, resulting in stylistic inconsistency. One museum curator quipped, “World War I eagles looked like fat turkeys.” Eventually, standardization brought the crisp, clean look seen today.
A Legacy That Endures
From 18th-century anchors to the refined Eagle, Globe, and Anchor of today, the emblem tracks the Corps’ evolution from shipboard security to a global expeditionary force. Over centuries, its form has varied—engraved by jewelers, stamped for wartime, and cast in silver for dress blues—but its meaning remains constant.
Every Marine who earns the EGA joins a tradition stretching back 250 years, defined by courage, loyalty, and the enduring promise to remain Always Faithful.
When you think about the American Revolution, you probably picture dramatic battles like Bunker Hill or the crossing of the Delaware. But here’s something that might surprise you: the biggest killer during the war wasn’t British muskets—it was disease. And it’s not even close.
The Numbers Tell a Grim Story
Let’s talk numbers for a second. On the American side, about 6,800 soldiers died from battlefield wounds. Sounds terrible, right? Well, disease killed an estimated 17,000 to 20,000. That’s roughly three times as many. The British and their Hessian allies faced similar odds: around 7,000 combat deaths versus 15,000 to 25,000 disease deaths.
Think about that for a moment. You were actually safer charging into battle than hanging around camp. In some regiments, disease wiped out more than a third of the troops before they even saw their first fight.
Why Was Disease So Deadly?
Picture yourself in a Revolutionary War military camp. Hundreds of men crammed together in makeshift shelters, no running water, primitive latrines dug too close to where everyone lives, and basically zero understanding of what we’d call “germ theory” today. It’s a perfect storm for infectious disease.
The big killers were:
Smallpox was the heavyweight champion of camp diseases. This virus killed about 30% of people it infected and spread like wildfire through packed military camps. Soldiers tried to protect themselves through a risky practice called inoculation—basically giving themselves a mild case of smallpox on purpose by rubbing infected pus into cuts on their skin. Without proper quarantine procedures, though, this sometimes made outbreaks worse instead of better.
Typhus (called “camp fever” back then) spread through lice and fleas. If you’ve ever been to a prolonged camping trip and felt gross after a few days, imagine that times a hundred. Soldiers lived in the same clothes for weeks, rarely bathed, and the parasites just had a field day. The fever, headaches, and diarrhea that came with typhus made it one of the most dreaded camp diseases.
Dysentery (charmingly nicknamed “bloody flux”) came from contaminated water and poor sanitation. When your latrine is 20 feet from your water source and you don’t understand how disease spreads, this becomes pretty much inevitable. The severe diarrhea weakened soldiers to the point where many couldn’t fight even if they wanted to and it made them even more susceptible to other diseases.
Malaria was especially important in the South, where mosquitoes thrived in the humid climate. This one actually played a fascinating role in how the war ended—but more on that in a bit.
When Disease Changed Everything
The 1776 invasion of Canada was a disaster largely because of smallpox. Out of 3,200 American soldiers in the Quebec campaign, 1,200 fell sick. You can’t mount much of an offensive when more than a third of your army is flat on their backs with fever. Similarly, during the siege of Boston, Washington couldn’t effectively engage the British because so many of his troops were sick with smallpox. These weren’t just setbacks—they were strategic catastrophes.
This is what pushed George Washington to make one of his boldest decisions in 1777: he ordered a mass inoculation of the Continental Army. This was controversial and dangerous at the time, but it worked. Washington had survived smallpox himself as a young man, so he understood both the risks and the benefits. The inoculation program probably saved the army from complete collapse.
Medical “Treatment” Was Often Worse Than Nothing
Here’s where things get really grim. Eighteenth-century medicine was basically medieval. Doctors believed in “balancing the humors” through bloodletting—literally draining blood from already weakened soldiers. They also gave powerful laxatives to people who were already suffering from diarrhea. Yeah, let that sink in.
Pain relief meant opium-based drinks or just straight alcohol. Some doctors used herbal remedies, but results were inconsistent at best. Quinine helped with malaria, though nobody really understood why. Mostly, if you got seriously sick, your survival came down to luck and a strong constitution.
Valley Forge: The Turning Point
Valley Forge is famous for being a brutal winter encampment, and disease was a huge part of why it was so terrible. Scabies left nearly half the troops unable to serve. Dysentery and camp fever killed somewhere between 1,700 and 2,000 soldiers during that single winter—and remember, these weren’t battle casualties. These men died from preventable diseases in what was supposed to be a safe encampment.
But Valley Forge taught the Continental Army a crucial lesson. After that nightmare winter, military leaders started taking sanitation seriously. They began focusing on camp hygiene, protecting water supplies, placing latrines away from living areas, and making sure soldiers could bathe and wash their clothes and bedding.
Baron von Steuben is famous for teaching the Continental Army how to march and drill, but he also deserves credit for implementing serious sanitation reforms. These changes helped prevent future disease outbreaks and kept the army functional for the rest of the war.
The Secret Weapon at Yorktown
Here’s one of my favorite historical details: mosquitoes may have helped win American independence. At Yorktown, roughly 30% of Cornwallis’s British army was knocked out by malaria and other diseases during the siege. The British commander was trying to hold off the American and French forces while also dealing with the fact that almost a third of his troops were too sick to fight.
Many American soldiers from the southern colonies had grown up with malaria and had some partial immunity. The British? Not so much. Some historians even think Cornwallis himself might have been suffering from malaria, which could have affected his decision-making. His second-in-command, Brigadier General Charles O’Hara, was definitely seriously ill during the siege. Fighting a war while you can barely stand is a pretty significant handicap.
The Bigger Picture
The American Revolution shows us something important: wars aren’t just won on battlefields. They’re won by the side that can keep its soldiers alive and healthy. Disease shaped strategic decisions, determined the outcomes of campaigns, and killed far more men than any British regiment ever did.
Washington’s decision to inoculate the army was genuinely revolutionary (pun intended). It showed a willingness to embrace controversial medical practices for the greater good. The sanitation reforms that came out of Valley Forge laid groundwork for modern military medicine and influenced public health policies in the new United States.
So next time someone mentions the American Revolution, remember: while we celebrate the military victories, one of the most important battles was fought against an enemy you couldn’t see—and for most of the war, nobody really knew how to fight it.
The casualty figures and major disease outbreaks are well-documented in historical records. The specific percentages and numbers are estimates based on historical research, as precise record-keeping was limited during this period. The overall narrative about disease being the primary cause of death is strongly supported by multiple historical sources.
I just finished People, Power, and Profits by Joseph Stiglitz — the Nobel Prize winning economist. He wrote this near the end of Trump’s first term, but honestly, the world he describes feels even more relevant now.
Stiglitz doesn’t sugarcoat it: capitalism, as we’re practicing it today, is broken. Monopolies dominate markets, inequality has gone wild, and trust in democracy is running on fumes. His proposed fix? Something he calls progressive capitalism — capitalism with guardrails, conscience, and a sense of fairness.
Stiglitz makes the case that our economic system is rigged — not by accident, but by design. Here are his most compelling arguments and what he thinks we should do about them.
1. Taxation and Rent-Seeking: The Rigged Game
Stiglitz draws a sharp distinction between making money through productive work and extracting it through what economists call “rent-seeking” – essentially, using power to skim wealth without creating value. Think of a pharmaceutical company that buys a drug patent and jacks up prices 5,000%, or telecom monopolies that divide up markets to avoid competing.
His argument is straightforward: our tax system rewards the wrong behavior. Capital gains are taxed at lower rates than wages, which means someone living off investments pays less than someone working a regular job. Meanwhile, the wealthy can afford armies of accountants to exploit loopholes that most people don’t even know exist.
What Stiglitz recommends: Tax wealth more aggressively, especially inherited wealth. Close the capital gains loophole. Tax rent-seeking activities heavily while reducing taxes on productive work and innovation. The goal isn’t just revenue – it’s changing incentives so that the path to riches runs through creating value, not extracting it.
2. Green Energy and the True Cost of Pollution
Here’s where Stiglitz gets into what economists call “externalities” – costs that businesses impose on society without paying for them. When a coal plant spews carbon into the atmosphere, we all pay through climate change and increased healthcare costs, but the plant’s balance sheet looks great.
Stiglitz argues this is fundamentally dishonest accounting. If we properly priced pollution and carbon emissions, green energy wouldn’t need subsidies to compete – fossil fuels would suddenly look much more expensive once you factor in their real costs to society.
His recommendation: Implement carbon pricing – either through a carbon tax or cap-and-trade system. Make polluters pay for the damage they cause. This isn’t about punishing business; it’s about honest accounting. Once prices reflect reality, the market will naturally shift toward cleaner energy because it’s actually cheaper when you account for all the costs.
3. Big Business and Big Banks: Concentration of Power
Stiglitz has been particularly vocal about how corporate consolidation hurts everyone except shareholders and executives. His critique of “too big to fail” is sharp. He argues that concentrated economic power — in tech, finance, and even agriculture — undermines both democracy and efficiency. When a few firms dominate markets, they can suppress wages, block innovation, and bend regulations in their favor—they gain power over prices, wages, and even politics.
The banking sector especially concerns him. After the 2008 financial crisis, which was caused largely by reckless behavior from major banks, these same institutions emerged even larger through government-facilitated mergers. We allowed them to spread their losses among their depositors but let them keep their gains as internal profits.
His recommendations: Reinstate and strengthen regulations that were stripped away, including bringing back something like the Glass-Steagall Act that separated commercial and investment banking. Break up banks that are “too big to fail.” Strengthen antitrust enforcement across all industries. Use the government’s regulatory power to promote competition rather than letting industry consolidate.
4. Money in Politics: The Feedback Loop
This is where everything connects for Stiglitz. Concentrated economic power translates directly into political power. Wealthy interests fund campaigns, lobby relentlessly, and effectively write regulations for the agencies that are supposed to oversee them. This creates a vicious cycle: economic inequality begets political inequality, which creates policies that worsen economic inequality.
Stiglitz argues that the Supreme Court’s Citizens United decision, which allowed unlimited corporate spending in elections, turbocharged this problem by treating money as speech and corporations as people.
His recommendations: Limit campaign spending and institute public financing of campaigns to reduce candidates’ dependence on wealthy donors. Place strict limits on lobbying and implement a robust “revolving door” policy that prevents government officials from immediately cashing in with the industries they regulated. Mandate transparency requirements so voters know who’s funding what. Pass Constitutional amendments if necessary to overturn Citizens United.
The Big Picture
What makes Stiglitz’s argument powerful is how these pieces fit together. You can’t fix inequality just through taxation if big corporations control the political process. You can’t address climate change if fossil fuel companies can buy enough influence to block action. Everything is connected.
His recommendations aren’t radical in historical terms – they’re actually trying to restore a balance that existed during the post-war economic boom of the 1950s. Stiglitz’s “progressive capitalism” isn’t socialism. It’s capitalism with a conscience — one that remembers who it’s supposed to serve.
Whether you see that as a rescue plan or a recipe for red tape depends entirely on where you put your faith: in public institutions or private markets. The question is do we have the political will to implement his recommendation despite entrenched opposition from those benefiting from the current system?
Either way, this debate isn’t going away — it’s the one shaping the 21st-century economy.
Not long ago I was watching a news show and one of the panelists started talking about “a duel of words” that went on in a congressional hearing. I was intrigued by the use of the word duel and I thought I’d look into the history of this strange custom.
In the age before Twitter feuds, internet trolling, and legal settlements, honor was defended with pistols at dawn. The Code Duello, a set of rules governing dueling, offers a fascinating glimpse into how ideas of masculinity, reputation, and justice shaped public and private life in the Anglo-American world from the mid-18th century through the antebellum era.
The Code Duello emerged as one of the most distinctive and controversial aspects of genteel culture in the American colonies in the early United States. This elaborate system of honor-based combat, imported from European aristocratic traditions, would profoundly shape American society between 1750 and 1860, creating a culture where personal honor often trumped legal authority and where violence became a sanctioned means of dispute resolution among the elite.
European Origins
The Code Duello originated in Renaissance Italy and spread throughout European aristocratic circles as a means of settling disputes while maintaining social hierarchy. The practice reached the American colonies through British and Continental European settlers who brought with them deeply ingrained notions of honor, reputation, and gentlemanly conduct. Unlike random violence or brawling, dueling operated under strict protocols that emphasized courage, skill, and adherence to prescribed rituals.
The most influential codification was the Irish Code Duello of 1777, written by gentlemen of Tipperary and Galway. This twenty-six-rule system established procedures for issuing challenges, selecting weapons, determining conditions of combat, and defining acceptable outcomes. The code emphasized that dueling was a privilege of gentlemen, requiring both participants to be of equal social standing and ensuring that honor could only be satisfied through formal, regulated combat.
Colonial Implementation and Adaptation
The first recorded American duel occurred in 1621 in Plymouth, Massachusetts, between two servants, but the practice soon became the exclusive domain of elites as only “gentlemen” were considered to possess honor worth defending in this way.
The Irish Code Duello was widely adopted in America, though often with local variations. In 1838, South Carolina Governor John Lyde Wilson published an “Americanized” version, known as the Wilson Code, which further codified the practice for the southern states and attempted to increase negotiated settlements. These codes served as the de facto law of honor, even as formal legal systems struggled to suppress dueling.
The practice gained prominence among the southern plantation society’s hierarchy as dueling fit well with its emphasis on personal honor. The ritual was highly formal: challenges were issued in writing, seconds (assistants to the duelists) attempted to mediate, the weapons chosen, and terms were carefully negotiated.
Colonial dueling adapted European practices to American circumstances. While European duels often involved swords, reflecting centuries of aristocratic martial tradition, American duelists increasingly favored pistols, which were more readily available and required less specialized training. This shift democratized dueling to some extent, as pistol proficiency was more easily acquired than swordsmanship, though the practice remained largely restricted to the upper classes.
The Revolutionary War significantly expanded dueling’s influence. Military service brought together men from different regions and social backgrounds, spreading dueling customs beyond their original geographic and social boundaries. Officers who had learned European military traditions during the conflict carried these practices into civilian life, establishing dueling as a marker of martial virtue and gentlemanly status.
The Early Republic
Following independence, dueling became increasingly institutionalized in American society. The young republic’s political culture, characterized by intense partisan conflict and personal attacks in newspapers, created numerous opportunities for perceived slights to honor that demanded satisfaction through combat.
The most famous American duel occurred in 1804 when Aaron Burr killed Alexander Hamilton at Weehawken, New Jersey. This encounter exemplified both the power and the contradictions of dueling culture. Hamilton, despite philosophical opposition to dueling, felt compelled to accept Burr’s challenge to maintain his political viability. The duel’s outcome effectively ended Burr’s political career and demonstrated how adherence to the code could destroy the very honor it purported to defend.
Prior to becoming president, Andrew Jackson took part in at least three duels, although he is rumored to have been in many more. In his most famous duel, Jackson shot and killed a man who had insulted his wife. Jackson was also wounded in the duel and carried the bullet in his chest for the rest of his life.
Political dueling reached epidemic proportions in the antebellum period. Congressional representatives, senators, and other public figures regularly challenged opponents to combat over policy disagreements or personal insults. The practice became so common that some politicians deliberately provoked duels to enhance their reputation for courage, while others saw dueling as essential to maintaining credibility in public life.
Regional Variations and Social Dynamics
Dueling culture varied significantly across regions. The South developed the most elaborate and persistent dueling traditions, where the practice became intimately connected with concepts of honor, masculinity, and social hierarchy that would later influence Confederate military culture. Southern dueling codes often emphasized elaborate rituals and multiple exchanges of fire, reflecting a culture that viewed honor as more important than life itself.
Northern attitudes toward dueling were more ambivalent. While many Northern elites participated in dueling, the practice faced stronger opposition from religious groups, legal authorities, and emerging middle-class values that emphasized commerce over honor. Anti-dueling societies formed in several Northern cities, and some states enacted specific anti-dueling legislation, though enforcement remained inconsistent. Laws against it were passed in several colonies as early as the mid-18th century, with harsh penalties including denial of Christian burial for duelists killed in combat. Clergy denounced it as un-Christian, and reformers sought to eradicate it, but the practice persisted, especially in regions where courts were weak or social hierarchies unstable. The South, with its less institutionalized markets and governance, saw dueling as a quicker, more reliable way to settle disputes.
Western frontier regions adapted dueling to their own circumstances, often emphasizing practical marksmanship over elaborate ceremony. Frontier dueling tended to be less formal than Eastern practices, but it served similar functions in establishing social hierarchies and resolving disputes in areas where legal institutions remained weak.
Decline and Legacy
By the 1850s, dueling faced increasing opposition from legal, religious, and social reform movements. The rise of professional journalism, which could destroy reputations without resort to violence, provided alternative means of defending honor. Changing economic conditions that emphasized commercial success over martial virtue gradually undermined dueling’s social foundations.
The Civil War marked dueling’s effective end as a significant social institution. The massive scale of organized violence made individual combat seem anachronistic, while post-war society increasingly emphasized industrial progress over aristocratic honor. Though isolated duels continued into the 1870s, the practice lost its central role in American elite culture.
The Code Duello’s legacy extended far beyond its formal practice. It established patterns of violence, honor, and masculine identity that would influence American culture for generations, contributing to regional differences in attitudes toward violence and honor that persist today. The code’s emphasis on individual resolution of disputes also reflected broader American skepticism toward institutional authority, helping shape a culture that often preferred private justice to public law.
How the Code Duello Shaped Western Gunfighting Culture
The Code Duello was a script for settling personal disputes through controlled violence. Its influence waned in the East by the mid-1800s, but many of its ideas persisted, especially among military veterans, Southern transplants, and frontiersmen. As the American frontier expanded, the ethic of “settling scores” through personal combat found fertile ground in the west. What changed was the style and setting.
From Pistols at Dawn to High Noon
In the Code Duello, challenges were typically issued in writing, often with formal language and designated seconds. A duel was planned, often days in advance, and fought with flintlock pistols or swords. By contrast, gunfights in the Old West were more spontaneous, often provoked by insults, cheating, or long-standing feuds. Still, both forms were ultimately about defending personal honor in public view.
Gunfighters like Wild Bill Hickok and Wyatt Earp became mythologized partly because they embodied an honor-based culture in an environment where the law was weak or slow. In many ways, the Western gunfight was an informal, democratized version of the Code Duello, stripped of its aristocratic pretenses but keeping its emotional and symbolic core.
Myth vs. Reality
Ironically, formal duels were relatively rare in the actual Old West, and many “gunfights” were closer to ambushes or drunken brawls than ritualized combat. But dime novels, Wild West shows, and later Hollywood films reimagined them using a Code Duello-like template: two men meet face to face, in broad daylight, to resolve a conflict through a test of nerve and skill. The image of the high-noon shootout—with a silent crowd, an agreed time and place, and an implied code of fairness—is the Code Duello in cowboy boots, but it likely never existed.
The Duel That Never Was
I will end the discussion of Code Duello with what may be one of the most unusual of all American dueling stories.
In 1842, Abraham Lincoln became embroiled in a public dispute with James Shields, the auditor of Illinois, largely over Illinois State banking policy and some satirical letters that mocked Shields. Shields took great offense to these attacks—particularly the ones written by Lincoln under the pseudonym “Rebecca”—and formally challenged Lincoln to a duel. According to the rules of dueling, Lincoln, as the one challenged, had the right to choose the weapons. He selected cavalry broadswords of the largest size to take advantage of his own height and reach over Shields.
The Duel’s Outcome
The duel was scheduled for September 22, 1842, on Bloody Island, a sandbar in the Mississippi River near Alton, Missouri—chosen because dueling was still legal there. On the day of the duel, before any blood was shed, Lincoln dramatically demonstrated his advantage by slicing off a high tree branch with his broadsword, showcasing his reach and physical prowess. After witnessing this and following subsequent negotiations by their seconds, Shields and Lincoln decided to call off the duel, resolving their differences without violence.
Legacy
Although the duel never resulted in violence, it became a notorious episode in Lincoln’s life, one he rarely spoke of later, even when asked about it. The event is commonly cited as a reflection of Lincoln’s quick wit, physical presence, and preference for peaceful resolution when possible. While Abraham Lincoln never actually fought a duel, he was briefly a participant in one of the more colorful near-duels of American political history.
A Final Thought
Perhaps the world would be a better place if we reinstitute some elements of Code Duello and instead of sending armies off to fight bloody battles, the national leaders settle disputes by individual combat. I suspect there would be many more negotiated settlements.
How A Nobel Laureate Thinks We Can Save The American Economy…But It Won’t Be Easy
By John Turley
On October 19, 2025
In Commentary, Politics
I just finished People, Power, and Profits by Joseph Stiglitz — the Nobel Prize winning economist. He wrote this near the end of Trump’s first term, but honestly, the world he describes feels even more relevant now.
Stiglitz doesn’t sugarcoat it: capitalism, as we’re practicing it today, is broken. Monopolies dominate markets, inequality has gone wild, and trust in democracy is running on fumes. His proposed fix? Something he calls progressive capitalism — capitalism with guardrails, conscience, and a sense of fairness.
Stiglitz makes the case that our economic system is rigged — not by accident, but by design. Here are his most compelling arguments and what he thinks we should do about them.
1. Taxation and Rent-Seeking: The Rigged Game
Stiglitz draws a sharp distinction between making money through productive work and extracting it through what economists call “rent-seeking” – essentially, using power to skim wealth without creating value. Think of a pharmaceutical company that buys a drug patent and jacks up prices 5,000%, or telecom monopolies that divide up markets to avoid competing.
His argument is straightforward: our tax system rewards the wrong behavior. Capital gains are taxed at lower rates than wages, which means someone living off investments pays less than someone working a regular job. Meanwhile, the wealthy can afford armies of accountants to exploit loopholes that most people don’t even know exist.
What Stiglitz recommends: Tax wealth more aggressively, especially inherited wealth. Close the capital gains loophole. Tax rent-seeking activities heavily while reducing taxes on productive work and innovation. The goal isn’t just revenue – it’s changing incentives so that the path to riches runs through creating value, not extracting it.
2. Green Energy and the True Cost of Pollution
Here’s where Stiglitz gets into what economists call “externalities” – costs that businesses impose on society without paying for them. When a coal plant spews carbon into the atmosphere, we all pay through climate change and increased healthcare costs, but the plant’s balance sheet looks great.
Stiglitz argues this is fundamentally dishonest accounting. If we properly priced pollution and carbon emissions, green energy wouldn’t need subsidies to compete – fossil fuels would suddenly look much more expensive once you factor in their real costs to society.
His recommendation: Implement carbon pricing – either through a carbon tax or cap-and-trade system. Make polluters pay for the damage they cause. This isn’t about punishing business; it’s about honest accounting. Once prices reflect reality, the market will naturally shift toward cleaner energy because it’s actually cheaper when you account for all the costs.
3. Big Business and Big Banks: Concentration of Power
Stiglitz has been particularly vocal about how corporate consolidation hurts everyone except shareholders and executives. His critique of “too big to fail” is sharp. He argues that concentrated economic power — in tech, finance, and even agriculture — undermines both democracy and efficiency. When a few firms dominate markets, they can suppress wages, block innovation, and bend regulations in their favor—they gain power over prices, wages, and even politics.
The banking sector especially concerns him. After the 2008 financial crisis, which was caused largely by reckless behavior from major banks, these same institutions emerged even larger through government-facilitated mergers. We allowed them to spread their losses among their depositors but let them keep their gains as internal profits.
His recommendations: Reinstate and strengthen regulations that were stripped away, including bringing back something like the Glass-Steagall Act that separated commercial and investment banking. Break up banks that are “too big to fail.” Strengthen antitrust enforcement across all industries. Use the government’s regulatory power to promote competition rather than letting industry consolidate.
4. Money in Politics: The Feedback Loop
This is where everything connects for Stiglitz. Concentrated economic power translates directly into political power. Wealthy interests fund campaigns, lobby relentlessly, and effectively write regulations for the agencies that are supposed to oversee them. This creates a vicious cycle: economic inequality begets political inequality, which creates policies that worsen economic inequality.
Stiglitz argues that the Supreme Court’s Citizens United decision, which allowed unlimited corporate spending in elections, turbocharged this problem by treating money as speech and corporations as people.
His recommendations: Limit campaign spending and institute public financing of campaigns to reduce candidates’ dependence on wealthy donors. Place strict limits on lobbying and implement a robust “revolving door” policy that prevents government officials from immediately cashing in with the industries they regulated. Mandate transparency requirements so voters know who’s funding what. Pass Constitutional amendments if necessary to overturn Citizens United.
The Big Picture
What makes Stiglitz’s argument powerful is how these pieces fit together. You can’t fix inequality just through taxation if big corporations control the political process. You can’t address climate change if fossil fuel companies can buy enough influence to block action. Everything is connected.
His recommendations aren’t radical in historical terms – they’re actually trying to restore a balance that existed during the post-war economic boom of the 1950s. Stiglitz’s “progressive capitalism” isn’t socialism. It’s capitalism with a conscience — one that remembers who it’s supposed to serve.
Whether you see that as a rescue plan or a recipe for red tape depends entirely on where you put your faith: in public institutions or private markets. The question is do we have the political will to implement his recommendation despite entrenched opposition from those benefiting from the current system?
Either way, this debate isn’t going away — it’s the one shaping the 21st-century economy.