Grumpy opinions about everything.

Category: History Page 2 of 9

Grumpy opinions about American history

Bull Markets, Bear Markets and the Story Behind Wall Street’s Most Famous Animals

If you’ve ever caught a business news segment, you’ve probably heard anchors throwing around terms like “bull market” and “bear market” as if everyone just naturally knows what they mean. But beyond the basic idea that one’s good and one’s bad, the real mechanics of these market conditions—and how they got their animal nicknames—are pretty interesting.

How the Stock Market Works (The Quick Version)

Before we dive into bulls and bears, let’s cover the basics. The stock market is essentially a place where people buy and sell ownership stakes in companies. When you buy a share of stock, you’re purchasing a tiny piece of that company. The price of that share goes up or down based on how many people want to buy it versus how many want to sell it—classic supply and demand.

Companies sell shares to raise money for growth, and investors buy them hoping the company will do well and the stock price will increase. The overall “market” is tracked through indexes like the S&P 500 or Dow Jones Industrial Average, which measure how a group of major companies are performing. When most stocks are rising, we say the market is up; when most are falling, the market is down.

What Bull and Bear Markets Actually Mean

A bull market refers to a period when stock prices are rising, typically defined as a 20% or more increase from recent lows. During bull markets, investors are optimistic, companies are generally doing well, and people are more willing to take risks with their money. Bull markets usually are driven by a strong economy with low inflation and optimistic investors. Think of the economic boom of the late 1990s or the recovery after the 2008 financial crisis—those were classic bull markets where prices kept climbing for years.

A bear market is essentially the opposite: a general decline in the stock market over time, usually defined as a 20% or more price decline over at least a two-month period. During bear markets, investors get nervous, sell off their holdings, and pessimism spreads. When a 10% to 20% decline occurs, it’s classified as a correction, and bear territory always precedes a bear market. The Great Depression, the 2008 financial crisis, and the COVID-19 pandemic’s initial impact all triggered bear markets.

The Colorful History Behind the Terms

Now here’s where things get interesting. These terms didn’t come from some modern marketing genius—they trace back to 18th century London, and the story involves everything from old proverbs to violent animal fights to one of history’s biggest financial scandals.

The “bear” term came first. Etymologists point to an old proverb, warning that it is not wise “to sell the bear’s skin before one has caught the bear”. This saying was about the foolishness of counting on something before you actually have it. By the early 1700s, traders who engaged in short selling (betting that prices would fall) were called “bear-skin jobbers” because they sold a bear’s skin—the shares—before catching the bear. The term eventually got shortened to just “bears.”

The real watershed moment came with the South Sea Bubble of 1720. The South Sea Company was a British joint stock company founded by an act of Parliament in 1711, and in 1720, the company assumed most of the British national debt and convinced investors to give up state annuities for company stock sold at a very high premium. When everything collapsed, share prices dropped dramatically, starting a “bear market,” and the story became the topic of many literary works and went down in history as an infamous metaphor.

As for “bull,” the origins are a bit murkier. The first known instance of the market term “bull” popped up in 1714, shortly after the “bear” term emerged. Most historians think it arose as a natural counterpoint to “bear,” possibly influenced by bull-baiting and bear-baiting, two animal fighting sports popular at the time—though I should note that’s somewhat speculative.

There’s also a popular explanation about how the animals attack: bears swipe downward with their paws while bulls thrust upward with their horns, which nicely mirrors market movements. While that’s a helpful memory device, it’s probably more of a convenient coincidence or a retroactive description than the actual origin. The term “bull” originally meant a speculative purchase in the expectation that stock prices would rise, and was later applied to the person making such purchases.

Why This Still Matters

These metaphors have stuck around for three centuries because they work. They’re visceral and easy to remember—you can picture a charging bull or a hibernating bear without needing an economics degree. They also capture something real about market psychology: the aggressive optimism of bulls pushing prices up versus the defensive pessimism of bears hunkering down.

Understanding these terms helps you follow financial news and, more importantly, recognize when markets are shifting. Knowing you’re in a bull market might make you less surprised by rising prices, while recognizing a bear market can help you avoid panic-selling when things look grim.

The Bull and Bear Markets are among those things I’ve heard for years and never knew their origin.  This article is an attempt to explain it to myself.

Sources:

The Sugar Act of 1764: The Tax Cut That Sparked a Revolution

The Sugar Act of 1764: The Tax Cut That Sparked a Revolution

Imagine a time when people rose up in protest of a tax being lowered.  Welcome to the world of the Sugar Act.

The Sugar Act of 1764 stands as one of the most ironic moments in the history of taxation. Here Britain was actually lowering a tax, and yet colonists reacted with a fury that would help spark a revolution. To understanding this paradox, we must understand that this new act represented something far more threatening than any previous attempt by Britain to regulate its American colonies.

The Old System: Benign Neglect

For decades before 1764, Britain had maintained what historians call “salutary neglect” toward its American colonies. The Molasses Act of 1733 had imposed a steep duty of six pence per gallon on foreign molasses imported into the colonies. On paper, this seemed like a significant burden for the rum-distilling industry, which depended heavily on cheap molasses from French and Spanish Caribbean islands. In practice, though, the tax was rarely collected. Colonial merchants either bribed customs officials or simply smuggled the molasses past them. The British government essentially looked the other way, and everyone profited.

This informal arrangement worked because Britain’s primary interest in the colonies was commercial, not fiscal. The Navigation Acts required colonists to ship certain goods only to Britain and to buy manufactured goods from British merchants, which enriched British traders and manufacturers without requiring aggressive tax collection in America. As long as this system funneled wealth toward London, Parliament didn’t care much about collecting relatively small customs duties across the Atlantic.

Everything Changed in 1763

The Seven Years’ War (which Americans call the French and Indian War) changed this comfortable arrangement entirely. Britain won decisively, driving France out of North America and gaining vast new territories. But victory came with a staggering price tag. Britain’s national debt had nearly doubled to £130 million, and annual interest payments alone consumed half the government’s budget. Meanwhile, Britain now needed to maintain 10,000 troops in North America to defend its expanded empire and manage relations with Native American tribes.

Prime Minister George Grenville faced a political problem. British taxpayers, already heavily burdened, were in no mood for additional taxes. The logic seemed obvious: since the colonies had benefited from the war’s outcome and still required military protection, they should help pay for their own defense. Americans, paid far lower taxes than their counterparts in Britain—by some estimates, British residents paid 26 times more per capita in taxes than colonists did.

What the Act Actually Did

The Sugar Act (officially the American Revenue Act of 1764) approached colonial taxation differently than anything before it. First, it cut the duty on foreign molasses from six pence to three pence per gallon—a 50% reduction. Grenville calculated, reasonably, that merchants might actually pay a three-pence duty rather than risk getting caught smuggling, whereas the six-pence duty had been so high it encouraged universal evasion.

But the Act did far more than adjust molasses duties. It added or increased duties on foreign textiles, coffee, indigo, and wine imported into the colonies. colonialtened regulations around the colonial lumber trade and banned the import of foreign rum entirely. Most significantly, the Act included elaborate provisions designed to strictly enforce these duties for the first time.

The enforcement mechanisms represented the real revolution in British policy. Ship captains now had to post bonds before loading cargo and had to maintain detailed written cargo lists. Naval patrols increased dramatically. Smugglers faced having their ships and cargo seized.

Significantly, the burden of proof was shifted to the accused.  They were required to prove their innocence, a reversal of traditional British justice. Most controversially, accused smugglers would be tried in vice-admiralty courts, which had no juries and whose judges received a cut of any fines levied.

The Paradox of the Lower Tax

So why did colonists react so angrily to a tax cut? The answer reveals the fundamental shift in the British-American relationship that the Sugar Act represented.

First, the issue wasn’t the tax rate. It was the certainty of collection. A six-pence tax that no one paid was infinitely preferable to a three-pence tax rigorously enforced. New England’s rum distilling industry, which employed thousands of distillery workers and sailors, depended on cheap molasses from the French West Indies. Even at three pence per gallon, the tax significantly increased operating costs. Many merchants calculated they couldn’t remain profitable if they had to pay it.

Second, and more importantly, colonists recognized that the Act’s purpose had changed relationships. Previous trade regulations, even if they involved taxes, were ostensibly about regulating commerce within the empire. The Sugar Act openly stated its purpose was raising revenue—the preamble declared it was “just and necessary that a revenue be raised” in America. This might seem like a technical distinction, but to colonists it mattered enormously. British constitutional theory held that subjects could only be taxed by their own elected representatives. Colonists elected representatives to their own assemblies but sent no representatives to Parliament. Trade regulations fell under Parliament’s legitimate authority to govern imperial commerce, but taxation for revenue was something else entirely.

Third, the enforcement mechanisms offended colonial sensibilities about justice and traditional British rights. The vice-admiralty courts denied jury trials, which colonists viewed as a fundamental right of British subjects. Having to prove your innocence rather than being presumed innocent violated another core principle. Customs officials and judges profiting from convictions created obvious incentives for abuse.

Implementation and Colonial Response

The Act took effect in September 1764, and Grenville paired it with an aggressive enforcement campaign. The Royal Navy assigned 27 ships to patrol American waters. Britain appointed new customs officials and gave them instructions to strictly do their jobs rather than accept bribes. Admiralty courts in Halifax, Nova Scotia became particularly notorious.  Colonists had to travel hundreds of miles to defend themselves in a court with no jury and with a judge whose income game from convictions.

Colonists responded immediately. Boston merchants drafted a protest arguing that the act would devastate their trade. They explained that New England’s economy depended on a complex triangular trade: they sold lumber and food to the Caribbean in exchange for molasses, which they distilled into rum, which they sold to Africa for slaves, who were sold to Caribbean plantations for molasses, and the cycle repeated. Taxing molasses would break this chain and impoverish the region.

But the economic arguments quickly evolved into constitutional ones. Lawyer James Otis argued that “taxation without representation is tyranny”—a phrase that would echo through the coming decade. Colonial assemblies began passing resolutions asserting their exclusive right to tax their own constituents. They didn’t deny Parliament’s authority to regulate trade, but they drew a clear line: revenue taxation required representation.

The protests went beyond rhetoric. Colonial merchants organized boycotts of British manufactured goods. Women’s groups pledged to wear homespun cloth rather than buy British textiles. These boycotts caused enough economic pain in Britain that London merchants began lobbying Parliament for relief.

The Road to Revolution

The Sugar Act’s significance extends far beyond its immediate economic impact. It established precedents and patterns that would define the next decade of imperial crisis.

Most fundamentally, it shattered the comfortable arrangement of salutary neglect. Once Britain demonstrated it intended to actively govern and tax the colonies, the relationship could never return to its previous informality. The colonists’ constitutional objections—no taxation without representation, right to jury trials, presumption of innocence—would be repeated with increasing urgency as Parliament passed the Stamp Act (1765), Townshend Acts (1767), and Tea Act (1773).

The Sugar Act also revealed the practical difficulties of governing an empire across 3,000 miles of ocean. The vice-admiralty courts became symbols of distant, unaccountable power. When colonists couldn’t get satisfaction through established legal channels, they increasingly turned to extralegal methods, including committees of correspondence, non-importation agreements, and eventually armed resistance.

Perhaps most importantly, the Sugar Act forced colonists to articulate a political theory that ultimately proved incompatible with continued membership in the British Empire. Once they agreed to the principle that they could only be taxed by their own elected representatives, and that Parliament’s authority over them was limited to trade regulation, the logic led inexorably toward independence. Britain couldn’t accept colonial assemblies as co-equal governing bodies since Parliament claimed supreme authority over all British subjects. The colonists couldn’t accept taxation without representation since they claimed the rights of freeborn Englishmen. These positions couldn’t be reconciled.

The Sugar Act of 1764 represents the point where the British Empire’s century-long success in North America began to unravel. By trying to make the colonies pay a modest share of imperial costs through what seemed like reasonable means, Britain inadvertently set in motion forces that would break the empire apart just twelve years later.

Sources

Mount Vernon Digital Encyclopedia – Sugar Act https://www.mountvernon.org/library/digitalhistory/digital-encyclopedia/article/sugar-act/ Provides overview of the Act’s provisions, economic context, and relationship to British debt from the Seven Years’ War. Includes information on tax burden comparisons between Britain and the colonies.

Britannica – Sugar Act https://www.britannica.com/event/Sugar-Act Covers the specific provisions of the Act, enforcement mechanisms, vice-admiralty courts, and the shift from the Molasses Act of 1733. Useful for technical details of the legislation.

History.com – Sugar Act https://www.history.com/topics/american-revolution/sugar-act Discusses colonial constitutional objections, the “taxation without representation” argument, and the enforcement provisions including burden of proof reversal and jury trial denial.

American Battlefield Trust – Sugar Act https://www.battlefields.org/learn/articles/sugar-act Details colonial response including boycotts, James Otis’s arguments, and the triangular trade system that the Act disrupted.

Additional Recommended Sources

Library of Congress – The Sugar Act https://www.loc.gov/collections/continental-congress-and-constitutional-convention-broadsides/articles-and-essays/continental-congress-broadsides/broadsides-related-to-the-sugar-act/ Primary source collection including contemporary colonial broadsides and protests against the Act.

National Archives – The Sugar Act (Primary Source Text) https://founders.archives.gov/about/Sugar-Act The actual text of the American Revenue Act of 1764, useful for verifying specific provisions and language.

Yale Law School – Avalon Project: Resolutions of the Continental Congress (October 19, 1765) https://avalon.law.yale.edu/18th_century/resolu65.asp Colonial responses to the Sugar and Stamp Acts, showing how the arguments evolved.

Massachusetts Historical Society – James Otis’s Rights of the British Colonies Asserted and Proved (1764) https://www.masshist.org/digitalhistory/revolution/taxation-without-representation Primary source for the “taxation without representation” argument that emerged from Sugar Act opposition.

Colonial Williamsburg Foundation – Sugar Act of 1764 https://www.colonialwilliamsburg.org/learn/deep-dives/sugar-act-1764/ Discusses economic impact on colonial merchants and the rum distilling industry.

Slavery and the Constitutional Convention: The Compromise That Shaped a Nation

When fifty-five delegates gathered in Philadelphia during the sweltering summer of 1787, they faced a challenge that would haunt American politics for the next eight decades. The question wasn’t whether slavery was morally right—many delegates privately acknowledged its evil—but whether a unified nation could exist with slavery as a part of it. That summer, the institution of slavery nearly killed the Constitution before it was born.

The Battle Lines

The convention revealed a stark divide. On one side stood delegates who spoke forcefully against slavery, though they represented a minority voice. Gouverneur Morris of Pennsylvania delivered some of the most scathing condemnations, calling slavery a “nefarious institution” and “the curse of heaven on the states where it prevailed.” According to James Madison’s notes, Morris argued passionately that counting enslaved people for representation would mean that someone “who goes to the Coast of Africa, and in defiance of the most sacred laws of humanity tears away his fellow creatures from their dearest connections & damns them to the most cruel bondages, shall have more votes in a Government instituted for protection of the rights of mankind.”

Luther Martin of Maryland, himself a slaveholder, joined Morris in opposition. He declared the slave trade “inconsistent with the principles of the revolution and dishonorable to the American character.”.  Even George Mason of Virginia, who owned over 200 enslaved people, denounced slavery at the convention, warning that “every master of slaves is born a petty tyrant” and that it would bring “the judgment of heaven on a country.”

The Southern Coalition

Facing these critics stood delegates from the Deep South—primarily South Carolina and Georgia—who made it abundantly clear that protecting slavery was non-negotiable. The South Carolina delegation was particularly unified and aggressive in defending the institution. All four of their delegates—John Rutledge, Charles Pinckney, Charles Cotesworth Pinckney, and Pierce Butler—owned slaves, and they spoke with one voice.

Charles Cotesworth Pinckney stated bluntly: “South Carolina and Georgia cannot do without slaves.” John Rutledge framed it even more starkly: “The true question at present is, whether the Southern States shall or shall not be parties to the Union.” The message was unmistakable—attempt to restrict slavery, and there would be no Constitution and perhaps no United States.

The Southern states didn’t just defend slavery; they threatened to walk out repeatedly. When debates over the slave trade heated up on August 22, delegates from North Carolina, South Carolina, and Georgia stated they would “never be such fools as to give up” their right to import enslaved Africans.  These weren’t idle threats—they were credible enough to force compromise.

The Three-Fifths Compromise

The central flashpoint came over representation in Congress. The new Constitution would base representation on population, but should enslaved people count? Southern states wanted every enslaved person counted fully, which would dramatically increase their congressional power. Northern states argued that enslaved people—who had no rights and couldn’t vote—shouldn’t count at all.

The three-fifths ratio had actually been debated before. Back in 1783, Congress had considered using it to calculate state tax obligations under the Articles of Confederation, though that proposal failed. James Wilson of Pennsylvania resurrected the idea at the Constitutional Convention, suggesting that representation be based on the free population plus three-fifths of “all other persons”—the euphemism they used to avoid writing the word “slave” in the Constitution.

The compromise passed eight states to two. New Jersey and Delaware are generally identified as the states voting against the compromise, New Hampshire is not listed as taking part in the vote. Rhode Island did not send a delegation to the convention and by the time of the vote New York no longer had a functioning delegation.

Though the South ultimately accepted the compromise, it wasn’t what they wanted. Southern delegates had pushed to count enslaved people equally with free persons—but otherwise ignored on all issues of human rights. The three-fifths ratio was a reduction from their demands—a limitation on slave state power, though it still gave them substantial advantage. With about 93% of the nation’s enslaved population concentrated in just five southern states, this compromise increased the South’s congressional delegation by 42%.

James Madison later recognized the compromise’s significance. He wrote after the convention: “It seems now to be pretty well understood that the real difference of interests lies not between the large and small but between the northern and southern states. The institution of slavery and its consequences form the line of discrimination.”

Could the Constitution Have Happened Without It?

Here’s where I need to speculate, but I’m fairly confident in this assessment: no, the Constitution would not have been ratified without the three-fifths compromise and related concessions on slavery.

The evidence is overwhelming. South Carolina and Georgia delegates stated explicitly and repeatedly that they would not join any union that restricted slavery. Alexander Hamilton himself later acknowledged that “no union could possibly have been formed” without the three-fifths compromise. Even delegates who despised slavery, like Roger Sherman of Connecticut, argued it was “better to let the Southern States import slaves than to part with them.”

The convention negotiated three major slavery compromises, all linked. Beyond the three-fifths clause, they agreed Congress couldn’t ban the international slave trade until 1808, and they included the Fugitive Slave Clause requiring the return of escaped enslaved people even from free states. These deals were struck together on August 29, 1787, in what Madison’s notes reveal was a package negotiation between northern and southern delegates.

Without these compromises, the convention would likely have collapsed. The alternative wouldn’t have been a better Constitution—it would have been no Constitution at all, potentially leaving the thirteen states as separate nations or weak confederations. Whether that would have been preferable is a profound counterfactual question that historians still debate.

The Impact on Early American Politics

The three-fifths compromise didn’t just affect one document—it shaped American politics for decades. Its effects were immediate and substantial.

The most famous early example came in the presidential election of 1800. Thomas Jefferson defeated John Adams in what’s often called the “Revolution of 1800″—the first peaceful transfer of power between opposing political parties. But Jefferson’s victory owed directly to the three-fifths compromise. Virginia’s enslaved population gave the state extra electoral votes that proved decisive. Historian Garry Wills has speculated that without these additional slave-state votes, Jefferson would have lost. Pennsylvania had a free population 10% larger than Virginia’s, yet received 20% fewer electoral votes because Virginia’s numbers were inflated by the compromise.

The impact extended far beyond that single election. Research shows the three-fifths clause changed the outcome of over 55% of legislative votes in the Sixth Congress (1799-1801). (The additional southern representatives—about 18 more than their free population warranted—gave the South what became known as the “Slave Power” in Congress.

This power influenced major legislation throughout the antebellum period. The Indian Removal Act of 1830, which forcibly relocated Native Americans to open land for plantation agriculture, passed because of margins provided by these extra southern representatives. The Missouri Compromise, the Kansas-Nebraska Act, and numerous other slavery-related measures bore the fingerprints of this constitutional imbalance.

The compromise also affected Supreme Court appointments and federal patronage. Southern-dominated Congresses ensured pro-slavery justices and policies that protected the institution. The sectional tensions it created led directly to later compromises—the Missouri Compromise of 1820, the Compromise of 1850—each one a temporary bandage on a wound that wouldn’t heal.

By the 1850s, the artificial political power granted to slave states had become intolerable to many northerners. When Abraham Lincoln won the presidency in 1860 without carrying a single southern state, southern political leaders recognized they had lost control of the federal government. Senator Louis Wigfall of Texas complained that non-slaveholding states now controlled Congress and the Electoral College. Ten southern states seceded in large part because they believed the three-fifths compromise no longer protected their interests.

The Bitter Legacy

The framers consciously avoided using the words “slave” or “slavery” in the Constitution, recognizing it would “sully the document.” But the euphemisms fooled no one. They had built slavery into the structure of American government, trading moral principles for political union.

The Civil War finally resolved what the Constitutional Convention had delayed. The Thirteenth Amendment abolished slavery in 1865, but not until 1868 did the Fourteenth Amendment finally strike the three-fifths clause from the Constitution, requiring that representation be based on counting the “whole number of persons” in each state.

Was it worth it? That’s ultimately a question of values. The Constitution created a stronger national government that eventually abolished slavery, but it took 78 years and a war that killed over 600,000 Americans. As Thurgood Marshall noted on the Constitution’s bicentennial, the framers “consented to a document which laid a foundation for the tragic events which were to follow.”

The convention delegates knew what they were doing. They chose union over justice, pragmatism over principle. Whether that choice was necessary, wise, or moral remains one of the most contested questions in American history.

____________________________________________________

Sources

  1. https://www.battlefields.org/learn/articles/slavery-and-constitution
  2. https://en.wikipedia.org/wiki/Luther_Martin
  3. https://schistorynewsletter.substack.com/p/7-october-2024
  4. https://www.americanacorner.com/blog/constitutional-convention-slavery
  5. https://www.nps.gov/articles/000/constitutionalconvention-august22.htm
  6. https://en.wikipedia.org/wiki/Three-fifths_Compromise
  7. https://www.brennancenter.org/our-work/analysis-opinion/electoral-colleges-racist-origins
  8. https://www.gilderlehrman.org/history-resources/teaching-resource/historical-context-constitution-and-slavery
  9. https://www.nps.gov/articles/000/constitutionalconvention-august29.htm
  10. https://www.lwv.org/blog/three-fifths-compromise-and-electoral-college
  11. https://www.aaihs.org/a-compact-for-the-good-of-america-slavery-and-the-three-fifths-compromise-part-ii/

Life Below Deck: Enlisted Sailors in America’s Continental Navy

When the Continental Congress established America’s first navy in October 1775, they faced a daunting challenge: how do you build a fleet from scratch when you’re fighting the world’s most powerful naval force? The Continental Navy peaked at around 3,000 men serving on approximately 30 ships, a tiny force compared to Britain’s massive Royal Navy. But who were these sailors who were willing to risk their lives for a fledgling republic?

Where They Came From

The colonial maritime community had extensive seafaring experience, as much of British trade was carried in American vessels, and North Americans made up a significant portion of the Royal Navy’s seamen. Continental Navy sailors came primarily from port cities along the Atlantic coast, particularly New England communities where maritime trades were a way of life. Many had worked as merchant sailors, fishermen, or privateers before joining.

The naval service was notably diverse for its time, including native-born Americans, British deserters, free and enslaved Black sailors, and European immigrants. Unlike the Continental Army, which had periods of banning Black soldiers or sometimes placing them in segregated regiments, the Continental Navy was mostly integrated. At sea, there was less distinction between free and enslaved sailors, and those held in bondage had opportunities to work toward freedom. This maritime tradition of relative equality distinguished naval service from other Revolutionary War experiences.

Getting Into the Service

Recruiting sailors proved to be one of the Continental Navy’s biggest headaches. Navy boards supervised appointing petty officers and enlisting seamen, though these duties were chiefly performed by ship commanders or recruiting agents. The first Marine recruiting station was located at Tun’s Tavern, a bar in Philadelphia.

Enlistment was generally voluntary, though the line between volunteering and impressment—forced service—was sometimes blurred. Recruiting parties would scour port towns seeking able-bodied men, advertising not only pay but also the possibility of capturing British prizes for sale, with proceeds shared among the crew—a powerful incentive.

The problem was competition. Privateering—private ships licensed by congress to seize enemy vessels—was far more attractive to sailors because cruises were shorter and pay could be better. With over 2,000 privateers operating during the war, the Continental Navy struggled constantly to maintain adequate crew sizes. Continental captains often found themselves unable to man their ships due to privateers’ superior inducements.

Landsmen, Seamen, and Petty Officers

At the bottom rung of a Navy crew stood the landsman—a recruit with little or no sea experience. Many were farm boys or tradesmen who had never set foot on a ship. Their days were filled with the hardest labor: hauling ropes, scrubbing decks, and learning basic seamanship.

Above them were ordinary seamen, who had some experience afloat, and the more skilled able seamen who knew their way around sails, rigging, and naval gunnery. These sailors formed the backbone of the Continental Navy. Sailors skilled in managing the ship’s rigging were said to “know the ropes.” Without their knowledge of wind, tide, and timber, ships would have been little more than floating platforms.

The most experienced enlisted men were promoted to petty officers. These weren’t commissioned officers but rather specialists and leaders—boatswain’s mates directing rigging crews, gunner’s mates overseeing cannon fire, and carpenters’ mates keeping the wooden hulls afloat. They were the Navy’s “non-commissioned officers,” long before the U.S. Navy had a formal NCO corps.

Most Continental Navy ships also carried detachments of Continental Marines. These enlisted men were soldiers at sea, tasked with keeping order on deck, manning small arms in combat, and leading boarding parties.

What They Wore

Unlike officers who had prescribed uniforms, enlisted sailors received no standard clothing from the Continental Navy. Due to meager funds and lack of manufacturing capacity, sailors generally provided their own clothing, usually consisting of pantaloons often tied at the knee or knee breeches, a jumper or shirt, neckerchief, short waisted jacket, and low crowned hats. Most sailors went barefoot, and a kerchief was worn either as a sweat band or as a simple collar closure. The short trousers served a practical purpose—they didn’t interfere with climbing the ship’s rigging. This lack of uniforms reflected the Continental Navy’s financial struggles, where everything from ships to ammunition took priority over standardized clothing.

Daily Life at Sea

Shipboard duties for enlisted sailors were grueling and dangerous. Landsmen cleaned the deck, helped raise or lower the anchor, worked in the galley, and assisted other crew members. More experienced sailors handled the complex work of managing sails, operating guns during combat, standing watch, and maintaining the vessel. Specialized roles were filled by experienced hands, and most sailors worked long shifts in harsh conditions, often enduring crowded, wet, and unsanitary quarters below deck.

Living conditions were cramped. Sailors lived in close quarters with limited privacy, shared hammocks on the lower decks, and endured monotonous food rations. Meals were simple, based on salted meat, ship’s biscuit, and whatever could be supplemented from local ports or captured prizes. Leisure was rare, and recreation was often limited to singing, storytelling, or gambling. The work was physically demanding and accidents were common—falling from rigging, being crushed by shifting cargo, or drowning were constant risks.

Discipline and Relations with Officers

Discipline in the Continental Navy was deeply influenced by the British Royal Navy and the “ancient common law of the sea.” The Continental Congress issued articles governing naval discipline, empowering officers to maintain strict order and punish infractions including drunkenness, blasphemy, theft, or disobedience. Punishments included wearing a wooden collar, spending time in irons, receiving pay deductions, confinement on bread and water, or, for serious offenses, flogging.

Flogging was often done with a multi-thonged whip known as the cat o’ nine tails. The most common flogging consisted of between 12 and 24 lashes, though mutineers might receive sentences in the hundreds of lashes—often becoming a death sentence.

Even though officers held absolute authority aboard their vessels, the Continental Navy sometimes suffered from severe discipline problems. Some commanders found it impossible to maintain control over squadrons made up of crews recruited from one area and commanded by officers from another. The relationship between officers and enlisted men reflected the social hierarchies of the time, with a clear divide between the educated officer class and working-class sailors. However, the shared dangers of combat and the sea could create bonds that transcended these divisions.

A Brief but Important Legacy

Enlisted sailors of the Continental Navy came from diverse and often hardscrabble backgrounds, shaped by the hard labor and hazards of maritime life. These men, whose names are mostly lost to history, formed the foundation of America’s first navy and contributed profoundly—through sacrifice and service—to the establishment of American independence.

Of approximately 65 vessels that served in the Continental Navy, only 11 survived the war, and by 1785 Congress had disbanded the Navy and sold the remaining ships. Despite its short existence and limited impact on the war’s outcome, the sailors of the Continental Navy created a foundation for American naval tradition and provided trained seamen who would serve in future conflicts.

Sources:

Personal note: The Grumpy Doc proudly served as an enlisted sailor in the U.S. Navy from 1967 to 1974.

Thomas Jefferson: The Philosopher Who Played Hardball

Here’s the thing about Thomas Jefferson that doesn’t always make it into the history textbooks: the guy who wrote those soaring words about liberty and limited government? He was also one of early America’s most skilled—and sometimes underhanded—political operators.

It’s surprising when you think about it. Jefferson genuinely believed in transparency, virtue in public life, and keeping government small. He wrote beautifully about these ideals. But when it came to actual politics? He played the game as hard as anyone, often using tactics that directly contradicted what he preached.

Jefferson’s public philosophy was straightforward. He thought America should be a nation of independent farmers—regular people who owned their own land and weren’t dependent on anyone else. He worried constantly about concentrated power, whether in government or in the hands of wealthy financiers or merchants. He believed people should be informed and engaged, and that government worked best when it stayed out of people’s lives.

His Declaration of Independence wasn’t just pretty rhetoric—it laid out a genuinely revolutionary idea: governments only have power because people agree to give it to them, and when governments stop serving the people, those people have the right to change things.

The Reality: How Jefferson Actually Operated

Here’s where it gets interesting. While Jefferson was writing about virtue and transparency, he was simultaneously running what today we’d recognize as opposition research, planting stories in the press, and organizing political operations—sometimes against people he was supposed to be working with.

The Freneau Setup: Paying for Attacks

The most blatant example happened in 1791. Jefferson was serving as Secretary of State under George Washington, which meant he was part of the administration. At the same time, he arranged for a guy named Philip Freneau to get a government job—technically as a translator. The real purpose? To give Freneau money to run a newspaper that would relentlessly attack Alexander Hamilton and other Federalists.

Think about that for a second. Jefferson was using his government position to fund media attacks on his own colleagues. When people called him out on it, he basically said, “Who, me? I have nothing to do with what Freneau publishes.” But the evidence shows Jefferson was actively encouraging and directing these attacks.

John Beckley: The Original Campaign Fixer

Jefferson also worked closely with John Beckley, who was essentially America’s first professional political operative. Beckley coordinated messaging, spread information (and sometimes misinformation) about opponents, and helped build the grassroots organization that would eventually become the Democratic-Republican Party.

This wasn’t a gentlemanly debate about ideas. This was organized political warfare—pamphlets, coordinated newspaper campaigns, and opposition research. Jefferson and Jame Madison quietly funded much of this work while maintaining public images as above-the-fray philosophers. We can’t know exactly what Jefferson said in every private conversation with Beckley, but the circumstantial evidence of coordination is convincing.

The Hamilton Rivalry: Ideological War

Jefferson’s conflict with Hamilton was both philosophical and deeply personal. Hamilton wanted a strong federal government, a national bank, and close ties with Britain. Jefferson saw all of this as a betrayal of the Revolution—a step toward creating the same kind of corrupt, elite-dominated system they’d just fought to escape.

But rather than just making his arguments publicly, Jefferson worked behind the scenes to undermine Hamilton’s policies. He encouraged Madison to lead opposition in Congress. He fed stories to friendly newspapers. He coordinated with Republican representatives to block Federalist initiatives.

The philosophical disagreement was real, but Jefferson’s methods were pure political calculation.

Turning on Washington: The Ultimate Betrayal?

Maybe the most damaging thing Jefferson did was secretly working against George Washington while still serving in his cabinet. By Washington’s second term, Jefferson had convinced himself that Washington was being manipulated by Hamilton and moving the country toward monarchy.

 Jefferson stayed in the cabinet, maintaining cordial relations with Washington in person, while privately organizing resistance to administration policies. He encouraged attacks on Washington in the press. He coordinated with opposition leaders. And he did all of this while Washington trusted him as a loyal advisor.

When Washington found out, he was devastated. The betrayal broke their relationship permanently.

The Burr Situation: Using People

Jefferson’s handling of Aaron Burr shows just how pragmatic he could be. Jefferson never really trusted Burr—thought he was too ambitious and unprincipled. But in 1800, when Jefferson needed to win the presidency, Burr was useful for delivering New York’s votes.

After winning, Jefferson kept Burr as vice president but froze him out of any real power. Once Burr’s usefulness ended (especially after he killed Hamilton in that duel), Jefferson completely abandoned him, eventually supporting an unsuccessful prosecution for treason.

Deceiving Congress

Another example of Jefferson’s political manipulation was the Louisiana Purchase. This was a massive land acquisition that doubled the size of the United States. Jefferson knew that under the constitution he had no clear authority to acquire territory for the United States.  He was able to secure the purchase by keeping it secret from both congress and his political opponents until after it was finalized. This allowed him to avoid a debate that could have derailed the deal.  Does this sound familiar?

So, What Do We Make of This?

Here’s the uncomfortable question: Was Jefferson a hypocrite, or was he just being realistic about how politics actually works?  Jefferson’s political manipulation was not always ethical, but it was effective. He was able to use his skills to achieve many of his political goals.

You could argue he was doing what he thought necessary to prevent Hamilton’s vision from taking over—that the ends justified the means. You could also argue that by using underhanded tactics, he corrupted the very democratic processes he claimed to be protecting.

My speculation: I think Jefferson was aware of the contradiction and wrestled with it. His private letters show moments of self-justification and lingering doubt. But ultimately, he kept doing it because he believed his vision for America was too important to lose by playing nice.

The Bottom Line

Thomas Jefferson remains one of our most brilliant political thinkers. But he was also willing to play dirty when he thought the stakes were high enough. That duality—beautiful ideals combined with hardball tactics—might actually make him more relevant today than ever. Because let’s be honest, that tension between principles and pragmatism hasn’t gone away in American politics.

Understanding both sides of Jefferson helps us see that even the founders we most revere weren’t simple heroes. They were complicated people operating in a messy political reality, trying to build something new while fighting over what that something should be.

The evidence for Jefferson’s political maneuvering is extensive and well-established by historians. Some interpretations of his motivations involve educated speculation, but the actions themselves are documented in letters, newspaper archives, and contemporary accounts.​​​​​​​​​​​​​​​​

Reference List

Primary Sources

Founders Online – National Archives https://founders.archives.gov/

  • Digital collection of correspondence and papers from George Washington, Thomas Jefferson, Alexander Hamilton, Benjamin Franklin, John Adams, and James Madison. Essential for Jefferson’s own words and contemporaneous accounts of his political activities.

Library of Congress – Thomas Jefferson Exhibition https://www.loc.gov/exhibits/jefferson/

  • Comprehensive digital exhibition covering Jefferson’s life, philosophy, and political career with original documents and interpretive essays.

Thomas Jefferson Encyclopedia – Monticello https://www.monticello.org/site/research-and-collections/

  • Scholarly resource maintained by the Thomas Jefferson Foundation, covering specific topics including Jefferson’s relationships with Aaron Burr and other political figures.

Secondary Sources – Books

Chernow, Ron. Alexander Hamilton. New York: Penguin Press, 2004.

  • Pulitzer Prize-winning biography that extensively covers the Jefferson-Hamilton rivalry and Jefferson’s behind-the-scenes political maneuvering, including the Freneau affair. Particularly strong on the 1790s conflicts within Washington’s cabinet.

Chernow, Ron. Washington: A Life. New York: Penguin Press, 2010.

  • Provides Washington’s perspective on Jefferson’s activities within his administration and the betrayal Washington felt when learning of Jefferson’s covert opposition.

Ellis, Joseph J. American Sphinx: The Character of Thomas Jefferson. New York: Alfred A. Knopf, 1996.

  • National Book Award winner that explores Jefferson’s contradictions and complexities, particularly the gap between his philosophical writings and political practices.

Ferling, John. Jefferson and Hamilton: The Rivalry That Forged a Nation. New York: Bloomsbury Press, 2013.

  • Detailed examination of the ideological and personal conflict between Jefferson and Hamilton, showing how their struggle shaped early American politics and party formation.

Isenberg, Nancy. Fallen Founder: The Life of Aaron Burr. New York: Penguin Books, 2007.

  • Comprehensive biography of Burr that includes extensive coverage of his complex relationship with Jefferson, from their 1800 alliance through Jefferson’s eventual abandonment of his vice president.

Pasley, Jeffrey L. The Tyranny of Printers: Newspaper Politics in the Early American Republic. Charlottesville: University of Virginia Press, 2001.

  • Scholarly examination of how newspapers and partisan press became political weapons in the 1790s, with detailed coverage of Jefferson’s relationship with Philip Freneau and the National Gazette.

Secondary Sources – Journal Articles and Academic Papers

Sharp, James Roger. “The Journalist as Partisan: The National Gazette and the Origins of the First Party System.” The Virginia Magazine of History and Biography 97, no. 4 (1989): 391-420.

  • Academic analysis of Freneau’s National Gazette and its role in forming political opposition, including Jefferson’s involvement in funding and directing the publication.

Cunningham, Noble E., Jr. “John Beckley: An Early American Party Manager.” The William and Mary Quarterly 13, no. 1 (1956): 40-52.

  • Scholarly examination of Beckley’s role as America’s first professional political operative and his work organizing Jefferson’s political machine.

Historiographical Note

The interpretation of Jefferson’s political behavior has evolved over time. Earlier biographies (pre-1960s) tended to minimize or excuse his behind-the-scenes maneuvering, while more recent scholarship has been willing to examine the contradictions between his philosophy and practice more critically. The works cited above represent current historical consensus based on documentary evidence, though historians continue to debate Jefferson’s motivations and whether his tactics were justified given the political stakes he perceived.

Why We Make Promises to Ourselves Every January: The History of New Year’s Resolutions

New Year’s resolutions—a practice where individuals set goals or make promises to improve their lives in the upcoming year—have a rich and varied history spanning thousands of years. While the concept of self-improvement at the start of a new year feels distinctly modern, its origins are deeply rooted in ancient civilizations and religious traditions that understood the psychological power of fresh starts.

Origins of New Year’s Resolutions

The tradition of making promises at the start of a new year can be traced back over 4,000 years to ancient Babylon. During their 12-day festival called Akitu, held in mid-March to coincide with the spring harvest and planting season, Babylonians made solemn vows to their gods. These promises typically involved practical matters like repaying debts and returning borrowed items, reflecting the agricultural society’s emphasis on community obligations and divine favor. The Babylonians believed that success in fulfilling these promises would curry favor with their deities, ensuring good harvests and prosperity in the year ahead.

The practice evolved significantly when Julius Caesar reformed the Roman calendar in 46 BCE and established January 1 as the official start of the new year. This wasn’t an arbitrary choice—January was named after Janus, the two-faced Roman god of beginnings, endings, doorways, and transitions. The symbolism was perfect: one face looking back at the year past, the other gazing forward to the future. Romans offered sacrifices to Janus and made promises of good conduct for the coming year, combining reflection on past mistakes with optimism about future improvements.

By the Middle Ages, the focus shifted dramatically toward religious observance. In early Christianity, the first day of the year became a time of prayer, spiritual reflection, and making pious resolutions aimed at becoming better Christians. One of the most colorful New Year’s traditions from this era was the “Peacock Vow,” practiced by Christian knights. At the end of the Christmas season, these knights would reaffirm their commitment to knightly virtue while feasting on roast peacock at elaborate New Year’s celebrations. The peacock, a symbol of pride and nobility, served as the centerpiece for vows promising good behavior and chivalric deeds during the coming year.

In the 17th century, Puritans brought particular intensity to the practice of New Year’s resolutions, focusing them squarely on spiritual and moral improvement. Rather than the broad promises of earlier eras, Puritan resolutions were detailed and specific. They committed to avoiding pride and vanity, practicing charity and liberality toward others, refraining from revenge even when wronged, controlling anger in daily interactions, speaking no evil of their neighbors, and living every aspect of their lives aligned with strict religious principles. Beyond these behavioral commitments, they also resolved to study scriptures diligently throughout the year, improve their religious devotion on a weekly basis, and continually renew their dedication to God. These resolutions were taken with utmost seriousness, often recorded in personal journals and reviewed regularly.

In 1740, John Wesley, the founder of Methodism, formalized this spiritual approach by creating the Covenant Renewal Service, traditionally held on New Year’s Eve or New Year’s Day. These powerful gatherings encouraged participants to reflect deeply on the past year’s failings and successes while making resolutions for spiritual growth in the year ahead. This tradition continues in many Methodist churches today.

Interestingly, the first known use of the specific phrase “New Year’s Resolution” appeared in a Boston newspaper called Walker’s Hibernian Magazine in 1813. The article took a humorous tone, discussing how people broke their New Year’s vows almost as soon as they made them—a wry observation that suggests nothing much has changed over the last 212 years.

The Modern Evolution of New Year’s Resolutions

The secularization of New Year’s resolutions accelerated during the 19th and 20th centuries as Western societies became increasingly diverse and less uniformly religious. Self-improvement and personal growth gradually took precedence over religious vows, though the underlying psychology remained similar. The rise of print media played a crucial role in popularizing the practice beyond religious communities. Newspapers and magazines began publishing advice columns on how to set and achieve goals, turning what had been a primarily spiritual practice into a secular ritual of self-betterment.

The industrial revolution and urbanization also influenced the nature of resolutions. As more people moved to cities and took on wage labor, resolutions began to reflect modern concerns like career advancement, financial stability, and managing the stress of urban life. The self-help movement of the 20th century, spurred by books like Dale Carnegie’s “How to Win Friends and Influence People” and Norman Vincent Peale’s “The Power of Positive Thinking,” further embedded the idea that individuals could transform themselves through conscious effort and goal-setting.

By the 21st century, resolutions were firmly established in Western culture as a beloved tradition of hope and renewal, no longer tied to any particular religious framework. The internet age brought new dimensions to the practice, with social media allowing people to publicly declare their resolutions, fitness tracking apps enabling data-driven self-improvement, and online communities providing support and accountability.

Common New Year’s Resolutions

Resolutions tend to reflect both cultural priorities and universal human aspirations. When researchers survey what people resolve to change, recurring themes emerge that tell us something about areas of discontent in contemporary life. Health and fitness consistently dominate the list, with millions of people vowing to lose weight, exercise more regularly, and eat healthier foods. The popularity of these goals reflects our sedentary modern lifestyles, abundant processed foods, and the cultural premium placed on physical appearance and wellness.

Personal development goals are another major category. People promise themselves they will finally learn that new skill they’ve been putting off, read more books instead of scrolling through social media, and manage their time better to reduce stress and increase productivity. These resolutions speak to a desire for intellectual growth and a nagging sense that we’re not living up to our full potential.

Financial goals also rank high on most people’s resolution lists. Many resolve to save more money for the future, pay off debts that have been accumulating, or stick to a budget instead of impulse spending. These financial resolutions often stem from anxiety about economic security and a recognition that small daily choices compound into major financial consequences over time.

Relationship and community-focused resolutions reflect our social nature and the loneliness epidemic affecting many developed nations. People vow to spend more quality time with family and friends rather than staying busy with work and distractions. They plan to volunteer and to give back to their communities in meaningful ways. They hope to strengthen the social bonds that are crucial to happiness and longevity.

Finally, breaking bad habits remains a perennial favorite. Traditional vices like smoking and excessive alcohol consumption still top many lists, but modern resolutions also target newer concerns like limiting screen time and reducing smartphone addiction. These goals acknowledge how difficult it is to maintain healthy habits in an environment designed to encourage overconsumption and instant gratification.

The Success Rate of Resolutions

Despite their enduring popularity, New Year’s resolutions are notoriously difficult to keep. Multiple studies estimate that approximately 80% of resolutions fail by February, often crashing and burning within just a few days of January 1st. The reasons for this high failure rate are both psychological and practical. Many people set overly ambitious goals without considering the realistic constraints of their lives or the sustained effort needed for meaningful change. Others make vague resolutions like “get healthier” without specific action steps or measurable milestones.

Research in behavioral psychology suggests that setting realistic, measurable, and time-bound goals—often called SMART goals (Specific, Measurable, Achievable, Relevant, and Time-bound)—can significantly improve success rates. Rather than resolving to “exercise more,” for example, a SMART goal would be “go to the gym for 30 minutes every Monday, Wednesday, and Friday morning.” The specificity provides clear direction, and the measurability allows for tracking progress and celebrating small victories along the way.

However, it’s worth noting that most people approach their New Year’s resolutions more as a fun tradition than with serious anticipation that they will actually keep them. There’s a ritualistic, almost playful quality to the practice—we know the odds are against us, but we participate anyway, embracing the hopeful symbolism of a fresh start even if we suspect we’ll be back to our old habits before Valentine’s Day.

The Significance of Resolutions Today

New Year’s resolutions persist across centuries and cultures because they align with a fundamental human desire for self-improvement and the psychological comfort of fresh starts. The appeal of marking time with calendars and treating January 1st as somehow special—despite being astronomically arbitrary—speaks to our need for narrative structure in our lives. Whether rooted in ancient Babylonian pledges to repay debts, Roman sacrifices to Janus, Christian vows of spiritual renewal, or modern goals to lose ten pounds, resolutions represent an enduring belief in the potential for change.

The tradition reminds us that humans have always struggled with the gap between who we are and who we aspire to be, and that we’ve always believed, however naively, that marking a new beginning on the calendar might help us bridge that gap. Even if our resolutions fail more often than they succeed, the very act of making them reaffirms our agency and our hope that we can become better versions of ourselves with just a bit of conscious effort.

Sources:

History.com provides comprehensive coverage of New Year’s resolution traditions: https://www.history.com/news/the-history-of-new-years-resolutions

Britannica offers detailed information on Janus and Roman New Year traditions: https://www.britannica.com/topic/Janus-Roman-god

The Smithsonian Magazine explores New Year’s countdown traditions and their historical context: https://www.smithsonianmag.com/science-nature/why-do-we-count-down-to-the-new-year-180961433/

Anthony Aveni’s “The Book of the Year: A Brief History of Our Seasonal Holidays” provides scholarly analysis of New Year’s traditions across cultures.

Kaila Curry’s article “The Ancient History of New Year’s Resolutions” traces the practice from Babylonian times through modern era.

Joshua O’Driscoll’s research on “The Peacock Vows” documents medieval chivalric New Year’s traditions, excerpted in various historical compilations.​​​​​​​​​​​​​​​​

Study The Past If You Would Define The Future—Confucius

I particularly like this quotation. It is similar to the more modern version: Those who don’t learn from the past are doomed to repeat it. However, I much prefer the former because it seems to be more in the form of advice or instruction. The latter seems to be more of a dire warning. Though I suspect, given the current state of the world, a dire warning is in order.

But regardless of whether it comes in the form of advice or warning, people today do not seem to heed the importance of studying the past.  The knowledge of history in our country is woeful. The lack of emphasis on the teaching of history in general and specifically American history, is shameful. While it is tempting to blame it on the lack of interest on the part of the younger generation, I find people my own age also have little appreciation of the events that shaped our nation, the world and their lives. Without this understanding, how can we evaluate what is currently happening and understand what we must do to come together as a nation and as a world.

I have always found history to be a fascinating subject. Biographies and nonfiction historical books remain among my favorite reading. In college I always added one or two history courses every semester to raise my grade point average. Even in college I found it strange that many of my friends hated history courses and took only the minimum. At the time, I didn’t realize just how serious this lack of historical perspective was to become.

Several years ago I became aware of just how little historical knowledge most people had. At the time Jay Leno was still doing his late-night show and he had a segment called Jaywalking. During the segment he would ask people in the street questions that were somewhat esoteric and to which he could expect to get unusual and generally humorous answers. On one show, on the 4th of July, he asked people “From what country did the United States declare independence on the 4th of July?” and of course no one knew the answer.

My first thought was that he must have gone through dozens of people to find the four or five people who did not know the answer to his question. The next day at work, the 5th of July, I decided to ask several people, all of whom were college graduates, the same question. I got not one single correct answer. Although, one person at least realized “I think I should know this”. When I told my wife, a retired teacher, she wasn’t surprised.  For a long time, she had been concerned about the lack of emphasis on social studies and the arts in school curriculums.  I was becoming seriously concerned about the direction of education in our country.

A lot of people are probably thinking “So what, who really cares what a bunch of dead people did 200 years ago?” If we don’t know what they did and why they did it how can we understand its relevance today?  We have no way to judge what actions may support the best interests of society and what might ultimately be detrimental.

Failure to learn from and understand the past results in a me-centric view of everything. If you fail to understand how things have developed, then you certainly cannot understand what the best course is to go forward. Attempting to judge all people and events of the past through your own personal prejudices leads only to continued and worsening conflict.

If you study the past you will see that there has never general agreement on anything. There were many disagreements, debates and even a civil war over differences of opinion.  It helps us to understand that there are no perfect people who always do everything the right way and at the right time. It helps us to appreciate the good that people do while understanding the human weaknesses that led to the things that we consider faults today. In other words, we cannot expect anyone to be a 100% perfect person. They may have accomplished many good and meaningful things and those good and meaningful things should not be discarded because the person was also a human being with human flaws.

Understanding the past does not mean approving of everything that occurred but it also means not condemning everything that does not fit into twenty-first century mores.  Only by recognizing this and seeing what led to the disasters of the past can we hope to avoid repetition of the worst aspects of our history. History teaches lessons in compromise, involvement and understanding. Failure to recognize that leads to strident argument and an unwillingness to cooperate with those who may differ in even the slightest way. Rather than creating the hoped-for perfect society, it simply leads to a new set of problems and a new group of grievances.

In sum, failure to study history is a failure to prepare for the future. We owe it to ourselves and future generations to understand where we came from and how we can best prepare our country and the world for them. They deserve nothing less than a full understanding of the past and a rational way forward. 

This was my first post after I started my blog in 2021.  I believe it is even more relevant now.

The Freemasons and the Founding Fathers: Secret Society or Just a Really Good Book Club?

You’ve probably heard the whispers—the Freemasons secretly controlled the American Revolution, George Washington wore a special apron, and there’s a hidden pyramid on the dollar bill. It’s the kind of thing that sounds like it came straight from a Nicolas Cage movie. But like most historical legends, the real story is more interesting (and less conspiratorial) than the mythology.

So, what’s the actual deal with Freemasons and America’s founding? Let’s dig in.

What Even Is Freemasonry?

First things first: Freemasonry started out as actual stonemasons’ guilds back in medieval Europe—think guys who built cathedrals sharing trade secrets. But by the early 1700s, it had transformed into something completely different: a philosophical club where educated men gathered to discuss big ideas about morality, reason, and how to be better humans.

The secrecy? That was part of the appeal. Lodges had rituals and passwords, sure, but the core values weren’t exactly hidden. Freemasons were all about Enlightenment thinking—liberty, equality, the pursuit of knowledge. Basically, the kind of stuff that gets you excited if you’re the type who actually enjoys reading philosophy books.

In colonial America, joining a Masonic lodge was a bit like joining an elite networking group today, except instead of swapping business cards, you discussed natural rights and wore fancy aprons. Lawyers, merchants, printers—the educated professional class—flocked to lodges for both the intellectual stimulation and the social connections.

The Founding Fathers: Who Was Actually In?

Let’s separate fact from fiction when it comes to which founders were card-carrying Masons.

Definitely Masons:

George Washington became a Master Mason at 21 in 1753. He wasn’t the most active member—he didn’t attend meetings constantly—but he took it seriously enough to wear his Masonic apron when he laid the cornerstone of the U.S. Capitol in 1793. That’s a pretty public endorsement.

Benjamin Franklin was perhaps the most dedicated Mason among the founders. Initiated in 1731, he eventually became Grand Master of Pennsylvania’s Grand Lodge and helped establish lodges in France during his diplomatic stint. Franklin was basically the poster child for Enlightenment Masonry.

Paul Revere—yes, that Paul Revere—was Grand Master of Massachusetts. His midnight ride gets all the attention, but his Masonic connections were just as important to his Revolutionary activities.

John Hancock also served as Grand Master of Massachusetts. His oversized signature on the Declaration was matched by his outsized commitment to Masonic ideals.

John Marshall, the Chief Justice who shaped American constitutional law, was a dedicated Mason. So was James Monroe, the fifth president.

Here’s a fun stat: of the 56 signers of the Declaration of Independence, at least nine (about 16%) were Masons. Among the 39 who signed the Constitution, roughly thirteen (33%) belonged to the fraternity.

The Maybes:

Thomas Jefferson? Probably not a Mason, despite endless conspiracy theories. There’s no solid evidence of membership, though his Enlightenment philosophy certainly sounded Masonic. His buddy the Marquis de Lafayette was definitely in, which hasn’t helped dispel the rumors.

Alexander Hamilton? The evidence is murky. Some historians think his writings hint at Masonic sympathies, but there’s no membership record.

Definitely Not:

John Adams wasn’t a Mason and was actually skeptical of secret societies. He still believed in many of the same principles, though—virtue, republican government, that sort of thing.

Did the Masons Really Influence the Revolution?

Here’s where it gets interesting. No, the Freemasons didn’t sit around a lodge plotting revolution like some shadowy cabal. But did their ideas and networks matter? Absolutely.

Think about what Masonic lodges provided: a space where educated colonists could meet, discuss radical ideas about natural rights and self-governance, and build trust across colonial boundaries—all without British officials breathing down their necks. These lodges brought together men from different colonies, different religious backgrounds (Anglicans, Quakers, Deists), and different social classes.

The radical part? Inside a lodge, everyone met “on the level.” It didn’t matter if you were born rich or poor—merit and virtue determined your standing. That’s pretty revolutionary thinking in the 1700s when most of the world still believed some people were just born better than others. Sound familiar? “All men are created equal” has a similar ring to it.

Freemasonry also championed religious tolerance. You had to believe in some kind of Supreme Being, but that was it—no specific creed required. This ecumenical approach directly influenced the founders’ commitment to religious freedom and separation of church and state.

The Masonic motto about moving “from darkness to light” through knowledge wasn’t just ritualistic mumbo-jumbo. It reflected genuine Enlightenment belief in reason and progress—the same intellectual current that powered revolutionary thinking.

What About All That Symbolism?

Okay, let’s address the pyramid and the all-seeing eye on the dollar bill. Are they Masonic? Maybe, maybe not. The Great Seal of the United States definitely uses imagery that Masons also used—but so did lots of 18th-century groups drawing on Enlightenment and classical symbolism. The connection is debated among historians.

What’s undeniable is that Masonic culture emphasized architecture and building as metaphors for constructing a just society. When Washington laid that Capitol cornerstone in his Masonic apron, he was making a statement about building something enduring and meaningful.

The “Conspiracy” Question

Let’s be clear: there was no Masonic conspiracy to create America. The fraternity wasn’t even unified—lodges operated independently, and members included both patriots and loyalists. Officially, Masonic organizations tried to stay neutral during the Revolution, though obviously that didn’t work out perfectly when the war split families and communities.

What is true is that many of the Revolution’s most articulate, influential leaders happened to be Masons. And the fraternity’s values—liberty, equality, reason, fraternity—aligned perfectly with revolutionary ideology. Correlation, not conspiracy.

After the Revolution, Freemasonry exploded in popularity. It became associated with the Enlightenment values that had supposedly won the day. Future presidents including Andrew Jackson, James Polk, and Theodore Roosevelt were all Masons. At its 19th-century peak, an estimated one in five American men belonged to a lodge.

What’s the Bottom Line?

The Freemason influence on America’s founding is real, but it’s cultural rather than conspiratorial. The lodges provided a space where Enlightenment ideas could circulate, where colonial leaders could build networks of trust, and where egalitarian principles could be practiced in miniature.

Washington, Franklin, Hancock, and the others weren’t sitting in smoke-filled rooms with secret handshakes planning to overthrow the British crown. They were part of a broader philosophical movement that valued personal improvement, moral virtue, and human rights. The Masonic lodge was one venue—among many—where those ideas took root.

Freemasonry was one tributary feeding into the river of revolutionary thought, along with classical republicanism, British common law, various religious traditions, and plain old grievances about taxes and representation.

The real story is somehow simpler and more fascinating than the conspiracy theories: a bunch of educated colonists joined a fraternity that encouraged them to think big thoughts about human nature and just governance. Those thoughts, debated in lodges and taverns and town halls, eventually sparked a revolution.

Not because of secret symbols or mysterious rituals, but because ideas about liberty and equality—once you start taking them seriously—are genuinely revolutionary.

True confession—The Grumpy Doc is not now, nor has he ever been, a Mason.

The Underground Heroes: How Sewers Built Our Cities

When we think about what makes cities possible, agriculture usually gets top billing. Without a steady food surplus, people could not have stopped foraging long enough to become artisans, priests, merchants, or kings. But once people clustered into towns and cities, another, less glamorous need quickly emerged: what to do with all the waste.  While no one likes to think about it, without effective methods for sewage disposal cities would quickly become uninhabitable.

When you consider the foundations of modern civilization, sewers probably don’t make your top ten list. But these underground networks deserve way more credit than they get. It is no exaggeration to say that sewage systems—whether open drains in the street or vast subterranean tunnels—were one of the most important technologies that made large cities livable. The story of sewers is really the story of how humans figured out how to live together in large numbers without, well, dying from our own waste.

The Ancient World Gets Creative

The earliest cities faced a pretty basic problem: what do you do with human and animal waste when you’ve got thousands of people living close together? The ancient Indus Valley civilization (around 2600-1900 BCE) came up with one of the first solutions. Archaeological evidence from Harappa and Mohenjo-daro shows they built covered drains and even had individual house connections—pretty impressive for 4,000 years ago.

The Romans, being Romans, took this concept and ran with it. The Cloaca Maxima, built around 600 BCE, started as an open drainage canal but eventually became Rome’s main sewer system. What made Roman sewers special wasn’t just their size, but how they integrated with aqueducts to create a flow-through system that actually worked.

Speculation alert: While we know the Romans understood the practical benefits of sewers, they probably didn’t fully grasp the disease prevention aspect in the way we do today.

The Medieval Mess

After Rome fell, European cities pretty much forgot how to manage waste properly. Medieval cities relied on a charming system where people just dumped waste into the streets, hoping rain would wash it away. Some cities built latrines that emptied into rivers, but most urban waste management was… let’s call it “informal.”

This wasn’t just gross—it was deadly. Cities regularly faced outbreaks of cholera, dysentery, and typhoid, though people didn’t yet understand the connection between contaminated water and disease.

London’s Wake-Up Call

The turning point came in 19th-century London. By the 1850s, the Thames had become essentially an open sewer, and the city’s water supply was contaminated. The “Great Stink” of 1858 made the problem impossible to ignore. The smell from the Thames was so bad that Parliament couldn’t meet.  When something smells worse than politics you know that’s bad.

Enter Joseph Bazalgette, chief engineer of London’s Metropolitan Board of Works. His solution was ambitious: build a comprehensive sewer network that would intercept waste before it reached the Thames and carry it downstream to treatment facilities. The system, completed in the 1870s, used gravity and the natural slope of the land to move waste through a network of tunnels—some large enough to drive a carriage through.

Who wouldn’t love the idea that a man named Thomas Crapper invented the flush toilet. But that’s not quite true. Variations of the flush toilet have been around for over 2000 years.  In 1775, a man named Alexander Cummings invented the S-trap—a curved pipe that prevented sewer gases from backing up into the home—making toilets finally tolerable for indoor use.  While Mr. Crapper did not invent the toilet, he did make it functional enough to be routinely installed in homes by creating a workable ballcock mechanism to allow reliable flushing. He also marketed a toilet of his own design, leading to the now familiar nickname of “the crapper.”

Reversing A River

Chicago’s development of a sewer system was a landmark feat of engineering and urban planning in the 19th century. Faced with flat, swampy terrain and rapid population growth, the city recruited engineer Ellis S. Chesbrough in 1855 to design the first comprehensive underground sewer system in the United States. Because the landscape offered little natural drainage, the entire city center had to be physically raised by several feet—an ambitious task that involved elevating streets and even entire buildings above their original grade to allow for gravity-based drainage into the Chicago River.

Apparently, no one realized this would pollute Lake Michigan, the city’s main drinking water source, a classic example of unintended consequences. This led to further innovation, including the construction of a tunnel extending two miles under the lake to bring in cleaner water (completed in 1866) and, ultimately, the monumental reversal of the Chicago River’s flow in 1900. This project diverted wastewater away from the lake and toward the Mississippi basin, following the time-tested political solution of sending your problems downstream.

The Science Behind the Solution

What made modern sewer systems revolutionary wasn’t just engineering—it was the growing understanding of how disease spreads. Dr. John Snow’s work during London’s 1854 cholera outbreak proved that contaminated water, not “bad air,” was spreading the disease. This discovery gave city planners the scientific backing they needed to invest heavily in sewer infrastructure.

Modern sewer systems work on relatively simple principles: gravity moves waste through sloped pipes to treatment facilities, where biological and chemical processes break down harmful materials before releasing treated water back into the environment. The key innovation was creating separate systems for stormwater and sewage, preventing overflow during heavy rains.

Cities Transform

The impact was immediate and dramatic. Cities with comprehensive sewer systems saw massive drops in waterborne diseases. Life expectancy increased, child mortality plummeted, and for the first time in human history, really large cities became livable spaces rather than death traps.

Sewers enabled urban growth on an unprecedented scale. Without sewers, cities like New York, Chicago, and London couldn’t support populations in the millions. The investment in underground infrastructure became the foundation for everything else—commerce, industry, culture—that makes cities economic powerhouses.

Modern Challenges

Today’s sewer systems face new challenges. Climate change brings more intense storms that can overwhelm older systems. Growing populations strain infrastructure that was built decades ago. Many cities are dealing with the expensive reality that sewer systems, built to last 50-100 years have outlived their life expectancy and need major upgrades or replacement.

Prediction: Cities will likely need to invest heavily in “smart” sewer systems over the next few decades—networks that use sensors and data to manage flow more efficiently and prevent overflows.

The Bottom Line

Sewers represent one of humanity’s most important but least appreciated innovations. They made modern urban life possible by solving the fundamental problem of waste management on a large scale. Without this underground network, our cities and the economic and cultural benefits they provide simply couldn’t exist.

The next time you turn on a tap or use indoor plumbing, remember  you’re benefiting from centuries of engineering innovation that literally built the foundation of modern civilization, one pipe at a time.

Sometimes when I’m researching articles, I find myself going down a rabbit hole.  This time I went down the drain.

The Fascinating Journey of Christmas Cards: From Victorian Innovation to Global Tradition

Have you ever wondered how the tradition of sending Christmas cards got started? It’s a story that combines busy social calendars, a new postal system, and one clever solution that became a worldwide phenomenon.

Before Christmas Cards: The Early Messengers

Long before anyone thought to mass-produce holiday greetings, people were already experimenting with seasonal messages. In fifteenth-century Germany, the “Andachtsbilder” appeared—proto-greeting cards with religious imagery, usually depicting baby Jesus, accompanied by the inscription “Ein gut selig jar” (A good and radiant year) that were presented as gifts during the Christmas season. Additionally, handwritten letters wishing “Merry Christmas” date from as early as 1534.These weren’t Christmas cards as we know them, but they laid the groundwork.

The first known Christmas card was sent in 1611 by Michael Maier, a German physician, to King James I of England and his son, with an elaborate greeting celebrating “the birthday of the Sacred King”.  This, however, was an ornate document rather than a mass-produced card. The true breakthrough came much later.

In late 1700s, British schoolchildren were creating their own versions. They would take large sheets of decorated writing paper and pen messages like “Love to Dearest Mummy at the Christmas Season” to show their parents how much their handwriting had improved over the year. It was part homework assignment, part holiday greeting—definitely more practical than sentimental!

Also during the latter part of the 18th century wealthy British families adopted a more personal variant: handwritten holiday letters. These were carefully composed greetings expressing seasonal good will and family updates, often decorated with small flourishes or illustrations. A forerunner of the much maligned Christmas letter.  In Victorian England—where social correspondence was almost an art form—sending letters for Christmas and New Year became fashionable among the middle class. The combination of widespread literacy and improvements in the postal system laid the groundwork for something new: a printed, affordable Christmas greeting.

The Birth of the Modern Christmas Card

The real game-changer came in 1843, thanks to a social problem that sounds remarkably modern: too many people to keep in touch with and not enough time. Henry Cole, a prominent civil servant, helped establish the Penny Post postal system—named after the cost of posting a letter.  He found himself with unanswered mail piling up during the busy Christmas season. His solution? Why not create one design that could be sent to everyone?

Cole commissioned his friend, artist John Callcott Horsley, to design what would become the world’s first commercial Christmas card. The design featured three generations of the Cole family raising a toast in celebration, surrounded by scenes depicting acts of charity. The message was simple: “A Merry Christmas and a Happy New Year to You.”

About 2,050 cards were printed in two versions—a black and white version for sixpence and a hand-colored version for one shilling. Interestingly, the card caused some controversy. The image showed young children enjoying glasses of wine with their family, which upset the Victorian temperance movement.

The Penny Post, introduced in 1840, made mailing affordable and accessible. What started as Cole’s time-saving solution quickly caught on among his friends and acquaintances, though it took a few decades for the tradition to really explode in popularity.

Crossing the Atlantic

Christmas cards made their way to America in the late 1840s, but they were expensive luxuries at first. In 1875, Louis Prang, a German-born printer who had worked on early cards in England, began mass-producing cards in America. He made them affordable for average families. His first cards featured flowers, plants, and children. By the 1880s, Prang was producing over five million cards annually.

 Christmas cards spread rapidly with improvements in both postal systems and printing. Victorian cards often featured sentimental, elaborate images—sometimes anthropomorphic animals or unexpected motifs. The Hall Brothers Company (later Hallmark) shifted the format to folded cards in envelopes rather than postcards, allowing for more personal written messages—setting the standard still seen today.

The 20th century brought both industrialization and personalization to the Christmas card. Advances in color printing, photography, and mass marketing meant that cards became cheaper and more varied. In the 1920s and 1930s, families began sending cards featuring their own photographs, a tradition that gained momentum after World War II with the rise of suburban life and inexpensive cameras. By the 1950s and 1960s, Christmas cards had become a fixture of middle-class life. Designs reflected changing tastes—from sentimental Victorian nostalgia to sleek mid-century modernism.  Surprisingly, the first known Christmas card with a personal photo was sent by Annie Oakley in 1891using a photo taken during a visit to Scotland.

Christmas Cards Around the World Today

Fast forward to today, and Christmas card traditions vary wildly depending on where you are. In Great Britian and US, sending cards remains a major tradition. British people send around 55 cards per year on average, with Christmas cards accounting for almost half of all greeting card sales

But the tradition looks quite different in other parts of the world. In Japan, where only about 1.5% of the population is Christian, Christmas is celebrated as a secular, romantic holiday rather than a religious one. Christmas Eve is treated similarly to Valentine’s Day, with couples exchanging gifts. While many people observe the Western custom of sending cards, these are nengajo—New Year’s cards—sent to friends, family, and business associates, expressing wishes for a happy and prosperous year.

In the Philippines, one of Asia’s most Christian nations, Christmas is celebrated with incredible enthusiasm starting as early as September, with the season officially beginning with nine days of dawn masses on December 16. Cards are part of the celebration, but they’re just one element of an extended, community-focused holiday.

In Australia, the tradition of sending handwritten Christmas cards remains popular despite the summer heat.  Australian cards often feature unique imagery—Santa in shorts and sandals, or kangaroos instead of reindeer, adapting the tradition to local culture.

The Digital Shift

Today, while e-cards and social media posts have certainly cut into traditional card sales, many people still cherish the ritual of sending and receiving physical cards. There’s something irreplaceable about finding a thoughtful card in your mailbox among the bills and advertisements.

What started as Henry Cole’s practical solution to a busy social calendar has evolved into a diverse global tradition, adapted and reimagined by different cultures worldwide. Whether you’re mailing elaborate family photo cards, sending quick e-greetings, or exchanging romantic messages in Tokyo, you’re participating in a tradition that’s over 200 years old.

Page 2 of 9

Powered by WordPress & Theme by Anders Norén