Grumpy opinions about everything.

Tag: History

Thomas Jefferson: The Philosopher Who Played Hardball

Here’s the thing about Thomas Jefferson that doesn’t always make it into the history textbooks: the guy who wrote those soaring words about liberty and limited government? He was also one of early America’s most skilled—and sometimes underhanded—political operators.

It’s surprising when you think about it. Jefferson genuinely believed in transparency, virtue in public life, and keeping government small. He wrote beautifully about these ideals. But when it came to actual politics? He played the game as hard as anyone, often using tactics that directly contradicted what he preached.

Jefferson’s public philosophy was straightforward. He thought America should be a nation of independent farmers—regular people who owned their own land and weren’t dependent on anyone else. He worried constantly about concentrated power, whether in government or in the hands of wealthy financiers or merchants. He believed people should be informed and engaged, and that government worked best when it stayed out of people’s lives.

His Declaration of Independence wasn’t just pretty rhetoric—it laid out a genuinely revolutionary idea: governments only have power because people agree to give it to them, and when governments stop serving the people, those people have the right to change things.

The Reality: How Jefferson Actually Operated

Here’s where it gets interesting. While Jefferson was writing about virtue and transparency, he was simultaneously running what today we’d recognize as opposition research, planting stories in the press, and organizing political operations—sometimes against people he was supposed to be working with.

The Freneau Setup: Paying for Attacks

The most blatant example happened in 1791. Jefferson was serving as Secretary of State under George Washington, which meant he was part of the administration. At the same time, he arranged for a guy named Philip Freneau to get a government job—technically as a translator. The real purpose? To give Freneau money to run a newspaper that would relentlessly attack Alexander Hamilton and other Federalists.

Think about that for a second. Jefferson was using his government position to fund media attacks on his own colleagues. When people called him out on it, he basically said, “Who, me? I have nothing to do with what Freneau publishes.” But the evidence shows Jefferson was actively encouraging and directing these attacks.

John Beckley: The Original Campaign Fixer

Jefferson also worked closely with John Beckley, who was essentially America’s first professional political operative. Beckley coordinated messaging, spread information (and sometimes misinformation) about opponents, and helped build the grassroots organization that would eventually become the Democratic-Republican Party.

This wasn’t a gentlemanly debate about ideas. This was organized political warfare—pamphlets, coordinated newspaper campaigns, and opposition research. Jefferson and Jame Madison quietly funded much of this work while maintaining public images as above-the-fray philosophers. We can’t know exactly what Jefferson said in every private conversation with Beckley, but the circumstantial evidence of coordination is convincing.

The Hamilton Rivalry: Ideological War

Jefferson’s conflict with Hamilton was both philosophical and deeply personal. Hamilton wanted a strong federal government, a national bank, and close ties with Britain. Jefferson saw all of this as a betrayal of the Revolution—a step toward creating the same kind of corrupt, elite-dominated system they’d just fought to escape.

But rather than just making his arguments publicly, Jefferson worked behind the scenes to undermine Hamilton’s policies. He encouraged Madison to lead opposition in Congress. He fed stories to friendly newspapers. He coordinated with Republican representatives to block Federalist initiatives.

The philosophical disagreement was real, but Jefferson’s methods were pure political calculation.

Turning on Washington: The Ultimate Betrayal?

Maybe the most damaging thing Jefferson did was secretly working against George Washington while still serving in his cabinet. By Washington’s second term, Jefferson had convinced himself that Washington was being manipulated by Hamilton and moving the country toward monarchy.

 Jefferson stayed in the cabinet, maintaining cordial relations with Washington in person, while privately organizing resistance to administration policies. He encouraged attacks on Washington in the press. He coordinated with opposition leaders. And he did all of this while Washington trusted him as a loyal advisor.

When Washington found out, he was devastated. The betrayal broke their relationship permanently.

The Burr Situation: Using People

Jefferson’s handling of Aaron Burr shows just how pragmatic he could be. Jefferson never really trusted Burr—thought he was too ambitious and unprincipled. But in 1800, when Jefferson needed to win the presidency, Burr was useful for delivering New York’s votes.

After winning, Jefferson kept Burr as vice president but froze him out of any real power. Once Burr’s usefulness ended (especially after he killed Hamilton in that duel), Jefferson completely abandoned him, eventually supporting an unsuccessful prosecution for treason.

Deceiving Congress

Another example of Jefferson’s political manipulation was the Louisiana Purchase. This was a massive land acquisition that doubled the size of the United States. Jefferson knew that under the constitution he had no clear authority to acquire territory for the United States.  He was able to secure the purchase by keeping it secret from both congress and his political opponents until after it was finalized. This allowed him to avoid a debate that could have derailed the deal.  Does this sound familiar?

So, What Do We Make of This?

Here’s the uncomfortable question: Was Jefferson a hypocrite, or was he just being realistic about how politics actually works?  Jefferson’s political manipulation was not always ethical, but it was effective. He was able to use his skills to achieve many of his political goals.

You could argue he was doing what he thought necessary to prevent Hamilton’s vision from taking over—that the ends justified the means. You could also argue that by using underhanded tactics, he corrupted the very democratic processes he claimed to be protecting.

My speculation: I think Jefferson was aware of the contradiction and wrestled with it. His private letters show moments of self-justification and lingering doubt. But ultimately, he kept doing it because he believed his vision for America was too important to lose by playing nice.

The Bottom Line

Thomas Jefferson remains one of our most brilliant political thinkers. But he was also willing to play dirty when he thought the stakes were high enough. That duality—beautiful ideals combined with hardball tactics—might actually make him more relevant today than ever. Because let’s be honest, that tension between principles and pragmatism hasn’t gone away in American politics.

Understanding both sides of Jefferson helps us see that even the founders we most revere weren’t simple heroes. They were complicated people operating in a messy political reality, trying to build something new while fighting over what that something should be.

The evidence for Jefferson’s political maneuvering is extensive and well-established by historians. Some interpretations of his motivations involve educated speculation, but the actions themselves are documented in letters, newspaper archives, and contemporary accounts.​​​​​​​​​​​​​​​​

Reference List

Primary Sources

Founders Online – National Archives https://founders.archives.gov/

  • Digital collection of correspondence and papers from George Washington, Thomas Jefferson, Alexander Hamilton, Benjamin Franklin, John Adams, and James Madison. Essential for Jefferson’s own words and contemporaneous accounts of his political activities.

Library of Congress – Thomas Jefferson Exhibition https://www.loc.gov/exhibits/jefferson/

  • Comprehensive digital exhibition covering Jefferson’s life, philosophy, and political career with original documents and interpretive essays.

Thomas Jefferson Encyclopedia – Monticello https://www.monticello.org/site/research-and-collections/

  • Scholarly resource maintained by the Thomas Jefferson Foundation, covering specific topics including Jefferson’s relationships with Aaron Burr and other political figures.

Secondary Sources – Books

Chernow, Ron. Alexander Hamilton. New York: Penguin Press, 2004.

  • Pulitzer Prize-winning biography that extensively covers the Jefferson-Hamilton rivalry and Jefferson’s behind-the-scenes political maneuvering, including the Freneau affair. Particularly strong on the 1790s conflicts within Washington’s cabinet.

Chernow, Ron. Washington: A Life. New York: Penguin Press, 2010.

  • Provides Washington’s perspective on Jefferson’s activities within his administration and the betrayal Washington felt when learning of Jefferson’s covert opposition.

Ellis, Joseph J. American Sphinx: The Character of Thomas Jefferson. New York: Alfred A. Knopf, 1996.

  • National Book Award winner that explores Jefferson’s contradictions and complexities, particularly the gap between his philosophical writings and political practices.

Ferling, John. Jefferson and Hamilton: The Rivalry That Forged a Nation. New York: Bloomsbury Press, 2013.

  • Detailed examination of the ideological and personal conflict between Jefferson and Hamilton, showing how their struggle shaped early American politics and party formation.

Isenberg, Nancy. Fallen Founder: The Life of Aaron Burr. New York: Penguin Books, 2007.

  • Comprehensive biography of Burr that includes extensive coverage of his complex relationship with Jefferson, from their 1800 alliance through Jefferson’s eventual abandonment of his vice president.

Pasley, Jeffrey L. The Tyranny of Printers: Newspaper Politics in the Early American Republic. Charlottesville: University of Virginia Press, 2001.

  • Scholarly examination of how newspapers and partisan press became political weapons in the 1790s, with detailed coverage of Jefferson’s relationship with Philip Freneau and the National Gazette.

Secondary Sources – Journal Articles and Academic Papers

Sharp, James Roger. “The Journalist as Partisan: The National Gazette and the Origins of the First Party System.” The Virginia Magazine of History and Biography 97, no. 4 (1989): 391-420.

  • Academic analysis of Freneau’s National Gazette and its role in forming political opposition, including Jefferson’s involvement in funding and directing the publication.

Cunningham, Noble E., Jr. “John Beckley: An Early American Party Manager.” The William and Mary Quarterly 13, no. 1 (1956): 40-52.

  • Scholarly examination of Beckley’s role as America’s first professional political operative and his work organizing Jefferson’s political machine.

Historiographical Note

The interpretation of Jefferson’s political behavior has evolved over time. Earlier biographies (pre-1960s) tended to minimize or excuse his behind-the-scenes maneuvering, while more recent scholarship has been willing to examine the contradictions between his philosophy and practice more critically. The works cited above represent current historical consensus based on documentary evidence, though historians continue to debate Jefferson’s motivations and whether his tactics were justified given the political stakes he perceived.

Why We Make Promises to Ourselves Every January: The History of New Year’s Resolutions

New Year’s resolutions—a practice where individuals set goals or make promises to improve their lives in the upcoming year—have a rich and varied history spanning thousands of years. While the concept of self-improvement at the start of a new year feels distinctly modern, its origins are deeply rooted in ancient civilizations and religious traditions that understood the psychological power of fresh starts.

Origins of New Year’s Resolutions

The tradition of making promises at the start of a new year can be traced back over 4,000 years to ancient Babylon. During their 12-day festival called Akitu, held in mid-March to coincide with the spring harvest and planting season, Babylonians made solemn vows to their gods. These promises typically involved practical matters like repaying debts and returning borrowed items, reflecting the agricultural society’s emphasis on community obligations and divine favor. The Babylonians believed that success in fulfilling these promises would curry favor with their deities, ensuring good harvests and prosperity in the year ahead.

The practice evolved significantly when Julius Caesar reformed the Roman calendar in 46 BCE and established January 1 as the official start of the new year. This wasn’t an arbitrary choice—January was named after Janus, the two-faced Roman god of beginnings, endings, doorways, and transitions. The symbolism was perfect: one face looking back at the year past, the other gazing forward to the future. Romans offered sacrifices to Janus and made promises of good conduct for the coming year, combining reflection on past mistakes with optimism about future improvements.

By the Middle Ages, the focus shifted dramatically toward religious observance. In early Christianity, the first day of the year became a time of prayer, spiritual reflection, and making pious resolutions aimed at becoming better Christians. One of the most colorful New Year’s traditions from this era was the “Peacock Vow,” practiced by Christian knights. At the end of the Christmas season, these knights would reaffirm their commitment to knightly virtue while feasting on roast peacock at elaborate New Year’s celebrations. The peacock, a symbol of pride and nobility, served as the centerpiece for vows promising good behavior and chivalric deeds during the coming year.

In the 17th century, Puritans brought particular intensity to the practice of New Year’s resolutions, focusing them squarely on spiritual and moral improvement. Rather than the broad promises of earlier eras, Puritan resolutions were detailed and specific. They committed to avoiding pride and vanity, practicing charity and liberality toward others, refraining from revenge even when wronged, controlling anger in daily interactions, speaking no evil of their neighbors, and living every aspect of their lives aligned with strict religious principles. Beyond these behavioral commitments, they also resolved to study scriptures diligently throughout the year, improve their religious devotion on a weekly basis, and continually renew their dedication to God. These resolutions were taken with utmost seriousness, often recorded in personal journals and reviewed regularly.

In 1740, John Wesley, the founder of Methodism, formalized this spiritual approach by creating the Covenant Renewal Service, traditionally held on New Year’s Eve or New Year’s Day. These powerful gatherings encouraged participants to reflect deeply on the past year’s failings and successes while making resolutions for spiritual growth in the year ahead. This tradition continues in many Methodist churches today.

Interestingly, the first known use of the specific phrase “New Year’s Resolution” appeared in a Boston newspaper called Walker’s Hibernian Magazine in 1813. The article took a humorous tone, discussing how people broke their New Year’s vows almost as soon as they made them—a wry observation that suggests nothing much has changed over the last 212 years.

The Modern Evolution of New Year’s Resolutions

The secularization of New Year’s resolutions accelerated during the 19th and 20th centuries as Western societies became increasingly diverse and less uniformly religious. Self-improvement and personal growth gradually took precedence over religious vows, though the underlying psychology remained similar. The rise of print media played a crucial role in popularizing the practice beyond religious communities. Newspapers and magazines began publishing advice columns on how to set and achieve goals, turning what had been a primarily spiritual practice into a secular ritual of self-betterment.

The industrial revolution and urbanization also influenced the nature of resolutions. As more people moved to cities and took on wage labor, resolutions began to reflect modern concerns like career advancement, financial stability, and managing the stress of urban life. The self-help movement of the 20th century, spurred by books like Dale Carnegie’s “How to Win Friends and Influence People” and Norman Vincent Peale’s “The Power of Positive Thinking,” further embedded the idea that individuals could transform themselves through conscious effort and goal-setting.

By the 21st century, resolutions were firmly established in Western culture as a beloved tradition of hope and renewal, no longer tied to any particular religious framework. The internet age brought new dimensions to the practice, with social media allowing people to publicly declare their resolutions, fitness tracking apps enabling data-driven self-improvement, and online communities providing support and accountability.

Common New Year’s Resolutions

Resolutions tend to reflect both cultural priorities and universal human aspirations. When researchers survey what people resolve to change, recurring themes emerge that tell us something about areas of discontent in contemporary life. Health and fitness consistently dominate the list, with millions of people vowing to lose weight, exercise more regularly, and eat healthier foods. The popularity of these goals reflects our sedentary modern lifestyles, abundant processed foods, and the cultural premium placed on physical appearance and wellness.

Personal development goals are another major category. People promise themselves they will finally learn that new skill they’ve been putting off, read more books instead of scrolling through social media, and manage their time better to reduce stress and increase productivity. These resolutions speak to a desire for intellectual growth and a nagging sense that we’re not living up to our full potential.

Financial goals also rank high on most people’s resolution lists. Many resolve to save more money for the future, pay off debts that have been accumulating, or stick to a budget instead of impulse spending. These financial resolutions often stem from anxiety about economic security and a recognition that small daily choices compound into major financial consequences over time.

Relationship and community-focused resolutions reflect our social nature and the loneliness epidemic affecting many developed nations. People vow to spend more quality time with family and friends rather than staying busy with work and distractions. They plan to volunteer and to give back to their communities in meaningful ways. They hope to strengthen the social bonds that are crucial to happiness and longevity.

Finally, breaking bad habits remains a perennial favorite. Traditional vices like smoking and excessive alcohol consumption still top many lists, but modern resolutions also target newer concerns like limiting screen time and reducing smartphone addiction. These goals acknowledge how difficult it is to maintain healthy habits in an environment designed to encourage overconsumption and instant gratification.

The Success Rate of Resolutions

Despite their enduring popularity, New Year’s resolutions are notoriously difficult to keep. Multiple studies estimate that approximately 80% of resolutions fail by February, often crashing and burning within just a few days of January 1st. The reasons for this high failure rate are both psychological and practical. Many people set overly ambitious goals without considering the realistic constraints of their lives or the sustained effort needed for meaningful change. Others make vague resolutions like “get healthier” without specific action steps or measurable milestones.

Research in behavioral psychology suggests that setting realistic, measurable, and time-bound goals—often called SMART goals (Specific, Measurable, Achievable, Relevant, and Time-bound)—can significantly improve success rates. Rather than resolving to “exercise more,” for example, a SMART goal would be “go to the gym for 30 minutes every Monday, Wednesday, and Friday morning.” The specificity provides clear direction, and the measurability allows for tracking progress and celebrating small victories along the way.

However, it’s worth noting that most people approach their New Year’s resolutions more as a fun tradition than with serious anticipation that they will actually keep them. There’s a ritualistic, almost playful quality to the practice—we know the odds are against us, but we participate anyway, embracing the hopeful symbolism of a fresh start even if we suspect we’ll be back to our old habits before Valentine’s Day.

The Significance of Resolutions Today

New Year’s resolutions persist across centuries and cultures because they align with a fundamental human desire for self-improvement and the psychological comfort of fresh starts. The appeal of marking time with calendars and treating January 1st as somehow special—despite being astronomically arbitrary—speaks to our need for narrative structure in our lives. Whether rooted in ancient Babylonian pledges to repay debts, Roman sacrifices to Janus, Christian vows of spiritual renewal, or modern goals to lose ten pounds, resolutions represent an enduring belief in the potential for change.

The tradition reminds us that humans have always struggled with the gap between who we are and who we aspire to be, and that we’ve always believed, however naively, that marking a new beginning on the calendar might help us bridge that gap. Even if our resolutions fail more often than they succeed, the very act of making them reaffirms our agency and our hope that we can become better versions of ourselves with just a bit of conscious effort.

Sources:

History.com provides comprehensive coverage of New Year’s resolution traditions: https://www.history.com/news/the-history-of-new-years-resolutions

Britannica offers detailed information on Janus and Roman New Year traditions: https://www.britannica.com/topic/Janus-Roman-god

The Smithsonian Magazine explores New Year’s countdown traditions and their historical context: https://www.smithsonianmag.com/science-nature/why-do-we-count-down-to-the-new-year-180961433/

Anthony Aveni’s “The Book of the Year: A Brief History of Our Seasonal Holidays” provides scholarly analysis of New Year’s traditions across cultures.

Kaila Curry’s article “The Ancient History of New Year’s Resolutions” traces the practice from Babylonian times through modern era.

Joshua O’Driscoll’s research on “The Peacock Vows” documents medieval chivalric New Year’s traditions, excerpted in various historical compilations.​​​​​​​​​​​​​​​​

The Freemasons and the Founding Fathers: Secret Society or Just a Really Good Book Club?

You’ve probably heard the whispers—the Freemasons secretly controlled the American Revolution, George Washington wore a special apron, and there’s a hidden pyramid on the dollar bill. It’s the kind of thing that sounds like it came straight from a Nicolas Cage movie. But like most historical legends, the real story is more interesting (and less conspiratorial) than the mythology.

So, what’s the actual deal with Freemasons and America’s founding? Let’s dig in.

What Even Is Freemasonry?

First things first: Freemasonry started out as actual stonemasons’ guilds back in medieval Europe—think guys who built cathedrals sharing trade secrets. But by the early 1700s, it had transformed into something completely different: a philosophical club where educated men gathered to discuss big ideas about morality, reason, and how to be better humans.

The secrecy? That was part of the appeal. Lodges had rituals and passwords, sure, but the core values weren’t exactly hidden. Freemasons were all about Enlightenment thinking—liberty, equality, the pursuit of knowledge. Basically, the kind of stuff that gets you excited if you’re the type who actually enjoys reading philosophy books.

In colonial America, joining a Masonic lodge was a bit like joining an elite networking group today, except instead of swapping business cards, you discussed natural rights and wore fancy aprons. Lawyers, merchants, printers—the educated professional class—flocked to lodges for both the intellectual stimulation and the social connections.

The Founding Fathers: Who Was Actually In?

Let’s separate fact from fiction when it comes to which founders were card-carrying Masons.

Definitely Masons:

George Washington became a Master Mason at 21 in 1753. He wasn’t the most active member—he didn’t attend meetings constantly—but he took it seriously enough to wear his Masonic apron when he laid the cornerstone of the U.S. Capitol in 1793. That’s a pretty public endorsement.

Benjamin Franklin was perhaps the most dedicated Mason among the founders. Initiated in 1731, he eventually became Grand Master of Pennsylvania’s Grand Lodge and helped establish lodges in France during his diplomatic stint. Franklin was basically the poster child for Enlightenment Masonry.

Paul Revere—yes, that Paul Revere—was Grand Master of Massachusetts. His midnight ride gets all the attention, but his Masonic connections were just as important to his Revolutionary activities.

John Hancock also served as Grand Master of Massachusetts. His oversized signature on the Declaration was matched by his outsized commitment to Masonic ideals.

John Marshall, the Chief Justice who shaped American constitutional law, was a dedicated Mason. So was James Monroe, the fifth president.

Here’s a fun stat: of the 56 signers of the Declaration of Independence, at least nine (about 16%) were Masons. Among the 39 who signed the Constitution, roughly thirteen (33%) belonged to the fraternity.

The Maybes:

Thomas Jefferson? Probably not a Mason, despite endless conspiracy theories. There’s no solid evidence of membership, though his Enlightenment philosophy certainly sounded Masonic. His buddy the Marquis de Lafayette was definitely in, which hasn’t helped dispel the rumors.

Alexander Hamilton? The evidence is murky. Some historians think his writings hint at Masonic sympathies, but there’s no membership record.

Definitely Not:

John Adams wasn’t a Mason and was actually skeptical of secret societies. He still believed in many of the same principles, though—virtue, republican government, that sort of thing.

Did the Masons Really Influence the Revolution?

Here’s where it gets interesting. No, the Freemasons didn’t sit around a lodge plotting revolution like some shadowy cabal. But did their ideas and networks matter? Absolutely.

Think about what Masonic lodges provided: a space where educated colonists could meet, discuss radical ideas about natural rights and self-governance, and build trust across colonial boundaries—all without British officials breathing down their necks. These lodges brought together men from different colonies, different religious backgrounds (Anglicans, Quakers, Deists), and different social classes.

The radical part? Inside a lodge, everyone met “on the level.” It didn’t matter if you were born rich or poor—merit and virtue determined your standing. That’s pretty revolutionary thinking in the 1700s when most of the world still believed some people were just born better than others. Sound familiar? “All men are created equal” has a similar ring to it.

Freemasonry also championed religious tolerance. You had to believe in some kind of Supreme Being, but that was it—no specific creed required. This ecumenical approach directly influenced the founders’ commitment to religious freedom and separation of church and state.

The Masonic motto about moving “from darkness to light” through knowledge wasn’t just ritualistic mumbo-jumbo. It reflected genuine Enlightenment belief in reason and progress—the same intellectual current that powered revolutionary thinking.

What About All That Symbolism?

Okay, let’s address the pyramid and the all-seeing eye on the dollar bill. Are they Masonic? Maybe, maybe not. The Great Seal of the United States definitely uses imagery that Masons also used—but so did lots of 18th-century groups drawing on Enlightenment and classical symbolism. The connection is debated among historians.

What’s undeniable is that Masonic culture emphasized architecture and building as metaphors for constructing a just society. When Washington laid that Capitol cornerstone in his Masonic apron, he was making a statement about building something enduring and meaningful.

The “Conspiracy” Question

Let’s be clear: there was no Masonic conspiracy to create America. The fraternity wasn’t even unified—lodges operated independently, and members included both patriots and loyalists. Officially, Masonic organizations tried to stay neutral during the Revolution, though obviously that didn’t work out perfectly when the war split families and communities.

What is true is that many of the Revolution’s most articulate, influential leaders happened to be Masons. And the fraternity’s values—liberty, equality, reason, fraternity—aligned perfectly with revolutionary ideology. Correlation, not conspiracy.

After the Revolution, Freemasonry exploded in popularity. It became associated with the Enlightenment values that had supposedly won the day. Future presidents including Andrew Jackson, James Polk, and Theodore Roosevelt were all Masons. At its 19th-century peak, an estimated one in five American men belonged to a lodge.

What’s the Bottom Line?

The Freemason influence on America’s founding is real, but it’s cultural rather than conspiratorial. The lodges provided a space where Enlightenment ideas could circulate, where colonial leaders could build networks of trust, and where egalitarian principles could be practiced in miniature.

Washington, Franklin, Hancock, and the others weren’t sitting in smoke-filled rooms with secret handshakes planning to overthrow the British crown. They were part of a broader philosophical movement that valued personal improvement, moral virtue, and human rights. The Masonic lodge was one venue—among many—where those ideas took root.

Freemasonry was one tributary feeding into the river of revolutionary thought, along with classical republicanism, British common law, various religious traditions, and plain old grievances about taxes and representation.

The real story is somehow simpler and more fascinating than the conspiracy theories: a bunch of educated colonists joined a fraternity that encouraged them to think big thoughts about human nature and just governance. Those thoughts, debated in lodges and taverns and town halls, eventually sparked a revolution.

Not because of secret symbols or mysterious rituals, but because ideas about liberty and equality—once you start taking them seriously—are genuinely revolutionary.

True confession—The Grumpy Doc is not now, nor has he ever been, a Mason.

The Fascinating Journey of Christmas Cards: From Victorian Innovation to Global Tradition

Have you ever wondered how the tradition of sending Christmas cards got started? It’s a story that combines busy social calendars, a new postal system, and one clever solution that became a worldwide phenomenon.

Before Christmas Cards: The Early Messengers

Long before anyone thought to mass-produce holiday greetings, people were already experimenting with seasonal messages. In fifteenth-century Germany, the “Andachtsbilder” appeared—proto-greeting cards with religious imagery, usually depicting baby Jesus, accompanied by the inscription “Ein gut selig jar” (A good and radiant year) that were presented as gifts during the Christmas season. Additionally, handwritten letters wishing “Merry Christmas” date from as early as 1534.These weren’t Christmas cards as we know them, but they laid the groundwork.

The first known Christmas card was sent in 1611 by Michael Maier, a German physician, to King James I of England and his son, with an elaborate greeting celebrating “the birthday of the Sacred King”.  This, however, was an ornate document rather than a mass-produced card. The true breakthrough came much later.

In late 1700s, British schoolchildren were creating their own versions. They would take large sheets of decorated writing paper and pen messages like “Love to Dearest Mummy at the Christmas Season” to show their parents how much their handwriting had improved over the year. It was part homework assignment, part holiday greeting—definitely more practical than sentimental!

Also during the latter part of the 18th century wealthy British families adopted a more personal variant: handwritten holiday letters. These were carefully composed greetings expressing seasonal good will and family updates, often decorated with small flourishes or illustrations. A forerunner of the much maligned Christmas letter.  In Victorian England—where social correspondence was almost an art form—sending letters for Christmas and New Year became fashionable among the middle class. The combination of widespread literacy and improvements in the postal system laid the groundwork for something new: a printed, affordable Christmas greeting.

The Birth of the Modern Christmas Card

The real game-changer came in 1843, thanks to a social problem that sounds remarkably modern: too many people to keep in touch with and not enough time. Henry Cole, a prominent civil servant, helped establish the Penny Post postal system—named after the cost of posting a letter.  He found himself with unanswered mail piling up during the busy Christmas season. His solution? Why not create one design that could be sent to everyone?

Cole commissioned his friend, artist John Callcott Horsley, to design what would become the world’s first commercial Christmas card. The design featured three generations of the Cole family raising a toast in celebration, surrounded by scenes depicting acts of charity. The message was simple: “A Merry Christmas and a Happy New Year to You.”

About 2,050 cards were printed in two versions—a black and white version for sixpence and a hand-colored version for one shilling. Interestingly, the card caused some controversy. The image showed young children enjoying glasses of wine with their family, which upset the Victorian temperance movement.

The Penny Post, introduced in 1840, made mailing affordable and accessible. What started as Cole’s time-saving solution quickly caught on among his friends and acquaintances, though it took a few decades for the tradition to really explode in popularity.

Crossing the Atlantic

Christmas cards made their way to America in the late 1840s, but they were expensive luxuries at first. In 1875, Louis Prang, a German-born printer who had worked on early cards in England, began mass-producing cards in America. He made them affordable for average families. His first cards featured flowers, plants, and children. By the 1880s, Prang was producing over five million cards annually.

 Christmas cards spread rapidly with improvements in both postal systems and printing. Victorian cards often featured sentimental, elaborate images—sometimes anthropomorphic animals or unexpected motifs. The Hall Brothers Company (later Hallmark) shifted the format to folded cards in envelopes rather than postcards, allowing for more personal written messages—setting the standard still seen today.

The 20th century brought both industrialization and personalization to the Christmas card. Advances in color printing, photography, and mass marketing meant that cards became cheaper and more varied. In the 1920s and 1930s, families began sending cards featuring their own photographs, a tradition that gained momentum after World War II with the rise of suburban life and inexpensive cameras. By the 1950s and 1960s, Christmas cards had become a fixture of middle-class life. Designs reflected changing tastes—from sentimental Victorian nostalgia to sleek mid-century modernism.  Surprisingly, the first known Christmas card with a personal photo was sent by Annie Oakley in 1891using a photo taken during a visit to Scotland.

Christmas Cards Around the World Today

Fast forward to today, and Christmas card traditions vary wildly depending on where you are. In Great Britian and US, sending cards remains a major tradition. British people send around 55 cards per year on average, with Christmas cards accounting for almost half of all greeting card sales

But the tradition looks quite different in other parts of the world. In Japan, where only about 1.5% of the population is Christian, Christmas is celebrated as a secular, romantic holiday rather than a religious one. Christmas Eve is treated similarly to Valentine’s Day, with couples exchanging gifts. While many people observe the Western custom of sending cards, these are nengajo—New Year’s cards—sent to friends, family, and business associates, expressing wishes for a happy and prosperous year.

In the Philippines, one of Asia’s most Christian nations, Christmas is celebrated with incredible enthusiasm starting as early as September, with the season officially beginning with nine days of dawn masses on December 16. Cards are part of the celebration, but they’re just one element of an extended, community-focused holiday.

In Australia, the tradition of sending handwritten Christmas cards remains popular despite the summer heat.  Australian cards often feature unique imagery—Santa in shorts and sandals, or kangaroos instead of reindeer, adapting the tradition to local culture.

The Digital Shift

Today, while e-cards and social media posts have certainly cut into traditional card sales, many people still cherish the ritual of sending and receiving physical cards. There’s something irreplaceable about finding a thoughtful card in your mailbox among the bills and advertisements.

What started as Henry Cole’s practical solution to a busy social calendar has evolved into a diverse global tradition, adapted and reimagined by different cultures worldwide. Whether you’re mailing elaborate family photo cards, sending quick e-greetings, or exchanging romantic messages in Tokyo, you’re participating in a tradition that’s over 200 years old.

Banned, Blessed, and Brewed

How Coffee Conquered the World

I don’t know about you, but I can’t get moving in the morning without a cup of coffee—or, if I’m honest, about three. Coffee has been a faithful companion through late nights and early mornings for most of my adult life.

I’ve written about it before, but there’s one story I’ve never shared—the time coffee actually sent me to the hospital.

A Pain in the Chest (and a Lesson Learned)

It happened not long before I turned forty. Back then, forty felt ancient. I started getting chest pains bad enough to send me to a cardiologist. After a battery of expensive tests, he said, “I don’t know what’s causing your pain, but it’s not your heart. Go see your family doctor.”

Problem was, I didn’t have one. (This was before I thought seriously about medical school.) So, I found a doctor, went in for a full workup, and after all the poking and prodding he casually asked, “How much coffee do you drink?”

“About eight cups a day,” I told him.

He raised an eyebrow. “You need to stop that.”

I asked if he really thought that was the problem. He didn’t hesitate—“Absolutely.”

This was before anyone talked much about reflux, at least not the way we do now. But I quit coffee cold turkey, and just like that, the chest pain disappeared.

These days I’ve learned my limit: three cups in the morning, and that’s it. Any more and the reflux reminds me who’s in charge.

It’s funny how something so simple can be both a comfort and a curse. Still, for all its quirks, I wouldn’t trade that first morning cup for anything.

From Goats to Global Obsession

My little coffee story fits neatly into a much older one. For centuries, coffee has stirred passion and controversy in equal measures. Its history is full of smuggling, religion, politics—and even the occasional threat of beheading.

The story begins in the Ethiopian highlands, in a region called Kaffa—possibly the origin of the word coffee. Wild Coffea arabica plants grew there long before anyone thought to roast their seeds.

According to legend, around 850 CE a goat herder named Kaldi noticed his goats acting wildly energetic after eating the red berries. We will never know if Kaldi was real or just a great marketing story.

By the 1400s Yemeni traders brought coffee plants from Ethiopia across the Red Sea to Yemen.  The first recorded coffee drinker was Sheikh Jamal-al-Din al-Dhabhani of Aden, around 1454. He and other Sufi mystics used the brew to stay alert during long nights of prayer—a kind of early spiritual espresso shot.

Coffee and the Muslim World

By 1514, coffee had reached Mecca and through the early 1500s it spread across Egypt and North Africa, beginning in the Yemeni port of Mocha (yes, that Mocha). Coffeehouses—qahveh khaneh—sprang up everywhere. They were the original social networks: lively centers for news, politics, debate, and gossip, often called “Schools of the Wise.”

Coffee also had its critics. Some Muslim scholars debated whether it was halal, arguing that its stimulating effect made it suspiciously close to an intoxicant.

The governor of Mecca banned coffee altogether, calling coffeehouses hotbeds of sedition. Thirteen years later the Ottoman sultan lifted the ban, recognizing that you can’t outlaw people’s favorite drink. Similar bans came and went—including one by Sultan Murad IV in the 1600s, who reportedly made drinking coffee a capital crime. It didn’t work. Coffee had already conquered the Middle East.

Europe’s Complicated Love Affair

When coffee reached Europe—most likely through Venetian traders—it faced new suspicion. To many Europeans, coffee was “the drink of the infidel,” something foreign and threatening.

Some Catholic priests went so far as to call it “the bitter invention of Satan” or “the wine of Araby”.  The issue was both secular and theology—wine played a central role in Christian ritual and Muslims, forbidden to drink wine, had elevated coffee to their own social centerpiece.

Then around 1600 Pope Clement VIII joined the debate. Instead of banning coffee, he decided to try it first. The story goes that he found it so delicious he “baptized” it, declaring it too good to leave to the infidels.

True or not, coffee won papal approval—and from there, Europe was hooked. Coffeehouses spread like wildfire.

In England, they were called “penny universities” because for the price of a penny (the cost of a cup), you could join conversations on politics, science, and philosophy. Coffeehouses became the fuel of the Enlightenment—an alternative to taverns and alehouses. King Charles II tried to ban them in 1675, fearing they encouraged sedition, but public outrage forced him to back down.

The Global Takeover

For a long time, Yemen held a monopoly on coffee exports, carefully boiling or roasting beans to prevent anyone from planting them elsewhere. But where there’s money there’s smuggling.

The Dutch managed to steal a few live plants and in 1616 and began to grow them in Ceylon and Java—hence the nickname “java.” The French followed suit, planting coffee across the Caribbean. One French officer famously smuggled a single seedling to Martinique in 1723; within fifty years, it had produced over 18 million trees.

Brazil entered the scene in 1727 when Francisco de Melo Palheta snuck seeds out of French Guiana. Brazil’s climate proved perfect, and before long, it became the world’s coffee superpower.

The Bitter Truth

Coffee’s global spread had a dark side. Its plantations across the Caribbean and Latin America were built on enslaved labor. The beverage that fueled Enlightenment discussion in Europe was produced through brutality and exploitation in the colonies.

That’s the paradox of coffee—it has always been both a social leveler and a symbol of inequality.

Why It Still Matters

From Ethiopia’s wild forests to Ottoman coffeehouses, from Parisian salons to Brazilian plantations, coffee’s story mirrors the forces that shaped our modern world—trade, religion, colonization, and globalization.

That cup you’re sipping this morning connects you to centuries of human ingenuity, faith, conflict, and resilience.

Your latte isn’t just caffeine—it’s history in a cup.

Pistols at Dawn: The Rise and Fall of the Code Duello

Not long ago I was watching a news show and one of the panelists started talking about “a duel of words” that went on in a congressional hearing. I was intrigued by the use of the word duel and I thought I’d look into the history of this strange custom.

In the age before Twitter feuds, internet trolling, and legal settlements, honor was defended with pistols at dawn. The Code Duello, a set of rules governing dueling, offers a fascinating glimpse into how ideas of masculinity, reputation, and justice shaped public and private life in the Anglo-American world from the mid-18th century through the antebellum era.

The Code Duello emerged as one of the most distinctive and controversial aspects of genteel culture in the American colonies in the early United States. This elaborate system of honor-based combat, imported from European aristocratic traditions, would profoundly shape American society between 1750 and 1860, creating a culture where personal honor often trumped legal authority and where violence became a sanctioned means of dispute resolution among the elite.

European Origins 

The Code Duello originated in Renaissance Italy and spread throughout European aristocratic circles as a means of settling disputes while maintaining social hierarchy. The practice reached the American colonies through British and Continental European settlers who brought with them deeply ingrained notions of honor, reputation, and gentlemanly conduct. Unlike random violence or brawling, dueling operated under strict protocols that emphasized courage, skill, and adherence to prescribed rituals.

The most influential codification was the Irish Code Duello of 1777, written by gentlemen of Tipperary and Galway. This twenty-six-rule system established procedures for issuing challenges, selecting weapons, determining conditions of combat, and defining acceptable outcomes. The code emphasized that dueling was a privilege of gentlemen, requiring both participants to be of equal social standing and ensuring that honor could only be satisfied through formal, regulated combat.

Colonial Implementation and Adaptation

The first recorded American duel occurred in 1621 in Plymouth, Massachusetts, between two servants, but the practice soon became the exclusive domain of elites as only “gentlemen” were considered to possess honor worth defending in this way.

The Irish Code Duello was widely adopted in America, though often with local variations. In 1838, South Carolina Governor John Lyde Wilson published an “Americanized” version, known as the Wilson Code, which further codified the practice for the southern states and attempted to increase negotiated settlements. These codes served as the de facto law of honor, even as formal legal systems struggled to suppress dueling.

The practice gained prominence among the southern plantation society’s hierarchy as dueling fit well with its emphasis on personal honor.   The ritual was highly formal: challenges were issued in writing, seconds (assistants to the duelists) attempted to mediate, the weapons chosen, and terms were carefully negotiated.

Colonial dueling adapted European practices to American circumstances. While European duels often involved swords, reflecting centuries of aristocratic martial tradition, American duelists increasingly favored pistols, which were more readily available and required less specialized training. This shift democratized dueling to some extent, as pistol proficiency was more easily acquired than swordsmanship, though the practice remained largely restricted to the upper classes.

The Revolutionary War significantly expanded dueling’s influence. Military service brought together men from different regions and social backgrounds, spreading dueling customs beyond their original geographic and social boundaries. Officers who had learned European military traditions during the conflict carried these practices into civilian life, establishing dueling as a marker of martial virtue and gentlemanly status.

The Early Republic

Following independence, dueling became increasingly institutionalized in American society.  The young republic’s political culture, characterized by intense partisan conflict and personal attacks in newspapers, created numerous opportunities for perceived slights to honor that demanded satisfaction through combat.

The most famous American duel occurred in 1804 when Aaron Burr killed Alexander Hamilton at Weehawken, New Jersey. This encounter exemplified both the power and the contradictions of dueling culture. Hamilton, despite philosophical opposition to dueling, felt compelled to accept Burr’s challenge to maintain his political viability. The duel’s outcome effectively ended Burr’s political career and demonstrated how adherence to the code could destroy the very honor it purported to defend.

Prior to becoming president, Andrew Jackson took part in at least three duels, although he is rumored to have been in many more. In his most famous duel, Jackson shot and killed a man who had insulted his wife. Jackson was also wounded in the duel and carried the bullet in his chest for the rest of his life.

Political dueling reached epidemic proportions in the antebellum period. Congressional representatives, senators, and other public figures regularly challenged opponents to combat over policy disagreements or personal insults. The practice became so common that some politicians deliberately provoked duels to enhance their reputation for courage, while others saw dueling as essential to maintaining credibility in public life.

Regional Variations and Social Dynamics

Dueling culture varied significantly across regions. The South developed the most elaborate and persistent dueling traditions, where the practice became intimately connected with concepts of honor, masculinity, and social hierarchy that would later influence Confederate military culture. Southern dueling codes often emphasized elaborate rituals and multiple exchanges of fire, reflecting a culture that viewed honor as more important than life itself.

Northern attitudes toward dueling were more ambivalent. While many Northern elites participated in dueling, the practice faced stronger opposition from religious groups, legal authorities, and emerging middle-class values that emphasized commerce over honor. Anti-dueling societies formed in several Northern cities, and some states enacted specific anti-dueling legislation, though enforcement remained inconsistent. Laws against it were passed in several colonies as early as the mid-18th century, with harsh penalties including denial of Christian burial for duelists killed in combat. Clergy denounced it as un-Christian, and reformers sought to eradicate it, but the practice persisted, especially in regions where courts were weak or social hierarchies unstable. The South, with its less institutionalized markets and governance, saw dueling as a quicker, more reliable way to settle disputes.

Western frontier regions adapted dueling to their own circumstances, often emphasizing practical marksmanship over elaborate ceremony. Frontier dueling tended to be less formal than Eastern practices, but it served similar functions in establishing social hierarchies and resolving disputes in areas where legal institutions remained weak.

Decline and Legacy

By the 1850s, dueling faced increasing opposition from legal, religious, and social reform movements. The rise of professional journalism, which could destroy reputations without resort to violence, provided alternative means of defending honor. Changing economic conditions that emphasized commercial success over martial virtue gradually undermined dueling’s social foundations.

The Civil War marked dueling’s effective end as a significant social institution. The massive scale of organized violence made individual combat seem anachronistic, while post-war society increasingly emphasized industrial progress over aristocratic honor. Though isolated duels continued into the 1870s, the practice lost its central role in American elite culture.

The Code Duello’s legacy extended far beyond its formal practice. It established patterns of violence, honor, and masculine identity that would influence American culture for generations, contributing to regional differences in attitudes toward violence and honor that persist today. The code’s emphasis on individual resolution of disputes also reflected broader American skepticism toward institutional authority, helping shape a culture that often preferred private justice to public law.

How the Code Duello Shaped Western Gunfighting Culture

The Code Duello was a script for settling personal disputes through controlled violence. Its influence waned in the East by the mid-1800s, but many of its ideas persisted, especially among military veterans, Southern transplants, and frontiersmen. As the American frontier expanded, the ethic of “settling scores” through personal combat found fertile ground in the west. What changed was the style and setting.

From Pistols at Dawn to High Noon

In the Code Duello, challenges were typically issued in writing, often with formal language and designated seconds. A duel was planned, often days in advance, and fought with flintlock pistols or swords. By contrast, gunfights in the Old West were more spontaneous, often provoked by insults, cheating, or long-standing feuds. Still, both forms were ultimately about defending personal honor in public view.

Gunfighters like Wild Bill Hickok and Wyatt Earp became mythologized partly because they embodied an honor-based culture in an environment where the law was weak or slow. In many ways, the Western gunfight was an informal, democratized version of the Code Duello, stripped of its aristocratic pretenses but keeping its emotional and symbolic core.

Myth vs. Reality

Ironically, formal duels were relatively rare in the actual Old West, and many “gunfights” were closer to ambushes or drunken brawls than ritualized combat. But dime novels, Wild West shows, and later Hollywood films reimagined them using a Code Duello-like template: two men meet face to face, in broad daylight, to resolve a conflict through a test of nerve and skill. The image of the high-noon shootout—with a silent crowd, an agreed time and place, and an implied code of fairness—is the Code Duello in cowboy boots, but it likely never existed.

The Duel That Never Was

I will end the discussion of Code Duello with what may be one of the most unusual of all American dueling stories.  

In 1842, Abraham Lincoln became embroiled in a public dispute with James Shields, the auditor of Illinois, largely over Illinois State banking policy and some satirical letters that mocked Shields.  Shields took great offense to these attacks—particularly the ones written by Lincoln under the pseudonym “Rebecca”—and formally challenged Lincoln to a duel.  According to the rules of dueling, Lincoln, as the one challenged, had the right to choose the weapons. He selected cavalry broadswords of the largest size to take advantage of his own height and reach over Shields.

The Duel’s Outcome

The duel was scheduled for September 22, 1842, on Bloody Island, a sandbar in the Mississippi River near Alton, Missouri—chosen because dueling was still legal there.  On the day of the duel, before any blood was shed, Lincoln dramatically demonstrated his advantage by slicing off a high tree branch with his broadsword, showcasing his reach and physical prowess.  After witnessing this and following subsequent negotiations by their seconds, Shields and Lincoln decided to call off the duel, resolving their differences without violence.

Legacy

Although the duel never resulted in violence, it became a notorious episode in Lincoln’s life, one he rarely spoke of later, even when asked about it.  The event is commonly cited as a reflection of Lincoln’s quick wit, physical presence, and preference for peaceful resolution when possible.  While Abraham Lincoln never actually fought a duel, he was briefly a participant in one of the more colorful near-duels of American political history.

A Final Thought

Perhaps the world would be a better place if we reinstitute some elements of Code Duello and instead of sending armies off to fight bloody battles, the national leaders settle disputes by individual combat.  I suspect there would be many more negotiated settlements.

Bread and Circuses: From Ancient Rome to Modern America

“Already long ago, from when we sold our vote to no man, the People have abdicated our duties; for the People who once upon a time handed out military command, high civil office, legions — everything, now restrains itself and anxiously desires for just two things: bread and circuses.”

Nearly 2,000 years ago, Roman satirist Juvenal penned one of history’s most enduring political observations: “Two things only the people anxiously desire — bread and circuses.” Writing around 100 CE in his Satire X, Juvenal wasn’t celebrating this phenomenon—he was lamenting it. The poet watched as Roman citizens traded their political engagement for free grain and spectacular entertainment, becoming passive spectators rather than active participants in their democracy. The phrase has endured for nearly two millennia as shorthand for a troubling political dynamic: entertainment and consumption replacing civic engagement and accountability.

The Roman Warning

Juvenal’s critique came at a pivotal moment in Roman history. The republic had collapsed, and emperors like Augustus had systematically dismantled democratic institutions. Rather than revolt, Roman citizens seemed content as long as the government provided basic sustenance (the grain dole called annona) and elaborate spectacles at venues like the Colosseum. Political participation withered as people focused on immediate pleasures rather than long-term civic responsibilities.

The strategy worked brilliantly for Roman rulers. Keep the masses fed and entertained, and they won’t question your authority or demand meaningful representation. It was political control through distraction—a form of soft authoritarianism that maintained order without overt oppression.  The policy was effective in the short term—peace in the streets and loyalty to the emperors—but disastrous over time. Rome’s population became disengaged from politics, while real power consolidated in the hands of a few.

Modern American Parallels

Fast-forward to contemporary America, and Juvenal’s observation feels uncomfortably relevant. While we don’t have gladiatorial games, we do have our own version of “circuses”—professional sports, reality TV, social media feeds, and celebrity culture that dominate public attention. These aren’t inherently problematic, but they become concerning when they crowd out civic engagement.

Our modern “bread” takes various forms: government assistance programs, subsidies, and economic policies designed to maintain consumer spending. We are saturated with cheap goods, instant delivery services, and mass consumerism. For many, economic struggles are temporarily softened by accessible consumption, from fast food to online shopping. Yet material comfort often masks deeper inequalities and systemic challenges—wage stagnation, healthcare costs, and mounting national debt. These programs often serve legitimate purposes, but they can also function as political tools to maintain public satisfaction and suppress dissent.

Consider how political campaigns increasingly focus on entertainment value rather than substantive policy debates. Politicians hire social media managers and appear on talk shows, understanding that capturing attention often matters more than presenting coherent governance plans. Meanwhile, voter turnout for local elections—where citizens have the most direct impact—remains dismally low.

The Distraction Economy

Perhaps most striking is how our information landscape mirrors Roman spectacles. We’re bombarded with sensational news, viral content, and manufactured controversies that generate strong emotional reactions but little productive action. Complex policy issues get reduced to soundbites and memes, making genuine democratic deliberation increasingly difficult.

Social media algorithms are specifically optimized for engagement, not enlightenment. They feed us content designed to provoke reactions—anger, outrage, schadenfreude—rather than encourage thoughtful consideration of difficult issues. This creates a population that feels politically engaged through constant consumption of political content while remaining largely passive in actual civic participation.

The danger of “bread and circuses” in modern America lies in apathy. When civic participation declines, voter turnout falls, and policy debates get reduced to simplistic slogans, elites face less scrutiny. The result is a weakened democracy, vulnerable to manipulation and short-term thinking.

Breaking the Cycle

Juvenal’s warning doesn’t mean we should abandon entertainment or social programs. Rather, it suggests we need intentional balance. Democratic societies thrive when citizens remain actively engaged in governance beyond just voting every few years.

This means staying informed about local issues, attending town halls, contacting representatives, and participating in community organizations. It means choosing substance over spectacle and long-term thinking over immediate gratification.

The Roman Republic fell partly because its citizens stopped paying attention to governance. Juvenal’s “bread and circuses” reminds us that democracy requires constant vigilance—and that comfortable distraction can be freedom’s most seductive enemy.

Page 2 of 2

Powered by WordPress & Theme by Anders Norén