Grumpy opinions about everything.

Category: History Page 1 of 3

Grumpy opinions about American history

A Bleak Christmas

 Surviving the Winter at Valley Forge

Christmas at Valley Forge in 1777 was a somber affair for the Continental Army. On December 19, weary soldiers arrived at the encampment after a string of defeats and the loss of Philadelphia to British forces. They faced immediate challenges: inadequate shelter, scarce provisions, disease, and the onset of a harsh winter. Although the construction of over 1,500 log huts provided some relief, many troops lacked proper clothing and shoes, enduring bitter cold with little protection.

The army’s religious diversity shaped Christmas observances. Denominations like Episcopalians and Lutherans celebrated the holiday, while others, including Quakers and Presbyterians, did not. As a result, any Christmas observances were likely subdued and personal.

Amid the hardships, General George Washington sought to alleviate suffering. On Christmas Eve, he ordered each regiment to draw provisions to complete rations for the following day. Despite these efforts, Christmas morning brought little relief. Many soldiers faced the day with only “firecakes”—a meager mixture of flour and water—as their meal. The harsh conditions compelled them to spend the day building and repairing huts, collecting firewood, and foraging for food. Others dug defensive works or endured rotating guard duty through the bitter night.

While Continental soldiers struggled at Valley Forge, British forces in Philadelphia enjoyed relative comfort. British troops were quartered in colonial homes, staying warm and well-fed. Some local farmers secretly sold provisions to the British, drawn by payments in gold or silver.

Despite the immense suffering, the winter at Valley Forge marked a turning point for the Continental Army. The arrival of Baron Friedrich Wilhelm von Steuben in February 1778 brought much-needed training and discipline, transforming the army into a more effective fighting force.

In summary, Christmas at Valley Forge was a time of hardship, sacrifice, and reflection for the Continental Army. The bitter experiences of that winter tested their resolve but also laid the groundwork for their ultimate success in the fight for independence.

Here are the sources referenced for the discussion on Christmas at Valley Forge:

  1. National Park Service – Valley Forge History and Significance
    Link: nps.gov
  2. National Park Service – Christmas at Valley Forge
    Link: nps.gov
  3. Mount Vernon – George Washington at Christmas
    Link: mountvernon.org
  4. History.com – Valley Forge
    Link: history.com

From Fact to Folklore: The Evolution of Thanksgiving Traditions

Thanksgiving has become one of the most cherished holidays in the United States, steeped in tradition, gratitude, and shared meals. Its origins are often traced back to the Pilgrims’ harvest celebration in 1621, yet much of what we “know” about that event has been shaped by legend. The historical facts surrounding the first Thanksgiving differ significantly from the modern narrative, which has evolved into a romanticized story of harmony and feasting. Let’s explore what history tells us about that pivotal celebration, examining the number of attendees, the types of food served, the length of the event, and the subsequent creation of the Thanksgiving legend.

The First Thanksgiving

In the autumn of 1621, after a successful harvest, the Pilgrims at Plymouth Colony held a three-day celebration that is often considered the “first Thanksgiving.” This event marked a period of gratitude and alliance-building between the Pilgrims and the Wampanoag people, who were critical to the settlers’ survival during their first year in the New World.

Who Attended?

Approximately 90 Wampanoag men, led by Chief Massasoit, joined 50 surviving Pilgrims for the event. The Pilgrims had arrived aboard the Mayflower the previous year, with 102 passengers. However, disease, harsh conditions, and starvation during the brutal winter of 1620-1621 had decimated their numbers. By the time of the harvest feast, only about half of the original settlers remained. Among the Pilgrims, there were 22 men, 4 married women, and about 25 children and teenagers.

The Wampanoag, who had been instrumental in teaching the Pilgrims essential survival skills, were invited as honored guests.

The Menu

The food served at the 1621 gathering was vastly different from today’s traditional Thanksgiving meal.  The feast was primarily prepared by the four surviving adult Pilgrim women: Eleanor Billington, Elizabeth Hopkins, Mary Brewster, and Susanna White. They were assisted by their daughters and four household servants.

While there are no definitive records of the exact dishes, historical accounts and the resources available to the settlers provide clues:

  • Meat and Game: The primary protein source was likely wildfowl, such as ducks, geese, and possibly turkey. Deer (venison) brought by the Wampanoag was also a centerpiece.
  • Seafood: The Pilgrims relied heavily on the ocean for sustenance, so fish, clams, mussels, and possibly lobster may have been included.
  • Grains and Vegetables: Corn was a staple, though it was likely prepared as a simple porridge or bread, not the sweetened dishes we know today. Other vegetables like squash, beans, onions, and native wild plants such as Jerusalem artichokes were likely served.
  • Fruits and Nuts: Wild berries, cranberries (unsweetened), and nuts like walnuts and chestnuts may have been part of the feast.
  • Beverages: The Pilgrims likely drank water or weak beer, as clean drinking water was not always available.

Absent from the feast were many items central to a contemporary Thanksgiving, such as mashed potatoes, pumpkin pie, and sweetened cranberry sauce. Potatoes and sugar were not readily available, and ovens for baking were primitive at best.

The Celebration’s Length

The first Thanksgiving was not a single meal but rather a three-day event. The Pilgrims and Wampanoag likely engaged in feasting, games, and possibly ceremonial activities. For the Pilgrims, it was a religious occasion, giving thanks to God for their survival and harvest. For the Wampanoag, such feasts were part of their cultural traditions, celebrating seasonal abundance and community.

Myth vs History

As Thanksgiving became a national holiday, myths about the first celebration began to overshadow historical facts. Much of the modern narrative can be traced back to the 19th century, when the holiday was popularized and romanticized.

The Romanticized Myth

The traditional narrative depicts Pilgrims and Native Americans sharing a harmonious meal, much like today’s Thanksgiving dinner. This portrayal emphasizes mutual goodwill and cultural exchange, but it simplifies a far more complex reality

This narrative began to take shape in the mid-19th century when writer Sarah Josepha Hale, editor of Godey’s Lady’s Book, campaigned to make Thanksgiving a national holiday. In 1863, during the Civil War, President Abraham Lincoln declared Thanksgiving a national holiday, emphasizing unity and gratitude. Hale’s writings, along with paintings and school textbooks, reinforced the idyllic imagery of Pilgrims and Native Americans dining together peacefully.

The Historical Complexities

While there was cooperation and mutual benefit between the Pilgrims and Wampanoag during the early years of Plymouth Colony, the relationship was far more complex than the legend suggests. The Wampanoag helped the settlers survive, teaching them to fish and to grow corn in the unfamiliar landscape. However, this alliance was forged out of necessity. The Wampanoag were seeking allies against rival tribes, and the Pilgrims needed help to avoid starvation.

Furthermore, the long-term relationship between European settlers and Native Americans was marked by conflict, displacement, and violence. By the late 17th century, tensions had escalated into King Philip’s War (1675-1678), one of the bloodiest conflicts in colonial American history, leading to the near-destruction of the Wampanoag people. These later events cast a shadow over the harmony celebrated in Thanksgiving lore.

Thanksgiving’s Evolution Over Time

As the centuries passed, the story of this harvest feast evolved into something far removed from its origins.

The mythologizing of Thanksgiving served a broader cultural purpose. During the 19th century, the holiday was framed as a uniquely American tradition, emphasizing family, gratitude, and unity at a time when the nation was deeply divided.

In modern times, Thanksgiving has become a secular holiday centered on food, family, and football, often disconnected from its historical roots. While many still reflect on gratitude, the original religious significance observed by the Pilgrims has largely faded. Similarly, the role of Native Americans in the holiday’s origins is often reduced to a simplistic narrative, overshadowing the complex history of their interactions with settlers.

Reclaiming the Story

In recent years, schools and communities have been actively reshaping the Thanksgiving narrative to present a more accurate and inclusive account of its history. This shift aims to acknowledge the complexities of the holiday’s origins and the experiences of Indigenous peoples.

Efforts have been made to present a more nuanced understanding of Thanksgiving. For example, Native American communities use Thanksgiving as a time for remembrance, marking it as a “National Day of Mourning” to honor ancestors and reflect on the impact of colonization. Educators and historians strive to balance the narrative, acknowledging both the cooperation and conflict between Pilgrims and Native Americans.

Understanding the historical first Thanksgiving as a multi-day harvest celebration shared by two very different cultures can enrich our appreciation of the holiday. By recognizing the complexities of the Pilgrims’ survival and the Wampanoag’s contributions, we can honor the real history while still finding meaning in Thanksgiving as a time for gratitude and reflection.

Conclusion

The first Thanksgiving of 1621 was a far cry from the turkey-laden feasts of today. It was a modest harvest celebration involving around 140 people, featuring wild game, seafood, and native vegetables. The three-day event was as much about survival and diplomacy as it was about gratitude.

Over centuries, this historical gathering has transformed into a powerful national myth that emphasizes unity and abundance. While the legend simplifies and sanitizes a more complex reality, it also reflects the evolving cultural values of the United States. By understanding the truth behind the Thanksgiving story, we can celebrate the holiday with a deeper sense of history, recognizing both its origins and its modern meaning.

Thanksgiving remains a day to give thanks, share food, and connect with loved ones—but it also offers an opportunity to reflect on the broader history it represents.

The most iconic Thanksgiving image.

 Sources:

Primary Accounts of the First Thanksgiving:

  • Bradford, William. Of Plymouth Plantation. Original accounts describing the Pilgrims’ settlement and their harvest celebration in 1621.
  • Winslow, Edward. Mourt’s Relation. An early Pilgrim document providing descriptions of their experiences.

  Attendees of the First Thanksgiving:

  • Pilgrim Hall Museum. “What Happened in 1621?”

  The Menu of the 1621 Feast:

  • History Channel. “What Was on the Menu at the First Thanksgiving?”

  The Role of Women and Servants:

  • New England Historical Society. “The Women Who Cooked the First Thanksgiving.”
  • Wikipedia. “List of Mayflower Passengers.”

  Evolution of the Thanksgiving Legend:

  • Smithsonian Magazine. “The Thanksgiving Myth and What We Should Be Teaching Kids.”

  Complex Relationships Between Pilgrims and Wampanoag: Smithsonian Magazine. “The History Behind the Thanksgiving Holiday.”

 War and Medicine

The Evolution of the Army Medical Corps

The history of military medicine in the United States during the 18th and 19th centuries is essentially the history of the Army Medical Corps. There is no surprise that the Army Medical Corps played a significant role in advances in battlefield medicine. However, many people do not appreciate that the Army Medical Corps also played a significant role in the treatment of infectious diseases and improvements in general sanitation.  For example, one of the first public health inoculation efforts was ordered by General George Washington in the Continental Army to protect troops against smallpox. Walter Reed led an Army Medical Corps team that proved that the transmission of yellow fever was by mosquitoes. The Army Medical Corps developed the first effective typhoid vaccine during the Spanish American War and in World War II the Army Medical Corps led research to develop anti-malarial drugs.

Revolutionary War and the Founding of the Army Medical Corps

The formal beginnings of military medical organization in the United States trace back to 1775, with the establishment of a Medical Department for the Continental Army. On July 27, 1775, the Continental Congress created the Army Medical Service to care for wounded soldiers. Dr. Benjamin Church was appointed as the first “Director General and Chief Physician” of the Medical Service, equivalent to today’s Surgeon General. However, Church’s tenure was brief and marred by scandal: he was proved to be a British spy, passing secrets to the enemy.

Church’s arrest in 1775 created a leadership vacuum, and the fledgling medical service had to reorganize quickly under Dr. John Morgan, who became the second Director General. Morgan sought to professionalize the medical corps, emphasizing proper record-keeping and standards of care. However, the Revolutionary War medical system struggled with limited resources, inadequate supplies, poor funding and an overworked staff. The lack of an effective supply chain for medicine, bandages, and surgical instruments was a significant issue throughout the conflict.

Early Challenges in Battlefield Medicine

During the Revolutionary War, military medical practices were rudimentary. Medical knowledge and understanding of disease processes had advanced little since the days of ancient Greece. Medical training was inconsistent and was principally by the apprentice method. In 1775 there were only two small medical schools in all of the 13 colonies. One of those closed with the onset of the revolution.

Field surgeons primarily treated gunshot wounds, fractures, and infections. Most treatments were painful and often involved amputation, as this was one of the few ways to prevent infections from spreading in an era without antibiotics. Battlefield medicine was further hampered by the fact that surgeons often had to work without proper sanitation or anesthesia.

One of the most significant health challenges faced by the Continental Army was disease, including smallpox, typhoid, dysentery, and typhus. In fact, more soldiers died from disease than from combat injuries. Recognizing the threat of smallpox, General George Washington made the controversial but strategic decision in 1777, to inoculate his troops against smallpox, significantly reducing mortality and helping to preserve the fighting force. At Valley Forge almost half of the continental troops were unfit for duty due to scabies infestation and approximately 1700 to 2000 soldiers died of the complications of typhoid and diarrhea.

It’s estimated that there were approximately 25,000 deaths among American soldiers both continental and militia in the American Revolution.  An estimated 7000 died from battlefield wounds. An additional 17,000 to 18,000 died from disease and infection. This loss of soldiers to non-combat deaths has been one of the biggest challenges faced by the Army Medical Corps through much of its history.

Post-Revolution: Developing a Medical Framework (1783-1812)

After the Revolutionary War, the United States Army Medical Department went through a period of instability. There were ongoing debates about the structure and necessity of a standing army and medical service in peacetime. However, the need for an organized military medical service became apparent during the War of 1812. The war underscored the importance of medical organization, especially in terms of logistics and transportation of the wounded.

The Army Medical Department grew, and by 1818, the government established the position of Surgeon General. Joseph Lovell became the first to officially hold the title of Surgeon General of the United States Army. Lovell introduced improvements to record-keeping and hospital management and laid the groundwork for future medical advances, though the department remained small and under-resourced.

Advancements in Military Medicine: The Mexican-American War (1846-1848)

The Mexican-American War provided an opportunity for the Army Medical Corps to refine its practices. Field hospitals were more structured, and new surgical techniques were tested. However, disease continued to be a significant challenge, yellow fever and dysentery plagued American troops. The war also underscored the importance of sanitation in camps, though knowledge about disease transmission was still limited.

The aftermath of the Mexican-American War saw the construction of permanent military hospitals and better organization of medical personnel, setting the stage for the much larger and more complex demands of the Civil War.

Civil War: The Birth of Modern Battlefield Medicine (1861-1865)

The Civil War represented a turning point in military medicine, with significant advances in both battlefield care and medical logistics. By the start of the war, the Army Medical Corps was better organized than during previous conflicts, though it still faced many challenges. Jonathan Letterman, the Medical Director of the Army of the Potomac, revolutionized battlefield medicine by creating the Letterman System, which included:

  1. Field Dressing Stations: Located near the front lines to provide immediate care.
  2. Ambulance System: Trained ambulance drivers transported wounded soldiers from the battlefield to hospitals.
  3. Field Hospitals and General Hospitals: These provided surgical care and longer-term treatment.

The Civil War saw the introduction of anesthesia (chloroform and ether), which reduced the suffering of wounded soldiers and made more complex surgeries possible. However, infection remained a major problem, as antiseptic techniques were not yet widely practiced and germ theory as a source for disease and infection was poorly understood. Surgeons worked in unsanitary conditions, often reusing instruments without sterilization and frequently doing little more than rinsing the blood off of their hands between patients.

Sanitation and Public Health Measures

One of the most critical lessons of the Civil War was the importance of camp sanitation and disease prevention. Dr. William Hammond, appointed Surgeon General in 1862, emphasized the need for hygiene and camp inspections. Under his leadership, new regulations improved the quality of food and water supplies. Though disease still claimed many lives, these efforts marked the beginning of a more systematic approach to military public health.

Additionally, the United States Sanitary Commission (USSC)was established in 1861. It was a civilian organization that was created to support the union army by promoting sanitary practices and improving medical care for soldiers with the objectives of improving camp sanitation, providing medical supplies, promoting hygiene and preventive care, supporting wounded soldiers and advocating for soldiers welfare.

Hammond also promoted the use of the Army Medical Museum to collect specimens and study diseases, fostering a more scientific approach to military medicine. Though he faced resistance from some military leaders, his reforms laid the foundation for modern military medical practices.

Conclusion

The evolution of the Army Medical Corps from the Revolutionary War to the Civil War reflects a gradual shift from rudimentary care to more organized, systematic medical practices. Early efforts were hindered by leadership issues, such as the betrayal by Benjamin Church, and by the challenges of disease and limited resources. However, over the decades, the Army Medical Department improved its structure, introduced innovations like inoculation and anesthesia, and laid the groundwork for advances in battlefield care. The Civil War, in particular, was pivotal in transforming military medicine, with lessons in logistics, sanitation, and surgical care that would shape the future of military and civilian medical systems.

For further reading, the following sources provide excellent insights:

  • Office of Medical History – U.S. Army
  • “Gangrene and Glory: Medical Care during the American Civil War” by Frank R. Freemon

History Rocks!

Always has, always will.

Rock on!

What Would George Washington and Thomas Jefferson Think About Our Current Political Climate?

In considering what George Washington and Thomas Jefferson might think of today’s political situation, it’s tempting to view their perspectives through the lens of nostalgia, believing that the founders had an idealistic vision that, if followed, would have prevented many modern problems. It’s impossible of course to know what they may have thought about our current environment. Certainly, such things as a 24-hour news cycle on cable television and social media would have been beyond their comprehension.  While both men lived in a vastly different era, their writings and philosophies give us a sense of how they might respond to the polarization and tensions we witness today.

George Washington: A Warning Against Partisanship

George Washington was deeply concerned about the rise of factions in the United States. (Political parties as such were unknown at the beginning of our republic.) In his famous Farewell Address in 1796, he warned that factions could lead to division and weaken the unity of the country. Washington was worried that faction (party) loyalty would surpass loyalty to the nation, creating conflict between groups and impairing the ability of government to function for the common good. He feared that excessive partisanship would “distract the public councils and enfeeble the public administration,” leaving the nation vulnerable to foreign influence and internal discord.

If Washington could observe today’s political environment, he likely would be saddened by the partisanship which dominates political discourse. The gridlock, belligerent rhetoric, and divisiveness we experience today demonstrate the appropriateness of his concern. Washington would likely advocate for a return to greater civility, urging Americans to focus on the common good and to set aside factionalism for the sake of national unity. While political parties have become integral to our system, Washington would likely still press for cooperation, mutual respect, and compromise among all groups.

Thomas Jefferson: Liberty, Democracy, and the People’s Role

Thomas Jefferson, while more supportive of political parties than Washington, had his own complex views about governance. Jefferson believed in the power of the people to govern themselves and was a passionate advocate for liberty, democracy, and decentralization. He distrusted concentrated power, whether in government, or economic institutions, and feared that it could lead to tyranny. Jefferson was famously a champion of agrarianism and believed that widespread participation in the democratic process was the best defense against corruption and the loss of liberty.

Jefferson, while a proponent of states’ rights and individual liberties, might view polarization as a threat to democratic ideals if it stifles dialogue and compromise. He believed in the potential for free men to govern wisely, but would caution against the erosion of civil discourse that might follow the rise of extreme factionalism

Faced with the highly charged political debates of today, Jefferson would likely express concern over the increasing centralization of power in government, banks, and large corporations. He would, without doubt, be troubled by the outsized influence of money in politics.

Jefferson was also a firm believer in education as a cornerstone of democracy; he would stress the importance of an informed electorate, particularly in an age where misinformation can spread rapidly.

However, Jefferson was no stranger to political conflict, having played a central role in the fiercely partisan battles of his time. He understood the value of vigorous debate but would probably urge that such debate remain focused on the core democratic principles of liberty, justice, and equality rather than devolving into personal attacks.

Media and Civil Discourse

Of course, it is impossible to know what Washington and Jefferson would think about the current role of media, particularly social media which would be beyond anything in their experience. Washington felt strongly aggrieved by the attacks upon him in the newspapers of the time.  He felt unfair attacks would undermine national unity. Jefferson, on the other hand, was a strong proponent of freedom of the press. He was also very adept at the use of newspapers to accomplish political means.

However, it is likely that both would caution against the dangers of misinformation and partisan bias to distort public perception.  Most likely both would emphasize the need for a responsible press that distinguishes between fact and opinion and supports a healthy democracy. Both would be opposed to using false or misleading statements to influence the public.

Unity and Civic Responsibility

Despite their differences, both Washington and Jefferson would likely agree on one thing: the importance of unity and civic responsibility. They envisioned a country where citizens were deeply involved in a participatory government, contributing not just with votes but with informed, constructive dialogue. Washington would call for a spirit of national unity above party lines, while Jefferson would insist that the preservation of liberty relies on active and informed participation from the public.

Both founders would encourage a healthier, more cooperative political environment, one where differences are respected and not allowed to fracture the country. They would likely see today’s polarization as a threat to the very ideals they fought to establish, and both would urge Americans to remember their shared values.

Conclusion

In short, George Washington and Thomas Jefferson, while men of their own time, had insights that are still relevant today. Neither man could have predicted the exact nature of modern politics, but their wisdom offers enduring guidance: political disagreements must not undermine the unity, liberty, and civic responsibility that are the foundation of the American experiment.  We owe it to them not to lose the promise of the American Revolution.

A Book Report

On Tyranny: Twenty Lessons From The Twentieth Century

Timothy Snyder

I don’t believe I have written a book report since I was a freshman in college. This is a small book that I believe is well worth your time to read.  I mean small book quite literally. It’s about 4 1/2 inches by 6 1/2 inches and only about 126 pages of fairly wide spaced text. You can read the entire book in an afternoon and still have time left over.  The book is also available as an audiobook, and it’s combined with Twenty New Lessons From Russia’s War On Ukraine. I got the audio book and listened to it as well. That will take about eight hours to complete.

I’m calling this a book report rather than a book review because I want to tell you what’s in the book rather than what I think about it so that you can form your own opinion.  First, I’ll give you an overall impression. It’s thought provoking and raises issues of concern to us today. You may not agree with it. I don’t agree with everything that he said. I found some of his 20 lessons to be redundant because I thought he was stretching to get 20 lessons for the 20th century. I also found one or two of them to be painfully obvious.  What follows is just a summary of the book. Read it and e-mail me or call me if you’d like to discuss it.  I will send the book to the first person who asks for it, but only if you promise to pass it on when you’re finished.

Doctor Snyder is a professor of history at Yale University. He specializes in Central and Eastern Europe and has spent considerable time in Ukraine. He’s the author of 15 books and reads five European languages and can converse in ten.

Preface

The book begins with a preface in which Doctor Snyder establishes the importance of historical awareness. He describes the failure of some 20th century European democracies and their descent into authoritarian rule. He uses these examples to illustrate how democracy cannot be taken for granted. He makes the argument that democracy is fragile and must be defended.  He further argues that even in the United States, the continuation of our democracy is not a given and we must fight for freedom and maintain constant vigilance over our democratic processes.

Each of his twenty lessons is contained in a separate chapter, varying from a few pages to a single sentence. 

 The Twenty Lessons

  1. Do Not Obey In Advance: Many authoritarian regimes have achieved their power from the people’s willingness to conform without being forced. Do not comply with unjust policies that are not being actively enforced.
  2. Defend Institutions: Democratic institutions such as courts, the press, and unions are an important part of a free society. We must support and defend them against attacks and subversion.
  3. Beware The One-Party State:  A multiparty system is crucial for the survival of liberty. The rise of a one-party system often precedes the decline of democratic government.
  4. Take Responsibility For The Face Of The World: The use of symbols in public spaces matters. Authoritarian regimes can shape public perceptions by co-opting national symbols and using them to create division rather than unity. Be vigilant for and challenge symbols of hate and remove these symbols from our public spaces.
  5. Remember Professional Ethics: Professionals, especially those in law, medicine, education and government service must uphold ethical standards, even when pressured to conform to the aims of authoritarian regimes. Their integrity is vital for maintaining democracy.
  6. Be Wary of Paramilitaries: Resist the rise of unofficial armed groups that support authoritarian figures. Such groups may eventually take the place of the legally constituted police and military services.  “When the men with guns who have always claimed to be against the system start wearing uniforms and marching with torches and pictures of a leader, the end is nigh. When the pro-leader paramilitary and official police and military intermingle, the end has come.”
  7. Be Reflective If You Must Be Armed: If you are part of law enforcement or the military, refuse to participate in unjust actions or repressions of the population. Ensure that those who bear arms do so with a deep sense of responsibility and ethics and a great regard for the value of democracy.
  8. Stand Out: Authoritarian regimes thrive on compliance. Demonstrate opposition publicly and show visible descent and encourage others to show resistance whenever possible.
  9. Be Kind To Our Language: Be precise in your use of language and avoid repeating the phrases made popular by authoritarian leaders. Independent thinking is facilitated by the careful use of language. Manipulation of language is a powerful tool for the control of the population.
  10. Believe In Truth: Truth is the linchpin of justice and democracy. Uphold and defend objective truths. Reject the false narratives or “big lies” spread by authoritarian regimes. Make every attempt to shed light on their falsehoods regardless of how often or loudly they proclaim them.
  11. Investigate: Seek out reliable sources of information. Support independent journalism. Identify, evaluate and encourage trusted sources. Do not accept any claims by the authoritarian regime without your personal verification. Remember critical thinking is important to resisting manipulation.
  12. Make Eye Contact and Small Talk: Build personal connections within the community to foster solidarity and resilience. Visible civic involvement demonstrates unity and a commitment to democratic principles.
  13. Practice Corporal Politics: Participate in protests, marches, and public events. Physically demonstrating support and civic engagement shows a strong commitment to democratic principles.
  14. Establish A Private Life:  Protecting your privacy from surveillance is essential. Authoritarian regimes often invade personal spaces to control and manipulate individuals.
  15. Contribute To Good Causes: Support organizations and causes that defend democracy and human rights. Both financial and moral support will strengthen resistance.
  16. Learn From Peers In Other Countries:  Understand how some nations have succumbed to authoritarianism while other nations have resisted it. Learn from their experiences and strategies.
  17. Listen for Dangerous Words:  Be alert to the use of language that demonizes opponents or glorifies violence. Such rhetoric often precedes more severe actions against targeted groups. Be sensitive to efforts to convert the population to harmful ideologies through repeated derogatory speech.
  18. Be Calm When The Unthinkable Arrives: Stay composed in moments of crisis and resist the urge to act impulsively. Authoritarian leaders often attempt to exploit chaos to seize power. Be alert for such attempts and be ready to counter them.
  19. Be A Patriot: True patriotism involves defending democratic values and principles, not blind loyalty to a leader or party.  Waving flags and shouting slogans is not patriotism. Defending democracy is the true mark of a patriot. 
  20. Be As Courageous As You Can: If none of us is prepared to die for freedom, then all of us will die under tyranny.” (That is the total of this lesson; perhaps the most important one of all.)

Epilogue

The book concludes with an epilogue entitled History and Liberty. In the epilogue Doctor Snyder discusses his theories of the Politics of Inevitability and the Politics of Eternity.

The Politics of Inevitability is the belief that history naturally progresses in a linear forward direction typically towards a better future. It assumes that liberal democracy and capitalism will inevitably spread across the world leading to a more prosperous and freer global system. Human forces are seen as predictable and human activity is often downplayed. People who live under the politics of inevitability tend to believe that the current state of affairs will persist because it represents the end point of political and economic evolution.

The Politics of Eternity, on the other hand, rejects the linear progression of history and emphasizes a cyclical view. It supposes that nations are perpetually threatened by external forces. It glorifies a supposed golden age of the past that is idealized and mythologized. It views current politics as a struggle to return to that idealized past.

I found his discussions to be more theoretical than practical. I can see aspects of both in most current societies. I invite you to please read this and I would love to discuss it. I may be missing something in his presentation of inevitability and eternity.

In Conclusion

If you are concerned about the survival of democracy, read this book.

 

Despotism In America

Recently, I was listening to a series of lectures based on Democracy in America the classic review of politics and society in the United States during the 1830s. Alexis de Tocqueville (1805 – 1859) was a young Frenchman who visited the United States for nine months in 1831 and 1832. Ostensibly, he was here on behest of the French government to review the prison system. His personal goals were much broader.

He and a friend, Gustav de Beaumont, visited much of the United States. They interviewed citizens, reviewed documents, attended community meetings and observed federal, state, and local governmental activities of all branches: executive, legislative and judicial. They also collected books, newspapers, and documents. They visited cities and rural areas in the north and in the south.  They even ventured as far as Wisconsin, then western edge of the American Frontier.  

While they did produce a report on American prisons, which were then relatively progressive in the United States compared to the rest of the world, de Tocqueville had in mind all along that he would write a critique of the United States as he saw it. This eventually became a four-volume set published between 1835 and 1856.

I first became familiar with de Tocqueville when I read a much-abridged version of Democracy in America for an Early American History course. I believe it was probably about 250 pages. That is brief compared to the 926-page behemoth that I recently bought online.

I was interested not so much in what I remembered from my previous reading of his works as I was with what I didn’t remember. In particular, in one of his last chapters, de Tocqueville talks about the conditions under which despotism may arise in America.

As I have done previously with the writings of historic people, I’m going to present de Tocqueville’s writings in his own words without comment or analysis by me. Keep in mind that he wrote 180 years ago. It’s not as amazing that he got some things wrong, as it is how much insight he had into the problems that may potentially arise in America.

 The excerpts in this post are from Book 4, Chapter 6:  What Sort of Despotism Democratic Nations have to Fear.

I had remarked during my stay in the United States, the democratic state of society, similar to that of the Americans, might offer singular facilities for the establishment of despotism; and I perceived, upon my return to Europe, how much use had already been made by most of our rulers, of the notions, the sentiments, and the wants engendered by this same social condition, for the purpose of extending the circle of their power.

But it would seem, that if despotism were to be established among the democratic nations of our days it might assume a different character; It would be more extensive and more mild; It would degrade men without tormenting them.

I think then that the species of oppression by which democratic nations are menaced is unlike anything which ever existed before in the world: our contemporaries will find no prototypes in their memories. I’m trying myself to choose an expression which will accurately convey the whole of the idea I have formed of it, but in vain; the old words “despotism” and “tyranny “ are inappropriate: the thing itself is new; and since I cannot name, it I must attempt to define it.

The first thing that strikes the observation is an innumerable multitude of men all equal and alike incessantly endeavoring to procure the petty and poultry pleasures which they glut their lives. Each of them, living apart, is a stranger to the fate of all the rest – his children and his private friends constitute to him the whole world of mankind; as for the rest of his fellow citizens, he feels them not; exists but in himself and for himself alone; and if his kindred will remain with him, he may be said at any rate to have lost his country.

Above this race of men stands an immense and tutelary power… That power is absolute, minute, regular, provident, and mild. It would be like the authority of a parent, if, like that authority, its object was to prepare men for manhood; but it seeks on the contrary to keep them in perpetual childhood: it is well content that the people should rejoice, provided they think nothing but rejoicing.

… What remains, but to spare them all the cares of thinking and all the troubles of living?

After having thus successfully taken each member of the community into its powerful grasp, and fashioned them at will, the supreme power then extends its arm over the whole community. It covers the surface of society with a network of small, complicated rules, minute and uniform, though which the most original minds and the most energetic characters cannot penetrate, to rise above the crowd. The will of man is not shattered, but softened, bent, and guided: men are seldom forced to act but they’re constantly restrained from acting… It does not tyrannize but it compresses, innervates, extinguishes, and stupefies the people…

Subjugation in minor affairs breaks out every day, and is felt by the whole community indiscriminately. It does not drive men to resistance, but it crosses them at every turn, till they are led to surrender the exercise of their will.

It is in vain to summon the people, which has been rendered so dependent on the central power, to choose from time to time the representative of that power; this rare and brief exercise of their free choice, however important it may be, will not prevent them from gradually losing the facilities of thinking, feeling and acting for themselves and thus gradually falling below the level of humanity. It had that they will soon become incapable of exercising the great and only privilege which remains to them.

 The nations of our time cannot prevent the conditions of men from becoming equal; but it depends upon themselves whether the principle of equality is to lead them to servitude or freedom, to knowledge or barbarism, to prosperity or to wretchedness.

The illustration at the beginning of this post is not intended to be a portrait of de Tocqueville, but rather illustrative of the time.

More Than Just Fake News: The Pernicious Effect Of Modern Propaganda

Propaganda does not deceive people; it merely helps them to deceive themselves.  Eric Hoffer

What is propaganda?

Propaganda! The very word conjures up images of sinister people involved in nefarious activities meant to delude the innocent. But this has not always been the case. Propaganda has, through much of history, been view as information, though frequently of a biased or misleading nature, used to promote or publicize a particular political cause or point of view.

Propaganda has always involved exaggeration and omission in order to achieve a specific goal.  It was intended to shape beliefs and attitudes without actually lying to the listeners. At its core, there was a basis of truth.

We generally think of propaganda as the domain of governments.  But, in its broadest definition, advertising might be considered as propaganda. It’s intended to create the impression that specific products contribute real advantage to your life.  Drinking a specific beer will make you have a better time. Driving a certain car will show that you are more environmentally concerned. Wearing specific clothes will make you more popular.

It wasn’t until the 20th century that the incorporation of falsehoods, deception, and other activities intended to create a totally false impression and to promulgate untruths became the mainstay of propaganda.

Phillip Taylor in his book “Munitions of the Mind” presents an excellent history of propaganda from its origins in the early years of civilization through its rapid evolution in the 20th century, to its infiltration of all aspects of society in the 21st century.

Propaganda began as early as ancient Mesopotamia when the boastings of kings were inscribed on stone monuments. It continued, principally as a way of monarchs justifying their rule up through the 19th century.

The earliest use of the term propaganda was in the early 17th century when the Catholic Church, wishing to spread Catholic doctrine, support the faithful and counter the protestant reformation, established the Congregation for the Propagation of the Faith (Sacra Congregatio de Propaganda Fide).

World War I saw the beginnings of the disconnection of propaganda and truth. Both sides in that war created knowingly false narratives to bolster civilian morale and increase the fighting spirit of their soldiers. World War II took this process to a whole new level as false propaganda was used to justify mass murder and enslavement of an entire continent.  In the 21st century propaganda techniques have been raised to a new level of technical sophistication. Social media, artificial intelligence and modern psychological techniques can create images, sounds and documents completely unrelated to reality but almost impossible for the average person to recognize as false.

Elements of propaganda.

One of the classic elements of propaganda is repetition, the more a statement is repeated the more likely people are to believe it. There is a concept called “illusory truth effect” where the more you hear a statement, the more it feels true.

In past centuries, reference was made to respected people in authority to give credence to statements.  Over the years, this has evolved into celebrity endorsements and continues to expand with the recent emergence of instant celebrities in the form of social media influencers.  

Emotional appeals have always been a significant part of propaganda, emotions being more easily manipulated than facts. The audience is encouraged to react rather than think.

Simplification is also a central tenant of propaganda; complex ideas are reduced to simple slogans that can be repeated over and over again.  Slogans that are catchy and clever will encourage people to repeat them without considering their true meaning.

The repeated use of slogans contributes to the bandwagon effect, a critical propaganda technique for creating the impression of widespread acceptance. The more a person believes everyone else supports the program, the more likely they will be to support it without detailed personal analysis. 

Evolving propaganda.

In the early years of the 20th century, propaganda began to take a more malicious path. It began to lose a grounding in truth, except where necessary to sell the lie.  As propaganda evolved through the first few decades of the 20th century it became a specialized and highly effective weapon of statecraft.

It’s important to recognize that the ultimate goal of propaganda is not merely manipulating opinions and beliefs. It is a tool for obtaining and using political power.

The following quote, which I will leave unattributed, underlies the objective of propaganda from the mid-20th century on.

 “All propaganda has to be popular and has to accommodate itself to the comprehension of the least intelligent of those whom it seeks to reach. The great mass of the people will more easily fall victim to a big lie than to a small one. If you tell a lie that is big enough and tell it frequently enough, it will be believed.”

Propaganda in Action

A propaganda program that is designed to achieve political goals has several key elements.

The Target

The first step is to decide on the target population. These are the people you wish to cultivate as supporters and whom you wish to manipulate into specific actions. It’s important to understand what they consider to be their critical concerns. Whether you share those concerns or not isn’t important if you are able to convince the target population that you care about them and that you will meet their needs. Once you have analyzed the concerns of your target population you can develop your message to best appeal to and manage their opinions.

The Leader

The second element is to create a cult of personality around the leader. Generally, the leader will be a charismatic and effective speaker. On other occasions, he simply may be someone they would “like to have a beer with”. If a bond can be created it doesn’t matter how. The leader doesn’t have to have a true concern for the target group as long as they believe he does.  Once the leader and the target group have bonded, he will have an easier time manipulating them.  The stronger they are connected to him personally, the less scrutiny they will give to his ideas.

The Others

The next element is to identify the “other” group that will be the focus of attacks. The first step is to create fear of this group. Once your target population has developed a significant fear of whatever this group may be accused of, be it crime, immorality, or “unAmericanism”, a program is put in place to demonize them. The purpose of the early program is to generate a high level of unreasoning fear of this group within the target population. Fear is difficult to control, so once this stage has been reached, the fear must be converted to hate through repeated attacks blaming the “others” for every grievance the target group has experienced. Hate is easier to focus and to direct.  People can be more easily rallied to action, even violence, in response to hate.

Action

Once hate of the “other” group has been raised to a significant level, your target population can be moved to action. Be that unquestioning acceptance of ideas, voting for whatever candidates you identify, or even resorting to violence to suppress the “others”. 

This is the stage where real political power begins to flow from your propaganda program.  Your supporters have given up all efforts at critical thinking and blindly accept whatever orders you give in the misguided thought that you are concerned about them and their needs and are doing what is best for them and the country.  They have become the weapon for implementing your agenda.

Conclusion

For those of you with an appreciation of history, this should resonate not only with the 20th century but with current events. If you would like to know the source of the quote I gave at the beginning of this section, contact me. 

Having seen the effects of modern propaganda on our society, I am left in great despair.  In a future post I’m going to be discussing how social media has significantly increased the rate of spread and the effectiveness of propaganda and other disinformation programs.

The Agony of Presidental Elections

Over the past several months I have become increasingly discouraged as I watch the sad spectacle of the presidential election play out in the newspapers and on the electronic media. It seems the candidates are more interested in serving themselves than in serving the people. Each is convinced that he is the only one who can save the country. While one is obsessed with regaining the power of the president, the other is reluctant to relinquish it.

This has left me wondering if there’s not a better way to elect the president. As everyone knows we elect a president for four years and then they can be reelected for an additional four years. But this was not always the way our electoral process has been structured.

A brief history of presidential terms

Perhaps second only to slavery, the office of president was one of the most controversial subjects at the 1787 Constitutional Convention. There were extensive debates about the structure of the office, the power of the chief executive, and even if it should be a single person.  There was a proposal put forward by Edmund Randolph of Virginia for a three-person executive committee to head the government. 

There were proposals for a short-term presidency of one or two years, arguing that more frequent elections would ensure the chief executive was more responsible to the people. There were also proposals for longer terms such as 7 or even 15 years. The intent of longer terms was to provide the president with independence from the influence of special interests and provide for more stable government.

There is a popular misconception that Alexander Hamilton favored a president for life. While he advocated for a strong presidency with certain aspects of a monarch, he also recognized the importance of a fixed term and the provision for impeachment to safeguard the country against abuse of power.

After it became clear to most delegates that George Washington would become the first president, the convention decided on a four-year term for the president. The constitution as originally adopted did not explicitly state that the president can be reelected, nor did it prohibit reelection.  Not all delegates were in favor of allowing the president to be reelected because they felt it would result in too much power being consolidated the hands of the single person.  As often happens in politics, no decision was made, and the Constitution as adopted was silent on reelection.

The issue of term length did not end with the ratification of the constitution. The first proposed constitutional amendment to change the length of the presidential term was introduced in 1808. Since then, multiple amendments have been proposed to extend the term to five, six, seven or even eight years. By the early 1900s the single six-year term had become the dominant idea being proposed. Amendments to change the term of the presidential office were introduced as late as the 1990s. Although persistent in their reappearance, none have ever had any realistic chance of being approved.

 George Washington set the precedent of serving only two terms by retiring to Mount Vernon at the end of eight years in office. This pattern was followed until Franklin Roosevelt ran for a third and then a fourth term.  After Roosevelt’s death, the 22nd Amendment to the Constitution was passed by Congress in 1947, and ratified in 1951, limiting the president to two total terms in office. 

Other countries have different patterns for their chief executive’s term of office. Parliamentary countries have no set term. Elections can be called, and the chief executive removed from office whenever confidence in the government has fallen, and the public demands a change.  Other countries such as Mexico, the Philippines and Chile have a single six-year term for president.

If we were to consider a single six-year term for the US president how would that change things. Let’s look at a few of the pros and cons.

The arguments in favor of a single six-year term

The president would be relieved of the burden of an almost constant reelection campaign. With a single term, a president can focus on policy and governance without being distracted by the next election cycle. This may lead to bolder, more decisive leadership, unencumbered by the need to cater to special interest groups. Those groups may be less inclined to make large political contributions knowing that they hold no future sway over the actions of the president.

There may be potential for greater continuity in policymaking. Presidents could be more inclined to tackle unpopular long-term challenges head-on, rather than deferring them to a second term or even to their successor. The president may be more likely to engage in bipartisan programs knowing that there is no need to appease party radicals during a reelection process.  The president could be more likely to engage in long term planning rather than worry about what will look good in the polls in the short term. This may result in fewer political decisions made just to improve reelection chances.

Not having to run for reelection would also give the president more time to work for the country and the citizens. The time and the effort put into planning campaigns, making speeches, and attending reelection events could be spent improving the government and the country. If the president has more time to work closely with Congress, then there could be less gridlock in Washington.  In sum, the single six-year term may allow the president to be the statesman they all claim to be.

The arguments against a single six-year term

A single six-year term could diminish the accountability of the president to the electorate. In a traditional two-term system, presidents are incentivized to deliver on their promises and perform effectively to secure re-election. With only one term to serve, there may be less pressure for a president to maintain high levels of performance throughout their tenure. This could result in complacency or a lack of motivation to pursue ambitious reforms, knowing there won’t be a chance for the electorate to hold them accountable.

A single six-year term could disrupt the balance of power between the executive and legislative branches. With the absence of a potential second term, presidents may become more inclined to bypass Congress and govern through executive orders and unilateral actions. This could lead to increased polarization and gridlock, as Congress may resist executive overreach, exacerbating tensions between branches of government.

If the president proves to be inept or not acting in the best interests of the people and the nation, six years is a long time for that person to be in office. The only option would be impeachment and we know from recent experience that impeachment is a long and painful process frequently generating more ill will than good results.

And in conclusion…

So, would a single six-year presidential term be a good thing or a bad thing?  Would it help us get out of our current political mess?

Perhaps it would have no impact at all. Perhaps it’s not the length of the presidential term, the number of terms the president serves, or even how we elect the president. Perhaps the real problem lies with us. It doesn’t matter how we elect our president if we don’t do a better job deciding who the candidates will be. It’s been said many times that people get the government they deserve. We all complain about it and yet we continue voting for the same type of people year after year. 

Of all the people to blame for the current political situation, the one in the mirror is the one to whom we should look first.  Because, after all, that’s the only person we can control.

And that is my grumpy opinion.

New Myths Arise

So why should we consider myths as anything but an anachronistic curiosity since we consider ourselves a rational and scientific society? Because the willingness to believe in myths is as strong today as it has ever been. While belief in Olympian gods, elves, and fairies has faded away new myths have arisen. Since the late 1800s, at least two new myths have spread in the United States.

The Golden Age of America

Our political situation today has much to do with belief in the Myth of the Golden Age of America. I strongly believe that the United States has been and continues to be the best hope for personal freedom in the world today. But the idea that at some time in the recent past everything was wonderful for everyone is a myth. In this “Golden Age” to which some people wish to return, women, minorities, gays, and the disabled were clearly discriminated against.

One key contributor to the Golden Age Myth was the economic boom that followed World War II. The United States was a global industrial leader and the economy showed significant gains in jobs, wages, and consumer goods. The middle class was expanded, college education rose due to the GI Bill, and home ownership reached new levels. However, these advantages did not reach all members of our society.

Racial minorities continued to be actively discriminated against. Segregation, particularly under the Jim Crow laws in the South, limited economic, social, and political opportunities for our Black citizens.
Women’s roles and opportunities similarly were significantly constrained. Women were expected to take a domestic role over professional or personal aspirations. Even women who obtained advanced college degrees were expected to stay at home and raise their family or to take “appropriate” jobs such as secretaries, teachers, or nurses.

Reaction to the Golden Age Myth led to movements such as women’s liberation and the civil rights movement, both followed shortly by gay rights and the advocacy for the disabled. The Golden Age seems to have existed principally in popular television shows such as Ozzie and Harriet or Father Knows Best. In short, this “Golden Age” was not golden for everyone.

The Lost Cause

This myth arose in the South in the late 19th and early 20th centuries though it had its origins while the war still raged. According to the Myth of the Lost Cause, the Civil War had nothing to do with slavery. It had to do with the southern states fighting for their individual state rights and their prerogatives of self-government. They were fighting against northern aggression that was trying to destroy the southern way of life. Most Confederate soldiers were poor farmers who neither owned nor could afford slaves. They had to be convinced that they were fighting for their way of life against a malicious union army that was intent on invading their homes and forcing “northern ways” on them. The soldiers had to be distracted from the fact that they were fighting, suffering, and dying to protect the way of life of the wealthy slave owning aristocracy.

The evolution of legend was also involved in the creation of this myth. Robert E. Lee and Stonewall Jackson were idealized as true southern gentlemen struggling in a glorious but doomed battle against overwhelming odds.

Of course, it completely ignores the fact that the southern way of life and those state rights were predominantly based on slavery. A critical element in this myth is that slavery was good for the slaves. We continue to hear this stunning falsehood from politicians today as they try to describe slavery as a little more than job training.

Many historians today agree this myth is an intentional distortion of historical facts. The Lost Cause Myth reached its fully developed form in the years surrounding the turning of the 20th century and was intended to change the historical narrative of the South’s role in the Civil War by minimizing the central role of slavery in the origin of the conflict. This revisionism was part on the broader social effort used to justify segregation, Jim Crow laws, and white supremacy.

The Lost Cause is a compelling example of history being reshaped by myth and legend. It shows how over time people can come to accept those things that most support their personal beliefs despite evidence to the contrary. It continues to dominate much of our current debate about race relations, voting rights and social welfare policies. However, today open advocacy of the Lost Cause Myth is in the background and it is seldom mentioned by name though its tenents are reflected in the opinions and statements of many .

Why believe in myths?

So why do we have a widespread belief in these myths? There are several reasons people persist in a false belief even after it has been largely disproven.

The most obvious reason is that myths meet emotional needs. They can be deeply ingrained in a person’s identity, beliefs, and values. When the myth is tied to political or religious beliefs people will be resistant to change even in the face of contradictory evidence. Admitting a previous error of belief is, in some ways, viewed as a form of weakness.

There is also a condition called confirmation bias. People are inclined to seek out and accept without question things that confirm their pre-existing beliefs and opinions. They ignore anything that is not consistent with an already held position.

When presented with information that conflicts with previously held opinions, people can experience what is known as cognitive dissonance. This is the emotional distress that people feel when attempting to hold two contradictory ideas or when trying to reconcile new information that challenges their behaviors or previously held ideas. To reduce this distress people either ignore or in some cases violently reject anything that conflicts with what they previously believed.

Critical thinking skills involve objectively evaluating evidence, identifing inconsistencies, and appling reasoning skills. Critical thinking isn’t always a natural process. When I was in high school, we were expected to memorize facts and then duplicate them on tests. I don’t remember even hearing the term critical thinking until I was in graduate school. Even then, I’m not really sure I understood how important it is to develop a personal understanding of facts and events and how difficult it would be to aquire those skills.

Teachers now make an effort to teach critical thinking skills. But it can be difficult for students to translate those skills into their life outside the classroom. Perhaps many of them may think of critical thinking the way they think of algebra, something they must do in school but won’t ever use in their everyday life. A well-informed citizenship requires all of us to encourage critical thinking practices. We need to ensure that our young people are reading and listening and using those skills outside the classroom. It’s easy to say, but venturing out of our comfort zone can take strength and purpose for all of us.

We can quickly fall into the habit of listening to or reading only a narrow range of opinions from a limited number of sources. It is particularly easy when those sources don’t challenge us to think or to analyze. It is too easy to reject new information rather than trying to evaluate and reconcile it with our previously held beliefs.

Obviously, these reasons are not mutually exclusive and are melded into a continuum of reasons for the rejection of fact in favor of myth.

Both the Lost Cause Myth and the Golden Age Myth arose much quicker and for a more limited purpose than did the classic myths. In this way the evolution of these myths has much in common with the concepts of propaganda. In my next post I will be looking at the difference between lies and myths and how both relate to propaganda and how it has evolved in modern times.

Page 1 of 3

Powered by WordPress & Theme by Anders Norén