Grumpy opinions about everything.

Category: History Page 1 of 4

Grumpy opinions about American history

What Is Fascism Anyway?

Fascist! The very word conjures up images of totalitarianism, militarism, suppression of dissent and brutality. Unfortunately, it’s become a ubiquitous portion of our political discourse. Each side, at one time or another, has accused the other of being fascist. But what do they really mean by fascist? Do they understand the definition and the reality of fascism? Or do they simply mean: “I disagree with you, and I really want to make you sound evil.”

I decided I needed to know more about fascism, so I’ve done some research, and I’d like to share the results with you. As I frequently do, I’ll start with the dictionary definition.  According to Merriam-Webster fascism is a political philosophy, movement, or regime that exalts nation and often race above the individual and that stands for a centralized autocratic government headed by a dictatorial leader, severe economic and social regimentation, and forcible suppression of opposition

As with many dictionary definitions, it gives us the 50,000-foot view without any real detail. What I’d like to do is cover the origins of fascism, its basic principles and how it rose to prominence in the middle of the 20th century. I also want to compare fascism to communism—another ideology that shaped much of the 20th century—and to provide insights into the differences and similarities between these two systems.

The Origins of Fascism

Fascism emerged in the early 20th century, primarily in Italy, as a reaction to the perceived failures of liberal democracy and socialism. The term itself comes from the Italian word “fascio,” meaning a bundle or group, symbolizing unity and collective strength. It also references fasces, a bundle of rods tied around an ax symbolizing authority in the Roman Republic.  It was appropriated as a symbol by Italian fascists in an attempt to identify with Roman history, much as American patriotic symbols are being appropriated by the radical right in the U.S. today.

Benito Mussolini, an Italian political leader, is often credited as the founder of fascism.   He established the groundwork for first fascist regime in Italy beginning in 1922 after he was appointed Prime Minister.  Fascism arose in a period of social and economic turmoil following the First World War. Many people in Europe were disillusioned with the existing political systems, which they believed had failed to prevent the war and its devastating consequences. The post-war economic instability, along with fears of communist revolutions like the one in Russia, provided fertile ground for the rise of fascist movements.

Moussolini, together with Italian philosopher Giovanni Gentile, published “The Doctrine of Fascism” (La Dottrina del Fascismo) in 1932, after he had consolidated political power in his hands.  It lays out the guiding principles and theoretical foundations of fascism, stressing nationalism, anti-communism, the glorification of the state, the belief in a strong centralized leadership, and the rejection of liberal democracy.   

The Philosophical Basis of Fascism

Fascism is rooted in several key philosophical ideas:

  • Nationalism and Militarism: Fascism places the nation or race at the center of its ideology, often elevating it to a quasi-religious status. The state is seen as a living entity that must be protected and expanded through internal police action and external military strength.
  • Authoritarianism: Fascists reject democratic institutions, believing that a strong, centralized authority is necessary to maintain order and achieve national greatness. Individual freedoms are subordinated to the needs of the state.
  • Anti-Communism and Anti-Liberalism: Fascism is explicitly opposed to both communism and liberal democracy. It views communism as a threat to national unity and social order, while liberal democracy is seen as weak and indecisive.
  • Social Darwinism: Fascists often believe in the idea of the survival of the fittest, applying this concept to nations and races. They argue that conflict and struggle are natural and necessary for the advancement of the state.

Implementation and Practice of Fascism

Fascism has been implemented in various forms, with Italy under Mussolini and Nazi Germany under Adolf Hitler being the most prominent examples. In practice, fascist regimes are characterized by:

  • Centralized Power: Fascist governments concentrate power in the hands of a single leader or party, often through the use of propaganda, censorship, political repression, and mass imprisonment and execution of opponents.
  • State Control of the Economy: While fascists generally allow for private ownership, they maintain strict control over the economy, directing resources toward the state’s goals, particularly militarization.
  • Suppression of Dissent: Fascist regimes are intolerant of opposition, often using violence, imprisonment, and even assassination to eliminate political rivals and suppress dissent.
  • Cult of Personality: Fascist leaders often create a cult of personality, presenting themselves as the embodiment of the nation and its destiny.

Comparing Fascism and Communism

While both fascism and communism reject liberal democracy, they differ significantly in their goals and methods.

  • Philosophical Differences:
    • Fascism: As mentioned earlier, fascism emphasizes nationalism, authoritarianism, and social hierarchy. It seeks to create a strong, unified state that can compete with other nations on the global stage.
    • Communism: Communism, based on the ideas of Karl Marx, advocates for a classless society where the means of production are owned collectively. It seeks to eliminate private property and achieve equality among all citizens.
  • Economic Systems:
    • Fascism: Fascists allow for private ownership but maintain state control over key industries and direct economic activity to serve the state’s interests.
    • Communism: Communism advocates for the abolition of private property, with all means of production owned and controlled by the state (or the people in theory). The economy is centrally planned and managed.
  • Political Structures:
    • Fascism: Fascist regimes are typically one-party states with a strong leader at the top. Political pluralism is non-existent, and the government exercises strict control over all aspects of life.
    • Communism: Communist states are also typically one-party systems, but they claim to represent the working class. In practice, these regimes often become highly centralized and authoritarian or totalitarian, similar to fascist states.

Comparative Examples

  • Italy and Nazi Germany (Fascism): Both Mussolini’s Italy and Hitler’s Germany exemplify fascist regimes. They were characterized by aggressive nationalism, military expansionism, and the suppression of political opposition. Hitler’s regime, however, took these ideas to their most extreme and horrifying conclusion with the Holocaust, a genocide driven by racist ideology.
  • Soviet Union (Communism): The Soviet Union under Joseph Stalin provides a clear example of a totalitarian communist state. The government abolished private property, collectivized agriculture, and implemented central planning. Political repression was severe, with millions of people imprisoned, starved to death or executed during Stalin’s purges.  It is important to recognize that Stalinist communism differed significantly from the theoretical communism of Karl Marx.

Conclusion

Fascism and communism, despite their profound differences, share certain similarities in practice, particularly in their authoritarianism and intolerance of dissent. However, their philosophical foundations and goals are fundamentally different: fascism seeks to elevate the nation above all else, while communism theoretically aims to create a classless society. Understanding these ideologies and their historical manifestations is crucial for anyone interested in the political history of the 20th century and its lasting impact on the world today. 

We can use our understanding of fascism and its comparison to democracy to ask important questions. What kind of government do we want?  Are there any possible crossovers or compromises between the two? And, importantly, should there be?

Postscript

Many of the ideas in this post were inspired by two excellent books on the subject, “The Origins of Totalitarianism” by Hannah Arendt and “Fascism: A Warning” by Madeleine Albright.

 Doctors of the Deep Blue Sea

A Brief History of the U.S. Navy Medical Corps

The U.S. Navy Medical Corps has a history that evolves from a humble beginning during the Revolutionary War to its current role as a vital component of modern military medicine. The Medical Corps ensures the health and well-being of sailors, Marines, and their families, while contributing to public health and advancements in medical science.

Origins in the Revolutionary War

The roots of Navy medicine trace back to the Revolutionary War, when medical care aboard ships was primitive at best. Shipboard surgeons, often lacking formal medical training, treated injuries and disease with the limited tools and knowledge available to them. In the early days of the U.S. Navy, physicians served without formal commissions, often receiving temporary appointments for specific cruises.  Their primary tasks included amputations, treating infections, and caring for diseases like scurvy and dysentery.

In 1798, Congress formally established the Department of the Navy, creating the foundation for organized medical care within the naval service.  Surgeon Edward Cutbush published the first American text on naval medicine in 1808. The Naval Hospital Act of 1811 marked another milestone, authorizing the construction of naval hospitals to support the growing fleet.

Establishment of the Navy Medical Corps (1871)

The U.S. Navy Medical Corps was officially established on March 3, 1871, by an act of Congress. This legislation created a formal medical staff to support the Navy, setting standards for the recruiting and training naval physicians. These physicians were initially known as “Surgeons” and “Assistant Surgeons,” tasked with providing care on ships and at naval hospitals.  The act granted Navy physicians rank relative to their line counterparts, acknowledged their role as a staff corps, and established the title of “Surgeon General” for the Navy’s senior medical officer.

During this period, the Navy Medical Corps began to expand its scope. It embraced emerging medical technologies and scientific discoveries, setting the stage for its later contributions to public health and medical innovation.

The Navy Hospital Corps

The U.S. Navy Hospital Corps was established on June 17, 1898. Its creation was prompted by the increased medical needs during the Spanish-American War. Since then, the enlisted corpsmen have served in every conflict involving the United States, providing critical medical care on battlefields, aboard ships, and in hospitals worldwide.

Corpsmen are trained to perform a wide range of medical tasks, including emergency battlefield triage and treatment, surgery assistance, and disease prevention. They are often embedded directly with Marine Corps units, making them indispensable on the battlefield.

The Hospital Corps is the most decorated group in the U.S. Navy. To date, its members have earned numerous high-level awards for valor, including: 22 Medals of Honor, 182 Navy Crosses, 946 Silver Stars, and 1,582 Bronze Stars.

World Wars and the Expansion of Military Medicine

Both World War I and World War II were transformative for the Navy Medical Corps. During World War I, Navy medical personnel treated injuries and illnesses both aboard ships and in field hospitals. Their efforts were instrumental in managing wartime epidemics, including the devastating 1918 influenza pandemic.

World War II brought further advancements. The Navy Medical Corps played a pivotal role in addressing the challenges of warfare in diverse climates, including tropical diseases in the Pacific Theater. It also pioneered methods for treating trauma, burns, and psychiatric conditions.

Cold War Era and Modernization

The Cold War era marked a time of significant innovation for the Navy Medical Corps. The establishment of the Navy Medical Research Institutes advanced studies in areas such as tropical medicine, submarine medicine, and aerospace medicine. These efforts supported the Navy’s global missions and contributed to broader medical advancements.

In the latter half of the 20th century, Navy medical personnel became key players in humanitarian missions, responding to natural disasters and providing aid in conflict zones. Their expertise in public health, infectious disease control, and trauma care enhanced the Navy’s ability to spread goodwill worldwide.

Modern Contributions and Future Challenges

Today, the Navy Medical Corps supports both military readiness and global health. Its personnel provide care on ships, submarines, aircraft carriers, and for Marine Corps forces, and at shore-based facilities. They also participate in humanitarian missions and disaster response, reflecting the Navy’s commitment to a broader vision of security and well-being.

In recent years, Navy medicine has faced challenges such as the COVID-19 pandemic, addressing mental health issues among service members, and adapting to emerging threats like climate change and cyber warfare defense. These challenges underscore the evolving role of the Navy Medical Corps in a complex world.

From its early days of rudimentary care to its modern role in global health and innovation, the U.S. Navy Medical Corps has been a cornerstone of military medicine. Its contributions extend beyond the battlefield, shaping public health, medical research, and humanitarian efforts worldwide.

As the Navy Medical Corps continues to adapt to new challenges, it remains a testament to the enduring value of medical service in the defense of the nation and the promotion of global health.

Superheros of the American Revolution

The American Revolution was fought not just by great leaders but by ordinary men and women who sacrificed everything for the promise of liberty. These “superheroes” of The Revolution—everyday soldiers of the Continental Army—endured unimaginable hardships and proved their resilience and commitment to a cause greater than themselves.

Who Were the Soldiers?
The typical soldier in the Continental Army was a young, able-bodied man in his late teens or twenties. However, recruits ranged widely in age, from boys as young as 16 to older men in their 40s or 50s. They came from all walks of life, reflecting the agrarian and small-town character of colonial America.

Work Background
Most soldiers were farmers or farm laborers, the backbone of the colonial economy. Others worked as apprentices or tradesmen, honing skills in blacksmithing, carpentry, and shoemaking. In coastal regions, fishermen and sailors also joined the ranks, bringing valuable maritime experience. Whatever their occupation, enlistment often meant leaving behind grueling but steady work, placing enormous burdens on their families and communities.

Education
Formal education was limited for most enlisted men. Literacy rates in colonial America, though higher than in Europe, were modest. Many soldiers could read and write only minimally, though these skills were sufficient for reading orders or sending letters home. Officers were generally better educated, often hailing from wealthier families with access to classical training and instruction in leadership and military strategy.

Family Life
Family ties were integral to the soldiers’ lives. Most were unmarried young men, but some older recruits left wives and children behind. Married soldiers relied on their families to manage farms and households in their absence, with women stepping into traditionally male roles to keep homes running. Communities often influenced enlistment decisions, with entire groups of men from the same town joining together, fostering camaraderie and mutual responsibility.

Why They Fought
Motivations for joining the Continental Army varied:
Patriotism: Many believed passionately in independence and the ideals of liberty and self-governance.
Economic Opportunity: For poorer colonists, enlistment promised steady (albeit delayed) pay and the promise of land grants after the war.
Community Expectations: Peer pressure and local leaders often spurred enlistments.
Adventure: For some young men, the army offered a chance for excitement and novelty.

Life in the Continental Army
Soldiers in the Continental Army faced extraordinary challenges that tested their endurance, commitment, and morale.
Logistical Struggles
The army constantly grappled with a lack of basic supplies:
Food: Soldiers often endured long periods of hunger, relying on inconsistent local contributions, sometimes going days without eating.
⦁ Clothing: Many lacked proper uniforms, footwear and blankets, suffering in harsh weather some even dying from exposure.
Ammunition: Weapons and ammunition were scarce, forcing soldiers to scavenge from battlefields.

At Valley Forge in the winter of 1777–1778, these shortages reached a critical point, with thousands suffering from frostbite, near starvation and exposure.
Extreme weather compounded the soldiers’ difficulties. Winter encampments like Valley Forge were marked by freezing temperatures, snow, and overcrowded, unsanitary conditions that led to outbreaks of smallpox, typhus, and dysentery.
Soldiers marched long distances with heavy packs, often on empty stomachs and in worn-out shoes. The physical strain was enormous, and separation from families added emotional stress. Many struggled to adapt to military life, which was vastly different from their previous experiences as farmers or tradesmen.

Financial Hardships
The fledgling American government struggled to fund the war:
⦁ Soldiers were rarely paid on time, leading to frustration and occasional mutinies.
⦁ Promised wages were often months or years late, making it difficult for soldiers to support their families.

Inconsistent Leadership and Training
Early in the war, the army lacked professional training and experienced leadership. While General George Washington provided steadfast guidance, many officers were political appointees with little military expertise. This began to change when Baron von Steuben arrived at Valley Forge, introducing systematic training and discipline.

Psychological Strain
The Revolutionary War dragged on for eight years, leaving soldiers to question whether their sacrifices would lead to victory. Early defeats against the better-equipped British Army demoralized many, and desertion rates were high. Still, the shared belief in the cause of liberty and the support of local communities kept many soldiers in the fight.

The Role of Communities
The army’s survival depended on civilian support. Local farmers, tradesmen, and women provided food, clothing, and moral encouragement. Civilians risked their lives to aid soldiers, and the collective belief in independence buoyed spirits even in the darkest times.

Conclusion
The common soldiers of the Continental Army were true superheroes of the American Revolution. Despite enduring hunger, cold, disease, and financial instability, they fought with unwavering determination. Their sacrifices laid the foundation for a new nation, proving that the quest for freedom often requires immense personal and collective sacrifice.

Sources:
Robert Middlekauff, The Glorious Cause: The American Revolution, 1763-1789
⦁ Caroline Cox, A Proper Sense of Honor: Service and Sacrifice in George Washington’s Army
⦁ Library of Congress: American Revolution resources

Waiting For The Reichstag Fire

On the evening of February 27th, 1933 the German Reichstag burst into flames. This attack on the German national parliament building was viewed by many as an attack on Germany itself.

A Dutchman named Marinus van der Lubbe was found and arrested at the scene almost immediately after the fire erupted. The Nazis quickly claimed that the fire was part of a broader communist uprising and used this claim to push for emergency powers.

 Van der Lubbe confessed to setting the fire alone, but the Nazi Party quickly claimed that it was part of a widespread communist conspiracy. Many people believe that the Nazis may have set the fire themselves and used it as a pretext to declare emergency rule.

 Adolf Hitler persuaded German President Paul von Hindenburg to issue the “Decree for the Protection of the People and the State” which suspended civil liberties, including freedom of speech, press and assembly. It allowed for the arrest and detention of political opponents without due process. Thousands of communists and socialists were arrested.

Within a month new elections were held. While the Nazis did not win an outright majority, they used the fire to create fear that led to passage of the “Enabling Act” on March 23, 1933. The act gave Hitler dictatorial powers, effectively ending democracy in Germany.

The Reichstag Fire was a crucial point in world history. Whether it was a Nazi engineered false flag operation or the act of a alone arsonist, it provided Hitler with the excuse he needed to dismantle democracy and establish a totalitarian dictatorship. This is a chilling example of how fear and propaganda can be weaponized to erase freedom; a lesson that remains relevant today.

The Wisdom of Dietrich Bonhoeffer

Dietrich Bonhoeffer (1906–1945) was a German Lutheran pastor, theologian, and anti-Nazi dissident. Born in Poland into an intellectual family, he pursued theology at the University of Berlin, earning a doctorate at just 21. His early work emphasized the importance of the Church in standing against injustice, a principle that would shape his resistance to Adolf Hitler’s regime.

In the 1930s, Bonhoeffer became a leading voice in the Confessing Church, a movement opposing Nazi influence in German Protestantism. He condemned the regime’s treatment of Jews and rejected the idea of a church subservient to state ideology. After the Nazis banned him from teaching and speaking publicly, he joined the German resistance, working secretly with military officers plotting to overthrow Hitler.

Arrested in 1943 for his role in the conspiracy, Bonhoeffer was imprisoned for two years, during which he wrote some of his most profound theological works, including Letters and Papers from Prison. The quotes below are taken from this work.

On April 9, 1945, just weeks before Germany’s surrender, he was executed at Flossenbürg concentration camp. His legacy endures as a model of Christian resistance, moral courage, and faith in action.

Quotes from Letters and Papers from Prison

“The impression one gains is not so much that stupidity is a congenital defect, but that, under certain circumstances, people are made stupid or that they allow this to happen to them.”

“Having thus become a mindless tool, the stupid person will also be capable of any evil and at the same time incapable of seeing that it is evil. This is where the danger of diabolical misuse lurks, for it is this that can once and for all destroy human beings.”

“Stupidity is a more dangerous enemy of the good than malice. One may protest against evil; it can be exposed and, if need be, prevented by use of force. Against stupidity we are defenseless.”

“Neither protests nor the use of force accomplish anything here; reasons fall on deaf ears; facts that contradict one’s prejudgment simply need not be believed – in such moments the stupid person even becomes critical – and when facts are irrefutable they are just pushed aside as inconsequential, as incidental.”

“In all this the stupid person, in contrast to the malicious one, is utterly self satisfied and, being easily irritated, becomes dangerous by going on the attack.”

What Would Jefferson Think About Inserting Religion Into Public Education?

Jefferson on Religion

Thomas Jefferson had strong views on the separation of church and state, and based on his writings, it’s likely that he would have opposed any attempt to inject religion into public education.  Jefferson’s views on religion were deeply influenced by Enlightenment principles, particularly the era’s emphasis on reason, skepticism of traditional authority, and commitment to individual liberty.

While Jefferson respected personal religious beliefs, he believed religion should remain a private matter, free from government influence. His 1786 Virginia Statute for Religious Freedom declared it immoral to compel anyone to support or participate in religious activities, emphasizing individual choice in matters of faith. This stance guided his actions, including the disestablishment of the Anglican Church as the official church of Virginia after the Revolution.

He famously wrote about the need for a “wall of separation between Church & State” in his 1802 letter to the Danbury Baptist Association. This idea became one of the foundational principles behind the First Amendment’s protection of religious liberty.

Although Jefferson was not opposed to religious belief, he supported individual freedom of conscience and he was adamant that religion should be a personal matter, not one enforced, promoted, or influenced by the government.

Religion in Education

When it came to education, Jefferson was passionate about public schooling and saw it as essential to maintaining a democratic society. He believed in the importance of a secular education system that promoted knowledge and reasoning. Jefferson envisioned public education as a way to cultivate informed citizens who could participate in self-governance.

Jefferson’s University of Virginia reflected these ideals, excluding religious instruction and ensuring a secular educational environment. He insisted that religion be studied alongside philosophy and ethics, rather than as a doctrinal subject.

If Jefferson were to assess attempts to inject religion into public education today, it’s reasonable to assume he would view such efforts as a violation of the principles of religious freedom he worked to establish. Jefferson would likely argue that public education, funded by taxpayer dollars and serving people of diverse religious backgrounds, should remain secular to respect the individual rights of all citizens. For him, blending government and religion risked infringing on personal freedoms and undermining the equality of all citizens under the law.

He would probably agree with later interpretations of the Constitution, such as Supreme Court rulings that have affirmed the separation of church and state in the context of public schools. These decisions typically uphold the principle that government institutions, including public schools, should not promote or endorse any particular religion.

Thomas Jefferson’s views on religious freedom, the separation of church and state, and public education suggest that he would strongly oppose any attempt to inject religion into public education. He believed that the role of public schools was to educate citizens in a way that fosters critical thinking, civic engagement, and respect for individual liberties, including the right to practice any religion or none at all. For Jefferson, keeping religion out of public institutions was essential to preserving a free and diverse society.

Jefferson’s unwavering commitment to individual liberty and reason over dogma continues to resonate, emphasizing the enduring value of secular education in fostering democratic principles.

Reshaping Collective Memory

How Governments and Organizations Influence History

The reshaping of societal memories by governments and powerful organizations is a complex, often subtle process driven by political, cultural, or economic goals. At its core, it involves shaping collective memory—the shared pool of knowledge and information within a society—so that certain narratives or interpretations of events are emphasized, while others are diminished or erased altogether. This process can occur overtly through official policies, education, or media, or covertly through subtle shifts in cultural emphasis. This post explores historical precedents, modern examples, the methods employed, the role of large organizations, and the ethical implications of manipulating collective memory.

Historical Precedents and Modern Examples

Governments have long engaged in the manipulation of collective memory, and history is filled with examples of this practice. In the Soviet Union, leaders who fell out of favor were frequently “erased” from photographs, history books, and public memory—a practice similar to the ancient concept of damnatio memoriae, the Roman practice of condemning those deemed enemies of the state by erasing their existence from public records. Similarly, in the aftermath of revolutions, new governments often attempt to rewrite history to legitimize their rule and justify their actions. Monuments, statues, and even place names can be altered or destroyed to erase the memory of a prior regime and reimagine the past in ways that support the new political narrative.

In more recent times, authoritarian regimes have used similar tactics, from China’s control of information surrounding the Tiananmen Square protests to North Korea’s highly curated historical narrative that glorifies its leaders. Even in democratic societies, where manipulation of collective memory is often less overt, there are still examples of governments attempting to control public discourse and memory.

Methods of Restructuring Collective Memory

The restructuring of collective memory can occur in a variety of ways, ranging from subtle shifts in emphasis to overt censorship:

  1. Education and Curriculum Control: By shaping school curricula, governments emphasize certain historical events or figures, creating narratives that align with political or ideological goals.
  2. Media Control: State-influenced media outlets shape public memory by controlling the flow of information, ensuring that only certain versions of history or current events are disseminated.
  3. Censorship and Information Suppression: Governments may restrict access to documents, films, or books, effectively controlling the narratives available to the public.
  4. Commemorations and Public Symbols: Through monuments, statues, holidays, and public spaces, societies decide what to commemorate, reinforcing specific narratives.

Role of Large Organizations

While governments are often the primary actors in reshaping societal memories, large organizations such as multinational corporations, international Non-Governmental Organizations (NGOs), and global media companies also play a significant role. Corporations often use “corporate social responsibility” (CSR) initiatives to align their brands with social movements or values, subtly shaping public perceptions of historical and current events.

Media conglomerates, by controlling vast networks of information dissemination, influence which stories are told, retold, or forgotten. Social media platforms, through their algorithms and content moderation policies, significantly influence collective memory by determining which narratives remain visible and which fade into obscurity. As a result, collective memory becomes fragmented, influenced as much by corporate interests and technological algorithms as by government policies.

Ethical Concerns and the Struggle for Truth

The ethical implications of reshaping societal memories are vast. While some argue that reshaping collective memory is necessary for social progress, particularly when it comes to rectifying historical injustices or fostering reconciliation, others view it as a dangerous form of manipulation that can obscure truth and stifle dissent.

This tension reflects a broader debate about the nature of memory and history itself. Is there an objective “true” version of history, or is all history inherently subject to reinterpretation as societal values and perspectives evolve? This ongoing tension between interpretation and truth underscores the need for a careful and inclusive approach to shaping collective memory, with a responsibility to ensure that the process remains open, inclusive, and truthful, rather than driven solely by those in power.

Conclusion

Restructuring societal memories is a powerful tool that governments and large organizations can use to influence culture, politics, and identity. The methods they use, whether through education, media, censorship, or public symbols, can have profound impacts on how societies understand their past and imagine their future. While some reshaping of collective memory is inevitable, it is essential to approach this process with caution, prioritizing the public interest over the narrow objectives of the powerful. With the rise of digital platforms and globalized media, the struggle for control over collective memory is more relevant than ever, raising important ethical questions about who gets to shape the stories we live by.

Further Reading

For further reading, see: Items: Insights from the Social Sciences.

The Oxford Handbook of Contextual Political Analysis,  https://academic.oup.com/edited-volume/34357

A Bleak Christmas

 Surviving the Winter at Valley Forge

Christmas at Valley Forge in 1777 was a somber affair for the Continental Army. On December 19, weary soldiers arrived at the encampment after a string of defeats and the loss of Philadelphia to British forces. They faced immediate challenges: inadequate shelter, scarce provisions, disease, and the onset of a harsh winter. Although the construction of over 1,500 log huts provided some relief, many troops lacked proper clothing and shoes, enduring bitter cold with little protection.

The army’s religious diversity shaped Christmas observances. Denominations like Episcopalians and Lutherans celebrated the holiday, while others, including Quakers and Presbyterians, did not. As a result, any Christmas observances were likely subdued and personal.

Amid the hardships, General George Washington sought to alleviate suffering. On Christmas Eve, he ordered each regiment to draw provisions to complete rations for the following day. Despite these efforts, Christmas morning brought little relief. Many soldiers faced the day with only “firecakes”—a meager mixture of flour and water—as their meal. The harsh conditions compelled them to spend the day building and repairing huts, collecting firewood, and foraging for food. Others dug defensive works or endured rotating guard duty through the bitter night.

While Continental soldiers struggled at Valley Forge, British forces in Philadelphia enjoyed relative comfort. British troops were quartered in colonial homes, staying warm and well-fed. Some local farmers secretly sold provisions to the British, drawn by payments in gold or silver.

Despite the immense suffering, the winter at Valley Forge marked a turning point for the Continental Army. The arrival of Baron Friedrich Wilhelm von Steuben in February 1778 brought much-needed training and discipline, transforming the army into a more effective fighting force.

In summary, Christmas at Valley Forge was a time of hardship, sacrifice, and reflection for the Continental Army. The bitter experiences of that winter tested their resolve but also laid the groundwork for their ultimate success in the fight for independence.

Here are the sources referenced for the discussion on Christmas at Valley Forge:

  1. National Park Service – Valley Forge History and Significance
    Link: nps.gov
  2. National Park Service – Christmas at Valley Forge
    Link: nps.gov
  3. Mount Vernon – George Washington at Christmas
    Link: mountvernon.org
  4. History.com – Valley Forge
    Link: history.com

From Fact to Folklore: The Evolution of Thanksgiving Traditions

Thanksgiving has become one of the most cherished holidays in the United States, steeped in tradition, gratitude, and shared meals. Its origins are often traced back to the Pilgrims’ harvest celebration in 1621, yet much of what we “know” about that event has been shaped by legend. The historical facts surrounding the first Thanksgiving differ significantly from the modern narrative, which has evolved into a romanticized story of harmony and feasting. Let’s explore what history tells us about that pivotal celebration, examining the number of attendees, the types of food served, the length of the event, and the subsequent creation of the Thanksgiving legend.

The First Thanksgiving

In the autumn of 1621, after a successful harvest, the Pilgrims at Plymouth Colony held a three-day celebration that is often considered the “first Thanksgiving.” This event marked a period of gratitude and alliance-building between the Pilgrims and the Wampanoag people, who were critical to the settlers’ survival during their first year in the New World.

Who Attended?

Approximately 90 Wampanoag men, led by Chief Massasoit, joined 50 surviving Pilgrims for the event. The Pilgrims had arrived aboard the Mayflower the previous year, with 102 passengers. However, disease, harsh conditions, and starvation during the brutal winter of 1620-1621 had decimated their numbers. By the time of the harvest feast, only about half of the original settlers remained. Among the Pilgrims, there were 22 men, 4 married women, and about 25 children and teenagers.

The Wampanoag, who had been instrumental in teaching the Pilgrims essential survival skills, were invited as honored guests.

The Menu

The food served at the 1621 gathering was vastly different from today’s traditional Thanksgiving meal.  The feast was primarily prepared by the four surviving adult Pilgrim women: Eleanor Billington, Elizabeth Hopkins, Mary Brewster, and Susanna White. They were assisted by their daughters and four household servants.

While there are no definitive records of the exact dishes, historical accounts and the resources available to the settlers provide clues:

  • Meat and Game: The primary protein source was likely wildfowl, such as ducks, geese, and possibly turkey. Deer (venison) brought by the Wampanoag was also a centerpiece.
  • Seafood: The Pilgrims relied heavily on the ocean for sustenance, so fish, clams, mussels, and possibly lobster may have been included.
  • Grains and Vegetables: Corn was a staple, though it was likely prepared as a simple porridge or bread, not the sweetened dishes we know today. Other vegetables like squash, beans, onions, and native wild plants such as Jerusalem artichokes were likely served.
  • Fruits and Nuts: Wild berries, cranberries (unsweetened), and nuts like walnuts and chestnuts may have been part of the feast.
  • Beverages: The Pilgrims likely drank water or weak beer, as clean drinking water was not always available.

Absent from the feast were many items central to a contemporary Thanksgiving, such as mashed potatoes, pumpkin pie, and sweetened cranberry sauce. Potatoes and sugar were not readily available, and ovens for baking were primitive at best.

The Celebration’s Length

The first Thanksgiving was not a single meal but rather a three-day event. The Pilgrims and Wampanoag likely engaged in feasting, games, and possibly ceremonial activities. For the Pilgrims, it was a religious occasion, giving thanks to God for their survival and harvest. For the Wampanoag, such feasts were part of their cultural traditions, celebrating seasonal abundance and community.

Myth vs History

As Thanksgiving became a national holiday, myths about the first celebration began to overshadow historical facts. Much of the modern narrative can be traced back to the 19th century, when the holiday was popularized and romanticized.

The Romanticized Myth

The traditional narrative depicts Pilgrims and Native Americans sharing a harmonious meal, much like today’s Thanksgiving dinner. This portrayal emphasizes mutual goodwill and cultural exchange, but it simplifies a far more complex reality

This narrative began to take shape in the mid-19th century when writer Sarah Josepha Hale, editor of Godey’s Lady’s Book, campaigned to make Thanksgiving a national holiday. In 1863, during the Civil War, President Abraham Lincoln declared Thanksgiving a national holiday, emphasizing unity and gratitude. Hale’s writings, along with paintings and school textbooks, reinforced the idyllic imagery of Pilgrims and Native Americans dining together peacefully.

The Historical Complexities

While there was cooperation and mutual benefit between the Pilgrims and Wampanoag during the early years of Plymouth Colony, the relationship was far more complex than the legend suggests. The Wampanoag helped the settlers survive, teaching them to fish and to grow corn in the unfamiliar landscape. However, this alliance was forged out of necessity. The Wampanoag were seeking allies against rival tribes, and the Pilgrims needed help to avoid starvation.

Furthermore, the long-term relationship between European settlers and Native Americans was marked by conflict, displacement, and violence. By the late 17th century, tensions had escalated into King Philip’s War (1675-1678), one of the bloodiest conflicts in colonial American history, leading to the near-destruction of the Wampanoag people. These later events cast a shadow over the harmony celebrated in Thanksgiving lore.

Thanksgiving’s Evolution Over Time

As the centuries passed, the story of this harvest feast evolved into something far removed from its origins.

The mythologizing of Thanksgiving served a broader cultural purpose. During the 19th century, the holiday was framed as a uniquely American tradition, emphasizing family, gratitude, and unity at a time when the nation was deeply divided.

In modern times, Thanksgiving has become a secular holiday centered on food, family, and football, often disconnected from its historical roots. While many still reflect on gratitude, the original religious significance observed by the Pilgrims has largely faded. Similarly, the role of Native Americans in the holiday’s origins is often reduced to a simplistic narrative, overshadowing the complex history of their interactions with settlers.

Reclaiming the Story

In recent years, schools and communities have been actively reshaping the Thanksgiving narrative to present a more accurate and inclusive account of its history. This shift aims to acknowledge the complexities of the holiday’s origins and the experiences of Indigenous peoples.

Efforts have been made to present a more nuanced understanding of Thanksgiving. For example, Native American communities use Thanksgiving as a time for remembrance, marking it as a “National Day of Mourning” to honor ancestors and reflect on the impact of colonization. Educators and historians strive to balance the narrative, acknowledging both the cooperation and conflict between Pilgrims and Native Americans.

Understanding the historical first Thanksgiving as a multi-day harvest celebration shared by two very different cultures can enrich our appreciation of the holiday. By recognizing the complexities of the Pilgrims’ survival and the Wampanoag’s contributions, we can honor the real history while still finding meaning in Thanksgiving as a time for gratitude and reflection.

Conclusion

The first Thanksgiving of 1621 was a far cry from the turkey-laden feasts of today. It was a modest harvest celebration involving around 140 people, featuring wild game, seafood, and native vegetables. The three-day event was as much about survival and diplomacy as it was about gratitude.

Over centuries, this historical gathering has transformed into a powerful national myth that emphasizes unity and abundance. While the legend simplifies and sanitizes a more complex reality, it also reflects the evolving cultural values of the United States. By understanding the truth behind the Thanksgiving story, we can celebrate the holiday with a deeper sense of history, recognizing both its origins and its modern meaning.

Thanksgiving remains a day to give thanks, share food, and connect with loved ones—but it also offers an opportunity to reflect on the broader history it represents.

The most iconic Thanksgiving image.

 Sources:

Primary Accounts of the First Thanksgiving:

  • Bradford, William. Of Plymouth Plantation. Original accounts describing the Pilgrims’ settlement and their harvest celebration in 1621.
  • Winslow, Edward. Mourt’s Relation. An early Pilgrim document providing descriptions of their experiences.

  Attendees of the First Thanksgiving:

  • Pilgrim Hall Museum. “What Happened in 1621?”

  The Menu of the 1621 Feast:

  • History Channel. “What Was on the Menu at the First Thanksgiving?”

  The Role of Women and Servants:

  • New England Historical Society. “The Women Who Cooked the First Thanksgiving.”
  • Wikipedia. “List of Mayflower Passengers.”

  Evolution of the Thanksgiving Legend:

  • Smithsonian Magazine. “The Thanksgiving Myth and What We Should Be Teaching Kids.”

  Complex Relationships Between Pilgrims and Wampanoag: Smithsonian Magazine. “The History Behind the Thanksgiving Holiday.”

 War and Medicine

The Evolution of the Army Medical Corps

The history of military medicine in the United States during the 18th and 19th centuries is essentially the history of the Army Medical Corps. There is no surprise that the Army Medical Corps played a significant role in advances in battlefield medicine. However, many people do not appreciate that the Army Medical Corps also played a significant role in the treatment of infectious diseases and improvements in general sanitation.  For example, one of the first public health inoculation efforts was ordered by General George Washington in the Continental Army to protect troops against smallpox. Walter Reed led an Army Medical Corps team that proved that the transmission of yellow fever was by mosquitoes. The Army Medical Corps developed the first effective typhoid vaccine during the Spanish American War and in World War II the Army Medical Corps led research to develop anti-malarial drugs.

Revolutionary War and the Founding of the Army Medical Corps

The formal beginnings of military medical organization in the United States trace back to 1775, with the establishment of a Medical Department for the Continental Army. On July 27, 1775, the Continental Congress created the Army Medical Service to care for wounded soldiers. Dr. Benjamin Church was appointed as the first “Director General and Chief Physician” of the Medical Service, equivalent to today’s Surgeon General. However, Church’s tenure was brief and marred by scandal: he was proved to be a British spy, passing secrets to the enemy.

Church’s arrest in 1775 created a leadership vacuum, and the fledgling medical service had to reorganize quickly under Dr. John Morgan, who became the second Director General. Morgan sought to professionalize the medical corps, emphasizing proper record-keeping and standards of care. However, the Revolutionary War medical system struggled with limited resources, inadequate supplies, poor funding and an overworked staff. The lack of an effective supply chain for medicine, bandages, and surgical instruments was a significant issue throughout the conflict.

Early Challenges in Battlefield Medicine

During the Revolutionary War, military medical practices were rudimentary. Medical knowledge and understanding of disease processes had advanced little since the days of ancient Greece. Medical training was inconsistent and was principally by the apprentice method. In 1775 there were only two small medical schools in all of the 13 colonies. One of those closed with the onset of the revolution.

Field surgeons primarily treated gunshot wounds, fractures, and infections. Most treatments were painful and often involved amputation, as this was one of the few ways to prevent infections from spreading in an era without antibiotics. Battlefield medicine was further hampered by the fact that surgeons often had to work without proper sanitation or anesthesia.

One of the most significant health challenges faced by the Continental Army was disease, including smallpox, typhoid, dysentery, and typhus. In fact, more soldiers died from disease than from combat injuries. Recognizing the threat of smallpox, General George Washington made the controversial but strategic decision in 1777, to inoculate his troops against smallpox, significantly reducing mortality and helping to preserve the fighting force. At Valley Forge almost half of the continental troops were unfit for duty due to scabies infestation and approximately 1700 to 2000 soldiers died of the complications of typhoid and diarrhea.

It’s estimated that there were approximately 25,000 deaths among American soldiers both continental and militia in the American Revolution.  An estimated 7000 died from battlefield wounds. An additional 17,000 to 18,000 died from disease and infection. This loss of soldiers to non-combat deaths has been one of the biggest challenges faced by the Army Medical Corps through much of its history.

Post-Revolution: Developing a Medical Framework (1783-1812)

After the Revolutionary War, the United States Army Medical Department went through a period of instability. There were ongoing debates about the structure and necessity of a standing army and medical service in peacetime. However, the need for an organized military medical service became apparent during the War of 1812. The war underscored the importance of medical organization, especially in terms of logistics and transportation of the wounded.

The Army Medical Department grew, and by 1818, the government established the position of Surgeon General. Joseph Lovell became the first to officially hold the title of Surgeon General of the United States Army. Lovell introduced improvements to record-keeping and hospital management and laid the groundwork for future medical advances, though the department remained small and under-resourced.

Advancements in Military Medicine: The Mexican-American War (1846-1848)

The Mexican-American War provided an opportunity for the Army Medical Corps to refine its practices. Field hospitals were more structured, and new surgical techniques were tested. However, disease continued to be a significant challenge, yellow fever and dysentery plagued American troops. The war also underscored the importance of sanitation in camps, though knowledge about disease transmission was still limited.

The aftermath of the Mexican-American War saw the construction of permanent military hospitals and better organization of medical personnel, setting the stage for the much larger and more complex demands of the Civil War.

Civil War: The Birth of Modern Battlefield Medicine (1861-1865)

The Civil War represented a turning point in military medicine, with significant advances in both battlefield care and medical logistics. By the start of the war, the Army Medical Corps was better organized than during previous conflicts, though it still faced many challenges. Jonathan Letterman, the Medical Director of the Army of the Potomac, revolutionized battlefield medicine by creating the Letterman System, which included:

  1. Field Dressing Stations: Located near the front lines to provide immediate care.
  2. Ambulance System: Trained ambulance drivers transported wounded soldiers from the battlefield to hospitals.
  3. Field Hospitals and General Hospitals: These provided surgical care and longer-term treatment.

The Civil War saw the introduction of anesthesia (chloroform and ether), which reduced the suffering of wounded soldiers and made more complex surgeries possible. However, infection remained a major problem, as antiseptic techniques were not yet widely practiced and germ theory as a source for disease and infection was poorly understood. Surgeons worked in unsanitary conditions, often reusing instruments without sterilization and frequently doing little more than rinsing the blood off of their hands between patients.

Sanitation and Public Health Measures

One of the most critical lessons of the Civil War was the importance of camp sanitation and disease prevention. Dr. William Hammond, appointed Surgeon General in 1862, emphasized the need for hygiene and camp inspections. Under his leadership, new regulations improved the quality of food and water supplies. Though disease still claimed many lives, these efforts marked the beginning of a more systematic approach to military public health.

Additionally, the United States Sanitary Commission (USSC)was established in 1861. It was a civilian organization that was created to support the union army by promoting sanitary practices and improving medical care for soldiers with the objectives of improving camp sanitation, providing medical supplies, promoting hygiene and preventive care, supporting wounded soldiers and advocating for soldiers welfare.

Hammond also promoted the use of the Army Medical Museum to collect specimens and study diseases, fostering a more scientific approach to military medicine. Though he faced resistance from some military leaders, his reforms laid the foundation for modern military medical practices.

Conclusion

The evolution of the Army Medical Corps from the Revolutionary War to the Civil War reflects a gradual shift from rudimentary care to more organized, systematic medical practices. Early efforts were hindered by leadership issues, such as the betrayal by Benjamin Church, and by the challenges of disease and limited resources. However, over the decades, the Army Medical Department improved its structure, introduced innovations like inoculation and anesthesia, and laid the groundwork for advances in battlefield care. The Civil War, in particular, was pivotal in transforming military medicine, with lessons in logistics, sanitation, and surgical care that would shape the future of military and civilian medical systems.

For further reading, the following sources provide excellent insights:

  • Office of Medical History – U.S. Army
  • “Gangrene and Glory: Medical Care during the American Civil War” by Frank R. Freemon

Page 1 of 4

Powered by WordPress & Theme by Anders Norén