The Grumpy Doc

Grumpy opinions about everything.

What Is Fascism Anyway?

Fascist! The very word conjures up images of totalitarianism, militarism, suppression of dissent and brutality. Unfortunately, it’s become a ubiquitous portion of our political discourse. Each side, at one time or another, has accused the other of being fascist. But what do they really mean by fascist? Do they understand the definition and the reality of fascism? Or do they simply mean: “I disagree with you, and I really want to make you sound evil.”

I decided I needed to know more about fascism, so I’ve done some research, and I’d like to share the results with you. As I frequently do, I’ll start with the dictionary definition.  According to Merriam-Webster fascism is a political philosophy, movement, or regime that exalts nation and often race above the individual and that stands for a centralized autocratic government headed by a dictatorial leader, severe economic and social regimentation, and forcible suppression of opposition

As with many dictionary definitions, it gives us the 50,000-foot view without any real detail. What I’d like to do is cover the origins of fascism, its basic principles and how it rose to prominence in the middle of the 20th century. I also want to compare fascism to communism—another ideology that shaped much of the 20th century—and to provide insights into the differences and similarities between these two systems.

The Origins of Fascism

Fascism emerged in the early 20th century, primarily in Italy, as a reaction to the perceived failures of liberal democracy and socialism. The term itself comes from the Italian word “fascio,” meaning a bundle or group, symbolizing unity and collective strength. It also references fasces, a bundle of rods tied around an ax symbolizing authority in the Roman Republic.  It was appropriated as a symbol by Italian fascists in an attempt to identify with Roman history, much as American patriotic symbols are being appropriated by the radical right in the U.S. today.

Benito Mussolini, an Italian political leader, is often credited as the founder of fascism.   He established the groundwork for first fascist regime in Italy beginning in 1922 after he was appointed Prime Minister.  Fascism arose in a period of social and economic turmoil following the First World War. Many people in Europe were disillusioned with the existing political systems, which they believed had failed to prevent the war and its devastating consequences. The post-war economic instability, along with fears of communist revolutions like the one in Russia, provided fertile ground for the rise of fascist movements.

Moussolini, together with Italian philosopher Giovanni Gentile, published “The Doctrine of Fascism” (La Dottrina del Fascismo) in 1932, after he had consolidated political power in his hands.  It lays out the guiding principles and theoretical foundations of fascism, stressing nationalism, anti-communism, the glorification of the state, the belief in a strong centralized leadership, and the rejection of liberal democracy.   

The Philosophical Basis of Fascism

Fascism is rooted in several key philosophical ideas:

  • Nationalism and Militarism: Fascism places the nation or race at the center of its ideology, often elevating it to a quasi-religious status. The state is seen as a living entity that must be protected and expanded through internal police action and external military strength.
  • Authoritarianism: Fascists reject democratic institutions, believing that a strong, centralized authority is necessary to maintain order and achieve national greatness. Individual freedoms are subordinated to the needs of the state.
  • Anti-Communism and Anti-Liberalism: Fascism is explicitly opposed to both communism and liberal democracy. It views communism as a threat to national unity and social order, while liberal democracy is seen as weak and indecisive.
  • Social Darwinism: Fascists often believe in the idea of the survival of the fittest, applying this concept to nations and races. They argue that conflict and struggle are natural and necessary for the advancement of the state.

Implementation and Practice of Fascism

Fascism has been implemented in various forms, with Italy under Mussolini and Nazi Germany under Adolf Hitler being the most prominent examples. In practice, fascist regimes are characterized by:

  • Centralized Power: Fascist governments concentrate power in the hands of a single leader or party, often through the use of propaganda, censorship, political repression, and mass imprisonment and execution of opponents.
  • State Control of the Economy: While fascists generally allow for private ownership, they maintain strict control over the economy, directing resources toward the state’s goals, particularly militarization.
  • Suppression of Dissent: Fascist regimes are intolerant of opposition, often using violence, imprisonment, and even assassination to eliminate political rivals and suppress dissent.
  • Cult of Personality: Fascist leaders often create a cult of personality, presenting themselves as the embodiment of the nation and its destiny.

Comparing Fascism and Communism

While both fascism and communism reject liberal democracy, they differ significantly in their goals and methods.

  • Philosophical Differences:
    • Fascism: As mentioned earlier, fascism emphasizes nationalism, authoritarianism, and social hierarchy. It seeks to create a strong, unified state that can compete with other nations on the global stage.
    • Communism: Communism, based on the ideas of Karl Marx, advocates for a classless society where the means of production are owned collectively. It seeks to eliminate private property and achieve equality among all citizens.
  • Economic Systems:
    • Fascism: Fascists allow for private ownership but maintain state control over key industries and direct economic activity to serve the state’s interests.
    • Communism: Communism advocates for the abolition of private property, with all means of production owned and controlled by the state (or the people in theory). The economy is centrally planned and managed.
  • Political Structures:
    • Fascism: Fascist regimes are typically one-party states with a strong leader at the top. Political pluralism is non-existent, and the government exercises strict control over all aspects of life.
    • Communism: Communist states are also typically one-party systems, but they claim to represent the working class. In practice, these regimes often become highly centralized and authoritarian or totalitarian, similar to fascist states.

Comparative Examples

  • Italy and Nazi Germany (Fascism): Both Mussolini’s Italy and Hitler’s Germany exemplify fascist regimes. They were characterized by aggressive nationalism, military expansionism, and the suppression of political opposition. Hitler’s regime, however, took these ideas to their most extreme and horrifying conclusion with the Holocaust, a genocide driven by racist ideology.
  • Soviet Union (Communism): The Soviet Union under Joseph Stalin provides a clear example of a totalitarian communist state. The government abolished private property, collectivized agriculture, and implemented central planning. Political repression was severe, with millions of people imprisoned, starved to death or executed during Stalin’s purges.  It is important to recognize that Stalinist communism differed significantly from the theoretical communism of Karl Marx.

Conclusion

Fascism and communism, despite their profound differences, share certain similarities in practice, particularly in their authoritarianism and intolerance of dissent. However, their philosophical foundations and goals are fundamentally different: fascism seeks to elevate the nation above all else, while communism theoretically aims to create a classless society. Understanding these ideologies and their historical manifestations is crucial for anyone interested in the political history of the 20th century and its lasting impact on the world today. 

We can use our understanding of fascism and its comparison to democracy to ask important questions. What kind of government do we want?  Are there any possible crossovers or compromises between the two? And, importantly, should there be?

Postscript

Many of the ideas in this post were inspired by two excellent books on the subject, “The Origins of Totalitarianism” by Hannah Arendt and “Fascism: A Warning” by Madeleine Albright.

 Doctors of the Deep Blue Sea

A Brief History of the U.S. Navy Medical Corps

The U.S. Navy Medical Corps has a history that evolves from a humble beginning during the Revolutionary War to its current role as a vital component of modern military medicine. The Medical Corps ensures the health and well-being of sailors, Marines, and their families, while contributing to public health and advancements in medical science.

Origins in the Revolutionary War

The roots of Navy medicine trace back to the Revolutionary War, when medical care aboard ships was primitive at best. Shipboard surgeons, often lacking formal medical training, treated injuries and disease with the limited tools and knowledge available to them. In the early days of the U.S. Navy, physicians served without formal commissions, often receiving temporary appointments for specific cruises.  Their primary tasks included amputations, treating infections, and caring for diseases like scurvy and dysentery.

In 1798, Congress formally established the Department of the Navy, creating the foundation for organized medical care within the naval service.  Surgeon Edward Cutbush published the first American text on naval medicine in 1808. The Naval Hospital Act of 1811 marked another milestone, authorizing the construction of naval hospitals to support the growing fleet.

Establishment of the Navy Medical Corps (1871)

The U.S. Navy Medical Corps was officially established on March 3, 1871, by an act of Congress. This legislation created a formal medical staff to support the Navy, setting standards for the recruiting and training naval physicians. These physicians were initially known as “Surgeons” and “Assistant Surgeons,” tasked with providing care on ships and at naval hospitals.  The act granted Navy physicians rank relative to their line counterparts, acknowledged their role as a staff corps, and established the title of “Surgeon General” for the Navy’s senior medical officer.

During this period, the Navy Medical Corps began to expand its scope. It embraced emerging medical technologies and scientific discoveries, setting the stage for its later contributions to public health and medical innovation.

The Navy Hospital Corps

The U.S. Navy Hospital Corps was established on June 17, 1898. Its creation was prompted by the increased medical needs during the Spanish-American War. Since then, the enlisted corpsmen have served in every conflict involving the United States, providing critical medical care on battlefields, aboard ships, and in hospitals worldwide.

Corpsmen are trained to perform a wide range of medical tasks, including emergency battlefield triage and treatment, surgery assistance, and disease prevention. They are often embedded directly with Marine Corps units, making them indispensable on the battlefield.

The Hospital Corps is the most decorated group in the U.S. Navy. To date, its members have earned numerous high-level awards for valor, including: 22 Medals of Honor, 182 Navy Crosses, 946 Silver Stars, and 1,582 Bronze Stars.

World Wars and the Expansion of Military Medicine

Both World War I and World War II were transformative for the Navy Medical Corps. During World War I, Navy medical personnel treated injuries and illnesses both aboard ships and in field hospitals. Their efforts were instrumental in managing wartime epidemics, including the devastating 1918 influenza pandemic.

World War II brought further advancements. The Navy Medical Corps played a pivotal role in addressing the challenges of warfare in diverse climates, including tropical diseases in the Pacific Theater. It also pioneered methods for treating trauma, burns, and psychiatric conditions.

Cold War Era and Modernization

The Cold War era marked a time of significant innovation for the Navy Medical Corps. The establishment of the Navy Medical Research Institutes advanced studies in areas such as tropical medicine, submarine medicine, and aerospace medicine. These efforts supported the Navy’s global missions and contributed to broader medical advancements.

In the latter half of the 20th century, Navy medical personnel became key players in humanitarian missions, responding to natural disasters and providing aid in conflict zones. Their expertise in public health, infectious disease control, and trauma care enhanced the Navy’s ability to spread goodwill worldwide.

Modern Contributions and Future Challenges

Today, the Navy Medical Corps supports both military readiness and global health. Its personnel provide care on ships, submarines, aircraft carriers, and for Marine Corps forces, and at shore-based facilities. They also participate in humanitarian missions and disaster response, reflecting the Navy’s commitment to a broader vision of security and well-being.

In recent years, Navy medicine has faced challenges such as the COVID-19 pandemic, addressing mental health issues among service members, and adapting to emerging threats like climate change and cyber warfare defense. These challenges underscore the evolving role of the Navy Medical Corps in a complex world.

From its early days of rudimentary care to its modern role in global health and innovation, the U.S. Navy Medical Corps has been a cornerstone of military medicine. Its contributions extend beyond the battlefield, shaping public health, medical research, and humanitarian efforts worldwide.

As the Navy Medical Corps continues to adapt to new challenges, it remains a testament to the enduring value of medical service in the defense of the nation and the promotion of global health.

Choosing Not to Know

Why We Avoid Truths That Make Us Uncomfortable

One afternoon during the COVID lockdown I was scrolling through online news sites looking for something to read.  I realized I was intentionally bypassing sites I knew I would disagree with.  This surprised me because I have always been a proponent of critical thinking.  Here I was practicing its antithesis— willful ignorance—intentionally avoiding evidence that contradicts my beliefs or preferences.

This behavior may seem irrational, yet it persists across all aspects of life, from personal relationships to religious beliefs to political ideologies. Understanding why we cling to falsehoods, what value we derive from this behavior, and how we can counter it is essential for fostering open-mindedness and informed decision-making.

We often assume that willful ignorance is something that affects “them”—the people with whom we disagree. Anyone can fall victim to willful ignorance, even you and me.

 When we encounter evidence that contradicts our beliefs, we experience cognitive dissonance—a state of mental discomfort caused by holding two conflicting ideas simultaneously. To resolve this discomfort, we often reject new evidence rather than altering our existing worldview.

We tend to seek out and interpret information in ways that confirm our pre-existing beliefs while ignoring or dismissing evidence to the contrary. This  conformation bias reinforces our opinions and shields us from uncomfortable truths.

 Our beliefs are often tied to our social identity. Having our beliefs challenged can feel like an attack on our sense of self or on our group affiliations. Maintaining allegiance to a shared belief—whether religious, political, or cultural—can feel more important than factual accuracy.

Contradictory evidence can create fear and uncertainty, especially if it undermines our understanding of the world. Clinging to familiar falsehoods can provide us a sense of security and predictability.

We invest time, energy, and emotions into our beliefs. Admitting we were wrong may feel like a personal failure or a waste of effort, making it easier to reject new information than to reconsider long-held positions.

Despite its drawbacks, willful ignorance offers psychological and social benefits that make it appealing.  Ignoring uncomfortable truths can protect us against guilt, shame, or fear, while providing a sense of inner peace and emotional comfort.  We may attempt to maintain our sense of self and group identification by avoiding information that threatens our worldview. Engaging with complex or contradictory information requires mental effort. Ignoring it simplifies decision-making, reducing cognitive load.  Aligning with a group’s shared beliefs—regardless of their accuracy—fosters social cohesion and acceptance.

While anyone can fall into willful ignorance, certain factors may make some groups more prone to it.  Studies show that individuals across the political spectrum exhibit willful ignorance, though the issues they ignore vary. For example, conservatives may deny climate change, while progressives may overlook the economic costs of policies they favor.  Groups that emphasize doctrinal adherence may be more resistant to evidence that challenges theological teachings.  Older adults may resist evidence that challenges long-held beliefs. However, younger individuals can also exhibit willful ignorance, particularly in social media echo chambers.

We are more likely to reconsider our beliefs in an environment where we feel we have been heard and understood rather than attacked and ridiculed. Constructive dialogue, rather than confrontation, opens the door to change.  Facts alone often fail to persuade. Framing evidence within emotionally resonant stories can make it more effective.  Presenting new information in small, digestible portions helps reduce cognitive dissonance and makes new ideas less threatening.  We are more likely to accept information from sources we trust, particularly those who share our cultural or ideological background.

Convincing someone that their beliefs are counterproductive requires tact and patience.  But, before trying to change others, we must first examine our own beliefs to ensure we are not guilty of the same behavior.  Self-examination is the first step in addressing willful ignorance.

Willful ignorance thrives in environments of fear, division, and mistrust. Countering it requires empathy, compassion, and truth. If we engage with others in a spirit of understanding rather than confrontation, we have a better chance of bridging divides and creating meaningful change.

The journey is challenging, but the rewards—for both individuals and society—will be worth the effort.

Don’t Forget Climate Change

It Affects Us All

Climate change, one of the most critical challenges facing humanity in the 21st century, seems to be forgotten in all the controversy surrounding DOGE. Regardless of everything else going on, we can’t ignore climate change because it affects global temperatures, weather patterns, ecosystems, and economies. The overwhelming scientific consensus is that human activities—primarily the burning of fossil fuels—are driving climate change.

The existence of climate change and the impact of human activity, like any other field of science, includes areas of disagreement among researchers. One of the principal areas of disagreement is about the sensitivity of the climate to the increase in CO2 production and the rate at which global warming will occur. There’s also discussion about how effective climate models may be with some arguing that the models may either overestimate or underestimate certain effects. A significant area of disagreement is over what is known as the “tipping points”. This is a debate about when or if certain events such as ice sheet collapse, permafrost thaw or ocean circulation changes might occur. Some argue these events could trigger rapid self-reinforcing climate shifts while others believe changes will be more gradual. Even with this disagreement there is broad acceptance that climate change has increased the frequency and intensity of heat waves, heavy rain and extreme weather.

As intense as some of these scientific debates maybe, they pale in significance beside the political debates being generated around climate change.

When the possibility of climate change was first recognized in the 1970s and 1980s there was bipartisan support to address possible remediation of long-term impacts. Republican President Richard Nixon signed landmark environmental laws including the Clean Air Act.

During the 1990s climate change became more polarized. President George H. W. Bush begin to frame climate change policy as an economic threat. George W. Bush rejected the Kyoto Protocol to avoid “economic hindrance”.

By 2008 the partisan divide had significantly increased. Republicans increasingly dismissed climate risks while Democrats amplified the urgency of taking action. By 2023, 78% of Democrats prioritized climate policy, but only 21% of Republicans viewed climate action as urgent despite increasing climate risks in some  GOP dominated states such as Florida and Texas.

The partisan gap expanded as conservative science skeptics continued to raise issues about rates of change, economic impacts and potential solutions. These conservatives tend to view climate policies as government overreach, while progressives hold the position that the government led initiatives are essential to combat environmental threats.

As they have in many other issues, the media have lined up into conservative and progressive camps. The conservative leaning media downplays climate risks while the liberal leaning media emphasizes the danger and need for urgent action. As with many other things this leads to a “echo chamber” effect simply reinforcing political beliefs without adding anything new of significance to the debate.

The Trump administration has signaled its desire to undo many of the climate change initiatives put in place by Democratic administrations. On January 20, 2025, President Trump signed Executive Order 14162 directing the immediate withdrawal of the United States from the Paris Climate Agreements and related international climate commitments. He has declared a “National Energy Emergency” to accelerate fossil fuel development and ease restrictions on the construction of new oil and gas projects. As part of this effort, he has weakened environmental reviews. This is expected to significantly increase fossil fuel consumption and associated greenhouse gas emissions. The Trump administration has begun the rollback of environmental regulations. Lobbyists for the oil, gas and chemical industries have been appointed to the Environmental Protection Agency to reverse climate regulations and pollution controls.

The administration is withdrawing funding for clean energy initiatives including those aimed at reducing carbon emissions and promoting renewable energy resources. The administration has initiated a review of the “legality and continued applicability” of the EPA’s endangerment finding which is the basis of most federal regulations on greenhouse gas.  The administration rolled back regulations limiting methane emissions from oil and gas operations. The definition of “waters of the United States” under the Clean Water Act was narrowed, potentially allowing increased pollution in streams and wetlands.

We can expect increases in severe weather because of Trump’s environmental policies.  These policy decisions collectively hinder efforts to mitigate climate change, potentially leading to increased greenhouse emissions and global warming. Reduction in funding for climate change research and the rollback of environmental regulations will have long term adverse effects on both domestic and global environmental health.

Significant budget cuts and layoffs within agencies like the National Oceanic and Atmospheric Administration (NOAA) could impair the ability to forecast and respond to severe weather events. For instance, the reduction of meteorologists and environmental scientists may hinder critical forecasting services, affecting public safety during events like hurricanes, tornados and floods.

The U.S. withdrawal from international climate initiatives, such as the Loss and Damage Fund, reduces financial support for developing countries dealing with climate-induced disasters. This could lead to inadequate infrastructure and preparedness in vulnerable regions, potentially increasing the severity of weather-related impacts.

While it is challenging to attribute specific future weather events to current policy changes directly, the administration’s environmental policies will likely contribute to conditions that favor more frequent and intense extreme weather events. The combination of increased greenhouse gas emissions together with weakened environmental regulations, reduced climate research capabilities, and diminished global climate cooperation collectively enhance the likelihood and impact of severe weather phenomena. This damage to our environment needs to be prevented!  Once it occurs it will be difficult to ever reverse and our children and grandchildren will suffer as a result.

More Than Just Glasses – Eye Health for Adults

Most of us don’t consider getting an eye exam until we think we need new glasses or maybe when we think we need glasses for the first time. But that’s not the only reason we should be visiting the eye doctor.  For adults, maintaining eye health becomes increasingly important as we get older. Vision changes are a natural part of aging and many serious eye conditions can be managed or even prevented with regular care. Conditions such as cataracts, glaucoma, macular degeneration, and diabetic retinopathy can be discovered during routine exams. Additionally, there are rarer eye conditions that can be detected, such as ocular cancers, that may not be symptomatic initially but can lead to vision loss and can even be fatal.

Timely diagnosis and treatment of eye diseases are crucial to preserving sight and overall quality of life.  Your eye exam is about far more than just a new pair of glasses.

This issue will cover major eye diseases affecting adults, the symptoms, available treatments, and complications of late diagnoses.


Cataracts

A cataract is a clouding of the eye’s natural lens, leading to blurry or diminished vision. Cataracts are one of the most common causes of vision loss in older adults.

Symptoms:

  • Blurred or cloudy vision
  • Difficulty seeing at night
  • Sensitivity to light and glare
  • Seeing halos around lights
  • Fading or yellowing of colors
  • Double vision in one eye

Treatment:

In the early stages, stronger lighting and prescription glasses may help. However, the only definitive treatment is cataract surgery, where the cloudy lens is replaced with an artificial intraocular lens (IOL). Cataract surgery is one of the safest and most effective procedures available.

Complications of Late Diagnosis:

Delaying treatment can lead to significant vision impairment, increasing the risk of falls, depression, and loss of independence. In advanced cases, cataracts can cause complete blindness.


Glaucoma

Glaucoma is a group of diseases that damage the optic nerve, often due to high intraocular pressure. Open-angle glaucoma is the most common form. It typically develops slowly without noticeable symptoms. Angle-closure glaucoma appears more suddenly and generally involves severe eye pain.  Glaucoma is a leading cause of blindness worldwide and often develops without noticeable symptoms until significant vision loss occurs.

Symptoms:

  • Gradual loss of peripheral vision (in open-angle glaucoma)
  • Sudden, severe eye pain (in angle-closure glaucoma)
  • Blurred vision
  • Halos around lights
  • Nausea and vomiting (in acute cases)

Treatment:

Glaucoma cannot be cured, but it can be managed with:

  • Prescription eye drops to reduce intraocular pressure
  • Laser therapy to improve fluid drainage
  • Surgery in severe cases

Complications of Late Diagnosis:

Glaucoma-related vision loss is irreversible. Without timely intervention, glaucoma can lead to tunnel vision and complete blindness.


Age-Related Macular Degeneration (AMD)

Macular degeneration, or age-related macular degeneration (AMD), primarily affects the macula, the central part of the retina responsible for sharp, central vision. There are two main forms of AMD: dry (non-neovascular) and wet (neovascular). Dry AMD is more common and progresses slowly, while wet AMD is less common but more severe and leads to rapid vision loss.

Symptoms:

  • Blurred or distorted central vision
  • Difficulty reading or recognizing faces
  • Straight lines appearing wavy
  • Need for brighter light when reading
  • Dark or empty areas in the center of vision

Treatment:

There is no cure for AMD, but treatment options include:

  • Injections to slow the progression of wet AMD
  • Laser therapy in some cases
  • Lifestyle changes, including a diet rich in leafy greens, omega-3 fatty acids, and antioxidant supplements

Complications of Late Diagnosis:

Without early treatment, AMD can progress to severe vision loss, making everyday activities like reading and driving difficult.


Diabetic Retinopathy

This condition occurs in people with diabetes when high blood sugar damages the blood vessels in the retina. In early stages it is not symptomatic, but it can lead to blindness if untreated.

Symptoms:

  • Floaters or dark spots in vision
  • Blurry vision
  • Difficulty seeing colors
  • Vision loss in advanced cases

Treatment:

  • Better blood sugar control to slow progression
  • Injections to prevent spread
  • Laser treatment to seal leaking blood vessels
  • Surgery for severe cases

Complications of Late Diagnosis:

Delaying treatment can result in retinal detachment, complete vision loss, and an increased risk of other eye diseases.


Cancers of the Eye

Although rare, cancers such as ocular melanoma can develop in the eye. This is a diverse group of malignancies that can affect different parts of the eye and its surrounding structures. They can originate within the eye or can spread to the eye from other parts of the body. They can be aggressive and vision threatening requiring prompt diagnosis and treatment.

Symptoms vary depending on the type and location of the cancer and can include many of the same symptoms as other eye diseases. Prognosis and treatment depend on the type of cancer and stage at the time of diagnosis. Treatment can include surgery, radiation therapy, laser therapy, chemotherapy and targeted immunotherapy. Early diagnosis is critical.

The Importance of Regular Eye Examinations

The American Academy of Ophthalmology recommends that adults over 65 have a comprehensive eye exam at least once a year, even if they have no noticeable vision problems. Those with conditions like diabetes, glaucoma, or AMD may need more frequent exams.

Maintaining Good Eye Health

  • Eat a Vision-Friendly Diet: Foods rich in vitamin A, C, E, zinc, and omega-3 fatty acids help protect eyesight.
  • Control Chronic Conditions: Managing diabetes and high blood pressure reduces the risk of eye complications.
  • Protect Your Eyes: Wear sunglasses with UV protection and blue light filtering when using digital screens.
  • Quit Smoking: Smoking significantly increases the risk of AMD, cataracts, and other eye diseases.
  • Stay Active: Regular exercise improves circulation and overall eye health.
  • Use Proper Lighting: Ensure good lighting at home to prevent strain and falls.
  • Follow Medication Instructions: Use prescribed eye drops and medications consistently to manage conditions like glaucoma.

Prioritizing Eye Health for a Better Quality of Life

Vision loss can significantly impact independence, mobility, and mental well-being. The key to maintaining good eye health is early detection and timely treatment. By scheduling regular eye exams and adopting healthy habits, you can preserve your vision and enjoy a higher quality of life.

If you’re haven’t had an eye exam in the past year, now is the time to schedule one. It’s about more than just a new pair of glasses. Protecting your eyesight today can ensure a clearer, brighter tomorrow.

Don’t Cut and Run on Ukraine

Like many Americans, my wife and I were both embarrassed and disgusted by the Oval Office ambush of Ukrainian President Volodymyr Zelenskyy by Donald Trump and JD Vance.  We were so upset by this disgraceful treatment of the visiting president of a sovereign nation, that we followed the lead of a friend and immediately ordered “I Stand With Ukraine”  T-shirts.

The oval office meeting held on February 28, 2025, was ostensibly intended to finalize a mineral rights agreement between the United States and Ukraine. The deal was seen as a strategic move to reduce US dependence on Chinese rare earth minerals and to support Ukraine’s economy amidst its ongoing conflict with Russia.

In what appeared to be a planned attack, Vice President Vance berated President Zelenskyy, making false claims of ingratitude on the part of Ukraine. President Trump quickly escalated the situation by criticizing Zelenskyy’s approach to the war and asserting that the Ukraine was “gambling with World War III.”   He then demanded that President Zelenskyy admit that he was responsible for the war and could end it at any time by making a deal.  Trump further demanded that Zelenskyy admit that it was Ukraine that was responsible both for initiating and prolonging the war.

If there is any doubt this was a planned and likely scripted meeting on the part of the Trump administration, you only have to look at Donald Trump’s closing statement for the meeting.  “I think we’ve seen enough. This is going to be great television.”

The fallout from this event has significant implications for international diplomacy and the ongoing conflict in Eastern Europe. The suspension of U.S. military aid to Ukraine following the meeting has raised concerns about Ukraine’s ability to defend itself against Russian advances. Ukrainian officials expressed disappointment but remained defiant with one military official stating, “we will fight with or without their help.”

President Trump has labeled Zelenskyy a dictator who is unwilling to negotiate peace. He claims that the Ukraine initiated hostilities against the Russian speaking population, requiring Russia to intervene. These claims have long since been debunked, yet Donald Trump continues to repeat them. It has been interesting this past week to watch Trump nominees try to avoid saying whether they believed Russia has invaded Ukraine. They evaded questions by saying they didn’t have all the facts, or it wasn’t appropriate for them to respond, when obviously they did not want to lie under oath and claim that Russia had not invaded Ukraine.

Russian officials and state media reacted with approval to the Oval Office clash.  China, Syria, North Korea and Iran also supported the Trump administration’s approach. 

The French President and the British Prime Minister both reaffirmed their commitment to Ukrainian sovereignty and condemned the manner in which the meeting was conducted.

Decide with whom you prefer to have the United States aligned, our long-standing allies and other democratic governments, or with autocrats and dictators. 

We invite you to join us and proudly proclaim “I STAND WITH UKRAINE.”

Superheros of the American Revolution

The American Revolution was fought not just by great leaders but by ordinary men and women who sacrificed everything for the promise of liberty. These “superheroes” of The Revolution—everyday soldiers of the Continental Army—endured unimaginable hardships and proved their resilience and commitment to a cause greater than themselves.

Who Were the Soldiers?
The typical soldier in the Continental Army was a young, able-bodied man in his late teens or twenties. However, recruits ranged widely in age, from boys as young as 16 to older men in their 40s or 50s. They came from all walks of life, reflecting the agrarian and small-town character of colonial America.

Work Background
Most soldiers were farmers or farm laborers, the backbone of the colonial economy. Others worked as apprentices or tradesmen, honing skills in blacksmithing, carpentry, and shoemaking. In coastal regions, fishermen and sailors also joined the ranks, bringing valuable maritime experience. Whatever their occupation, enlistment often meant leaving behind grueling but steady work, placing enormous burdens on their families and communities.

Education
Formal education was limited for most enlisted men. Literacy rates in colonial America, though higher than in Europe, were modest. Many soldiers could read and write only minimally, though these skills were sufficient for reading orders or sending letters home. Officers were generally better educated, often hailing from wealthier families with access to classical training and instruction in leadership and military strategy.

Family Life
Family ties were integral to the soldiers’ lives. Most were unmarried young men, but some older recruits left wives and children behind. Married soldiers relied on their families to manage farms and households in their absence, with women stepping into traditionally male roles to keep homes running. Communities often influenced enlistment decisions, with entire groups of men from the same town joining together, fostering camaraderie and mutual responsibility.

Why They Fought
Motivations for joining the Continental Army varied:
Patriotism: Many believed passionately in independence and the ideals of liberty and self-governance.
Economic Opportunity: For poorer colonists, enlistment promised steady (albeit delayed) pay and the promise of land grants after the war.
Community Expectations: Peer pressure and local leaders often spurred enlistments.
Adventure: For some young men, the army offered a chance for excitement and novelty.

Life in the Continental Army
Soldiers in the Continental Army faced extraordinary challenges that tested their endurance, commitment, and morale.
Logistical Struggles
The army constantly grappled with a lack of basic supplies:
Food: Soldiers often endured long periods of hunger, relying on inconsistent local contributions, sometimes going days without eating.
⦁ Clothing: Many lacked proper uniforms, footwear and blankets, suffering in harsh weather some even dying from exposure.
Ammunition: Weapons and ammunition were scarce, forcing soldiers to scavenge from battlefields.

At Valley Forge in the winter of 1777–1778, these shortages reached a critical point, with thousands suffering from frostbite, near starvation and exposure.
Extreme weather compounded the soldiers’ difficulties. Winter encampments like Valley Forge were marked by freezing temperatures, snow, and overcrowded, unsanitary conditions that led to outbreaks of smallpox, typhus, and dysentery.
Soldiers marched long distances with heavy packs, often on empty stomachs and in worn-out shoes. The physical strain was enormous, and separation from families added emotional stress. Many struggled to adapt to military life, which was vastly different from their previous experiences as farmers or tradesmen.

Financial Hardships
The fledgling American government struggled to fund the war:
⦁ Soldiers were rarely paid on time, leading to frustration and occasional mutinies.
⦁ Promised wages were often months or years late, making it difficult for soldiers to support their families.

Inconsistent Leadership and Training
Early in the war, the army lacked professional training and experienced leadership. While General George Washington provided steadfast guidance, many officers were political appointees with little military expertise. This began to change when Baron von Steuben arrived at Valley Forge, introducing systematic training and discipline.

Psychological Strain
The Revolutionary War dragged on for eight years, leaving soldiers to question whether their sacrifices would lead to victory. Early defeats against the better-equipped British Army demoralized many, and desertion rates were high. Still, the shared belief in the cause of liberty and the support of local communities kept many soldiers in the fight.

The Role of Communities
The army’s survival depended on civilian support. Local farmers, tradesmen, and women provided food, clothing, and moral encouragement. Civilians risked their lives to aid soldiers, and the collective belief in independence buoyed spirits even in the darkest times.

Conclusion
The common soldiers of the Continental Army were true superheroes of the American Revolution. Despite enduring hunger, cold, disease, and financial instability, they fought with unwavering determination. Their sacrifices laid the foundation for a new nation, proving that the quest for freedom often requires immense personal and collective sacrifice.

Sources:
Robert Middlekauff, The Glorious Cause: The American Revolution, 1763-1789
⦁ Caroline Cox, A Proper Sense of Honor: Service and Sacrifice in George Washington’s Army
⦁ Library of Congress: American Revolution resources

Oppression in Politics: Totalitarian and Authoritarian Systems

Since January 20th there has been extensive use of the terms authoritarian and totalitarian to refer to the actions of the current administration.  While totalitarian and authoritarian are often used interchangeably, they represent similar but distinct forms of governance with critical differences. If we’re going to hold rational discussions about these theories, we should be using the same terminology.

A totalitarian government seeks to control every aspect of public and private life, including political, economic, social, and cultural domains. The government uses a specific ideology to unify and dominate society. The government strives to regulate all aspects of life, leaving no room for personal freedoms or independent thought.  A guiding ideology is central, often enforced by propaganda, indoctrination, and censorship.  The government frequently relies on widespread surveillance, police state tactics, and brutal suppression of dissent.  All institutions, media, education, economy, and religion are state-controlled.

Examples include Nazi Germany, unified under an ideology of racial purity and Stalin’s Soviet Union, ostensibly organized under a Marxist ideology.  Both governments maintained control of their population through propaganda, brutal police actions, terror and murder.

 An authoritarian government is characterized by strong central power with limited political freedoms, but it does not seek to control all aspects of life.  Unlike totalitarian regimes, authoritarian states often allow some degree of personal freedom in areas like culture, business, or religion, as long as these do not challenge political authority.  Typically, these regimes are pragmatic and focused on maintaining power, not enforcing an all-encompassing ideology.  They are more likely to be organized around the personality of the dictatorial leader.  While repression is common, it is often less pervasive and targeted primarily at political opponents.

Franco’s Spain had limited political freedoms but allowed religious and cultural autonomy.  Putin’s Russia allows limited economic freedom for members of the Russian oligarchy.

The main distinction lies in the scope of control.  Totalitarian regimes seek to control all aspects of life and demand ideological conformity.  Authoritarian regimes primarily focus on political power and allow some personal autonomy as long as it does not threaten the regime.

In summary, all totalitarian governments are authoritarian, but not all authoritarian governments are totalitarian.

Waiting For The Reichstag Fire

On the evening of February 27th, 1933 the German Reichstag burst into flames. This attack on the German national parliament building was viewed by many as an attack on Germany itself.

A Dutchman named Marinus van der Lubbe was found and arrested at the scene almost immediately after the fire erupted. The Nazis quickly claimed that the fire was part of a broader communist uprising and used this claim to push for emergency powers.

 Van der Lubbe confessed to setting the fire alone, but the Nazi Party quickly claimed that it was part of a widespread communist conspiracy. Many people believe that the Nazis may have set the fire themselves and used it as a pretext to declare emergency rule.

 Adolf Hitler persuaded German President Paul von Hindenburg to issue the “Decree for the Protection of the People and the State” which suspended civil liberties, including freedom of speech, press and assembly. It allowed for the arrest and detention of political opponents without due process. Thousands of communists and socialists were arrested.

Within a month new elections were held. While the Nazis did not win an outright majority, they used the fire to create fear that led to passage of the “Enabling Act” on March 23, 1933. The act gave Hitler dictatorial powers, effectively ending democracy in Germany.

The Reichstag Fire was a crucial point in world history. Whether it was a Nazi engineered false flag operation or the act of a alone arsonist, it provided Hitler with the excuse he needed to dismantle democracy and establish a totalitarian dictatorship. This is a chilling example of how fear and propaganda can be weaponized to erase freedom; a lesson that remains relevant today.

Telehealth: Revolutionizing Healthcare

Or Is It Simply a Band-Aid?

When I first started hearing about telemedicine in the 1990s, I was dubious at best. How can I treat a patient I can’t examine? Too many things ran through my mind. I couldn’t listen to their heart, I couldn’t listen to them breathe, I couldn’t even look in their throat or their ears. What if I needed an EKG? How could I check their blood pressure? I was worried that telemedicine might be “second rate medicine”. 

I was worried about misdiagnosis and overprescribing antibiotics. If you couldn’t actually examine a patient, you might decide to play it safe and prescribe an antibiotic whether it was really needed or not. It might result in people being sent to the emergency room who might have been treated as an outpatient if you could have examined them in person.

As I looked into it, I discovered that the idea of telemedicine was not really new. As early as 1879, the British Medical Journal The Lancet discussed the possibility of using the telephone, then a revolutionary new technology, to reduce unnecessary doctors’ visits.  It took the advent of the computer age and audio-video technology to make telemedicine a real possibility.  But even then, I was still skeptical. I preferred to see my patients in person and did not get involved in telemedicine until the great societal upheaval of COVID.

I happened to retire from the emergency department three months before COVID hit. I was still doing primary care two days a week for an employee’s clinic. Like everyone else, we were shut down.

Reluctantly, we decided the only way to provide a service to our patients was to start using telehealth. Of course, we had none of the audio-video equipment we needed so we initially did it by telephone. That just confirmed most of my worries about providing poor care. We soon acquired the audio-video capabilities which gave us a little more insight into the patients we were dealing with. Over the next few months, I learned who was and was not a good candidate for telemedicine and how I could best care for patients that I could not physically examine. I’m going to share with you some of the things that I’ve learned over the past four years. Thankfully telehealth is now an exception rather than the rule as it was early in COVID. But it’s here to stay and we need to learn how to make it work.

Advantages of Telehealth

Convenience and Accessibility: Telehealth’s most immediate and tangible benefit is convenience. With the simple click of a button, patients can consult a physician from the comfort of their home. This is particularly helpful for those living in rural areas or those who are physically unable to travel to a clinic or hospital. According to a study by the American Medical Association, telehealth has increased access to care for patients who otherwise might not be able to receive it, whether due to geographical limitations, lack of transportation, or mobility issues.

For working professionals or parents who find it difficult to carve out time for in-person visits, telehealth allows consultations to occur from anywhere, drastically reducing travel time and missed work or family obligations. Patients also benefit from shorter wait times, as virtual queues tend to move more quickly than physical ones.

 Cost Efficiency:  Telehealth services can be more cost-effective for both patients and healthcare providers. For patients, the expenses associated with travel, parking, and time away from work are minimized. Healthcare providers, particularly in large hospital networks, can allocate resources more efficiently by integrating telemedicine into their workflow. Many telehealth services also offer more affordable consultation fees compared to in-office visits. A report from the National Bureau of Economic Research found that telemedicine visits are often less expensive for both insurers and healthcare systems.

Continuity of Care:  Telehealth allows for more frequent follow-ups, which is critical for managing chronic diseases such as diabetes, hypertension, and asthma. Instead of requiring patients to come to the clinic for every minor adjustment or medication change, telehealth allows for regular check-ins from home. This facilitates better long-term disease management and patient compliance. It can also enable quick intervention in cases where a patient’s symptoms escalate, potentially reducing the likelihood of emergency room visits.

Disadvantages of Telehealth

Limited Physical Examination:  The inability to perform a comprehensive physical examination is a significant limitation of telehealth. While many aspects of healthcare can be effectively managed through conversation, video, and shared data, some conditions require a hands-on exam. For example, a doctor might not be able to detect subtle signs of a skin condition, a heart murmur, or abdominal tenderness through a video screen. This limitation can hinder accurate diagnoses and delay proper treatment.

Privacy and Data Security:  Healthcare data is among the most sensitive forms of personal information. The shift to telehealth introduces significant concerns about data security, especially given the increase in cyberattacks on healthcare systems. The Health Insurance Portability and Accountability Act (HIPAA) mandates strict guidelines for protecting patient privacy, but not all telehealth platforms may be fully compliant. In some cases, platforms may use third-party applications that could compromise patient information. The risk of hacking, data breaches, or improper data handling adds another layer of complexity to the telehealth debate.

Connectivity Issues: High-speed internet is a luxury that is still not available in many rural and underserved areas. Telehealth relies heavily on stable and fast internet connections to facilitate real-time communication between patient and provider. In regions where broadband access is limited, telehealth appointments can be riddled with delays, interruptions, or complete disconnections. This not only disrupts the flow of the consultation but can also compromise the quality of care provided.

Lack of Universal Standards: Unlike in-person healthcare, where the processes are well-established and regulated, telehealth practices can vary significantly between providers and systems. The lack of universal standards for telehealth can lead to inconsistencies in the quality of care. Some platforms might not integrate well with electronic health records (EHRs), making it difficult for physicians to access a complete patient history during the virtual consultation.  Platforms may not function seamlessly across different devices (i.e., Android vs. iOS) or different browsers. Technical support may not always be readily available to address these issues, leading to delays in care or missed appointments.

Medical Problems Not Appropriate for Telehealth

While telehealth has proven to be effective for certain conditions, it is not a one-size-fits-all solution. There are specific medical problems that necessitate an in-person visit, where a physical examination and specialized equipment are crucial.

 Acute Injuries and Trauma:  Telehealth is not suitable for diagnosing or treating acute injuries such as fractures, deep cuts, burns, or other types of trauma. These conditions require immediate hands-on evaluation, imaging (e.g., X-rays or CT scans), and possibly surgical intervention. A telehealth consultation cannot provide the necessary tools to address these problems adequately, and any delays in care could worsen the patient’s condition.

Cardiovascular Emergencies: Conditions such as chest pain, heart attack symptoms, or strokes demand immediate in-person evaluation. The time-sensitive nature of these issues means that telehealth would not be appropriate for diagnosis or treatment. Patients experiencing these symptoms require rapid testing, monitoring, and possibly life-saving interventions that cannot be performed remotely.

Neurological Symptoms: Patients presenting with acute neurological symptoms such as sudden onset of weakness, slurred speech, confusion, or seizure activity require immediate in-person evaluation. These symptoms could indicate a stroke, transient ischemic attack (TIA), or another serious neurological condition that cannot be diagnosed or managed through a telehealth appointment.

Surgical Consultations: While telehealth can be a valuable tool for follow-up appointments post-surgery, the initial evaluation for surgical candidates should take place in person. Surgeons often rely on physical examinations and imaging results to determine whether surgery is necessary and to plan the procedure effectively.

Striking a Balance

Telehealth has transformed healthcare in a multitude of ways, providing unprecedented access to care for millions of patients. Its convenience, cost efficiency, and ability to promote continuity of care make it a powerful tool in the modern healthcare landscape. However, the limitations of telehealth, especially in cases requiring hands-on care or in emergencies, cannot be ignored. As healthcare systems continue to integrate telehealth into routine practice, it is essential to strike a balance between virtual and in-person care to ensure that all patients receive the level of medical attention they need. For now, I believe telehealth should be viewed as a complement to, rather than a replacement for, traditional healthcare.

Page 1 of 14

Powered by WordPress & Theme by Anders Norén