Grumpy opinions about everything.

Author: John Turley Page 14 of 27

 Doctors of the Deep Blue Sea

A Brief History of the U.S. Navy Medical Corps

The U.S. Navy Medical Corps has a history that evolves from a humble beginning during the Revolutionary War to its current role as a vital component of modern military medicine. The Medical Corps ensures the health and well-being of sailors, Marines, and their families, while contributing to public health and advancements in medical science.

Origins in the Revolutionary War

The roots of Navy medicine trace back to the Revolutionary War, when medical care aboard ships was primitive at best. Shipboard surgeons, often lacking formal medical training, treated injuries and disease with the limited tools and knowledge available to them. In the early days of the U.S. Navy, physicians served without formal commissions, often receiving temporary appointments for specific cruises.  Their primary tasks included amputations, treating infections, and caring for diseases like scurvy and dysentery.

In 1798, Congress formally established the Department of the Navy, creating the foundation for organized medical care within the naval service.  Surgeon Edward Cutbush published the first American text on naval medicine in 1808. The Naval Hospital Act of 1811 marked another milestone, authorizing the construction of naval hospitals to support the growing fleet.

Establishment of the Navy Medical Corps (1871)

The U.S. Navy Medical Corps was officially established on March 3, 1871, by an act of Congress. This legislation created a formal medical staff to support the Navy, setting standards for the recruiting and training naval physicians. These physicians were initially known as “Surgeons” and “Assistant Surgeons,” tasked with providing care on ships and at naval hospitals.  The act granted Navy physicians rank relative to their line counterparts, acknowledged their role as a staff corps, and established the title of “Surgeon General” for the Navy’s senior medical officer.

During this period, the Navy Medical Corps began to expand its scope. It embraced emerging medical technologies and scientific discoveries, setting the stage for its later contributions to public health and medical innovation.

The Navy Hospital Corps

The U.S. Navy Hospital Corps was established on June 17, 1898. Its creation was prompted by the increased medical needs during the Spanish-American War. Since then, the enlisted corpsmen have served in every conflict involving the United States, providing critical medical care on battlefields, aboard ships, and in hospitals worldwide.

Corpsmen are trained to perform a wide range of medical tasks, including emergency battlefield triage and treatment, surgery assistance, and disease prevention. They are often embedded directly with Marine Corps units, making them indispensable on the battlefield.

The Hospital Corps is the most decorated group in the U.S. Navy. To date, its members have earned numerous high-level awards for valor, including: 22 Medals of Honor, 182 Navy Crosses, 946 Silver Stars, and 1,582 Bronze Stars.

World Wars and the Expansion of Military Medicine

Both World War I and World War II were transformative for the Navy Medical Corps. During World War I, Navy medical personnel treated injuries and illnesses both aboard ships and in field hospitals. Their efforts were instrumental in managing wartime epidemics, including the devastating 1918 influenza pandemic.

World War II brought further advancements. The Navy Medical Corps played a pivotal role in addressing the challenges of warfare in diverse climates, including tropical diseases in the Pacific Theater. It also pioneered methods for treating trauma, burns, and psychiatric conditions.

Cold War Era and Modernization

The Cold War era marked a time of significant innovation for the Navy Medical Corps. The establishment of the Navy Medical Research Institutes advanced studies in areas such as tropical medicine, submarine medicine, and aerospace medicine. These efforts supported the Navy’s global missions and contributed to broader medical advancements.

In the latter half of the 20th century, Navy medical personnel became key players in humanitarian missions, responding to natural disasters and providing aid in conflict zones. Their expertise in public health, infectious disease control, and trauma care enhanced the Navy’s ability to spread goodwill worldwide.

Modern Contributions and Future Challenges

Today, the Navy Medical Corps supports both military readiness and global health. Its personnel provide care on ships, submarines, aircraft carriers, and for Marine Corps forces, and at shore-based facilities. They also participate in humanitarian missions and disaster response, reflecting the Navy’s commitment to a broader vision of security and well-being.

In recent years, Navy medicine has faced challenges such as the COVID-19 pandemic, addressing mental health issues among service members, and adapting to emerging threats like climate change and cyber warfare defense. These challenges underscore the evolving role of the Navy Medical Corps in a complex world.

From its early days of rudimentary care to its modern role in global health and innovation, the U.S. Navy Medical Corps has been a cornerstone of military medicine. Its contributions extend beyond the battlefield, shaping public health, medical research, and humanitarian efforts worldwide.

As the Navy Medical Corps continues to adapt to new challenges, it remains a testament to the enduring value of medical service in the defense of the nation and the promotion of global health.

Choosing Not to Know

Why We Avoid Truths That Make Us Uncomfortable

One afternoon during the COVID lockdown I was scrolling through online news sites looking for something to read.  I realized I was intentionally bypassing sites I knew I would disagree with.  This surprised me because I have always been a proponent of critical thinking.  Here I was practicing its antithesis— willful ignorance—intentionally avoiding evidence that contradicts my beliefs or preferences.

This behavior may seem irrational, yet it persists across all aspects of life, from personal relationships to religious beliefs to political ideologies. Understanding why we cling to falsehoods, what value we derive from this behavior, and how we can counter it is essential for fostering open-mindedness and informed decision-making.

We often assume that willful ignorance is something that affects “them”—the people with whom we disagree. Anyone can fall victim to willful ignorance, even you and me.

 When we encounter evidence that contradicts our beliefs, we experience cognitive dissonance—a state of mental discomfort caused by holding two conflicting ideas simultaneously. To resolve this discomfort, we often reject new evidence rather than altering our existing worldview.

We tend to seek out and interpret information in ways that confirm our pre-existing beliefs while ignoring or dismissing evidence to the contrary. This  conformation bias reinforces our opinions and shields us from uncomfortable truths.

 Our beliefs are often tied to our social identity. Having our beliefs challenged can feel like an attack on our sense of self or on our group affiliations. Maintaining allegiance to a shared belief—whether religious, political, or cultural—can feel more important than factual accuracy.

Contradictory evidence can create fear and uncertainty, especially if it undermines our understanding of the world. Clinging to familiar falsehoods can provide us a sense of security and predictability.

We invest time, energy, and emotions into our beliefs. Admitting we were wrong may feel like a personal failure or a waste of effort, making it easier to reject new information than to reconsider long-held positions.

Despite its drawbacks, willful ignorance offers psychological and social benefits that make it appealing.  Ignoring uncomfortable truths can protect us against guilt, shame, or fear, while providing a sense of inner peace and emotional comfort.  We may attempt to maintain our sense of self and group identification by avoiding information that threatens our worldview. Engaging with complex or contradictory information requires mental effort. Ignoring it simplifies decision-making, reducing cognitive load.  Aligning with a group’s shared beliefs—regardless of their accuracy—fosters social cohesion and acceptance.

While anyone can fall into willful ignorance, certain factors may make some groups more prone to it.  Studies show that individuals across the political spectrum exhibit willful ignorance, though the issues they ignore vary. For example, conservatives may deny climate change, while progressives may overlook the economic costs of policies they favor.  Groups that emphasize doctrinal adherence may be more resistant to evidence that challenges theological teachings.  Older adults may resist evidence that challenges long-held beliefs. However, younger individuals can also exhibit willful ignorance, particularly in social media echo chambers.

We are more likely to reconsider our beliefs in an environment where we feel we have been heard and understood rather than attacked and ridiculed. Constructive dialogue, rather than confrontation, opens the door to change.  Facts alone often fail to persuade. Framing evidence within emotionally resonant stories can make it more effective.  Presenting new information in small, digestible portions helps reduce cognitive dissonance and makes new ideas less threatening.  We are more likely to accept information from sources we trust, particularly those who share our cultural or ideological background.

Convincing someone that their beliefs are counterproductive requires tact and patience.  But, before trying to change others, we must first examine our own beliefs to ensure we are not guilty of the same behavior.  Self-examination is the first step in addressing willful ignorance.

Willful ignorance thrives in environments of fear, division, and mistrust. Countering it requires empathy, compassion, and truth. If we engage with others in a spirit of understanding rather than confrontation, we have a better chance of bridging divides and creating meaningful change.

The journey is challenging, but the rewards—for both individuals and society—will be worth the effort.

Don’t Forget Climate Change

It Affects Us All

Climate change, one of the most critical challenges facing humanity in the 21st century, seems to be forgotten in all the controversy surrounding DOGE. Regardless of everything else going on, we can’t ignore climate change because it affects global temperatures, weather patterns, ecosystems, and economies. The overwhelming scientific consensus is that human activities—primarily the burning of fossil fuels—are driving climate change.

The existence of climate change and the impact of human activity, like any other field of science, includes areas of disagreement among researchers. One of the principal areas of disagreement is about the sensitivity of the climate to the increase in CO2 production and the rate at which global warming will occur. There’s also discussion about how effective climate models may be with some arguing that the models may either overestimate or underestimate certain effects. A significant area of disagreement is over what is known as the “tipping points”. This is a debate about when or if certain events such as ice sheet collapse, permafrost thaw or ocean circulation changes might occur. Some argue these events could trigger rapid self-reinforcing climate shifts while others believe changes will be more gradual. Even with this disagreement there is broad acceptance that climate change has increased the frequency and intensity of heat waves, heavy rain and extreme weather.

As intense as some of these scientific debates maybe, they pale in significance beside the political debates being generated around climate change.

When the possibility of climate change was first recognized in the 1970s and 1980s there was bipartisan support to address possible remediation of long-term impacts. Republican President Richard Nixon signed landmark environmental laws including the Clean Air Act.

During the 1990s climate change became more polarized. President George H. W. Bush begin to frame climate change policy as an economic threat. George W. Bush rejected the Kyoto Protocol to avoid “economic hindrance”.

By 2008 the partisan divide had significantly increased. Republicans increasingly dismissed climate risks while Democrats amplified the urgency of taking action. By 2023, 78% of Democrats prioritized climate policy, but only 21% of Republicans viewed climate action as urgent despite increasing climate risks in some  GOP dominated states such as Florida and Texas.

The partisan gap expanded as conservative science skeptics continued to raise issues about rates of change, economic impacts and potential solutions. These conservatives tend to view climate policies as government overreach, while progressives hold the position that the government led initiatives are essential to combat environmental threats.

As they have in many other issues, the media have lined up into conservative and progressive camps. The conservative leaning media downplays climate risks while the liberal leaning media emphasizes the danger and need for urgent action. As with many other things this leads to a “echo chamber” effect simply reinforcing political beliefs without adding anything new of significance to the debate.

The Trump administration has signaled its desire to undo many of the climate change initiatives put in place by Democratic administrations. On January 20, 2025, President Trump signed Executive Order 14162 directing the immediate withdrawal of the United States from the Paris Climate Agreements and related international climate commitments. He has declared a “National Energy Emergency” to accelerate fossil fuel development and ease restrictions on the construction of new oil and gas projects. As part of this effort, he has weakened environmental reviews. This is expected to significantly increase fossil fuel consumption and associated greenhouse gas emissions. The Trump administration has begun the rollback of environmental regulations. Lobbyists for the oil, gas and chemical industries have been appointed to the Environmental Protection Agency to reverse climate regulations and pollution controls.

The administration is withdrawing funding for clean energy initiatives including those aimed at reducing carbon emissions and promoting renewable energy resources. The administration has initiated a review of the “legality and continued applicability” of the EPA’s endangerment finding which is the basis of most federal regulations on greenhouse gas.  The administration rolled back regulations limiting methane emissions from oil and gas operations. The definition of “waters of the United States” under the Clean Water Act was narrowed, potentially allowing increased pollution in streams and wetlands.

We can expect increases in severe weather because of Trump’s environmental policies.  These policy decisions collectively hinder efforts to mitigate climate change, potentially leading to increased greenhouse emissions and global warming. Reduction in funding for climate change research and the rollback of environmental regulations will have long term adverse effects on both domestic and global environmental health.

Significant budget cuts and layoffs within agencies like the National Oceanic and Atmospheric Administration (NOAA) could impair the ability to forecast and respond to severe weather events. For instance, the reduction of meteorologists and environmental scientists may hinder critical forecasting services, affecting public safety during events like hurricanes, tornados and floods.

The U.S. withdrawal from international climate initiatives, such as the Loss and Damage Fund, reduces financial support for developing countries dealing with climate-induced disasters. This could lead to inadequate infrastructure and preparedness in vulnerable regions, potentially increasing the severity of weather-related impacts.

While it is challenging to attribute specific future weather events to current policy changes directly, the administration’s environmental policies will likely contribute to conditions that favor more frequent and intense extreme weather events. The combination of increased greenhouse gas emissions together with weakened environmental regulations, reduced climate research capabilities, and diminished global climate cooperation collectively enhance the likelihood and impact of severe weather phenomena. This damage to our environment needs to be prevented!  Once it occurs it will be difficult to ever reverse and our children and grandchildren will suffer as a result.

More Than Just Glasses – Eye Health for Adults

Most of us don’t consider getting an eye exam until we think we need new glasses or maybe when we think we need glasses for the first time. But that’s not the only reason we should be visiting the eye doctor.  For adults, maintaining eye health becomes increasingly important as we get older. Vision changes are a natural part of aging and many serious eye conditions can be managed or even prevented with regular care. Conditions such as cataracts, glaucoma, macular degeneration, and diabetic retinopathy can be discovered during routine exams. Additionally, there are rarer eye conditions that can be detected, such as ocular cancers, that may not be symptomatic initially but can lead to vision loss and can even be fatal.

Timely diagnosis and treatment of eye diseases are crucial to preserving sight and overall quality of life.  Your eye exam is about far more than just a new pair of glasses.

This issue will cover major eye diseases affecting adults, the symptoms, available treatments, and complications of late diagnoses.


Cataracts

A cataract is a clouding of the eye’s natural lens, leading to blurry or diminished vision. Cataracts are one of the most common causes of vision loss in older adults.

Symptoms:

  • Blurred or cloudy vision
  • Difficulty seeing at night
  • Sensitivity to light and glare
  • Seeing halos around lights
  • Fading or yellowing of colors
  • Double vision in one eye

Treatment:

In the early stages, stronger lighting and prescription glasses may help. However, the only definitive treatment is cataract surgery, where the cloudy lens is replaced with an artificial intraocular lens (IOL). Cataract surgery is one of the safest and most effective procedures available.

Complications of Late Diagnosis:

Delaying treatment can lead to significant vision impairment, increasing the risk of falls, depression, and loss of independence. In advanced cases, cataracts can cause complete blindness.


Glaucoma

Glaucoma is a group of diseases that damage the optic nerve, often due to high intraocular pressure. Open-angle glaucoma is the most common form. It typically develops slowly without noticeable symptoms. Angle-closure glaucoma appears more suddenly and generally involves severe eye pain.  Glaucoma is a leading cause of blindness worldwide and often develops without noticeable symptoms until significant vision loss occurs.

Symptoms:

  • Gradual loss of peripheral vision (in open-angle glaucoma)
  • Sudden, severe eye pain (in angle-closure glaucoma)
  • Blurred vision
  • Halos around lights
  • Nausea and vomiting (in acute cases)

Treatment:

Glaucoma cannot be cured, but it can be managed with:

  • Prescription eye drops to reduce intraocular pressure
  • Laser therapy to improve fluid drainage
  • Surgery in severe cases

Complications of Late Diagnosis:

Glaucoma-related vision loss is irreversible. Without timely intervention, glaucoma can lead to tunnel vision and complete blindness.


Age-Related Macular Degeneration (AMD)

Macular degeneration, or age-related macular degeneration (AMD), primarily affects the macula, the central part of the retina responsible for sharp, central vision. There are two main forms of AMD: dry (non-neovascular) and wet (neovascular). Dry AMD is more common and progresses slowly, while wet AMD is less common but more severe and leads to rapid vision loss.

Symptoms:

  • Blurred or distorted central vision
  • Difficulty reading or recognizing faces
  • Straight lines appearing wavy
  • Need for brighter light when reading
  • Dark or empty areas in the center of vision

Treatment:

There is no cure for AMD, but treatment options include:

  • Injections to slow the progression of wet AMD
  • Laser therapy in some cases
  • Lifestyle changes, including a diet rich in leafy greens, omega-3 fatty acids, and antioxidant supplements

Complications of Late Diagnosis:

Without early treatment, AMD can progress to severe vision loss, making everyday activities like reading and driving difficult.


Diabetic Retinopathy

This condition occurs in people with diabetes when high blood sugar damages the blood vessels in the retina. In early stages it is not symptomatic, but it can lead to blindness if untreated.

Symptoms:

  • Floaters or dark spots in vision
  • Blurry vision
  • Difficulty seeing colors
  • Vision loss in advanced cases

Treatment:

  • Better blood sugar control to slow progression
  • Injections to prevent spread
  • Laser treatment to seal leaking blood vessels
  • Surgery for severe cases

Complications of Late Diagnosis:

Delaying treatment can result in retinal detachment, complete vision loss, and an increased risk of other eye diseases.


Cancers of the Eye

Although rare, cancers such as ocular melanoma can develop in the eye. This is a diverse group of malignancies that can affect different parts of the eye and its surrounding structures. They can originate within the eye or can spread to the eye from other parts of the body. They can be aggressive and vision threatening requiring prompt diagnosis and treatment.

Symptoms vary depending on the type and location of the cancer and can include many of the same symptoms as other eye diseases. Prognosis and treatment depend on the type of cancer and stage at the time of diagnosis. Treatment can include surgery, radiation therapy, laser therapy, chemotherapy and targeted immunotherapy. Early diagnosis is critical.

The Importance of Regular Eye Examinations

The American Academy of Ophthalmology recommends that adults over 65 have a comprehensive eye exam at least once a year, even if they have no noticeable vision problems. Those with conditions like diabetes, glaucoma, or AMD may need more frequent exams.

Maintaining Good Eye Health

  • Eat a Vision-Friendly Diet: Foods rich in vitamin A, C, E, zinc, and omega-3 fatty acids help protect eyesight.
  • Control Chronic Conditions: Managing diabetes and high blood pressure reduces the risk of eye complications.
  • Protect Your Eyes: Wear sunglasses with UV protection and blue light filtering when using digital screens.
  • Quit Smoking: Smoking significantly increases the risk of AMD, cataracts, and other eye diseases.
  • Stay Active: Regular exercise improves circulation and overall eye health.
  • Use Proper Lighting: Ensure good lighting at home to prevent strain and falls.
  • Follow Medication Instructions: Use prescribed eye drops and medications consistently to manage conditions like glaucoma.

Prioritizing Eye Health for a Better Quality of Life

Vision loss can significantly impact independence, mobility, and mental well-being. The key to maintaining good eye health is early detection and timely treatment. By scheduling regular eye exams and adopting healthy habits, you can preserve your vision and enjoy a higher quality of life.

If you’re haven’t had an eye exam in the past year, now is the time to schedule one. It’s about more than just a new pair of glasses. Protecting your eyesight today can ensure a clearer, brighter tomorrow.

Don’t Cut and Run on Ukraine

Like many Americans, my wife and I were both embarrassed and disgusted by the Oval Office ambush of Ukrainian President Volodymyr Zelenskyy by Donald Trump and JD Vance.  We were so upset by this disgraceful treatment of the visiting president of a sovereign nation, that we followed the lead of a friend and immediately ordered “I Stand With Ukraine”  T-shirts.

The oval office meeting held on February 28, 2025, was ostensibly intended to finalize a mineral rights agreement between the United States and Ukraine. The deal was seen as a strategic move to reduce US dependence on Chinese rare earth minerals and to support Ukraine’s economy amidst its ongoing conflict with Russia.

In what appeared to be a planned attack, Vice President Vance berated President Zelenskyy, making false claims of ingratitude on the part of Ukraine. President Trump quickly escalated the situation by criticizing Zelenskyy’s approach to the war and asserting that the Ukraine was “gambling with World War III.”   He then demanded that President Zelenskyy admit that he was responsible for the war and could end it at any time by making a deal.  Trump further demanded that Zelenskyy admit that it was Ukraine that was responsible both for initiating and prolonging the war.

If there is any doubt this was a planned and likely scripted meeting on the part of the Trump administration, you only have to look at Donald Trump’s closing statement for the meeting.  “I think we’ve seen enough. This is going to be great television.”

The fallout from this event has significant implications for international diplomacy and the ongoing conflict in Eastern Europe. The suspension of U.S. military aid to Ukraine following the meeting has raised concerns about Ukraine’s ability to defend itself against Russian advances. Ukrainian officials expressed disappointment but remained defiant with one military official stating, “we will fight with or without their help.”

President Trump has labeled Zelenskyy a dictator who is unwilling to negotiate peace. He claims that the Ukraine initiated hostilities against the Russian speaking population, requiring Russia to intervene. These claims have long since been debunked, yet Donald Trump continues to repeat them. It has been interesting this past week to watch Trump nominees try to avoid saying whether they believed Russia has invaded Ukraine. They evaded questions by saying they didn’t have all the facts, or it wasn’t appropriate for them to respond, when obviously they did not want to lie under oath and claim that Russia had not invaded Ukraine.

Russian officials and state media reacted with approval to the Oval Office clash.  China, Syria, North Korea and Iran also supported the Trump administration’s approach. 

The French President and the British Prime Minister both reaffirmed their commitment to Ukrainian sovereignty and condemned the manner in which the meeting was conducted.

Decide with whom you prefer to have the United States aligned, our long-standing allies and other democratic governments, or with autocrats and dictators. 

We invite you to join us and proudly proclaim “I STAND WITH UKRAINE.”

Superheros of the American Revolution

The American Revolution was fought not just by great leaders but by ordinary men and women who sacrificed everything for the promise of liberty. These “superheroes” of The Revolution—everyday soldiers of the Continental Army—endured unimaginable hardships and proved their resilience and commitment to a cause greater than themselves.

Who Were the Soldiers?
The typical soldier in the Continental Army was a young, able-bodied man in his late teens or twenties. However, recruits ranged widely in age, from boys as young as 16 to older men in their 40s or 50s. They came from all walks of life, reflecting the agrarian and small-town character of colonial America.

Work Background
Most soldiers were farmers or farm laborers, the backbone of the colonial economy. Others worked as apprentices or tradesmen, honing skills in blacksmithing, carpentry, and shoemaking. In coastal regions, fishermen and sailors also joined the ranks, bringing valuable maritime experience. Whatever their occupation, enlistment often meant leaving behind grueling but steady work, placing enormous burdens on their families and communities.

Education
Formal education was limited for most enlisted men. Literacy rates in colonial America, though higher than in Europe, were modest. Many soldiers could read and write only minimally, though these skills were sufficient for reading orders or sending letters home. Officers were generally better educated, often hailing from wealthier families with access to classical training and instruction in leadership and military strategy.

Family Life
Family ties were integral to the soldiers’ lives. Most were unmarried young men, but some older recruits left wives and children behind. Married soldiers relied on their families to manage farms and households in their absence, with women stepping into traditionally male roles to keep homes running. Communities often influenced enlistment decisions, with entire groups of men from the same town joining together, fostering camaraderie and mutual responsibility.

Why They Fought
Motivations for joining the Continental Army varied:
Patriotism: Many believed passionately in independence and the ideals of liberty and self-governance.
Economic Opportunity: For poorer colonists, enlistment promised steady (albeit delayed) pay and the promise of land grants after the war.
Community Expectations: Peer pressure and local leaders often spurred enlistments.
Adventure: For some young men, the army offered a chance for excitement and novelty.

Life in the Continental Army
Soldiers in the Continental Army faced extraordinary challenges that tested their endurance, commitment, and morale.
Logistical Struggles
The army constantly grappled with a lack of basic supplies:
Food: Soldiers often endured long periods of hunger, relying on inconsistent local contributions, sometimes going days without eating.
⦁ Clothing: Many lacked proper uniforms, footwear and blankets, suffering in harsh weather some even dying from exposure.
Ammunition: Weapons and ammunition were scarce, forcing soldiers to scavenge from battlefields.

At Valley Forge in the winter of 1777–1778, these shortages reached a critical point, with thousands suffering from frostbite, near starvation and exposure.
Extreme weather compounded the soldiers’ difficulties. Winter encampments like Valley Forge were marked by freezing temperatures, snow, and overcrowded, unsanitary conditions that led to outbreaks of smallpox, typhus, and dysentery.
Soldiers marched long distances with heavy packs, often on empty stomachs and in worn-out shoes. The physical strain was enormous, and separation from families added emotional stress. Many struggled to adapt to military life, which was vastly different from their previous experiences as farmers or tradesmen.

Financial Hardships
The fledgling American government struggled to fund the war:
⦁ Soldiers were rarely paid on time, leading to frustration and occasional mutinies.
⦁ Promised wages were often months or years late, making it difficult for soldiers to support their families.

Inconsistent Leadership and Training
Early in the war, the army lacked professional training and experienced leadership. While General George Washington provided steadfast guidance, many officers were political appointees with little military expertise. This began to change when Baron von Steuben arrived at Valley Forge, introducing systematic training and discipline.

Psychological Strain
The Revolutionary War dragged on for eight years, leaving soldiers to question whether their sacrifices would lead to victory. Early defeats against the better-equipped British Army demoralized many, and desertion rates were high. Still, the shared belief in the cause of liberty and the support of local communities kept many soldiers in the fight.

The Role of Communities
The army’s survival depended on civilian support. Local farmers, tradesmen, and women provided food, clothing, and moral encouragement. Civilians risked their lives to aid soldiers, and the collective belief in independence buoyed spirits even in the darkest times.

Conclusion
The common soldiers of the Continental Army were true superheroes of the American Revolution. Despite enduring hunger, cold, disease, and financial instability, they fought with unwavering determination. Their sacrifices laid the foundation for a new nation, proving that the quest for freedom often requires immense personal and collective sacrifice.

Sources:
Robert Middlekauff, The Glorious Cause: The American Revolution, 1763-1789
⦁ Caroline Cox, A Proper Sense of Honor: Service and Sacrifice in George Washington’s Army
⦁ Library of Congress: American Revolution resources

Oppression in Politics: Totalitarian and Authoritarian Systems

Since January 20th there has been extensive use of the terms authoritarian and totalitarian to refer to the actions of the current administration.  While totalitarian and authoritarian are often used interchangeably, they represent similar but distinct forms of governance with critical differences. If we’re going to hold rational discussions about these theories, we should be using the same terminology.

A totalitarian government seeks to control every aspect of public and private life, including political, economic, social, and cultural domains. The government uses a specific ideology to unify and dominate society. The government strives to regulate all aspects of life, leaving no room for personal freedoms or independent thought.  A guiding ideology is central, often enforced by propaganda, indoctrination, and censorship.  The government frequently relies on widespread surveillance, police state tactics, and brutal suppression of dissent.  All institutions, media, education, economy, and religion are state-controlled.

Examples include Nazi Germany, unified under an ideology of racial purity and Stalin’s Soviet Union, ostensibly organized under a Marxist ideology.  Both governments maintained control of their population through propaganda, brutal police actions, terror and murder.

 An authoritarian government is characterized by strong central power with limited political freedoms, but it does not seek to control all aspects of life.  Unlike totalitarian regimes, authoritarian states often allow some degree of personal freedom in areas like culture, business, or religion, as long as these do not challenge political authority.  Typically, these regimes are pragmatic and focused on maintaining power, not enforcing an all-encompassing ideology.  They are more likely to be organized around the personality of the dictatorial leader.  While repression is common, it is often less pervasive and targeted primarily at political opponents.

Franco’s Spain had limited political freedoms but allowed religious and cultural autonomy.  Putin’s Russia allows limited economic freedom for members of the Russian oligarchy.

The main distinction lies in the scope of control.  Totalitarian regimes seek to control all aspects of life and demand ideological conformity.  Authoritarian regimes primarily focus on political power and allow some personal autonomy as long as it does not threaten the regime.

In summary, all totalitarian governments are authoritarian, but not all authoritarian governments are totalitarian.

Waiting For The Reichstag Fire

On the evening of February 27th, 1933 the German Reichstag burst into flames. This attack on the German national parliament building was viewed by many as an attack on Germany itself.

A Dutchman named Marinus van der Lubbe was found and arrested at the scene almost immediately after the fire erupted. The Nazis quickly claimed that the fire was part of a broader communist uprising and used this claim to push for emergency powers.

 Van der Lubbe confessed to setting the fire alone, but the Nazi Party quickly claimed that it was part of a widespread communist conspiracy. Many people believe that the Nazis may have set the fire themselves and used it as a pretext to declare emergency rule.

 Adolf Hitler persuaded German President Paul von Hindenburg to issue the “Decree for the Protection of the People and the State” which suspended civil liberties, including freedom of speech, press and assembly. It allowed for the arrest and detention of political opponents without due process. Thousands of communists and socialists were arrested.

Within a month new elections were held. While the Nazis did not win an outright majority, they used the fire to create fear that led to passage of the “Enabling Act” on March 23, 1933. The act gave Hitler dictatorial powers, effectively ending democracy in Germany.

The Reichstag Fire was a crucial point in world history. Whether it was a Nazi engineered false flag operation or the act of a alone arsonist, it provided Hitler with the excuse he needed to dismantle democracy and establish a totalitarian dictatorship. This is a chilling example of how fear and propaganda can be weaponized to erase freedom; a lesson that remains relevant today.

Telehealth: Revolutionizing Healthcare

Or Is It Simply a Band-Aid?

When I first started hearing about telemedicine in the 1990s, I was dubious at best. How can I treat a patient I can’t examine? Too many things ran through my mind. I couldn’t listen to their heart, I couldn’t listen to them breathe, I couldn’t even look in their throat or their ears. What if I needed an EKG? How could I check their blood pressure? I was worried that telemedicine might be “second rate medicine”. 

I was worried about misdiagnosis and overprescribing antibiotics. If you couldn’t actually examine a patient, you might decide to play it safe and prescribe an antibiotic whether it was really needed or not. It might result in people being sent to the emergency room who might have been treated as an outpatient if you could have examined them in person.

As I looked into it, I discovered that the idea of telemedicine was not really new. As early as 1879, the British Medical Journal The Lancet discussed the possibility of using the telephone, then a revolutionary new technology, to reduce unnecessary doctors’ visits.  It took the advent of the computer age and audio-video technology to make telemedicine a real possibility.  But even then, I was still skeptical. I preferred to see my patients in person and did not get involved in telemedicine until the great societal upheaval of COVID.

I happened to retire from the emergency department three months before COVID hit. I was still doing primary care two days a week for an employee’s clinic. Like everyone else, we were shut down.

Reluctantly, we decided the only way to provide a service to our patients was to start using telehealth. Of course, we had none of the audio-video equipment we needed so we initially did it by telephone. That just confirmed most of my worries about providing poor care. We soon acquired the audio-video capabilities which gave us a little more insight into the patients we were dealing with. Over the next few months, I learned who was and was not a good candidate for telemedicine and how I could best care for patients that I could not physically examine. I’m going to share with you some of the things that I’ve learned over the past four years. Thankfully telehealth is now an exception rather than the rule as it was early in COVID. But it’s here to stay and we need to learn how to make it work.

Advantages of Telehealth

Convenience and Accessibility: Telehealth’s most immediate and tangible benefit is convenience. With the simple click of a button, patients can consult a physician from the comfort of their home. This is particularly helpful for those living in rural areas or those who are physically unable to travel to a clinic or hospital. According to a study by the American Medical Association, telehealth has increased access to care for patients who otherwise might not be able to receive it, whether due to geographical limitations, lack of transportation, or mobility issues.

For working professionals or parents who find it difficult to carve out time for in-person visits, telehealth allows consultations to occur from anywhere, drastically reducing travel time and missed work or family obligations. Patients also benefit from shorter wait times, as virtual queues tend to move more quickly than physical ones.

 Cost Efficiency:  Telehealth services can be more cost-effective for both patients and healthcare providers. For patients, the expenses associated with travel, parking, and time away from work are minimized. Healthcare providers, particularly in large hospital networks, can allocate resources more efficiently by integrating telemedicine into their workflow. Many telehealth services also offer more affordable consultation fees compared to in-office visits. A report from the National Bureau of Economic Research found that telemedicine visits are often less expensive for both insurers and healthcare systems.

Continuity of Care:  Telehealth allows for more frequent follow-ups, which is critical for managing chronic diseases such as diabetes, hypertension, and asthma. Instead of requiring patients to come to the clinic for every minor adjustment or medication change, telehealth allows for regular check-ins from home. This facilitates better long-term disease management and patient compliance. It can also enable quick intervention in cases where a patient’s symptoms escalate, potentially reducing the likelihood of emergency room visits.

Disadvantages of Telehealth

Limited Physical Examination:  The inability to perform a comprehensive physical examination is a significant limitation of telehealth. While many aspects of healthcare can be effectively managed through conversation, video, and shared data, some conditions require a hands-on exam. For example, a doctor might not be able to detect subtle signs of a skin condition, a heart murmur, or abdominal tenderness through a video screen. This limitation can hinder accurate diagnoses and delay proper treatment.

Privacy and Data Security:  Healthcare data is among the most sensitive forms of personal information. The shift to telehealth introduces significant concerns about data security, especially given the increase in cyberattacks on healthcare systems. The Health Insurance Portability and Accountability Act (HIPAA) mandates strict guidelines for protecting patient privacy, but not all telehealth platforms may be fully compliant. In some cases, platforms may use third-party applications that could compromise patient information. The risk of hacking, data breaches, or improper data handling adds another layer of complexity to the telehealth debate.

Connectivity Issues: High-speed internet is a luxury that is still not available in many rural and underserved areas. Telehealth relies heavily on stable and fast internet connections to facilitate real-time communication between patient and provider. In regions where broadband access is limited, telehealth appointments can be riddled with delays, interruptions, or complete disconnections. This not only disrupts the flow of the consultation but can also compromise the quality of care provided.

Lack of Universal Standards: Unlike in-person healthcare, where the processes are well-established and regulated, telehealth practices can vary significantly between providers and systems. The lack of universal standards for telehealth can lead to inconsistencies in the quality of care. Some platforms might not integrate well with electronic health records (EHRs), making it difficult for physicians to access a complete patient history during the virtual consultation.  Platforms may not function seamlessly across different devices (i.e., Android vs. iOS) or different browsers. Technical support may not always be readily available to address these issues, leading to delays in care or missed appointments.

Medical Problems Not Appropriate for Telehealth

While telehealth has proven to be effective for certain conditions, it is not a one-size-fits-all solution. There are specific medical problems that necessitate an in-person visit, where a physical examination and specialized equipment are crucial.

 Acute Injuries and Trauma:  Telehealth is not suitable for diagnosing or treating acute injuries such as fractures, deep cuts, burns, or other types of trauma. These conditions require immediate hands-on evaluation, imaging (e.g., X-rays or CT scans), and possibly surgical intervention. A telehealth consultation cannot provide the necessary tools to address these problems adequately, and any delays in care could worsen the patient’s condition.

Cardiovascular Emergencies: Conditions such as chest pain, heart attack symptoms, or strokes demand immediate in-person evaluation. The time-sensitive nature of these issues means that telehealth would not be appropriate for diagnosis or treatment. Patients experiencing these symptoms require rapid testing, monitoring, and possibly life-saving interventions that cannot be performed remotely.

Neurological Symptoms: Patients presenting with acute neurological symptoms such as sudden onset of weakness, slurred speech, confusion, or seizure activity require immediate in-person evaluation. These symptoms could indicate a stroke, transient ischemic attack (TIA), or another serious neurological condition that cannot be diagnosed or managed through a telehealth appointment.

Surgical Consultations: While telehealth can be a valuable tool for follow-up appointments post-surgery, the initial evaluation for surgical candidates should take place in person. Surgeons often rely on physical examinations and imaging results to determine whether surgery is necessary and to plan the procedure effectively.

Striking a Balance

Telehealth has transformed healthcare in a multitude of ways, providing unprecedented access to care for millions of patients. Its convenience, cost efficiency, and ability to promote continuity of care make it a powerful tool in the modern healthcare landscape. However, the limitations of telehealth, especially in cases requiring hands-on care or in emergencies, cannot be ignored. As healthcare systems continue to integrate telehealth into routine practice, it is essential to strike a balance between virtual and in-person care to ensure that all patients receive the level of medical attention they need. For now, I believe telehealth should be viewed as a complement to, rather than a replacement for, traditional healthcare.

Hijacked Healthcare- A System In Crisis 

For more than 30 years I have watched our health care system become increasingly more politicized. As a physician I have become concerned with the direction it has recently taken. 

Until the early 20th century healthcare was mostly private, and medical expenses were out of pocket. Early calls for national health insurance began with labor organizations and were quickly joined by progressive politicians. President Franklin Roosevelt wanted to include health insurance in the Social Security Act of 1935 but was unable to get it passed. President Harry Truman also proposed a National Health Insurance program in 1945, but it was denounced as socialized medicine.  All these efforts were opposed by business interests, conservative politicians — particularly southern— and surprisingly, the American Medical Association. 

Finally in the 1960s as part of his “Great Society” programs President Lyndon Johnson pushed for the passage of both Medicare and Medicaid. Rising costs of health care under President Richard Nixon led to the introduction of Health Maintenance Organizations (HMOs) as an attempt to encourage cost efficiency. President Ronald Reagan reduced federal health care spending and pushed for more privatization. In the 1990s President Bill Clinton attempted to introduce universal health coverage but it was met by fierce opposition from the insurance industry, business, and the Republican Party who labeled it as government “overreach”. Finally in 2010 President Obama’s Affordable Care Act (ACA) also called “Obamacare” became the most significant health care reform since Medicare and Medicaid. It also faced legal challenges and political resistance with the Republicans consistently attempting to repeal it. During his first term, President Donald Trump reduced ACA funding and repealed the individual mandate penalty that had required people who did not maintain health insurance to pay a fee. The elimination of the penalty weakened the law and reduced the number of people who sought coverage.  We can expect further efforts to weaken the provisions of the ACA but given that it is well entrenched in the US healthcare system now is unlikely that it will be completely repealed. 

While early health care programs faced significant controversy and strong debate, progress in providing expanded coverage and improved care was continuous.  I’m concerned that we’re about to enter an era where many of our gains in public health are going to be reversed.  The United States remains unique among wealthy nations as the only one without universal health care and I fear that we will begin to lose what gains we have made over the past several decades. 

I’ve written previously about my concerns with vaccine resistance and the elimination of vaccination requirements for school children. I believe that this is an impending public health disaster and I’m afraid there are even greater disasters on the horizon. 

Robert F. Kennedy Jr has been nominated by President Trump to be the secretary of Health and Human Services and by the time you read this he may well have been confirmed. During his confirmation hearings Kennedy has made a few positive statements. He’s expressed an intent to increase focus on chronic diseases such as diabetes and obesity. He has indicated support for rural hospitals. He would like to increase training for physicians in addiction care and increase access to treatment programs. He is also indicated plans to improve American diet by targeting ultra processed foods, contaminants in food, and placing restrictions on food additives. He also has proposed reforms to include stricter FDA oversight of the food supply. 

However, there are several very troubling aspects to his nomination. He has a history as a vaccine denier although he is currently denying that denial. He said he is not anti vaccine but is pro safety. He has stated he will support polio and measles vaccines and that all his children have been vaccinated. (In 2020, while speaking on the podcast of his nonprofit organization Children’s Health Defense, Kennedy said that he would do anything, pay anything to be able to go back in time to avoid giving his children the vaccines that he gave them.)  Given his history of anti vaccine statements and the fact that he profits from anti vaccine litigation it’s likely he will return to previous anti vaccine positions once confirmed.   

He has proposed significant changes to both the CDC and the NIH including significant staff changes. He has proposed redirecting funding to preventative/alternative medicine. 

Most troubling is his poor understanding of Medicare and Medicaid programs. During questioning he showed a lack of understanding of the funding sources and statutory requirements of the two programs. 

The Centers for Disease Control (CDC) faces considerable threat. House Republicans have proposed a $1.8 billion cut (22%) to CDC’s budget. These budget cuts target programs that address opioid overdoses, firearm injuries and food safety monitoring. This budget conflicts with Kennedy’s statements about his priorities and it remains to be seen how this will be resolved. The Heritage Foundation’s Project 2025 has advocated splitting the CDC into two separate entities: one for data collection and another for limited public health guidance. The intent is to reduce its influence on social policies. The administration has already imposed communications restrictions, requiring that CDC announcements, social media posts and scientific reports undergo political review. There is currently a proposal to reduce the in-house reviews of medical research; there is even a proposal to “deputize the public” to challenge scientific findings used in regulations. This would leave medical research open to review by the least qualified. Unfortunately, he current nominee for CDC director, David Weldon, a physician and former republican congressman, has signaled his intent to narrow the agency’s scope and his support for administration policies. 

Highly contentious issues such as gender affirming care and reproductive health have already been severely restricted. It is likely that these areas will come under continued attack by the current administration. 

This administration also poses a threat to global health. By executive order the US was withdrawn from the World Health Organization. Additionally, the US Agency for International Development (USAID) has been significantly reduced with all major programs placed on hold. Not only does USAID support foreign aid programs, but it is also a major player in global health. 

USAID sponsored programs identify and monitor disease outbreaks, provide treatment and preventive measures for local populations and provide global disease alerts that help protect United States citizens.  We are already seeing the beginnings of a worldwide humanitarian healthcare emergency.  Not only will this affect healthcare systems but eventually the economic systems in countries who have lost their access to modern medical assistance.  We will lose the advanced notice about disease outbreak and spread.  Without this remote surveillance, it is possible that we may be caught unaware by the next pandemic until it is ravaging our population. 

This administration claims to support “the average American” yet it seems to be intent on destroying all our health. 

Page 14 of 27

Powered by WordPress & Theme by Anders Norén