Grumpy opinions about everything.

Author: John Turley Page 8 of 27

Understanding Herd Immunity

Your Community’s Shield Against Disease

Picture your community as a fortress. The stronger the walls and the more guards on duty, the harder it becomes for invaders to breach the defenses. Herd immunity works similarly—it’s your community’s invisible shield against infectious diseases, and vaccination is the primary way we build and maintain that protection.

Initial observations of herd immunity arose from livestock studies in the early twentieth century. Farmers noticed that once most animals in a herd recovered from a disease, future outbreaks diminished or disappeared altogether. Public health scientists later confirmed that this same principle applies to humans.

What Is Herd Immunity?

Herd immunity means that enough people in a group or area have achieved immunity against a virus or other infectious agent so that it becomes very difficult for the infection to spread. When a critical proportion of the population becomes immune, called the herd immunity threshold, the disease may no longer persist in the population, ceasing to be endemic.

Think of it like a firebreak in a forest. If enough trees have already been burned (past infection) or treated with flame retardant (vaccination), the fire has a harder time jumping from tree to tree. Similarly, with herd immunity, the chain of transmission is disrupted.

Individuals who are immune to a specific disease act as a barrier to the spread of disease, slowing or preventing the transmission of disease to others. This protection can come from two main sources: surviving a natural infection or receiving vaccines. However, vaccination is by far the safer and more reliable path to immunity.

The Math Behind Community Protection

The magic number for herd immunity isn’t the same for every disease—it depends on how contagious the illness is. Scientists use something called the basic reproduction number (R₀) to figure this out. For measles, one of the most contagious diseases, (R₀=15), this means 1 – (1/15) = 1 – 0.067 = 0.933. Measles herd immunity requires 93% of the population to be immune, while polio—less contagious—requires 80%.

For COVID-19, the target has been a moving one. At the start of the pandemic, researchers thought that having 60% to 70% of the people in the world immunized through vaccination or infection would equal the level of herd immunity needed for COVID-19. However, the contagiousness of the delta and omicron variants has made researchers rethink that number. Now that number could be as high as 85%.

Protecting the Most Vulnerable

Here’s where herd immunity becomes truly meaningful: it’s not just about personal protection—it’s about creating a safety net for those who need it most. Herd immunity gives protection to vulnerable people such as newborn babies, elderly people and those who are too sick to be vaccinated. In every community, you will find individuals in these categories, making herd immunity that much more important.

Consider these community members who depend on herd immunity:

– Newborns who are too young to receive certain vaccines

– People undergoing cancer treatment whose immune systems are compromised

– Elderly individuals whose immune responses may be weaker

– Those with autoimmune diseases who cannot safely receive live vaccines

– People with severe allergies to vaccine components

These people then depend on others getting vaccinated to be indirectly protected by them. When vaccination rates drop in a community, these vulnerable populations face the greatest risk.

Vaccination: The Cornerstone of Herd Immunity

While natural infection can provide immunity, vaccination is the only viable path to herd immunity for most diseases. The alternative—letting diseases spread naturally—comes with devastating costs. Achieving herd immunity, the ‘natural’ way would mean that many people would die and many others get ill and some seriously ill.

Vaccines have transformed herd immunity from a risky process—one that relied on dangerous natural infection—into a safe and reliable public health strategy. When people are vaccinated, they receive a controlled stimulus that trains their immune systems to recognize and fight particular pathogens, without causing the disease itself. Widespread vaccination reduces the pool of susceptible hosts, “starving” the disease of opportunities to spread.

Real-world examples demonstrate vaccination’s power. In 2000, measles was declared defeated in the U.S. However, in 2019, a surge of new cases was recorded. This occurred as a result of the declining vaccination rates, showing the importance of vaccinations and their impact on herd immunity.

The success stories of vaccination are impressive: Global vaccination campaigns have eradicated smallpox from the planet, and they have eliminated polio from almost all countries in the world.

A Historical Speculation: What If We Had Vaccines in the past?

*Note: The following section involves speculation based on historical analysis.

The 1918 influenza pandemic, often called the Spanish flu, killed an estimated 50 million people worldwide—more than World War I. The H1N1 influenza pandemic that swept across the world from 1918 to 1919, sometimes called “the mother of all pandemics”, involved a particularly virulent new strain of the influenza A virus. The 1918 pandemic is estimated to have infected 500 million people worldwide.

Had a vaccine been available—and administered on a global scale—herd immunity might have dramatically altered the pandemic’s trajectory. Even 50–60% coverage could have slowed transmission enough to flatten the curve, sparing millions of lives. Hospitals, already overwhelmed, might have had more capacity to care for the sick.

Another instructive example is smallpox, which killed an estimated 300 million people in the 20th century alone. Historically, populations never exposed to smallpox—such as indigenous communities in the New World—suffered catastrophic losses, sometimes as high as 90% when the virus first arrived. European societies, by contrast, had some community immunity from years of prior exposure, but still suffered mortality rates as high as 25%. 

Once the smallpox vaccine became widely used, herd immunity did its work so effectively that the disease was eradicated in 1980—the only human disease to be eliminated globally. This success story underscores the potential power herd immunity might have had against earlier plagues.

In the 1940s and 1950s, polio terrified parents across the United States. Summer outbreaks paralyzed thousands of children each year. Once the Salk and Sabin vaccines became available, vaccination campaigns rapidly built herd immunity. Within a few decades, polio was virtually eliminated in the U.S. and reduced worldwide by over 99%. Without herd immunity, the virus would still be circulating widely today.

The Reality Check: Why Herd Immunity Isn’t Always Achievable

Modern societies are paradoxically both more capable and more vulnerable when it comes to herd immunity. Global travel means diseases can spread between continents in hours. Vaccine hesitancy, fueled by misinformation, creates gaps in immunity. At the same time, scientific advances allow us to develop vaccines faster than ever—COVID-19 vaccines were available within a year of the virus’s emergence.

The COVID-19 pandemic also revealed the complexity of herd immunity. High transmission rates, evolving variants, and waning immunity made it nearly impossible to reach a stable herd immunity threshold. Instead, vaccines reduced severity and death, while natural infections layered additional immunity in populations. The lesson: herd immunity isn’t always permanent or perfect, but even partial protection can save countless lives.

This doesn’t mean vaccination is pointless—far from it. Even when herd immunity isn’t achievable, vaccination still provides crucial individual protection and reduces the overall burden of disease in communities.

Your Role in Community Protection

Herd immunity is one of our best tools for the prevention of infectious diseases, but it is a tool that must be continuously sharpened.

Understanding herd immunity helps us see vaccination not just as a personal choice, but as a community responsibility. Every person who gets vaccinated contributes to the collective shield that protects the most vulnerable members of our communities.  It is a story about interdependence.

While the concept can seem abstract, its effects are concrete and measurable. When vaccination rates remain high, diseases that once terrorized communities become rare memories. When they drop, we see the return of preventable illnesses and, tragically, preventable deaths.

The next time you roll up your sleeve for a vaccination, remember you’re not just protecting yourself—you’re helping to maintain your community’s invisible fortress against disease.

This post reflects current scientific understanding of herd immunity and vaccination. For specific medical advice, always consult with a healthcare professional.

Smartphones, Smartwatches & Wearables for Seniors

A Simple Guide to What Helps—and What’s Just Noise

If you’re over 60 and trying to figure out whether a smartphone, smartwatch, or wearable can genuinely make life healthier—or you’re helping a spouse or parent decide—you’re not alone. A lot of people feel overwhelmed by all the features, apps, alerts, and promises.

The good news: some of this tech actually helps. It won’t replace your doctor, but it can flag early problems, keep you safer at home, and make it easier for your family or care team to stay in the loop. The trick is knowing what’s useful and what’s just hype.

Let’s walk through it in plain English.


Why This Stuff Matters Now

Ten years ago, the idea that a watch could detect a fall or an irregular heartbeat felt like science fiction. Today, it’s routine. About a third of adults over 50 now use smartwatches or other wearables—and the number keeps rising.

For many older adults, these devices have quietly become part of the “safety net” that helps them stay independent.


How Smartphones Actually Help Your Health

1. Keeping Medications on Track

If you’ve ever forgotten a pill—or doubled a dose—you’re in good company. Medication mix-ups are incredibly common.

Apps like:

  • Medisafe – shows pill images, keeps a schedule, and even sends caregiver alerts.
  • Apple’s Medications app – built right into iPhones and Apple Watches.
  • CareClinic – tracks meds, moods, blood pressure, and symptoms in one place.

Studies from the National Library of Medicine show people using reminder apps stick to their meds far better than those who don’t.

2. Telemedicine That Actually Works

Telehealth isn’t a pandemic fad anymore—it’s now a standard part of care. Apps like Walmart Health Virtual Care or Heal let you talk to a clinician on video, sometimes even with Medicare coverage. Many can pull in data from wearables so your doctor gets a bigger picture than just your office visit.

3. Everyday Tools for Wellness

Your phone can track blood pressure, sleep, relaxation, and even your medical records.

  • Qardio for blood pressure and weight
  • Insight Timer for stress and sleep
  • My Medical for storing labs and appointment notes

Simple but surprisingly useful.


Smartwatches: What They Really Do Well

Modern smartwatches are basically mini health monitors. Not perfect—but often helpful.

The genuinely useful features

  • Irregular heartbeat detection (A-fib alerts). Apple’s A-fib notification is FDA-cleared and backed by a huge 419,000-person study.
  • Fall detection. If you take a hard fall and don’t respond, the watch can call 911.
  • Walking steadiness alerts. Your phone can notice changes in your balance.
  • Sleep tracking. Good for patterns—not a medical diagnosis.
  • Blood oxygen trends. Not perfect, but another piece of data.

Devices seniors tend to like

  • Apple Watch Series 9 / Ultra 2
  • Samsung Galaxy Watch7
  • Medical alert watches (like Medical Guardian or Bay Alarm), which keep things simple and focus on emergency features.

Continuous Glucose Monitors (CGM): A Game Changer

If you or a loved one has diabetes, CGMs may be the single most meaningful wearable health tool available.

They sit on your arm or abdomen and send glucose numbers to your phone every few minutes. No more finger sticks. No guessing. No surprises.

Why seniors like them

  • Far fewer finger pricks
  • Alerts for highs or lows (can literally prevent emergencies)
  • Better long-term glucose control
  • Optional caregiver alerts

Top CGM options

  • Dexcom G7 – Medicare-covered for many users
  • FreeStyle Libre 3 – small, simple, affordable
  • Medtronic Guardian Connect – syncs with insulin pumps

In 2023, Medicare expanded coverage, so more seniors now qualify.

Speculation: non-invasive glucose sensors (no needles at all) are being tested, but none are FDA-approved yet. Expect progress in the next few years.


Other Wearables That Actually Help

Not everything is a watch:

  • KardiaMobile 6L – a pocket-sized, FDA-approved ECG in 30 seconds
  • Tango Belt – a wearable “airbag” that inflates during a fall
  • Hero Health – a smart pill dispenser that takes the guesswork out of meds

These tend to be more practical than trendy.


How to Choose: Start with Your Goal

Instead of shopping features, pick the problem you’re trying to solve:

  • Worried about falls? Get a watch with fall detection.
  • Blood pressure issues? Pair your phone with a good upper-arm cuff.
  • Managing diabetes? Ask your doctor about CGM eligibility.
  • Heart rhythm concerns? Add a handheld ECG like Kardia.

And make sure the device is easy to share with family or clinicians. Apple’s Health Sharing is especially simple.


Remote Patient Monitoring (RPM)

This is where your doctor gets readings from your home devices automatically. Medicare even pays for it. It can catch early issues—like rising blood pressure—before they turn into bigger problems.

Just be aware not every clinic uses it yet.


Privacy: A Quick Reality Check

Most people assume health apps follow HIPAA. Many don’t.

  • HIPAA covers your doctor—not your app.
  • The FTC now requires some health apps to notify you of breaches.
  • Always review privacy policies to see who gets your data.  Not fun, but necessary.

What Wearables Don’t Do Well

Here’s where things get messy:

  • Heart rate sensors can misread darker skin tones, tattoos, or movement.
  • SpO₂ readings can vary widely—enough that the FDA has issued warnings.
  • Sleep trackers estimate, they don’t diagnose.
  • Step counts vary by 10–30% depending on brand.

Think of wearables as “trends over time,” not medical tests.


Downsides to Keep in Mind

A few honest drawbacks:

  • Daily or near-daily charging
  • Subscription fees that creep up
  • Too many alerts (which most people eventually shut off)
  • Physical challenges like tiny text, small buttons, stiff bands
  • Data that doesn’t always sync with your doctor’s record
  • False reassurance (“My watch didn’t alert, so I’m fine”)

None of these are dealbreakers—but they’re worth knowing.


Where This Is All Going

Wearable tech will keep getting smaller and more accurate: rings, adhesive patches, even hearing aids that monitor your vitals.

Prediction (speculation): Within a few years, AI will connect your meds, sleep, glucose, heart data, and activity into simple daily guidance you can actually use. It’s not quite here yet, but it’s coming.


The Bottom Line

Smartphones and wearables can genuinely improve health and independence—but only if you choose based on your real needs. You don’t need every bell and whistle.

Start small.
Pick one goal.
Choose one device that helps with that goal.

Sometimes a simple fall-detection watch or a glucose sensor does far more good than the fanciest new feature. Used wisely, these tools give seniors—and their families—more safety, more independence, and more peace of mind.

Three Shades of Left

Understanding Classical Socialism, Democratic Socialism, and Social Democracy in Today’s America

If you’ve ever wondered what politicians really mean when they throw around words like “socialism” or “social democracy,” you’re not alone. These ideas used to live mostly in political theory textbooks. Now they show up in campaign speeches and social media debates. With figures like Bernie Sanders and groups like the Democratic Socialists of America bringing these ideas into the mainstream, it’s worth sorting out what each actually means.

Even though classical socialism, democratic socialism, and social democracy all claim to focus on fairness and reducing inequality, they take very different routes to get there. Understanding those differences helps make sense of what’s really being argued about in American politics today.

Classical Socialism: The Original Blueprint

Classical socialism came out of the 19th century, when industrial capitalism was grinding workers down and a couple of guys named Karl Marx and Friedrich Engels thought they had the fix. Their idea: workers should collectively own and control the means of production — factories, land, and major industries.

This wasn’t just about taxing the rich. It was about redesigning the whole system from the ground up, through violent revolution if necessary. In theory, private property creates exploitation; collective ownership ends it. In practice, that often means top-down control by the state, with economies planned from above — as seen in the Soviet Union or Maoist China.

The central ideas of classical socialism are collective ownership of big industries and central or cooperative planning instead of market competition.  Production is aimed at meeting needs, not profits with the eventual goal of a classless, stateless society. Classical socialism accepts that revolution will most likely be necessary for implementation.

In theory, classical socialism wipes out worker exploitation and wealth extremes. Its central tenant is that production serves human needs, not corporate profit.  In practice, it often leads to authoritarian governments, clumsy economic planning, and little room for innovation or dissent.

Would it work in America?
Probably not. The U.S. has deep cultural roots in individualism and private enterprise. Replacing markets with centralized planning would clash hard with both our Constitution and national temperament.

The Siblings of Socialism

In the real world, classical socialism has produced two offsprings, the confusingly named democratic socialism and social democracy. While they share many similarities, the major difference is that democratic socialism aims to replace capitalism while social democracy has the objective of reforming capitalism and making it more humane.

Democratic socialism

Democratic socialism shares many of classical socialism’s goals but emphasizes getting there through elections — not revolution. It aims to establish central control of key parts of the economy while protecting some political freedom and most civil rights.

The vision of Democratic Socialism is collective (public) ownership of major industries like energy, transportation, manufacturing, and communications. The economy would be directed and managed by the government, but the government would be elected and it would not be an authoritarian state.  It proposes that within individual industries there would be worker self-management and workplace democracy. It also proposes that there would be private sector businesses allowed on a small scale—think Mom and Pop retail. It supposes gradual reform, not a violent upheaval, while maintaining democracy and civil liberties.

There are several major drawbacks to democratic socialism. Progress can be slow, easily reversed, and still subject to bureaucratic inefficiencies. Competing globally with capitalist economies might also prove tough. To me the major drawback is how major corporations, financial institutions, and wealthy businesspeople can be convinced to peacefully hand over control of major portions of the economy to a “people’s collective”.

How it fits in the U.S.:
Democratic Socialism has grown in popularity, especially among younger voters; although, it seems that many younger people seem to believe that this means making things more fair rather than supporting the reality of Democratic Socialism.

Bernie Sanders and Alexandria Ocasio-Cortez wear the label proudly. Still, the idea of government control of a significant portion of the economy faces serious resistance here. Realistically, it’s more a movement that nudges policy leftward than a model ready for prime time.

Social Democracy: Capitalism with Guardrails

Social democracy takes a different track. It doesn’t want to abolish capitalism — it wants to civilize it. Think Scandinavia: private ownership, strong markets, but also universal healthcare, paid leave, and free college.

The central elements of Social Democracy are a mixed economy with both public and private sector control. In some models, there is direct government management of such public services as healthcare, energy and transportation. In other models, there remains private control of these services with a strong regulation on the part of the government.

Regardless of the chosen model, a Social Democracy is a strong welfare state with universal benefits. The definition of welfare in this context is a way of providing earned support for hard working citizens  Perhaps it should be called an earned benefits state as the term welfare has a pejorative implication for some.

There is strong market regulation to prevent unfair competition, price gouging, and monopolies that are detrimental to public good. There is a progressive tax program designed to reward productivity while heavily taxing passive or nonproductive income. These taxes are used to fund generous public services.

The government remains elective and responsive to the public. It’s proven to work. Nordic countries show that capitalism can coexist with equality and innovation.  While it is expensive, and high taxes can be a political lightning rod, it leaves capitalism’s basic structure intact.   There is a constant risk that inequality can creep back if protection weaken.

In the U.S. context:
Social democracy may be the most realistic option. As social scientist Lane Kenworthy puts it, America already is a social democracy — just not a particularly generous one. We’ve got Medicare, Social Security, public education — we just underfund them compared to our European cousins.  The reality is that income lost to increased taxation is regained through decreases in insurance premiums, healthcare costs, education expenses and retirement expenses. 

With Elon Musk on the cusp of becoming the world’s first trillionaire we have to ask: “How much is enough before they accept their social responsibility to the working people that made their wealth possible?”  The bottom line is that when the ultra-wealthy are required to pay their fair share of taxes, public services become affordable. We should be supporting people, not yachts.

What’s Realistically Possible Here?

Culturally, Americans value freedom, competition, and property rights. Yet polls show younger voters are warming up to “socialism,” even if most don’t seem to be clear about the specifics. Institutionally, the U.S. political system makes sweeping change tough. Our winner-take-all elections favor a two-party system that leaves little room for socialist parties to grow independently.

Democratic Socialism may continue to shape the conversation, but full socialism — especially the classic Marxist kind — is not likely to take hold here.  From my perspective, the most realistic option, Social Democracy is too often overlooked in these discussions.

Given that, the path of least resistance looks like expanded Social Democracy: things like a revised and equitable tax code, universal healthcare, free or subsidized higher education, paid family leave, stronger labor laws, and public investment in infrastructure and green energy.

Social Democracy looks like the most attainable path — not a revolution, but an evolution toward a fairer society.  Only time will tell.

250 Years Strong!

The Continental Marines: Birth of America’s Amphibious Warriors

When most people think of the American Revolution, they picture Continental soldiers marching across snowy battlefields or patriot militias defending their homes. But there’s another group that played a crucial role in securing American independence: the Continental Marines. These amphibious warriors served in America’s nascent naval force and proved their worth on both land and sea during the eight-year struggle for independence.

The Continental Marines, established in 1775, served as America’s first organized marine force during the Revolutionary War before being disbanded in 1783, laying the foundation for what would eventually become the modern U.S. Marine Corps.  Though short-lived, the original Marine Corps played a significant role in America’s fight for independence, setting precedents that the modern Marine Corps still honors today.

The Legislative Foundation

By the fall of 1775, the American colonies were no longer engaged in mere protest—they were in open rebellion against the British Empire. Battles had already been fought at Lexington, Concord, and Bunker Hill. The Continental Congress, led by figures like John Adams, had begun to organize a Continental Army under George Washington’s command. But many in the Congress, especially Adams, believed a navy was also essential to challenge British power at sea and disrupt its supply lines.

With a navy, it was reasoned, must come Marines—soldiers trained to serve aboard ships, conduct landings, enforce discipline, and fight in close quarters during boarding actions. This model was based on the British Royal Marines, a corps with a long and respected tradition.

The Continental Marines came into existence through a resolution passed by the Second Continental Congress on November 10, 1775. This date, which Marines still celebrate today as their birthday, marked a pivotal moment in American military history.

The Continental Marine Act of 1775 decreed: “That two battalions of Marines be raised consisting of one Colonel, two lieutenant-colonels, two majors and other officers, as usual in other regiments; that they consist of an equal number of privates as with other battalions, that particular care be taken that no persons be appointed to offices, or enlisted into said battalions, but such as are good seamen, or so acquainted with maritime affairs as to be able to serve for and during the present war with Great Britain and the Colonies.”

The legislation was part of Congress’s broader effort to create a Continental Navy capable of challenging British naval supremacy. The resolution was drafted by future U.S. president John Adams and adopted in Philadelphia. This wasn’t just about creating another military unit—Congress recognized that naval warfare required specialized troops who could fight effectively both on ships and on shore. The concept wasn’t entirely new—European navies had long employed marines for similar purposes—but the Continental Marines represented America’s first organized attempt to create a professional amphibious force, though the term amphibious didn’t come into use in a military setting until the 1930s—they would likely have been informally referred to as a naval landing force.

Recruitment: From Taverns to the Fleet

The recruitment of the Continental Marines has become the stuff of legend, particularly the story of their traditional birthplace at Tun Tavern in Philadelphia. Though legend places its first recruiting post at Tun Tavern, historian Edwin Simmons surmises that it may as likely have been the Conestoga Waggon [sic], a tavern owned by the Nicholas family. Regardless of which tavern served as the primary recruiting station, the Marines can claim the unique distinction of being the only military branch “born in a bar”.

The first Commandant of the Marine Corps was Captain Samuel Nicholas, and his first Captain and recruiter was Robert Mullan, the owner of Tun Tavern. Samuel Nicholas, a Quaker-born Philadelphia native and experience mariner, was commissioned on November 28, 1775, becoming the Continental Marines’ senior officer and only commandant throughout their existence. While his background as a Philadelphia tavern keeper may seem unusual for a military leader, his connections in the maritime community proved invaluable for recruiting. The requirement for maritime experience shaped the character of the force from its inception.

The Marines faced immediate recruitment challenges. Originally, Congress envisioned using the Marines for a planned invasion of Nova Scotia.  They expected the Marines to draw personnel from George Washington’s Continental Army.  However, Washington was reluctant to part with his soldiers, forcing the Marines to recruit independently, primarily from the maritime communities of Philadelphia and New York.

By December 1775, Nicholas had raised a battalion of approximately 300 men, organized into five companies, though this fell short of the original plan for two full battalions. Robert Mullan, helped to assemble the fledgling fighting force. Plans to form the second battalion were suspended indefinitely after several British regiments-of-foot and cavalry landed in Nova Scotia, making the planned naval assault impossible.

Organization for Dual Service

The Continental Marines were organized as a flexible force capable of serving both aboard ships and on land. For shipboard service, Marines were organized into small detachments that could be distributed across the Continental Navy’s vessels. Their organization reflected their multi-purpose mission: they served as security forces protecting ship officers, repelling boarders and joining boarding parties during naval engagements, and as assault troops for amphibious operations. Marksmanship received particular emphasis—a tradition that continues to this day—as Marines often served as sharpshooters in naval engagements, targeting enemy officers and sailors from the rigging and fighting tops of ships.

During the Revolutionary War, the Continental Marines uniform directives specified a green jacket with white facings and cuffs.   However, when the first sets of uniforms were actually ordered and delivered, red facings were substituted for white. The likely reason was supply availability: red cloth was easier to obtain from Continental or captured British stores. The most authoritative description comes from Captain Samuel Nicholas, who wrote from Philadelphia in 1776 that Marines were outfitted in “green coats faced with red, and lined with white”

The uniform also included a high leather collar, or stock, to ostensibly protect the neck against sword slashes, although there is some evidence that may actually have been intended to improve posture. This distinctive uniform item helped establish their identity as an elite force and eventually lead to their treasured nickname “leathernecks”.

Shipboard Service and Naval Operations

The Continental Marines’ role aboard ship was multifaceted and crucial to naval operations. Their most important duty was to serve as onboard security forces, protecting the captain of a ship and his officers. During naval engagements, in addition to manning the cannons along with the crew of the ship, Marine sharp shooters were stationed in the fighting tops of a ship’s masts specifically to shoot the opponent’s officers and crew. These duties reflected centuries of naval tradition and drew on the example of the British Marines.

The Marines’ first major naval operation came in early 1776 when five companies joined Commodore Esek Hopkins’ Continental Navy  squadron, on its first cruise in the Caribbean. This deployment demonstrated their value as both shipboard security and assault troops, setting the pattern for their service throughout the war.

Major Land-Based Actions

Despite their naval origins, the Continental Marines proved equally effective in land combat. Their most famous early action was the landing at Nassau on the Island of New Providence in the Bahamas in March 1776. The landing was the first by Marines on a hostile shore.  It was led by Captain Nicholas and consisted of 250 marines and sailors. After 13 Days the Marines had captured two forts, the Government House, occupied Nassau and captured cannons and large stores of supplies. While they missed capturing the gunpowder stores (which had been evacuated before their arrival), the raid demonstrated American capability to strike British positions anywhere.

Though modest in scale, this operation had a major symbolic weight and established the Marines as America’s premier amphibious force. The operation did not decisively alter the balance of the war, but it foreshadowed the Marines’ enduring identity as a seafaring, expeditionary force. Today, the Battle of Nassau is remembered less for the supplies seized than for what it represented: the moment the Continental Marines stepped onto the world stage.

Other notable operations included raids on British soil itself. In April of 1778, Marines under the command of John Paul Jones made two daring raids, one at the port of Whitehaven, in northwest England, and the second later that day at St. Mary’s Isle. These operations brought the war directly to British territory, demonstrating American reach and resolve.  While the battles had no strategic impact on the outcome of the war, they were a great moral booster when reports, though largely exaggerated, reached the rebellious colonies

Official Marine Corps history also acknowledges Marine participation in the Battle of Princeton, though it wasn’t a major Marine engagement. Marines from Captain William Shippen’s company, who had been serving aboard Continental Navy ships, participated in this battle as a part of Cadwalader’s Brigade on Washington’s flank.  Some Marines were detached to augment the artillery, with a few eventually transferring to the army.  However, the Marines’ role was relatively minor compared to their more significant naval actions during this period.

The Gradual Decline

As the Revolutionary War progressed, the Continental Marines faced increasing challenges. Financial constraints plagued the Continental forces throughout the war, and the Marines were no exception. The Continental Congress struggled to fund and supply all military branches, and the relatively small Marine force often found itself at a disadvantage competing for resources with the larger Continental Army and Navy.

Recruitment became increasingly difficult as the war dragged on. After the early campaigns, Nicholas’s four-company battalion discontinued independent service, and remaining Marines were reassigned to shipboard detachments.  Their number had been reduced by transfers, desertion, and the loss of eighty Marines through disease.

The Continental Navy also faced severe challenges that directly impacted the Marines. Many ships were captured, destroyed, or sold, leaving Marines without their primary operational platform. As the naval war shifted toward privateering and smaller-scale operations, the need for organized Marine units diminished.

Beginning in February 1777 two companies of Marines either transferred to Morristown to assume the roles in the Continental artillery batteries or left the service altogether. This transfer of Marines to army artillery units reflected the practical reality that their specialized skills were needed elsewhere as the Continental forces adapted to changing circumstances.

Disbanded at War’s End

The end of the Revolutionary War marked the end of the Continental Marines as an organized force. Both the Continental Navy and Marines were disbanded in April 1783. Although a few individual Marines briefly stayed on to provide security for the remaining U.S. Navy vessels, the last Continental Marine was discharged in September 1783.

The last official act of the Continental Marines was escorting a stash of French Silver Crowns (coins) from Boston to Philadelphia—a loan from Louis XVI to establish of the Bank of North America. This final mission, conducted in 1781, symbolically linked the Marines to the new nation’s financial foundations even as their military role ended.

The disbanding reflected broader American attitudes toward standing military forces. Having won their independence, Americans were skeptical of maintaining large military establishments that might threaten republican government. The Continental Congress, facing financial pressures and political opposition to permanent military forces, chose to disband both the Navy and Marines.

Legacy

The Continental Marines’ contribution to American independence was significant despite their small numbers. In all, over the course of 7 years of battle, the Continental Marines had only 49 men killed and just 70 more wounded, out of a total force of roughly 130 Marine Officers and 2,000 enlisted. These relatively low casualty figures reflected both their effectiveness and the limited size of the force.

Rising tensions with Revolutionary France in the late 1790s led to the Quasi-War, prompting Congress to reestablish the Navy in 1798. On July 11 of that year, President John Adams signed legislation formally creating the United States Marine Corps as a permanent branch of the military, under the jurisdiction of the Department of the Navy. This new Marine Corps inherited the traditions, mission, and esprit de corps of its Revolutionary War predecessors.  Despite the gap between the disbanding of the Continental Marines and the establishment of the new United States Marine Corps, Marines honor November 10, 1775, as the official founding date of their Corps.

The Continental Marines established precedents that would shape American military doctrine for more than two centuries. The Revolutionary War not only led to the founding of the United States (Continental) Marine Corps but also highlighted for the first time the versatility for which Marines have come to be known. They fought on land, they fought at sea on ships, and they performed amphibious assaults.

The Continental Marines represented a crucial innovation in American military organization. Born from congressional resolution and tavern recruitment, these maritime warriors proved their worth in battles from the Caribbean to the British Isles. Though disbanded with the war’s end, their legacy lives on in the traditions and spirit of the modern Marine Corps. While their numbers were small and their existence brief, their impact on American military tradition proved lasting and significant.

The Republic of Indian Stream: America’s Forgotten Frontier Nation

Did you know that there once an independent republic in the farthest reaches of northern New Hampshire, where the dense forests blend into the Canadian wilderness?  Neither did I until I came across it in a fascinating book titled A Brief History of the World in 47 Boarders by John Elledge.

It was a short-lived but remarkable experiment in self-government. For three years in the 1830s, the settlers of a disputed border region declared themselves citizens of an independent republic—complete with their own constitution, legislature, and militia. They called it the Republic of Indian Stream, a name that today sounds almost mythical, yet it was a genuine, functioning democracy. Their story blends frontier improvisation, international diplomacy, and Yankee self-reliance—and it leaves us with a curious artifact: a constitution written not by statesmen in Philadelphia, but by farmers, loggers, and merchants caught between two competing nations.

A Territory in Limbo

The roots of the Indian Stream story go back to the Treaty of Paris (1783), which ended the American Revolution. The treaty defined the U.S.–Canada border but used vague geographic language—particularly the phrase “the northwesternmost head of the Connecticut River.” No one could agree which of several small tributaries the treaty meant.

The ambiguity created a slice of wilderness—about 200 square miles—claimed by both the United States and British Lower Canada (now Quebec). For decades, the region existed in a gray zone. Both countries sent tax collectors and law officers, both demanded military service, and neither provided clear legal protection. Residents couldn’t vote, hold secure property titles, or rely on either government’s courts. To make matters worse, they were sometimes forced to pay taxes twice—once to New Hampshire and once to Canada.

Origins of the Republic

By the late 1820s, frustration had reached a boiling point. Attempts to resolve the border dispute were unsuccessful—including arbitration by the King of the Netherlands in 1827 that failed when the United States rejected his decision that favored Great Britain.

With both sides still pressing their claims, the settlers decided they’d had enough of outside interference. On July 9, 1832, they convened a local meeting and declared independence, forming the Republic of Indian Stream. Their constitution—modeled on American state constitutions—began with a simple premise: authority rested with “the citizens inhabiting the territory.”

This wasn’t an act of rebellion but one of survival. The settlers wanted peace, order, and local control. Their goal was to withdrawal from ambiguous regulation and to create a government that could function until the border question was finally settled.

The Constitution of Indian Stream

The constitution of the Republic, adopted the same day they declared sovereignty, was an impressively crafted document for a community of barely 300 people. It reflected the settlers’ familiarity with republican ideals and their determination to govern themselves fairly.

Key features included:

  • Democratic foundation: All authority stemmed from the citizens.
  • Annual elections: A single House of Representatives made the laws, and a magistrate acted as both executive and judge.
  • Judicial simplicity: Local justices of the peace handled disputes—there were no elaborate court hierarchies.
  • Individual rights: Residents enjoyed protections derived from U.S. constitutions—trial by jury, due process, and freedom from arbitrary arrest.
  • Defense and civic duty: Citizens pledged to defend their independence and assist one another in emergencies.

Despite its modest scale, the system worked. The republic passed laws, issued warrants, collected taxes, and even mustered a small militia to maintain order.

Life on the Frontier

Life in Indian Stream resembled that of many frontier communities: logging, farming, hunting, and trading. The land was rough, winters long, and access to distant markets limited. Yet the people thrived through cooperation and self-reliance. Trade with both Canadian and New Hampshire merchants continued—proof that practicality often trumped politics on the frontier.

The republic’s remote location provided a degree of safety from interference, but not immunity. Both British and American agents continued to assert claims, and occasional arrests or skirmishes kept tensions high.

The End of the Republic

The experiment in independence lasted only three years. In 1835, a dispute between an Indian Stream constable and a Canadian deputy sheriff triggered a diplomatic crisis. Canada sent troops to assert control, prompting New Hampshire’s governor to respond in kind.

Realizing they were caught between two competing governments, the citizens voted in April 1836 to accept New Hampshire’s jurisdiction. Indian Stream became part of the town of Pittsburg, and peace was restored.

The larger boundary issue wasn’t fully settled until the Webster–Ashburton Treaty of 1842, which formally placed Indian Stream within the United States.

Legacy of a Lost Republic

Today, little remains of the Republic of Indian Stream except New Hampshire Historical Marker #1 and a scattering of homesteads near the Connecticut Lakes.

Yet its legacy is profound.  It may have lasted only three years, but its story reflects the broader American frontier experience: independence, inventive, and determination to live free from arbitrary rule. In an era defined by rigid borders and powerful states, the memory of Indian Stream reminds us that freedom once depended, not on lines on a map, but on the courage of people willing to draw their own lines.

The story also illustrates the complexities of nation-building in the early American period when borders remained fluid and communities sometimes had to forge their own path toward self-governance. While the republic was short lived, it stands as a testament to the ingenuity and determination of America’s frontier settlers, who refused to accept statelessness and instead chose to create their own nation in the wilderness.

The Indian Stream constitution reminds us that political order is not always imposed from above; sometimes, out of necessity, it is created from below. The settlers were neither revolutionaries nor idealists—they simply wanted clear rules, fair courts, and predictable taxes. Ordinary citizens, faced with legal chaos and neglect, designed a functioning democracy grounded in fairness and mutual responsibility.

That such a tiny community would craft its own constitution speaks to the enduring appeal of constitutional government in the early 19th century. Even on the edge of two empires, far from capitals and legislatures, these settlers turned to a familiar American solution: write it down, elect your leaders, and hold them accountable every year.  Hopefully we will be able to keep their spirit and live up to the example of Indian Stream.

The Eagle, Globe, and Anchor

How the Marine Corps Found Its Symbol

Few military emblems carry as much history and pride as the Marine Corps’ Eagle, Globe, and Anchor, better known as the EGA or simply as the emblem. New recruits and officer candidates work intensely to earn the right to wear this symbol. It is a source of immense pride for every Marine who achieves that distinction.

When entering the Corps, I encountered World War II veterans who affectionately called the EGA the “Birdie on the Ball.” But only Marines can take such liberties—outsiders risk offense if they use the term.

The emblem is instantly recognizable, yet few realize its deep historical roots or appreciate the transformations it has undergone to become the symbol every Marine wears today.

From Anchors to Eagles: The Early Years (1776–1868)

At its inception in 1776, the Continental Marines lacked any formal insignia. Some Marines, predominantly officers, adopted maritime icons such as the fouled anchor—an anchor entwined with rope—often emblazoned on buttons or hat plates. This design echoed the British Royal Navy and underscored their naval identity, but it was never standardized.

Uniform innovations began in the early 1800s. By 1804, Marines were using brass eagles mounted on square plates. During the War of 1812, octagonal plates appeared, embossed with eagles, anchors, drums, shields, and flags. Later designs were simplified to feature metal letters “U.S.M.” (United States Marines), reflecting the shift towards a national identity.

The example below is an officer’s coat button circa 1805-1820.

A more distinctive step came in 1821: the Corps adopted an eagle perched on a fouled anchor encircled by 13 stars, a motif featured on buttons for nearly four decades. However, similar symbols were also used by the Army and Navy, making it less than unique.

Following the Civil War, Marine Corps leadership under Brigadier General Jacob Zeilin, the seventh Commandant, sought a truly unique insignia for the service.

The Zeilin Board and the Birth of the Modern EGA (1868)

On November 12, 1868, Zeilin established a board of officers “To decide and report upon the various devices of cap ornaments of the Marine Corps.” They wasted no time: by November 19, the Secretary of the Navy, Gideon Welles, had approved the new emblem.

The board drew inspiration from the British Royal Marines’ “Globe and Laurel” emblem.

The American version added a few important touches:

  • Globe showing the Western Hemisphere: Representing the Corps’ defense of the Americas and a global presence.
  • Fouled anchor: Honoring the Corps’ naval origins.
  • Eagle: Symbolizing national service and pride.

Zeilin described the new emblem as representing the Corps’ “readiness to serve anywhere, by sea or land.”

At the same time, a distinct emblem was also created for Marine Corps musicians, still seen today on the formal red and gold uniforms of the U.S. Marine Band—“The President’s Own”.

The Motto and Later Refinements

The Latin motto, Semper Fidelis (“Always Faithful”), was introduced in 1883 under Commandant Charles McCawley, replacing previous mottoes such as Fortitudine (“With Fortitude”). Semper Fidelis became central to the Marine Corps’ ethos.

The emblem saw many variations over the decades. Initial designs featured a crested eagle—borrowed from European heraldry. Semper Fidelis appeared on a scroll held in the eagle’s beak on some versions of the emblem.

Only in 1954, with President Dwight D. Eisenhower’s Executive Order 10538, did the American bald eagle with a scroll officially become part of the emblem. This finalized the design used today.

Officer and Enlisted Differences

Since 1868, design distinctions have marked officer and enlisted EGA emblems. Officers’ original emblems were elaborate—frosted silver hemispheres with gold-plated Americas, crowned by a solid silver eagle. Enlisted emblems were brass, emphasizing practicality.

Modern officers wear a multi-piece, high-relief insignia with fine rope detailing, while enlisted Marines use a one-piece emblem. Notably, officers’ globes omit Cuba to strengthen the emblem structurally.  A running joke among enlisted personnel is that officers couldn’t find Cuba on a map.

Before WWII, officers often purchased insignia from jewelers like Bailey, Banks & Biddle, resulting in stylistic inconsistency. One museum curator quipped, “World War I eagles looked like fat turkeys.” Eventually, standardization brought the crisp, clean look seen today.

A Legacy That Endures

From 18th-century anchors to the refined Eagle, Globe, and Anchor of today, the emblem tracks the Corps’ evolution from shipboard security to a global expeditionary force. Over centuries, its form has varied—engraved by jewelers, stamped for wartime, and cast in silver for dress blues—but its meaning remains constant.

Every Marine who earns the EGA joins a tradition stretching back 250 years, defined by courage, loyalty, and the enduring promise to remain Always Faithful.

The Real Enemy of the Revolution: Disease

When you think about the American Revolution, you probably picture dramatic battles like Bunker Hill or the crossing of the Delaware. But here’s something that might surprise you: the biggest killer during the war wasn’t British muskets—it was disease. And it’s not even close.

The Numbers Tell a Grim Story

Let’s talk numbers for a second. On the American side, about 6,800 soldiers died from battlefield wounds. Sounds terrible, right? Well, disease killed an estimated 17,000 to 20,000. That’s roughly three times as many. The British and their Hessian allies faced similar odds: around 7,000 combat deaths versus 15,000 to 25,000 disease deaths.

Think about that for a moment. You were actually safer charging into battle than hanging around camp. In some regiments, disease wiped out more than a third of the troops before they even saw their first fight.

Why Was Disease So Deadly?

Picture yourself in a Revolutionary War military camp. Hundreds of men crammed together in makeshift shelters, no running water, primitive latrines dug too close to where everyone lives, and basically zero understanding of what we’d call “germ theory” today. It’s a perfect storm for infectious disease.

The big killers were:

Smallpox was the heavyweight champion of camp diseases. This virus killed about 30% of people it infected and spread like wildfire through packed military camps. Soldiers tried to protect themselves through a risky practice called inoculation—basically giving themselves a mild case of smallpox on purpose by rubbing infected pus into cuts on their skin. Without proper quarantine procedures, though, this sometimes made outbreaks worse instead of better.

Typhus (called “camp fever” back then) spread through lice and fleas. If you’ve ever been to a prolonged camping trip and felt gross after a few days, imagine that times a hundred. Soldiers lived in the same clothes for weeks, rarely bathed, and the parasites just had a field day. The fever, headaches, and diarrhea that came with typhus made it one of the most dreaded camp diseases.

Dysentery (charmingly nicknamed “bloody flux”) came from contaminated water and poor sanitation. When your latrine is 20 feet from your water source and you don’t understand how disease spreads, this becomes pretty much inevitable. The severe diarrhea weakened soldiers to the point where many couldn’t fight even if they wanted to and it made them even more susceptible to other diseases.

Malaria was especially important in the South, where mosquitoes thrived in the humid climate. This one actually played a fascinating role in how the war ended—but more on that in a bit.

When Disease Changed Everything

The 1776 invasion of Canada was a disaster largely because of smallpox. Out of 3,200 American soldiers in the Quebec campaign, 1,200 fell sick. You can’t mount much of an offensive when more than a third of your army is flat on their backs with fever. Similarly, during the siege of Boston, Washington couldn’t effectively engage the British because so many of his troops were sick with smallpox. These weren’t just setbacks—they were strategic catastrophes.

This is what pushed George Washington to make one of his boldest decisions in 1777: he ordered a mass inoculation of the Continental Army. This was controversial and dangerous at the time, but it worked. Washington had survived smallpox himself as a young man, so he understood both the risks and the benefits. The inoculation program probably saved the army from complete collapse.

Medical “Treatment” Was Often Worse Than Nothing

Here’s where things get really grim. Eighteenth-century medicine was basically medieval. Doctors believed in “balancing the humors” through bloodletting—literally draining blood from already weakened soldiers. They also gave powerful laxatives to people who were already suffering from diarrhea. Yeah, let that sink in.

Pain relief meant opium-based drinks or just straight alcohol. Some doctors used herbal remedies, but results were inconsistent at best. Quinine helped with malaria, though nobody really understood why. Mostly, if you got seriously sick, your survival came down to luck and a strong constitution.

Valley Forge: The Turning Point

Valley Forge is famous for being a brutal winter encampment, and disease was a huge part of why it was so terrible. Scabies left nearly half the troops unable to serve. Dysentery and camp fever killed somewhere between 1,700 and 2,000 soldiers during that single winter—and remember, these weren’t battle casualties. These men died from preventable diseases in what was supposed to be a safe encampment.

But Valley Forge taught the Continental Army a crucial lesson. After that nightmare winter, military leaders started taking sanitation seriously. They began focusing on camp hygiene, protecting water supplies, placing latrines away from living areas, and making sure soldiers could bathe and wash their clothes and bedding.

Baron von Steuben is famous for teaching the Continental Army how to march and drill, but he also deserves credit for implementing serious sanitation reforms. These changes helped prevent future disease outbreaks and kept the army functional for the rest of the war.

The Secret Weapon at Yorktown

Here’s one of my favorite historical details: mosquitoes may have helped win American independence. At Yorktown, roughly 30% of Cornwallis’s British army was knocked out by malaria and other diseases during the siege. The British commander was trying to hold off the American and French forces while also dealing with the fact that almost a third of his troops were too sick to fight.

Many American soldiers from the southern colonies had grown up with malaria and had some partial immunity. The British? Not so much. Some historians even think Cornwallis himself might have been suffering from malaria, which could have affected his decision-making. His second-in-command, Brigadier General Charles O’Hara, was definitely seriously ill during the siege. Fighting a war while you can barely stand is a pretty significant handicap.

The Bigger Picture

The American Revolution shows us something important: wars aren’t just won on battlefields. They’re won by the side that can keep its soldiers alive and healthy. Disease shaped strategic decisions, determined the outcomes of campaigns, and killed far more men than any British regiment ever did.

Washington’s decision to inoculate the army was genuinely revolutionary (pun intended). It showed a willingness to embrace controversial medical practices for the greater good. The sanitation reforms that came out of Valley Forge laid groundwork for modern military medicine and influenced public health policies in the new United States.

So next time someone mentions the American Revolution, remember: while we celebrate the military victories, one of the most important battles was fought against an enemy you couldn’t see—and for most of the war, nobody really knew how to fight it.

The casualty figures and major disease outbreaks are well-documented in historical records. The specific percentages and numbers are estimates based on historical research, as precise record-keeping was limited during this period. The overall narrative about disease being the primary cause of death is strongly supported by multiple historical sources.

How A Nobel Laureate Thinks We Can Save The American Economy…But It Won’t Be Easy

I just finished People, Power, and Profits by Joseph Stiglitz — the Nobel Prize winning economist.  He wrote this near the end of Trump’s first term, but honestly, the world he describes feels even more relevant now.

Stiglitz doesn’t sugarcoat it: capitalism, as we’re practicing it today, is broken. Monopolies dominate markets, inequality has gone wild, and trust in democracy is running on fumes. His proposed fix? Something he calls progressive capitalism — capitalism with guardrails, conscience, and a sense of fairness.

Stiglitz makes the case that our economic system is rigged — not by accident, but by design. Here are his most compelling arguments and what he thinks we should do about them.

1. Taxation and Rent-Seeking: The Rigged Game

Stiglitz draws a sharp distinction between making money through productive work and extracting it through what economists call “rent-seeking” – essentially, using power to skim wealth without creating value. Think of a pharmaceutical company that buys a drug patent and jacks up prices 5,000%, or telecom monopolies that divide up markets to avoid competing.

His argument is straightforward: our tax system rewards the wrong behavior. Capital gains are taxed at lower rates than wages, which means someone living off investments pays less than someone working a regular job. Meanwhile, the wealthy can afford armies of accountants to exploit loopholes that most people don’t even know exist.

What Stiglitz recommends: Tax wealth more aggressively, especially inherited wealth. Close the capital gains loophole. Tax rent-seeking activities heavily while reducing taxes on productive work and innovation. The goal isn’t just revenue – it’s changing incentives so that the path to riches runs through creating value, not extracting it.

2. Green Energy and the True Cost of Pollution

Here’s where Stiglitz gets into what economists call “externalities” – costs that businesses impose on society without paying for them. When a coal plant spews carbon into the atmosphere, we all pay through climate change and increased healthcare costs, but the plant’s balance sheet looks great.

Stiglitz argues this is fundamentally dishonest accounting. If we properly priced pollution and carbon emissions, green energy wouldn’t need subsidies to compete – fossil fuels would suddenly look much more expensive once you factor in their real costs to society.

His recommendation: Implement carbon pricing – either through a carbon tax or cap-and-trade system. Make polluters pay for the damage they cause. This isn’t about punishing business; it’s about honest accounting. Once prices reflect reality, the market will naturally shift toward cleaner energy because it’s actually cheaper when you account for all the costs.

3. Big Business and Big Banks: Concentration of Power

Stiglitz has been particularly vocal about how corporate consolidation hurts everyone except shareholders and executives.  His critique of “too big to fail” is sharp. He argues that concentrated economic power — in tech, finance, and even agriculture — undermines both democracy and efficiency. When a few firms dominate markets, they can suppress wages, block innovation, and bend regulations in their favor—they gain power over prices, wages, and even politics.

The banking sector especially concerns him. After the 2008 financial crisis, which was caused largely by reckless behavior from major banks, these same institutions emerged even larger through government-facilitated mergers. We allowed them to spread their losses among their depositors but let them keep their gains as internal profits.

His recommendations: Reinstate and strengthen regulations that were stripped away, including bringing back something like the Glass-Steagall Act that separated commercial and investment banking. Break up banks that are “too big to fail.” Strengthen antitrust enforcement across all industries. Use the government’s regulatory power to promote competition rather than letting industry consolidate.

4. Money in Politics: The Feedback Loop

This is where everything connects for Stiglitz. Concentrated economic power translates directly into political power. Wealthy interests fund campaigns, lobby relentlessly, and effectively write regulations for the agencies that are supposed to oversee them. This creates a vicious cycle: economic inequality begets political inequality, which creates policies that worsen economic inequality.

Stiglitz argues that the Supreme Court’s Citizens United decision, which allowed unlimited corporate spending in elections, turbocharged this problem by treating money as speech and corporations as people.

His recommendations: Limit campaign spending and institute public financing of campaigns to reduce candidates’ dependence on wealthy donors. Place strict limits on lobbying and implement a robust “revolving door” policy that prevents government officials from immediately cashing in with the industries they regulated. Mandate transparency requirements so voters know who’s funding what. Pass Constitutional amendments if necessary to overturn Citizens United.

The Big Picture

What makes Stiglitz’s argument powerful is how these pieces fit together. You can’t fix inequality just through taxation if big corporations control the political process. You can’t address climate change if fossil fuel companies can buy enough influence to block action. Everything is connected.

His recommendations aren’t radical in historical terms – they’re actually trying to restore a balance that existed during the post-war economic boom of the 1950s.  Stiglitz’s “progressive capitalism” isn’t socialism. It’s capitalism with a conscience — one that remembers who it’s supposed to serve.

Whether you see that as a rescue plan or a recipe for red tape depends entirely on where you put your faith: in public institutions or private markets. The question is do we have the political will to implement his recommendation despite entrenched opposition from those benefiting from the current system?

 Either way, this debate isn’t going away — it’s the one shaping the 21st-century economy.

No Kings!

Page 8 of 27

Powered by WordPress & Theme by Anders Norén