Grumpy opinions about everything.

Tag: American Politics

Military Purges and Democratic Stability: Why History Still Matters

When political power is on the line, history shows that the military often becomes the make-or-break institution. Authoritarian leaders—from Hitler to Erdogan—have long understood that a professional military answers to the state, not to any one person. That independence can be inconvenient for leaders who want fewer limits to their power. So, the classic move is simple: replace seasoned, independent officers with people whose primary loyalty is personal rather than constitutional.

This isn’t speculation; it’s a familiar historical pattern.

How Authoritarians Reshape Militaries

Professional militaries promote based on experience, training, and merit. They’re built to resist illegal orders and to stay out of domestic politics. For an authoritarian-leaning leader, military professionalism is a potential obstacle. Purges serve a purpose: clear out officers who take institutional norms seriously, and elevate those who won’t push back.

Two cases illustrate how this works.

Hitler and the German Army

After consolidating political power, Hitler moved aggressively to dominate the military. In 1934, the army was pressured to swear a personal oath of loyalty to him—not to the state or constitution.

By 1938 he removed two top commanders, Werner von Blomberg and Werner von Fritsch, through trumped-up scandals after they questioned his rush toward war. Dozens of senior generals were pushed out soon after.

The goal was not efficiency—it was control.

Turkey After the 2016 Coup Attempt

Following the failed coup, President Erdogan launched the largest purge in modern Turkish history. Tens of thousands across the military, police, and judiciary were arrested or fired, including nearly half of Turkey’s generals.

Later reporting showed that many dismissed officers had no link to the coup at all; they were targeted for being politically unreliable or pro-Western.

These cases differ in scale and context, but the pattern is strikingly similar: the professional military is reshaped to serve the leader.

What Healthy Civil–Military Relations Look Like

In stable democracies, civilian leaders set policy, but the military retains professional autonomy. Officers swear loyalty to the constitution. Promotions are merit-based. And there’s a bright line between national service and political allegiance.

One important safeguard: every member of the U.S. military is obligated to refuse unlawful orders and swears an oath to do so. It’s not optional—it’s core to American military ethics.

Research consistently shows that professional, apolitical militaries strengthen democracies, while politically entangled militaries make coups and repression more likely.

The Current U.S. Debate

Since early 2025, Defense Secretary Pete Hegseth’s removal or sidelining of more than two dozen generals and admirals has raised alarms within the military and among lawmakers. It includes the unprecedented firing of a sitting Chairman of the Joint Chiefs of Staff and significant cuts to senior officer billets.

Hegseth has framed these moves as reforms—streamlining, eliminating “woke politicization,” and aligning leadership with the administration’s national-security priorities.

Many inside the services describe the environment as unpredictable and politically charged. Officers report confusion about why certain leaders are removed and others promoted, and some say the secretary’s rhetoric has alienated the very institution he’s trying to lead. Public reporting describes an “atmosphere of uncertainty and fear” inside the officer corps.

Similarities and Differences to Classic Purges

Where patterns overlap

  • Large-scale personnel changes in a short time
  • Emphasis on loyalty to a person rather than institutional norms
  • Limited transparency in the selection and removal process
  • Signals that dissent or disagreement are disqualifying

Where the U.S. still differs

  • Congress can investigate and slow actions
  • Courts remain independent (for now)
  • Officers swear loyalty to the Constitution, not the president
  • No arrests, detentions, or manufactured scandals
  • The press is free to report and criticize

Why This Matters

Institutional Readiness

Purges can weaken the military by removing seasoned leaders and creating gaps in institutional memory.

Professionalism

If officers think advancement depends on political alignment instead of performance, the talent pipeline changes. Some of the best people simply leave.

Civil–Military Trust

The relationship between elected leaders and the military rests on mutual respect. Reports of intimidation or political litmus tests damage that trust.

Democratic Stability

Democracies depend on militaries that stay out of politics. History shows that once political loyalty becomes the main metric for advancement, the slope toward politicization—and eventually erosion of democratic norms—gets much steeper.

The Real Question

It’s not whether current events equal Turkey in 2016 or Germany in 1938. They don’t.

The real question is much simpler:

Will we maintain a military that is professional, apolitical, and loyal to the Constitution—or move toward a military where career survival depends on political loyalty?

That direction matters far more than any single personnel decision.

Bottom Line

History shows that authoritarianism doesn’t arrive all at once; it arrives incrementally. One of the clearest patterns is reshaping the military to reward personal loyalty over constitutional loyalty.

The United States still has strong guardrails: congressional oversight, rule of law, open media, and a military culture steeped in constitutional commitment. But those guardrails only work if they’re maintained—by political leaders, by officers, and by citizens paying attention.  Many are concerned that the deployment of military forces in American cities and their use to destroy purported drug traffickers is a way to acclimate senior officers to following questionable orders.

Watching these trends isn’t alarmist. It’s simply responsible.  It’s our duty as citizens

Three Shades of Left

Understanding Classical Socialism, Democratic Socialism, and Social Democracy in Today’s America

If you’ve ever wondered what politicians really mean when they throw around words like “socialism” or “social democracy,” you’re not alone. These ideas used to live mostly in political theory textbooks. Now they show up in campaign speeches and social media debates. With figures like Bernie Sanders and groups like the Democratic Socialists of America bringing these ideas into the mainstream, it’s worth sorting out what each actually means.

Even though classical socialism, democratic socialism, and social democracy all claim to focus on fairness and reducing inequality, they take very different routes to get there. Understanding those differences helps make sense of what’s really being argued about in American politics today.

Classical Socialism: The Original Blueprint

Classical socialism came out of the 19th century, when industrial capitalism was grinding workers down and a couple of guys named Karl Marx and Friedrich Engels thought they had the fix. Their idea: workers should collectively own and control the means of production — factories, land, and major industries.

This wasn’t just about taxing the rich. It was about redesigning the whole system from the ground up, through violent revolution if necessary. In theory, private property creates exploitation; collective ownership ends it. In practice, that often means top-down control by the state, with economies planned from above — as seen in the Soviet Union or Maoist China.

The central ideas of classical socialism are collective ownership of big industries and central or cooperative planning instead of market competition.  Production is aimed at meeting needs, not profits with the eventual goal of a classless, stateless society. Classical socialism accepts that revolution will most likely be necessary for implementation.

In theory, classical socialism wipes out worker exploitation and wealth extremes. Its central tenant is that production serves human needs, not corporate profit.  In practice, it often leads to authoritarian governments, clumsy economic planning, and little room for innovation or dissent.

Would it work in America?
Probably not. The U.S. has deep cultural roots in individualism and private enterprise. Replacing markets with centralized planning would clash hard with both our Constitution and national temperament.

The Siblings of Socialism

In the real world, classical socialism has produced two offsprings, the confusingly named democratic socialism and social democracy. While they share many similarities, the major difference is that democratic socialism aims to replace capitalism while social democracy has the objective of reforming capitalism and making it more humane.

Democratic socialism

Democratic socialism shares many of classical socialism’s goals but emphasizes getting there through elections — not revolution. It aims to establish central control of key parts of the economy while protecting some political freedom and most civil rights.

The vision of Democratic Socialism is collective (public) ownership of major industries like energy, transportation, manufacturing, and communications. The economy would be directed and managed by the government, but the government would be elected and it would not be an authoritarian state.  It proposes that within individual industries there would be worker self-management and workplace democracy. It also proposes that there would be private sector businesses allowed on a small scale—think Mom and Pop retail. It supposes gradual reform, not a violent upheaval, while maintaining democracy and civil liberties.

There are several major drawbacks to democratic socialism. Progress can be slow, easily reversed, and still subject to bureaucratic inefficiencies. Competing globally with capitalist economies might also prove tough. To me the major drawback is how major corporations, financial institutions, and wealthy businesspeople can be convinced to peacefully hand over control of major portions of the economy to a “people’s collective”.

How it fits in the U.S.:
Democratic Socialism has grown in popularity, especially among younger voters; although, it seems that many younger people seem to believe that this means making things more fair rather than supporting the reality of Democratic Socialism.

Bernie Sanders and Alexandria Ocasio-Cortez wear the label proudly. Still, the idea of government control of a significant portion of the economy faces serious resistance here. Realistically, it’s more a movement that nudges policy leftward than a model ready for prime time.

Social Democracy: Capitalism with Guardrails

Social democracy takes a different track. It doesn’t want to abolish capitalism — it wants to civilize it. Think Scandinavia: private ownership, strong markets, but also universal healthcare, paid leave, and free college.

The central elements of Social Democracy are a mixed economy with both public and private sector control. In some models, there is direct government management of such public services as healthcare, energy and transportation. In other models, there remains private control of these services with a strong regulation on the part of the government.

Regardless of the chosen model, a Social Democracy is a strong welfare state with universal benefits. The definition of welfare in this context is a way of providing earned support for hard working citizens  Perhaps it should be called an earned benefits state as the term welfare has a pejorative implication for some.

There is strong market regulation to prevent unfair competition, price gouging, and monopolies that are detrimental to public good. There is a progressive tax program designed to reward productivity while heavily taxing passive or nonproductive income. These taxes are used to fund generous public services.

The government remains elective and responsive to the public. It’s proven to work. Nordic countries show that capitalism can coexist with equality and innovation.  While it is expensive, and high taxes can be a political lightning rod, it leaves capitalism’s basic structure intact.   There is a constant risk that inequality can creep back if protection weaken.

In the U.S. context:
Social democracy may be the most realistic option. As social scientist Lane Kenworthy puts it, America already is a social democracy — just not a particularly generous one. We’ve got Medicare, Social Security, public education — we just underfund them compared to our European cousins.  The reality is that income lost to increased taxation is regained through decreases in insurance premiums, healthcare costs, education expenses and retirement expenses. 

With Elon Musk on the cusp of becoming the world’s first trillionaire we have to ask: “How much is enough before they accept their social responsibility to the working people that made their wealth possible?”  The bottom line is that when the ultra-wealthy are required to pay their fair share of taxes, public services become affordable. We should be supporting people, not yachts.

What’s Realistically Possible Here?

Culturally, Americans value freedom, competition, and property rights. Yet polls show younger voters are warming up to “socialism,” even if most don’t seem to be clear about the specifics. Institutionally, the U.S. political system makes sweeping change tough. Our winner-take-all elections favor a two-party system that leaves little room for socialist parties to grow independently.

Democratic Socialism may continue to shape the conversation, but full socialism — especially the classic Marxist kind — is not likely to take hold here.  From my perspective, the most realistic option, Social Democracy is too often overlooked in these discussions.

Given that, the path of least resistance looks like expanded Social Democracy: things like a revised and equitable tax code, universal healthcare, free or subsidized higher education, paid family leave, stronger labor laws, and public investment in infrastructure and green energy.

Social Democracy looks like the most attainable path — not a revolution, but an evolution toward a fairer society.  Only time will tell.

The Republic of Indian Stream: America’s Forgotten Frontier Nation

Did you know that there once an independent republic in the farthest reaches of northern New Hampshire, where the dense forests blend into the Canadian wilderness?  Neither did I until I came across it in a fascinating book titled A Brief History of the World in 47 Boarders by John Elledge.

It was a short-lived but remarkable experiment in self-government. For three years in the 1830s, the settlers of a disputed border region declared themselves citizens of an independent republic—complete with their own constitution, legislature, and militia. They called it the Republic of Indian Stream, a name that today sounds almost mythical, yet it was a genuine, functioning democracy. Their story blends frontier improvisation, international diplomacy, and Yankee self-reliance—and it leaves us with a curious artifact: a constitution written not by statesmen in Philadelphia, but by farmers, loggers, and merchants caught between two competing nations.

A Territory in Limbo

The roots of the Indian Stream story go back to the Treaty of Paris (1783), which ended the American Revolution. The treaty defined the U.S.–Canada border but used vague geographic language—particularly the phrase “the northwesternmost head of the Connecticut River.” No one could agree which of several small tributaries the treaty meant.

The ambiguity created a slice of wilderness—about 200 square miles—claimed by both the United States and British Lower Canada (now Quebec). For decades, the region existed in a gray zone. Both countries sent tax collectors and law officers, both demanded military service, and neither provided clear legal protection. Residents couldn’t vote, hold secure property titles, or rely on either government’s courts. To make matters worse, they were sometimes forced to pay taxes twice—once to New Hampshire and once to Canada.

Origins of the Republic

By the late 1820s, frustration had reached a boiling point. Attempts to resolve the border dispute were unsuccessful—including arbitration by the King of the Netherlands in 1827 that failed when the United States rejected his decision that favored Great Britain.

With both sides still pressing their claims, the settlers decided they’d had enough of outside interference. On July 9, 1832, they convened a local meeting and declared independence, forming the Republic of Indian Stream. Their constitution—modeled on American state constitutions—began with a simple premise: authority rested with “the citizens inhabiting the territory.”

This wasn’t an act of rebellion but one of survival. The settlers wanted peace, order, and local control. Their goal was to withdrawal from ambiguous regulation and to create a government that could function until the border question was finally settled.

The Constitution of Indian Stream

The constitution of the Republic, adopted the same day they declared sovereignty, was an impressively crafted document for a community of barely 300 people. It reflected the settlers’ familiarity with republican ideals and their determination to govern themselves fairly.

Key features included:

  • Democratic foundation: All authority stemmed from the citizens.
  • Annual elections: A single House of Representatives made the laws, and a magistrate acted as both executive and judge.
  • Judicial simplicity: Local justices of the peace handled disputes—there were no elaborate court hierarchies.
  • Individual rights: Residents enjoyed protections derived from U.S. constitutions—trial by jury, due process, and freedom from arbitrary arrest.
  • Defense and civic duty: Citizens pledged to defend their independence and assist one another in emergencies.

Despite its modest scale, the system worked. The republic passed laws, issued warrants, collected taxes, and even mustered a small militia to maintain order.

Life on the Frontier

Life in Indian Stream resembled that of many frontier communities: logging, farming, hunting, and trading. The land was rough, winters long, and access to distant markets limited. Yet the people thrived through cooperation and self-reliance. Trade with both Canadian and New Hampshire merchants continued—proof that practicality often trumped politics on the frontier.

The republic’s remote location provided a degree of safety from interference, but not immunity. Both British and American agents continued to assert claims, and occasional arrests or skirmishes kept tensions high.

The End of the Republic

The experiment in independence lasted only three years. In 1835, a dispute between an Indian Stream constable and a Canadian deputy sheriff triggered a diplomatic crisis. Canada sent troops to assert control, prompting New Hampshire’s governor to respond in kind.

Realizing they were caught between two competing governments, the citizens voted in April 1836 to accept New Hampshire’s jurisdiction. Indian Stream became part of the town of Pittsburg, and peace was restored.

The larger boundary issue wasn’t fully settled until the Webster–Ashburton Treaty of 1842, which formally placed Indian Stream within the United States.

Legacy of a Lost Republic

Today, little remains of the Republic of Indian Stream except New Hampshire Historical Marker #1 and a scattering of homesteads near the Connecticut Lakes.

Yet its legacy is profound.  It may have lasted only three years, but its story reflects the broader American frontier experience: independence, inventive, and determination to live free from arbitrary rule. In an era defined by rigid borders and powerful states, the memory of Indian Stream reminds us that freedom once depended, not on lines on a map, but on the courage of people willing to draw their own lines.

The story also illustrates the complexities of nation-building in the early American period when borders remained fluid and communities sometimes had to forge their own path toward self-governance. While the republic was short lived, it stands as a testament to the ingenuity and determination of America’s frontier settlers, who refused to accept statelessness and instead chose to create their own nation in the wilderness.

The Indian Stream constitution reminds us that political order is not always imposed from above; sometimes, out of necessity, it is created from below. The settlers were neither revolutionaries nor idealists—they simply wanted clear rules, fair courts, and predictable taxes. Ordinary citizens, faced with legal chaos and neglect, designed a functioning democracy grounded in fairness and mutual responsibility.

That such a tiny community would craft its own constitution speaks to the enduring appeal of constitutional government in the early 19th century. Even on the edge of two empires, far from capitals and legislatures, these settlers turned to a familiar American solution: write it down, elect your leaders, and hold them accountable every year.  Hopefully we will be able to keep their spirit and live up to the example of Indian Stream.

Powdered Wigs and Politics: The Rise and Fall of America’s Most Distinguished Hair Trend

I’ve been spending a lot of time recently researching and writing about the 250th Anniversary of the American Revolution and I keep asking myself, “What’s up with the wigs?”   Have you ever wondered why the Founding Fathers look so impossibly fancy in their portraits?  Well, you can thank a French king and a syphilis epidemic. The elaborate wigs worn by early American leaders weren’t just fashion statements—they were complex social symbols that said everything about who you were, what you could afford, and how seriously you wanted to be taken.

Where It All Started

The wig craze didn’t begin in America. It started across the Atlantic when France’s King Louis XIII went bald prematurely in the 1600s and decided to cover it up with a wig. But it was his son, Louis XIV, who really kicked things into high gear. When the Sun King started losing his hair, he commissioned elaborate wigs that became the epitome of aristocratic style. European nobility, desperate to emulate French sophistication, quickly followed suit.

The practice also had a less glamorous origin story. Syphilis was rampant in 17th-century Europe, and one of its unfortunate side effects was hair loss. Wigs conveniently covered up this telltale symptom while also hiding the sores and blemishes that came with the disease.

Europe in the 1600s and 1700s also had frequent outbreaks of lice and other parasites. Shaving one’s natural hair short and wearing a wig—which could be cleaned, boiled, or deloused more easily—became a practical solution. Powdering helped keep wigs fresh and masked odors.

By the time the fashion crossed the ocean to colonial America in the early 1700s, wigs had become standard attire for anyone with social pretensions.

Status on Your Head

In colonial America, your wig announced your place in society before you even opened your mouth. The most expensive and elaborate wigs featured long, flowing curls that cascaded past the shoulders—these full-bottomed wigs could cost the equivalent of several months’ wages for an average worker. Wealthy merchants, successful plantation owners, and colonial officials wore these statement pieces to project authority and refinement.

Professional men like doctors, lawyers, and clergy typically wore more modest styles. The “tie wig” gathered hair at the back with a ribbon, while the “bob wig” featured shorter hair that ended around the neck. These styles were practical enough for men who actually had to work, but still formal enough to command respect. Even the style of curl mattered—tight curls suggested conservatism and tradition, while looser waves indicated a more progressive outlook.

Working-class men generally couldn’t afford real wigs. Some wore simple caps or went bareheaded, while others might invest in a cheap wig made from horsehair or goat hair for special occasions. The quality difference was obvious—human hair wigs, especially those made from blonde or white hair, were luxury items that only the wealthy could obtain.

Many men who did not wear wigs but still wanted the fashionable look would grow their own hair long, pull it into a queue (pony tail), and powder it. George Washington is a good example — portraits show his natural hair powdered white, not a wig.

The Daily Reality of Wig Life

Maintaining these hairpieces was no joke. Owners had to powder their wigs regularly with starch powder, often scented with lavender or orange, to achieve that distinctive white or gray color that signaled refinement. The powder got everywhere, which is why men often wore special dressing gowns during the powdering process.

Wigs required regular cleaning and restyling by professionals called peruke makers or wigmakers. These craftsmen commanded good money in colonial cities, advertising their services alongside other luxury trades. The hot, humid summers in places like Virginia and South Carolina made wig-wearing particularly miserable, but fashion demanded sacrifice.

The Revolutionary Shift

By the time of the American Revolution, attitudes toward wigs were already changing. The shift happened for several interconnected reasons, and it reflected broader transformations in American society.

First, the Revolutionary War itself promoted practical thinking. Military officers found elaborate wigs impractical in the field, and the democratic ideals of the Revolution made aristocratic European fashions seem pretentious. Many younger revolutionaries, including Thomas Jefferson, stopped wearing wigs as a political statement against Old World affectation.

A young Jefferson with a wig

Second, France—the original source of wig fashion—underwent its own revolution in 1789. As French revolutionaries literally beheaded the aristocracy, powdered wigs became associated with the despised nobility. What had once symbolized sophistication now suggested tyranny and excess.

In Great Britain, Parliament introduced a tax on hair powder as part of Prime Minister William Pitt the Younger’s revenue-raising measures.  The law required anyone who used hair powder to purchase an annual certificate costing one guinea (a little over $200 in today’s money).  This contributed to the growing sense that wigs were an unnecessary extravagance. Meanwhile, changing ideals of masculinity emphasized natural simplicity over artificial ornamentation.

By the early 1800s, the wig had largely disappeared from everyday American life. A new generation of leaders, including Andrew Jackson, proudly displayed their natural hair. The transition happened remarkably quickly—within a single generation, wigs went from essential to absurd. By the 1820s, anyone still wearing a powdered wig looked hopelessly outdated, clinging to a world that no longer existed.

The Legacy

Today, elaborate wigs survive primarily in British courtrooms, where some judges still wear them in formal proceedings—a deliberate echo of legal tradition. The powdered wigs of the Founding Fathers remain iconic, instantly recognizable symbols of early American history, even though the men who wore them were already abandoning the fashion by the time they built the new nation.

Page 2 of 2

Powered by WordPress & Theme by Anders Norén