Grumpy opinions about everything.

Category: Uncategorized Page 1 of 2

The Broken Promises: How America Treated Its Revolutionary War Veterans

The story of how the Continental Army’s veterans were treated after winning independence reads like a betrayal. These men had endured Valley Forge, fought without pay — — often without food or clothing — risking everything for a revolution that promised liberty and opportunity. What many received instead was financial ruin, confiscated land, and a harsh lesson in how political power and economic class determined who really benefited from their shared sacrifice.

The Pay That Never Came

Let me start with the most basic broken promise — pay. Continental soldiers were supposed to receive regular wages, but the Continental Congress lacked the power to tax and relied on increasingly worthless paper money. By war’s end, many soldiers hadn’t been paid in months or even years. When they finally returned home, they carried IOUs called “certificates of indebtedness” rather than actual money.

The wealthy and well-connected quickly figured out how to profit from this situation. Speculators traveled through rural areas buying up these certificates from desperate veterans at pennies on the dollar. The soldiers, facing immediate debts and no income, often had no choice but to sell. When the federal government eventually redeemed these certificates at full value under Alexander Hamilton’s financial plan in the 1790s, it was the speculators who made fortunes, not the men who’d earned the money, suffered and won the revolution.

Pensions: Promised to Officers, Denied to Enlisted Men

The pension situation revealed the class divisions even more starkly. In 1780, Congress promised officers who served until the war’s end a pension of half-pay for life. Common soldiers received no such promise. When the officers’ pensions proved controversial and expensive, Congress “commuted” them in 1783 to a one-time payment of five years’ full pay — still nothing for the enlisted men who’d done most of the fighting and dying.

It wasn’t until 1818 that Congress finally created a pension for Continental Army privates, and even then, only for those in “reduced circumstances” — meaning you had to prove you were poor to get it. The maximum annual pension was $96, hardly generous compensation for years of service. Soldiers who had served in militia units were generally excluded. By contrast, officers had already received their commutations decades earlier and often held positions of economic and political power.

Land Bounties: Another Empty Promise

Land bounties represented another avenue where common soldiers got shortchanged. Various colonies and Congress promised land grants to encourage enlistment — typically 100 acres for privates, scaling up to 500+ acres for officers and thousands of acres for generals. But there were problems from the start.

First, much of the promised land was in frontier territories like the Ohio Country, which remained dangerous and largely unsurveyed for years after the war. Second, the process of claiming your land required navigating bureaucratic systems, paying surveying fees, and sometimes traveling hundreds of miles. Third, the land often turned out to be of poor quality or in disputed areas. The average veteran with little education, almost no money and absolutely no political influence was seldom ever able to take advantage of the land bounty.

Predictably, speculators moved in. They bought up land bounty warrants from soldiers who lacked the resources or knowledge to claim them directly. One study found that in Virginia, which promised the most generous bounties, speculators ultimately controlled vast tracts while many veterans received little or nothing.

The Tax Collector Cometh

Here’s where the story gets particularly cruel. While veterans struggled with unpaid wages and unredeemed promises, the new state governments faced their own financial crises. They’d accumulated massive war debts and needed revenue. Their solution? Property taxes.

In Massachusetts, the legislature imposed heavy taxes payable in hard currency — gold or silver — which almost nobody in rural areas possessed. The same certificates of indebtedness that soldiers were given by the government weren’t accepted for tax payments, even though the state owed them that money. Veterans who’d sold their certificates for a fraction of their value to pay immediate debts now faced tax bills they couldn’t pay. These policies were not accidental side effects; they reflected the priorities of creditor classes concentrated in coastal towns, who preferred regressive property taxes over inflation or debt relief for veterans.

When farmers and veterans couldn’t pay these taxes, local sheriffs seized and auctioned their property. In many cases, the buyers at these auctions were the same merchant elites and speculators who’d bought up the certificates. This wasn’t accidental — it was a systematic transfer of wealth and property from those who’d fought the war to those who’d financed it, avoided personal risk and now controlled state governments. Elites did not overtly confiscate veterans’ land through direct political targeting; instead, they relied on neutral-looking fiscal policy — strict tax collection, aggressive debt enforcement, and courts unsympathetic to insolvency — to transfer property legally. The effect was unmistakable; veterans who fought for independence lost their farms to satisfy debts incurred during or immediately after their service, while wealthier investors accumulated land and made financial gains.

The Massachusetts situation became particularly egregious. Between 1784 and 1786, thousands of foreclosure proceedings were filed. Veterans who’d survived the war returned to find themselves losing their farms and, in some cases, being thrown into debtors’ prison.

Shays’ Rebellion: When Veterans Fought Back

The breaking point came in 1786 with Shays’ Rebellion in Massachusetts. Daniel Shays, a former Continental Army captain, led hundreds of veterans and farmers in an armed uprising against foreclosures and debt courts. They physically prevented courts from sitting, trying to halt the cascade of farm seizures. They represented the soldiers who’d won independence and felt the new government had betrayed them.

The rebellion was suppressed by a militia funded by wealthy Boston merchants and creditors as the state treasury lacked ready cash to pay troops. This was a clear demonstration of how thoroughly economic power had concentrated among elites and sent shockwaves through the political establishment. To many rural farmers the suppression looked like creditors hiring an army to enforce unjust laws against impoverished veterans.

Interestingly, most of the rebels received pardons, and Massachusetts did eventually reduce some taxes and reform debtor laws. But the damage was done, and the pattern had been established.

The Class Divide in Revolutionary Benefits

The broader pattern is unmistakable. Officers, who were generally drawn from propertied classes, received pensions and larger land bounties, had the education and connections to navigate bureaucratic systems, and often held the political power to protect their interests. Common soldiers, usually farmers or laborers, received certificates they had to sell at a loss, faced tax collectors seizing their property, and had little political voice. They disproportionately bore the costs of the new fiscal order through unpaid or depreciated wages, lack of early pension support, and vulnerability to foreclosure, while many of the tangible financial benefits of their service migrated to wealthier elites.

Some historians argue this wasn’t conspiracy but circumstance — that the new nation genuinely lacked resources and that markets naturally concentrated certificates in wealthier hands. There’s some truth to this. The Continental Congress was genuinely broke, and state governments faced real fiscal crises.

But the specific policy choices — redeeming certificates from speculators at full value while rejecting them for tax payments, creating pensions for officers but not enlisted men, setting tax policies that required hard currency that poor farmers didn’t have — these weren’t inevitable, they were a choice. They reflected the interests of those who held power in state legislatures and the Continental Congress.

The Long Echo

The treatment of Continental Army veterans established patterns that would echo through American history: promises made during wartime, broken during peace; benefits flowing more generously to officers than enlisted men. Economic and political elites using legal mechanisms to transfer wealth from those who fought the revolution to those who financed it.

The first genuinely “service‑based” pension law that broadly covered surviving Continental soldiers — regardless of disability — did not arrive until 1818, three decades after the war, and it initially required proof of indigence, effectively screening out better‑off veterans and stigmatizing poorer ones. Not until the 1832 act did Congress move toward full pay for life for many surviving officers and enlisted men — including militia — based on length of service alone. But large numbers of veterans had already lost their farms, spent years in poverty, or died. The benefits came too late and too meagerly to undo decades of hardship. They were owed better.

Illustrations generated by author using ChatGPT.

Sources:

Mount Vernon Digital Encyclopedia — Veterans of the Revolutionary War https://www.mountvernon.org/library/digitalhistory/digital-encyclopedia/article/veterans-of-the-revolutionary-war/

This George Washington Presidential Library resource provides an overview of how Continental Army veterans were treated, including details on certificate speculation, payment issues, and the general economic struggles veterans faced after the war.

National Archives — Revolutionary War Pension Files https://www.archives.gov/research/military/war-of-1812/pension-files

While this link references War of 1812 pensions, the National Archives maintains extensive documentation on Revolutionary War pensions as well. The site explains the evolution of pension systems and eligibility requirements, including the 1818 act that finally provided pensions to enlisted men who could prove poverty.

Encyclopedia Virginia — Military Bounty Lands https://www.encyclopediavirginia.org/entries/military-bounty-lands/

This scholarly resource details Virginia’s land bounty system, which was among the most extensive. It documents how these bounties were promised, the challenges veterans faced in claiming them, and how speculators ultimately acquired much of the promised land.

American Battlefield Trust — Shays’ Rebellion https://www.battlefields.org/learn/articles/shays-rebellion

This article provides context on the 1786–1787 uprising in Massachusetts, explaining the economic conditions that drove veterans to armed resistance, the foreclosure crisis, and the rebellion’s impact on constitutional debates.

Massachusetts Historical Society — Shays’ Rebellion https://www.masshist.org/features/shays/

The Fatal Meeting: When Hamilton and Burr Settled Fifteen Years of Rivalry with Pistols

The story of the Hamilton-Burr duel has all the elements of a Greek tragedy: brilliant men, political ambition, an unforgiving honor culture, and an ending that destroyed both victor and vanquished alike. When Aaron Burr shot Alexander Hamilton on the morning of July 11, 1804, he didn’t just kill one of America’s founding architects—he also ended his own political career and helped doom the entire Federalist Party to irrelevance. Let’s rewind the clock more than a decade to try and understand how these two gifted lawyers and Revolutionary War veterans ended up facing each other with loaded pistols.

The Long Road to Weehawken

Hamilton and Burr moved in the same elite New York political circles from the 1790s onward, but they had remarkably different temperaments and political beliefs. Hamilton was ideological, prolific, and combative—often too much so for his own good. Burr was pragmatic, opaque, self-serving, and famously hard to pin down on principle. They distrusted each other deeply.

Their rivalry stretched back to 1791, when Burr defeated Philip Schuyler for a U.S. Senate seat representing New York. This wasn’t just any political defeat for Hamilton—Schuyler was his father-in-law and a crucial Federalist ally on whom Hamilton had counted to support his ambitious financial programs. Hamilton, who was serving in George Washington’s cabinet as Treasury Secretary, never forgave Burr for this loss. In correspondence from June 1804, Hamilton himself referenced “a course of fifteen years competition” between the two men.  

Their philosophical differences ran deep. Hamilton was an ideological Federalist who dreamed of transforming the United States into a modern economic power rivaling European empires through strong central government, industrial development, and military strength. Burr, by contrast, approached politics more pragmatically—he saw it as a vehicle for advancing his own interests and those of his allies rather than as a way to implement sweeping political visions. As Burr himself allegedly said, politics were nothing more than “fun and honor and profit”. Hamilton viewed Burr as fundamentally dangerous due to his lack of fixed ideological principles. Hamilton wrote in 1792 that he considered it his “religious duty to keep this man from office”

The election of 1800 brought their animosity to a boiling point. Due to a quirk in the original Constitution’s electoral system, Thomas Jefferson and his running mate Aaron Burr tied in the Electoral College with 73 votes each, allowing the Federalists to briefly consider elevating Burr to the presidency.  The decision went to the House of Representatives, and Hamilton—despite despising Jefferson’s Democratic-Republican politics—campaigned hard to ensure Jefferson won the presidency rather than Burr. Hamilton argued that Jefferson, however wrong in policy, had convictions, whereas Burr had none.  In the end, Jefferson gained the presidency and Burr became Vice President, but their relationship was never collegial and Burr was excluded from any meaningful participation in Jefferson’s administration.

By 1804, it was clear Jefferson would not consider Burr for a second term as Vice President. Desperate to salvage his political career, Burr made a surprising move: he sought the Federalist nomination for governor of New York, switching from his Democratic-Republican affiliation. It was a strange gambit—essentially betting that his political enemies might support him if it served their interests. Hamilton, predictably, worked vigorously to block Burr’s ambitions yet again. Although Hamilton’s opposition wasn’t the only factor, Burr lost badly to Morgan Lewis, the Democratic-Republican candidate, in April 1804.

The Cooper Letter and the Challenge

The immediate trigger for the duel came from a relatively minor slight in the context of their long feud. In February 1804, Dr. Charles Cooper attended a dinner party where Hamilton spoke forcefully against Burr’s candidacy. Cooper later wrote to Philip Schuyler describing Hamilton’s comments, noting that Hamilton had called Burr “a dangerous man” and referenced an even “more despicable opinion” of him. This letter was published in the Albany Register in April, after Burr’s electoral defeat.

When the newspaper reached Burr, he was already politically ruined—still Vice President of the United States, but with no prospects for future office. He demanded that Hamilton acknowledge or deny the statements attributed to him. What followed was a formal exchange of letters between the two men and their representatives that lasted through June. Hamilton refused to give Burr the straightforward denial he sought, explaining that he couldn’t reasonably be expected to account for everything he might have said about a political opponent during fifteen years of competition. Burr, seeing his honor impugned and his options exhausted, invoked the code of honor and issued a formal challenge to duel.

Hamilton found himself in an impossible position. If he admitted to the insults, which were substantially true, he would lose his honor. If he refused to duel, the result would be the same—his political career would effectively end. Hamilton had personal and moral objections to dueling. His eldest son Philip had died in a duel just three years earlier, at the same Weehawken location where Hamilton and Burr would meet. Hamilton calculated that his ability to maintain his political influence required him to conform to the codes of honor that governed gentlemen’s behavior in early America.

Dawn at Weehawken

At 5:00 AM on the morning of July 11, 1804, the men departed Manhattan from separate docks. They were each rowed across the Hudson River to the Heights of Weehawken, New Jersey—a popular dueling ground where at least 18 known duels took place between 1700 and 1845. They chose New Jersey because while dueling had been outlawed in both New York and New Jersey, the New Jersey penalties were less severe.

Burr arrived first around 6:30 AM, with Hamilton landing about thirty minutes later. Each man was accompanied by his “second”—an assistant responsible for ensuring the duel followed proper protocols. Hamilton brought Nathaniel Pendleton, a Revolutionary War veteran and Georgia district court judge, while Burr’s second was William Van Ness, a New York federal judge. Hamilton also brought Dr. David Hosack, a Columbia College professor of medicine and botany, in case medical attention proved necessary.

Shortly after 7 a.m., the seconds measured out ten paces, loaded the .56‑caliber pistols, and explained the firing rules before Hamilton and Burr took their positions. What exactly happened next remains one of history’s enduring mysteries. The seconds gave conflicting accounts, and historians still debate the sequence and meaning of events.

In a written statement before the duel, Hamilton expressed religious and moral objections to dueling, worry for his family and creditors, and professed no personal hatred of Burr, yet concluded that honor and future public usefulness compelled him to accept. By some accounts, Hamilton had also written to confidants indicating his intention to “throw away my shot”—essentially to deliberately miss Burr, satisfying the requirements of honor without attempting to kill his opponent. Burr, by contrast, appears to have aimed directly at Hamilton.

Some accounts suggest Hamilton fired first, with his shot hitting a tree branch above and behind Burr’s head. Other versions claim Burr shot first. There’s even a theory that Hamilton’s pistol had a hair trigger that caused an accidental discharge after Burr wounded him.

What’s undisputed is the outcome: Burr’s shot struck Hamilton in the lower abdomen, with the bullet lodging near his spine. Hamilton fell, and Burr reportedly started toward his fallen opponent before Van Ness held him back, worried about the legal consequences of lingering at the scene. The two parties crossed back to Manhattan in their respective boats, with Hamilton taken to the home of William Bayard Jr. in what is now Greenwich Village.

Hamilton survived long enough to say goodbye to his wife Eliza and their children. He died at 2 PM on July 12, 1804, approximately 31 hours after being shot.

Political Aftershocks

The nation was outraged. While duels were relatively common in early America, they rarely resulted in death, and the killing of someone as prominent as Alexander Hamilton sparked widespread condemnation. The political consequences proved catastrophic for everyone involved—and reshaped American politics for the next two decades.

Hamilton’s death turned him into a Federalist martyr. Even many who had disliked his arrogance now praised his intellect, service, and sacrifice. His economic vision, already embedded in American institutions, gained a kind of posthumous authority.

For Aaron Burr, the duel destroyed him politically and socially. Murder charges were filed against him in both New York and New Jersey, though neither reached trial—a grand jury in Bergen County, New Jersey indicted him for murder in November 1804, but the New Jersey Supreme Court quashed the indictment. Nevertheless, Burr fled to St. Simons Island, Georgia, staying at the plantation of Pierce Butler, before returning to Washington to complete his term as Vice President.

Rather than restoring his reputation as he’d hoped, the duel made Burr a pariah. He would never hold elected office again. His subsequent attempt to regain power through what historians call the “Burr Conspiracy”—an alleged plan to create an independent nation along the Mississippi River by separating territories from the United States and Spain—led to a treason trial in 1807. Chief Justice John Marshall presided and Burr was ultimately acquitted, but the trial further cemented Burr’s reputation as a dangerous schemer. He spent his later years quietly practicing law in New York.

For the Federalist Party, Hamilton’s death proved even more devastating than Burr’s personal ruin. Hamilton had been the party’s intellectual architect and most effective leader. At the time of his death, the Federalists were attempting a comeback after their national defeat in the 1800 election. Without Hamilton’s energy, strategic thinking, and ability to articulate a compelling vision for the country, the Federalists lost direction. As one historian put it, “The Federalists would be unable to find another leader as forceful and energetic as Hamilton had been, and their movement would slowly suffocate before finally petering out in the early 1820s”. The party’s decline ended what historians consider the first round of partisan struggles in American history.

An interesting footnote: while many Federalists wanted to portray Hamilton as a political martyr, Federalist clergy broke with the party line to condemn dueling itself as a violation of the sixth commandment. These ministers used Hamilton’s death as an opportunity to wage a moral crusade against the practice of dueling, helping to accelerate its decline in American culture—particularly in the northern states where it was already losing favor.

The duel produced a triple tragedy: Hamilton dead at age 47 (or 49—his birth year remains disputed), Burr politically destroyed despite being acquitted of murder charges, and the Federalist Party fatally weakened at a critical moment in American political development.

The Hamilton–Burr duel sits at the intersection of politics, personality, and culture. It reminds us that the early republic was not a calm, rational experiment run by marble statues—but a volatile environment shaped by ego, fear, and ambition. Institutions were young, norms were fragile, and reputations were all important. What began as fifteen years of professional rivalry and personal enmity ended with two brilliant men eliminating each other from the political stage, neither achieving what they’d hoped for through their fatal meeting on the heights of Weehawken.

Sources

Encyclopedia Britannica “Burr-Hamilton duel | Summary, Background, & Facts” https://www.britannica.com/event/Burr-Hamilton-duel

History.com “Aaron Burr slays Alexander Hamilton in duel” https://www.history.com/this-day-in-history/july-11/burr-slays-hamilton-in-duel

Library of Congress “Today in History – July 11” https://www.loc.gov/item/today-in-history/july-11

National Constitution Center “The Burr vs. Hamilton duel happened on this day” https://constitutioncenter.org/blog/burr-vs-hamilton-behind-the-ultimate-political-feud

National Park Service “Hamilton-Burr Duel” https://www.nps.gov/articles/000/hamilton-burr-duel.htm

PBS American Experience “Alexander Hamilton and Aaron Burr’s Duel” https://www.pbs.org/wgbh/americanexperience/features/duel-alexander-hamilton-and-aaron-burrs-duel/

The Gospel Coalition “American Prophets: Federalist Clergy’s Response to the Hamilton–Burr Duel of 1804” https://www.thegospelcoalition.org/themelios/article/american-prophets-federalist-clergys-response-to-the-hamilton-burr-duel-of-1804/

Wikipedia “Burr–Hamilton duel” https://en.wikipedia.org/wiki/Burr–Hamilton_duel

World History Encyclopedia “Hamilton-Burr Duel” https://www.worldhistory.org/article/2548/hamilton-burr-duel/​​​​​​​​​​​​​​​​

For more information about the history of dueling in early America see my earlier post: Pistols at Dawn, The Rise and Fall of the Code Duello.

Images generated by author using ChatGPT.

How Do We Know What We Know? An Introduction to Epistemology

In a world awash with conflicting information, how do we know what is true? How do we know what to believe? How can we even begin to assess it?  My ongoing interest in critical thinking has led me to epistemology.

Epistemology is the branch of philosophy that asks one of the most fundamental questions humans can consider: How do we know what we know? It’s essentially the study of knowledge itself—what counts as knowledge, how we acquire it, and what makes our beliefs justified or true.

Think about it this way: You believe the Earth revolves around the Sun. But why do you believe that? Maybe you learned it in school, maybe you’ve seen evidence from astronomy, or maybe you just trust what scientists tell you. Epistemology digs into questions like these—examining the difference between simply believing something and actually knowing it.

The field explores several core questions. What’s the difference between knowledge and mere opinion? Can we ever be absolutely certain about anything, or is all knowledge provisional?  What are the sources of knowledge—experience, reason, intuition, testimony from others? Epistemology also wrestles with skepticism—the worry that our beliefs might be systematically wrong. How do we know the world isn’t an illusion? How do we justify trusting memory, perception, or testimony? When is it rational to believe something, and when should we remain skeptical?

Epistemologists have developed various theories over the centuries. Some argue that true knowledge comes primarily through sensory experience (empiricism), while others emphasize the role of reason and logic (rationalism). Still others focus on whether knowledge requires absolute certainty or just a high degree of justified confidence.

These might seem like abstract concerns, but epistemology has real-world implications. When you’re deciding whether to trust a news source, evaluating scientific claims, or determining what evidence you need before making an important decision, you’re engaging with epistemological questions.  Epistemology doesn’t tell you what to believe about climate change, vaccines, or economics—but it sharpens your sense of why some beliefs deserve more confidence than others. It encourages intellectual humility without sliding into cynicism.

Ultimately, epistemology concerns itself with concepts of knowledge, belief, truth, and justification. Its primary focus is understanding not only what is believed, but the reasons those beliefs are considered warranted. Far from being limited to abstract philosophy, epistemology fosters disciplined critical thinking—a vital skill in societies inundated with competing perspectives. It is less about ivory-tower theory and more about disciplined thinking.

Source: For a comprehensive academic overview of epistemology and its central questions, see the Stanford Encyclopedia of Philosophy’s entry: https://plato.stanford.edu/entries/epistemology/

From Reagan Conservative to Social Democrat: A Political Evolution

Political beliefs rarely change overnight. Mine certainly didn’t. My journey from Reagan-era conservatism to social democracy unfolded slowly, shaped less by ideology than by lived experience and an accumulating body of evidence about what actually works.

Morning in America

Like many Americans of my generation, my political awakening came during the Reagan years. The message was optimistic and reassuring: limited government, free markets, individual responsibility, and a strong national defense would restore American greatness. Reagan’s charisma made complex economic ideas feel like common sense. Lower taxes would spur growth. Deregulation would unleash innovation. Markets would reward effort and discipline.

That worldview was personally affirming. Success was earned. Failure reflected poor choices. Government’s role should be narrow—defense, public order, and little else. Social programs, we were told, fostered dependency rather than opportunity. It was a coherent framework, and for a time, it seemed to fit the facts.

Cracks in the Foundation

By the 1990s, inconsistencies began to surface. Economic growth continued, but inequality widened. Entire industrial communities collapsed despite residents working hard and playing by the rules. The benefits of “trickle-down” economics were not trickling very far.

Personal experiences made the abstractions impossible to ignore. Families lost health insurance because of pre-existing conditions. Medical bills pushed insured households into bankruptcy. These outcomes weren’t failures of character; they were failures of systems.

The 2008 financial crisis shattered whatever illusions remained. Financial institutions that preached personal responsibility engaged in reckless speculation, then received massive government bailouts, while homeowners were left to face foreclosure. Like millions of others, I lost nearly half of my retirement savings. The contradiction was glaring: socialism for the wealthy, harsh market discipline for everyone else. Individual responsibility meant little when systemic risk brought down the entire economy.

A Turning Point

Job loss during the Great Recession completed the lesson. Despite qualifications and work history, employment opportunities vanished. Unemployment benefits—once easy to dismiss in theory as handouts—became essential in practice. The bootstrap mythology doesn’t hold up when the floor is pulled away.

This period also exposed the fragility of employer-based healthcare and retirement systems. COBRA coverage was unaffordable. 401(k)s evaporated. The safety net that once seemed excessive suddenly looked inadequate. Meanwhile, countries with stronger social protections weathered the recession better than the United States.

Seeing Other Models

Travel and research broadened my perspective further. Nations like Germany, Denmark, France, and Sweden paired market economies with robust social programs—and consistently outperformed the U.S. on measures of health, social mobility, and life satisfaction.

These were not stagnant, overregulated societies. They were thriving capitalist democracies that simply made different choices about public investment and risk-sharing.

Writers like Joseph Stiglitz and Thomas Piketty documented how concentrated wealth undermines both democracy and long-term growth. Historical evidence showed that America’s most prosperous era—the post-World War II boom—coincided with high marginal tax rates, strong unions, and major public investment.

Healthcare Changed Everything

Healthcare ultimately crystallized my shift. The U.S. spends far more per capita than any other nation yet produces worse outcomes on many basic measures.

As a physician, I watched patients struggle with insurance denials, opaque pricing, and medical debt. Healthcare markets don’t function like normal markets. You can’t comparison shop during a heart attack. When insurers profit by denying care, the system aligns against patients. Medical bankruptcy is virtually unknown in countries with universal coverage—for a reason. We have a system where the major goal of health insurance companies is making a profit for their investors—not providing affordable healthcare to their subscribers. 

Climate and Collective Action

Climate change further exposed the limits of market fundamentalism. Individualism and laissez-faire policies have failed to account for shared environmental costs and long-term consequences. Markets alone cannot price long-term environmental harm or coordinate collective action at the necessary scale. Addressing climate risk requires regulation, public investment, and democratic planning.

What Social Democracy Is—and Isn’t

Social democracy is not the rejection of capitalism. It is regulated capitalism with guardrails—markets where they work well, public systems where markets fail. Healthcare, education, infrastructure, and basic income security perform better with strong public involvement.

This differs from democratic socialism, a distinction I’ve explored elsewhere. Social democracy embraces entrepreneurship and competition while preventing monopoly power, protecting workers, and taxing fairly to fund shared prosperity.

As sociologist Lane Kenworthy notes, the U.S. already has elements of social democracy—Social Security, Medicare, public education—we simply underfund them compared to European nations.

A Pragmatic Conclusion

My evolution wasn’t ideological betrayal; it was pragmatic learning. I adjusted my beliefs based on outcomes, not slogans. Countries with strong social democracies routinely outperform the U.S. on health, mobility, education, and even business competitiveness.

True prosperity requires both entrepreneurial freedom and collective investment. The choice isn’t markets or government—it’s how to balance them intelligently. This lesson took me decades to learn, but the evidence now feels hard to ignore.

References

  1. Federal Reserve History – The Great Recession
    Overview of causes, systemic failures, and economic consequences of the 2007–2009 financial crisis.
    https://www.federalreservehistory.org/essays/great-recession
  2. OECD – Social Protection and Economic Resilience
    Comparative data on how countries with stronger social safety nets performed during economic downturns.
    https://www.oecd.org/economy
  3. World Happiness Report (United Nations / Oxford)
    Cross-national comparisons of well-being, social trust, and economic security.
    https://worldhappiness.report
  4. Joseph Stiglitz – Inequality and Economic Growth (IMF Finance & Development)
    Analysis of how income concentration undermines long-term economic performance and democracy.
    https://www.imf.org/en/Publications/fandd/issues/2019/09/inequality-and-economic-growth-stiglitz
  5. Thomas Piketty – Capital in the Twenty-First Century (Data Companion & Summaries)
    Historical evidence on wealth concentration and taxation in advanced economies.
    https://wid.world
  6. Tax Policy Center – Historical Top Marginal Income Tax Rates
    U.S. tax rate history showing high marginal rates during the post-war economic boom.
    https://www.taxpolicycenter.org/statistics/historical-highest-marginal-income-tax-rates
  7. The Commonwealth Fund – U.S. Health Care from a Global Perspective
    Comparative analysis of health spending, outcomes, and access across developed nations.
    https://www.commonwealthfund.org/publications/issue-briefs/2023/jan/us-health-care-global-perspective-2022
  8. OECD Health Statistics
    International comparisons of healthcare costs, outcomes, and system performance.
    https://www.oecd.org/health/health-data.htm
  9. IPCC Sixth Assessment Report – Synthesis Report
    Scientific consensus on climate change risks and the need for coordinated public action.
    https://www.ipcc.ch/report/ar6/syr
  10. Lane Kenworthy – Social Democratic Capitalism
    Comparative research on social democracy, public investment, and economic performance.
    https://lanekenworthy.net

Understanding Parkinson’s Disease: From Diagnosis to Daily Living

When most people think of Parkinson’s disease, they picture the characteristic tremor—that involuntary shaking that has become almost synonymous with the condition. But the reality is far more complex than just one visible symptom. Let’s dig into what’s actually happening in the brain, how doctors figure out what’s going on, and what living with this condition really looks like.

What Causes Parkinson’s Disease?

Here’s where things get frustrating for researchers: despite decades of study, scientists still don’t know exactly what causes the nerve cells in the brain to die. I’m going to apologize in advance because I’m going to be using a lot of “doctor talk”—no way around it. 

What we do know is that nerve cells (neurons) in the substantia nigra portion of the basal ganglia—an area of the brain controlling movement—become impaired or die, and these neurons normally produce dopamine, an important brain chemical. When these cells stop working properly, dopamine levels drop, and that’s when movement problems begin showing up.

But dopamine isn’t the whole story. People with Parkinson’s also lose nerve endings that produce norepinephrine, the main chemical messenger of the sympathetic nervous system, which helps explain why the disease affects so much more than just movement—things like blood pressure, digestion, and energy levels all take a hit.

Most Parkinson’s cases are idiopathic, meaning the cause is unknown, though contributing factors have been identified. Current thinking suggests a complicated mix of genetic and environmental factors. About 5% to 10% of cases begin before age 50, and these early-onset forms are often, though not always, inherited.

Some risk factors have emerged from research: age is the most significant, with about 1% of those over 65 and around 4.3% of those over 85 affected. Traumatic brain injury significantly increases risk, especially if recent, and repeated head injuries from contact sports can cause what’s called post-traumatic parkinsonism.  Muhammad Ali is a classic example of this.

Exposure to pesticides and industrial chemicals has also been identified as a risk factor.  Interestingly, large epidemiologic studies consistently show that people who smoke have a lower risk of being diagnosed with Parkinson’s disease than never‑smokers, although smoking is still strongly discouraged because of its many harmful health risks.  Large cohort studies in the U.S. and Europe generally find no direct association between alcohol consumption and Parkinson’s disease. A few observational studies show that moderate drinkers have slightly lower Parkinson’s rates. However, researchers believe this may be due to reverse causation (people in early or undiagnosed stages often reduce drinking because of GI or mood changes) and lifestyle confounders (moderate drinkers may differ in socioeconomic status, diet, or activity level).  So, the “protective” effect is considered speculative, not causal.  

The Symptoms: More Than Just Shaking

The hallmark movement symptoms—what doctors call “motor symptoms”—are what usually bring people to the doctor. Slowed movements, called bradykinesia, is required for a Parkinson’s diagnosis. People describe it as muscle weakness, though it’s really about control, not strength. The classic tremor, stiffness, and balance problems round out the main movement issues.  Patients frequently show reduced arm swing, shuffling gait, difficulty initiating movement or turning, masked facial expression, decreased blinking, and soft or monotone speech.

But here’s what often surprises people: many individuals later diagnosed with Parkinson’s notice that prior to experiencing stiffness and tremor, they had sleep problems, constipation, loss of smell, and restless legs. These “prodromal symptoms” can show up years before the movement problems become obvious. Other early signs include mood disorders like anxiety and depression.

The cognitive side deserves attention too. Some people experience changes in cognitive function, including problems with memory, attention, and the ability to plan and accomplish tasks, though hard to pin down due to concurrence with age related memory problems, 20% at the time of diagnosis is a commonly cited number.  More contested is how many develop Parkinson’s dementia, with estimates ranging from 20% all the way to 85%.

How Doctors Make the Diagnosis

Here’s something that might surprise you: there are currently no blood or laboratory tests to diagnose non-genetic cases of Parkinson’s. The standard diagnosis is clinical, meaning there’s no test that can give a conclusive result—certain physical symptoms need to be present.

Doctors typically diagnose Parkinson’s by taking a detailed medical history and performing a neurological examination. If symptoms improve after starting medication, that’s another indicator that the person has Parkinson’s.

There are some imaging tools available. The FDA approved an imaging scan called the DaTscan in 2011, which allows doctors to see detailed pictures of the brain’s dopamine system using a radioactive drug and SPECT scanner. But this scan can’t definitively diagnose Parkinson’s though it helps rule out conditions that mimic it.  A hallmark of Parkinson’s is the buildup of misfolded alpha-synuclein proteins (Lewy bodies) inside neurons. Whether this is a cause, an effect, or both is still under study—this part of the science remains somewhat speculative.

Recently, researchers developed something more promising: the alpha-synuclein seeding amplification assay can detect abnormal alpha-synuclein in spinal fluid and may detect Parkinson’s in people who haven’t been diagnosed yet. The catch? It requires a spinal tap and isn’t widely available, though scientists are working on blood and saliva tests.

The early diagnostic challenge is real. Many disorders can cause similar symptoms, and people with Parkinson’s-like symptoms from other causes are sometimes said to have parkinsonism, which includes conditions like multiple system atrophy and Lewy body dementia that require different treatments.

What to Expect: The Prognosis

Let’s address the big question: how does Parkinson’s affect life expectancy? The news here is better than you might think. The average life expectancy of a person with Parkinson’s is generally the same as for someone without the disease.

More specifically, average life expectancy has increased by about 55% since 1967, rising to more than 14.5 years from diagnosis. Modern treatments have made a huge difference. Research indicates that those with Parkinson’s and normal cognitive function appear to have a largely normal life expectancy.

That said, timing matters. Research from 2020 suggests that people who receive a diagnosis before age 70 usually experience a greater reduction in life expectancy, and males with Parkinson’s may have a greater reduction in life expectancy than females.

The disease is progressive, meaning it gets worse over time, but symptoms and progression vary from person to person, and neither you nor your doctor can predict which symptoms you’ll get, when, or how severe they’ll be. The tremor-dominant type usually has a more favorable prognosis than the hypokinetic type.

What actually causes death in advanced Parkinson’s? Advanced symptoms can cause falls, pressure ulcers, swallowing difficulties and general frailty, all of which are linked to death. Aspiration pneumonia—when you inhale food or liquid into the lungs—is the leading cause of death for people with Parkinson’s.

Managing the Disease

Currently, there’s no cure for Parkinson’s, but medications or surgery can improve many of the movement symptoms.

The gold standard medication is levodopa (often combined with carbidopa as Sinemet). Healthcare providers use levodopa cautiously and they commonly combine it with other medications to keep your body from processing it before it enters your brain.  This helps avoid side effects like nausea, vomiting, and low blood pressure when standing up. The tricky part? Over time, the way your body uses levodopa changes, and it can lose effectiveness.

Beyond levodopa, doctors use MAO-B inhibitors and dopamine agonists. As the disease progresses, these medications become less effective and may cause involuntary muscle movements. When drugs stop working well, there are surgical options to treat severe motor symptoms.

The main surgical treatment today is called deep brain stimulation (DBS).  It is the most important therapeutic advancement since the development of levodopa, and it’s been FDA-approved since the late 1990s A surgeon places thin metal wires called electrodes into one or both sides of the brain, in specific areas that control movement. A second procedure implants an impulse generator battery under the collarbone or in the abdomen. It is similar to a heart pacemaker and about the size of a stopwatch, this device delivers electrical stimulation to those targeted brain areas.

A new treatment that is being used is focused ultrasound. Guided by MRI, high-intensity, inaudible sound waves are emitted into the brain, and where these waves cross, they create high energy that destroys a very specific area connected to tremor. It’s considered non-invasive and the FDA has approved it for Parkinson’s tremor that doesn’t respond to medications.

Don’t underestimate lifestyle interventions either. Physical therapy can improve balance and address muscle stiffness, and regular exercise improves strength, flexibility, and balance. Eating a balanced diet helps—drinking plenty of water and eating enough fiber reduces constipation, while omega-3 fats and magnesium may boost cognition and help with anxiety.

Parkinson’s disease sits at the intersection of aging, genetics, environment, and biology. Diagnosis is clinical, progression is gradual and variable, and treatment has become increasingly sophisticated. While it remains incurable, early diagnosis, personalized medication plans, targeted therapies like DBS, and consistent exercise allow many people to maintain meaningful independence for years.

The key message from specialists? Treatment makes a major difference in keeping symptoms from having worse effects, and adjustments to medications and dosages can hugely impact how Parkinson’s affects your life.

When Your World Goes Dark: A Simple Guide to Fainting

So you want to know about fainting—or as doctors call it, “syncope” (sink-oh-pee)? Let’s talk about it like we’re grabbing coffee, because this is something that happens to a lot of people and it’s worth understanding.

What’s Actually Happening When You Faint

Here’s the basics: fainting is when your brain temporarily doesn’t get enough blood flow, and it hits the “off” switch for a few seconds. Your body does this as a protective mechanism—when you’re horizontal on the ground, it’s easier for blood to reach your brain again. Not exactly elegant, but your body is doing its best.

Most of the time, you’ll get some warning signs before you go down. Your vision might get blurry or narrow like you’re looking through a tunnel. You might feel dizzy, sweaty, nauseous, or just generally weird and weak. Some people describe feeling really warm right before it happens. If you’re lucky enough to recognize these signs, you can sometimes sit or lie down before you actually lose consciousness.

When you do faint, it usually only lasts a few seconds to maybe a couple minutes. You’ll collapse, your muscles will relax, and you’ll be out. Sometimes your body might jerk a little bit—not like a full seizure, just brief movements because your brain is momentarily starved for oxygen. Then you wake up, usually within moments, you’re back to normal, though you might feel tired or a bit confused for a short while.

Why This Happens: The Age Factor

The interesting thing is that why people faint changes a lot depending on how old they are.

If you’re younger, the most common culprit is what’s called vasovagal syncope, your nervous system overreacts to something and suddenly drops your heart rate and blood pressure. This can happen when you’re stressed, in pain, standing for too long, or even just dehydrated. Ever heard someone say they “can’t stand the sight of blood” or they got woozy at a concert? That’s usually vasovagal syncope. Standing up too fast is another big one—you’ve probably experienced that head rush where everything goes spotty for a second. Sometimes specific situations trigger it: coughing really hard, swallowing, even urinating or exercising intensely can mess with your blood pressure just enough to cause problems.

There are also some rarer causes in young people, like inherited heart rhythm problems—conditions with names like long QT syndrome or Wolff-Parkinson-White syndrome. These are less common but more serious.

For older adults, the picture changes. The autonomic nervous system—your body’s autopilot for things like blood pressure—doesn’t work quite as smoothly as you age. Add in multiple medications (especially blood pressure meds and diuretics), some chronic dehydration (common as people get older) and you’ve got a recipe for more frequent dizzy spells when standing up. Some older folks develop something called carotid sinus hypersensitivity, where even turning their head or wearing a tight collar can trigger a drop in heart rate or blood pressure.

Heart-related causes become much more common with age too. Irregular heartbeats like atrial fibrillation, problems with the heart’s electrical system, or structural issues like a stiff aortic valve or weakened heart muscle can all lead to fainting. And let’s not forget medications—beta-blockers, vasodilators, and certain antidepressants— can all lower blood pressure enough to cause problems.

When Should You Worry?

Here’s where we need to get serious for a second. Most fainting episodes aren’t dangerous, but some are red flags that need immediate attention.

Get emergency help if fainting comes with chest pain, a racing or pounding heartbeat, or trouble breathing—these could mean something’s wrong with your heart. Also, if there are any neurological symptoms like sudden confusion, trouble speaking, weakness on one side of your body, or difficulty understanding people, then you need to rule out things like stroke or seizure right away.

Even without those scary symptoms, if you’re fainting repeatedly or can’t figure out why it’s happening, you should definitely see a doctor. Recurrent fainting can point to underlying issues that are worth catching early—both for safety (falling and hitting your head is no joke) and for quality of life.

How Doctors Figure It Out?

When you go to see a doctor about fainting, they’re playing detective. They’ll want to know everything: What were you doing when it happened? What did you feel beforehand? Did anyone see you faint—and if so, what did they observe? How did you feel afterward? They’ll also ask about your family history (especially sudden cardiac deaths) and what medications you’re taking.

The physical exam usually includes checking your blood pressure and heart rate while you’re lying down and then again when you stand up—this can reveal orthostatic hypotension (that fancy term for your blood pressure dropping when you stand). They’ll listen to your heart, check your neurological function, and look for any obvious problems.

Almost everyone gets an electrocardiogram (EKG)—that test where they stick electrodes on your chest to measure your heart’s electrical activity. Depending on what they find, you might get blood work to check for things like anemia, blood sugar problems, or electrolyte imbalances. An ultrasound of your heart (echocardiogram) might be ordered if they suspect structural heart disease.

If you keep fainting or if there’s concern about your heart, they might want continuous monitoring. This could be anything from wearing a Holter monitor for 24 hours to having a tiny device implanted under your skin that can record your heart rhythm for weeks or even longer. There’s also something called a tilt table test, where they literally tilt you upward on a table to see if it triggers fainting—sounds medieval but it’s useful for diagnosing vasovagal syncope.

Living With It: What You Can Do

The good news is that for most types of fainting, there’s a lot you can do to prevent it from happening again.

If you have the common vasovagal type, learning to recognize those warning signs is huge. Once you feel them coming on, you can do what’s called “counter-pressure maneuvers”—crossing your legs and tensing them, squeezing your hands together really hard, or tensing your arm muscles. These actions help keep your blood pressure up and can stop you from fainting.

Lifestyle changes make a real difference too. Stay hydrated—seriously, drink more water than you think you need. Avoid your known triggers if you can identify them. When you’ve been sitting or lying down, stand up slowly in stages rather than popping right up. Some people benefit from compression stockings (yeah, they’re not glamorous, but they work). Your doctor might even tell you to eat more salt, which is probably the only time a healthcare provider will ever tell you to do that.

For orthostatic hypotension, the management is similar—hydrate, rise slowly, maybe do some calf muscle exercises. Your doctor will also review your medications to see if anything can be adjusted or eliminated.

If your fainting is related to a heart problem, treatment gets more specific and serious. This could mean medications to control heart rhythm, procedures to fix abnormal electrical pathways in your heart, or even implanting a pacemaker or defibrillator. The treatment depends entirely on what specific problem you have.

No matter what’s causing your fainting, regular follow-up with your doctor is important. They need to see if treatments are working, adjust things if necessary, and catch any new issues early.

The Bottom Line

Fainting is super common, but it’s also something you shouldn’t try to diagnose yourself. While most episodes are harmless vasovagal responses to stress or dehydration, some can signal serious heart problems or other conditions that need treatment. If you’re frequently fainting, talk to a doctor—especially if it happens during exercise, or if it comes with other concerning symptoms.

With the right evaluation and management, most people who deal with syncope can get their episodes under control and get back to a normal life. It might take some trial and error to figure out what works for you, but the effort is worth it for both your safety and peace of mind.

For any medical condition always consult with your physician to verify specific treatment recommendations, as individual circumstances can vary significantly. This article is for information and isn’t a substitute for medical advice from your own doctor.

250 Years Strong!

The Eagle, Globe, and Anchor

How the Marine Corps Found Its Symbol

Few military emblems carry as much history and pride as the Marine Corps’ Eagle, Globe, and Anchor, better known as the EGA or simply as the emblem. New recruits and officer candidates work intensely to earn the right to wear this symbol. It is a source of immense pride for every Marine who achieves that distinction.

When entering the Corps, I encountered World War II veterans who affectionately called the EGA the “Birdie on the Ball.” But only Marines can take such liberties—outsiders risk offense if they use the term.

The emblem is instantly recognizable, yet few realize its deep historical roots or appreciate the transformations it has undergone to become the symbol every Marine wears today.

From Anchors to Eagles: The Early Years (1776–1868)

At its inception in 1776, the Continental Marines lacked any formal insignia. Some Marines, predominantly officers, adopted maritime icons such as the fouled anchor—an anchor entwined with rope—often emblazoned on buttons or hat plates. This design echoed the British Royal Navy and underscored their naval identity, but it was never standardized.

Uniform innovations began in the early 1800s. By 1804, Marines were using brass eagles mounted on square plates. During the War of 1812, octagonal plates appeared, embossed with eagles, anchors, drums, shields, and flags. Later designs were simplified to feature metal letters “U.S.M.” (United States Marines), reflecting the shift towards a national identity.

The example below is an officer’s coat button circa 1805-1820.

A more distinctive step came in 1821: the Corps adopted an eagle perched on a fouled anchor encircled by 13 stars, a motif featured on buttons for nearly four decades. However, similar symbols were also used by the Army and Navy, making it less than unique.

Following the Civil War, Marine Corps leadership under Brigadier General Jacob Zeilin, the seventh Commandant, sought a truly unique insignia for the service.

The Zeilin Board and the Birth of the Modern EGA (1868)

On November 12, 1868, Zeilin established a board of officers “To decide and report upon the various devices of cap ornaments of the Marine Corps.” They wasted no time: by November 19, the Secretary of the Navy, Gideon Welles, had approved the new emblem.

The board drew inspiration from the British Royal Marines’ “Globe and Laurel” emblem.

The American version added a few important touches:

  • Globe showing the Western Hemisphere: Representing the Corps’ defense of the Americas and a global presence.
  • Fouled anchor: Honoring the Corps’ naval origins.
  • Eagle: Symbolizing national service and pride.

Zeilin described the new emblem as representing the Corps’ “readiness to serve anywhere, by sea or land.”

At the same time, a distinct emblem was also created for Marine Corps musicians, still seen today on the formal red and gold uniforms of the U.S. Marine Band—“The President’s Own”.

The Motto and Later Refinements

The Latin motto, Semper Fidelis (“Always Faithful”), was introduced in 1883 under Commandant Charles McCawley, replacing previous mottoes such as Fortitudine (“With Fortitude”). Semper Fidelis became central to the Marine Corps’ ethos.

The emblem saw many variations over the decades. Initial designs featured a crested eagle—borrowed from European heraldry. Semper Fidelis appeared on a scroll held in the eagle’s beak on some versions of the emblem.

Only in 1954, with President Dwight D. Eisenhower’s Executive Order 10538, did the American bald eagle with a scroll officially become part of the emblem. This finalized the design used today.

Officer and Enlisted Differences

Since 1868, design distinctions have marked officer and enlisted EGA emblems. Officers’ original emblems were elaborate—frosted silver hemispheres with gold-plated Americas, crowned by a solid silver eagle. Enlisted emblems were brass, emphasizing practicality.

Modern officers wear a multi-piece, high-relief insignia with fine rope detailing, while enlisted Marines use a one-piece emblem. Notably, officers’ globes omit Cuba to strengthen the emblem structurally.  A running joke among enlisted personnel is that officers couldn’t find Cuba on a map.

Before WWII, officers often purchased insignia from jewelers like Bailey, Banks & Biddle, resulting in stylistic inconsistency. One museum curator quipped, “World War I eagles looked like fat turkeys.” Eventually, standardization brought the crisp, clean look seen today.

A Legacy That Endures

From 18th-century anchors to the refined Eagle, Globe, and Anchor of today, the emblem tracks the Corps’ evolution from shipboard security to a global expeditionary force. Over centuries, its form has varied—engraved by jewelers, stamped for wartime, and cast in silver for dress blues—but its meaning remains constant.

Every Marine who earns the EGA joins a tradition stretching back 250 years, defined by courage, loyalty, and the enduring promise to remain Always Faithful.

Tech Savvy Seniors, Part 1: Leveraging Technology to Improve Health in Older Adults

Introduction

Advances in technology have created significant opportunities to improve healthcare in general and for senior citizens in specific. Digital health technologies, including telehealth, smartphone applications, and wearable devices, have become increasingly prevalent, particularly since the COVID-19 pandemic. These technologies offer older adults opportunities to overcome barriers to healthcare access and enhance their ability to manage health conditions independently.  In this article we will present a general overview of healthcare technology as it applies to senior citizens. We will also take a brief look at a few of the apps available. In Part 2 we’ll look at specific wearable devices including smartphones and smart watches as well as dedicated health monitoring equipment.

Digital Health Adoption and Benefits

Many older adults are adopting digital health technologies to maintain communication with healthcare providers and to manage their health conditions. Telehealth, for instance, has become a vital tool, allowing older adults to consult with healthcare professionals remotely, thus reducing the need for travel and exposure to potential health risks. Additionally, smartphone apps and wearable devices enable continuous monitoring of vital signs and provide reminders for medication, contributing to better disease management.

Too Old to Use?

Despite the benefits, ageism remains a barrier to the widespread adoption of digital health technologies for some older adults. Many healthcare professionals hold outdated beliefs that older adults are unable or unwilling to use these technologies, ignoring the fact that many of their patients are part of the generation that pioneered the digital revolution. This has, on occasion, led to their exclusion from health services and clinical trials that utilize digital health, creating a “digital health divide”. Overcoming these biases is crucial to ensuring that older adults can fully benefit from technological advancements in healthcare.

Enhancing Memory and Scoializatin

Regular use of the internet and digital platforms can improve cognitive functioning and memory skills, potentially reducing the risk of dementia. Engaging in online activities such as learning a new language, learning new technological skills, or even online puzzles can keep the brain active and sharp.  Also, technology can help mitigate social isolation—a common issue among older adults—facilitating communication with family and friends and enabling participation in online communities and interest groups.

Promoting Independence and Accessibility

Technology has significantly enhanced the independence of older adults, particularly those with mobility or vision challenges. Online shopping and ride-sharing apps allow older adults to manage daily tasks without relying on others. Voice-activated technologies and personal monitoring devices provide additional support, ensuring safety and independence at home.

Challenges and Future Directions

Many older adults lack access to reliable internet and user-friendly technological devices. Many areas of the country still lack access to reliable broadband Internet.

While many seniors have experience with technology, there are many others who lack sufficient familiarity to utilize it successfully. Older adults often have lower levels of self-confidence or knowledge related to using digital health tools. This can be exacerbated by physical and mental deficits, such as poor vision, hearing loss, and cognitive impairments, which make using digital tools challenging.

Some older adults may not perceive digital health technologies as useful or trustworthy. Concerns about privacy and security, as well as a lack of information about the benefits of e-health, can deter engagement.

Barriers are more pronounced among older adults from socioeconomically disadvantaged groups. These groups often face additional challenges in accessing and using digital health technologies due to cost or regional availability. Many have significant trust issues that inhibit their use of new methods.

Addressing these barriers requires targeted efforts to improve digital literacy, provide accessible and affordable technology, and to challenge ageist perceptions within the healthcare system and to increase the level of trust.

Useful Apps

There are a growing number of apps designed to help older adults manage their healthcare more effectively. Here is a small sample of some common apps that can be particularly useful:

MediSafe: designed for medication management, allowing users to set up medication schedules and receive reminders. It also provides warnings about potential drug interactions and allows family members to monitor medication adherence.

GoodRx: helps users compare drug prices at different pharmacies and provides coupons to help reduce prescription costs, making it easier to manage expenses related to chronic conditions.

Abridge: records conversations during doctor’s appointments, highlights medical terms, and provides definitions, helping users better understand and recall medical advice.

Pill Monitor: helps users schedule medication reminders and keep track of their medication intake, which can be shared with healthcare providers.

 ShopWell: assists with dietary management by helping users create nutritious shopping lists tailored to their health needs, promoting healthy eating habits.

Mychart: provides access to personal health records and allows for viewing of test results, scheduling appointments and communicating with healthcare providers.

Silversneakers Go: promotes physical fitness by providing workout programs tailored for older adults, managing class schedules, and tracking progress.

These are just a few or the many apps designed to be user-friendly and cater to the specific needs of seniors, helping them maintain their health and independence.

Conclusion

The adoption of digital health technologies by older adults holds great promise for improving healthcare outcomes, reducing costs and enhancing quality of life. By addressing ageism and ensuring accessibility, we can bridge the digital health divide and support older adults in achieving healthier, more independent lives. As technology continues to evolve, it will play an increasingly vital role in geriatric care and the promotion of healthy aging.  In Part 2 we will get into greater detail about what’s available, what works, and what’s hype.

Why History Still Matters: Lessons for Today’s World

Recently I was reading an article about the poor state of historical knowledge in the United States, and I decided to repost my first article from when I started blogging almost five years ago.  It seems very little has changed.

“Study the past if you would define the future,”—Confucius. I particularly like this quotation. It is similar to the more modern version: Those who don’t learn from the past are doomed to repeat it. However, I much prefer the former because it seems to be more in the form of advice or instruction. The latter seems to be more of a dire warning. Though I suspect, given the current state of the world, a dire warning is in order.

But regardless of whether it comes in the form of advice or warning, people today do not seem to heed the importance of studying the past.  The knowledge of history in our country is woeful. The lack of emphasis on the teaching of history in general and specifically American history, is shameful. While it is tempting to blame it on the lack of interest on the part of the younger generation, I find people my own age also have very little appreciation of the events that shaped our nation, the world, and their lives. Without this understanding, how can we evaluate what is currently happening and understand what we must do to come together as a nation and as a world.

I have always found history to be a fascinating subject. Biographies and nonfiction historical books remain among my favorite reading. In college I always added one or two history courses every semester to raise my grade point average. Even in college I found it strange that many of my friends hated history courses and took only the minimum. At the time, I didn’t realize just how serious this lack of historical perspective was to become.

Several years ago, I became aware of just how little historical knowledge most people possess. At the time, Jay Leno was still doing his late-night show, and he had a segment called jaywalking. During that segment he would ask people in the street questions that were somewhat obscure and to which he could expect to get unusual and generally humorous answers. On one show, on the 4th of July, he asked people “From what country did the United States declare independence on the 4th of July?” and of course no one knew the answer.  

My first response was he must have gone through dozens of people to find the four or five people who did not know the answer to his question. The next day at work, the 5th of July, I decided to ask several people, all of whom were college graduates, the same question. I got not one single correct answer. Although, one person at least realized “I think I should know this”. When I told my wife, a retired teacher, she wasn’t surprised.  For a long time, she had been concerned about the lack of emphasis on social studies and the arts in school curriculums.  I too was now becoming seriously concerned about the direction of education in our country.

A lot of people are probably thinking “So what, who really cares what a bunch of dead people did 250 years ago?” If we don’t know what they did and why they did it how can we understand its relevance today?  We have no way to judge what actions may support the best interests of society and what will ultimately be detrimental.

Failure to learn from and understand the past results in a me-centric view of everything. If you fail to understand how and why things have developed, then you certainly cannot understand what the best course forward will be. Attempting to judge all people and events of the past through your own personal prejudices leads only to continued and worsening conflict.

If you study the past, you will see that there has never general agreement on anything. There were many disagreements, debates and even a civil war over differences of opinion.  It helps us to understand that there are no perfect people who always do everything the right way and at the right time. It helps us to appreciate the good that people do while understanding the human weaknesses that led to the things that we consider faults today. In other words, we cannot expect anyone to be a 100% perfect person. They may have accomplished many good and meaningful things, and those good and meaningful things should not be discarded because the person was also a human being with human flaws.

Understanding the past does not mean approving of everything that occurred but it also means not condemning everything that does not fit into twenty-first century mores.  Only by recognizing this and seeing what led to the disasters of the past can we hope to avoid repetition of the worst aspects of our history. History teaches lessons in compromise, involvement and understanding. Failure to recognize that leads to strident argument and an unwillingness to cooperate with those who may differ in even the slightest way. Rather than creating the hoped-for perfect society, it simply leads to a new set of problems and a new group of grievances.

In sum, failure to study history is failure to prepare for the future. We owe it to ourselves and future generations to understand where we came from and how we can best prepare our country and the world we leave for them. They deserve nothing less than a full understanding of the past and a rational way forward. 

I’m going to close with a quote I recently came across:

  “Indifference to history isn’t just ignorant, it’s rude.”

                        —David McCollum

Page 1 of 2

Powered by WordPress & Theme by Anders Norén