Grumpy opinions about everything.

Category: History Page 1 of 10

Grumpy opinions about American history

Benjamin Franklin and Slavery: A Complicated Legacy


 
Few figures in American history are as celebrated — or as contradictory — as Benjamin Franklin. Founding Father, inventor, diplomat, and philosopher. Franklin is remembered for just about everything except the uncomfortable truth that he was also, for much of his life, a slave owner. His relationship with slavery is a study in the slow, painful moral evolution of a brilliant but flawed man — one who spent decades benefiting from the institution he would spend his final years fighting to abolish.
The Slaveowner
Franklin was a slave owner beginning around 1735, and he owned enslaved people until at least 1785 when he freed two slaves after his return from France.  Over the course of his life, there were up to seven named slaves in the Franklin household, including Peter, his wife Jemima, their son Othello, and George, John, and King.
Franklin’s complicity in slavery extended beyond personal ownership. As editor of the Pennsylvania Gazette, Franklin benefited financially from advertisements for runaway slaves and slave auctions that were paid for by slave owners and traders.  He also used his printing press to publish content that supported the slave trade and, as a British colonial agent, sought to have the British government accept Georgia’s slave code.  In short, slavery wasn’t just a private matter for Franklin — it was woven into his professional and financial life.  At the same time, he printed Quaker antislavery tracts, a sign that his professional role placed him at the intersection of both pro‑slavery commerce and early antislavery movements.
What little we know about how Franklin treated his enslaved people comes mostly from letters and financial records.  In part this is because northern slaveholders kept fewer detailed records of slave families, births, and deaths than large southern planters. His enslaved servants lived within his household and were integrated into domestic routines, a common arrangement in urban slavery that still left them legally and socially unfree.
When Franklin traveled to London in 1757, he brought two enslaved men, Peter and King, who lived and worked at 36 Craven Street. Peter remained with Franklin until their departure in 1762, but King ran away sometime in 1758 and was later found living in Suffolk, having been taken in by a Christian woman who taught him to read and write.  The fact that King fled at the first opportunity tells its own story about the nature of slavery, whatever Franklin’s personal demeanor may have been.
His Evolving Written Views
Franklin’s early writings on slavery were at best ambivalent and at worst openly racist. In his 1751 essay “Observations Concerning the Increase of Mankind,” Franklin argued that slave labor wasn’t economically efficient in part because enslaved people pilfered from their owners, writing that “almost every Slave [being] by Nature a Thief.”  His concern about slavery in this period was largely economic rather than moral — he worried that it would hurt poor white laborers and enriched a wealthy elite, not that it was a profound violation of human dignity.
By the 1760s, something began to shift. His perspective began to change following a 1759 visit arranged by his friend Samuel Johnson to one of Dr. Bray’s schools for Black children. He also met Anthony Benezet, who had started a school in Philadelphia and would later co-found the Abolition Society. By 1763, Franklin wrote that African “shortcomings” were not inherent but came from lack of education, slavery, and negative environments — and that he saw no difference in learning ability between African and white children.   
While in London in the 1760s, he supported black education projects and in 1770 anonymously published “Conversations between an Englishman, a Scotchman, and an American,” a piece that criticized both the slave trade and the broader institution. In 1782 he circulated “A Thought Concerning the Sugar Islands,” condemning the African wars that fed the trade, the horrors of the Middle Passage, and the “numbers that die under the severities of slavery,” arguing that even sugar was morally tainted by blood.
By the late 1780s, Franklin’s language had become openly abolitionist. In 1787 he signed a public antislavery appeal declaring that the Creator had made “of one flesh, all the children of men,” and in 1789–1790 he wrote essays insisting that slavery was an “atrocious debasement of human nature.” He also argued that formerly enslaved people needed education, moral instruction, and employment to make the transition from bondage to full participation in civil society.
This was a meaningful intellectual leap for the era. Franklin was moving from a view of enslaved people as economic units toward recognizing their common humanity and the role that oppression itself played in creating the inequalities he had previously attributed to nature.
Franklin the Constitutional Convention and the Three-Fifths Compromise
By the time of the 1787 Constitutional Convention in Philadelphia, Franklin. then 81 years old, was a delegate from Pennsylvania. The Three-Fifths Compromise — which counted enslaved people as three-fifths of a person for purposes of congressional representation and taxation — was one of the most contentious issues at the Convention. The compromise was formally proposed by delegate James Wilson and seconded by Charles Pinckney.
Franklin’s specific role in the Three-Fifths Compromise itself is limited. His more direct contribution to the Convention’s structural debates was to the Great Compromise about proportional representation and spending rather than the slavery count.
Notably, just weeks before the Convention began, Franklin signed a public antislavery appeal stating that “the Creator of the world” had made “of one flesh, all the children of men.”  Yet he ultimately signed a Constitution that embedded protections for slavery, including the Three-Fifths Compromise and a provision preventing Congress from banning the slave trade until 1808. Franklin’s acquiescence reflected his broader pragmatic calculation, shared by many Northern founders, that preserving the Union required compromise with the slaveholding South, even at a terrible moral cost. This is partly speculative — Franklin left few direct written statements about his reasoning on this specific tradeoff at the Convention.  
The Abolitionist
Whatever compromises Franklin made at Philadelphia, the years that followed saw him embrace abolitionism with increasing conviction and urgency. In 1787, he began serving as President of the Pennsylvania Society for Promoting the Abolition of Slavery  — the oldest abolitionist organization in the country — which had originally formed in 1775 and was reorganized and incorporated by Pennsylvania in 1789.
In 1789, Franklin wrote and published several essays supporting abolition, including a public address dated November 9th of that year in which he called slavery an “atrocious debasement of human nature.”  He called for practical support for emancipated people, including education and employment — ideas that were radical for the time and would remain largely unaddressed for generations.
His final public act was perhaps his most consequential. On February 3, 1790, Franklin signed a petition to the first Congress on behalf of the Abolition Society, asking lawmakers to “devise means for removing the Inconsistency from the Character of the American People” and to “promote mercy and justice toward this distressed Race.”  The petition was immediately denounced by pro-slavery congressmen and referred to a committee, which ultimately concluded that the Constitution prevented Congress from acting on the matter until 1808.
Franklin died in April 1790, just weeks after these debates, leaving a legacy that combined early complicity in slavery with later, forceful advocacy for abolition and Black education. As part of his will, he directed all remaining enslaved people in his household be freed upon his death, although it is unclear if he still owned slaves at the time and this may have been a symbolic declaration that he hoped others would follow.  His life illustrates both the pervasiveness of slavery in colonial America — even among its most famous reformers — and the possibility, however belated, of moral and political transformation on the issue.
 What to Make of It All
Franklin’s association with slavery resists easy conclusions. He spent roughly four and a half decades owning enslaved people, profiting from the slave trade through his newspaper, and diplomatically defending slavery when it served colonial interests. His evolution toward abolitionism was real, but it was also late — and driven partly by visits to schools for Black children and Quaker friendships rather than a spontaneous moral awakening.
At the same time, his final years represent one of the most prominent Founding Fathers publicly and passionately challenging the institution while other contemporaries remained silent or actively defended it. As historian David Waldstreicher has cautioned, Franklin’s antislavery credentials have sometimes been “remembered backwards” and exaggerated  — but that doesn’t mean the later evolution wasn’t genuine.
What Franklin’s story offers isn’t a story of redemption so much as a realistic portrait of moral growth under the weight of self-interest, social norms, and political pragmatism. He was, as one observer put it, a man who showed himself to be “thoughtful, open, teachable” — eventually. The tragedy is how long it took, how few followed his lead, and how much damage was done in the meantime.
 
Illustration generated by author using ChatGPT.

Sources:
                Benjamin Franklin House – Franklin and Slavery
https://benjaminfranklinhouse.org/education/benjamin-franklin-and-slavery/
 
                Benjamin Franklin House – The Philadelphia Household 1735–1790
https://benjaminfranklinhouse.org/franklin-and-slavery-the-philadelphia-household-1735-1790/
 
                Online Library of Liberty – Benjamin Franklin and Slavery, Part One
https://oll.libertyfund.org/publications/reading-room/2023-07-05-ealy-franklin-slavery-part-one
 
                Benjamin Franklin Historical Society – Slavery and the Abolition Society
http://www.benjamin-franklin-history.org/slavery-abolition-society/
 
                National Archives – Benjamin Franklin’s Anti-Slavery Petitions to Congress
https://www.archives.gov/legislative/features/franklin
 
                Penn & Slavery Project – Benjamin Franklin
https://pennandslaveryproject.archives.upenn.edu/2025/07/09/benjamin-franklin/
 
                Commonplace: The Journal of Early American Life – Benjamin Franklin, Slavery, and the Founders
https://commonplace.online/article/benjamin-franklin-slavery/
 
                U.S. History – Ben Franklin and the Vexing Question of Race in America
https://www.ushistory.org/franklin/essays/franklin_race.htm
 
                Wikipedia – Benjamin Franklin
https://en.wikipedia.org/wiki/Benjamin_Franklin
 
                Wikipedia – Three-Fifths Compromise
https://en.wikipedia.org/wiki/Three-fifths_Compromise
 
                Britannica – Three-Fifths Compromise
https://www.britannica.com/topic/three-fifths-compromise
 
                U.S. Senate – Equal State Representation and the Great Compromise
https://www.senate.gov/about/origins-foundations/senate-and-constitution/equal-state-representation.htm
 
                Wikipedia – Connecticut Compromise
https://en.wikipedia.org/wiki/Connecticut_Compromise
 
                Teaching American History – The Constitutional Convention: The Three-Fifths Clause
https://teachingamericanhistory.org/document/the-constitutional-convention-the-three-fifths-clause/​​​​​​​​​​​​​​​​
 

From Minstrel Stage to State Law: The History of Jim Crow

Source: Library of Congress via Wikimedia Commons, public domain

Few phrases in American history carry as much weight as Jim Crow. It is shorthand for a century of systematic racial oppression — separate schools, separate water fountains, disenfranchisement and even terror. But the term itself has an origin that is both surprising and telling: it began not in a courthouse or a legislature, but on a minstrel stage in the 1820s, as the punchline of a song-and-dance act performed by a white man in blackface.

Jump Jim Crow: The Birth of a Slur

The story starts with Thomas Dartmouth “Daddy” Rice, a struggling white actor from the Lower East Side of Manhattan who would go on to become one of the most famous entertainers of his era. Around 1828, Rice developed a stage character he called Jim Crow — a singing, dancing, shuffling caricature of a Black man, performed in blackface makeup made from burnt cork and wearing ragged clothes. The precise inspiration for the character is disputed and has, as historians put it, been lost to legend. Rice claimed to have based it on a disabled enslaved stableman he observed singing and dancing to amuse himself, though the time, place, and truth of this claim have never been verified.

What is not disputed is the impact. Rice’s “Jump Jim Crow” routine was a sensation. By 1832 he had turned the character into his signature act, touring from Louisville to Cincinnati to Philadelphia to New York, playing to sold-out houses. He eventually took the show to London and Dublin, where audiences were equally enthralled. Rice became known as “Jim Crow Rice,” the self-proclaimed “Father of American Minstrelsy.”

The character Rice created was deliberately degrading — a dim-witted, lazy, buffoonish caricature built to confirm the worst racial stereotypes white audiences held about Black Americans and to reinforce a racial hierarchy that portrayed Black people as naturally suited to servitude and exclusion from the rights of citizenship. He and his imitators spread these images across the country. By 1838, the term “Jim Crow” had crossed from the stage into everyday speech as a collective racial slur for Black people. The minstrel show had given American racism a catchy name.

It is worth pausing on the cultural mechanics of what minstrelsy did. It was not just entertainment — it was propaganda. Rice and his many imitators, including the wildly popular Christy Minstrels (for whom Stephen Foster wrote songs), systematically presented Black Americans as subhuman, unworthy of social participation, and fundamentally inferior and ridiculous figures. These depictions played to a stereotype that reinforced racial prejudice among white audiences and softened their moral resistance to segregation and violence long before those policies were ever written into law.

Reconstruction and Its Violent Undoing

The Civil War ended in 1865 with the abolition of slavery, and the Reconstruction era that followed was a genuine, if incomplete, experiment in racial equality. The 13th, 14th, and 15th Amendments abolished slavery, granted citizenship, and extended voting rights to Black men. For a brief window, Black Americans served in state legislatures and held federal office across the South.

That window slammed shut in 1877, when President Rutherford B. Hayes withdrew the last federal troops from the South as part of a political compromise. Southern white Democrats — calling themselves “Redeemers” — systematically retook their state legislatures using violence, voter fraud, and paramilitary terror groups like the Ku Klux Klan, the White League, and the Red Shirts. Republican officeholders were run out of town or worse. Black voters were lynched as a warning to others.

Once back in power, Southern legislatures began passing the laws that would become known as Jim Crow — a name borrowed directly from Rice’s minstrel slur. The term had traveled from the stage to common speech and then into the legal codebook and reflected the same racist assumptions embodied in the minstrel character: that Black Americans were inherently inferior and should occupy a subordinate social and legal position.

Beginning with Florida in 1887, states required railroads to maintain separate cars for Black and white passengers. Similar laws quickly spread segregation to schools, restaurants, hospitals, theaters, cemeteries, parks, waiting rooms, and virtually every other point of public contact between the races. Many facilities were marked with signs reading “White” and “Colored” to emphasize the exclusion of Black citizens.  The laws varied by state but shared a common goal—racial separation and political exclusion.

Plessy v. Ferguson: The Supreme Court Blesses Segregation

In 1892, a group of Black and mixed-race citizens in New Orleans organized a deliberate legal challenge to Louisiana’s Separate Car Act. They recruited Homer Plessy — a man who was seven-eighths white but legally classified as Black under Louisiana law — to board a whites-only train car and refuse to move. He was arrested on cue, and his case wound its way through the court system to the United States Supreme Court.

In Plessy v. Ferguson (1896), the Court ruled 7-1 that racial segregation was constitutional as long as the separate facilities were nominally equal. The lone dissenter, Justice John Harlan, stated that the constitution should be color blind and warned prophetically that the decision would prove as pernicious as the Dred Scott ruling. He was right. The “separate but equal” doctrine became the legal foundation for a total apartheid system across the South — and, in many respects, well beyond it.

In practice, of course, the facilities were never equal. Black schools were chronically underfunded, with inferior buildings, used textbooks, and far lower teacher salaries. Black hospitals, where they existed at all, were understaffed and under-equipped. “Separate but equal” was a legal fiction that everyone understood as a polite cover for second-class citizenship.

Between 1890 and 1910, ten of the eleven former Confederate states rewrote constitutions or passed amendments to systematically disenfranchise Black voters through poll taxes, literacy tests administered by white registrars, grandfather clauses, and outright violent intimidation. African Americans who had briefly held political power during Reconstruction effectively disappeared from public life.

The Long Arm of Jim Crow: Life Under the System

For the better part of a century, Jim Crow was not merely a set of legal rules — it was a total social order enforced by custom, economic pressure, and the ever-present threat of violence. A Black man who looked a white man in the eye too long, or failed to step off the sidewalk, or was accused of anything at all, could find himself facing a lynch mob. Between Reconstruction and World War II, thousands of Black Americans — men, women and even children — were lynched, often publicly and festively, with onlookers smiling while posing for pictures with the victim. Law enforcement often did nothing and at times even actively participated.

The Great Migration — the mass movement of Black Americans from the South to Northern and Midwestern cities beginning around 1910 — was driven in large part by Jim Crow. Millions fled to Chicago, Detroit, New York, and Los Angeles in search of something approaching equal treatment. What they often found instead was a northern variant: informal segregation through redlining (systematic denial of mortgages in Black neighborhoods), discriminatory hiring, and residential segregation enforced by real estate covenants and social pressure.

The Legal Dismantling: Brown, the Civil Rights Act, and the Voting Rights Act

The legal edifice of Jim Crow began to crack in 1954, when the Supreme Court issued its unanimous decision in Brown v. Board of Education. Directly overturning Plessy, the Court held that segregated schools were inherently unequal and unconstitutional. The ruling did not end segregation — Southern states resisted fiercely, and desegregation proceeded at a glacial pace well into the 1970s — but Brown pulled the legal rug out from under “separate but equal.”

The civil rights movement of the late 1950s and early 1960s — marked by sit-ins, freedom rides, the Birmingham campaign, the March on Washington, and countless acts of individual courage in the face of brutal repression — built the political pressure that forced legislative action. The Civil Rights Act of 1964 prohibited discrimination in public accommodations and employment. The Voting Rights Act of 1965 dismantled the machinery of Black disenfranchisement in the South, sending federal examiners into states with histories of discrimination to ensure that Black citizens could actually register and vote. By 1965 the majority of the formal Jim Crow laws had been overturned.

The Vestiges: Jim Crow’s Long Shadow

Ending Jim Crow legally is not the same as ending its consequences — or, as some argue, its functional successors. Legal scholar Michelle Alexander, in her landmark 2010 book The New Jim Crow, argues that mass incarceration has emerged as a new system of racialized social control. Black Americans are incarcerated at roughly five times the rate of white Americans. A felony conviction — often stemming from drug offenses prosecuted far more aggressively in Black communities despite similar rates of drug use across races — carries a cascade of legal disabilities: loss of voting rights in most states, exclusion from public housing and many jobs, and ineligibility for certain federal benefits. Nearly 30% of Black men carry felony records.

The racial wealth gap stands as perhaps the most stubborn vestige of a century of legal exclusion. The median Black family holds roughly one-tenth the wealth of the median white family — a gap driven not by individual choices but by generations of legally enforced exclusion from homeownership, inheritance, and wealth-building opportunities. Redlining — the systematic denial of mortgages in Black neighborhoods by federally-backed banks — was official federal policy from the 1930s through the 1960s and left neighborhood segregation patterns that persist visibly today.

Voting rights remain contested territory. The Supreme Court’s 2013 ruling in Shelby County v. Holder gutted the preclearance requirement of the Voting Rights Act, which had required states with histories of discrimination to get federal approval before changing election laws. Since then, a wave of new restrictions — voter ID laws, aggressive voter roll purges, reduced polling locations, and aggressive felony disenfranchisement — has fallen disproportionately on Black, Latino, and Indigenous voters. As of April 2024, 19 states require photo identification to vote, a requirement critics note functions as a modern poll tax for those who lack access to the required documents.

Residential segregation, too, endures. American neighborhoods and, by extension, their schools remain largely segregated by race — a product not of individual choice but of policies like redlining, racially restrictive housing covenants, and “white flight” that were actively promoted by governments at every level. These patterns mean that school quality, which is largely tied to local property tax revenues, continues to diverge sharply by race.

Conclusion

The arc of Jim Crow runs from the absurd to the tragic: from a white comedian in burnt-cork makeup doing a shuffling dance routine, to a legal apparatus that stripped millions of Americans of their dignity, their rights, and their lives for a century. The name itself tells you what the architects of segregation thought of their project — it was always a contemptuous slap in the face of Black Americans.

While the formal system was legally dismantled between 1954 and 1965 — which is rightly understood as a profound achievement — the legal end of Jim Crow is not the same as justice, and it is not the same as equality. The wealth gap, the incarceration gap, the voting access gap, and the education gap all bear the fingerprints of a system that successfully kept Black Americans subordinate for generations.

Understanding Jim Crow fully requires tracing it from its shameful origins in a minstrel song, through its deadly flowering as a legal regime, to its present-day aftereffects. The name may have disappeared from the law books, but the consequences have not and some critics of the current administration see the dismantling of DEI as an indirect form of Jim Crow revival or neo-segregation. They believe that race-neutral language is being used to reproduce racial inequality without openly naming race.  This is sometimes described this as “Jim Crow 2.0” or “colorblind Jim Crow” and should be no more acceptable than its more blatant ancestor.

Sources

1. Wikipedia — Thomas D. Rice: https://en.wikipedia.org/wiki/Thomas_D._Rice

2. Wikipedia — Jim Crow (Character): https://en.wikipedia.org/wiki/Jim_Crow_(character)

3. Wikipedia — Jump Jim Crow: https://en.wikipedia.org/wiki/Jump_Jim_Crow

4. Jim Crow Museum, Ferris State University — Who Was Jim Crow?: https://jimcrowmuseum.ferris.edu/who/index.htm

5. Jim Crow Museum, Ferris State University — The Origins of Jim Crow: https://jimcrowmuseum.ferris.edu/origins.htm

6. Britannica — What Is the Origin of the Term “Jim Crow”?: https://www.britannica.com/story/what-is-the-origin-of-the-term-jim-crow

7. Britannica — Jim Crow Law: https://www.britannica.com/event/Jim-Crow-law

8. National Archives — Plessy v. Ferguson (1896): https://www.archives.gov/milestone-documents/plessy-v-ferguson

9. Wikipedia — Plessy v. Ferguson: https://en.wikipedia.org/wiki/Plessy_v._Ferguson

10. Wikipedia — Jim Crow Laws: https://en.wikipedia.org/wiki/Jim_Crow_laws

11. PBS — Jim Crow & Plessy v. Ferguson: https://www.pbs.org/tpt/slavery-by-another-name/themes/jim-crow/

12. Howard University School of Law — Reconstruction and Jim Crow Eras: https://library.law.howard.edu/civilrightshistory/blackrights/jimcrow

13. American Battlefield Trust — The Jim Crow Era: https://www.battlefields.org/learn/articles/jim-crow-era

14. Michelle Alexander — The New Jim Crow (Introduction Excerpt): https://newjimcrow.com/about/excerpt-from-the-introduction

15. Economic Policy Institute — Voter Suppression Rooted in Racism: https://www.epi.org/publication/rooted-racism-voter-suppression/

16. American Academy of Arts and Sciences — Somewhere Between Jim Crow and Post-Racialism: https://www.amacad.org/publication/daedalus/somewhere-between-jim-crow-post-racialism

17. Yale Macmillan — History of Minstrel Shows and Jim Crow: https://macmillan.yale.edu/glc/history-minstrel-shows-and-jim-crow

The Accidental Footnote: Heel Spurs, the Vietnam Draft, and American Inequality

If you’ve ever winced taking your first steps out of bed in the morning, you may have already made an involuntary acquaintance with heel spurs — or more precisely, with the condition that often travels with them. The term itself sounds alarming, and for a brief but colorful stretch of American political history, it became something far more charged than a footnote to podiatry. But before we get to the politics, it’s worth understanding what a heel spur actually is, because the medical reality is both more mundane and more complicated than the caricature.


What Exactly Is a Heel Spur?
A heel spur is a small bony outgrowth — technically called a calcaneal spur — that extends from the underside of the heel bone (the calcaneus). It forms at the spot where the plantar fascia — the thick ligament running the length of your foot from heel to toe — attaches to the heel bone. The spur is not, despite what the name implies, a sharp spike. It is typically smooth and rounded, though it can still cause irritation if it presses into surrounding soft tissue.


The image depicts a comparison between a healthy foot and one with a prominent foot ulcer, highlighting the condition's visible effects.

AI-generated content may be incorrect.
 
Heel spurs affect about 10% of the population, making them one of the more common foot conditions around, though most people who have one don’t know it. The spur develops gradually — usually over months or even years — as the body deposits calcium in response to chronic stress at that heel attachment point. Think of it less as damage and more as your skeleton’s attempt at reinforcement.

What Causes Them?
The underlying driver is repetitive mechanical stress on the foot. Heel spurs are particularly associated with strains on foot muscles and ligaments, stretching of the plantar fascia, and repeated small tears in the membrane covering the heel bone. Athletes who do a lot of running and jumping are especially prone.

But you don’t need to be an elite runner to develop one. Walking gait problems — particularly overpronation, where the foot rolls inward — place uneven stress on the heel with each step. Worn-out or poorly fitted shoes, which fail to absorb shock or support the arch, compound the problem. Obesity increases the mechanical load on the heel. Occupations that require prolonged standing or walking on hard surfaces put the plantar fascia under constant tension. And as people age, tendons and ligaments lose their elasticity, making the tissues more vulnerable to micro-tears and the subsequent bony repair response.

Heel spurs are also closely connected to a condition most people have heard of: plantar fasciitis. The two are related a but not identical. Plantar fasciitis is inflammation of the plantar fascia itself, usually from overuse. A heel spur can develop as a downstream consequence of that inflammation — the body lays down extra bone in response to the ongoing stress at the fascia’s attachment point.

Symptoms — or the Lack Thereof
Here’s the part that surprises most people: the majority of heel spurs cause no symptoms at all, and many are discovered incidentally on X-rays taken for other reasons. Only about 5% of heel spurs are estimated to be symptomatic.

When a heel spur does produce symptoms, the experience is heavily intertwined with plantar fasciitis. The classic description is a sharp, stabbing pain on the bottom of the foot first thing in the morning, or after any prolonged rest. Many people compare it to stepping on a tack. Paradoxically, this pain often eases somewhat after walking around for a few minutes, only to return after extended time on the feet or after another rest. It’s that “worse in the morning” quality that tends to be the giveaway.

Other symptoms, when present, can include localized swelling, warmth, and tenderness along the front of the heel, as well as increased sensitivity on the underside of the foot. It’s worth noting that the pain associated with a heel spur is not generally thought to come from the bony spur itself, but from the irritation it causes in the surrounding soft tissue — tendons, ligaments, and bursae.

How Is It Diagnosed?
Diagnosis typically begins with a physical exam. Your doctor or podiatrist will ask about when the pain started, what activities preceded it, and what makes it better or worse. They’ll examine your foot for tenderness at specific points, assess your range of motion, and check foot alignment and press on key areas to locate the source of pain.

Imaging confirms the picture. An X-ray can clearly show the bony spur and is the most commonly used test. That said, the size of the spur on an X-ray doesn’t necessarily correspond to how much pain a patient is experiencing — a small spur can be quite painful while a large one may cause no trouble at all. In more complex cases, an MRI may be ordered to assess the soft tissues more closely and evaluate whether plantar fasciitis or another condition is also in play.

Treatment Options
The reassuring news is that the vast majority of cases resolve without surgery. More than 90% of patients improve with nonsurgical treatment. The catch is that conservative management requires patience — improvement typically takes weeks, and more stubborn cases can take months.

The cornerstone of treatment is rest and reducing the activities that provoke pain. This doesn’t necessarily mean completely stopping exercise; low-impact alternatives like swimming, cycling, or rowing allow you to stay active while giving the heel a break from impact. Icing the bottom of the foot after activity helps manage inflammation. Over-the-counter anti-inflammatory medications like ibuprofen or naproxen can provide relief, though they’re intended for short-term use.

Footwear matters enormously. Supportive shoes with good arch support, cushioning, and a slight heel rise reduce the strain on the plantar fascia. Custom orthotics with molded insoles designed to redistribute pressure across the foot are often recommended, particularly for people with gait abnormalities or flat feet. Physical therapy can be part of the treatment plan, focusing on stretching the calf muscles and plantar fascia, strengthening the foot’s intrinsic muscles, and correcting biomechanical issues.

For cases that don’t respond to these initial measures, the next tier of treatment includes corticosteroid injections to reduce inflammation at the spur site, and extracorporeal shockwave therapy — a non-invasive procedure that uses sound waves to stimulate healing in chronically inflamed tissue. Surgery is reserved for the minority of cases where conservative treatment fails after nine to twelve months. Possible complications include nerve pain, infection, scarring, and — with plantar fascia release — the risk of foot instability or stress fracture. Most orthopedic surgeons regard surgery as a last resort.

Are Heel Spurs Debilitating?
For most people, the honest answer is: no.  Heel spurs are a common condition with a favorable prognosis, especially with early diagnosis and appropriate management. Many people live with heel spurs for years without ever knowing it, and even those who develop pain typically find substantial relief with conservative treatment within four to eight weeks.
That said, the pain at its worst — particularly in conjunction with plantar fasciitis — can be genuinely disruptive to daily life. Athletes may find their training significantly limited. People who spend long hours on their feet at work may struggle with sustained discomfort. And a small percentage of patients do end up with prolonged, treatment-resistant pain that affects mobility. So, the more accurate framing might be: heel spurs have the potential to be significantly uncomfortable and functionally limiting during flare-ups, but with proper treatment most people recover well and return to normal activity.

Heel Spurs and the Vietnam-Era Draft
Which brings us to an improbable chapter in heel spur history. During the Vietnam War era, heel spurs became — for at least one famous case — a ticket out of military service. Understanding how that worked requires a brief detour into the draft system of the 1960s and 1970s, and what it meant to receive a medical deferment.
According to the National Archives, of the roughly 27 million American men eligible for military service between 1964 and 1973, about 15 million were granted deferments — mostly for education, and some for mental or physical problems — while only 2,215,000 were actually drafted into service—another eight million volunteered. Some of those who later served had previously had deferments. The system was sprawling, complex, and — as was widely acknowledged even at the time — deeply unequal.
Roughly 60% of draft-eligible American men took some sort of action to avoid military conscription. There were many routes: college deferments, fatherhood, conscientious objector status (170,000 men received those alone), National Guard enlistment, and medical exemptions. Medical deferments covered a wide range of conditions — from serious chronic illness to conditions that, in a different context, most people would consider minor. Flat feet, poor eyesight, asthma, and yes, bone spurs all appeared on the list of potentially disqualifying ailments.
The system was known to favor men with access to money, education, and well-connected physicians. American forces in Vietnam were 55% working-class and 25% poor — reflecting those who didn’t have the means to navigate the deferment labyrinth. A working-class kid from rural West Virginia was far more likely to end up in the Mekong Delta than the son of a New York real estate developer.

The Most Famous Heel Spur in American History
Which leads, inevitably, to Donald Trump. As confirmed by Selective Service records obtained and reported by multiple news outlets, Trump received five Vietnam-era draft deferments — four for college attendance at Fordham and the Wharton School, and a fifth in 1968, recorded as a medical deferment for bone spurs in his heels. The medical classification left him disqualified for military service.

The circumstances surrounding the diagnosis have been contested ever since. Reporting by the New York Times included accounts from the daughters of a Queens podiatrist named Larry Braunstein, who alleged that their father had provided or vouched for the diagnosis as a professional favor to Trump’s father, Fred Trump — a landlord to whom Braunstein reportedly owed a debt of gratitude. Trump’s former lawyer Michael Cohen also testified that Trump had admitted to fabricating the injury. Trump himself has maintained that the diagnosis was legitimate, stating that a doctor “gave me a letter — a very strong letter — on the heels.” The underlying medical records that would resolve the dispute are, conveniently, not publicly available; most individual Selective Service medical records from that era were subsequently destroyed.

It’s worth noting that Trump’s pattern — using legal channels, including a medical deferment of questionable validity, to avoid Vietnam service — was not unique to him. Historians have pointed out that numerous prominent figures on both sides of the political aisle received deferments of various kinds, including Joe Biden (asthma), Dick Cheney (student deferments), Bill Clinton (navigated the ROTC system), and George W. Bush (National Guard). The heel spur episode became politically charged in part because of Trump’s later hawkish rhetoric and his outspokenness in questioning the military service of others — most notably Senator John McCain, who spent years as a prisoner of war in North Vietnam.

How Many People Got Heel Spur Deferments?
This is where the historical record hits a hard wall. No reliable statistics exist specifically for heel spur deferments. The Selective Service tracked broad categories — student deferments, hardship deferments, conscientious objector status, medical disqualifications — but it did not publish a breakdown by specific diagnosis, and most individual medical records from that era no longer exist.

What we can say is that bone spurs were a recognized medical disqualifier under Selective Service regulations, that medical deferments broadly were a commonly used — and commonly abused — avenue for avoiding service, and that the process was heavily influenced by access to sympathetic physicians. A man with means, connections, and a cooperative podiatrist had options that a man without those resources did not.

The honest answer, then, is that we don’t know how many men received deferments citing heel spurs specifically, and we almost certainly never will. The data either wasn’t tracked at that level of granularity or was long since destroyed. What we do know is that the condition became, for a time, a lens through which Americans examined something much larger: who serves, who doesn’t, and whether the systems meant to govern those decisions are applied fairly.

For most people, a heel spur is a manageable, if annoying, footnote in the story of their health. For at least one person, it became a footnote in the history of American politics.
 
Personal Note: I have heel spurs; I wish I’d known about them in 1967.
 
Images generated by author using AI.

Medical Sources
Cleveland Clinic — Heel Spurs overview
https://my.clevelandclinic.org/health/diseases/21965-heel-spurs
WebMD — Heel Spur Causes, Symptoms, Treatments, and Surgery
https://www.webmd.com/pain-management/heel-spurs-pain-causes-symptoms-treatments
Hackensack Meridian Health — Bone Spurs in the Heel: Symptoms and Recovery
https://www.hackensackmeridianhealth.org/en/healthier-you/2024/01/02/bone-spurs-in-the-heel-symptoms-and-recovery
OrthoArkansas — Heel Spurs
https://www.orthoarkansas.com/heel-spurs-orthoarkansas/
EmergeOrtho — Heel Bone Spurs: Causes, Symptoms, Treatment
https://emergeortho.com/news/heel-bone-spurs/
American Academy of Orthopaedic Surgeons — Plantar Fasciitis and Bone Spurs
https://orthoinfo.aaos.org/en/diseases–conditions/plantar-fasciitis-and-bone-spurs/
Vietnam Draft & Military Service Sources
History.com — 7 Ways Americans Avoided the Draft During the Vietnam War
https://www.history.com/articles/vietnam-war-draft-avoiding
Wikipedia — Draft Evasion in the Vietnam War
https://en.wikipedia.org/wiki/Draft_evasion_in_the_Vietnam_War
Wikipedia — Conscription in the United States
https://en.wikipedia.org/wiki/Conscription_in_the_United_States
Students of History — The Draft and the Vietnam War
https://www.studentsofhistory.com/vietnam-war-draft
University of Michigan — The Military Draft During the Vietnam War
https://michiganintheworld.history.lsa.umich.edu/antivietnamwar/exhibits/show/exhibit/draft_protests/the-military-draft-during-the-
Vietnam Veterans of America Chapter 310 — Vietnam War Statistics
https://www.vva310.org/vietnam-war-statistics
Vietnam Veterans of Foreign Wars — Fact vs. Fiction: The Vietnam Veteran
https://www.vvof.org/factsvnv.htm
New York City Vietnam Veterans Plaza — Interesting Facts About Vietnam
https://www.vietnamveteransplaza.com/interesting-facts-about-vietnam/
 
 
Medical Disclaimer
The information provided in this article is intended for general educational and informational purposes only and does not constitute medical advice. It should not be used as a substitute for professional medical advice, diagnosis, or treatment.
Always seek the guidance of a qualified healthcare provider with any questions you may have regarding a medical condition or treatment. Never disregard professional medical advice or delay seeking it because of something you have read here.
If you are experiencing a medical emergency, call 911 or your local emergency number immediately.
The author of this article is a licensed physician, but the views expressed here are solely those of the author and do not represent the official position of any hospital, health system, or medical organization with which the author may be affiliated.
 

Lady Liberty: The Statue We Think We Know

Collection of the author

Ask most Americans what the Statue of Liberty means and they’ll say the same thing: she welcomes immigrants. She is the “Mother of Exiles,” keeper of the golden door. That image is so deeply woven into the national identity that it has been quoted, protested, and debated for more than a century. The only problem is that it wasn’t what her creators originally intended.

The story begins in 1865, just weeks after the Civil War ended, at a dinner party near Versailles. The host was Édouard de Laboulaye, a French historian and president of the French Anti-Slavery Society. Laboulaye was one of France’s most passionate admirers of American democracy and was deeply moved by both Lincoln’s assassination and the abolition of slavery. He proposed that France present the United States with a colossal monument — one that would celebrate two things at once: the centennial of American independence and the end of slavery.

Sculptor Frédéric-Auguste Bartholdi was at that dinner and took the idea and ran with it. Early models from around 1870 show Lady Liberty with her right arm raised — familiar enough — but in her left hand she holds broken shackles, not a tablet. The anti-slavery message was unmistakable.

That symbolism didn’t entirely disappear from the final design — it just got moved. According to the National Park Service, Bartholdi placed a broken chain and shackle at the statue’s feet, hidden beneath the copper drapery. Most visitors never notice it. The left hand now holds a tablet inscribed with the date of the Declaration of Independence.

Why the change? NYU historian Edward Berenson points to the political climate of the 1880s. By the time the statue was dedicated in October 1886 — more than 20 years after Laboulaye’s dinner — Reconstruction had collapsed, Jim Crow was spreading, and the country was trying to paper over sectional wounds by quietly forgetting the war’s racial roots. Nobody at the dedication mentioned slavery. The abolitionist origins were simply buried.

The statue was formally named Liberty Enlightening the World and the message broadened toward Franco-American friendship and American liberty in general rather than emancipation specifically.

Then came Emma Lazarus, and the second transformation. In 1883, a fundraiser struggling to pay for the statue’s pedestal asked prominent writers to donate works for an auction. Lazarus, a Jewish-American poet, initially declined as she was then deeply involved in aiding Jewish refugees fleeing wide-spread and organized violence in Russia. A friend persuaded her that the statue, sitting at the entrance to New York Harbor, would inevitably be seen as a beacon by arriving immigrants. She wrote “The New Colossus” — fourteen lines that reimagined Lady Liberty entirely, as a “Mother of Exiles” beckoning the world’s tired and poor to America’s golden door.

Not like the brazen giant of Greek fame,
With conquering limbs astride from land to land;
Here at our sea-washed, sunset gates shall stand
A mighty woman with a torch, whose flame
Is the imprisoned lightning, and her name
Mother of Exiles. From her beacon-hand
Glows world-wide welcome; her mild eyes command
The air-bridged harbor that twin cities frame.
“Keep, ancient lands, your storied pomp!” cries she
With silent lips. “Give me your tired, your poor,
Your huddled masses yearning to breathe free,
The wretched refuse of your teeming shore.
Send these, the homeless, tempest-tost to me,
I lift my lamp beside the golden door!”

Strangely, the poem then vanished. It played no role at the statue’s 1886 dedication. Lazarus died in 1887, before Ellis Island even opened. It wasn’t until 1903 that a friend had the entire poem cast onto a bronze plaque that was mounted inside the pedestal——not engraved on the outside as many believe. By then, millions of immigrants had already sailed past the statue on their way to Ellis Island, and the association with immigration had taken hold in the public imagination.

The immigration meaning deepened through the early 20th century as the government used the statue’s image in campaigns to assimilate immigrant children, many of whom lived in ethnic enclaves in large eastern cities. Meanwhile, the abolitionist symbolism that had inspired the project in the first place faded almost entirely from public memory.

The broken shackles are still there, tucked under her robe, mostly invisible. Lady Liberty has been continuously reinterpreted for 160 years — by abolitionists, by a poet responding to a refugee crisis, by politicians, and by millions of people who looked up at that torch from the deck of a ship. Will Lady Liberty and her words continue to offer welcome and hope and be a source of national pride or is her meaning once again being rewritten as a symbol of American identity, dominance, and, perhaps, exclusion?

The Statue’s shackles and feet. National Park Service, Statue of Liberty NM, Public Domain

Sources

National Park Service – Statue of Liberty history: https://www.nps.gov/stli/learn/historyculture/statue-of-liberty.htm Library of Congress – Dedication and speeches: https://www.loc.gov/item/ihas.200197394/ Cleveland, Grover. Dedication Address (1886): https://www.nps.gov/stli/learn/historyculture/cleveland.htm Smithsonian Magazine – History of the Statue of Liberty: https://www.smithsonianmag.com/history/statue-liberty-180970340/

The Art of Rigging Democracy: A Close Look At Gerrymandering

Wikimedia Commons: Elkanah Tisdale (1771-1835) (often falsely attributed to Gilbert Stuart), public domain


Picture this: It’s 1812 in Massachusetts, and Governor Elbridge Gerry has just approved a redistricting plan that creates such a bizarrely shaped legislative district that when a local newspaper editor saw it on a map, he thought it looked like a salamander. The editor sketched wings and claws onto the district, and someone quipped that it looked more like a “Gerry-mander” than a salamander. The term stuck, and more than two centuries later, we’re still dealing with the same problem that inspired that joke.
What Gerrymandering Actually Means
At its core, gerrymandering is the practice of drawing electoral district boundaries to give one political party or group an unfair advantage over its opponents. It’s a form of political manipulation that allows those in power to essentially choose their voters, rather than letting voters choose their representatives. Think of it as a sophisticated form of gaming the system—perfectly legal in many cases, but profoundly anti-democratic in spirit.
The mechanics are surprisingly straightforward. There are two main techniques: “cracking” and “packing.” Cracking involves splitting up concentrations of opposing voters across multiple districts so they can’t form a majority anywhere. Packing does the opposite—cramming as many opposing voters as possible into a few districts so they waste their votes winning by huge margins in just a couple of places, leaving the rest of the districts safely in your column. These techniques can be deployed together to engineer a decisive partisan advantage.
A History of Creative Mapmaking
The practice didn’t start with Gerry, of course. The Founding Fathers—for all their lofty rhetoric about representative democracy—weren’t above putting their thumbs on the electoral scales. But gerrymandering really came into its own in the 20th century as advances in census data, and statistics combined with the addition of newly available computing power made it possible to draw districts with surgical precision.
The 2010 redistricting cycle marked a watershed moment. Single-party control of the redistricting process gave partisan line drawers free rein to craft some of the most extreme gerrymanders in American history often down to the level of individual city blocks. Republicans, having won control of many state legislatures in the 2010 midterms, used sophisticated computer modeling to create maps that locked in their advantages for a decade. Democrats did the same where they had the power, though Republicans controlled more state legislatures and thus wielded greater gerrymandering capability overall.
The 2024-2025 Gerrymandering Wars
Here’s where things get really interesting—and deeply concerning. The situation has exploded into what some are calling “gerrymandering wars” following the 2020 census and a critical Supreme Court decision in 2019. In Rucho v. Common Cause, the Supreme Court ruled that partisan gerrymandering constitutes a non-justiciable “political question” where federal court intervention is unsuitable. Translation: The federal courts won’t stop partisan gerrymandering because they claim there’s no objective standard to measure it.
This opened the floodgates. The Brennan Center estimates that gerrymandering gave Republicans an advantage of around 16 House seats in the 2024 race to control Congress compared to fair maps. But here’s the kicker: we’re not even done with this decade’s redistricting.
In an unprecedented move, President Donald Trump has pushed Republican state lawmakers to further gerrymander their states’ congressional maps, prompting Democratic state lawmakers to respond in kind. In August 2025, during a special session, Texas’s legislature passed a redistricting plan that weakens electoral opportunities for Black and Hispanic voters.  California has threatened to respond with its own gerrymander, creating a tit-for-tat dynamic that could spiral out of control.
North Carolina provides perhaps the most dramatic example. After the state supreme court reversed its position on policing partisan gerrymandering, the Republican-controlled legislature redrew the map, and after the 2024 election, three Democratic districts flipped to Republicans—enough to give control of the U.S. House to the GOP by a slim margin.
The situation has gotten so extreme that both parties are now openly engaging in mid-decade redistricting—something that traditionally only happened after each ten-year census. California, Missouri, North Carolina, Ohio, Texas and Utah have all adopted new congressional maps in 2025, with new maps also appearing possible in Florida, Maryland and Virginia.
The Racial Dimension
It’s crucial to note that gerrymandering comes in two flavors: partisan and racial. While partisan gerrymandering is currently legal thanks to the Supreme Court, racial gerrymandering—drawing districts specifically to dilute the voting power of racial minorities—violates the Voting Rights Act of 1965. The line between the two can get blurry, though, since partisan voting patterns often correlate with race.
In May 2025, a federal court ruled that Alabama’s 2023 congressional map not only violates Section 2 of the Voting Rights Act but was enacted by the Alabama Legislature with racially discriminatory intent. Similar battles are playing out in Louisiana, Mississippi, and other states. The legal landscape here is complex, with courts sometimes walking a tightrope between ensuring fair representation for communities of color and avoiding the creation of what could be challenged as racial gerrymanders.
What Can Be Done About It?
The most popular reform proposal is the creation of independent redistricting commissions—bodies of citizens (not politicians) who draw district maps according to neutral criteria.  Currently, several states including Colorado, Michigan, Ohio and Virginia use redistricting commissions to draw congressional and state legislative maps, ranging from political commissions with elected officials to completely independent commissions that bar all elected officials from serving as commissioners.
Do they work? The evidence is mixed but generally positive. According to a Redistricting Report Card published with the Princeton Gerrymandering Project, the states that had some form of commission drew “B+” maps on average, while states where partisans controlled the process drew “D+” maps.  California’s independent commission is often held up as the gold standard, though it’s not perfect—even fairly drawn maps can produce lopsided results due to how voters cluster geographically.
Another proposal focuses on clear, enforceable criteria: compactness, contiguity, respect for existing political boundaries, and transparency in the mapping process. Advances in statistical analysis also make it possible to compare proposed maps against thousands of neutral alternatives to detect extreme outliers, a method increasingly discussed in academic and legal circles.
Federal legislation has been proposed repeatedly. The Redistricting Reform Act of 2025 would prohibit states from mid-decade redistricting and would require every state to adopt nonpartisan, independent redistricting commissions. Similar provisions were included in the “For the People Act” that Democrats passed in the House in 2021, but it died in the Senate. Getting such legislation through Congress would require bipartisan cooperation, which seems unlikely given that both parties see gerrymandering as a political weapon and they believe they can’t afford to unilaterally disarm.
Some reformers advocate for more radical solutions. Proportional representation systems, where the share of votes equals the share of seats, would end boundary-drawing battles altogether and make democracy more representative. Under such a system, if Democrats win 60% of the vote in a state, they’d get roughly 60% of that state’s congressional seats. It’s intuitive and fair, but it would require a fundamental restructuring of American electoral systems—something that’s probably not politically feasible in the near term.
State courts have emerged as a potential backstop. At least 10 state supreme courts have found that state courts can decide cases involving allegations of partisan gerrymandering, even though federal courts won’t touch them.  This means that state constitutional provisions against gerrymandering could provide meaningful protection—though as North Carolina demonstrated, state court compositions can change, and with them, their willingness to police gerrymandering.
The Bottom Line
Gerrymandering represents a fundamental tension in American democracy: How do we draw districts fairly when the people drawing them have every incentive to rig the game in their favor? The problem has ancient roots but ultra-modern manifestations, powered by big data and sophisticated computer modeling that would make Elbridge Gerry’s head spin.
The current moment feels particularly precarious. With Republicans’ razor-thin majority in the House and midterm elections traditionally being unfavorable for the party in power, the Republicans’ action amounts to a preemptive move to retain control of Congress. The Democrats threatened response could trigger an escalating cycle of partisan map manipulation that further entrenches our political divisions and makes elections less responsive to actual voter preferences.
Independent commissions offer a promising path forward, but they’re not a silver bullet. They work better than partisan control, but they can’t eliminate all the inherent challenges of translating votes into seats through geographic districts. More ambitious reforms like proportional representation could solve the problem more completely, but they face enormous political and practical obstacles.
For now, gerrymandering remains what that newspaper editor saw in 1812: a monstrous distortion of democratic principles, hiding in plain sight on our electoral maps. The question is whether we have the political will to slay the beast, or whether we’ll keep feeding it for another two centuries.
 
Sources:
Brennan Center for Justice – Gerrymandering Explained
https://www.brennancenter.org/our-work/research-reports/gerrymandering-explained
 
Brennan Center for Justice – How Gerrymandering Tilts the 2024 Race for the House
https://www.brennancenter.org/our-work/research-reports/how-gerrymandering-tilts-2024-race-house
 
American Constitution Society – America’s Gerrymandering Crisis
https://www.acslaw.org/expertforum/americas-gerrymandering-crisis-time-for-a-constructive-redistricting-framework/
 
ACLU – Court Cases on Gerrymandering
https://www.aclu.org/court-cases?issue=gerrymandering
 
Stateline – State Courts and Gerrymandering
https://stateline.org/2025/12/22/as-supreme-court-pulls-back-on-gerrymandering-state-courts-may-decide-fate-of-maps/
 
Campaign Legal Center – Do Independent Redistricting Commissions Work?
https://campaignlegal.org/update/do-independent-redistricting-commissions-really-prevent-gerrymandering-yes-they-do
 
RepresentUs – End Partisan Gerrymandering
https://represent.us/policy-platform/ending-partisan-gerrymandering/
 
Senator Alex Padilla – Redistricting Reform Act of 2025
https://www.padilla.senate.gov/newsroom/press-releases/watch-padilla-lofgren-introduce-legislation-to-establish-independent-redistricting-commissions-end-mid-decade-redistricting-nationwide/
 
Protect Democracy – How to End Gerrymandering
https://protectdemocracy.org/work/how-to-end-gerrymandering/

The Evolution of the English Language: From Anglo-Saxon Roots to a Global Tongue

English is a beautifully messy language—shameless in its borrowing and relentless in its evolution. It resists the tidy logic that might make a grammarian’s life easier, and that resistance is part of what makes its history so compelling. The English we speak today is the product of centuries of invasion, migration, cultural collision, and literary ambition—a language built in layers, like geological strata laid down over time.

To see how English grew from an obscure Germanic dialect into a global lingua franca, it helps to trace three broad phases: Old English, Middle English, and Modern English. Each stage was shaped by different historical forces, from Germanic migration and Viking settlement to the Norman Conquest, the Renaissance, the printing press, and ultimately the worldwide reach of the British Empire and the United States.

Anglo-Saxon Foundations

The story begins on the European mainland. When Roman authority collapsed in Britain in the early fifth century, Germanic-speaking peoples from what is now northern Germany, Denmark, and the Netherlands moved into the island. The Angles, Saxons, and Jutes arrived in waves, bringing closely related West Germanic dialects that gradually developed into Old English, often called Anglo-Saxon.

Old English was thoroughly Germanic in both grammar and vocabulary. It was a highly inflected language: case endings marked whether a noun was subject, object, or possessive, and nouns had grammatical gender. Verbs were conjugated with a complexity that feels foreign to most modern English speakers. Much of the core vocabulary of modern English—words such as water, house, bread, child, earth, life, and death—dates back to this early period and still carries that Germanic stamp.

The language of Beowulf, composed between the eighth and early eleventh centuries, is virtually unreadable today without specialized training. Its famous opening line, “Hwæt! We Gardena in geardagum,” is technically English, but it feels closer to a foreign language. Old English used letters such as þ (thorn) and ð (eth) and relied on grammatical structures that later disappeared.

Nor was Old English a single uniform tongue. It existed as a cluster of regional dialects including Northumbrian, Mercian, Kentish, and West Saxon. Under King Alfred the Great in the late ninth century, Wessex became the leading political power in England and a center of learning. Alfred sponsored translations of important Latin works into Old English, most often in the West Saxon dialect. As a result, most surviving Old English texts come from that dialect, giving us only a partial view of the linguistic diversity of early England.

Latin and Celtic Influences

Even before the Anglo-Saxons arrived in Britain, Latin had begun to influence their speech through contact with the Roman world. Early Latin loanwords include street (from strata), wall (from vallum), and wine (from vinum).

A second wave of Latin influence arrived with the Christianization of England beginning in 597, when Augustine of Canterbury established a mission in Kent. Christianity introduced vocabulary connected with religion, learning, and administration—words such as church, bishop, monk, school, altar, and verse.

By contrast, the Celtic languages spoken by the native Britons left a surprisingly small mark on English vocabulary. Their influence survives most clearly in place names—for example Thames, Avon, and Dover—and in landscape terms such as combe (valley) and tor (rocky hill). Why Celtic languages left relatively few everyday words in English remains one of the lingering puzzles of linguistic history.

Vikings and the Norse Contribution

Beginning in the late eighth century, Scandinavian raiders and settlers—collectively known as Vikings—began attacking and eventually settling parts of England. By the ninth century much of northern and eastern England had become part of the Danelaw, where Old English speakers lived alongside speakers of Old Norse.

Because Old Norse and Old English were closely related Germanic languages, speakers could often roughly understand each other. Over time, however, sustained contact produced deep linguistic blending. English absorbed many Norse-derived words that now feel completely native, including sky, skin, skill, skirt, egg, leg, window, husband, call, take, give, get, want, and die.

Perhaps the most striking Norse contribution lies in the pronouns they, them, and their, which replaced earlier Old English forms. When a language adopts core pronouns from another language, it signals unusually intense and prolonged contact.

Many linguists also believe that contact with Norse speakers helped accelerate the simplification of English grammar. In bilingual communities, speakers often reduce complex inflectional endings that make communication difficult. As a result, English gradually moved away from the elaborate grammatical endings of Old English and toward a system that relied more heavily on word order.

The Norman Transformation

The Norman Conquest of 1066 transformed English more dramatically than any other single event in its history. When William of Normandy defeated King Harold at the Battle of Hastings and became king of England, he brought with him a French-speaking aristocracy.

For several centuries after the conquest, French dominated the language of power—the court, the law, the church hierarchy, and much of government administration. English remained the everyday language of the population but lost prestige in elite circles.

French vocabulary poured into English in areas associated with authority and culture. Law gained terms such as justice, court, judge, jury, prison, crime, and verdict. Government absorbed parliament, sovereign, minister, authority, tax, and treasury. Military language adopted army, navy, soldier, captain, defense, and siege.

Even the language of food reflects this social divide. The animals in the field kept their Old English names—cow, sheep, pig, and deer—while the meat served at noble tables took French names: beef, mutton, pork, and venison.

The Rise of Middle English

Over time, French dominance gradually weakened. The loss of Normandy in 1204 encouraged English nobles to identify more strongly with England itself. Later, the Black Death (1348–1350) reshaped English society by elevating the economic importance of English-speaking laborers and craftsmen.

During the fourteenth century, English returned as the language of all social classes. The language that emerged—Middle English—looked very different from Old English. Most grammatical endings disappeared, grammatical gender vanished, and sentence structure shifted toward the familiar subject-verb-object order.

At the same time, English vocabulary became a rich mixture of Germanic and Romance elements. This layering produced sets of near-synonyms with different levels of formality: ask (Germanic), question (French), and interrogate (Latin).

The most famous literary figure of this period was Geoffrey Chaucer, whose Canterbury Tales demonstrated that English could rival French and Latin as a vehicle for sophisticated literature. Chaucer wrote in the London dialect, which was gaining prominence due to the city’s political and commercial importance. Though not yet standardized, London English gradually became the foundation of later written English.

Printing and the Great Vowel Shift

William Caxton established England’s first printing press in 1476, and this technological revolution had far-reaching consequences for the language. Printing created a need for standardized spelling and grammar, since texts would now be distributed widely rather than copied by hand in local scriptoria. Caxton himself struggled with the problem of dialect variation, complaining about the difficulty of choosing forms that all English readers could understand. Over time, the conventions adopted by London printers became the de facto standard. Additionally, the need to create type for the printing press led to the dropping of the letters þ (thorn) and ð (eth) that were difficult to replicate in lead.

At the same time, English pronunciation underwent a dramatic change known as the Great Vowel Shift, which occurred roughly between 1400 and 1700. Long vowel sounds moved upward in the mouth, transforming the pronunciation of many common words. For example, “name” once sounded closer to nah-muh, while “mouse” sounded more like moose. 

The causes of the Great Vowel Shift remain debated—theories range from the social upheaval following the Black Death to the influence of French-accented English—but its effects were enormous. The spellings had been largely fixed by printing before the vowel shift was complete, so that the written words reflected pronunciations that no longer existed such as knife and through.

Renaissance Expansion

The English Renaissance of the sixteenth and seventeenth centuries unleashed another flood of new vocabulary, much of it borrowed from Latin and Greek. Scholars and writers introduced thousands of words connected to science, philosophy, and literature, including democracy, encyclopedia, atmosphere, thermometer, criticism, and educate.

Critics derided the new coinages as “inkhorn terms”—pretentious, unnecessary words invented by scholars dipping their quills in inkhorns. Some of these attacked words, like “perpetrate” and “contemplate,” survived, others, like “ingent” (enormous), did not.

Two towering cultural works further shaped English during this era: Shakespeare’s plays and the King James Bible (1611). Shakespeare popularized countless words and expressions—among them assassination, lonely, eventful, and phrases like “break the ice” and “wild goose chase.” The King James Bible, widely read for centuries, left deep marks on English rhythm and idiom.

Dictionaries and Standardization

By the eighteenth century, many writers wanted to standardize and regulate English. The most influential effort was Samuel Johnson’s Dictionary of the English Language (1755), which became the dominant reference work of its era.

In the United States, Noah Webster’s American Dictionary of the English Language (1828) promoted simplified spellings such as color instead of colour and center instead of centre. Webster viewed spelling reform as part of America’s broader cultural independence from Britain.

English Goes Global

From the seventeenth through the early twentieth centuries, the British Empire spread English across the globe. Along the way, the language absorbed vocabulary from many other languages. Hindi contributed words such as jungle and shampoo, Arabic added algebra and alcohol, and Malay gave English bamboo and ketchup.

As English took root in different regions, new varieties emerged—American, Australian, Canadian, Indian, Nigerian, Singaporean, and many others. Linguists today increasingly recognize these as legitimate forms of English rather than deviations from a single standard.

English in the Digital Age

In the twentieth and twenty-first centuries, mass media and digital communication have accelerated linguistic change. Radio, film, television, and the internet spread slang, accents, and new expressions around the world with unprecedented speed.

English continues to absorb new words from science, technology, business, and online culture. Brand names become verbs; internet slang becomes everyday speech. Today more than a billion people speak English as a first or second language, making it the most widely used language in human history.

A Language Still Evolving

The history of English reminds us that language is not a fixed monument but a living system shaped by human interaction. Its vocabulary is like an archaeological site, where almost every common word carries traces of earlier eras.

English has never been “pure,” and attempts to purify it have always failed. Its strength lies in its openness—its ability to borrow, adapt, and reinvent itself. From the heroic poetry of Beowulf to Shakespeare’s theater, from the King James Bible to the language of the internet, English continues to grow through the voices of those who use it.

And if history is any guide, the English spoken a few centuries from now will sound just as surprising to us as Chaucer’s language once did.

Illustration generated by author using ChatGPT.

Sources

Baugh, Albert C. and Thomas Cable. A History of the English Language (6th edition). Routledge, 2012. This remains the standard academic textbook on the subject and covers every period and influence discussed above.

Crystal, David. The Cambridge Encyclopedia of the English Language (3rd edition). Cambridge University Press, 2019. An accessible and richly illustrated reference covering the structure and history of English.

McCrum, Robert, Robert MacNeil, and William Cran. The Story of English (3rd revised edition). Penguin, 2003. A popular history that accompanied the PBS television series, excellent for general readers.

Mugglestone, Lynda (ed.). The Oxford History of English (2nd edition). Oxford University Press, 2012. A collection of essays by specialists covering English from its earliest origins to the present day.

Bede, The Venerable. Ecclesiastical History of the English People. Penguin Classics, 1990 (translated by Leo Sherley-Price). The primary early source on the Anglo-Saxon migrations.

Townend, Matthew. “Contacts and Conflicts: Latin, Norse, and French.” In The Oxford History of English, edited by Lynda Mugglestone, 2012. A detailed treatment of the major external influences on English.

Online resource: The British Library’s “Evolving English” exhibit materials are available at https://www.bl.uk/learning/langlit/evolvingenglish/

Online resource: Durkin, Philip. “Borrowed Words: A History of Loanwords in English.” Oxford University Press, 2014. Summary and excerpts available at https://global.oup.com/academic/product/borrowed-words-9780199574995

The Easter Bunny: A Surprisingly Serious History

How a German hare hopped its way into American Easter tradition

Every Easter morning, children across America hunt for eggs left by a rabbit. It’s a charming ritual—and a deeply strange one, when you stop to think about it. Rabbits don’t lay eggs. They don’t carry baskets. Yet here we are, every spring, maintaining the fiction with great enthusiasm. Where did this tradition come from? The answer turns out to be a lot more interesting than you might expect.

The story starts in Germany. The earliest documented reference to an Easter Hare—called the “Osterhase” in German—appears in 1678, in a medical text by the physician Georg Franck von Franckenau. In the German tradition, the Osterhase was specifically a hare, not a rabbit, and its job was straightforward: deliver colored eggs to well-behaved children. Naughty children got nothing. This moral dimension—gift delivery tied to good behavior—should sound familiar. The Easter Bunny was, in a sense, an early version of Santa Claus.

The tradition crossed the Atlantic in the 1700s, carried by German Protestant immigrants who settled in Pennsylvania. Their children knew the Osterhase (sometimes rendered as “Oschter Haws” in Pennsylvania Dutch dialect) and kept up the custom of leaving out nests—made from caps and bonnets—for the hare to fill with eggs. Over time, the nests became baskets, the simple colored eggs became candy and chocolate, and the moral judgment quietly dropped away. By the 20th century, the Easter Bunny had transformed from selective gift-giver into universal children’s benefactor.

But why eggs at all? Eggs entered the Easter story long before Germany. For ancient Romans, they symbolized new life and fertility, and the custom of giving dyed eggs as spring gifts predates Christianity. The Christian tradition added another layer: during the Lenten fasting period eggs were a forbidden food. By Easter Sunday, the people were ready to use the accumulated eggs and were ready to celebrate.  They cooked, decorated, and shared them. The emergence from the shell became a visual metaphor for resurrection, and the symbolism stuck.

Rabbits and hares had their own long history as symbols of fertility and springtime. Some writers have linked the Easter Bunny to an ancient Anglo-Saxon goddess named Eostre—from whose name we may get the word “Easter”—and they claim the hare was her sacred animal. It’s a compelling story. It’s also largely unsupported by evidence. The Oxford Dictionary of English Folklore notes that the only historical source mentioning Eostre is the medieval scholar Bede, and Bede says nothing about hares. The goddess-and-hare connection appears to be modern folklore dressed up as ancient tradition.

What is better documented is that hares held symbolic significance across many early cultures. Neolithic burial sites in Europe include hares interred alongside humans, suggesting ritual importance. Hares are conspicuous breeders—they produce multiple litters each year and nest above ground, making their reproductive activity visible in a way that rabbits’ underground burrows do not. For pre-modern peoples marking the return of spring, the hare was a living advertisement for new life.

The combination of egg symbolism and hare symbolism wasn’t a deliberate design decision by any single culture or institution. It was a gradual collision—two powerful images of renewal fusing together over centuries of seasonal celebration. The church absorbed local spring customs rather than eliminating them, allowing pagan associations with fertility and rebirth to persist beneath a Christian overlay. The result is the hybrid tradition we have today.

Today’s Easter Bunny is genuinely a global figure, though not always a rabbit. In Australia, the role is played by the Easter Bilby, an endangered marsupial that conservationists have promoted as a local alternative since the 1990s. Switzerland has an Easter Cuckoo. Parts of Germany have an Easter Fox. Each region adapted the basic concept of a spring gift-bringer to fit its own wildlife and folklore.

The commercial Easter Bunny we know—the chocolate molded figure, the pastel basket, the branded plush toy—is largely a product of the late 19th and 20th centuries, shaped by the same forces that turned Saint Nicholas into Santa Claus. Candy manufacturers, greeting card companies, and department stores found in Easter a spring counterpart to the Christmas retail season, and the Easter Bunny was the obvious mascot.

None of that diminishes what the tradition actually does. The Easter Bunny survived precisely because its meaning kept evolving. It began as a moral enforcer in 17th-century Germany, became a community ritual for immigrant families in Pennsylvania, and eventually became a child’s-eye-view celebration of spring available to secular and religious families alike. The rabbit never needed to make logical sense. It only needed to mark the moment the world turns green again—and every civilization, it seems, finds a way to celebrate that.

Illustration generated by author using ChatGPT.

Sources:

  • Bede, De Temporum Ratione (8th century)
    https://sourcebooks.fordham.edu/basis/bede-reckoning.asp
  • Encyclopaedia Britannica — Easter holiday origins
    https://www.britannica.com/topic/Easter-holiday
  • Catholic Encyclopedia — Lent and fasting traditions
    https://www.newadvent.org/cathen/09152a.htm
  • Smithsonian Magazine — History of Easter Eggs
    https://www.smithsonianmag.com/arts-culture/the-history-of-the-easter-egg-180971982/
  • History.com — Easter Symbols and Traditions
    https://www.history.com/topics/holidays/easter-symbols
  • Library of Congress — Easter traditions in early America
    https://blogs.loc.gov/folklife/2016/03/easter-on-the-farm/
  • National Geographic — Where Did the Easter Bunny Come From?
    https://www.nationalgeographic.com/history/article/easter-bunny-origins
  • American Folklife Center, Library of Congress
    https://www.loc.gov/folklife/
  • National Confectioners Association — Easter candy statistics
    https://www.nationalconfectioners.org/blog/seasonal-easter-candy-data/
  • Smithsonian — How holidays became commercial traditions
    https://www.smithsonianmag.com/history/the-surprising-history-of-holiday-shopping-180964949/
  • Oxford Companion to the Year — Ronald Hutton
    https://global.oup.com/academic/product/the-stations-of-the-sun-9780192854483
  • University of Pennsylvania Religious Studies overview of seasonal festivals
    https://www.penn.museum/sites/expedition/easter/

Native Americans in the Revolutionary War: Choosing Sides in a Conflict Not Their Own

The American Revolution wasn’t just a showdown between colonists and the British Crown. For the more than 80 distinct Native American nations living east of the Mississippi River, the conflict posed an existential threat — one that would reshape their world no matter who won. They faced an agonizing choice: stay neutral in what many viewed as a family dispute within the British Empire, or pick a side and hope that alliance might help preserve their lands and sovereignty.

Most tribes that chose a side supported the British, and their reasoning was sound. The Proclamation of 1763 had attempted to block colonial settlement west of the Appalachians, and Native leaders correctly recognized that an independent America, freed from British constraints, would accelerate land seizures at a terrifying pace. As Mohawk leader Joseph Brant warned in 1775, independence for the colonists would likely mean disaster for indigenous peoples across the continent. History would prove him right.

The Patriots’ Native Allies

Still, several tribes made the difficult calculation to support the Revolutionary cause. The most significant were the Oneida and Tuscarora nations of the Iroquois Confederacy, along with the Stockbridge-Mohican people of Massachusetts and New York. Smaller contingents from the Catawba, Delaware, Maliseet, Pequot, Narragansett, Niantics, and Montauks also fought alongside colonial forces.

The Stockbridge-Mohican had a relatively clear-cut situation: surrounded by colonial settlements in western Massachusetts, neutrality was essentially impossible. They had already developed cultural and trade ties with their English neighbors, and they bet that loyalty might protect their remaining land rights in the new nation. They were among the very first Native people to take up arms, with members serving as minutemen at Lexington and Concord in April 1775 and fighting at Bunker Hill that June.

The Oneida’s decision was more complex. Unlike tribes facing immediate frontier pressure, they had some geographic breathing room. Their choice reflected relationships built with colonial missionaries and traders, but also a calculated gamble: that an American victory might better respect their territorial claims than continued British rule. In 1776, Congress formally authorized General Washington to recruit Stockbridge Indians, and the Oneida soon became crucial assets — not just as fighters, but as scouts who knew the terrain intimately, and as diplomats attempting to keep other tribes neutral.

Combat Contributions

Native Americans who fought for the Patriots contributed far beyond their numbers. Historian Pekka Hämäläinen has argued that proportionally more Indians than New Englanders served in Patriot forces during the war. Their most consequential military moment came at the Battle of Oriskany on August 6, 1777 — one of the bloodiest engagements of the entire conflict.

At least 60 Oneida warriors fought alongside New York militia against a combined British, Loyalist, and Mohawk force. Warrior Han Yerry, his wife Tyonajanegen, and their son all distinguished themselves that day. According to contemporary accounts, Han Yerry killed nine enemy fighters before a bullet disabled his gun hand, forcing him to continue with his tomahawk; Tyonajanegen fought on horseback with pistols throughout the battle. The engagement fractured the Iroquois Confederacy permanently and helped prevent British forces from reinforcing General Burgoyne before the decisive American victory at Saratoga two months later.

Perhaps the Oneida’s most vital — and least celebrated — contribution came during the winter of 1777-78 at Valley Forge. When Washington’s army faced starvation, Oneida Chief Shenandoah dispatched warriors carrying several hundred bushels of white corn. An Oneida woman named Polly Cooper made the 200-mile journey from Fort Stanwix and stayed at Valley Forge, teaching the starving soldiers how to properly cook the corn so it was actually digestible. Washington personally met with Oneida leaders to express his gratitude, presenting each with a wampum belt. It was a quiet act of generosity that may have saved the Continental Army.

The Oneida continued fighting throughout the war — at the Battle of Barren Hill in May 1778, where scouts stayed behind to allow Lafayette’s troops to escape a British trap; at the Battle of Monmouth; and in numerous northern campaigns. Ten Oneida soldiers earned officers’ commissions in the Continental Army, one rising to lieutenant colonel. Some even served as spies, gathering intelligence deep in enemy territory at enormous personal risk.

The Bitter Aftermath

And then came the betrayal. The 1783 Treaty of Paris, which ended the war, contained no Native American representatives and made no provisions whatsoever for protecting indigenous lands or sovereignty. Britain simply handed over all territory east of the Mississippi to the new United States — without consulting a single Native nation — treating indigenous homelands as British property to dispose of at will.

Even the tribes that had fought for the American cause found that wartime promises evaporated in peacetime. The Oneida, whose contributions had been genuinely critical, faced immediate pressure to cede their territories. By 1788, New York State had leveraged the Oneida into surrendering approximately 5.5 million acres, leaving them with just 300,000. Between 1785 and 1846, New York forced the Oneida to sign 26 additional treaties, stripping away nearly everything that remained.

In 1794, Congress did formally acknowledge the service of the Oneida, Tuscarora, and Stockbridge with the Treaty of Canandaigua, providing $5,000, a new church, and some mills. But the treaty also required the tribes to relinquish all other claims for compensation — effectively closing the books on their wartime losses. Historians estimate the Oneida lost nearly a third of their population during and immediately after the war through combat casualties, displacement, and the destruction of their villages and food stores. The Stockbridge-Mohican, similarly dispossessed, largely migrated west to present-day Wisconsin by the early 19th century.

The Larger Picture

British-allied tribes fared no better. When Britain ceded its eastern territories, it abandoned all its Native allies without protection or compensation. Joseph Brant’s Mohawk lost nearly all their land, though the British eventually granted Brant’s followers about 810,000 hectares along the Grand River in present-day Ontario — land where the Six Nations Reserve still exists today.

The pattern was consistent across tribes, regardless of which side they chose: the Revolution was a catastrophe for virtually every Native American nation. Those who supported the Patriots made contributions that were real, substantial, and in some cases decisive. The Oneida at Oriskany, the Stockbridge minutemen at Lexington, Polly Cooper at Valley Forge — these weren’t footnotes. They were participants in the founding of a nation that would spend the next century systematically dispossessing them.

The Revolution shattered longstanding indigenous alliances, set precedents for how the new United States would treat Native peoples, and demonstrated that for Native Americans, the choice between British and American sides was ultimately a choice between two different roads to the same devastating destination: the loss of their lands, their sovereignty, and their way of life. It’s a chapter of the founding era that deserves far more attention than it typically gets.

Illustration generated by the author using ChatGPT.

Sources

Oneida Nation — Revolutionary War contributions: https://www.oneida-nsn.gov/our-ways/history/

Treaty of Canandaigua (1794): https://www.onondaganation.org/history/1794-treaty-of-canandaigua/

Stockbridge-Mohican history: https://www.mohican.com/history/

Battle of Oriskany: https://www.nps.gov/orpi/index.htm

Pekka Hämäläinen — Native American roles in the Revolution: https://www.hup.harvard.edu/books/9780674248717

Proclamation of 1763: https://www.britannica.com/event/Proclamation-of-1763

Treaty of Paris (1783) and Native lands: https://avalon.law.yale.edu/18th_century/paris.asp

Six Nations Reserve, Ontario: https://www.sixnations.ca/

The Marble Statue Problem: Why Half the Story Is No Story at All

A Commentary on Selective American History

There is a version of American history that looks spectacular. Founding Fathers on horseback, industrialists building steel empires from nothing, pioneers pushing west into open lands. It is the kind of history that gets carved into marble, hoisted onto pedestals, and taught as national mythology. Clean. Inspiring. Incomplete. And right now, there is a visible push by some politicians, curriculum reformers, and commentators to make that marble-statue version the only version — to scrub away what one American Historical Association report called the “inconvenient” truths that complicate the picture. What we lose in that scrubbing is not just accuracy. We lose the full human story of this country, and with it, the lessons that might be useful today.

The selective telling is not new, but its current form has new energy. In recent years, legislation has been introduced across multiple states to restrict how teachers discuss slavery, Indigenous displacement, immigration history, and the treatment of women and the poor. The argument is usually dressed up as national unity and pride. But the practical effect is something else: a history curriculum where triumph and innovation are permissible but suffering and exploitation are edited out.

Historians surveying American teachers in 2024 found this impulse reflected in the classroom as well — students arriving with what teachers described as a “marble statues” version of history absorbed from earlier grades, one that freezes the Founders and other heroes in idealized civic memory, stripped of contradiction. The pitch is usually framed as morale: kids need pride and self esteem, not “division.” But the practical effect is a kind of historical editing that turns real people—enslaved Americans, Native communities, women, immigrants, and the poor—into background scenery rather than participants with agency, suffering, and claims on the national memory. 

You can see the argument playing out in education policy and curriculum fights. The “patriotic education” push associated with the federal 1776 Commission is a clear example: it cast some approaches to teaching slavery and racism as inherently “anti-American,” and it encouraged a narrative that stresses national ideals while softening the lived realities that contradicted those ideals. 

Historians’ organizations have answered back that this kind of narrowing doesn’t create unity so much as it creates amnesia.  At the state level, controversies over how to describe or contextualize slavery—down to euphemisms and selective framing—keep resurfacing, because controlling the vocabulary controls the moral takeaway.  Florida’s education standards went so far as to compare slavery with job training.

The tension between celebratory and critical history also appears in how we interpret national symbols. The Statue of Liberty, now widely read as a welcoming beacon for immigrants, was originally conceived in significant part as a commemoration of the end of slavery in the United States and of the nation’s centennial. Over time, its antislavery meaning was overshadowed by a more comfortable story about voluntary immigration and opportunity as official imagery and public campaigns recast the statue to fit new national needs. This shift did not merely “add” an interpretation; it obscured the connection between American liberty and Black emancipation, pushing aside the reality that millions arrived in chains rather than by choice.

The deeper problem isn’t that Americans disagree about the past—healthy societies argue about meaning all the time. The problem is when disagreement becomes a one-way ratchet: complexity gets labeled “bias,” and only a feel-good storyline qualifies as “neutral.” That’s not neutral. That’s a choice to privilege certain experiences as representative and treat others as “inconvenient.”

Nowhere does this distortion show up more clearly than in how Americans tend to celebrate the industrialists of the late 19th and early 20th centuries — the Gilded Age titans who built railroads, steel mills, and oil empires. Andrew Carnegie, John D. Rockefeller, J.P. Morgan, Cornelius Vanderbilt: these men are frequently held up as models of American ambition and ingenuity, visionaries who transformed a post-Civil War nation into the world’s dominant industrial power. And they did do that. But the marble-statue version stops there, and stopping there is where the dishonesty begins.

Look at what powered that industrial machine: coal. And look at who powered coal. The men — and children — who went underground every day to dig it out of the earth under conditions that were, by any modern standard, a form of institutionalized violence. Between 1880 and 1923, more than 70,000 coal miners died on the job in the United States. That is not a rounding error; it is a small city’s worth of human lives, consumed by an industry that knew the dangers and chose profits over protection. Cave-ins, gas explosions, machinery accidents, and the slow suffocation of black lung took miners in ones and twos on ordinary days, and in mass casualties during what miners grimly called “explosion season” — when dry winter air made methane and coal dust especially volatile. Three major mine disasters in the first decade of the 1900s killed 201, 362, and 239 miners respectively, the latter two occurring within two weeks of each other.

And those were the adults. In the anthracite coal fields of Pennsylvania alone, an estimated 20,000 boys were working as “breaker boys” in 1880 — children as young as eight years old, perched above chutes and conveyor belts for ten hours a day, six days a week, picking slate and impurities out of rushing coal with bare hands. The coal dust was so thick at times it obscured their view. Photographer Lewis Hine documented these children in the early 1900s specifically because he understood that seeing them — their coal-blackened faces, their missing fingers, their flat eyes — was the only way to make comfortable Americans confront the total cost of the industrial miracle. Pennsylvania passed a law in 1885 banning children under twelve from working in coal breakers. The law was routinely ignored; employers forged age documents and desperate families went along with it because the wages, however meager, kept families from starving.

Coal mining is a representative case study because the work was both essential and punishing, and because the labor conflicts were not metaphorical—they were sometimes literally armed. In the coalfields, many miners lived in company towns where the company controlled the housing and the local economy. Some workers were paid in “scrip” redeemable only at the company store, a system that locked families into dependency and debt.  When union organizing surged, the backlash could be violent. West Virginia’s Mine Wars culminated in the Battle of Blair Mountain in 1921—widely described as the largest labor uprising in U.S. history—where thousands of miners confronted company-aligned forces and state power.  The mine owners deployed heavy machine guns and hired private pilots to drop arial bombs on the miners.

If you zoom out, this pattern wasn’t limited to coal. The Triangle Shirtwaist Factory fire in 1911 became infamous partly because locked doors and poor safety practices trapped workers—mostly young immigrant women—leading to 146 deaths in minutes. 

When workers tried to organize for better pay and safer conditions, the response from the industrialists and their allies was not negotiation. It was force. Henry Clay Frick, chairman at Carnegie Steel, cut worker wages in half while increasing shifts to twelve hours, then hired the Pinkerton Detective Agency — effectively a private army — to break the strike that followed at Homestead, PA in 1892. During the Great Railroad Strike of 1877, when workers walked off the job across the country, state militias were called in. In Maryland, militia fired into a crowd of strikers, killing eleven. In Pittsburgh, twenty more were killed with bayonets and rifle fire. A railroad executive of the era, asked about hungry striking workers, reportedly suggested they be given “a rifle diet for a few days” to see how they liked it. Throughout this period the federal government largely sided with capital against labor.

This is the part of the story that the marble-statue version leaves out — and not because it is marginal. The labor movement that emerged from these battles shaped virtually every protection American workers have today: the eight-hour workday, child labor laws, workplace safety regulations, the right to organize. These were not gifts handed down by generous industrialists. They were won through strikes, suffering, and in some cases, death. Ignoring that history does not honor the industrialists. It dishonors the workers.

The same pattern runs through every thread of American history that is currently under pressure. The story of westward expansion is incomplete without the story of Native displacement and the deliberate destruction of Indigenous cultures. The story of American agriculture is incomplete without the story of enslaved labor and the systems of racial control that followed emancipation. The story of American prosperity is incomplete without the story of immigrant communities channeled into the most dangerous, lowest-paid work and then told to be grateful for the opportunity. Women’s history, for most of American history, was not considered history at all. In each case, leaving out the difficult chapter does not produce a cleaner story. It produces a false one.

The argument for the marble-statue version is usually that complexity is demoralizing — that children need heroes, that citizens need pride, that a nation cannot function if it is constantly relitigating its worst moments. There is something in that concern worth taking seriously. History taught purely as a catalog of grievances is not good history either. But the answer to that problem is not to swap one distortion for another. Good history holds both: the genuine achievement and the genuine cost. Mark Twain understood this when he coined “The Gilded Age” — a title that means literally covered in a thin layer of gold over something much cheaper underneath. That phrase has been in the American vocabulary for 150 years because it captures something true about how surfaces can deceive.

A country that cannot look honestly at its own history is a country that will keep repeating the parts it refuses to examine. The enslaved deserve to be in the story. Indigenous people deserve to be in the story. Women deserve to be in the story. The breaker boys deserve to be in the story. The miners killed by the thousands deserve to be in the story. The workers shot by militias while asking for a living wage deserve to be in the story. Not because the story should only be about suffering, but because they were there — and because understanding what they faced, and what they fought for, and what they eventually changed, is how the story makes sense.

Illustration generated by author using ChatGPT.

Sources

American Historical Association. “American Lesson Plan: Curricular Content.” 2024.
https://www.historians.org/teaching-learning/k-12-education/american-lesson-plan/curricular-content/

Brewminate. “Replaceable Lives and Labor Abuse in the Gilded Age: Labor Exploitation and the Human Cost in America’s Gilded Age.” 2026.
https://brewminate.com/replaceable-lives-and-labor-abuse-in-the-gilded-age/

Bureau of Labor Statistics. “History of Child Labor in the United States, Part 1.” 2017.
https://www.bls.gov/opub/mlr/2017/article/history-of-child-labor-in-the-united-states-part-1.htm

Energy History Project, Yale University. “Coal Mining and Labor Conflict.”
https://energyhistory.yale.edu/coal-mining-and-labor-conflict/

Hannah-Jones, Nikole, et al. “A Brief History of Slavery That You Didn’t Learn in School.” New York Times Magazine. 2019.
https://www.nytimes.com/interactive/2019/08/14/magazine/slavery-capitalism.html

Investopedia. “The Gilded Age Explained: An Era of Wealth and Inequality.” 2025.
https://www.investopedia.com/terms/g/gilded-age.asp

MLPP Pressbooks. “Gilded Age Labor Conflict.”
https://mlpp.pressbooks.pub/ushistory2/chapter/chapter-1/

Princeton School of Public and International Affairs. “Princeton SPIA Faculty Reflect on America’s Past as 250th Anniversary Approaches.” 2026.
https://spia.princeton.edu/

USA Today. “Millions of Native People Were Enslaved in the Americas. Their Story Is Rarely Told.” 2025.
https://www.usatoday.com/

Wikipedia. “Breaker Boy.”
https://en.wikipedia.org/wiki/Breaker_boy

Wikipedia. “Robber Baron (Industrialist).”
https://en.wikipedia.org/wiki/Robber_baron_(industrialist)

America250 (U.S. Semiquincentennial Commission). “America250: The United States Semiquincentennial.”
https://www.america250.org/

Bunk History (citing Washington Post reporting). “The Statue of Liberty Was Created to Celebrate Freed Slaves, Not Immigrants.”
https://www.bunkhistory.org/

Upworthy. “The Statue of Liberty Is a Symbol of Welcoming Immigrants. That’s Not What She Was Originally Meant to Be.” 2026.
https://www.upworthy.com/

Henry Knox vs Joseph Plumb Martin: A Case Study in Officer Privilege After the Revolution

Last week I looked at how poorly revolutionary war veterans were treated in general. This week I’d like to take a look at a specific example —the contrast between how generals like Henry Knox and common soldiers like Joseph Plumb Martin fared after the Revolutionary War. It perfectly illustrates the class divide I discussed in my previous post. These two men served in the same army, helped win the same independence, and endured similar hardships—although Martin endured far greater hardship. Their post-war experiences couldn’t have been more different—and in a bitter twist, Knox’s prosperity came partly at Martin’s expense.
Knox’s Golden Parachute
Henry Knox entered the war as a Boston bookseller of modest means whose military knowledge was gained from reading rather than formal training. He rose to become Washington’s chief of artillery and a major general. When the war ended, Knox received benefits that set him up for life—or should have.
As an officer who served until the war’s end, Knox received the 1783 commutation payment: five years’ full pay in the form of government securities bearing six percent annual interest. This came after Knox himself helped lead the officer corps in pressuring Congress for payment during the near-mutiny known as the Newburgh Conspiracy in early 1783. In total, 2,480 officers received these commutation certificates
But Knox’s real windfall came from his marriage and his government connections. His wife Lucy came from a wealthy Loyalist family—her grandfather was Brigadier General Samuel Waldo, who’d gained control of a massive land patent in Maine in the 1730’s. When Lucy’s family fled to England, she became the sole heir to approximately 576,000 acres known as the Waldo Patent.
Knox used his position as the first Secretary of War (earning $3,000 annually in 1793) and his wartime connections to expand his land holdings and business ventures. He was able to ensure that his wife’s family lands were passed to her, rather than being seized by the government, as the holding of many loyalists were. Knox was firmly positioned on the creditor side of the equation, and his political connections helped shield him from the harsh economic reality faced by common soldiers.
He also acquired additional property in the Ohio Valley and engaged in extensive land speculation. He ran multiple businesses: timber operations, shipbuilding, brick-making, quarrying, and extensive real estate development.
After retiring from government in 1795, he built Montpelier, a magnificent three-story mansion in Thomaston, Maine, described as having “beauty, symmetry and magnificence” unequaled in Massachusetts. (My wife and I visited a reconstruction of his mansion this past summer and I can personally testify as to how elaborate a home it was.)
Martin’s Broken Promises
Joseph Plumb Martin’s story is the experience of the roughly 80,000-90,000 common soldiers who did most of the fighting. Martin enlisted at age 15 in 1776 and served seven years—fighting at Brooklyn, White Plains, Monmouth, surviving Valley Forge, and digging trenches at Yorktown. He rose from private to sergeant.
When Martin mustered out, he received certificates of indebtedness instead of actual pay—IOUs that depreciated rapidly. Unlike Knox, enlisted men received no pension, no commutation payment, nothing beyond those nearly worthless certificates. Martin, like many veterans, sold his certificates to speculators at a fraction of their face value just to survive.
After teaching briefly in New York, Martin settled in Maine in the early 1790s.  Based on the promise of a land bounty from Massachusetts, Martin and other “Liberty Men” each claimed 100 acres in Maine, assuming that Loyalist lands would be confiscated and sold cheaply to the current occupants or, perhaps, even treated as vacant lands they could secure by clearing and improving.
Martin married Lucy Clewley in 1794 and started farming. He’d fought for independence and now just wanted to build a modest life in the belief that the country he had fought for would stand by its promises.
When Former Comrades Became Adversaries
Here’s where the story takes a dark turn. In 1794, Henry Knox—Martin’s former commanding general—asserted legal ownership of Martin’s 100-acre farm. Knox claimed the land was part of the Waldo Patent. Martin and other settlers argued they had the right to farm the land they’d improved, especially as it should be payment for their Revolutionary service.
The dispute dragged on for years, with some veterans even forming a guerrilla group called the “White Indians” who attacked Knox’s surveyors. But Knox had wealth, lawyers, and political connections. In 1797, the legal system upheld Knox’s claim. Martin’s farm was appraised at $170—payable over six years in installments.
To put that in perspective, when Martin finally received a pension in 1818—twenty-one years later—it paid only $96 per year. And to get even that meager pension, Martin had to prove he was destitute. The $170 Knox demanded represented nearly two years of the pension Martin wouldn’t receive for another two decades.
Martin begged Knox to let him keep the land. There’s no evidence Knox even acknowledged his letters. By 1811, Martin had lost more than half his farm. By 1818, when he appeared before the Massachusetts General Court with other veterans seeking their long-promised pensions, he owned nothing.
The Irony of “Fair Treatment”
Knox claimed he treated settlers on his Maine lands fairly, though he used intermediaries to evict those who couldn’t pay rent or whom he considered to be squatters. The settlers disagreed so strenuously that they once threatened to burn Montpelier to the ground
The situation’s bitter irony is hard to overstate. Knox had been one of the officers who organized the Society of the Cincinnati in 1783, ostensibly to support widows and orphans of Revolutionary War officers. He’d helped lead the push for officer commutation payments by threatening Congress during the Newburgh affair. Yet when common soldiers like Martin—men who’d literally dug the trenches that won the siege at Yorktown—needed help, Knox showed no mercy.
The Numbers Tell the Story
Let’s compare their situations side by side:
Henry Knox:
              ∙            Officer commutation: Five years’ full pay in securities with 6% interest
              ∙            Secretary of War salary: $3,000 per year (1793)
              ∙            Land holdings: 576,000+ acres in Maine, plus Ohio Valley properties
              ∙            Housing: Three-story mansion with extensive outbuildings
              ∙            Businesses: Multiple ventures in timber, ships, bricks, quarrying, real estate
              ∙            Death: 1806, in debt from failed business ventures but having lived in luxury
Joseph Plumb Martin:
              ∙            Enlisted pay: Mostly unpaid certificates sold at a loss to speculators
              ∙            Pension: None until 1818, then $96 per year (had to be destitute to qualify)
              ∙            Land holdings: Started with 100 acres, lost all most all of it to Knox by 1818
              ∙            Housing: Small farmhouse, struggling to farm 8 of his original 100 acres
              ∙            Income: Subsistence farming, served as town clerk for modest pay
              ∙            Death: 1850 at age 89, having struggled financially his entire post-war life
A Memoir Born of Frustration
In 1830, at age 70, Martin published his memoir anonymously. The full title captured his experience: “A Narrative of Some of the Adventures, Dangers, and Sufferings of a Revolutionary Soldier.” He published it partly to support other veterans fighting for their promised benefits and possibly hoping to earn some money from sales.
The book didn’t sell. It essentially disappeared until a first edition was rediscovered in the 1950s and republished in 1962. Today it’s considered one of the most valuable primary sources we have for understanding what common soldiers experienced during the Revolution. Historians praise it precisely because it’s not written by someone like Washington, Knox, or Greene—it’s the voice of a regular soldier
When Martin died in 1850, a passing platoon of U.S. Light Infantry stopped at his house and fired a salute to honor the Revolutionary War hero. But that gesture of respect came long after the country should have helped Martin when he needed it.
The Broader Pattern
Knox wasn’t unusual among officers, nor was Martin unusual among enlisted men. This was the pattern: officers with education, connections, and capital leveraged their wartime service into political positions, land grants, and business opportunities. Common soldiers received promises, waited decades for minimal pensions, and often lost what little property they had to the very elites who’d commanded them.
It’s worth noting that Knox’s business ventures eventually failed. He died in debt in 1806, having borrowed extensively to fund his speculations. His widow Lucy had to gradually sell off land to survive. But Knox still lived eleven years in a mansion, engaged in enterprises of his choosing, and died surrounded by family on his comfortable estate. Martin outlived him by forty-four years, spending most of them in poverty.
The story of Knox and Martin isn’t one of villainy versus heroism. Knox was a capable general who genuinely contributed to winning independence. Martin was a dedicated soldier who did the same. But the system they operated within distributed the benefits of that shared victory in profoundly unequal ways, and Knox—whether intentionally or not—used that system to take what little they had from soldiers who’d fought under his command. This was not corruption in the modern sense; it was the predictable outcome of a system that rewarded status, education, and proximity to power. Knox’s experience illustrates a broader truth of the post-Revolutionary period: independence redistributed political sovereignty, but economic security flowed upward, not downward.
When we talk about how Continental Army veterans were treated, this is what it looked like on the ground: the officer who led the charge for officer pensions living in a mansion on 600,000 acres, while the sergeant who dug the trenches at Yorktown lost his 100-acre farm and had to prove he was destitute to get $96 a year, decades too late to matter. This will always be a black mark on American history.
 
Illustrations generated by author using ChatGPT.

Personal note: I spent 12 years on active duty, both as an officer and an enlisted man. I’m proud of my service and I’m proud of the people who have served our country. I do not write this in order to condemn our history. I write it in order to make us aware that we need to always support the common people who contribute vitally to our national success and are seldom recognized.

Sources
Martin, Joseph Plumb. “A Narrative of a Revolutionary Soldier: Some of the Adventures, Dangers and Sufferings of Joseph Plumb Martin”
Originally published anonymously in 1830 at Hallowell, Maine as “A narrative of some of the adventures, dangers, and sufferings of a Revolutionary soldier, interspersed with anecdotes of incidents that occurred within his own observation.” The memoir fell into obscurity until a first edition copy was discovered in the 1950s and donated to Morristown National Historical Park. Republished by Little, Brown in 1962 under the title “Private Yankee Doodle” (edited by George F. Scheer). Current edition published 2001. This firsthand account by a Continental Army private who served seven years provides invaluable insight into the common soldier’s experience during the war and the struggles veterans faced afterward, including Martin’s own land dispute with Henry Knox.  I highly recommend this book to anyone with an interest in ordinary people and their role in history.
 
American Battlefield Trust – The Newburgh Conspiracy
https://www.battlefields.org/learn/articles/newburgh-conspiracy
 
Maine Memory Network – Henry Knox: Land Dealings
https://thomaston.mainememory.net/page/735/display.html
 
World History Encyclopedia – Henry Knox
https://www.worldhistory.org/Henry_Knox/
 
Maine: An Encyclopedia – Knox, Henry
https://maineanencyclopedia.com/knox-henry/
 
American Battlefield Trust – Joseph Plumb Martin: Voice of the Common American Soldier
https://www.battlefields.org/learn/articles/joseph-plumb-martin
 
Wikipedia – Joseph Plumb Martin
https://en.wikipedia.org/wiki/Joseph_Plumb_Martin
 
Note on Additional Context: While these were the primary sources directly used in this article, the discussion also drew on information from my earlier Revolutionary War veterans article about the general treatment of enlisted soldiers, pension systems, and the class disparities in how benefits were distributed after the war.

Page 1 of 10

Powered by WordPress & Theme by Anders Norén