If you’ve ever winced taking your first steps out of bed in the morning, you may have already made an involuntary acquaintance with heel spurs — or more precisely, with the condition that often travels with them. The term itself sounds alarming, and for a brief but colorful stretch of American political history, it became something far more charged than a footnote to podiatry. But before we get to the politics, it’s worth understanding what a heel spur actually is, because the medical reality is both more mundane and more complicated than the caricature.
What Exactly Is a Heel Spur? A heel spur is a small bony outgrowth — technically called a calcaneal spur — that extends from the underside of the heel bone (the calcaneus). It forms at the spot where the plantar fascia — the thick ligament running the length of your foot from heel to toe — attaches to the heel bone. The spur is not, despite what the name implies, a sharp spike. It is typically smooth and rounded, though it can still cause irritation if it presses into surrounding soft tissue.
Heel spurs affect about 10% of the population, making them one of the more common foot conditions around, though most people who have one don’t know it. The spur develops gradually — usually over months or even years — as the body deposits calcium in response to chronic stress at that heel attachment point. Think of it less as damage and more as your skeleton’s attempt at reinforcement.
What Causes Them? The underlying driver is repetitive mechanical stress on the foot. Heel spurs are particularly associated with strains on foot muscles and ligaments, stretching of the plantar fascia, and repeated small tears in the membrane covering the heel bone. Athletes who do a lot of running and jumping are especially prone.
But you don’t need to be an elite runner to develop one. Walking gait problems — particularly overpronation, where the foot rolls inward — place uneven stress on the heel with each step. Worn-out or poorly fitted shoes, which fail to absorb shock or support the arch, compound the problem. Obesity increases the mechanical load on the heel. Occupations that require prolonged standing or walking on hard surfaces put the plantar fascia under constant tension. And as people age, tendons and ligaments lose their elasticity, making the tissues more vulnerable to micro-tears and the subsequent bony repair response.
Heel spurs are also closely connected to a condition most people have heard of: plantar fasciitis. The two are related a but not identical. Plantar fasciitis is inflammation of the plantar fascia itself, usually from overuse. A heel spur can develop as a downstream consequence of that inflammation — the body lays down extra bone in response to the ongoing stress at the fascia’s attachment point.
Symptoms — or the Lack Thereof Here’s the part that surprises most people: the majority of heel spurs cause no symptoms at all, and many are discovered incidentally on X-rays taken for other reasons. Only about 5% of heel spurs are estimated to be symptomatic.
When a heel spur does produce symptoms, the experience is heavily intertwined with plantar fasciitis. The classic description is a sharp, stabbing pain on the bottom of the foot first thing in the morning, or after any prolonged rest. Many people compare it to stepping on a tack. Paradoxically, this pain often eases somewhat after walking around for a few minutes, only to return after extended time on the feet or after another rest. It’s that “worse in the morning” quality that tends to be the giveaway.
Other symptoms, when present, can include localized swelling, warmth, and tenderness along the front of the heel, as well as increased sensitivity on the underside of the foot. It’s worth noting that the pain associated with a heel spur is not generally thought to come from the bony spur itself, but from the irritation it causes in the surrounding soft tissue — tendons, ligaments, and bursae.
How Is It Diagnosed? Diagnosis typically begins with a physical exam. Your doctor or podiatrist will ask about when the pain started, what activities preceded it, and what makes it better or worse. They’ll examine your foot for tenderness at specific points, assess your range of motion, and check foot alignment and press on key areas to locate the source of pain.
Imaging confirms the picture. An X-ray can clearly show the bony spur and is the most commonly used test. That said, the size of the spur on an X-ray doesn’t necessarily correspond to how much pain a patient is experiencing — a small spur can be quite painful while a large one may cause no trouble at all. In more complex cases, an MRI may be ordered to assess the soft tissues more closely and evaluate whether plantar fasciitis or another condition is also in play.
Treatment Options The reassuring news is that the vast majority of cases resolve without surgery. More than 90% of patients improve with nonsurgical treatment. The catch is that conservative management requires patience — improvement typically takes weeks, and more stubborn cases can take months.
The cornerstone of treatment is rest and reducing the activities that provoke pain. This doesn’t necessarily mean completely stopping exercise; low-impact alternatives like swimming, cycling, or rowing allow you to stay active while giving the heel a break from impact. Icing the bottom of the foot after activity helps manage inflammation. Over-the-counter anti-inflammatory medications like ibuprofen or naproxen can provide relief, though they’re intended for short-term use.
Footwear matters enormously. Supportive shoes with good arch support, cushioning, and a slight heel rise reduce the strain on the plantar fascia. Custom orthotics with molded insoles designed to redistribute pressure across the foot are often recommended, particularly for people with gait abnormalities or flat feet. Physical therapy can be part of the treatment plan, focusing on stretching the calf muscles and plantar fascia, strengthening the foot’s intrinsic muscles, and correcting biomechanical issues.
For cases that don’t respond to these initial measures, the next tier of treatment includes corticosteroid injections to reduce inflammation at the spur site, and extracorporeal shockwave therapy — a non-invasive procedure that uses sound waves to stimulate healing in chronically inflamed tissue. Surgery is reserved for the minority of cases where conservative treatment fails after nine to twelve months. Possible complications include nerve pain, infection, scarring, and — with plantar fascia release — the risk of foot instability or stress fracture. Most orthopedic surgeons regard surgery as a last resort.
Are Heel Spurs Debilitating? For most people, the honest answer is: no. Heel spurs are a common condition with a favorable prognosis, especially with early diagnosis and appropriate management. Many people live with heel spurs for years without ever knowing it, and even those who develop pain typically find substantial relief with conservative treatment within four to eight weeks. That said, the pain at its worst — particularly in conjunction with plantar fasciitis — can be genuinely disruptive to daily life. Athletes may find their training significantly limited. People who spend long hours on their feet at work may struggle with sustained discomfort. And a small percentage of patients do end up with prolonged, treatment-resistant pain that affects mobility. So, the more accurate framing might be: heel spurs have the potential to be significantly uncomfortable and functionally limiting during flare-ups, but with proper treatment most people recover well and return to normal activity.
Heel Spurs and the Vietnam-Era Draft Which brings us to an improbable chapter in heel spur history. During the Vietnam War era, heel spurs became — for at least one famous case — a ticket out of military service. Understanding how that worked requires a brief detour into the draft system of the 1960s and 1970s, and what it meant to receive a medical deferment. According to the National Archives, of the roughly 27 million American men eligible for military service between 1964 and 1973, about 15 million were granted deferments — mostly for education, and some for mental or physical problems — while only 2,215,000 were actually drafted into service—another eight million volunteered. Some of those who later served had previously had deferments. The system was sprawling, complex, and — as was widely acknowledged even at the time — deeply unequal. Roughly 60% of draft-eligible American men took some sort of action to avoid military conscription. There were many routes: college deferments, fatherhood, conscientious objector status (170,000 men received those alone), National Guard enlistment, and medical exemptions. Medical deferments covered a wide range of conditions — from serious chronic illness to conditions that, in a different context, most people would consider minor. Flat feet, poor eyesight, asthma, and yes, bone spurs all appeared on the list of potentially disqualifying ailments. The system was known to favor men with access to money, education, and well-connected physicians. American forces in Vietnam were 55% working-class and 25% poor — reflecting those who didn’t have the means to navigate the deferment labyrinth. A working-class kid from rural West Virginia was far more likely to end up in the Mekong Delta than the son of a New York real estate developer.
The Most Famous Heel Spur in American History Which leads, inevitably, to Donald Trump. As confirmed by Selective Service records obtained and reported by multiple news outlets, Trump received five Vietnam-era draft deferments — four for college attendance at Fordham and the Wharton School, and a fifth in 1968, recorded as a medical deferment for bone spurs in his heels. The medical classification left him disqualified for military service.
The circumstances surrounding the diagnosis have been contested ever since. Reporting by the New York Times included accounts from the daughters of a Queens podiatrist named Larry Braunstein, who alleged that their father had provided or vouched for the diagnosis as a professional favor to Trump’s father, Fred Trump — a landlord to whom Braunstein reportedly owed a debt of gratitude. Trump’s former lawyer Michael Cohen also testified that Trump had admitted to fabricating the injury. Trump himself has maintained that the diagnosis was legitimate, stating that a doctor “gave me a letter — a very strong letter — on the heels.” The underlying medical records that would resolve the dispute are, conveniently, not publicly available; most individual Selective Service medical records from that era were subsequently destroyed.
It’s worth noting that Trump’s pattern — using legal channels, including a medical deferment of questionable validity, to avoid Vietnam service — was not unique to him. Historians have pointed out that numerous prominent figures on both sides of the political aisle received deferments of various kinds, including Joe Biden (asthma), Dick Cheney (student deferments), Bill Clinton (navigated the ROTC system), and George W. Bush (National Guard). The heel spur episode became politically charged in part because of Trump’s later hawkish rhetoric and his outspokenness in questioning the military service of others — most notably Senator John McCain, who spent years as a prisoner of war in North Vietnam.
How Many People Got Heel Spur Deferments? This is where the historical record hits a hard wall. No reliable statistics exist specifically for heel spur deferments. The Selective Service tracked broad categories — student deferments, hardship deferments, conscientious objector status, medical disqualifications — but it did not publish a breakdown by specific diagnosis, and most individual medical records from that era no longer exist.
What we can say is that bone spurs were a recognized medical disqualifier under Selective Service regulations, that medical deferments broadly were a commonly used — and commonly abused — avenue for avoiding service, and that the process was heavily influenced by access to sympathetic physicians. A man with means, connections, and a cooperative podiatrist had options that a man without those resources did not.
The honest answer, then, is that we don’t know how many men received deferments citing heel spurs specifically, and we almost certainly never will. The data either wasn’t tracked at that level of granularity or was long since destroyed. What we do know is that the condition became, for a time, a lens through which Americans examined something much larger: who serves, who doesn’t, and whether the systems meant to govern those decisions are applied fairly.
For most people, a heel spur is a manageable, if annoying, footnote in the story of their health. For at least one person, it became a footnote in the history of American politics.
Personal Note: I have heel spurs; I wish I’d known about them in 1967. Images generated by author using AI.
Medical Sources Cleveland Clinic — Heel Spurs overview https://my.clevelandclinic.org/health/diseases/21965-heel-spurs WebMD — Heel Spur Causes, Symptoms, Treatments, and Surgery https://www.webmd.com/pain-management/heel-spurs-pain-causes-symptoms-treatments Hackensack Meridian Health — Bone Spurs in the Heel: Symptoms and Recovery https://www.hackensackmeridianhealth.org/en/healthier-you/2024/01/02/bone-spurs-in-the-heel-symptoms-and-recovery OrthoArkansas — Heel Spurs https://www.orthoarkansas.com/heel-spurs-orthoarkansas/ EmergeOrtho — Heel Bone Spurs: Causes, Symptoms, Treatment https://emergeortho.com/news/heel-bone-spurs/ American Academy of Orthopaedic Surgeons — Plantar Fasciitis and Bone Spurs https://orthoinfo.aaos.org/en/diseases–conditions/plantar-fasciitis-and-bone-spurs/ Vietnam Draft & Military Service Sources History.com — 7 Ways Americans Avoided the Draft During the Vietnam War https://www.history.com/articles/vietnam-war-draft-avoiding Wikipedia — Draft Evasion in the Vietnam War https://en.wikipedia.org/wiki/Draft_evasion_in_the_Vietnam_War Wikipedia — Conscription in the United States https://en.wikipedia.org/wiki/Conscription_in_the_United_States Students of History — The Draft and the Vietnam War https://www.studentsofhistory.com/vietnam-war-draft University of Michigan — The Military Draft During the Vietnam War https://michiganintheworld.history.lsa.umich.edu/antivietnamwar/exhibits/show/exhibit/draft_protests/the-military-draft-during-the- Vietnam Veterans of America Chapter 310 — Vietnam War Statistics https://www.vva310.org/vietnam-war-statistics Vietnam Veterans of Foreign Wars — Fact vs. Fiction: The Vietnam Veteran https://www.vvof.org/factsvnv.htm New York City Vietnam Veterans Plaza — Interesting Facts About Vietnam https://www.vietnamveteransplaza.com/interesting-facts-about-vietnam/
Medical Disclaimer The information provided in this article is intended for general educational and informational purposes only and does not constitute medical advice. It should not be used as a substitute for professional medical advice, diagnosis, or treatment. Always seek the guidance of a qualified healthcare provider with any questions you may have regarding a medical condition or treatment. Never disregard professional medical advice or delay seeking it because of something you have read here. If you are experiencing a medical emergency, call 911 or your local emergency number immediately. The author of this article is a licensed physician, but the views expressed here are solely those of the author and do not represent the official position of any hospital, health system, or medical organization with which the author may be affiliated.
Ask most Americans what the Statue of Liberty means and they’ll say the same thing: she welcomes immigrants. She is the “Mother of Exiles,” keeper of the golden door. That image is so deeply woven into the national identity that it has been quoted, protested, and debated for more than a century. The only problem is that it wasn’t what her creators originally intended.
The story begins in 1865, just weeks after the Civil War ended, at a dinner party near Versailles. The host was Édouard de Laboulaye, a French historian and president of the French Anti-Slavery Society. Laboulaye was one of France’s most passionate admirers of American democracy and was deeply moved by both Lincoln’s assassination and the abolition of slavery. He proposed that France present the United States with a colossal monument — one that would celebrate two things at once: the centennial of American independence and the end of slavery.
Sculptor Frédéric-Auguste Bartholdi was at that dinner and took the idea and ran with it. Early models from around 1870 show Lady Liberty with her right arm raised — familiar enough — but in her left hand she holds broken shackles, not a tablet. The anti-slavery message was unmistakable.
That symbolism didn’t entirely disappear from the final design — it just got moved. According to the National Park Service, Bartholdi placed a broken chain and shackle at the statue’s feet, hidden beneath the copper drapery. Most visitors never notice it. The left hand now holds a tablet inscribed with the date of the Declaration of Independence.
Why the change? NYU historian Edward Berenson points to the political climate of the 1880s. By the time the statue was dedicated in October 1886 — more than 20 years after Laboulaye’s dinner — Reconstruction had collapsed, Jim Crow was spreading, and the country was trying to paper over sectional wounds by quietly forgetting the war’s racial roots. Nobody at the dedication mentioned slavery. The abolitionist origins were simply buried.
The statue was formally named Liberty Enlightening the World and the message broadened toward Franco-American friendship and American liberty in general rather than emancipation specifically.
Then came Emma Lazarus, and the second transformation. In 1883, a fundraiser struggling to pay for the statue’s pedestal asked prominent writers to donate works for an auction. Lazarus, a Jewish-American poet, initially declined as she was then deeply involved in aiding Jewish refugees fleeing wide-spread and organized violence in Russia. A friend persuaded her that the statue, sitting at the entrance to New York Harbor, would inevitably be seen as a beacon by arriving immigrants. She wrote “The New Colossus” — fourteen lines that reimagined Lady Liberty entirely, as a “Mother of Exiles” beckoning the world’s tired and poor to America’s golden door.
Not like the brazen giant of Greek fame, With conquering limbs astride from land to land; Here at our sea-washed, sunset gates shall stand A mighty woman with a torch, whose flame Is the imprisoned lightning, and her name Mother of Exiles. From her beacon-hand Glows world-wide welcome; her mild eyes command The air-bridged harbor that twin cities frame. “Keep, ancient lands, your storied pomp!” cries she With silent lips. “Give me your tired, your poor, Your huddled masses yearning to breathe free, The wretched refuse of your teeming shore. Send these, the homeless, tempest-tost to me, I lift my lamp beside the golden door!”
Strangely, the poem then vanished. It played no role at the statue’s 1886 dedication. Lazarus died in 1887, before Ellis Island even opened. It wasn’t until 1903 that a friend had the entire poem cast onto a bronze plaque that was mounted inside the pedestal——not engraved on the outside as many believe. By then, millions of immigrants had already sailed past the statue on their way to Ellis Island, and the association with immigration had taken hold in the public imagination.
The immigration meaning deepened through the early 20th century as the government used the statue’s image in campaigns to assimilate immigrant children, many of whom lived in ethnic enclaves in large eastern cities. Meanwhile, the abolitionist symbolism that had inspired the project in the first place faded almost entirely from public memory.
The broken shackles are still there, tucked under her robe, mostly invisible. Lady Liberty has been continuously reinterpreted for 160 years — by abolitionists, by a poet responding to a refugee crisis, by politicians, and by millions of people who looked up at that torch from the deck of a ship. Will Lady Liberty and her words continue to offer welcome and hope and be a source of national pride or is her meaning once again being rewritten as a symbol of American identity, dominance, and, perhaps, exclusion?
The Statue’s shackles and feet. National Park Service, Statue of Liberty NM, Public Domain
Wikimedia Commons: Elkanah Tisdale (1771-1835) (often falsely attributed to Gilbert Stuart), public domain
Picture this: It’s 1812 in Massachusetts, and Governor Elbridge Gerry has just approved a redistricting plan that creates such a bizarrely shaped legislative district that when a local newspaper editor saw it on a map, he thought it looked like a salamander. The editor sketched wings and claws onto the district, and someone quipped that it looked more like a “Gerry-mander” than a salamander. The term stuck, and more than two centuries later, we’re still dealing with the same problem that inspired that joke. What Gerrymandering Actually Means At its core, gerrymandering is the practice of drawing electoral district boundaries to give one political party or group an unfair advantage over its opponents. It’s a form of political manipulation that allows those in power to essentially choose their voters, rather than letting voters choose their representatives. Think of it as a sophisticated form of gaming the system—perfectly legal in many cases, but profoundly anti-democratic in spirit. The mechanics are surprisingly straightforward. There are two main techniques: “cracking” and “packing.” Cracking involves splitting up concentrations of opposing voters across multiple districts so they can’t form a majority anywhere. Packing does the opposite—cramming as many opposing voters as possible into a few districts so they waste their votes winning by huge margins in just a couple of places, leaving the rest of the districts safely in your column. These techniques can be deployed together to engineer a decisive partisan advantage. A History of Creative Mapmaking The practice didn’t start with Gerry, of course. The Founding Fathers—for all their lofty rhetoric about representative democracy—weren’t above putting their thumbs on the electoral scales. But gerrymandering really came into its own in the 20th century as advances in census data, and statistics combined with the addition of newly available computing power made it possible to draw districts with surgical precision. The 2010 redistricting cycle marked a watershed moment. Single-party control of the redistricting process gave partisan line drawers free rein to craft some of the most extreme gerrymanders in American history often down to the level of individual city blocks. Republicans, having won control of many state legislatures in the 2010 midterms, used sophisticated computer modeling to create maps that locked in their advantages for a decade. Democrats did the same where they had the power, though Republicans controlled more state legislatures and thus wielded greater gerrymandering capability overall. The 2024-2025 Gerrymandering Wars Here’s where things get really interesting—and deeply concerning. The situation has exploded into what some are calling “gerrymandering wars” following the 2020 census and a critical Supreme Court decision in 2019. In Rucho v. Common Cause, the Supreme Court ruled that partisan gerrymandering constitutes a non-justiciable “political question” where federal court intervention is unsuitable. Translation: The federal courts won’t stop partisan gerrymandering because they claim there’s no objective standard to measure it. This opened the floodgates. The Brennan Center estimates that gerrymandering gave Republicans an advantage of around 16 House seats in the 2024 race to control Congress compared to fair maps. But here’s the kicker: we’re not even done with this decade’s redistricting. In an unprecedented move, President Donald Trump has pushed Republican state lawmakers to further gerrymander their states’ congressional maps, prompting Democratic state lawmakers to respond in kind. In August 2025, during a special session, Texas’s legislature passed a redistricting plan that weakens electoral opportunities for Black and Hispanic voters. California has threatened to respond with its own gerrymander, creating a tit-for-tat dynamic that could spiral out of control. North Carolina provides perhaps the most dramatic example. After the state supreme court reversed its position on policing partisan gerrymandering, the Republican-controlled legislature redrew the map, and after the 2024 election, three Democratic districts flipped to Republicans—enough to give control of the U.S. House to the GOP by a slim margin. The situation has gotten so extreme that both parties are now openly engaging in mid-decade redistricting—something that traditionally only happened after each ten-year census. California, Missouri, North Carolina, Ohio, Texas and Utah have all adopted new congressional maps in 2025, with new maps also appearing possible in Florida, Maryland and Virginia. The Racial Dimension It’s crucial to note that gerrymandering comes in two flavors: partisan and racial. While partisan gerrymandering is currently legal thanks to the Supreme Court, racial gerrymandering—drawing districts specifically to dilute the voting power of racial minorities—violates the Voting Rights Act of 1965. The line between the two can get blurry, though, since partisan voting patterns often correlate with race. In May 2025, a federal court ruled that Alabama’s 2023 congressional map not only violates Section 2 of the Voting Rights Act but was enacted by the Alabama Legislature with racially discriminatory intent. Similar battles are playing out in Louisiana, Mississippi, and other states. The legal landscape here is complex, with courts sometimes walking a tightrope between ensuring fair representation for communities of color and avoiding the creation of what could be challenged as racial gerrymanders. What Can Be Done About It? The most popular reform proposal is the creation of independent redistricting commissions—bodies of citizens (not politicians) who draw district maps according to neutral criteria. Currently, several states including Colorado, Michigan, Ohio and Virginia use redistricting commissions to draw congressional and state legislative maps, ranging from political commissions with elected officials to completely independent commissions that bar all elected officials from serving as commissioners. Do they work? The evidence is mixed but generally positive. According to a Redistricting Report Card published with the Princeton Gerrymandering Project, the states that had some form of commission drew “B+” maps on average, while states where partisans controlled the process drew “D+” maps. California’s independent commission is often held up as the gold standard, though it’s not perfect—even fairly drawn maps can produce lopsided results due to how voters cluster geographically. Another proposal focuses on clear, enforceable criteria: compactness, contiguity, respect for existing political boundaries, and transparency in the mapping process. Advances in statistical analysis also make it possible to compare proposed maps against thousands of neutral alternatives to detect extreme outliers, a method increasingly discussed in academic and legal circles. Federal legislation has been proposed repeatedly. The Redistricting Reform Act of 2025 would prohibit states from mid-decade redistricting and would require every state to adopt nonpartisan, independent redistricting commissions. Similar provisions were included in the “For the People Act” that Democrats passed in the House in 2021, but it died in the Senate. Getting such legislation through Congress would require bipartisan cooperation, which seems unlikely given that both parties see gerrymandering as a political weapon and they believe they can’t afford to unilaterally disarm. Some reformers advocate for more radical solutions. Proportional representation systems, where the share of votes equals the share of seats, would end boundary-drawing battles altogether and make democracy more representative. Under such a system, if Democrats win 60% of the vote in a state, they’d get roughly 60% of that state’s congressional seats. It’s intuitive and fair, but it would require a fundamental restructuring of American electoral systems—something that’s probably not politically feasible in the near term. State courts have emerged as a potential backstop. At least 10 state supreme courts have found that state courts can decide cases involving allegations of partisan gerrymandering, even though federal courts won’t touch them. This means that state constitutional provisions against gerrymandering could provide meaningful protection—though as North Carolina demonstrated, state court compositions can change, and with them, their willingness to police gerrymandering. The Bottom Line Gerrymandering represents a fundamental tension in American democracy: How do we draw districts fairly when the people drawing them have every incentive to rig the game in their favor? The problem has ancient roots but ultra-modern manifestations, powered by big data and sophisticated computer modeling that would make Elbridge Gerry’s head spin. The current moment feels particularly precarious. With Republicans’ razor-thin majority in the House and midterm elections traditionally being unfavorable for the party in power, the Republicans’ action amounts to a preemptive move to retain control of Congress. The Democrats threatened response could trigger an escalating cycle of partisan map manipulation that further entrenches our political divisions and makes elections less responsive to actual voter preferences. Independent commissions offer a promising path forward, but they’re not a silver bullet. They work better than partisan control, but they can’t eliminate all the inherent challenges of translating votes into seats through geographic districts. More ambitious reforms like proportional representation could solve the problem more completely, but they face enormous political and practical obstacles. For now, gerrymandering remains what that newspaper editor saw in 1812: a monstrous distortion of democratic principles, hiding in plain sight on our electoral maps. The question is whether we have the political will to slay the beast, or whether we’ll keep feeding it for another two centuries.
American Constitution Society – America’s Gerrymandering Crisis https://www.acslaw.org/expertforum/americas-gerrymandering-crisis-time-for-a-constructive-redistricting-framework/
ACLU – Court Cases on Gerrymandering https://www.aclu.org/court-cases?issue=gerrymandering
Stateline – State Courts and Gerrymandering https://stateline.org/2025/12/22/as-supreme-court-pulls-back-on-gerrymandering-state-courts-may-decide-fate-of-maps/
Campaign Legal Center – Do Independent Redistricting Commissions Work? https://campaignlegal.org/update/do-independent-redistricting-commissions-really-prevent-gerrymandering-yes-they-do
RepresentUs – End Partisan Gerrymandering https://represent.us/policy-platform/ending-partisan-gerrymandering/
Senator Alex Padilla – Redistricting Reform Act of 2025 https://www.padilla.senate.gov/newsroom/press-releases/watch-padilla-lofgren-introduce-legislation-to-establish-independent-redistricting-commissions-end-mid-decade-redistricting-nationwide/
English is a beautifully messy language—shameless in its borrowing and relentless in its evolution. It resists the tidy logic that might make a grammarian’s life easier, and that resistance is part of what makes its history so compelling. The English we speak today is the product of centuries of invasion, migration, cultural collision, and literary ambition—a language built in layers, like geological strata laid down over time.
To see how English grew from an obscure Germanic dialect into a global lingua franca, it helps to trace three broad phases: Old English, Middle English, and Modern English. Each stage was shaped by different historical forces, from Germanic migration and Viking settlement to the Norman Conquest, the Renaissance, the printing press, and ultimately the worldwide reach of the British Empire and the United States.
Anglo-Saxon Foundations
The story begins on the European mainland. When Roman authority collapsed in Britain in the early fifth century, Germanic-speaking peoples from what is now northern Germany, Denmark, and the Netherlands moved into the island. The Angles, Saxons, and Jutes arrived in waves, bringing closely related West Germanic dialects that gradually developed into Old English, often called Anglo-Saxon.
Old English was thoroughly Germanic in both grammar and vocabulary. It was a highly inflected language: case endings marked whether a noun was subject, object, or possessive, and nouns had grammatical gender. Verbs were conjugated with a complexity that feels foreign to most modern English speakers. Much of the core vocabulary of modern English—words such as water, house, bread, child, earth, life, and death—dates back to this early period and still carries that Germanic stamp.
The language of Beowulf, composed between the eighth and early eleventh centuries, is virtually unreadable today without specialized training. Its famous opening line, “Hwæt! We Gardena in geardagum,” is technically English, but it feels closer to a foreign language. Old English used letters such as þ (thorn) and ð (eth) and relied on grammatical structures that later disappeared.
Nor was Old English a single uniform tongue. It existed as a cluster of regional dialects including Northumbrian, Mercian, Kentish, and West Saxon. Under King Alfred the Great in the late ninth century, Wessex became the leading political power in England and a center of learning. Alfred sponsored translations of important Latin works into Old English, most often in the West Saxon dialect. As a result, most surviving Old English texts come from that dialect, giving us only a partial view of the linguistic diversity of early England.
Latin and Celtic Influences
Even before the Anglo-Saxons arrived in Britain, Latin had begun to influence their speech through contact with the Roman world. Early Latin loanwords include street (from strata), wall (from vallum), and wine (from vinum).
A second wave of Latin influence arrived with the Christianization of England beginning in 597, when Augustine of Canterbury established a mission in Kent. Christianity introduced vocabulary connected with religion, learning, and administration—words such as church, bishop, monk, school, altar, and verse.
By contrast, the Celtic languages spoken by the native Britons left a surprisingly small mark on English vocabulary. Their influence survives most clearly in place names—for example Thames, Avon, and Dover—and in landscape terms such as combe (valley) and tor (rocky hill). Why Celtic languages left relatively few everyday words in English remains one of the lingering puzzles of linguistic history.
Vikings and the Norse Contribution
Beginning in the late eighth century, Scandinavian raiders and settlers—collectively known as Vikings—began attacking and eventually settling parts of England. By the ninth century much of northern and eastern England had become part of the Danelaw, where Old English speakers lived alongside speakers of Old Norse.
Because Old Norse and Old English were closely related Germanic languages, speakers could often roughly understand each other. Over time, however, sustained contact produced deep linguistic blending. English absorbed many Norse-derived words that now feel completely native, including sky, skin, skill, skirt, egg, leg, window, husband, call, take, give, get, want, and die.
Perhaps the most striking Norse contribution lies in the pronouns they, them, and their, which replaced earlier Old English forms. When a language adopts core pronouns from another language, it signals unusually intense and prolonged contact.
Many linguists also believe that contact with Norse speakers helped accelerate the simplification of English grammar. In bilingual communities, speakers often reduce complex inflectional endings that make communication difficult. As a result, English gradually moved away from the elaborate grammatical endings of Old English and toward a system that relied more heavily on word order.
The Norman Transformation
The Norman Conquest of 1066 transformed English more dramatically than any other single event in its history. When William of Normandy defeated King Harold at the Battle of Hastings and became king of England, he brought with him a French-speaking aristocracy.
For several centuries after the conquest, French dominated the language of power—the court, the law, the church hierarchy, and much of government administration. English remained the everyday language of the population but lost prestige in elite circles.
French vocabulary poured into English in areas associated with authority and culture. Law gained terms such as justice, court, judge, jury, prison, crime, and verdict. Government absorbed parliament, sovereign, minister, authority, tax, and treasury. Military language adopted army, navy, soldier, captain, defense, and siege.
Even the language of food reflects this social divide. The animals in the field kept their Old English names—cow, sheep, pig, and deer—while the meat served at noble tables took French names: beef, mutton, pork, and venison.
The Rise of Middle English
Over time, French dominance gradually weakened. The loss of Normandy in 1204 encouraged English nobles to identify more strongly with England itself. Later, the Black Death (1348–1350) reshaped English society by elevating the economic importance of English-speaking laborers and craftsmen.
During the fourteenth century, English returned as the language of all social classes. The language that emerged—Middle English—looked very different from Old English. Most grammatical endings disappeared, grammatical gender vanished, and sentence structure shifted toward the familiar subject-verb-object order.
At the same time, English vocabulary became a rich mixture of Germanic and Romance elements. This layering produced sets of near-synonyms with different levels of formality: ask (Germanic), question (French), and interrogate (Latin).
The most famous literary figure of this period was Geoffrey Chaucer, whose Canterbury Tales demonstrated that English could rival French and Latin as a vehicle for sophisticated literature. Chaucer wrote in the London dialect, which was gaining prominence due to the city’s political and commercial importance. Though not yet standardized, London English gradually became the foundation of later written English.
Printing and the Great Vowel Shift
William Caxton established England’s first printing press in 1476, and this technological revolution had far-reaching consequences for the language. Printing created a need for standardized spelling and grammar, since texts would now be distributed widely rather than copied by hand in local scriptoria. Caxton himself struggled with the problem of dialect variation, complaining about the difficulty of choosing forms that all English readers could understand. Over time, the conventions adopted by London printers became the de facto standard. Additionally, the need to create type for the printing press led to the dropping of the letters þ (thorn) and ð (eth) that were difficult to replicate in lead.
At the same time, English pronunciation underwent a dramatic change known as the Great Vowel Shift, which occurred roughly between 1400 and 1700. Long vowel sounds moved upward in the mouth, transforming the pronunciation of many common words. For example, “name” once sounded closer to nah-muh, while “mouse” sounded more like moose.
The causes of the Great Vowel Shift remain debated—theories range from the social upheaval following the Black Death to the influence of French-accented English—but its effects were enormous. The spellings had been largely fixed by printing before the vowel shift was complete, so that the written words reflected pronunciations that no longer existed such as knife and through.
Renaissance Expansion
The English Renaissance of the sixteenth and seventeenth centuries unleashed another flood of new vocabulary, much of it borrowed from Latin and Greek. Scholars and writers introduced thousands of words connected to science, philosophy, and literature, including democracy, encyclopedia, atmosphere, thermometer, criticism, and educate.
Critics derided the new coinages as “inkhorn terms”—pretentious, unnecessary words invented by scholars dipping their quills in inkhorns. Some of these attacked words, like “perpetrate” and “contemplate,” survived, others, like “ingent” (enormous), did not.
Two towering cultural works further shaped English during this era: Shakespeare’s plays and the King James Bible (1611). Shakespeare popularized countless words and expressions—among them assassination, lonely, eventful, and phrases like “break the ice” and “wild goose chase.” The King James Bible, widely read for centuries, left deep marks on English rhythm and idiom.
Dictionaries and Standardization
By the eighteenth century, many writers wanted to standardize and regulate English. The most influential effort was Samuel Johnson’s Dictionary of the English Language (1755), which became the dominant reference work of its era.
In the United States, Noah Webster’s American Dictionary of the English Language (1828) promoted simplified spellings such as color instead of colour and center instead of centre. Webster viewed spelling reform as part of America’s broader cultural independence from Britain.
English Goes Global
From the seventeenth through the early twentieth centuries, the British Empire spread English across the globe. Along the way, the language absorbed vocabulary from many other languages. Hindi contributed words such as jungle and shampoo, Arabic added algebra and alcohol, and Malay gave English bamboo and ketchup.
As English took root in different regions, new varieties emerged—American, Australian, Canadian, Indian, Nigerian, Singaporean, and many others. Linguists today increasingly recognize these as legitimate forms of English rather than deviations from a single standard.
English in the Digital Age
In the twentieth and twenty-first centuries, mass media and digital communication have accelerated linguistic change. Radio, film, television, and the internet spread slang, accents, and new expressions around the world with unprecedented speed.
English continues to absorb new words from science, technology, business, and online culture. Brand names become verbs; internet slang becomes everyday speech. Today more than a billion people speak English as a first or second language, making it the most widely used language in human history.
A Language Still Evolving
The history of English reminds us that language is not a fixed monument but a living system shaped by human interaction. Its vocabulary is like an archaeological site, where almost every common word carries traces of earlier eras.
English has never been “pure,” and attempts to purify it have always failed. Its strength lies in its openness—its ability to borrow, adapt, and reinvent itself. From the heroic poetry of Beowulf to Shakespeare’s theater, from the King James Bible to the language of the internet, English continues to grow through the voices of those who use it.
And if history is any guide, the English spoken a few centuries from now will sound just as surprising to us as Chaucer’s language once did.
Illustration generated by author using ChatGPT.
Sources
Baugh, Albert C. and Thomas Cable. A History of the English Language (6th edition). Routledge, 2012. This remains the standard academic textbook on the subject and covers every period and influence discussed above.
Crystal, David. The Cambridge Encyclopedia of the English Language (3rd edition). Cambridge University Press, 2019. An accessible and richly illustrated reference covering the structure and history of English.
McCrum, Robert, Robert MacNeil, and William Cran. The Story of English (3rd revised edition). Penguin, 2003. A popular history that accompanied the PBS television series, excellent for general readers.
Mugglestone, Lynda (ed.). The Oxford History of English (2nd edition). Oxford University Press, 2012. A collection of essays by specialists covering English from its earliest origins to the present day.
Bede, The Venerable. Ecclesiastical History of the English People. Penguin Classics, 1990 (translated by Leo Sherley-Price). The primary early source on the Anglo-Saxon migrations.
Townend, Matthew. “Contacts and Conflicts: Latin, Norse, and French.” In The Oxford History of English, edited by Lynda Mugglestone, 2012. A detailed treatment of the major external influences on English.
Online resource: The British Library’s “Evolving English” exhibit materials are available at https://www.bl.uk/learning/langlit/evolvingenglish/
Online resource: Durkin, Philip. “Borrowed Words: A History of Loanwords in English.” Oxford University Press, 2014. Summary and excerpts available at https://global.oup.com/academic/product/borrowed-words-9780199574995
How a German hare hopped its way into American Easter tradition
Every Easter morning, children across America hunt for eggs left by a rabbit. It’s a charming ritual—and a deeply strange one, when you stop to think about it. Rabbits don’t lay eggs. They don’t carry baskets. Yet here we are, every spring, maintaining the fiction with great enthusiasm. Where did this tradition come from? The answer turns out to be a lot more interesting than you might expect.
The story starts in Germany. The earliest documented reference to an Easter Hare—called the “Osterhase” in German—appears in 1678, in a medical text by the physician Georg Franck von Franckenau. In the German tradition, the Osterhase was specifically a hare, not a rabbit, and its job was straightforward: deliver colored eggs to well-behaved children. Naughty children got nothing. This moral dimension—gift delivery tied to good behavior—should sound familiar. The Easter Bunny was, in a sense, an early version of Santa Claus.
The tradition crossed the Atlantic in the 1700s, carried by German Protestant immigrants who settled in Pennsylvania. Their children knew the Osterhase (sometimes rendered as “Oschter Haws” in Pennsylvania Dutch dialect) and kept up the custom of leaving out nests—made from caps and bonnets—for the hare to fill with eggs. Over time, the nests became baskets, the simple colored eggs became candy and chocolate, and the moral judgment quietly dropped away. By the 20th century, the Easter Bunny had transformed from selective gift-giver into universal children’s benefactor.
But why eggs at all? Eggs entered the Easter story long before Germany. For ancient Romans, they symbolized new life and fertility, and the custom of giving dyed eggs as spring gifts predates Christianity. The Christian tradition added another layer: during the Lenten fasting period eggs were a forbidden food. By Easter Sunday, the people were ready to use the accumulated eggs and were ready to celebrate. They cooked, decorated, and shared them. The emergence from the shell became a visual metaphor for resurrection, and the symbolism stuck.
Rabbits and hares had their own long history as symbols of fertility and springtime. Some writers have linked the Easter Bunny to an ancient Anglo-Saxon goddess named Eostre—from whose name we may get the word “Easter”—and they claim the hare was her sacred animal. It’s a compelling story. It’s also largely unsupported by evidence. The Oxford Dictionary of English Folklore notes that the only historical source mentioning Eostre is the medieval scholar Bede, and Bede says nothing about hares. The goddess-and-hare connection appears to be modern folklore dressed up as ancient tradition.
What is better documented is that hares held symbolic significance across many early cultures. Neolithic burial sites in Europe include hares interred alongside humans, suggesting ritual importance. Hares are conspicuous breeders—they produce multiple litters each year and nest above ground, making their reproductive activity visible in a way that rabbits’ underground burrows do not. For pre-modern peoples marking the return of spring, the hare was a living advertisement for new life.
The combination of egg symbolism and hare symbolism wasn’t a deliberate design decision by any single culture or institution. It was a gradual collision—two powerful images of renewal fusing together over centuries of seasonal celebration. The church absorbed local spring customs rather than eliminating them, allowing pagan associations with fertility and rebirth to persist beneath a Christian overlay. The result is the hybrid tradition we have today.
Today’s Easter Bunny is genuinely a global figure, though not always a rabbit. In Australia, the role is played by the Easter Bilby, an endangered marsupial that conservationists have promoted as a local alternative since the 1990s. Switzerland has an Easter Cuckoo. Parts of Germany have an Easter Fox. Each region adapted the basic concept of a spring gift-bringer to fit its own wildlife and folklore.
The commercial Easter Bunny we know—the chocolate molded figure, the pastel basket, the branded plush toy—is largely a product of the late 19th and 20th centuries, shaped by the same forces that turned Saint Nicholas into Santa Claus. Candy manufacturers, greeting card companies, and department stores found in Easter a spring counterpart to the Christmas retail season, and the Easter Bunny was the obvious mascot.
None of that diminishes what the tradition actually does. The Easter Bunny survived precisely because its meaning kept evolving. It began as a moral enforcer in 17th-century Germany, became a community ritual for immigrant families in Pennsylvania, and eventually became a child’s-eye-view celebration of spring available to secular and religious families alike. The rabbit never needed to make logical sense. It only needed to mark the moment the world turns green again—and every civilization, it seems, finds a way to celebrate that.
Illustration generated by author using ChatGPT.
Sources:
Bede, De Temporum Ratione (8th century) https://sourcebooks.fordham.edu/basis/bede-reckoning.asp
The American Revolution wasn’t just a showdown between colonists and the British Crown. For the more than 80 distinct Native American nations living east of the Mississippi River, the conflict posed an existential threat — one that would reshape their world no matter who won. They faced an agonizing choice: stay neutral in what many viewed as a family dispute within the British Empire, or pick a side and hope that alliance might help preserve their lands and sovereignty.
Most tribes that chose a side supported the British, and their reasoning was sound. The Proclamation of 1763 had attempted to block colonial settlement west of the Appalachians, and Native leaders correctly recognized that an independent America, freed from British constraints, would accelerate land seizures at a terrifying pace. As Mohawk leader Joseph Brant warned in 1775, independence for the colonists would likely mean disaster for indigenous peoples across the continent. History would prove him right.
The Patriots’ Native Allies
Still, several tribes made the difficult calculation to support the Revolutionary cause. The most significant were the Oneida and Tuscarora nations of the Iroquois Confederacy, along with the Stockbridge-Mohican people of Massachusetts and New York. Smaller contingents from the Catawba, Delaware, Maliseet, Pequot, Narragansett, Niantics, and Montauks also fought alongside colonial forces.
The Stockbridge-Mohican had a relatively clear-cut situation: surrounded by colonial settlements in western Massachusetts, neutrality was essentially impossible. They had already developed cultural and trade ties with their English neighbors, and they bet that loyalty might protect their remaining land rights in the new nation. They were among the very first Native people to take up arms, with members serving as minutemen at Lexington and Concord in April 1775 and fighting at Bunker Hill that June.
The Oneida’s decision was more complex. Unlike tribes facing immediate frontier pressure, they had some geographic breathing room. Their choice reflected relationships built with colonial missionaries and traders, but also a calculated gamble: that an American victory might better respect their territorial claims than continued British rule. In 1776, Congress formally authorized General Washington to recruit Stockbridge Indians, and the Oneida soon became crucial assets — not just as fighters, but as scouts who knew the terrain intimately, and as diplomats attempting to keep other tribes neutral.
Combat Contributions
Native Americans who fought for the Patriots contributed far beyond their numbers. Historian Pekka Hämäläinen has argued that proportionally more Indians than New Englanders served in Patriot forces during the war. Their most consequential military moment came at the Battle of Oriskany on August 6, 1777 — one of the bloodiest engagements of the entire conflict.
At least 60 Oneida warriors fought alongside New York militia against a combined British, Loyalist, and Mohawk force. Warrior Han Yerry, his wife Tyonajanegen, and their son all distinguished themselves that day. According to contemporary accounts, Han Yerry killed nine enemy fighters before a bullet disabled his gun hand, forcing him to continue with his tomahawk; Tyonajanegen fought on horseback with pistols throughout the battle. The engagement fractured the Iroquois Confederacy permanently and helped prevent British forces from reinforcing General Burgoyne before the decisive American victory at Saratoga two months later.
Perhaps the Oneida’s most vital — and least celebrated — contribution came during the winter of 1777-78 at Valley Forge. When Washington’s army faced starvation, Oneida Chief Shenandoah dispatched warriors carrying several hundred bushels of white corn. An Oneida woman named Polly Cooper made the 200-mile journey from Fort Stanwix and stayed at Valley Forge, teaching the starving soldiers how to properly cook the corn so it was actually digestible. Washington personally met with Oneida leaders to express his gratitude, presenting each with a wampum belt. It was a quiet act of generosity that may have saved the Continental Army.
The Oneida continued fighting throughout the war — at the Battle of Barren Hill in May 1778, where scouts stayed behind to allow Lafayette’s troops to escape a British trap; at the Battle of Monmouth; and in numerous northern campaigns. Ten Oneida soldiers earned officers’ commissions in the Continental Army, one rising to lieutenant colonel. Some even served as spies, gathering intelligence deep in enemy territory at enormous personal risk.
The Bitter Aftermath
And then came the betrayal. The 1783 Treaty of Paris, which ended the war, contained no Native American representatives and made no provisions whatsoever for protecting indigenous lands or sovereignty. Britain simply handed over all territory east of the Mississippi to the new United States — without consulting a single Native nation — treating indigenous homelands as British property to dispose of at will.
Even the tribes that had fought for the American cause found that wartime promises evaporated in peacetime. The Oneida, whose contributions had been genuinely critical, faced immediate pressure to cede their territories. By 1788, New York State had leveraged the Oneida into surrendering approximately 5.5 million acres, leaving them with just 300,000. Between 1785 and 1846, New York forced the Oneida to sign 26 additional treaties, stripping away nearly everything that remained.
In 1794, Congress did formally acknowledge the service of the Oneida, Tuscarora, and Stockbridge with the Treaty of Canandaigua, providing $5,000, a new church, and some mills. But the treaty also required the tribes to relinquish all other claims for compensation — effectively closing the books on their wartime losses. Historians estimate the Oneida lost nearly a third of their population during and immediately after the war through combat casualties, displacement, and the destruction of their villages and food stores. The Stockbridge-Mohican, similarly dispossessed, largely migrated west to present-day Wisconsin by the early 19th century.
The Larger Picture
British-allied tribes fared no better. When Britain ceded its eastern territories, it abandoned all its Native allies without protection or compensation. Joseph Brant’s Mohawk lost nearly all their land, though the British eventually granted Brant’s followers about 810,000 hectares along the Grand River in present-day Ontario — land where the Six Nations Reserve still exists today.
The pattern was consistent across tribes, regardless of which side they chose: the Revolution was a catastrophe for virtually every Native American nation. Those who supported the Patriots made contributions that were real, substantial, and in some cases decisive. The Oneida at Oriskany, the Stockbridge minutemen at Lexington, Polly Cooper at Valley Forge — these weren’t footnotes. They were participants in the founding of a nation that would spend the next century systematically dispossessing them.
The Revolution shattered longstanding indigenous alliances, set precedents for how the new United States would treat Native peoples, and demonstrated that for Native Americans, the choice between British and American sides was ultimately a choice between two different roads to the same devastating destination: the loss of their lands, their sovereignty, and their way of life. It’s a chapter of the founding era that deserves far more attention than it typically gets.
Illustration generated by the author using ChatGPT.
There is a version of American history that looks spectacular. Founding Fathers on horseback, industrialists building steel empires from nothing, pioneers pushing west into open lands. It is the kind of history that gets carved into marble, hoisted onto pedestals, and taught as national mythology. Clean. Inspiring. Incomplete. And right now, there is a visible push by some politicians, curriculum reformers, and commentators to make that marble-statue version the only version — to scrub away what one American Historical Association report called the “inconvenient” truths that complicate the picture. What we lose in that scrubbing is not just accuracy. We lose the full human story of this country, and with it, the lessons that might be useful today.
The selective telling is not new, but its current form has new energy. In recent years, legislation has been introduced across multiple states to restrict how teachers discuss slavery, Indigenous displacement, immigration history, and the treatment of women and the poor. The argument is usually dressed up as national unity and pride. But the practical effect is something else: a history curriculum where triumph and innovation are permissible but suffering and exploitation are edited out.
Historians surveying American teachers in 2024 found this impulse reflected in the classroom as well — students arriving with what teachers described as a “marble statues” version of history absorbed from earlier grades, one that freezes the Founders and other heroes in idealized civic memory, stripped of contradiction. The pitch is usually framed as morale: kids need pride and self esteem, not “division.” But the practical effect is a kind of historical editing that turns real people—enslaved Americans, Native communities, women, immigrants, and the poor—into background scenery rather than participants with agency, suffering, and claims on the national memory.
You can see the argument playing out in education policy and curriculum fights. The “patriotic education” push associated with the federal 1776 Commission is a clear example: it cast some approaches to teaching slavery and racism as inherently “anti-American,” and it encouraged a narrative that stresses national ideals while softening the lived realities that contradicted those ideals.
Historians’ organizations have answered back that this kind of narrowing doesn’t create unity so much as it creates amnesia. At the state level, controversies over how to describe or contextualize slavery—down to euphemisms and selective framing—keep resurfacing, because controlling the vocabulary controls the moral takeaway. Florida’s education standards went so far as to compare slavery with job training.
The tension between celebratory and critical history also appears in how we interpret national symbols. The Statue of Liberty, now widely read as a welcoming beacon for immigrants, was originally conceived in significant part as a commemoration of the end of slavery in the United States and of the nation’s centennial. Over time, its antislavery meaning was overshadowed by a more comfortable story about voluntary immigration and opportunity as official imagery and public campaigns recast the statue to fit new national needs. This shift did not merely “add” an interpretation; it obscured the connection between American liberty and Black emancipation, pushing aside the reality that millions arrived in chains rather than by choice.
The deeper problem isn’t that Americans disagree about the past—healthy societies argue about meaning all the time. The problem is when disagreement becomes a one-way ratchet: complexity gets labeled “bias,” and only a feel-good storyline qualifies as “neutral.” That’s not neutral. That’s a choice to privilege certain experiences as representative and treat others as “inconvenient.”
Nowhere does this distortion show up more clearly than in how Americans tend to celebrate the industrialists of the late 19th and early 20th centuries — the Gilded Age titans who built railroads, steel mills, and oil empires. Andrew Carnegie, John D. Rockefeller, J.P. Morgan, Cornelius Vanderbilt: these men are frequently held up as models of American ambition and ingenuity, visionaries who transformed a post-Civil War nation into the world’s dominant industrial power. And they did do that. But the marble-statue version stops there, and stopping there is where the dishonesty begins.
Look at what powered that industrial machine: coal. And look at who powered coal. The men — and children — who went underground every day to dig it out of the earth under conditions that were, by any modern standard, a form of institutionalized violence. Between 1880 and 1923, more than 70,000 coal miners died on the job in the United States. That is not a rounding error; it is a small city’s worth of human lives, consumed by an industry that knew the dangers and chose profits over protection. Cave-ins, gas explosions, machinery accidents, and the slow suffocation of black lung took miners in ones and twos on ordinary days, and in mass casualties during what miners grimly called “explosion season” — when dry winter air made methane and coal dust especially volatile. Three major mine disasters in the first decade of the 1900s killed 201, 362, and 239 miners respectively, the latter two occurring within two weeks of each other.
And those were the adults. In the anthracite coal fields of Pennsylvania alone, an estimated 20,000 boys were working as “breaker boys” in 1880 — children as young as eight years old, perched above chutes and conveyor belts for ten hours a day, six days a week, picking slate and impurities out of rushing coal with bare hands. The coal dust was so thick at times it obscured their view. Photographer Lewis Hine documented these children in the early 1900s specifically because he understood that seeing them — their coal-blackened faces, their missing fingers, their flat eyes — was the only way to make comfortable Americans confront the total cost of the industrial miracle. Pennsylvania passed a law in 1885 banning children under twelve from working in coal breakers. The law was routinely ignored; employers forged age documents and desperate families went along with it because the wages, however meager, kept families from starving.
Coal mining is a representative case study because the work was both essential and punishing, and because the labor conflicts were not metaphorical—they were sometimes literally armed. In the coalfields, many miners lived in company towns where the company controlled the housing and the local economy. Some workers were paid in “scrip” redeemable only at the company store, a system that locked families into dependency and debt. When union organizing surged, the backlash could be violent. West Virginia’s Mine Wars culminated in the Battle of Blair Mountain in 1921—widely described as the largest labor uprising in U.S. history—where thousands of miners confronted company-aligned forces and state power. The mine owners deployed heavy machine guns and hired private pilots to drop arial bombs on the miners.
If you zoom out, this pattern wasn’t limited to coal. The Triangle Shirtwaist Factory fire in 1911 became infamous partly because locked doors and poor safety practices trapped workers—mostly young immigrant women—leading to 146 deaths in minutes.
When workers tried to organize for better pay and safer conditions, the response from the industrialists and their allies was not negotiation. It was force. Henry Clay Frick, chairman at Carnegie Steel, cut worker wages in half while increasing shifts to twelve hours, then hired the Pinkerton Detective Agency — effectively a private army — to break the strike that followed at Homestead, PA in 1892. During the Great Railroad Strike of 1877, when workers walked off the job across the country, state militias were called in. In Maryland, militia fired into a crowd of strikers, killing eleven. In Pittsburgh, twenty more were killed with bayonets and rifle fire. A railroad executive of the era, asked about hungry striking workers, reportedly suggested they be given “a rifle diet for a few days” to see how they liked it. Throughout this period the federal government largely sided with capital against labor.
This is the part of the story that the marble-statue version leaves out — and not because it is marginal. The labor movement that emerged from these battles shaped virtually every protection American workers have today: the eight-hour workday, child labor laws, workplace safety regulations, the right to organize. These were not gifts handed down by generous industrialists. They were won through strikes, suffering, and in some cases, death. Ignoring that history does not honor the industrialists. It dishonors the workers.
The same pattern runs through every thread of American history that is currently under pressure. The story of westward expansion is incomplete without the story of Native displacement and the deliberate destruction of Indigenous cultures. The story of American agriculture is incomplete without the story of enslaved labor and the systems of racial control that followed emancipation. The story of American prosperity is incomplete without the story of immigrant communities channeled into the most dangerous, lowest-paid work and then told to be grateful for the opportunity. Women’s history, for most of American history, was not considered history at all. In each case, leaving out the difficult chapter does not produce a cleaner story. It produces a false one.
The argument for the marble-statue version is usually that complexity is demoralizing — that children need heroes, that citizens need pride, that a nation cannot function if it is constantly relitigating its worst moments. There is something in that concern worth taking seriously. History taught purely as a catalog of grievances is not good history either. But the answer to that problem is not to swap one distortion for another. Good history holds both: the genuine achievement and the genuine cost. Mark Twain understood this when he coined “The Gilded Age” — a title that means literally covered in a thin layer of gold over something much cheaper underneath. That phrase has been in the American vocabulary for 150 years because it captures something true about how surfaces can deceive.
A country that cannot look honestly at its own history is a country that will keep repeating the parts it refuses to examine. The enslaved deserve to be in the story. Indigenous people deserve to be in the story. Women deserve to be in the story. The breaker boys deserve to be in the story. The miners killed by the thousands deserve to be in the story. The workers shot by militias while asking for a living wage deserve to be in the story. Not because the story should only be about suffering, but because they were there — and because understanding what they faced, and what they fought for, and what they eventually changed, is how the story makes sense.
Brewminate. “Replaceable Lives and Labor Abuse in the Gilded Age: Labor Exploitation and the Human Cost in America’s Gilded Age.” 2026. https://brewminate.com/replaceable-lives-and-labor-abuse-in-the-gilded-age/
Hannah-Jones, Nikole, et al. “A Brief History of Slavery That You Didn’t Learn in School.” New York Times Magazine. 2019. https://www.nytimes.com/interactive/2019/08/14/magazine/slavery-capitalism.html
Investopedia. “The Gilded Age Explained: An Era of Wealth and Inequality.” 2025. https://www.investopedia.com/terms/g/gilded-age.asp
Princeton School of Public and International Affairs. “Princeton SPIA Faculty Reflect on America’s Past as 250th Anniversary Approaches.” 2026. https://spia.princeton.edu/
USA Today. “Millions of Native People Were Enslaved in the Americas. Their Story Is Rarely Told.” 2025. https://www.usatoday.com/
America250 (U.S. Semiquincentennial Commission). “America250: The United States Semiquincentennial.” https://www.america250.org/
Bunk History (citing Washington Post reporting). “The Statue of Liberty Was Created to Celebrate Freed Slaves, Not Immigrants.” https://www.bunkhistory.org/
Upworthy. “The Statue of Liberty Is a Symbol of Welcoming Immigrants. That’s Not What She Was Originally Meant to Be.” 2026. https://www.upworthy.com/
Last week I looked at how poorly revolutionary war veterans were treated in general. This week I’d like to take a look at a specific example —the contrast between how generals like Henry Knox and common soldiers like Joseph Plumb Martin fared after the Revolutionary War. It perfectly illustrates the class divide I discussed in my previous post. These two men served in the same army, helped win the same independence, and endured similar hardships—although Martin endured far greater hardship. Their post-war experiences couldn’t have been more different—and in a bitter twist, Knox’s prosperity came partly at Martin’s expense. Knox’s Golden Parachute Henry Knox entered the war as a Boston bookseller of modest means whose military knowledge was gained from reading rather than formal training. He rose to become Washington’s chief of artillery and a major general. When the war ended, Knox received benefits that set him up for life—or should have. As an officer who served until the war’s end, Knox received the 1783 commutation payment: five years’ full pay in the form of government securities bearing six percent annual interest. This came after Knox himself helped lead the officer corps in pressuring Congress for payment during the near-mutiny known as the Newburgh Conspiracy in early 1783. In total, 2,480 officers received these commutation certificates But Knox’s real windfall came from his marriage and his government connections. His wife Lucy came from a wealthy Loyalist family—her grandfather was Brigadier General Samuel Waldo, who’d gained control of a massive land patent in Maine in the 1730’s. When Lucy’s family fled to England, she became the sole heir to approximately 576,000 acres known as the Waldo Patent. Knox used his position as the first Secretary of War (earning $3,000 annually in 1793) and his wartime connections to expand his land holdings and business ventures. He was able to ensure that his wife’s family lands were passed to her, rather than being seized by the government, as the holding of many loyalists were. Knox was firmly positioned on the creditor side of the equation, and his political connections helped shield him from the harsh economic reality faced by common soldiers. He also acquired additional property in the Ohio Valley and engaged in extensive land speculation. He ran multiple businesses: timber operations, shipbuilding, brick-making, quarrying, and extensive real estate development. After retiring from government in 1795, he built Montpelier, a magnificent three-story mansion in Thomaston, Maine, described as having “beauty, symmetry and magnificence” unequaled in Massachusetts. (My wife and I visited a reconstruction of his mansion this past summer and I can personally testify as to how elaborate a home it was.) Martin’s Broken Promises Joseph Plumb Martin’s story is the experience of the roughly 80,000-90,000 common soldiers who did most of the fighting. Martin enlisted at age 15 in 1776 and served seven years—fighting at Brooklyn, White Plains, Monmouth, surviving Valley Forge, and digging trenches at Yorktown. He rose from private to sergeant. When Martin mustered out, he received certificates of indebtedness instead of actual pay—IOUs that depreciated rapidly. Unlike Knox, enlisted men received no pension, no commutation payment, nothing beyond those nearly worthless certificates. Martin, like many veterans, sold his certificates to speculators at a fraction of their face value just to survive. After teaching briefly in New York, Martin settled in Maine in the early 1790s. Based on the promise of a land bounty from Massachusetts, Martin and other “Liberty Men” each claimed 100 acres in Maine, assuming that Loyalist lands would be confiscated and sold cheaply to the current occupants or, perhaps, even treated as vacant lands they could secure by clearing and improving. Martin married Lucy Clewley in 1794 and started farming. He’d fought for independence and now just wanted to build a modest life in the belief that the country he had fought for would stand by its promises. When Former Comrades Became Adversaries Here’s where the story takes a dark turn. In 1794, Henry Knox—Martin’s former commanding general—asserted legal ownership of Martin’s 100-acre farm. Knox claimed the land was part of the Waldo Patent. Martin and other settlers argued they had the right to farm the land they’d improved, especially as it should be payment for their Revolutionary service. The dispute dragged on for years, with some veterans even forming a guerrilla group called the “White Indians” who attacked Knox’s surveyors. But Knox had wealth, lawyers, and political connections. In 1797, the legal system upheld Knox’s claim. Martin’s farm was appraised at $170—payable over six years in installments. To put that in perspective, when Martin finally received a pension in 1818—twenty-one years later—it paid only $96 per year. And to get even that meager pension, Martin had to prove he was destitute. The $170 Knox demanded represented nearly two years of the pension Martin wouldn’t receive for another two decades. Martin begged Knox to let him keep the land. There’s no evidence Knox even acknowledged his letters. By 1811, Martin had lost more than half his farm. By 1818, when he appeared before the Massachusetts General Court with other veterans seeking their long-promised pensions, he owned nothing. The Irony of “Fair Treatment” Knox claimed he treated settlers on his Maine lands fairly, though he used intermediaries to evict those who couldn’t pay rent or whom he considered to be squatters. The settlers disagreed so strenuously that they once threatened to burn Montpelier to the ground The situation’s bitter irony is hard to overstate. Knox had been one of the officers who organized the Society of the Cincinnati in 1783, ostensibly to support widows and orphans of Revolutionary War officers. He’d helped lead the push for officer commutation payments by threatening Congress during the Newburgh affair. Yet when common soldiers like Martin—men who’d literally dug the trenches that won the siege at Yorktown—needed help, Knox showed no mercy. The Numbers Tell the Story Let’s compare their situations side by side: Henry Knox: ∙ Officer commutation: Five years’ full pay in securities with 6% interest ∙ Secretary of War salary: $3,000 per year (1793) ∙ Land holdings: 576,000+ acres in Maine, plus Ohio Valley properties ∙ Housing: Three-story mansion with extensive outbuildings ∙ Businesses: Multiple ventures in timber, ships, bricks, quarrying, real estate ∙ Death: 1806, in debt from failed business ventures but having lived in luxury Joseph Plumb Martin: ∙ Enlisted pay: Mostly unpaid certificates sold at a loss to speculators ∙ Pension: None until 1818, then $96 per year (had to be destitute to qualify) ∙ Land holdings: Started with 100 acres, lost all most all of it to Knox by 1818 ∙ Housing: Small farmhouse, struggling to farm 8 of his original 100 acres ∙ Income: Subsistence farming, served as town clerk for modest pay ∙ Death: 1850 at age 89, having struggled financially his entire post-war life A Memoir Born of Frustration In 1830, at age 70, Martin published his memoir anonymously. The full title captured his experience: “A Narrative of Some of the Adventures, Dangers, and Sufferings of a Revolutionary Soldier.” He published it partly to support other veterans fighting for their promised benefits and possibly hoping to earn some money from sales. The book didn’t sell. It essentially disappeared until a first edition was rediscovered in the 1950s and republished in 1962. Today it’s considered one of the most valuable primary sources we have for understanding what common soldiers experienced during the Revolution. Historians praise it precisely because it’s not written by someone like Washington, Knox, or Greene—it’s the voice of a regular soldier When Martin died in 1850, a passing platoon of U.S. Light Infantry stopped at his house and fired a salute to honor the Revolutionary War hero. But that gesture of respect came long after the country should have helped Martin when he needed it. The Broader Pattern Knox wasn’t unusual among officers, nor was Martin unusual among enlisted men. This was the pattern: officers with education, connections, and capital leveraged their wartime service into political positions, land grants, and business opportunities. Common soldiers received promises, waited decades for minimal pensions, and often lost what little property they had to the very elites who’d commanded them. It’s worth noting that Knox’s business ventures eventually failed. He died in debt in 1806, having borrowed extensively to fund his speculations. His widow Lucy had to gradually sell off land to survive. But Knox still lived eleven years in a mansion, engaged in enterprises of his choosing, and died surrounded by family on his comfortable estate. Martin outlived him by forty-four years, spending most of them in poverty. The story of Knox and Martin isn’t one of villainy versus heroism. Knox was a capable general who genuinely contributed to winning independence. Martin was a dedicated soldier who did the same. But the system they operated within distributed the benefits of that shared victory in profoundly unequal ways, and Knox—whether intentionally or not—used that system to take what little they had from soldiers who’d fought under his command. This was not corruption in the modern sense; it was the predictable outcome of a system that rewarded status, education, and proximity to power. Knox’s experience illustrates a broader truth of the post-Revolutionary period: independence redistributed political sovereignty, but economic security flowed upward, not downward. When we talk about how Continental Army veterans were treated, this is what it looked like on the ground: the officer who led the charge for officer pensions living in a mansion on 600,000 acres, while the sergeant who dug the trenches at Yorktown lost his 100-acre farm and had to prove he was destitute to get $96 a year, decades too late to matter. This will always be a black mark on American history.
Illustrations generated by author using ChatGPT.
Personal note: I spent 12 years on active duty, both as an officer and an enlisted man. I’m proud of my service and I’m proud of the people who have served our country. I do not write this in order to condemn our history. I write it in order to make us aware that we need to always support the common people who contribute vitally to our national success and are seldom recognized.
Sources Martin, Joseph Plumb. “A Narrative of a Revolutionary Soldier: Some of the Adventures, Dangers and Sufferings of Joseph Plumb Martin” Originally published anonymously in 1830 at Hallowell, Maine as “A narrative of some of the adventures, dangers, and sufferings of a Revolutionary soldier, interspersed with anecdotes of incidents that occurred within his own observation.” The memoir fell into obscurity until a first edition copy was discovered in the 1950s and donated to Morristown National Historical Park. Republished by Little, Brown in 1962 under the title “Private Yankee Doodle” (edited by George F. Scheer). Current edition published 2001. This firsthand account by a Continental Army private who served seven years provides invaluable insight into the common soldier’s experience during the war and the struggles veterans faced afterward, including Martin’s own land dispute with Henry Knox. I highly recommend this book to anyone with an interest in ordinary people and their role in history.
American Battlefield Trust – The Newburgh Conspiracy https://www.battlefields.org/learn/articles/newburgh-conspiracy
Maine Memory Network – Henry Knox: Land Dealings https://thomaston.mainememory.net/page/735/display.html
World History Encyclopedia – Henry Knox https://www.worldhistory.org/Henry_Knox/
Maine: An Encyclopedia – Knox, Henry https://maineanencyclopedia.com/knox-henry/
American Battlefield Trust – Joseph Plumb Martin: Voice of the Common American Soldier https://www.battlefields.org/learn/articles/joseph-plumb-martin
Wikipedia – Joseph Plumb Martin https://en.wikipedia.org/wiki/Joseph_Plumb_Martin
Note on Additional Context: While these were the primary sources directly used in this article, the discussion also drew on information from my earlier Revolutionary War veterans article about the general treatment of enlisted soldiers, pension systems, and the class disparities in how benefits were distributed after the war.
When I first started hearing debates about Critical Race Theory, I thought these people can’t possibly be talking about the same thing. There seemed to be no common ground—even the words they were using seemed to have different meanings.
Critical Race Theory (CRT) has become one of the most contested intellectual concepts in contemporary American culture. Originally developed in law schools during the 1970s and 1980s, CRT has evolved into a broad analytical method of examining how race and racism operate in society. Understanding its origins, core principles, and the political debates surrounding it requires examining both its academic foundations and its journey into public consciousness.
Origins and Early Development
Legal scholars who were dissatisfied with the slow pace of racial progress following the Civil Rights Movement laid the groundwork for CRT. The early figures included Derrick Bell, often considered the father of CRT, along with Alan Freeman, Richard Delgado, Kimberlé Crenshaw, and Cheryl Harris. These scholars were frustrated that despite landmark legislation like the Civil Rights Act of 1964 and the Voting Rights Act of 1965, racial inequality persisted across American institutions.
The intellectual roots of CRT can be traced to Critical Legal Studies, a movement that challenged traditional legal scholarship’s claims of objectivity and neutrality. However, CRT scholars felt that Critical Legal Studies failed to adequately address race and racism. They drew inspiration from various sources, including the work of civil rights lawyers like Charles Hamilton Houston, sociological insights about institutional racism, and postmodern critiques of knowledge and power.
Derrick Bell’s groundbreaking work in the 1970s laid crucial foundation. His “interest convergence” theory, presented in his analysis of Brown v. Board of Education, argued that advances in civil rights occur only when they align with white interests. This insight became central to CRT’s understanding of how racial progress unfolds in American society.
Core Elements and Principles
Critical Race Theory encompasses several key tenets that distinguish it from other approaches to studying race and racism.
First, CRT posits that race is not biologically real; it’s a human invention to justify unequal treatment. It also holds that racism is not merely individual prejudice, but a systemic feature of American society embedded in legal, political, and social institutions. This “structural racism” perspective emphasizes how seemingly neutral policies and practices can perpetuate racial inequality.
Second, CRT challenges the traditional civil rights approach that emphasizes color-blindness and incremental reform. Instead, CRT scholars argue that color-blind approaches often mask and perpetuate racial inequities. They advocate for race-conscious policies and a more aggressive approach to dismantling systemic racism.
Third, CRT emphasizes the importance of lived experience in the form of storytelling and narrative. Scholars use personal narratives, historical accounts, and counter-stories to challenge dominant narratives about race and racism. This methodological approach reflects CRT’s belief that experiential knowledge from communities of color provides crucial insights often overlooked by traditional scholarship.
Fourth, CRT introduces the concept of intersectionality, a term coined by legal scholar Kimberlé Crenshaw. This framework examines how multiple forms of identity and oppression—including race, gender, class, and sexuality—intersect and compound each other’s effects.
Finally, CRT is explicitly activist-oriented with a goal of creating new norms of interracial interaction. Unlike purely descriptive academic theories, CRT aims to understand racism in order to eliminate it. This commitment to social transformation distinguishes CRT from more traditional academic approaches.
Evolution and Expansion
Since its origins in legal studies, CRT has expanded into numerous disciplines including education, sociology, political science, and ethnic studies. In education, scholars like Gloria Ladson-Billings and William Tate applied CRT frameworks to understand racial disparities in schooling. This educational application of CRT examines how school policies, curriculum, and practices contribute to achievement gaps and educational inequality.
Conservative Perspectives
Conservative critics of CRT raise several concerns about the theory and its applications. They argue that CRT’s emphasis on systemic racism is overly deterministic and fails to account for individual differences and the significant progress made in racial equality since the Civil Rights era. Many conservatives contend that CRT promotes a victim mentality that undermines personal responsibility and achievement.
From this perspective, CRT’s race-conscious approach is seen as divisive and potentially counterproductive. Critics argue that emphasizing racial differences rather than common humanity perpetuates division and resentment. They often prefer color-blind approaches that treat all individuals equally regardless of race.
Conservative critics also express concern about CRT’s application in educational settings, arguing that it introduces inappropriate political content into classrooms and may cause students to feel guilt or shame based on their racial identity. Some argue that CRT-influenced curricula amount to indoctrination rather than education.
Additionally, some conservatives view CRT as fundamentally un-American, arguing that its critique of American institutions and emphasis on systemic oppression undermines national unity and patriotism. They contend that CRT presents an overly negative view of American history and society.
Some conservatives go further, calling CRT a form of “anti-American radicalism.” They believe it rejects Enlightenment values—reason, objectivity, and universal rights—in favor of ideology and emotion. Others criticize CRT’s reliance on narrative and lived experience, arguing that it substitutes storytelling for empirical evidence.
Liberal Perspectives
Supporters of CRT argue that it provides essential tools for understanding persistent racial inequalities that other approaches fail to explain adequately. They contend that CRT’s focus on systemic racism accurately describes how racial disparities continue despite formal legal equality.
To them, CRT isn’t about blaming individuals; it’s about recognizing how systems work. Advocates say that color-blind policies often perpetuate inequality because they ignore how race has historically shaped opportunity. They see CRT as empowering marginalized communities to tell their stories and as pushing America closer to its own ideals of justice and equality.
Liberal and progressive thinkers see CRT as a reality check—a necessary tool for understanding and dismantling systemic racism. They argue that laws and policies that seem neutral can still produce racially unequal outcomes—for example disparities in school funding or redlining in housing. (Denying loans or insurance based on neighborhoods rather than individual qualifications.)
From this perspective, CRT’s race-conscious approach is necessary because color-blind policies have proven insufficient to address entrenched racial inequities. Supporters argue that acknowledging and directly confronting racism is more effective than pretending race doesn’t matter.
Liberal defenders of CRT emphasize its scholarly rigor and empirical grounding, arguing that criticism often mischaracterizes or oversimplifies the theory. They point out that CRT is primarily an analytical framework used by scholars and graduate students, not a curriculum taught to elementary school children, as some critics suggest. Progressive educators also note that much of what critics call “CRT in schools” is really teaching about historical facts—slavery, segregation, civil-rights struggles—not law-school theory. They argue that banning CRT is less about protecting students and more about suppressing uncomfortable conversations about race and history.
Supporters also argue that CRT’s emphasis on storytelling and lived experience provides valuable perspectives that have been historically marginalized in academic discourse. They see this as democratizing knowledge production rather than abandoning scholarly standards.
Furthermore, many on the left argue that attacks on CRT represent attempts to silence discussions of racism and maintain the status quo. They view criticism of CRT as part of a broader backlash against racial justice efforts.
Why It Matters
You don’t have to buy every part of CRT to see why it struck a nerve. It forces us to ask uncomfortable but important questions: Why do some inequalities persist even after laws change? How do institutions carry the weight of history?
Whether you agree or disagree with CRT, it’s hard to deny that it has shaped how Americans talk about race. The theory challenges us to look beyond personal prejudice and ask how systems distribute power and privilege. Its critics, in turn, remind us that any theory of justice must preserve individual rights and shared civic values.
The real challenge may be learning to hold both ideas at once: that racism can be systemic, and that individuals should still be treated as individuals. CRT’s greatest value—and its greatest controversy—comes from forcing that tension into the open.
Sources:
JSTOR Daily. “What Is Critical Race Theory?” https://daily.jstor.org/what-is-critical-race-theory/ (Accessed December 3, 2025)
Harvard Law Review Blog. “Derrick Bell’s Interest Convergence and the Permanence of Racism: A Reflection on Resistance.” https://harvardlawreview.org/blog/2020/08/derrick-bells-interest-convergence-and-the-permanence-of-racism-a-reflection-on-resistance/ (March 24, 2023)
Bell, Derrick A., Jr. “Brown v. Board of Education and the Interest-Convergence Dilemma.” Harvard Law Review, Vol. 93, No. 3 (January 1980), pp. 518-533.
Columbia Law School. “Kimberlé Crenshaw on Intersectionality, More than Two Decades Later.” https://www.law.columbia.edu/news/archive/kimberle-crenshaw-intersectionality-more-two-decades-later
Crenshaw, Kimberlé. “Demarginalizing the Intersection of Race and Sex: A Black Feminist Critique of Antidiscrimination Doctrine, Feminist Theory and Antiracist Politics.” 1989.
Britannica. “Richard Delgado | American legal scholar.” https://www.britannica.com/biography/Richard-Delgado
Wikipedia. “Critical Race Theory.” https://en.wikipedia.org/wiki/Critical_race_theory (Updated December 31, 2025)
Delgado, Richard and Jean Stefancic. “Critical Race Theory: An Introduction.” New York University Press, 2001 (2nd edition 2012, 3rd edition 2018).
Teachers College Press. “Critical Race Theory in Education.” https://www.tcpress.com/critical-race-theory-in-education-9780807765838
American Bar Association. “A Lesson on Critical Race Theory.” https://www.americanbar.org/groups/crsj/publications/human_rights_magazine_home/civil-rights-reimagining-policing/a-lesson-on-critical-race-theory/
NAACP Legal Defense and Educational Fund. “What is Critical Race Theory, Anyway? | FAQs.” https://www.naacpldf.org/critical-race-theory-faq/ (May 6, 2025)
The illustration was generated by the author using Midjourney.
When we picture the American Revolution, we often imagine Continental soldiers in blue coats facing off against British redcoats—but this image leaves out thousands of crucial participants. Between 5,000 and 8,000 Black men fought for the Patriot cause, while an estimated 20,000 joined the British forces. Their stories reveal the war’s profound contradictions and the complex choices Black Americans faced when white colonists fought for “liberty” while holding hundreds of thousands of people in bondage. Their participation reflected the Revolution’s central paradox: a war waged in the name of liberty within a society deeply dependent on slavery.
The irony wasn’t lost on anyone at the time. As Abigail Adams wrote in 1774, “it always appeared a most iniquitous scheme to me to fight ourselves for what we are daily robbing and plundering from those who have as good a right to freedom as we have”.
For most Black participants, the key question was which side offered the clearest path out of bondage rather than abstract allegiance to King or Congress. The tension between revolutionary rhetoric and the reality of slavery shaped every decision Black Americans made about which side to support. This dynamic meant that enslaved people frequently escaped to British forces, while free Blacks (especially in New England) were more likely, though not exclusively, to enlist with the Patriots where they already had tenuous civic footholds
The British Offer: “Liberty to Slaves”
In November 1775, Virginia’s royal governor Lord Dunmore made a move that sent shockwaves through the colonies. With his military position deteriorating and losing men under his command, Dunmore issued a proclamation offering freedom to any enslaved person who abandoned their Patriot masters and joined British forces. The proclamation declared “all indented servants, Negroes, or others (appertaining to rebels) free, that are able and willing to bear arms”.
The response was immediate. Within a month, an estimated 300 Black men had enlisted in what Dunmore called the “Royal Ethiopian Regiment,” eventually growing to about 800 men. Their uniforms were emblazoned with the provocative words “Liberty to Slaves.” The name “Ethiopian” wasn’t random—it referenced ancient associations of Ethiopia with wisdom and nobility. These soldiers saw action at the Battle of Kemp’s Landing, where—in a moment rich with symbolic meaning—one previously enslaved soldier captured his former master, militia colonel Joseph Hutchings.
Dunmore’s promise came with devastating costs. The regiment’s only other major battle was the disastrous British defeat at Great Bridge in December 1775. Far worse was the disease that ravaged the Black soldiers’ ranks. As the Virginia Gazette reported in March 1776, “the jail distemper rages with great violence on board Lord Dunmore’s fleet, particularly among the negro forces”. Disease ultimately killed more of Dunmore’s recruits than combat, as was common among all armies of the time. By 1776, Dunmore was forced to flee Virginia, taking only about 300 survivors with him.
The Patriot Response: Reluctant Acceptance
The Continental Army’s relationship with Black soldiers was complicated from the start. Black men fought at Lexington and Concord. They also distinguished themselves at Bunker Hill, where Black patriot Salem Poor performed so heroically that fourteen officers petitioned the Massachusetts legislature to recognize his “brave and gallant” service.
But in November 1775, just days after Dunmore’s Proclamation, George Washington—himself a Virginia slaveholder—banned the recruitment of all Black men. The ban didn’t last long. The British continued recruiting Black soldiers, and Washington faced a simple reality: he desperately needed troops. By early 1778, after the brutal winter at Valley Forge had decimated his forces, Washington grudgingly allowed states to enlist Black soldiers. Rhode Island led the way with legislation that promised immediate freedom to any “able-bodied negro, mulatto, or Indian man slave” who enlisted, with the state compensating slaveholders for their “property”.
The result was the 1st Rhode Island Regiment, which became known as the “Black Regiment.” Of its roughly 225 soldiers, about 140 were Black or Native American men. The regiment fought at the Battle of Rhode Island in August 1778, where they held their position against repeated British and Hessian charges—a performance that earned them, according to Major General John Sullivan, “a proper share of the day’s honors”. They went on to fight at Yorktown, where they stood alongside southern militiamen whose peacetime job had been hunting runaway slaves.
Throughout the Continental Army, Black soldiers generally served in integrated units. One French officer estimated that a quarter of Washington’s army was Black—though historians believe 10 to 15 percent is more accurate. As one historian noted, “In the rest of the Army, the few blacks who served with each company were fully integrated: They fought, drilled, marched, ate and slept alongside their white counterparts.”
Naval service—on both sides—was often more racially integrated than the army. Black men served as sailors, gunners, and marines in the Royal Navy and the Continental Navy. Maritime labor traditions had long been more flexible on race, and skill mattered more than status.
Free Blacks in northern towns could enlist much like white common citizens, sometimes motivated by pay, local patriotism, and the hope that visible service would strengthen claims to equal rights after the war. Enslaved men rarely chose independently; Patriot masters often enlisted them as substitutes to avoid service, while Loyalist masters sometimes allowed or forced them to join British units. In both cases emancipation promises were unevenly honored.
Some enslavers freed men in advance of service, others promised manumission afterward and reneged, while still others simply collected bounties or commutation while trying to retain control over Black veterans. On the British side, imperial policy also vacillated, with some officers fully supporting freedom for Black refugees tied to rebel masters, and others quietly returning runaways to Loyalist owners or exploiting them as unpaid labor.
The Promise and the Betrayal
As the war ended, the gulf between British and American treatment of their Black allies became stark. In 1783, as British forces prepared to evacuate New York, General George Washington demanded the return of all formerly enslaved people as “property” under the Treaty of Paris. British commander Sir Guy Carleton refused. Instead, he created the “Book of Negroes”—a ledger documenting about 3,000 Black Loyalists who were granted certificates of freedom and evacuated to Nova Scotia, England, Germany, and British territories.
The Book provides glimpses of individual journeys. Boston King, who had escaped slavery in South Carolina to join the British, was evacuated with his wife Violet to Nova Scotia. Their entry simply notes Violet as a “stout wench”—a reminder that even their liberators viewed them through racist lenses. Harry Washington, who had escaped from George Washington’s Mount Vernon plantation, also reached Nova Scotia and later became a leader in the resettlement to Sierra Leone.
Nova Scotia proved no paradise. Black Loyalists received inferior land—rocky and infertile compared to what white Loyalists received. They faced discrimination, exploitation, and broken promises about land grants. By 1792, nearly 1,200 Black Loyalists—about half of those in Nova Scotia—accepted an offer to resettle in Sierra Leone, where they founded Freetown.
For Black Patriots, the outcome was often worse. While some white soldiers received up to 100 acres of land and military pensions from Congress, Black soldiers who had been promised freedom often received nothing beyond freedom—and some didn’t even get that. As one historian put it, they were “dumped back into civilian society”. In June 1784, thirteen veterans of the Rhode Island Regiment had to hire a lawyer just to petition for their back pay. The state responded with an act that classified them as “paupers, who heretofore were slaves” and ordered towns to provide charity.
Lieutenant Colonel Jeremiah Olney, who commanded the Rhode Island Regiment after Christopher Greene’s death, spent years advocating for his former soldiers—fighting attempts to re-enslave them and supporting their pension claims. Some soldiers, like Jack Sisson, finally received pensions decades later in 1818—forty years after they’d enlisted, and often too late. Many died before seeing any recognition.
Even more cruelly, many Black soldiers who had been promised freedom by their masters were returned to slavery after the war. Some remained enslaved for a few years until their owners honored their promises; others remained enslaved permanently, having fought for a freedom they would never experience.
It is plausible that the widespread participation of Black soldiers subtly accelerated Northern emancipation by making slavery harder to justify ideologically, even as Southern resistance hardened.
The Larger Meaning
The American Revolution was the last time the U.S. military would be significantly integrated until President Truman’s Executive Order 9981 in 1948. In 1792, Congress passed legislation limiting military service to “free, able-bodied, white male citizens”—a restriction that would last for generations.
Yet the Revolutionary War period saw more enslaved people gain their freedom than any other time before the Civil War. Historian Gary Nash estimates that between 80,000 and 100,000 enslaved people escaped throughout the thirteen colonies during the war—not all joined the military, but the war created opportunities for flight that many seized.
As historian Edward Countryman notes, the Revolution forced Americans to confront a question that Black Americans had been raising all along: “What does the revolutionary promise of freedom and democracy mean for African Americans?” The white founders failed to answer that question satisfactorily, but the thousands of Black soldiers who fought—on both sides—had already answered it with their lives. They understood that liberty was worth fighting for, even when the people promising it had no intention of extending it to everyone.
Image generated by author using ChatGPT.
Sources
“African Americans in the Revolutionary War,” Wikipedia.
Museum of the American Revolution, “Black Patriots and Loyalists” and “Black Founders: Black Soldiers and Sailors in the Revolutionary War.”
Gilder Lehrman Institute, “African American Patriots in the Revolution.”
National Archives blog, “African Americans and the American War for Independence.”
Douglas R. Egerton, Death or Liberty: African Americans and Revolutionary America (individual stories on both Patriot and Loyalist sides).
Edward Countryman, The American Revolution.
Gary B. Nash, The Forgotten Fifth: African Americans in the Age of Revolution.
Alan Gilbert, Black Patriots and Loyalists: Fighting for Emancipation in the War for Independence.
DAR, Forgotten Patriots – African American and American Indian Patriots in the Revolutionary War: A Guide to Service, Sources, and Studies).
NYPL LibGuide, “Black Experience of the American Revolution”
American Battlefield Trust, “10 Facts: Black Patriots in the American Revolution.”
Massachusetts Historical Society, “Revolutionary Participation: African Americans in the American Revolution.”
Fraunces Tavern Museum, “Enlistment of Freed and Enslaved Blacks in the Continental Army.”
American Independence Museum, “African-American Soldiers’ Service During the Revolutionary War.”
Mount Vernon, “Dunmore’s Proclamation and Black Loyalists” and “The Ethiopian Regiment.”
American Battlefield Trust, “Lord Dunmore’s Ethiopian Regiment”
Lord Dunmore’s Proclamation (1775), in transcription with context at Gilder Lehrman, Encyclopedia Virginia, and Mount Vernon.
“Book of Negroes” (1783 evacuation ledger of Black Loyalists to Nova Scotia; digital copies and discussions via BlackPast and Dictionary of Canadian Biography).
Boston King, “Memoirs of the Life of Boston King, a Black Preacher,” Methodist Magazine (1798)
NYPL “Black Experience of the American Revolution”
1st Rhode Island Regiment, World History Encyclopedia
The Accidental Footnote: Heel Spurs, the Vietnam Draft, and American Inequality
By John Turley
On April 17, 2026
In Commentary, History, Medicine, Politics
If you’ve ever winced taking your first steps out of bed in the morning, you may have already made an involuntary acquaintance with heel spurs — or more precisely, with the condition that often travels with them. The term itself sounds alarming, and for a brief but colorful stretch of American political history, it became something far more charged than a footnote to podiatry. But before we get to the politics, it’s worth understanding what a heel spur actually is, because the medical reality is both more mundane and more complicated than the caricature.
What Exactly Is a Heel Spur?
A heel spur is a small bony outgrowth — technically called a calcaneal spur — that extends from the underside of the heel bone (the calcaneus). It forms at the spot where the plantar fascia — the thick ligament running the length of your foot from heel to toe — attaches to the heel bone. The spur is not, despite what the name implies, a sharp spike. It is typically smooth and rounded, though it can still cause irritation if it presses into surrounding soft tissue.
Heel spurs affect about 10% of the population, making them one of the more common foot conditions around, though most people who have one don’t know it. The spur develops gradually — usually over months or even years — as the body deposits calcium in response to chronic stress at that heel attachment point. Think of it less as damage and more as your skeleton’s attempt at reinforcement.
What Causes Them?
The underlying driver is repetitive mechanical stress on the foot. Heel spurs are particularly associated with strains on foot muscles and ligaments, stretching of the plantar fascia, and repeated small tears in the membrane covering the heel bone. Athletes who do a lot of running and jumping are especially prone.
But you don’t need to be an elite runner to develop one. Walking gait problems — particularly overpronation, where the foot rolls inward — place uneven stress on the heel with each step. Worn-out or poorly fitted shoes, which fail to absorb shock or support the arch, compound the problem. Obesity increases the mechanical load on the heel. Occupations that require prolonged standing or walking on hard surfaces put the plantar fascia under constant tension. And as people age, tendons and ligaments lose their elasticity, making the tissues more vulnerable to micro-tears and the subsequent bony repair response.
Heel spurs are also closely connected to a condition most people have heard of: plantar fasciitis. The two are related a but not identical. Plantar fasciitis is inflammation of the plantar fascia itself, usually from overuse. A heel spur can develop as a downstream consequence of that inflammation — the body lays down extra bone in response to the ongoing stress at the fascia’s attachment point.
Symptoms — or the Lack Thereof
Here’s the part that surprises most people: the majority of heel spurs cause no symptoms at all, and many are discovered incidentally on X-rays taken for other reasons. Only about 5% of heel spurs are estimated to be symptomatic.
When a heel spur does produce symptoms, the experience is heavily intertwined with plantar fasciitis. The classic description is a sharp, stabbing pain on the bottom of the foot first thing in the morning, or after any prolonged rest. Many people compare it to stepping on a tack. Paradoxically, this pain often eases somewhat after walking around for a few minutes, only to return after extended time on the feet or after another rest. It’s that “worse in the morning” quality that tends to be the giveaway.
Other symptoms, when present, can include localized swelling, warmth, and tenderness along the front of the heel, as well as increased sensitivity on the underside of the foot. It’s worth noting that the pain associated with a heel spur is not generally thought to come from the bony spur itself, but from the irritation it causes in the surrounding soft tissue — tendons, ligaments, and bursae.
How Is It Diagnosed?
Diagnosis typically begins with a physical exam. Your doctor or podiatrist will ask about when the pain started, what activities preceded it, and what makes it better or worse. They’ll examine your foot for tenderness at specific points, assess your range of motion, and check foot alignment and press on key areas to locate the source of pain.
Imaging confirms the picture. An X-ray can clearly show the bony spur and is the most commonly used test. That said, the size of the spur on an X-ray doesn’t necessarily correspond to how much pain a patient is experiencing — a small spur can be quite painful while a large one may cause no trouble at all. In more complex cases, an MRI may be ordered to assess the soft tissues more closely and evaluate whether plantar fasciitis or another condition is also in play.
Treatment Options
The reassuring news is that the vast majority of cases resolve without surgery. More than 90% of patients improve with nonsurgical treatment. The catch is that conservative management requires patience — improvement typically takes weeks, and more stubborn cases can take months.
The cornerstone of treatment is rest and reducing the activities that provoke pain. This doesn’t necessarily mean completely stopping exercise; low-impact alternatives like swimming, cycling, or rowing allow you to stay active while giving the heel a break from impact. Icing the bottom of the foot after activity helps manage inflammation. Over-the-counter anti-inflammatory medications like ibuprofen or naproxen can provide relief, though they’re intended for short-term use.
Footwear matters enormously. Supportive shoes with good arch support, cushioning, and a slight heel rise reduce the strain on the plantar fascia. Custom orthotics with molded insoles designed to redistribute pressure across the foot are often recommended, particularly for people with gait abnormalities or flat feet. Physical therapy can be part of the treatment plan, focusing on stretching the calf muscles and plantar fascia, strengthening the foot’s intrinsic muscles, and correcting biomechanical issues.
For cases that don’t respond to these initial measures, the next tier of treatment includes corticosteroid injections to reduce inflammation at the spur site, and extracorporeal shockwave therapy — a non-invasive procedure that uses sound waves to stimulate healing in chronically inflamed tissue. Surgery is reserved for the minority of cases where conservative treatment fails after nine to twelve months. Possible complications include nerve pain, infection, scarring, and — with plantar fascia release — the risk of foot instability or stress fracture. Most orthopedic surgeons regard surgery as a last resort.
Are Heel Spurs Debilitating?
For most people, the honest answer is: no. Heel spurs are a common condition with a favorable prognosis, especially with early diagnosis and appropriate management. Many people live with heel spurs for years without ever knowing it, and even those who develop pain typically find substantial relief with conservative treatment within four to eight weeks.
That said, the pain at its worst — particularly in conjunction with plantar fasciitis — can be genuinely disruptive to daily life. Athletes may find their training significantly limited. People who spend long hours on their feet at work may struggle with sustained discomfort. And a small percentage of patients do end up with prolonged, treatment-resistant pain that affects mobility. So, the more accurate framing might be: heel spurs have the potential to be significantly uncomfortable and functionally limiting during flare-ups, but with proper treatment most people recover well and return to normal activity.
Heel Spurs and the Vietnam-Era Draft
Which brings us to an improbable chapter in heel spur history. During the Vietnam War era, heel spurs became — for at least one famous case — a ticket out of military service. Understanding how that worked requires a brief detour into the draft system of the 1960s and 1970s, and what it meant to receive a medical deferment.
According to the National Archives, of the roughly 27 million American men eligible for military service between 1964 and 1973, about 15 million were granted deferments — mostly for education, and some for mental or physical problems — while only 2,215,000 were actually drafted into service—another eight million volunteered. Some of those who later served had previously had deferments. The system was sprawling, complex, and — as was widely acknowledged even at the time — deeply unequal.
Roughly 60% of draft-eligible American men took some sort of action to avoid military conscription. There were many routes: college deferments, fatherhood, conscientious objector status (170,000 men received those alone), National Guard enlistment, and medical exemptions. Medical deferments covered a wide range of conditions — from serious chronic illness to conditions that, in a different context, most people would consider minor. Flat feet, poor eyesight, asthma, and yes, bone spurs all appeared on the list of potentially disqualifying ailments.
The system was known to favor men with access to money, education, and well-connected physicians. American forces in Vietnam were 55% working-class and 25% poor — reflecting those who didn’t have the means to navigate the deferment labyrinth. A working-class kid from rural West Virginia was far more likely to end up in the Mekong Delta than the son of a New York real estate developer.
The Most Famous Heel Spur in American History
Which leads, inevitably, to Donald Trump. As confirmed by Selective Service records obtained and reported by multiple news outlets, Trump received five Vietnam-era draft deferments — four for college attendance at Fordham and the Wharton School, and a fifth in 1968, recorded as a medical deferment for bone spurs in his heels. The medical classification left him disqualified for military service.
The circumstances surrounding the diagnosis have been contested ever since. Reporting by the New York Times included accounts from the daughters of a Queens podiatrist named Larry Braunstein, who alleged that their father had provided or vouched for the diagnosis as a professional favor to Trump’s father, Fred Trump — a landlord to whom Braunstein reportedly owed a debt of gratitude. Trump’s former lawyer Michael Cohen also testified that Trump had admitted to fabricating the injury. Trump himself has maintained that the diagnosis was legitimate, stating that a doctor “gave me a letter — a very strong letter — on the heels.” The underlying medical records that would resolve the dispute are, conveniently, not publicly available; most individual Selective Service medical records from that era were subsequently destroyed.
It’s worth noting that Trump’s pattern — using legal channels, including a medical deferment of questionable validity, to avoid Vietnam service — was not unique to him. Historians have pointed out that numerous prominent figures on both sides of the political aisle received deferments of various kinds, including Joe Biden (asthma), Dick Cheney (student deferments), Bill Clinton (navigated the ROTC system), and George W. Bush (National Guard). The heel spur episode became politically charged in part because of Trump’s later hawkish rhetoric and his outspokenness in questioning the military service of others — most notably Senator John McCain, who spent years as a prisoner of war in North Vietnam.
How Many People Got Heel Spur Deferments?
This is where the historical record hits a hard wall. No reliable statistics exist specifically for heel spur deferments. The Selective Service tracked broad categories — student deferments, hardship deferments, conscientious objector status, medical disqualifications — but it did not publish a breakdown by specific diagnosis, and most individual medical records from that era no longer exist.
What we can say is that bone spurs were a recognized medical disqualifier under Selective Service regulations, that medical deferments broadly were a commonly used — and commonly abused — avenue for avoiding service, and that the process was heavily influenced by access to sympathetic physicians. A man with means, connections, and a cooperative podiatrist had options that a man without those resources did not.
The honest answer, then, is that we don’t know how many men received deferments citing heel spurs specifically, and we almost certainly never will. The data either wasn’t tracked at that level of granularity or was long since destroyed. What we do know is that the condition became, for a time, a lens through which Americans examined something much larger: who serves, who doesn’t, and whether the systems meant to govern those decisions are applied fairly.
For most people, a heel spur is a manageable, if annoying, footnote in the story of their health. For at least one person, it became a footnote in the history of American politics.
Personal Note: I have heel spurs; I wish I’d known about them in 1967.
Images generated by author using AI.
Medical Sources
Cleveland Clinic — Heel Spurs overview
https://my.clevelandclinic.org/health/diseases/21965-heel-spurs
WebMD — Heel Spur Causes, Symptoms, Treatments, and Surgery
https://www.webmd.com/pain-management/heel-spurs-pain-causes-symptoms-treatments
Hackensack Meridian Health — Bone Spurs in the Heel: Symptoms and Recovery
https://www.hackensackmeridianhealth.org/en/healthier-you/2024/01/02/bone-spurs-in-the-heel-symptoms-and-recovery
OrthoArkansas — Heel Spurs
https://www.orthoarkansas.com/heel-spurs-orthoarkansas/
EmergeOrtho — Heel Bone Spurs: Causes, Symptoms, Treatment
https://emergeortho.com/news/heel-bone-spurs/
American Academy of Orthopaedic Surgeons — Plantar Fasciitis and Bone Spurs
https://orthoinfo.aaos.org/en/diseases–conditions/plantar-fasciitis-and-bone-spurs/
Vietnam Draft & Military Service Sources
History.com — 7 Ways Americans Avoided the Draft During the Vietnam War
https://www.history.com/articles/vietnam-war-draft-avoiding
Wikipedia — Draft Evasion in the Vietnam War
https://en.wikipedia.org/wiki/Draft_evasion_in_the_Vietnam_War
Wikipedia — Conscription in the United States
https://en.wikipedia.org/wiki/Conscription_in_the_United_States
Students of History — The Draft and the Vietnam War
https://www.studentsofhistory.com/vietnam-war-draft
University of Michigan — The Military Draft During the Vietnam War
https://michiganintheworld.history.lsa.umich.edu/antivietnamwar/exhibits/show/exhibit/draft_protests/the-military-draft-during-the-
Vietnam Veterans of America Chapter 310 — Vietnam War Statistics
https://www.vva310.org/vietnam-war-statistics
Vietnam Veterans of Foreign Wars — Fact vs. Fiction: The Vietnam Veteran
https://www.vvof.org/factsvnv.htm
New York City Vietnam Veterans Plaza — Interesting Facts About Vietnam
https://www.vietnamveteransplaza.com/interesting-facts-about-vietnam/
Medical Disclaimer
The information provided in this article is intended for general educational and informational purposes only and does not constitute medical advice. It should not be used as a substitute for professional medical advice, diagnosis, or treatment.
Always seek the guidance of a qualified healthcare provider with any questions you may have regarding a medical condition or treatment. Never disregard professional medical advice or delay seeking it because of something you have read here.
If you are experiencing a medical emergency, call 911 or your local emergency number immediately.
The author of this article is a licensed physician, but the views expressed here are solely those of the author and do not represent the official position of any hospital, health system, or medical organization with which the author may be affiliated.