Grumpy opinions about everything.

Month: April 2026

The Evolution of the English Language: From Anglo-Saxon Roots to a Global Tongue

English is a beautifully messy language—shameless in its borrowing and relentless in its evolution. It resists the tidy logic that might make a grammarian’s life easier, and that resistance is part of what makes its history so compelling. The English we speak today is the product of centuries of invasion, migration, cultural collision, and literary ambition—a language built in layers, like geological strata laid down over time.

To see how English grew from an obscure Germanic dialect into a global lingua franca, it helps to trace three broad phases: Old English, Middle English, and Modern English. Each stage was shaped by different historical forces, from Germanic migration and Viking settlement to the Norman Conquest, the Renaissance, the printing press, and ultimately the worldwide reach of the British Empire and the United States.

Anglo-Saxon Foundations

The story begins on the European mainland. When Roman authority collapsed in Britain in the early fifth century, Germanic-speaking peoples from what is now northern Germany, Denmark, and the Netherlands moved into the island. The Angles, Saxons, and Jutes arrived in waves, bringing closely related West Germanic dialects that gradually developed into Old English, often called Anglo-Saxon.

Old English was thoroughly Germanic in both grammar and vocabulary. It was a highly inflected language: case endings marked whether a noun was subject, object, or possessive, and nouns had grammatical gender. Verbs were conjugated with a complexity that feels foreign to most modern English speakers. Much of the core vocabulary of modern English—words such as water, house, bread, child, earth, life, and death—dates back to this early period and still carries that Germanic stamp.

The language of Beowulf, composed between the eighth and early eleventh centuries, is virtually unreadable today without specialized training. Its famous opening line, “Hwæt! We Gardena in geardagum,” is technically English, but it feels closer to a foreign language. Old English used letters such as þ (thorn) and ð (eth) and relied on grammatical structures that later disappeared.

Nor was Old English a single uniform tongue. It existed as a cluster of regional dialects including Northumbrian, Mercian, Kentish, and West Saxon. Under King Alfred the Great in the late ninth century, Wessex became the leading political power in England and a center of learning. Alfred sponsored translations of important Latin works into Old English, most often in the West Saxon dialect. As a result, most surviving Old English texts come from that dialect, giving us only a partial view of the linguistic diversity of early England.

Latin and Celtic Influences

Even before the Anglo-Saxons arrived in Britain, Latin had begun to influence their speech through contact with the Roman world. Early Latin loanwords include street (from strata), wall (from vallum), and wine (from vinum).

A second wave of Latin influence arrived with the Christianization of England beginning in 597, when Augustine of Canterbury established a mission in Kent. Christianity introduced vocabulary connected with religion, learning, and administration—words such as church, bishop, monk, school, altar, and verse.

By contrast, the Celtic languages spoken by the native Britons left a surprisingly small mark on English vocabulary. Their influence survives most clearly in place names—for example Thames, Avon, and Dover—and in landscape terms such as combe (valley) and tor (rocky hill). Why Celtic languages left relatively few everyday words in English remains one of the lingering puzzles of linguistic history.

Vikings and the Norse Contribution

Beginning in the late eighth century, Scandinavian raiders and settlers—collectively known as Vikings—began attacking and eventually settling parts of England. By the ninth century much of northern and eastern England had become part of the Danelaw, where Old English speakers lived alongside speakers of Old Norse.

Because Old Norse and Old English were closely related Germanic languages, speakers could often roughly understand each other. Over time, however, sustained contact produced deep linguistic blending. English absorbed many Norse-derived words that now feel completely native, including sky, skin, skill, skirt, egg, leg, window, husband, call, take, give, get, want, and die.

Perhaps the most striking Norse contribution lies in the pronouns they, them, and their, which replaced earlier Old English forms. When a language adopts core pronouns from another language, it signals unusually intense and prolonged contact.

Many linguists also believe that contact with Norse speakers helped accelerate the simplification of English grammar. In bilingual communities, speakers often reduce complex inflectional endings that make communication difficult. As a result, English gradually moved away from the elaborate grammatical endings of Old English and toward a system that relied more heavily on word order.

The Norman Transformation

The Norman Conquest of 1066 transformed English more dramatically than any other single event in its history. When William of Normandy defeated King Harold at the Battle of Hastings and became king of England, he brought with him a French-speaking aristocracy.

For several centuries after the conquest, French dominated the language of power—the court, the law, the church hierarchy, and much of government administration. English remained the everyday language of the population but lost prestige in elite circles.

French vocabulary poured into English in areas associated with authority and culture. Law gained terms such as justice, court, judge, jury, prison, crime, and verdict. Government absorbed parliament, sovereign, minister, authority, tax, and treasury. Military language adopted army, navy, soldier, captain, defense, and siege.

Even the language of food reflects this social divide. The animals in the field kept their Old English names—cow, sheep, pig, and deer—while the meat served at noble tables took French names: beef, mutton, pork, and venison.

The Rise of Middle English

Over time, French dominance gradually weakened. The loss of Normandy in 1204 encouraged English nobles to identify more strongly with England itself. Later, the Black Death (1348–1350) reshaped English society by elevating the economic importance of English-speaking laborers and craftsmen.

During the fourteenth century, English returned as the language of all social classes. The language that emerged—Middle English—looked very different from Old English. Most grammatical endings disappeared, grammatical gender vanished, and sentence structure shifted toward the familiar subject-verb-object order.

At the same time, English vocabulary became a rich mixture of Germanic and Romance elements. This layering produced sets of near-synonyms with different levels of formality: ask (Germanic), question (French), and interrogate (Latin).

The most famous literary figure of this period was Geoffrey Chaucer, whose Canterbury Tales demonstrated that English could rival French and Latin as a vehicle for sophisticated literature. Chaucer wrote in the London dialect, which was gaining prominence due to the city’s political and commercial importance. Though not yet standardized, London English gradually became the foundation of later written English.

Printing and the Great Vowel Shift

William Caxton established England’s first printing press in 1476, and this technological revolution had far-reaching consequences for the language. Printing created a need for standardized spelling and grammar, since texts would now be distributed widely rather than copied by hand in local scriptoria. Caxton himself struggled with the problem of dialect variation, complaining about the difficulty of choosing forms that all English readers could understand. Over time, the conventions adopted by London printers became the de facto standard. Additionally, the need to create type for the printing press led to the dropping of the letters þ (thorn) and ð (eth) that were difficult to replicate in lead.

At the same time, English pronunciation underwent a dramatic change known as the Great Vowel Shift, which occurred roughly between 1400 and 1700. Long vowel sounds moved upward in the mouth, transforming the pronunciation of many common words. For example, “name” once sounded closer to nah-muh, while “mouse” sounded more like moose. 

The causes of the Great Vowel Shift remain debated—theories range from the social upheaval following the Black Death to the influence of French-accented English—but its effects were enormous. The spellings had been largely fixed by printing before the vowel shift was complete, so that the written words reflected pronunciations that no longer existed such as knife and through.

Renaissance Expansion

The English Renaissance of the sixteenth and seventeenth centuries unleashed another flood of new vocabulary, much of it borrowed from Latin and Greek. Scholars and writers introduced thousands of words connected to science, philosophy, and literature, including democracy, encyclopedia, atmosphere, thermometer, criticism, and educate.

Critics derided the new coinages as “inkhorn terms”—pretentious, unnecessary words invented by scholars dipping their quills in inkhorns. Some of these attacked words, like “perpetrate” and “contemplate,” survived, others, like “ingent” (enormous), did not.

Two towering cultural works further shaped English during this era: Shakespeare’s plays and the King James Bible (1611). Shakespeare popularized countless words and expressions—among them assassination, lonely, eventful, and phrases like “break the ice” and “wild goose chase.” The King James Bible, widely read for centuries, left deep marks on English rhythm and idiom.

Dictionaries and Standardization

By the eighteenth century, many writers wanted to standardize and regulate English. The most influential effort was Samuel Johnson’s Dictionary of the English Language (1755), which became the dominant reference work of its era.

In the United States, Noah Webster’s American Dictionary of the English Language (1828) promoted simplified spellings such as color instead of colour and center instead of centre. Webster viewed spelling reform as part of America’s broader cultural independence from Britain.

English Goes Global

From the seventeenth through the early twentieth centuries, the British Empire spread English across the globe. Along the way, the language absorbed vocabulary from many other languages. Hindi contributed words such as jungle and shampoo, Arabic added algebra and alcohol, and Malay gave English bamboo and ketchup.

As English took root in different regions, new varieties emerged—American, Australian, Canadian, Indian, Nigerian, Singaporean, and many others. Linguists today increasingly recognize these as legitimate forms of English rather than deviations from a single standard.

English in the Digital Age

In the twentieth and twenty-first centuries, mass media and digital communication have accelerated linguistic change. Radio, film, television, and the internet spread slang, accents, and new expressions around the world with unprecedented speed.

English continues to absorb new words from science, technology, business, and online culture. Brand names become verbs; internet slang becomes everyday speech. Today more than a billion people speak English as a first or second language, making it the most widely used language in human history.

A Language Still Evolving

The history of English reminds us that language is not a fixed monument but a living system shaped by human interaction. Its vocabulary is like an archaeological site, where almost every common word carries traces of earlier eras.

English has never been “pure,” and attempts to purify it have always failed. Its strength lies in its openness—its ability to borrow, adapt, and reinvent itself. From the heroic poetry of Beowulf to Shakespeare’s theater, from the King James Bible to the language of the internet, English continues to grow through the voices of those who use it.

And if history is any guide, the English spoken a few centuries from now will sound just as surprising to us as Chaucer’s language once did.

Illustration generated by author using ChatGPT.

Sources

Baugh, Albert C. and Thomas Cable. A History of the English Language (6th edition). Routledge, 2012. This remains the standard academic textbook on the subject and covers every period and influence discussed above.

Crystal, David. The Cambridge Encyclopedia of the English Language (3rd edition). Cambridge University Press, 2019. An accessible and richly illustrated reference covering the structure and history of English.

McCrum, Robert, Robert MacNeil, and William Cran. The Story of English (3rd revised edition). Penguin, 2003. A popular history that accompanied the PBS television series, excellent for general readers.

Mugglestone, Lynda (ed.). The Oxford History of English (2nd edition). Oxford University Press, 2012. A collection of essays by specialists covering English from its earliest origins to the present day.

Bede, The Venerable. Ecclesiastical History of the English People. Penguin Classics, 1990 (translated by Leo Sherley-Price). The primary early source on the Anglo-Saxon migrations.

Townend, Matthew. “Contacts and Conflicts: Latin, Norse, and French.” In The Oxford History of English, edited by Lynda Mugglestone, 2012. A detailed treatment of the major external influences on English.

Online resource: The British Library’s “Evolving English” exhibit materials are available at https://www.bl.uk/learning/langlit/evolvingenglish/

Online resource: Durkin, Philip. “Borrowed Words: A History of Loanwords in English.” Oxford University Press, 2014. Summary and excerpts available at https://global.oup.com/academic/product/borrowed-words-9780199574995

Hay Fever: The Allergy That Has Nothing to Do with Hay

Let’s get one thing out of the way up front: hay fever has almost nothing to do with hay, and it doesn’t cause a fever. The name stuck after a popular 19th-century theory that the smell of summer hay was making people sick. Turns out, the culprit is invisible and far more pervasive — tiny airborne particles that your immune system, for reasons we can’t entirely explain, decides to treat like the enemy. The official medical term is allergic rhinitis, but most of us just call it hay fever, seasonal allergies, or, in the depths of pollen season, I call it a personal nightmare.

If you’ve ever spent a spring morning sneezing your way through a box of tissues or rubbed your eyes until they looked like you’d been crying all night, you already know what this feels like. What you might not know is why it happens, what exactly sets it off, and — most importantly — what you can do about it. Let’s dig in.

What Is Hay Fever, Exactly?

Hay fever is, at its core, an overreaction by your immune system. When you breathe in certain particles — pollen, dust, animal dander — your body may misidentify them as a threat. In response, it releases a chemical called histamine, which is supposed to help fight off invaders but instead triggers a cascade of miserable symptoms: sneezing, congestion, a runny nose, itchy eyes, and general stuffiness. None of this is actually doing anything useful. Your immune system is essentially deploying the cavalry against a dandelion.

According to the Cleveland Clinic, roughly 20% of Americans have allergic rhinitis, and a 2021 study found that more than 81 million people reported seasonal allergy symptoms that year alone. So, if you’re one of us, you are not alone.

Hay fever comes in two main varieties. Seasonal allergic rhinitis is what most people picture — the spring sneezing, the summer eye-rubbing, the early fall misery. Perennial allergic rhinitis, on the other hand, is the year-round version, driven by indoor allergens that don’t take the winter off. Either way, the underlying mechanism is the same: your immune system picking a fight with something that poses no real danger.

What Triggers It?

The list of potential triggers is longer than you might expect, but they fall into a few main categories.

Pollen is the classic offender and the one most associated with the “hay fever” label. But not all pollen is created equal. According to the American College of Allergy, Asthma and Immunology (ACAAI), seasonal hay fever is most commonly triggered by wind-carried pollen from trees, grasses, and weeds. Crucially, it’s not flower pollen — those heavy, colorful grains are carried by insects and never make it into your airway.   The sneaky offenders are the plain-looking plants whose lightweight pollen drifts for miles. Tree pollens tend to peak in spring, grasses in early summer, and ragweed in late summer through early fall.

Hot, dry, and windy days are the worst for pollen exposure. A cool, rainy day provides some relief — rain washes pollen out of the air, at least temporarily. As noted by MedlinePlus (National Library of Medicine), pollen counts are highest during those breezy, sunny mornings when everything is blooming.

Beyond pollen, a range of indoor allergens can trigger perennial symptoms year-round. Dust mites — microscopic creatures that live in bedding, carpets, and upholstered furniture — are among the most common. Pet dander (the tiny flecks of skin that cats, dogs, and other animals shed) is another major culprit. Mold spores, which thrive in damp environments, can trigger symptoms both indoors and outdoors. And unpleasantly, cockroach droppings and saliva are also recognized as allergens. The ACAAI notes that perennial symptoms tend to worsen in winter, when people spend more time indoors with windows closed and allergens concentrated.

You may also notice that some non‑allergic irritants make things worse, such as cigarette smoke, strong perfumes, cleaning sprays or exhaust fumes. They do not cause hay fever on their own, but they can irritate already sensitive noses and eyes.

There’s also a lesser-known category: occupational rhinitis. If your symptoms are worse at work and better on weekends, you might be reacting to something in your workplace environment — cleaning chemicals, dust, fumes, or other irritants. This is worth discussing with a doctor if you notice a pattern.

The so-called “hygiene hypothesis” suggests that overly clean environments may predispose the immune system to overreact when you do come in contact with a trigger. This point remains debatable, but it’s widely discussed in immunology literature.

How Does It Feel?

The symptoms of hay fever overlap enough with the common cold that it can be genuinely hard to tell the two apart at first. The key difference is that hay fever is not contagious, doesn’t come with a true fever, and tends to linger as long as you’re exposed to the trigger rather than resolving in a week or two like a cold.

Typical symptoms include sneezing (sometimes in rapid-fire bursts), a runny or stuffed-up nose, itchy and watery eyes, an itchy throat or roof of the mouth, and post-nasal drip. More severe cases can cause fatigue, reduced concentration, and disrupted sleep. According to Harvard Health Publishing, the congestion can also lead to secondary complications like sinus infections or ear infections, since swelling can block the passages that normally drain those areas.

For people with asthma, hay fever can be an especially unwelcome companion. The same inflammation that irritates the nasal passages can travel through the airways and worsen breathing problems. The NCBI/InformedHealth.org notes that hay fever symptoms can sometimes “move down” into the lungs and develop into allergic asthma over time — one more reason to take persistent symptoms seriously.

What Can You Do About It?

The good news is that hay fever is manageable, even if it isn’t curable. Treatment generally falls into three strategies: avoidance, medication, and — for more serious cases — immunotherapy.

Avoidance sounds obvious but is easier said than done and takes some planning. Staying indoors on high-pollen days (especially in the morning when counts peak), keeping windows closed, using air conditioning instead of window fans, and showering after being outside can all reduce your exposure. For dust mite allergies, encasing pillows and mattresses in allergen-blocking covers and washing bedding in hot water regularly can make a noticeable difference. The ACAAI also suggests wearing wraparound sunglasses outdoors to limit the amount of pollen that reaches your eyes.

Medications are the backbone of hay fever treatment for most people. Antihistamines work by blocking the histamine response — they’re widely available over the counter and work well for mild-to-moderate symptoms. Older antihistamines (like diphenhydramine, the active ingredient in Benadryl) can cause drowsiness; newer ones like cetirizine (Zyrtec) and loratadine (Claritin) are much less sedating for most people.  These make life tolerable for me in the fall and spring.  When I was younger, there were days when I wouldn’t venture outside because of the unpleasant symptoms.

Nasal corticosteroid sprays are considered the most effective single treatment for allergic rhinitis by most clinical guidelines. According to MedlinePlus, they work best when used consistently rather than just on symptom days, and many brands — including fluticasone (Flonase) and budesonide (Rhinocort) — are now available without a prescription. Harvard Health advises starting these sprays a week or two before your expected allergy season begins for maximum effectiveness.

Decongestants can help with nasal stuffiness, but nasal spray decongestants (like oxymetazoline) should not be used for more than three days in a row, as they can cause a rebound effect that makes congestion worse. Oral decongestants don’t carry that risk but can raise blood pressure and heart rate, so they’re not appropriate for everyone.

Leukotriene inhibitors — most commonly montelukast (Singulair) — offer another option. These prescription medications work differently from antihistamines and steroids, blocking a different arm of the allergic response. They’re less effective than corticosteroid sprays on their own but can be useful in combination. Antihistamine eye drops are also available for people whose main complaint is itchy, watery eyes.

For people with persistent or severe symptoms that don’t respond well to medications, allergen immunotherapy may be the answer. This is the long game: regular, gradually increasing doses of the allergen itself, either through allergy shots (subcutaneous immunotherapy) or sublingual tablets and drops placed under the tongue. According to the Australasian Society of Clinical Immunology and Allergy (ASCIA), treatment typically runs three to five years and should be overseen by an allergy specialist. It doesn’t cure the allergy, but it can meaningfully reduce the severity of symptoms and lower your dependence on daily medications.

Finally, simple saline nasal rinses are worth mentioning. They’re not glamorous, but rinsing the nasal passages with saltwater (using a neti pot or squeeze bottle) can physically flush out allergens and thin mucus. They’re safe, inexpensive, and effective enough that clinical guidelines recommend them as a complementary strategy.  Personally, I’ve found them unpleasant to use though many of my patients swear by them.

A Final Word

Hay fever is one of those conditions that can feel like a minor inconvenience until it’s not — until it’s disrupting your sleep, tanking your productivity, and making you dread the most beautiful days of the year. The encouraging news is that modern medicine has a pretty good toolkit for managing it. If over-the-counter antihistamines and nasal sprays aren’t cutting it, that’s worth a conversation with your doctor. Allergy testing can pinpoint your specific triggers, and from there, a targeted treatment plan can make a real difference.

There’s something ironic about hay fever: the very environments we associate with health—fresh air, blooming trees, green landscapes—can provoke the body into a defensive overreaction. Understanding that paradox is the first step toward managing it effectively.

In the meantime, maybe check the pollen count before you plan that picnic.

As always, this article is for information only. Consult your health care provider regarding your individual care.

Illustration generated by the author using ChatGPT.

Sources

Cleveland Clinic: Allergic Rhinitis (Hay Fever) — https://my.clevelandclinic.org/health/diseases/8622-allergic-rhinitis-hay-fever

American College of Allergy, Asthma & Immunology (ACAAI): Hay Fever — https://acaai.org/allergies/allergic-conditions/hay-fever/

MedlinePlus (National Library of Medicine): Allergic Rhinitis — https://medlineplus.gov/ency/article/000813.htm

Harvard Health Publishing: Hay Fever (Allergic Rhinitis) — https://www.health.harvard.edu/a_to_z/hay-fever-allergic-rhinitis-a-to-z

NCBI / InformedHealth.org: Overview of Hay Fever — https://www.ncbi.nlm.nih.gov/books/NBK279488/

Australasian Society of Clinical Immunology and Allergy (ASCIA): Allergic Rhinitis — https://www.allergy.org.au/patients/allergic-rhinitis-hay-fever-and-sinusitis/allergic-rhinitis-or-hay-fever

The Easter Bunny: A Surprisingly Serious History

How a German hare hopped its way into American Easter tradition

Every Easter morning, children across America hunt for eggs left by a rabbit. It’s a charming ritual—and a deeply strange one, when you stop to think about it. Rabbits don’t lay eggs. They don’t carry baskets. Yet here we are, every spring, maintaining the fiction with great enthusiasm. Where did this tradition come from? The answer turns out to be a lot more interesting than you might expect.

The story starts in Germany. The earliest documented reference to an Easter Hare—called the “Osterhase” in German—appears in 1678, in a medical text by the physician Georg Franck von Franckenau. In the German tradition, the Osterhase was specifically a hare, not a rabbit, and its job was straightforward: deliver colored eggs to well-behaved children. Naughty children got nothing. This moral dimension—gift delivery tied to good behavior—should sound familiar. The Easter Bunny was, in a sense, an early version of Santa Claus.

The tradition crossed the Atlantic in the 1700s, carried by German Protestant immigrants who settled in Pennsylvania. Their children knew the Osterhase (sometimes rendered as “Oschter Haws” in Pennsylvania Dutch dialect) and kept up the custom of leaving out nests—made from caps and bonnets—for the hare to fill with eggs. Over time, the nests became baskets, the simple colored eggs became candy and chocolate, and the moral judgment quietly dropped away. By the 20th century, the Easter Bunny had transformed from selective gift-giver into universal children’s benefactor.

But why eggs at all? Eggs entered the Easter story long before Germany. For ancient Romans, they symbolized new life and fertility, and the custom of giving dyed eggs as spring gifts predates Christianity. The Christian tradition added another layer: during the Lenten fasting period eggs were a forbidden food. By Easter Sunday, the people were ready to use the accumulated eggs and were ready to celebrate.  They cooked, decorated, and shared them. The emergence from the shell became a visual metaphor for resurrection, and the symbolism stuck.

Rabbits and hares had their own long history as symbols of fertility and springtime. Some writers have linked the Easter Bunny to an ancient Anglo-Saxon goddess named Eostre—from whose name we may get the word “Easter”—and they claim the hare was her sacred animal. It’s a compelling story. It’s also largely unsupported by evidence. The Oxford Dictionary of English Folklore notes that the only historical source mentioning Eostre is the medieval scholar Bede, and Bede says nothing about hares. The goddess-and-hare connection appears to be modern folklore dressed up as ancient tradition.

What is better documented is that hares held symbolic significance across many early cultures. Neolithic burial sites in Europe include hares interred alongside humans, suggesting ritual importance. Hares are conspicuous breeders—they produce multiple litters each year and nest above ground, making their reproductive activity visible in a way that rabbits’ underground burrows do not. For pre-modern peoples marking the return of spring, the hare was a living advertisement for new life.

The combination of egg symbolism and hare symbolism wasn’t a deliberate design decision by any single culture or institution. It was a gradual collision—two powerful images of renewal fusing together over centuries of seasonal celebration. The church absorbed local spring customs rather than eliminating them, allowing pagan associations with fertility and rebirth to persist beneath a Christian overlay. The result is the hybrid tradition we have today.

Today’s Easter Bunny is genuinely a global figure, though not always a rabbit. In Australia, the role is played by the Easter Bilby, an endangered marsupial that conservationists have promoted as a local alternative since the 1990s. Switzerland has an Easter Cuckoo. Parts of Germany have an Easter Fox. Each region adapted the basic concept of a spring gift-bringer to fit its own wildlife and folklore.

The commercial Easter Bunny we know—the chocolate molded figure, the pastel basket, the branded plush toy—is largely a product of the late 19th and 20th centuries, shaped by the same forces that turned Saint Nicholas into Santa Claus. Candy manufacturers, greeting card companies, and department stores found in Easter a spring counterpart to the Christmas retail season, and the Easter Bunny was the obvious mascot.

None of that diminishes what the tradition actually does. The Easter Bunny survived precisely because its meaning kept evolving. It began as a moral enforcer in 17th-century Germany, became a community ritual for immigrant families in Pennsylvania, and eventually became a child’s-eye-view celebration of spring available to secular and religious families alike. The rabbit never needed to make logical sense. It only needed to mark the moment the world turns green again—and every civilization, it seems, finds a way to celebrate that.

Illustration generated by author using ChatGPT.

Sources:

  • Bede, De Temporum Ratione (8th century)
    https://sourcebooks.fordham.edu/basis/bede-reckoning.asp
  • Encyclopaedia Britannica — Easter holiday origins
    https://www.britannica.com/topic/Easter-holiday
  • Catholic Encyclopedia — Lent and fasting traditions
    https://www.newadvent.org/cathen/09152a.htm
  • Smithsonian Magazine — History of Easter Eggs
    https://www.smithsonianmag.com/arts-culture/the-history-of-the-easter-egg-180971982/
  • History.com — Easter Symbols and Traditions
    https://www.history.com/topics/holidays/easter-symbols
  • Library of Congress — Easter traditions in early America
    https://blogs.loc.gov/folklife/2016/03/easter-on-the-farm/
  • National Geographic — Where Did the Easter Bunny Come From?
    https://www.nationalgeographic.com/history/article/easter-bunny-origins
  • American Folklife Center, Library of Congress
    https://www.loc.gov/folklife/
  • National Confectioners Association — Easter candy statistics
    https://www.nationalconfectioners.org/blog/seasonal-easter-candy-data/
  • Smithsonian — How holidays became commercial traditions
    https://www.smithsonianmag.com/history/the-surprising-history-of-holiday-shopping-180964949/
  • Oxford Companion to the Year — Ronald Hutton
    https://global.oup.com/academic/product/the-stations-of-the-sun-9780192854483
  • University of Pennsylvania Religious Studies overview of seasonal festivals
    https://www.penn.museum/sites/expedition/easter/

Page 2 of 2

Powered by WordPress & Theme by Anders Norén