
I’ve been spending a lot of time recently researching and writing about the 250th Anniversary of the American Revolution and I keep asking myself, “What’s up with the wigs?” Have you ever wondered why the Founding Fathers look so impossibly fancy in their portraits? Well, you can thank a French king and a syphilis epidemic. The elaborate wigs worn by early American leaders weren’t just fashion statements—they were complex social symbols that said everything about who you were, what you could afford, and how seriously you wanted to be taken.
Where It All Started
The wig craze didn’t begin in America. It started across the Atlantic when France’s King Louis XIII went bald prematurely in the 1600s and decided to cover it up with a wig. But it was his son, Louis XIV, who really kicked things into high gear. When the Sun King started losing his hair, he commissioned elaborate wigs that became the epitome of aristocratic style. European nobility, desperate to emulate French sophistication, quickly followed suit.

The practice also had a less glamorous origin story. Syphilis was rampant in 17th-century Europe, and one of its unfortunate side effects was hair loss. Wigs conveniently covered up this telltale symptom while also hiding the sores and blemishes that came with the disease.
Europe in the 1600s and 1700s also had frequent outbreaks of lice and other parasites. Shaving one’s natural hair short and wearing a wig—which could be cleaned, boiled, or deloused more easily—became a practical solution. Powdering helped keep wigs fresh and masked odors.
By the time the fashion crossed the ocean to colonial America in the early 1700s, wigs had become standard attire for anyone with social pretensions.
Status on Your Head
In colonial America, your wig announced your place in society before you even opened your mouth. The most expensive and elaborate wigs featured long, flowing curls that cascaded past the shoulders—these full-bottomed wigs could cost the equivalent of several months’ wages for an average worker. Wealthy merchants, successful plantation owners, and colonial officials wore these statement pieces to project authority and refinement.
Professional men like doctors, lawyers, and clergy typically wore more modest styles. The “tie wig” gathered hair at the back with a ribbon, while the “bob wig” featured shorter hair that ended around the neck. These styles were practical enough for men who actually had to work, but still formal enough to command respect. Even the style of curl mattered—tight curls suggested conservatism and tradition, while looser waves indicated a more progressive outlook.
Working-class men generally couldn’t afford real wigs. Some wore simple caps or went bareheaded, while others might invest in a cheap wig made from horsehair or goat hair for special occasions. The quality difference was obvious—human hair wigs, especially those made from blonde or white hair, were luxury items that only the wealthy could obtain.
Many men who did not wear wigs but still wanted the fashionable look would grow their own hair long, pull it into a queue (pony tail), and powder it. George Washington is a good example — portraits show his natural hair powdered white, not a wig.
The Daily Reality of Wig Life
Maintaining these hairpieces was no joke. Owners had to powder their wigs regularly with starch powder, often scented with lavender or orange, to achieve that distinctive white or gray color that signaled refinement. The powder got everywhere, which is why men often wore special dressing gowns during the powdering process.
Wigs required regular cleaning and restyling by professionals called peruke makers or wigmakers. These craftsmen commanded good money in colonial cities, advertising their services alongside other luxury trades. The hot, humid summers in places like Virginia and South Carolina made wig-wearing particularly miserable, but fashion demanded sacrifice.
The Revolutionary Shift
By the time of the American Revolution, attitudes toward wigs were already changing. The shift happened for several interconnected reasons, and it reflected broader transformations in American society.
First, the Revolutionary War itself promoted practical thinking. Military officers found elaborate wigs impractical in the field, and the democratic ideals of the Revolution made aristocratic European fashions seem pretentious. Many younger revolutionaries, including Thomas Jefferson, stopped wearing wigs as a political statement against Old World affectation.

A young Jefferson with a wig
Second, France—the original source of wig fashion—underwent its own revolution in 1789. As French revolutionaries literally beheaded the aristocracy, powdered wigs became associated with the despised nobility. What had once symbolized sophistication now suggested tyranny and excess.
In Great Britain, Parliament introduced a tax on hair powder as part of Prime Minister William Pitt the Younger’s revenue-raising measures. The law required anyone who used hair powder to purchase an annual certificate costing one guinea (a little over $200 in today’s money). This contributed to the growing sense that wigs were an unnecessary extravagance. Meanwhile, changing ideals of masculinity emphasized natural simplicity over artificial ornamentation.
By the early 1800s, the wig had largely disappeared from everyday American life. A new generation of leaders, including Andrew Jackson, proudly displayed their natural hair. The transition happened remarkably quickly—within a single generation, wigs went from essential to absurd. By the 1820s, anyone still wearing a powdered wig looked hopelessly outdated, clinging to a world that no longer existed.
The Legacy
Today, elaborate wigs survive primarily in British courtrooms, where some judges still wear them in formal proceedings—a deliberate echo of legal tradition. The powdered wigs of the Founding Fathers remain iconic, instantly recognizable symbols of early American history, even though the men who wore them were already abandoning the fashion by the time they built the new nation.









When Evidence Isn’t Enough: The Crisis of Science in Public Life
By John Turley
On September 23, 2025
In Commentary, Politics
While I would never call myself a scientist, as a physician my whole professional life is built on the belief in and the trust of science. I am distressed that so many people have chosen to disregard trust in science in favor of misinformation.
Throughout history, scientific discovery has been humanity’s most reliable guide to progress. From the germ theory of disease to space exploration, science has reshaped how we live and what we believe possible. Yet in recent years, the very foundation of this methodical pursuit—evidence, observation, and experimentation—has come under sustained political, cultural, and economic attack. This struggle is often described as “the war on science,” a phrase that captures how debates once rooted in policy have shifted into battles over truth itself.
The numbers tell a stark story. The National Science Foundation has terminated roughly 1,040 grants that would have awarded $739 million to researchers and has awarded only 52 undergraduate research grants in 2025, compared to about 200 annually since 2015. The proposed cuts are staggering. Trump will request a $4 billion budget for the NSF in fiscal year 2026, a 55% reduction from what Congress appropriated for 2025.
At the heart of the conflict lies mistrust. Science requires patience since answers evolve as new data emerge. But in a world driven by instant communication and ideological certainties, that evolving nature is often cast as contradiction or weakness. Critics dismiss changing conclusions not as hallmarks of rigorous inquiry, but as evidence of unreliability. The result is a dangerous fracture; science depends on trust in evidence, while many segments of society increasingly place trust in ideology or anecdote or even outright falsehoods.
Climate change is one of the most visible fronts in this battle. Virtually every major scientific body worldwide affirms that human activities are driving global warming. Yet climate scientists are routinely accused of bias or conspiracy, their data questioned, and their motives impugned. What is often overlooked in the controversy is not the complexity of climate systems—scientists have long acknowledged uncertainties—but the political and economic interests threatened by the solutions science prescribes. When climate scientists publish evidence of global warming, their research doesn’t just describe weather patterns—it challenges powerful industries built on fossil fuels.
Public health provides another stark example. During the COVID-19 pandemic, scientific guidance became subject to fierce political polarization. Masking policies, vaccine safety, and even simple social distancing rules morphed into partisan symbols rather than matters of medical evidence. Scientists found themselves vilified, their professional debates distorted into talking points. The losers in this exchange were not the scientists themselves but the broader public, denied clear trust in institutions that are dedicated to safeguarding health.
Underlying these conflicts are powerful currents. Some industries resist regulation by casting doubt on findings that threaten profit. Certain political movements thrive on skepticism of expertise, channeling populist distrust of “elites” toward scientists. And in the swirl of social media, misinformation spreads more rapidly than peer-reviewed studies, eroding the influence of evidence before consensus can take hold.
What makes this particularly concerning is the timing. America’s main scientific and technological rivals are rising fast. In terms of federal Research and Development funding as a percentage of GDP, U.S. investment has dropped for decades, and the lead that the U.S. enjoyed over China’s R&D expenditure has largely been erased.
While the war on science is often treated as a distinctly modern dilemma, born of political polarization, mass media, and cultural distrust of expertise, its roots stretch back centuries. Galileo was silenced for challenging religious dogma. Early physicians were scorned when they argued that invisible germs, not miasmas or curses, caused disease. During the Enlightenment of the 17th and 18th centuries, thinkers faced their own version of this struggle—a battle between dogma and reason, authority and evidence, tradition and discovery. In every case, vested interests—whether theological, cultural, or economic—feared the disruption that scientific truth carried. Understanding those earlier conflicts provides valuable context for our challenges today.
The stakes today, however, feel higher. Our era’s challenges—climate change, pandemics, artificial intelligence, genetic engineering—demand unprecedented reliance on scientific understanding. To wage war on science is, in effect, to wage war on our own best chance for survival and responsible progress. If truth becomes negotiable, then evidence loses meaning, and with it, the possibility of reasoned self-government. That is why the war on science cannot be dismissed as a technical squabble—it is a philosophical contest echoing the Enlightenment battles that shaped modern civilization.
Ultimately, the struggle is less about data than about values. Do we commit to curiosity, openness, and the willingness to change our minds? Or do we cling to certainties that soothe but endanger us in the end? The war on science will not be won by scientists alone. It can only be resolved if society restores trust in evidence as the most reliable compass we have—however unsettling the direction it may point. There may be alternative opinions but there are no alternative facts.