The key finding in the study, though, wasn’t that more people had allergies; that was an accepted observation already. It was who had them and who didn’t. The author, immunologist David Strachan, reported that people then in their twenties, who had been part of a huge, lifelong study of British children born in 1958, seemed less likely to have hay fever if they had grown up with older siblings. The implication was that the older sibs—who would have been leaving the house, going to school, and running around outdoors with friends when the toddlers stayed home—were exposing younger kids to something they brought home. It was a phenomenon that wouldn’t be available to an eldest or only child—people who, in this original research, had higher rates of hay fever than younger siblings did.
The possibility that early exposure to something prevented later trouble was intuitively appealing, and it led to a cascade of research associating allergies, eczema, and asthma with hygienic modern living. Many observational studies reported that allergies and asthma were less likely in people whose childhoods were spent outside cities, who were put in day care as infants, or who grew up with pets or were raised on farms—leading overall to a conclusion that messy, dirty premodern life was healthier for a growing child.
This led to a backlash—a sense that parents desperate to avoid allergies were neglecting basic cleanliness—and to a reframing of the hygiene problem. Version 2.0, formulated by Rook in 2003, proposes that the source of allergies isn’t a lack of infections, but rather deprivation of contact with environmental organisms that were our evolutionary companions over millennia. Rook called this the “old friends” hypothesis, suggesting that exposure to those organisms allowed our immune systems to learn the difference between pathogens and inoffensive fellow-travelers.
While this rethink was occurring, lab science was achieving the tools to characterize the microbiome, the films of bacteria and fungi that occupy the external and internal surfaces of everything in the world, including us. That helped recast the exposures that kids received in those observational studies—to animals, other children, dung, dander, and dust—not as infectious threats, but as opportunities to stock their microbiomes with a diverse array of organisms.
And that recognition in turn led to Version 3.0, the hygiene hypothesis as it exists now. Renamed the “disappearing microbiota” hypothesis and reformulated 10 years ago by microbiologist Stanley Falkow (who died in 2018) and physician-researcher Martin J. Blaser, this iteration proposes that our microbiomes mediate our immune systems. It also warns that our microbial diversity is becoming depleted, and thus less protective, because of the impact of antibiotics, antiseptics, and poor diets, among other threats.
That’s a quick tour of the contention that a lack of exposure—to childhood infections, environmental bacteria, and other opportunities to recharge microbial diversity—lets immune systems fall out of balance with their surroundings. It’s an idea that today is broadly accepted in pediatrics and immunology, though the surviving proponents of the various versions might disagree over details. But what does it mean for our immune systems as we emerge from combatting Covid-19? The hypothesis can’t say exactly what will happen, because so far researchers only have data on the prevalence of viral infections, not on other types of exposures. But that data is provocative.
In the southern hemisphere, where flu season overlaps the northern hemisphere’s summer, there was “virtually no influenza circulation” in 2020, according to a CDC report in September. The agency hasn’t yet published its final report on the US experience with the flu this winter, but the World Health Organization reported last month that it remained “below baseline” throughout the northern hemisphere.