You’re trying to think of a treat to send to school for your son’s birthday. However, it can’t contain peanuts, milk, strawberries, or tree nuts because several children in his class have food allergies. Recently, food allergies seem to be on the rise. Is this true? If so, why is this happening?
Food allergies are a significant public health concern. According to the nonprofit Food Allergy Research and Education (FARE), a food allergy reaction sends someone to an emergency room every 3 minutes in the United States. In some cases, the result is fatal.
An allergy is the outcome of the immune system fighting substances, usually proteins, in the environment that it should see as harmless. Instead, it views them as invaders and triggers a reaction to attack them. Reactions vary from person to person. They can also differ from one episode to episode in the same person.
A minor response can be skin redness, hives, itching, swelling, and stomach ache (ex. vomiting and diarrhea). A severe reaction, also known as anaphylactic shock, includes wheezing, dizziness, a drop in blood pressure, and difficulty breathing. Anaphylactic shock can lead to death if not treated promptly. Symptoms appear within minutes to hours after eating the triggering food.
A concerning trend is that there’s been an increase in the number of people with food allergies in the past several decades. Multiple peer-reviewed studies indicate the rate of food allergies worldwide has increased from 3% of the population in 1960 to about 7% in 2018. One evaluation of hospital admissions data found anaphylaxis cases on the rise globally. In the U.S., hospital visits for food allergies increased threefold from 1993 to 2006.
Children are more likely to be affected than adults. Somewhere between 6-8% of them are thought to have food allergies, compared with less than 3% of adults. The Centers for Disease Control and Prevention (CDC) research shows that food allergies in children have increased by roughly 50% between 1997 and 2011. This means that 1 in 13 children (or about two students in every classroom) have a food allergy. The agency estimates that more than 40% of children with a food allergy have experienced a severe form of reaction, including anaphylaxis.
Statistics show the number of medical procedures required due to anaphylaxis increased by almost five times between 2007 and 2016. Information from Grandview Research demonstrates that allergy diagnoses and treatment is nearly a $26 billion market.
The range of foods to which people are allergic has expanded as well. There used to be two or three main food allergies, with peanuts and eggs being the most common. Now, scientists have identified over 170 allergenic foods. However, about 90% of allergic reactions are caused by eight food, including milk, eggs, peanuts, tree nuts (ex. walnuts, almonds, pine nuts, brazil nuts, pecans), soy, wheat, fish, and shellfish (ex. crustaceans and mollusks).
One theory as to why there’s been a rise in food allergies is that we’re more aware of them. Allergy specialists disagree. They don’t think that our raised awareness level is because we’re better at diagnostics since it hasn’t changed. Instead, they’re looking at various factors regarding how we live, particularly in industrialized societies. It’s believed that the increased sensitivity to foods is related to Western lifestyles. This is the assumption because the evidence shows lower food allergy rates in developing countries. Even within developed countries, there’s a difference between urban and rural areas.
A concept that supports this idea is the “hygiene hypothesis.” It was first postulated in 1989 and has been debated ever since. Many physicians and organizations, including the Food and Drug Administration (FDA), are taking a closer look at it. The hygiene hypothesis means that a lack of exposure to infectious agents early in childhood creates a scenario where the immune system mistakes a food protein as an invading germ.
Our gut has a microbiota of bacteria that helps our body to survive. It’s so crucial that the microbiota outnumber the 27 to 37 trillion cells in our body by around three times. This means that each of us carries approximately 3 trillion bacteria. When we’re born, our body comes into contact with a wide array of bacteria in the air, on the ground, and in our diet, which helps us populate our gut microbiota. When we’re not exposed to these bacteria, it puts our bodies at a disadvantage.
The lack of exposure is the result of Western society’s obsession with fighting germs—it’s quite common for people to want to disinfect everything. This means the microbiota we’re exposed to in our homes is changing because it bears no relation to the outside world in which we evolved.
Parasitic infections are usually fought by the same mechanisms involved in tackling allergies. With fewer parasites to fight, the immune system turns against things that should be harmless. One other contributor to this is the increasing use of antibiotics to treat infections. Most antibiotics are broad-spectrum, so they kill any bacteria, including the healthy ones in our gut.
New research released this year by scientists at Yale University supports the notion that overuse of hygiene products and antibiotics could’ve caused the increase in food allergies in the Western world over the past 30 years. Besides the absence of natural microbial exposure, the researchers say that unnatural substances, including processed food, or environmental chemicals, such as dishwashing detergent, in the modern environment play a role in disrupting our internal food quality control.
An example of our internal food quality control is if something smells or tastes bad, we don’t eat it. There are similar sensors in the gut. For instance, if we consume toxins, they are detected and expelled. The latter example is part of the immune system to help neutralize a threat. This mechanism is what’s triggered by food allergies. The research indicates that the lack of natural threats, such as parasites, makes the immune system hypersensitive and more likely to respond to harmless proteins found in certain food groups.
A different model is the “dual allergen exposure” theory, which suggests food allergy development comes down to the balance between the timing, dose, and form of exposure. When most of these food allergies started appearing in the 1990s, people got very worried about introducing foods, such as peanuts, into babies’ diets. This led to guidance from medical professionals that said, “Don’t give these foods to your baby until they’re three years old.” However, the advice wasn’t based on any evidence.
Scientists now know that parents should be doing the opposite, providing allergenic foods to babies as early as possible. The thought process is that just because an infant doesn’t eat peanuts doesn’t mean that they won’t encounter people who have. If the child hasn’t eaten peanuts, this contact with it through their skin can trigger a response from their immune system.
Experts phrase it this way: “Through the skin allergies begin; through the diet allergies can stay quiet.” This is why they recommend introducing a diverse range of foods starting around three or four months of age because it’s a window of opportunity to establish tolerance. The theory is that eating allergenic foods during this period “trains” the gut’s immune system to tolerate bacteria and foreign substances, like new foods.
One study found that giving children between 4-11-months-old peanuts lowered their chances of having a peanut allergy at the age of five by 80%. The World Health Organization (WHO) promotes breastfeeding only for six months before introducing other foods, but “to not delay the introduction of allergenic foods such as peanut and egg beyond that, as this may increase the risk of allergy, particularly in kids with eczema.”
Scientists are also looking to the environment as a reason why there has been an explosion in food allergies. One example that food allergies vary according to the environment is verified by the absence of peanut allergies in countries where the population barely eats peanuts. Also, migrants show a higher prevalence of asthma and food allergies in their adopted country than in their country of origin.
Researchers have noticed that food allergy prevalence seems to coincide with the availability of sunlight. Sunlight helps our body produce vitamin D, which performs an essential role in developing immunoregulatory mechanisms. In urban and suburban areas, people are spending more and more time indoors, depriving themselves of this crucial nutrient. If we do go outside, we cover our exposed skin with sunscreen. All of this has led to the vitamin D deficiency rate almost doubling in the U.S. in just over a decade.
According to climatologists, 2000-2009 was the hottest decade on record, and the average annual temperature could rise by 10°F in the coming decades. This will length the growing season for plants, increasing pollen and allergen counts for more extended periods. This will impact rural and suburban communities. A different environmental element is a prenatal impact on allergy development. Currently, it remains unknown. In 2008, a wide range of studies failed to find a definite link between prenatal diet and food allergies.
Given the rise of allergies in children, it makes sense that more adults would have allergies, too, since those children grow up. However, this doesn’t wholly account for the prevalence of food allergies among adults. Per a study published in JAMA Network Open, as of 2019, more than 10% (26 million) of adults in the U.S. are estimated to have a food allergy. One out of 4 adults with a food allergy reported getting their first allergy in adulthood. There hasn’t been much research done on why this occurs.
However, a 2015-2016 survey of more than 40,000 adults across the country provide some data. The results show that 51.1% have had a severe reaction, and 38.3% report at least one reaction that required emergency care. Research also shows that 19% of adults think they have a food allergy but don’t know for sure. Only 5% have a doctor-confirmed diagnosis, and 24% of adults with a food allergy report a current epinephrine prescription.
As far as types of food allergies for adults, shellfish is at the top, affecting 7.2 million. Regarding other allergens, the impacts are still widespread, with milk impacting 4.7 million people, peanuts 4.5 million, tree nuts 3 million, finfish 2.2 million, eggs and wheat are 2 million apiece, soy 1.5 million, and sesame 0.5 million. As for causes of adult-onset food allergies, researchers are focusing on hormone changes, like those that occur during pregnancy, genetic factors, and environmental triggers.
Diagnosing a food allergy is challenging. Self-reporting isn’t reliable because three to four times as many people think they have a food allergy than they actually do. This is partly due to food intolerances being confused with food allergies since some symptoms overlap. Many times, this leads to self-diagnose without confirmation from an expert, such as an allergist. It’s essential to get a diagnosis, so you know how to manage your condition daily. When you self-diagnosis, it can limit your diet unnecessarily and prevent you from receiving appropriate care, like epinephrine auto-injectors that can save your life in the event of a severe allergic reaction.
The other issue is that there aren’t optimal diagnostic tools since there isn’t one skin or blood test that can check for all possible allergens. The “gold-standard” food allergy test is the oral food challenge. It involves feeding a small amount of the food in question to the potentially affected person in a clinical setting. Slowly, they consume increasing doses up to a full serving. If the patient has a reaction during the test, it’s stopped, and they’re treated.
The test is time-consuming, costly, and comes with risks, so not the ideal way to find an answer. For children, experts recommend getting tested once a year to see if they’ve outgrown their food allergies. Many often do, especially if it’s a milk or egg allergy. For peanuts, about 1 in 5 outgrow their allergy.
Unfortunately, there isn’t currently a cure for food allergies. Instead, the condition is managed by avoiding the offending foods, which is challenging and a burden on social and family lives. Also, individuals and families need to have an emergency treatment plan in case of exposure. This means many patients and families live with fear and anxiety.
One treatment presently being studied and yielding promising results is immunotherapy or desensitization. This entails the consumption of minuscule but increasing amounts of allergenic foods and has been shown to reduce sensitivity in allergic patients and can protect against accidental exposure. However, it’s important to note that this is an ongoing treatment. When individuals stop having it regularly, their allergic response returns to what it was before going through the process. New research is looking at the use of probiotics and drug treatments.
The rise in food allergies and deaths of people suffering from them highlights the significance of clear and accurate labeling. Allergy experts hope that individuals in the food industry are paying attention to this data. Not only do companies need to be careful with labeling, but restaurants have to be cautious with serving. Patients with food allergies rely on others to keep them safe. This means other people must have a general level of understanding of food allergies.
Many factors have played a role in how our immune systems respond to food products. We need to understand better what they are while improving our diagnostic capabilities and treatment offerings. Even if you’re not personally impacted by a food allergy, you probably know someone who is. We each play a role in ensuring that food is safe for everyone!