At a recent dinner with friends, a discussion arose during the period between sitting down and ordering our meals about how we have changed our diets over the past few years. Several of us no longer drink colas, others have decaffeinated themselves, and one or two are lactose-free. We chuckled at what a challenge it is to invite people for dinner. The query of “What can I bring?” has become “Are there any food restrictions?”
I find it odd how our bodies have become sensitive or intolerant to food and food additives as we mature. I also find it interesting that in response to those sensitivities, we have reverted to “purer” preparations, to the extent that we can find the unadulterated ingredients. That is definitely easier when you are the “Master Chef,” but when you rely on others—well, that is a very different scenario.
While regulation of food in the United States dates from early colonial times, it took until 1906 to get the Food and Drugs Act, also known as the Wiley Act, established. Harvey Washington Wiley, Chief Chemist of the Bureau of Chemistry in the Department of Agriculture, was the powerhouse behind this law. Wiley believed unsafe foods were a greater public health crisis than adulterated or misbranded drugs. Moreover, he opposed chemical additives to foods, which he viewed as unnecessary adulterants.1
Interestingly, Upton Sinclair’s The Jungle, an exposé of the revolting state of the meatpacking industry, is credited as the precipitating force behind this meat inspection and comprehensive food and drug law.1 The Wiley Act banned interstate commerce in adulterated and misbranded food and drugs, and further prohibited the addition of any ingredients that would substitute for the food, conceal damage, pose a health hazard, or constitute a filthy or decomposed substance. Prior to that, basic elements of food protection were absent.
Despite these inroads, however, concerns about food and drug safety continued, and in 1938, President Franklin D. Roosevelt signed the Food, Drug, and Cosmetic Act into law.2 This corrected abuses in food packaging and quality, and it mandated legally enforceable food standards. The first food standards issued under the 1938 act were for canned tomato products; since the 1960s, about half of our food supply is subject to a standard.
Almost 100 years after the establishment of the Wiley Act, we continued to be plagued with concerns about our food and the contents therein. To address these concerns, in 2004, the passage of the Food Allergy Labeling and Consumer Protection Act required the labeling of any product that contains a protein derived from any of the following foods that, as a group, account for the vast majority of food allergies: peanuts, soybeans, cow’s milk, eggs, fish, crustacean shellfish, tree nuts, and wheat.3 This was an important move, as studies indicate that more than 11 million Americans have one or more food allergies considered a component of chemical intolerance.
The term chemical intolerance (CI) is used to describe the loss of prior, natural tolerance to common foods and drugs that occurs in certain individuals.4 In population-based surveys, participants report a 2% to 13% prevalence of CI.5 Researchers have also found that patients with CI had an increased incidence of poorer functional states and a tendency toward increased use of the health care system, compared with persons without CI.4
Food additives are chemicals used to enhance the flavor, color, or shelf-life of food. While now carefully regulated by federal authorities and various international organizations to ensure that foods are safe to eat and are accurately labeled, in my opinion, they continue to be the most concealed and dangerous sources of CI. The CI recognized as food allergies can be a potentially serious immune response to eating those specific foods.
The incidence of allergies to food, or food additives, is on the rise. In children younger than 18 alone, there was an 18% increase in the prevalence of reported food allergy between 1997 and 2007.6 Often after dining out, those with CI or food allergies suffer for days with gastrointestinal, atopic, cardiovascular, or respiratory symptoms. Anyone who has ever tried to identify exactly what, why, or how they became ill after eating knows how frustrating and sickening it is to go through the process. They also know that as little as one taste of an offending substance can send them to bed for a day—or worse, to the emergency department (ED).
I never cease to be amazed at the carelessness of some food preparers. As you can imagine, I am outraged that people with food allergies or intolerance seem to be viewed as “picky eaters.” Yes, we are picky: We chose not to be ill after eating in your establishment! I was asked once if I “couldn’t just pick out” the allergen in my dish. My response was “Sure, right after you pick out where the LifeFlight helicopter can land after I eat this!”
You have the backdrop; now back to our dinner escapades. Our waiter informed us of the daily specials. We listened carefully to his presentation about each dish and the chef’s preparation of it. When he asked if we had any questions, three of us posed queries related to our individual food restrictions. And so began the tribulations. Our meals arrived and contained the items or preparations that each of us had specifically said were taboo. Thankfully, keen eyes and a great sense of smell intervened, preventing us from the guaranteed illness or evening in the ED, had we trusted the kitchen crew.
Food allergies and intolerance are no joke. Those of us with food allergies or intolerance are ever vigilant about reading labels and informing others about our food restrictions. It is imperative that others who prepare food for us be as attentive. Knowing what ingredients are in each dish is important, but also knowing what is in the “base” of how the dish is prepared is critical to preventing dining disasters. That, my friends, is the responsibility of the cook and the servers.
Hold the mushrooms and coconut, please! If you’d like to share your dining disaster, please send it to NPEditor@qhc.com.
1. FDA History—Part 1: The 1906 Food and Drugs Act and Its Enforcement. Available at: www.fda.gov/AboutFDA/WhatWeDo/History/Origin/ucm054819.htm.
2. FDA History - Part 2: The 1938 Food, Drug, and Cosmetic Act. Available at: www.fda.gov/AboutFDA/WhatWeDo/History/Origin/ucm054826.htm.
3. About FDA: Significant Dates in U.S. Food and Drug Law History. Available at: www.fda.gov/AboutFDA/WhatWeDo/History/Milestones/ucm128305.htm.
4. Katerndahl DA, Bell IR, Palmer RF, Miller CS. Chemical intolerance in primary care settings: prevalence, comorbidity, and outcomes. Ann Fam Med. 2012;10(4):357-365.
5. Caress SM, Steinemann AC. Prevalence of multiple chemical sensitivities: a population-based study in the southeastern United States. Am J Public Health. 2004;94(5):746-747.
6. Branum AM, Lukacs SL. Food allergy among US children: trends in prevalence and hospitalizations. NCHS data brief, no 10. Hyattsville, MD: National Center for Health Statistics; 2008.