I have come to realize, with both running this site and its forums, that restrictive eating disorder behaviors are normative in our society, not aberrant.
Even though the majority of people in our society have absolutely no neurobiological underpinning to really experience misidentification of food as a threat (as happens for those with eating disorders), we are all striving to do so nonetheless.
We are all eating disorder wannabes.
With all these eating disorder wannabes (your family, your friends, your doctor, your therapist, your dietician, your treatment teams, your casual acquaintances, strangers in the street) clamoring to compliment you on your thinness; recommending you “stop gaining now”; or perhaps letting you know it’s time to “slow your weight restoration so you don’t become unhealthy”, the overriding message for you in recovery these days is: “You just need to not take things to extremes.”
The MinnieMaud Guidelines approach to recovery not only flies in the face of the glut of unscientific approaches to recovery from restrictive eating disorders, it is also a profoundly disturbing approach for the population at large. The socio-culturally abhorrent nature to this model of recovery renders my effort to persuade others of the scientific basis for this approach rather depressing. It creates a decided going-nowhere feel for me.
Dr. Semmelweis' experience with ingrained, yet unscientific, medical practice is as alive today as it was back in the 1840s.
Semmelweis was a Hungarian physician who was able to prove, through experiment, that the failure of medical interns to wash their hands was directly correlated to the incidence of puerperal fever (childbed fever) in maternity wards of the day. By instigating a strict procedure of having all interns, who had performed autopsies, wash their hands before they entered the Maternity Department at the hospital in Vienna, Semmelweis achieved an immediate drop in the incidence of fatal puerperal fever from 10% to 1-2% [Semmelweis Society International, 2013].
In fact, the tendency to reject new evidence because it contradicts the established norm, is called the Semmelweis reflex [ibid].
Semmelweis became increasingly outraged that the medical establishment did not implement his statistical proof on the necessity of hand washing to save lives. He postulated that some kind of autopsy-related particles were involved in the incidence of puerperal fever, but in the absence of germ theory (that micro-organisms are responsible for many human illnesses), he had evidence that was compelling, yet unpersuasive, to the establishment of the day.
He resorted to publicly denouncing his colleagues as murderers in numerous letters. In 1865 he was committed by his family to a mental asylum and forcibly restrained and beaten. The injuries became infected and he died of septicemia two weeks later.
Within two years of Semmelweis’ death, Joseph Lister, on the heels of Louis Pasteur’s newly-minted germ theory, was able to apply chemical solutions that would stop the spread of bacteria (Listerine®, anyone?). Too bad Semmelweis was not around to enjoy any vindication.
Well, maybe not vindication after all:
“Each year in the United States, more people die from bacterial infections they caught while receiving care in their own local, community and private hospitals than from: AIDS, breast cancer and automobile accidents combined. It’s a national disgrace that annually, more than 99,000 people never survive the infections they accidentally catch while being cared for as patients in American hospitals.”
[Safe Care Campaign, 2013]
In fact, the study of hand hygiene compliance completed in 2009 by the America Journal of Medical Quality indicates compliance remains unsatisfactory throughout both intensive care and non-intensive care units in hospitals across the country. Even after intervention and training, most ICUs and non-ICUs barely get above compliance rates of 50%, with nurses improving in compliance far more than doctors, and continuing misuse of gloves being a foundational aspect of continued poor hand hygiene compliance [ibid; J Randle et. al., 2006; D Pittet et. al., 2000; E Girou et. al., 2004].
Nutrition as Religion
My genetics predispose me, and my family members, to many unpleasant and occasionally life-threatening chronic conditions. There isn’t a single one of us who has not attempted all manner of dietary changes, supplements, nutrient removals and additions to attempt to alleviate our symptoms.
Why do we think being unconcerned with what we eat will make us ill and kill us, yet being concerned and vigilant will make us healthy and long lived? Because it is common sense of course!
But common sense can be common nonsense and parsing the sense from the nonsense is not easy.
I don’t believe being unconcerned with what we eat is perhaps the best model to adopt. However our vigilant attention drives us to ignore that food is always greater than the sum of its constituent parts.
As you know, I have Michael Pollan’s books listed in the reading section on this site. I enjoy his writing and his equivocal stance on many things, but he swims as much in the cultural waters of the U.S. as any of us—in fact he swims in a particular socioeconomic pond that is not terribly big or inclusive.
What I believe Pollan does have right is that American culture lacks a food culture. This absence of food culture has degraded and reduced connection, cooking and communion to become nutrients, socioeconomic moralizing and pervasive loneliness.
And the American empire has naturally spread its approach to food across the globe.
In a society where connection, cooking and communion are absent, then one’s own mortality becomes even more daunting. It is as if nutrition labels have replaced all other totems or talismans of comfort that were interwoven in the world’s religions to allow us to have a ready reminder of solace and comfort.
How many of you are aware that you likely extract more nutrients from food if the food is culturally relevant to you? [L Hallberg et al., 1977]. Sadly there has been little follow up in the scientific community on this topic. The extent to which food functions as a stressor as opposed to a pleasure varies across cultures. Not surprisingly, viewing food consumption as a health-oriented endeavor is more stressful and correlates with increased ill health [P Rozin et al., 1999].
Nutrition as Class
In 2010, 46.8% of Americans self-identify as working class [General Social Survey, 2010]. However, when the differentiation offered on the survey is between middle class and lower class (rather than working class), then the majority will self-identify as middle class.
There are various models for identifying socioeconomic status in our society. I use the Beeghley model: Super-rich (less than 1%), Rich (5%), Middle Class (46%), Working Class (40%) and the Poor (12%).
The annual income spent on food in the United States is 6.8%. By comparison the annual income spent on food in Croatia is 26.1%. The percentage of children malnourished in both countries is less than 5% [Washington State University Magazine, Fall 2011]. In fact Americans spend the least on food when compared to all other countries. Most European countries are at 10% of their annual incomes and above.
In 1963, Americans were used to spending a third of their income on food [M Orshansky, 1963]. The shift to extremely cheap food happened in the 1970s with massive increases in corn production alongside goosing the entire petroleum production system to process and transport it all. The Green Revolution was sprouted not from soil, but from Oil.
On the surface, massively reducing the amount of annual income that any one family had to use to feed itself should have resulted in the eradication of poverty and the upward mobilization of the working class and poor towards the middle and rich classes in America and the rest of the world.
In practice, it has rendered the middle class, working class and the poor all progressively less resilient. Our moralistic focus on the utter fallacy that is the ‘obesity’ epidemic has rendered us completely oblivious to the real danger of cheap food.
Journalist Marc Bittman skillfully argued last year in his New York Times piece that junk food is not cheaper than a “nutritious” meal: Is Junk Food Really Cheaper? He, along with folks like Marion Nestlé and of course Michael Pollan, reinforce the prevailing attitude that eating ultra-processed foods is a choice, and not even an economic one at that.
“The fact is that most people can afford real food. Even the nearly 50 million Americans who are enrolled in the Supplemental Nutrition Assistance Program (formerly known as food stamps) receive about $5 per person per day, which is far from ideal but enough to survive. So we have to assume that money alone doesn’t guide decisions about what to eat.”[M Bittman, September 24, 2011]
There is nothing more irritating than listening to our own narrow socioeconomic enclave obliviously speak to one another about $5 per day being enough to survive. I highly doubt Bittman has attempted it, and I most certainly won’t by choice.
He goes on to say: “The core problem is that cooking is defined as work, and fast food is both a pleasure and a crutch.” Well cooking is work when you have to do it on top of two jobs, long bus rides, shift work and raising a family. I would hope that anyone actually accomplishing all of that would still have the energy to swat Bittman across the face for his sanctimonious superiority. Just try to get to a Safeway without a car at 8 pm almost anywhere in the U.S. you’d care to name.
The real danger of cheap food, and frankly it’s all cheap food whether you step into the Safeway, Waitrose, Aldi, Coop, SPAR, Woolworths, Tesco, Wellcome, Jumbo, Whole Foods or MacDonald’s, is that there is no room to make any further family budget cuts, and yet the vast majority of us are increasingly compelled to do so in our current economic reality.
If anyone wants to joust with me on people getting sick-obese because of junk food and eating in unrestrictive ways, then they best be ready to explain how all these sick-obese people are otherwise getting sufficient sleep; not unduly stressed or worried about shelter and keeping their heads above the financial waterline; not exposed to a myriad obesogenic endocrine disruptors in their store receipts, soaps, laundry detergents, air ‘fresheners’, lotions and potions; and that these ‘gluttonous lazy pigs’ have also never dieted in their lives either.
“When presented with negative outcomes, people often engage in counterfactual thinking, imagining various ways that events might have been different.”
[R Janoff-Bullman et al., 1985].
Likely some of you are aware of the literature surrounding the groundswell of both system justification and victim blaming that occurred in the wake of Hurricane Katrina (2005) across the U.S.. Both system justification and victim blaming arise automatically as a result of counterfactual thinking.
“We propose that the social system was indirectly threatened for the public when inadequate relief efforts exposed governmental shortcomings, called into question the legitimacy of agency leadership, and highlighted racial inequality in America. In response to such system threats, both victims and observers (e.g., the general public, commentators, policy makers) are known to engage in various forms of system justification, including direct defense of the status quo, victim blaming, stereotyping, and internalization of inequality. These processes can reduce emotional distress and restore perceived legitimacy to the system, but they may have a number of troubling consequences for the storm victims in their efforts to return to normalcy.”
[JL Napier et al., 2006]
We are busily engaging in counterfactual thinking to reduce our emotional distress and restore perceived legitimacy to our systems of food production, preparation and consumption, while ensuring very troubling consequences for both those with restrictive eating disorders and those dealing with adipositis (inflammatory obesity).
Oh, and I’ll be using that term adipositis from now on with due credit to Dr. Carl Nathan.