This is a series of articles on one of my favorite topics. You can read Part 1 here.
In last Saturday’s piece, I explored how the devaluation of cooking, a seemingly innocent concession to modern notions of hyper-convenience, helped de-ritualize eating as a group affair and focus our minds as cultural actors on an aristocratic view of food as a panoply of eating delights. Just as the English Baron and his family never once cooked a meal for themselves, we, too, can opt out of cooking very easily due to modern food processing and the spread of fast food. Even working-class Americans can outsource some ‘cooking’ at minimal expense (compared to other items in their monthly budget).
Devaluing cooking as optional does not make us obese directly, but it allows us to view food only as something we eat and eat in a state of abundant choice and sensory overload. If you have ever lived in a rural peasant society or even visited such a village for one weekend, you immediately understand how precious food is, how sacred cooking is (because it’s the cheapest way to feed a group), and how every meal is an act of group love in a state of scarcity.
Food in America is profane, cheap, and over-abundant. Ironically, we treat it with more respect and gratitude when we cook as hosts or on the holidays (i.e., not often). Whether as a cause or as a result, we, therefore, have no symbolic or ritual barriers to overconsumption. The gates are wide, wide open to stuffing and munching.
Regarding our food, most adults have no effective gatekeeper other than their own habits. We have no ritual food disburser or wrist-slapping enforcer who makes us endure hunger until the next meal. Older Americans reading this remember this behavior as once commonplace.
All of the above would prove critical in the 21st-century rise of our obese adult population, especially our chronically ill, lower expectancy morbidly obese population.
The Advent of Ungatekeeped Snacking
In most literate, agriculture-based cultures, the emphasis on eating cooked food rendered between meal snacking trivial or non-existent (not so much for pre-agricultural, horticultural and tribal foraging societies). The hearth-cooked shared meal has to some extent always had a sacred quality, more so if it was only staged once per day.
When most of one’s calories, if not all, flowed through cooked meals (true in much of post-medieval Western history if my research sources are accurate), women had enormous social power over what and when you ate. This was true even for the household patriarch, who was always a cultural victim of learned helplessness in the kitchen. You may even know some men, probably older men, who exhibit this very learned helplessness. Today, you may also have met women who share this learned state of being. Ironies of feminism abound in a culture favoring maximal autonomy.
The English word “snack” only dates back to the 18th century in terms of modern usage, coinciding suspiciously with the rise of women working in factories in England. If fewer women are at home during the day, cooking gets concentrated (early or late), leaving more ‘open time’ for hungry (unemployed men and teen boys) to rummage for some edible left-overs from the last meal (but in an era before refrigeration).
The eighteenth-century “snacker” appears to have just wanted a meal but had nothing but leftover scraps of stale bread or whatever. This was snacking as an inferior meal. And yes, hunger was no doubt the only motive. Stale bread? Dried out, cooked meat? Come on. This is not the stuff of intentional desire.
This early modern snacking sparked what would later become a mass consumer behavior worldwide - casual, solitary snacking on impulse.
Snack marketers study “cravings,” and their R&D colleagues spend loads of time trying to satisfy them. Your cravings, that is. Imagery, your private sensory memories, and marketing language combine to seduce modern urbanites to snack as soon as the sensation or desire to do so occurs. Or, as quickly as you can swing into the Quick Trip to buy a snack.
With most women employed outside the home, modern refrigeration and long-shelf foods of all kinds that require either no cooking or mere reheating, who exactly needs a ‘home cook?’
We don’t. We fired her.
Wait. No. We have her (or him) on speed dial like an Instacart gig worker, but one who doesn’t get paid.
Whenever anyone finally starts earning a W-2 income, coming home to loads of unpaid domestic work is very annoying (maximally so when married with kids at home). Where is the help? remains a simmering question in millions of households built around the tiny, alienated nuclear family. When women fully entered the modern workforce in the 1970s (in unprecedented numbers compared to the era of peak factory labor), tens of millions confronted this infuriating question.
What does this have to do with obesity?
Well, if the traditional gatekeeper is now out of the home (or preoccupied with a laptop) most of the day, the number of de facto hours in which other people can just grab food and eat without being surveilled goes way up. They need to if they want to eat.
While parents of young kids still attempt to gatekeep their children’s diet, the effort is pretty permissive in most households, as any food industry market researcher will tell you, while sitting on a mountain of survey and observational evidence the public cannot access. It’s not just you and your kids.
If your home has no kids, there is little gatekeeping of food intake at all anymore because meals are no longer hogging all of our caloric intake. And, because many of us create and eat meals alone anyway. Modern Americans gatekeep our diets as an individual responsibility.
As dietary gatekeeping weakened or vanished from American homes because of modern food storage, food technology, and female employment trends, snacking became almost an inevitable social development.
It. Is. Just. Too. Easy. And. Cheap.
However, removing the gatekeeper and flooding our homes with cheap snack food is not enough to drive the subtle excess food consumption that puts on pounds of adipose fat that keep adding up.
Something else was required. And, ironically, that something else connects right back to the current rise of Glp-1 hormone-boosting drugs to lower weight.
Snacking to Banish the Hunger Sensation
When I first entered the business world in 2003, I did in-home interviews and shop-along tours with Americans for corporate clients. In these two to three-hour explorations of American foodways, I quickly noticed something odd that was tangential to the stated corporate research agenda.
Most adults I interviewed in the early 2000s believed that merely feeling hungry, a sensation we have all experienced, is something we should promptly get rid of by eating a snack. They did not put it this way directly with me because it sounds gluttonous or ridiculous. I inferred this social truth because hundreds kept insisting that they snacked in midafternoon because “I felt hungry” and some version of “didn’t want to wait until dinner” or “couldn’t make it until dinner.” The surface explanation was usually about “lasting until dinner” or “performing at one’s best” until work ended, etc.
But for millennia, humans have happily deferred eating for various reasons. Agricultural humans did not have time to forage all day, so they waited, hungry, for cooking-centered rituals of commensality and food sharing. And you waited to eat if it was not a locally appropriate mealtime.
Interviewees’ use of the auxiliary verb can/could (i.e., “I couldn’t make it until dinner”) baffles anyone who knows human anatomy or any hunger relief worker who has aided truly starving people in refugee camps. Humans can last quite a long time without food—up to two weeks. There is no biological reason to react to a hunger signal faster than the next culturally appointed mealtime. Confining food intake to meals in a cooking-centered culture is very realistic. Let’s face it, Americans have three mealtimes from which to choose, spaced no farther apart than five-to-six hours (when not skipped). It’s not the end of the world to wait five hours for more food. Seriously.
Yet, here I had perfectly mature, college-educated adults sitting in their living rooms and insisting on camera that they “had” to snack in the afternoons (two hours before dinner) or even in the mornings (one hour before lunch!). Had to do it, James. Had to. Wouldn’t make it to lunch. Wouldn’t make it to dinner. I’d crap out. I’d fade. Wouldn’t make it through the meeting. I’d zone out. Lose it. Couldn’t think.
It’s not empirically clear, but I suspect this notional standard—extinguishing hunger signals more or less immediately—spread in American culture during the 1990s. This cultural belief did not exist in the 1980s when I was growing up. Only teenage males were thrown predinner snacks to quell angry bellies. Sometimes. My wife remembers being told angrily to wait for dinner by her father: “You won’t die!”
Mealtime was sacred. Food was guarded. If this panicked need to extinguish hunger the moment we sense it wasn’t widespread in the 1980s, but it was by the 2000s, then something happened in between.
I suspect that the cultural banishment of hunger pangs among adults and children correlated to a general intensification of the definition of middle-class comfort in America. It is part of a longer-term trend toward extinguishing all manner of physical pain with OTC painkillers, prescription opioids, and THC. It’s not my place here to accuse Americans of becoming “royal wussbags,” as we called our cowardly peers in the 1980s. I do not believe that snacking to prevent between-meal hunger is about being too wimpy to wait until dinner. It’s simply a cheap luxury in the land of processed mass-distributed calories. It costs little to extinguish hunger instantly, so . . . we do it. It’s just so easy when no one guards the sacredness of mealtime. It’s something the anxious and stressed can control.
I’m confident that all those helpless eighteenth-century men who couldn’t find a woman to cook them a meal and ate snacks of stale bread and cold meat would pass out from ecstasy at the array of snack foods available to us now.
More tragically, millions of Americans are now set up to fail. As cultural beings, they have internalized a truly aristocratic notion that hunger pangs should be extinguished, surrounded by cheap, high-calorie snack food (inside and outside the home), and left to manage their own diet as individuals with 100% accountability.
The debate rages over whether a BMI of 30-40 is a death sentence, although plenty of evidence exists to show increases in chronic illness and medical costs to remain alive.1 And fat acceptance is on the rise culturally, in large part because most of us are chubby now (unlike most of human history). A BMI of 40 or higher, morbid obesity, is not controversial in scientific and medical circles. It IS a slow-rolling death sentence affecting 8% of American adults, or 20-21 million human beings.2
Yet, America's sheer population-level prevalence of excess body fat is unprecedented in human history. What many Americans who have never lived in Africa or Asia do not understand is that, traditionally, humans associate visual obesity with power and wealth. I encountered this belief alive and well in urban India in the late 1990s. This ancient human ‘prejudice’ links to the fact that being skinny was normal for most of human history until your final years when physical mobility declines naturally (but meal servings at the hearth do not).
Today, the ‘skinny’ with normal weight (24 BMI or lower) represent only 18% of adults, mostly young folks.3 Some are just manual laborers. Others live in very small lifestyle subcultures hell-bent on body image as a form of elite social status. I know many of these folks due to my own post-grad social network.
These skinny folks are becoming the new aristocrats, now that accumulating body fat is a mass-market possibility.
If you are enjoying this essay, my new book goes even deeper into this story. I’d really appreciate your pre-order to help successfully tickle the Amazon algorithm.
Thanks in advance!
Here is a great resource on the empirical state of obesity in America, if you want to go deeper - The State of Obesity: 2020 BETTER POLICIES FOR A HEALTHIER AMERICA ISSUE REPORT With Special Feature on Food Insecurity and its Connection to Obesity . chrome-extension://efaidnbmnnnibpcajpcglclefindmkaj/https://www.tfah.org/wp-content/uploads/2020/09/TFAHObesityReport_20.pdf
2020 US Census data was used for this calculation.
BMI segments by % of population - https://www.niddk.nih.gov/health-information/health-statistics/overweight-obesity
The "normal" BMI of 24 was based on white men in Belgium in the 1800s. I don't think it has any relevance to women in particular. We have more body fat than. men, and always have. That said, I wandered into a McDonalds and ordered a fajita a few years ago only to be told that it hadn't been on the menu for a decade. We grow most of our own food, almost never order out and I love to cook from scratch - including bread. I'm still considered fat, even though my blood pressure is perfect, cholesterol under control, and with two big dogs to walk I usually crack out 5 miles a day. I get that we are the fattest nation on earth, but the irony here is the the Food Industrial Complex uses this to create MORE processed, so-called "diet" food, including diet cola, which is ridiculous. Basically, America has lost touch with food and food sources, and we are completely and totally screwed up on this issue.
This sounds super-Puritanical.