What’s “processed”?

Suppose you want to eat less processed food. Given how and what most Americans eat, that impulse is probably a good one. But once we go beyond the obvious (cheese curls, sugar cereal, hot dogs) you find yourself down the rabbit hole. What about that bottle of salad dressing you use to perk up your unprocessed salad? Is hot sauce ok? What boxed cereal can you eat? You start squinting over ingredients lists, blocking the grocery aisle with your empty cart. You accept an invitation to a potluck and sit horror-struck by the potential dangers lurking in the dishes, feeling your appetite slipping away like blue cheese dressing off a greasy wing. To bolster your flagging courage, you read endless blog posts about why the things you’ve given up are killing other people’s children. You develop an evangelical zeal, gnawed by the fear that your friends will make fun of you the moment you step out of the room. You begin to wonder if you should get new friends.

And then you throw up your hands and dive into a bag of Doritos.

Now, I am the last person to advocate eating most of the food available in American supermarkets. I make my own jam and pickles, I bake bread, I cook practically every meal from scratch, I shop at farmers’ markets. After twenty years of living and eating like this, industrially processed food no longer really tastes like food. Forget health concerns; it just isn’t particularly satisfying.

But having lived this way for twenty years — and having put a great deal of thought into it during that time, and having done a lot of research on how foods were historically prepared — I’m painfully aware that any notion of purity about this business is foolishness. Cooking is, after all, processing, and humans have been doing that for what, fifty thousand years? We’ve been grinding grain into meal for five thousand years, and we’ve been processing and selling food commercially (mainly as grain, oil, and spices) for probably four thousand. I can, if I try, justify the natural origins of practically any edible substance — or find fault with the freshest of fruits. (What the heck is “food-grade wax”?)

Obviously, any sane and sensible person is going to draw a line somewhere. But any line we draw will to some extent be arbitrary; any principle we set will inevitably include some things that seem thoroughly unnatural and exclude others we can’t manage without. I’m going to consider some possible standards, suggest an alternative that’s (you won’t be surprised to learn) largely historical, show how difficult it is to apply even that comparatively objective standard — and then draw some conclusions about navigating this mess sensibly. It’s a long piece, but hit-and-run easy answers are exactly what we need to avoid.

A note: For the sake of clarity, in the discussion that follows, I’m going to use Processed, capital-P, to denote big-bad-industrial-processed foods, as opposed to the sort of processing I do when I peel and seed a tomato and cook it down into sauce. I’m not going to pretend that the latter isn’t processing, and I don’t have a word for the former.

diagram from patent application

A diagram from Jehu Hollingsworth’s 1851 patent application for a “mill for grinding and bolting” grain. (View full size.)

What, first of all, is the yardstick for measuring Processing? Is Processing a question of who made the product, or in what sort of kitchen? If it was made my someone you know, with his own hands in his own kitchen, is that ok? What if it was made in rented space in a commercial kitchen and sold at a farmer’s market? Still ok? What if that vendor has some success, sets up her own kitchen, and hires a few employees? What if she goes regional? What if she licenses it to a bigger company, which produces it in Let’s assume that the product itself, its ingredients and procedures, has not changed. At what point does the product become Processed? It’s not clear, and I don’t see how anyone could draw a line that isn’t in some way arbitrary. A tomato from your garden is fine, a can of condensed tomato soup is not, and in between are a lot of difficult decisions.

Or is Processing about fundamentally transforming ingredients, making them utterly unrecognizable as the whole foods from which they came? But applesauce is then questionable, and it only has one ingredient. Even the best homemade sausage is a good way from being recognizable as a pig. Let alone cassoulet, or lasagna, or a pot of soup, in which flavors combine inextricably. And would you eat a souffléed potato but not puffed rice? What if the rice was puffed at home, and the potato souffléed in a commercial kitchen?

One could argue that that transformation is only bad if ingredients are broken down chemically and then recombined — making corn syrup of corn or extracting lecithin from soy to use in packaged cookies. We might say that food is not Processed if it’s made entirely from whole foods — although we’d have to include foods made from other whole foods so that we could include, say, tomato juice or maple syrup. But then a can of all-natural condensed soup would pass muster, and condensed soup seems obviously Processed. And is the traditional production of hominy — treating corn with lye, a dangerous chemical — therefore bad Processing?

One standard I’ve seen, which looks at first like common sense, is that a food is not Processed “if the food in question could be made at home by a competent cook out of whole-food ingredients.” But could and competent are fuzzy. There are foods made in factories that one could make at home if one were independently wealthy and could afford the time and equipment required, but that no person, anywhere, ever made before there were factories, and there are others that were so expensive to process by hand that only the very wealthy could afford to eat them.

I’d like to qualify that standard by insisting that the alleged Unprocessed food must actually have been made at home or on a community scale in some culture, somewhere on the globe, at some point in human history, in essentially the form you’d be buying at the supermarket. Practically anything could be made at home with enough labor, persistence, and equipment, and money, but did anybody? and would you, realistically? In other words, is the food in question genuinely and historically a product of culture and agriculture, or is it rather a product of mechanized industry?

The historical part of that question is key, because it gives us something like an objective standard. And applying it honestly yields some surprising results. The blogger who set the “made-from-whole-foods-at-home” standard thought that chocolate, crackers, and “good cereals” were fine, but look at little more closely at where they come from.

In a fair bit of research on the history of baking I have found no significant evidence that ordinary people ever baked their own crackers, not as we’ve come to think of them, not for everyday use. Eighteenth-century English and Anglo-American cooks prepared twice-baked breads like rusk that stored well and served for snacking, and they baked beaten biscuits that kept for weeks and resembled what later came to be known as oyster crackers. But the crisp, crunchy, bite-sized cracker was made first by commercial bakeries and only afterwards imitated by home bakers: recipes for that kind of thing don’t show up until after various sorts of industrially-baked crackers were already widely available. (The technique for making soda crackers, though easily imitated at home, was thought by health reformers of the 1840s to be dangerously Processed!) And there are few brands of crackers now on the market that you could reasonably emulate at home. You cannot, for example, make Triscuits, and it would take considerably skill even to make something resembling Carr’s Table Water Crackers. So even though you can make crackers at home, if you’re eschewing Processed foods, nearly all boxed crackers are out.

I’m not sure what “good cereals” would be. Most boxed and bagged cereals are inherently Processed: corn flakes, for example, require fascinatingly complex industrial manufacturing. Ditto grape nuts, whatever their marketing. (Let alone Frosted Flakes!) None of these cereals existed until the late nineteenth century, when they could be made in factories.

Could you make granola at home? Rolled oats — whole oat groats, steamed and pressed flat — are a product of an industrial food system. They didn’t exist prior to the steel roller mill for which they’re named, first built around 1870. Wheat germ, a common enrichment and apparent natural food, is cleanly separable from wheat kernels only in industrial roller mills.

In theory, you could turn a cacao bean into edible chocolate in your own kitchen. Before industrial-scale processing, though, only the rich could afford to each chocolate, which suggests that in practice, you are not going to be able to make it at home. Moreover, chocolate wasn’t available in all the forms it is today, nor was the quality nearly as good as even the cheapest grocery-store bag of chocolate chips. The fine creamy texture of a really good chocolate bar is attainable only with machine power and precision. Chocolate, I’m afraid, is very much Processed.

And that’s only the beginning. Even the most basic ingredients of the modern kitchen have questionable origins:


You could, with a bit of knowledge and a few simple tools, raise a crop of wheat, harvest it, thresh and winnow it. You could then grind the grain in a mortar and pestle. With a modest amount of social organization and engineering capability you could build a mill, where for a fee people could take their grain to be ground by rotating stones powered by water or wind. This is, of course, processing. Modern roller mills use different technologies, but they work on the same principle. Is bagged flour then ok? If not, where’s the line?

If you wanted to get rid of some of the bran — since millstones don’t grind bran very finely, and it contributes to spoilage — you could bolt the flour through tightly woven cloth. That, too, is processing, and it takes more work (and finer cloth) than most people could afford before the industrial age — hence white flour was valuable stuff.

Ditto cornmeal. Lacking wheels for grinding, to save labor, many American Indians hulled corn with lye or lime and then pounded it into a paste rather than pounding it dry into flour.


Nobody ever made granulated sugar at home It was always a large-scale commercial process, even in the Middle Ages. You could, in theory, make it yourself, but you would have to be independently wealthy to afford the time and tools. That mans not only white sugar but brown sugar, organic brown sugar, turbinado sugar, and, yes, even blackstrap molasses are straight out. All shelf-stable products of the sugar cane are unavoidably Processed. Although Turbinado sugar is marketed as “raw,” its name comes from the turbines used to separate sugar from molasses: it, too, is processed. And blackstrap molasses, the darling of health-foods proponents for the last half a century, requires more processing than cane syrup, because nearly all of the sugar has to be crystallized out by repeated centrifuging.

Honey and maple syrup are about your only options for sweeteners. Some people seem to think that agave nectar is Unprocessed, but the commercial manufacturing process looks pretty complicated to me. And I think, if we’re being honest with ourselves, that buying a commercial product that could theoretically be made at home from cactus juice when in fact you live in the humid east and don’t have a cactus growing within a thousand miles of your kitchen, and then claiming some kind of purity, is just a little bit disingenuous.

Baking soda and baking powder
Fundamentally industrial products. Alkali leaveners like baking soda weren’t around until the late eighteenth-century, first as a by-product of the potash industry, and they were never manufactured at home except by amateur chemists experimenting at their kitchen tables. Cream of tartar was scraped from the insides of wine casks where it precipitated, so it could, in theory, be made at home, but it isn’t any use for leavening without an alkali to react with. Chemical leavening, as a rule, is Processed. For baking you’ll have to be content with…
Except that active dry yeast is Processed, too. Before the late nineteenth century, industrial processing, and refrigerated railcars, the only yeast available for purchase was liquid brewer’s yeast taken from the bottom of beer barrels. Even old-fashioned yeast cakes were the result of Processing. If you want to keep your bread pure, then, you’ll need to take up homebrewing, or else cultivate wild yeast and make sourdough.
Oil and shortening
Olive oil and palm oil pass the make-at-home test, but that’s about it for cooking oils. Peanut, corn, soy, and canola oils are all inventions of the industrial age. Where neither olives nor palm were grown, people traditionally used animal fats for cooking. (There are a few exceptions, but they’re rare.)
Wine, cider, and malt vinegar are fine, but not distilled vinegar. While you could build a still and distill vinegar, I don’t believe anyone actually did this until it was done industrially. It simply wasn’t worth the effort.

You can most certainly make beer and wine at home. You can set up a still, too, but I’d note that the refining processes that permitted the manufacture of fine spirits weren’t around until the nineteenth century. In historical terms, then, good vodka and gin are Processed. (Whiskey is in, though.) That fact also makes vanilla extract questionable: you can make it at home by steeping vanilla beans in vodka, but you need a fairly well-refined vodka if you want to get a clean flavor. Even homemade vanilla extract might therefore require an ingredient that is, if we’re being fair, Processed.

I’ve read that if you strain cheap vodka a few times through coffee filters (or through layers of cloth equipment), it’s almost as good as Ketel One. I can’t confirm this, because I don’t like vodka enough to bother trying it.

I could go on, but I think that’s enough.

My point here is not to say that you should or should not avoid Processed foods, or to tell you what you ought to eat or not eat or where you ought to draw a line. I do think, though, that the historical and cultural standard is a good one, for a few reasons. First, it makes us think about the real world of human capability and community, past and present, rather than focusing on our own foodie fantasies and desires. Second, it’s a standard that is actually attainable by a human community, living sustainably from its own foodshed without massive extractions of resources.

And third, contrarily, it forces us to face the incredible distance between that way of life and the way we actually live. Even something as simple as yeast, which humans have harnessed for thousands of years, which was cultured, nurtured, shared, traded, and when necessary bought and sold within a community, village, or town, is now available only in an industrial form from distant factories. Nor can we buy flour from the local gristmill — and buying the grain mill attachment for our KitchenAid mixers makes us just as reliant on distant factories, and our food just as industrially processed, as if we simply bought a bag of flour.

One thing that ought to be clear from thinking about food this way is that we simply cannot go it alone: the standard to which we ought to hold our food is not attainable by individuals working in their home kitchens. If we shield ourselves from industrial production and try to cook and eat alone, we’ll be forced to give up practically all the fruits of civilization. Bread, for example, is straight out; we’ll be back to eating porridge three meals a day. We need communities.

And any sort of hard, honest thinking about what we eat ought to lead to a few simple conclusions that, on consideration, are not simple at all:

  1. To be responsible human beings, we must choose thoughtfully what to consume and not to consume. On the other hand,
  2. The complexity of the world we live in makes many of those choices, if not exactly arbitrary, then at least continually open to question. We can’t possibly avoid all potentially negative consequences of our actions. We can’t even know all the potentially negative consequences of our actions: we’re forced to make our choices without sufficient knowledge. As a result,
  3. It’s quite easy to rationalize ourselves into choices that permit us to do what we want — like eating chocolate when we won’t eat bottled mayonnaise. And even if we avoid that trap,
  4. Whatever we choose, and however thoughtfully, we are not pure. Even a subsistence economy would have its issues, and none of us is living in one. And so none of us has any business feeling self-righteous about our choices.

In short, the ideal is fuzzy, the reality almost infinitely complicated, and the choices difficult and uncertain — but we must choose. That’s the human condition, and I think the best way to approach it is with more humility and less panic: to identify core principles, recognize when they conflict, do our best to sort them out, and not be too hard on ourselves when, inevitably, we fail to do so perfectly — and certainly not on those around us. Becuase, you know, what we happen to be putting into our bodies at any given moment really is not the most important thing in the world. Despite the amount of digital ink people like me spill over it.