Storage:Possible Neolithic Agent of Disease

When humans started transitioning towards agrarian ways of life around 10,000 years ago, it wasn't just the types of food that changed. It wasn't just about more reliance on grains and less on meat, but about a fundamental change in the food system. True hunter-gatherers literally live day to day, not storing any food for later use. Horticulturalists started manipulating the forest so that they could have living stores of certain things like cassava and also started fermenting various plant and animal foods. As the human species moved into the arctic (LATER ON in our evolutionary history, contrary to some polar pushers that are popular "paleo diet" authors) and started living in more marginal areas in general, we developed smoking and salting as methods of preservation.*

But with agrarianism came the widespread processing and long-term storage of foods, particularly grains and legumes. 

This opened up humans to all kinds of new vulnerabilities from rancidity, molds, and bacterial contamination during storage. We've largely forgotten about these things because science has eliminated so many extreme acute examples. When was the last time you heard about someone getting ergotism? Ergotism is caused by a fungus that grows on rye. In the past, a contaminated harvest could terrorize entire towns. A lovely description from 857: "a Great plague of swollen blisters consumed the people by a loathsome rot, so that their limbs were loosened and fell off before death."

While antinutrients and rouge plant proteins tend to get a lot of focus in blaming such foods for the poor health of ancient agrarian populations, such contamination also probably played a role. In developing countries, aflatoxins, a type of mycotoxins, still remain a serious health issue. They are something to always be aware of when parsing epidemiological data from agrarian cultures. 

While few people in the US seem to be suffering from gangrene because of mold, whether or not even low levels of contamination and rancidity are an issue remains debatable. Regulatory agencies have different standards for what is spoiled. In Europe, the standard for Patulin, a toxin produced by P. expansum that shows immune system damage in animal products, is 10 μg/kg for children's apple juice. The action level for the US FDA is 50 µg/kg. 

In terms of rancidity, I notice a lot of industrial food producers are adding antioxidants like Vitamin E to oils vulnerable to oxidative damage, so someone is aware that it's a bad thing. But I think a lot of restaurants even exhaust the antioxidant additive's abilities by using the oil to fry stuff over and over again. In animal models at least, feeding oxidized fat is a great way to induce inflammation. There is mounting evidence they are a health threat for humans. 

And possibly because such types of spoilage is relatively evolutionarily novel, most humans seem to be unable to detect it simply by taste or smell. This is made worse when one is used to consuming sub-par food. The Chicago Tribute today had an article that noted that many immigrants find US peanut butter tastes rancid, but to most of us it tastes delicious. Also a study showed that 44% of Americans actually preferred the taste of rancid olive oil.

Many people I've talked to report that they feel fine eating "bad" foods in Europe. I've had that experience myself and it's very interesting. It's perhaps a testament to the EU's higher standards.

When I think about the diet I eat now vs. the diet I ate in the past, one thing that stands out is that almost all of my food is now in the fridge or freezer rather than in my cabinets. My cabinets are actually just full of tea and underused ktichen appliances. Of course, solve one problem and another one pops up- there is a hypothesis that Crohns could be caused by bacteria that thrive in fridge temperatures. 

*which have their own particular health risks that could make up an entire post