This blog is about the intersection between evolutionary biology and food. But also about practical applications, sustainable agriculture, and general tasty things.
Hands down the best health book I read this year was The Definitive H.P. Lovecraft: 67 Tales of Horror in One Volume. Despite being about fictional creatures of terror from unholy abysses, I learned quite a bit from Lovecraft's depiction of the universe. The humans in Lovecraft's stories are baptized into the knowledge that the universe is older and more incomprehensible than they could have ever imagined. While the monstrosities and sublime ancient temples are quite terrifying, what is even more terrifying to the humans in the stories is their realization of how little they can ever really know. Those that get a taste of the mysteries often only do so at a very high price.
They called up some image from deep cells and tissues whose retentive functions are wholly primal and awesomely ancestral
I'm not sure I have any sort of particular cause in terms of diet anymore. It's gotten to the point where I'm just interested in the Paleolithic and not really very concerned with arguing about whether or not a potato is safe to eat or not.
Wouldn't it be nice if our nice little narratives worked out? The ones in which Homo sapiens sapiens is the protagonist and you can trace his illustrious evolution neatly through the ages. And he fits rather nicely in your romantic stories about hunters and mammoths so you can tell people that this is their heritage.
But in reality you don't get your nice story. Instead, you get ages and ages of dust and bones, in which every little shred of a skeleton is a prized, but dim, glimpse into ages long past.
In my anthropology class last year, one of the skull casts that caught my attention was the Kabwe skull, which is estimated to be between 125,000 and 300,000 years old. Not quite Homo Sapiens, the skull has some features of modern humans and some of Neanderthals. Homo rhodesiensis? Homo heidelbergensis? Homo sapiens rhodesiensis? Anthropologists could argue about it all day. Either way, this person died a miserable death. The first known incidence of dental infection in a hominid as far as I know, and the infection was bad enough to put some ugly holes in the bone and eventually kill the individual.
There is only so much you can tell from bones, which leaves lots of room for people to make stuff up. Stable isotope analysis seems quite promising, as they can potentially tell you the source of protein in the diet, but they can only tell that and nothing else, and the isotopes are subject to interpretation. For example, Lierre Keith in her error-ridden Vegetarian Myth claims that stable isotope analysis showed Australopithecus africanus ate meat, but in reality the data only said that the protein was from carbon-13 enriched foods, which could include grasses and sedges as well. Later investigations revealed that the carbon-13 probably was more likely from grasses and sedges, but the data is up for interpretation. Before you tear up your lawn to make dinner, it might be worth remembering that Australopithecus africanus is only thought to be a possible human ancestor and was quite a bit different from a modern human.
That said, stable isotope analysis puts to bed the idea that early Homo sapiens were getting their protein from the Paleolithic equivalent of tofu or the idea that Neanderthals definitely only ate meat (turns out that some ate fish too...maybe).
"Maybe", "later investigations revealed", "thought to be"- these are things that should give you pause whenever you encounter stable isotopes being used to argue about ancient diets. Have I confused you? Good, now you are less vulnerable to the abuse of bones in the name of various causes one way or another.
It can be used to estimate the trophic level and origin of the protein, but it cannot tell you whether the person ate a teeny tiny auroch steak and then 17 potatoes or whether they only ate mammoth. It cannot tell you the percentage of protein in the diet. It cannot tell you how much protein in grams. That information was lost when the person died.
Then there is the use (and mainly misuse) of animal bones and modern data from wild game species to argue various things about ancient diets. I read this latest paper, Man The Fat Hunter, with absolute glee because it uses many of the same questionable methods and comes to an opposite conclusion of many past papers, which overemphasize protein. The questionable method is taking bones of animals possibly consumed by ancient humans and plugging them into an equation with the modern wild game data and then saying this or that about the amount of fat or protein in an ancient diet. In this paper we have elephants featured, which is great, since elephants are very fatty, but unfortunately their presence or absence in bone assemblages is not a food diary. There is no way to know how often elephants were eaten, so there is no way to make an even sort-of accurate conclusion about %elephant and therefore %elephant fat in the diet. Whether or not the hominids in question were able to cook is also a point of contention.
One good thing about the paper is that it does try to address one issue, which is ceilings. In this case, the paper mentions possible ceilings for protein consumption and fiber consumption that could be used to build diet-estimating equations. Unfortunately, there are quite hard to determine, as they are affected by human genetic variation, culture, and environment. For example, there is possible a ceiling on the consumption of raw plant materials based on gut morphology (though if you have only skeletons you can only speculate on this) and toxins, but that ceiling can be raised with access to cooking and processing. To complicate matters further, their food sources may have been things you haven't even thought about eating. You can try to figure it out based on local paleobotany and starch microfossils, which can be hard to read. Once you've established that a microfossil on an ancient tooth is possibly Bromus secalinu, you might be able to figure out a little about how it was processed based on microfossil shape and local conditions and if you have a rich lab you might be able to collect it and do a full nutritional analysis, but you still have no idea how much of the diet it made up.
And what is the protein ceiling? It depends on the rest of the diet, an individual's health, and possibly genetics. Modern genetics adds some depths to the picture. For example, the fact that genetic adaptations for a starch-based diet seem to be part of fairly recent selective sweeps may give us a clue that Paleolithic human ancestors probably weren't eating mainly starch, but statistical genetics is in its infancy.
But genetic variation can add more confusion if we are talking about what to eat now. Many "paleo" dieters have learned the hard way that they carry alleles for hemochromatosis, which means they can over-accumulate iron, which has some pretty nasty effects. It would be interesting to know where this came from, as it clearly would be a liability if an ancient human ate meat-based diet, but ultimately whether or not Paleolithic hominids carried such alleles in high frequency is irrelevant to the millions of men (and some women) who are at risk. This represents a ceiling for them, though it can be modified through modern medical treatment.
Normal is of limited use if you are on the end of the bell curve- this is where personalized medicine and self-experimentation is important
So while it's not completely true we have no idea what Paleolithic hominids ate. We do have some good clues, but reconstructing the diet is pretty hard. That doesn't stop people from trying, but their results are on some pretty shaky ground.
My own method, which is about as accurate as some of these equations, is to observe the fact that a medallion of relatively lean wild boar goes absolutely perfectly with a seared hazelnut crust and dollop of mashed celeriac or potatoes cooked in broth. Maybe there is a reason that dishes containing a protein on a bed of delicious carbs AND fat (but not overpowered by them) is so appealing to so many? Who knows.
A problem with reconstructing diets from the past is that people often forget to fathom the amount of information and cultural diversity that has been lost. Lost to cultural change, to habitat change, or simply to nature's rising oceans or lava flows.
Often you only have pale glimpses of what was lost in the form of archeological remains or the writings of passing travelers who probably did not realize that they were witnessing things that few can even imagine today.
When most people today think of the arctic or an ice age, they think of people clad in skins subsisting on wooly mammoth. But the truth is that arctic peoples of the past and of today rely on a huge variety of plants as well. I have written about the excellent book called Plants That We Eat, which describes the amazing and diverse plant foods of the Inuit. Most of their plant foods were leaves and berries, but they also collect tiny roots from the stores of mice, which provide a small amount of starch.
Turns out that further-south Arctic cultures in the past probably exploited starches more extensively. In Siberia they called the starchy bulbs of flower "sarana", but as this interesting paper shows, the word probably applies to several types of flower bulbs, mainly in the Lilly (Liliaceae) family.
Like John D. Speth's excellent book, the paper relies extensively on sources written in German, many of which have not yet been translated to English. I was already aware of the use of lily bulbs among the Native Americans of North America, but was not aware that Siberians ate them as well.
Apparently, sarana was eaten by many Siberian tribes: Shor, Tofalar, Tuva, Altai, Buryat, Selkup, Itelmen, Aleut, Evenki, Ket, and Khanti are mentioned in the paper. Of course, all these different peoples had very different lifestyles. Some like the Buryat and Evenki are nomadic pastoralists and others like the Itelmen and Aleut are closer to hunter-gatherers. Use of sarana varied in different regions. It was a staple in some and more of a treat in others.
The accounts of travelers in the area mention that sarana was:
It was mainly gathered by women, who made special tools to dig it out. When it was too cold to dig it out, they could also find large high-quality stores in vole (or other rodent) nests, making sure to leave something in return so that the voles would survive the winter and be able to harvest again next year. Georg Wilhelm Steller, who witnessed this in the 1700s, noted that it resembled a form of trade.
Sarana bulbs could also be steamed and served with berries. According to Krasheninnikov this was the best and foremost dish in Kamchatka. In his view, it was “both sweet and sour at the same time” and it filled the stomach well. “It can be consumed every day, which makes one almost forget the lack of bread”,
says Krasheninnikov (1819: II: 314)... The taste of cooked sarana has been compared to sweet or baked chestnut. Adolph Erman found the taste of sarana delicious. He describes sarana bulbs as excellent food (Erman 1848: III: 161). According to Karl von Ditmar, who calls it “pagan food”, the taste is similar to potato… Bread did not belong to the traditional diet of northern Eurasia. Ditmar correctly observed that the local people did not even miss bread. Bread was (and still is) in comparison extremely important in the European diets and was only partly replaced by potatoes in the 19th century. The lack of bread, potatoes and other familiar food seems to have bothered many of the travellers in Siberia. They were not capable of enjoying the local diet except for some dishes. The boiled bulbs of sarana and other plants were seen as more or less exotic, “pagan”, disgusting, strange or, in rare cases, surprisingly tasty. In general the travellers held a distanced attitude towards local food, which made them unable to correctly estimate the significance of sarana for the Kamchatkan diet.
In many areas of Siberia, game is pretty low in fat. If you've ever tried to eat mainly fish and lean game, it's very much understandable why sarana was so worth the trouble.
It's also understandable why such traditions have died out, as there are many flower bulbs that are quite poisonous and gathering them was probably a skill passed down through the generations.
Unfortunately, many traditions like these died out before people could really study them, which is a real shame. I've met arctic people who believe that wheat bread is a "traditional" food. But the remnants cast skepticism on the idea that arctic or ice age diets were just a bunch of big game.
I've noticed that Dr. Rosedale has all the sudden become a guru to low-carbers and even to the "paleo" movement. Sometimes I have a sudden desire to go back to school and enroll in normal Nutrition classes so I can be reminded that the vast majority of scientists don't think that way. But not all doctors are scientists, though the public seems to often think they are. So Rosedale would consider me a diabetic:
Also, there is really no totally safe level of blood sugar that will not cause non-enzymatic glycation or damage. The thresholds for diagnosing diabetes are arbitrary numbers. As such, I consider most everybody to have diabetes; just different degrees. Is the intake of 100 g daily of starch/sugar terrible? No, compared to the vast majority of diets being consumed. Can one hundred grams starch be tolerated? Yes, by most. Can it be called healthy? That depends on if one means healthy or healthier than what is unhealthy. My answer would be no.
Despite the fact I'm probably more leptin and insulin sensitive than anyone I know, I guess I'm diabetic. He said on Paleohacks that the Kitavans would benefit from his diet, which strikes me as arrogant considering his diet has only shown in a single small study to improve the markers of sick people and the final numbers were nothing impressive particularly compared to raw vegans, starch-eating horticulturalists like the Kitavans, or small-scale fish eating farmers. Either way, viewing starch as poison doesn't make sense evolutionarily or physiologically.
Maybe he will soften as he realizes that the paleoanthropology does not support his conclusions. Paleoanthropology supports the idea that humans thrived on MANY different diets. The content of fat or starch in ancient diets is incredibly controversial. Rosedale says it doesn't matter because ancient diets only optimized for reproduction, but again, that is not true. One of the unique things that makes us human is the presence of very old "elders." There are many anthropological theories and debates regarding their importance, most notably the "grandmother hypothesis." It's quite clear to me that humans have been evolutionarily selected to thrive at older ages.
The human digestive system is neither that of a carnivore nor an herbivore. It's something special, a tool for taking over the world by thriving on many different diets. Our metabolism is equally flexible.
And either way, if a truly starch-free low-carb diet was something magical that made people live an especially long healthy life, the early explorers would have found the polar regions full of such people at first contact. But many explorers actually noted that the Inuit and other circumpolar tribes seemed to age faster.
When mainstream media folks are criticizing us for being a monolith, it doesn't make sense for us to embrace someone who fits that word to a T.
If you want to follow the biochem debate between Rosedale and Paul Jaminet, this is a good starting point, but my point is that it doesn't make sense evolutionarily.
Edit: Chris Masterjohn comments.
I know I've been blogging about public school lunch a lot lately, but I think it's a great example of everything that is wrong with the food system in America and what will happen if we allow the government to be in charge of more and more of the food system.
A few months ago I noticed that the USDA school lunch standards were restricting starchy vegetables (like potatoes), corn, and legumes. I neglected to blog about it, I suppose because I was so busy. Had the USDA read Gary Taubes?
Nope. Apparently, it was based on a report by the Institute of Medicine that recommended restricting them not for any problems with the foods themselves, but in order to encourage schools to try other vegetables.
That seems a little sanctimonious to me, like something a moralistic grandmother would do. In the meantime, I don't see any USDA restrictions on fried food or chocolate milk.
If you read the IOM recs, it's clear they are unable to think about food holistically. They are not seeing that it's not the potatoes or the chicken that's the problem, it's the fact that they are often breaded and/or fried. If only our problem was overeating potatoes and lima beans! And we could solve that be eating them less and making sure we get our chalk-colored fortified water, I mean 1% milk. It's laughable once you think about it.
But never fear, don't forget the government guidelines are often a combination of paternalism and lobbying interests. This time, I guess they canceled each other out and the USDA abandoned the plan.
No, it's not about potatoes, but about potato products:
Eating more potato chips and French fries is likely to lead to a bigger weight gain over the years than the weight change associated with eating more of other foods, new research indicates.
Apparently these are worst than cakes and sugary foods
Marion Nestle, New York University professor of nutrition and public health, expressed surprise that potato products were linked with more weight gain than desserts like cake, cookies and doughnuts, which contribute the most calories to the American diet, other research shows. She says she suspects people who eat potato chips and fries also tend to eat too much in general, making these foods markers for a diet leading to weight gain.
Personally, I'm not surprised. The difference is in the industrial vegetable oil. While many baked goods do contain some vegetable oil, I suspect fried potato products contain much more.
More science on the study at Gene Expression. As with all self-reported population studies, serious problems loom. I suspect "whole grain" eaters are just health-conscious individuals. Most people who think they are eating whole grains are eating processed foods masquerading as them. Same goes from yogurt, the food of weight-conscious women, which is often spiked with tons of sugar.
What boggles my mind is that schools still serve fries and baked goods. Pretty much anyone who knows anything about nutrition, from low-fat vegan gurus like Joel Fuhrman to moderates like Marion Nestle to low-carb gurus like Jeff Volek, could agree that fried foods and sugary foods are bad for you. Yet they are on the menu EVERY DAY in most public schools. We should stop arguing about meatless Mondays and local vegetables and still focusing on this sort of trash we are shoveling into our children's mouths. If we can't eliminate these foods being served to our most vulnerable populations, that's just sad.
Another hypothesis is that lack of SCFAs is behind such diseases of civilization. A SCFA called butyrate provides some insight into this. Butyrate is the preferred fuel of the colonic epithelial cells and also plays a major role in the regulation of cell proliferation and differentiation (Wong, de Souza, Kendall, Emam, & D. J. a Jenkins, 2006). Lower than normal levels have been found in patients with several diseases, notably types of colitis and inflammatory bowel disorder. Studies show such diseases can be treated through application of butyrate in the colon. That and the fact that some studies show complete remission through bacteriotherapy transplants point to these diseases being caused by disturbed populations of gut bacteria. Interestingly, these diseases are common in captive populations of apes and unheard of in wild apes (McKenna et al., 2008).
Bacteria affect butyrate production, but so do dietary inputs. Certain fibers produce more butyrate than others in humans, whether or not this differs between primates would be an interesting avenue of research (Smith, Yokoyama, & German, 1998).
Figure 1: Butyrate production in response to fiber
Interestingly, one of the top producers is something known as “resistant starch.” Resistant starch represents the growing nuance in understanding of fiber, since it is a starch that acts like a fiber in terms of acting as a bacterial substrate. It first showed up on the scientific radar when scientists found that low rates of colon cancer were not just found in populations with high-fiber diets, but those with high-starch diets (O'Keefe, Kidd, Espitalier-Noel, & Owira, 1999)1. Researchers found that a particular starch resisted digestion and ended up being fermented by colonic flora. They called this resistant starch and it is found mostly in cooked starches, some raw starches like green bananas, and some rough unprocessed grains and seeds. The former is termed type III and is a major part of the diets of many foraging populations who consume pounded and cooked starches like cassava, taro, true yam, and sago palm.
Whether or not humans are better adapted to certain types of resistant starch remains unexplored, but could account from some inconsistent results in studies that used type I resistant starch, mostly found in grains and seeds that would have probably been relatively uncommon in our ancestral diet. These studies have shown poor results and others with promising results are marred by high drop out rates due to unpleasant gastrointestinal side effects (Rinne et al., 2005; de Vrese & Marteau, 2007; Vuksan et al., 2007). Whether some populations would do better on this type of starch versus others would be an interesting investigation, but very few cultures consume large amounts of unmilled seeds and grains.
What type of starch we are best adapted to is interesting because the role of starch in human evolution is so controversial. Richard Wrangham has suggested that utilization of cooked starches was one of the dietary quality innovations that fed our rapidly expanding expensive brain tissue as it evolved towards hominid size (Wrangham, 2003). Recent analysis throws a wrench in that theory because it suggests habitual use of fire came after encephalization, about 300,000 years ago (Roebroeks & Villa, 2011). However, this does not mean that such cooked starches did not change humans, even if it reduces their significance in human evolution.
The burgeoning field of archeological starch grain analysis has transformed our view of hominids once thought to be mostly carnivorous. Microfossils on Neanderthal teeth from around 44,000 years ago show evidence of the consumption of many roots and tubers, some of which show evidence of cooking (Henry, Brooks, & Piperno, 2010). The full impact of the adoption of cooked starches on the human body has not been fully elucidated. One promising adaptation is the starch-digesting salivary amalyse gene, AMY 1 (Perry et al., 2007). Chimpanzees and bonobos have only two copies of this gene, humans have as many as 10 copies, though it varies quite heavily by population from 2 to 10 correlated with the importance of starch in the diet. Molecular genetic evidence places the origin of divergence on this gene at about 200,000 years, about the time when habitual fire use became common. Further genetic analysis shows that adaptations to root and tuber starch as a major source of calories may account for variation in human folic acid metabolism, since folic acid is usually low in starchy vegetables (Hancock et al., 2010).
Another relatively unexplored avenue of research would be whether butyrate in the diet itself has led to decreased reliance on butyrate for colonic fermentation in some cultures that consume large amounts of dietary butyrate. The major source of butyrate in food is from the milk fats of grazing animals (Smith et al., 1998).
It is most common in the modern diet in butter at 3%. It is possible that pastoral cultures consume substantial amounts of exogenous butyrate. Currently there have been few studies on oral consumption of butyrate in humans. Animal studies have been inconclusive, with some showing positive effects and some showing negative effects, which is complicated by the fact that if ingested orally it is also present in the small intestine, where it may play different roles (Sengupta, Muir, & Gibson, 2006; Wächtershäuser & Stein, 2000). A small study found orally-administered butyrate had a positive effect on symptoms of Crohn’s disease, but the method of administration was through pills rather than food (Di Sabatino et al., 2005).
Another potential source of butyrate is fermented foods. Some fermented foods like ogi, a pounded fermented starch, contain measurable levels (Hesseltine, 1979). Fermented foods are worth examining evolutionarily because they represent another human dietary innovation in improving food quality. Fermentation increases the bioavailability of nutrients, breaks down starches, and reduces levels of anti-nutritional factors and toxins (Mugula, 2003). It is unknown how long humans have been purposefully fermenting food. Fermentation naturally occurs in the wild and many wild animals are known to indulge in such foods to the point of drunkenness (Dudley, 2002). Spontaneous fermentation and consumption of such foods by wild primates is unfortunately not well studied. However, fermentation is practiced by almost every known culture to some extent, with the largest diversity in fermented foods among African farmers (Dirar, 1993) It is estimated that fermented foods make up 1/3 of the diet of humans worldwide (van Hylckama Vlieg, Veiga, Zhang, Derrien, & Zhao, 2011). Exogenous fermentation may substitute for the reduced fermentative ability of the human gut.
1. The researchers concluded that colon cancer risk was increased with meat consumption. I will remain skeptical until they do studies on other cultures that eat relatively low-fiber and high-meat diets like the Masai and Siberian cultures for example.
Di Sabatino, A., Morera, R., Ciccocioppo, R., Cazzola, P., Gotti, S., Tinozzi, F. P., et al. (2005). Oral butyrate for mildly to moderately active Crohnʼs disease. Alimentary pharmacology & therapeutics, 22(9), 789-94. doi: 10.1111/j.1365-2036.2005.02639.x.
Dirar, H. A. (1993). The indigenous fermented foods of the Sudan: a study in African food and ... (p. 552). CAB International. Retrieved May 9, 2011, from http://books.google.com/books?id=J-ogAQAAIAAJ&pgis=1.
Dudley, R. (2002). Fermenting fruit and the historical ecology of ethanol ingestion: is alcoholism in modern humans an evolutionary hangover? Addiction (Abingdon, England), 97(4), 381-8. Retrieved from http://www.ncbi.nlm.nih.gov/pubmed/11964055.
Hancock, A. M., Witonsky, D. B., Ehler, E., Alkorta-Aranburu, G., Beall, C., Gebremedhin, A., et al. (2010). In Light of Evolution IV: The Human Conditions Sackler Colloquium: Human adaptations to diet, subsistence, and ecoregion are due to subtle shifts in allele frequency. Proceedings of the National Academy of Sciences of the United States of America, 107(Supplement_2), 8924-8930. doi: 10.1073/pnas.0914625107.
Henry, A. G., Brooks, A. S., & Piperno, D. R. (2010). Microfossils in calculus demonstrate consumption of plants and cooked foods in Neanderthal diets (Shanidar III, Iraq; Spy I and II, Belgium). Proceedings of the National Academy of Sciences of the United States of America, 1-6. doi: 10.1073/pnas.1016868108.
Hesseltine, C. W. (1979). Some important fermented foods of Mid-Asia, the Middle East, and Africa. Journal of the American Oil Chemists’ Society, 56(3), 367-374. Springer Berlin / Heidelberg. doi: 10.1007/BF02671501.
Hylckama Vlieg, J. E. van, Veiga, P., Zhang, C., Derrien, M., & Zhao, L. (2011). Impact of microbial transformation of food on health-from fermented foods to fermentation in the gastro-intestinal tract. Current opinion in biotechnology, 22(2), 219-211. doi: 10.1016/j.copbio.2010.12.004.
McKenna, P., Hoffmann, C., Minkah, N., Aye, P. P., Lackner, A., Liu, Z., et al. (2008). The macaque gut microbiome in health, lentiviral infection, and chronic enterocolitis. PLoS pathogens, 4(2), e20. doi: 10.1371/journal.ppat.0040020.
Mugula, J. (2003). Microbiological and fermentation characteristics of togwa, a Tanzanian fermented food. International Journal of Food Microbiology, 80(3), 187-199. doi: 10.1016/S0168-1605(02)00141-1.
OʼKeefe, S. J., Kidd, M., Espitalier-Noel, G., & Owira, P. (1999). Rarity of colon cancer in Africans is associated with low animal product consumption, not fiber. The American journal of gastroenterology, 94(5), 1373-80. doi: 10.1111/j.1572-0241.1999.01089.x.
Perry, G. H., Dominy, N. J., Claw, K. G., Lee, A. S., Fiegler, H., Redon, R., et al. (2007). Diet and the evolution of human amylase gene copy number variation. Nature genetics, 39(10), 1256-60. doi: 10.1038/ng2123.
Rinne, M. M., Gueimonde, M., Kalliomäki, M., Hoppu, U., Salminen, S. J., & Isolauri, E. (2005). Similar bifidogenic effects of prebiotic-supplemented partially hydrolyzed infant formula and breastfeeding on infant gut microbiota. FEMS immunology and medical microbiology, 43(1), 59-65. doi: 10.1016/j.femsim.2004.07.005.
Roebroeks, W., & Villa, P. (2011). On the earliest evidence for habitual use of fire in Europe. Proceedings of the National Academy of Sciences of the United States of America, 1018116108-. doi: 10.1073/pnas.1018116108.
Sengupta, S., Muir, J. G., & Gibson, P. R. (2006). Does butyrate protect from colorectal cancer? Journal of gastroenterology and hepatology, 21(1 Pt 2), 209-18. doi: 10.1111/j.1440-1746.2006.04213.x.
Smith, J., Yokoyama, W., & German, J. B. (1998). Butyric Acid from the Diet: Actions at the Level of Gene Expression. Critical Reviews in Food Science and Nutrition, 38(4), 259-297. doi: 10.1080/10408699891274200.
Vrese, M. de, & Marteau, P. R. (2007). Probiotics and Prebiotics: Effects on Diarrhea. J. Nutr., 137(3), 803S-811. Retrieved May 9, 2011, from http://jn.nutrition.org/cgi/content/abstract/137/3/803S.
Vuksan, V., Whitham, D., Sievenpiper, J. L., Jenkins, A. L., Rogovik, A. L., Bazinet, R. P., et al. (2007). Supplementation of conventional therapy with the novel grain Salba (Salvia hispanica L.) improves major and emerging cardiovascular risk factors in type 2 diabetes: results of a randomized controlled trial. Diabetes care, 30(11), 2804-10. doi: 10.2337/dc07-1144.
Wong, J. M. W., Souza, R. de, Kendall, C. W. C., Emam, A., & Jenkins, D. J. a. (2006). Colonic health: fermentation and short chain fatty acids. Journal of clinical gastroenterology, 40(3), 235-43. Retrieved from http://www.ncbi.nlm.nih.gov/pubmed/16633129.
Wrangham, R. (2003). “Cooking as a biological trait.” Comparative Biochemistry and Physiology - Part A: Molecular & Integrative Physiology, 136(1), 35-46. doi: 10.1016/S1095-6433(03)00020-5.
Wächtershäuser, a, & Stein, J. (2000). Rationale for the luminal provision of butyrate in intestinal diseases. European journal of nutrition, 39(4), 164-71. Retrieved from http://www.ncbi.nlm.nih.gov/pubmed/11079736.
This weekend I was reading Food and Western Disease on Friday and in the intro Dr. Lindeberg talks about why some starchs like yams might be OK since humans have such a long history eating them in our ancestral African homeland, whereas potatoes are a "new world" crop and humans have only been there for at the earliest 35,000 years ago. The problem is that when I see people saying their yam dishes are paleo and potatoes aren't...they are not really eating yams, but sweet potatoes, which are Ipomoea, a NEW WORLD crop. Maybe Ipomoea has fewer anti-nutrients, but from a botanical viewpoint anti-nutrient composition is independent from ancestral history. There are many foods we have been eating a very very long time like cycad that are far more toxic than cultivated vegetables. Humans have bred vegetables to be less toxic in most cases.
This is "New World"
True yam (Dioscorea) is quite different from sweet potato. Most are quite large, have scratchy skin, and a white flesh, though purple yam (Ube) exists too.
I bought one at a local market recently. They are quite cheap here because they are popular with many immigrant populations and are imported in large quantities. But after buying it, it languished some time on the counter because it was kind of intimadating. It just looked large and grizzley...I decided to try it first made by the experts- Nigerians. Luckily there is a Nigerian restaurant near me.
Nigerians typically serve yam as "fufu," which is boiled yam pounded into a dough, though there are cassava and corn fufus too. The fufu I had was pretty bland and starchy. I don't know if I'll be seeking out African Yam in the future. I really liked the fermented cassava fufu though. It had a wonderful sour taste.
I was going to write about how all the paleo folks eating sweet potatoes and thinking they are so authenic are a bunch of posers, but then I realized I might be endangering all the paleo manhood. You see, yams actually have much scarier plant chemicals than sweet potatoes or potatoes. For example, researchers looking at the unusual incidence of twins in some Nigerian towns found that chemicals in yams might be to blame
In 1996 Ugwonali went to Nigeria on a Downs fellowship to begin his research. After analyzing age, socioeconomic factors and other variables, Ugwonali focused on diet. Demographic and scientific studies conducted in the early 1970s pointed to white yams as the culprit in the mystery of multiple births in southwestern Nigeria. Ugwonali interviewed people about their eating habits and made his own observations. “We suspected environmental factors,” he said. “The only factor that ended up being different from the ones we controlled was yams.” In laboratories at Yale and in Nigeria, he fed rats a diet of yams and saw the average size of their litters double from about four to about nine.
“Our hypothesis is that yams act as anti-estrogens,” he said, noting that he hasn’t investigated the precise chemical link between yams and fertility and has yet to isolate an anti-estrogen from yams. Anti-estrogens fool the brain into thinking there is insufficient estrogen, causing it to release more of a hormone called gonadotrophin and increase the ovulation rate, he said.
The "most paleo sounding yam" the wild yam (Dioscorea villosa), is even more questionable, since in folk medicine its extracts are thought to be estrogenic and used for "bust enhancing." In some studies they have been shown to change sex hormone concentrations (though other studies show it is ineffective)
After yam ingestion, there were significant increases in serum concentrations of estrone (26%), sex hormone binding globulin (SHBG) (9.5%), and near significant increase in estradiol (27%).
So yeah boys, keep eatin those "paleo" "yams." They taste much better and won't give you man boobs.
I spent this weekend in King of Prussia, Pennsylvania at the Weston A. Price Foundation's Wise Traditions conference with John Durant and Allison Bojarski. I live-Tweeted it, but here is also a list of things I learned:
And a bonus:
11. The government isn't going to fix the food system and in its blundering will destroy many small farmers and food businesses. Wow, it was scary seeing a doc called Farmageddon, which was accounts of military-style raids on FARMS. It was weird being in the same room as many of the people I did my senior food law thesis on like Linda Faillace and Mark McAfee. I was very glad to pay $4 at breakfast for bone broth because it supported the Farm to Consumer Legal Defense Fund. But I still don't feel sad about not going to law school because the whole thing is just too depressing for me.
Great, a new pop-sci treatment of an anthropology paper that your Aunt Maude will forward to you with the implication that you should eat her whole wheat pancakes next time you visit. The article portrays this as some kind of ground-breaking research that totally changes our view of the paleolithic.
So what's the deal with this study? Now that I'm wormed my way into academia again somehow, I read the paper. They found something that looks like a mortar and pestle with some evidence of starch residues.
The title says flour, but that's not the good old white flour your Aunt Maude is thinking of. Of the nine species mentioned, one is a seed, the rest are roots and rhizomes. That ground starch has been used by humans since the upper paleolithic is not really news. Famous anthropologist Richard Wrangham who wrote Catching Fire has been writing about the role of cooked starch in the Upper Paleolithic for quite some time. In the Upper Paleolithic it might have spurred population increases that eventually led to early settlements like Gobekli Tepe. There has been selection for genes like AMY1 which allow for better starch digestion.
I think isotope studies are a little more accurate than a few as the paper admits "poor preserved" plant remains. And the evidence is that the protein in the paleolithic diet was mostly animal protein.
Find the whole wheat...
I've had cattail and it's not bad, though a pain in the ass to gather and process. If you want something similar chestnuts are another starchy paleo-ish food, which by coincidence I ate today. So if it makes you feel more accurate have some yams or chestnuts alongside of your steak. But steak is king.
Whenever an article about the paleo diet is published in a major newspaper, at least one commenter expresses dismay that paleo dieters don't realize that humans are adapted to grains and milk. That's a misconception on several levels. First of all, plenty of us are educated enough to know that genetic adaptations can occur rapidly. I remember in high school when I first read The Beak of The Finch, which is about the finches in the Galapagos islands and how their populations genetically respond rapidly to changes in the environment. It takes down the myth that evolution is slow and can't be observed.
In that case, why are we still talking about what our ancestors eat as if it matters? Well, so far the evidence is that some adaptations have occurred in some populations response to neolithic food. Genetic evidence shows that most of the population in modern societies is descended from agriculturalists who had been farming for several thousand years. Clearly, our ancestors were very much able to survive on diets of grains and dairy.
I was just reading this scientific paper, Demeter's Legacy, which is free online and a fascinating read. Yes, there are two major genetic adaptations in agriculturalist populations. One improves the digestion of starch and the other of dairy. Great, we can eat these foods and reproduce. Yay, but it doesn't mean that we are completely adapted to them. There are plenty of foods that are digestible for everyday needs, but damaging in the long term. It's up to us to do the research and figure out if foods are really worth it. I ate bread for most of my life and felt OK, but life for me is not just about surviving, but about thriving. It's important to remember that even though adaptations have occurred, the vast majority of our genes were forged before agriculture.
And for people descended from more recent hunter-gatherers, neolithic foods are even more devastating.
I created a list that I am currently still adding foods to which outlines some pros and cons of various foods from the paleo viewpoint. I think foods should be judged on their merits and there is no "one true" paleo diet...there can't be, since last time I checked I couldn't get wild antelope at the grocery store. It's about learning from the wisdom of the past and choosing food based on those principles, not reenactment.