Jump to content
CBR1100XX.org Forum

Keto Info Week 8/27


spEEdfrEEk

Recommended Posts

CATEGORY: diets/vegetarian

TECHNICAL: **

SUMMARY:

This is part two of one of the most profound documents

I had ever read at the time I found it. It is written by Ward

Nicholson who has done a tremendous amount of research into

human diets based on evolution. As I said in the first part,

Nicholson, initially practiced a type of diet known as the

"hygienic" diet -- which is a strict vegetarian/vegan diet in

which everything is consumed raw and unprocessed. I'm quite

sure that you too will get as much out of this document as I

did. And, after you've had a chance to read through it (and the

remaining parts), I bet you too will find his argument pretty

convincing. And, of course, that arguement is that human beings

could never have evolved the way we did if we had been

vegetarians/vegans or frutarians.

Possibly the most profound statement, and one that I've

repeated, is that most people have forgotten that modern drugs only

masquerade the symptoms of an illness. A real cure can be sought

through reverting back to a natural human-evolution type diet.

The discussion on the use of fire for cooking is pretty interesting

too.

-------------------------------------------------------------

Part 2 of our Visit with Ward Nicholson

Fire And Cooking In Human Evolution,

Rates Of Genetic Adaptation To Change,

Hunter-Gatherers, And Diseases In The Wild

Health & Beyond: Ward, in Part 1 of our interview, you discussed the

extensive evidence showing that primitive human beings as well as almost

all of the primates today have included animal foods such as flesh or

insects in their diets. Why haven't Natural Hygienists and other

vegetarians looked into all this information?

Ward Nicholson: My guess is that: (1) Most aren't aware that

paleoanthropologists have by now assembled a considerable amount of data

about our evolutionary past related to diet. But more importantly, I think

it has to do with psychological barriers, such as: (2) Many Hygienists

assume they don't have to look because the subjective "animal model" for

raw-food naturalism makes it "obvious" what our natural diet is, and

therefore the paleontologists' evidence must therefore be in error, or

biased by present cultural eating practices. Or: (3) They don't want to

look, perhaps because they're afraid of what they might see.

I think in spite of what most Natural Hygienists will tell you, they

are really more wedded to certain specific details of the Hygienic system

that remain prevalent (i.e., raw-food vegetarianism, food-combining, etc.)

than they are truly concerned with whether those details follow logically

from underlying Hygienic principles. The basic principle of Natural

Hygiene is that the body is a self-maintaining, self-regulating,

self-repairing organism that naturally maintains its own health when it is

given food and other living conditions appropriate to its natural

biological adaptation.

In and of itself, this does not tell you what foods to eat. That has

to be determined by a review of the best evidence we have available. So

while the principles of Hygiene as a logical system do not change, our

knowledge of the appropriate details that follow from those principles may

and probably will change from time to time--since science is a process of

systematically elucidating more "known" information from what used to be

unknown. Thus the accuracy of our knowledge is to some extent time-based,

dependent on the accumulation of evidence to provide a more inclusive view

of "truth" which unfortunately is probably never absolute, but--as far as

human beings are concerned--relative to the state of our knowledge.

Science simply tries to bridge the knowledge gap. And a hallmark of

closing the knowledge gap through scientific discovery is openness to

change and refinements based on the accumulation of evidence.

Open-mindedness is really openness to change. Just memorizing details

doesn't mean much in and of itself. It's how that information is

organized, or seen, or interpreted, or related to, that means something.

What's interesting to me is that the evolutionary diet is not so

starkly different from the Hygienic diet. Much of it validates important

elements of the Hygienic view. It is very similar in terms of getting

plenty of fresh fruits and veggies, some nuts and seeds, and so forth,

except for the addition of the smaller role of flesh and other amounts of

animal food (at least compared to the much larger role of plant foods) in

the diet. It's one exception. We have actually done fairly well in

approximating humanity's "natural" or "original" diet, except we have been

in error about this particular item, and gotten exceedingly fundamentalist

about it when there is nothing in the body of Hygienic principles

themselves that would outlaw meat if it's in our evolutionary adaptation.

But for some reason, even though Natural Hygiene is not based on any

"ethical" basis for vegetarianism (officially at least), this particular

item seems to completely freak most Hygienists out. Somehow we have made a

religion out of dietary details that have been the hand-me-downs of past

Hygienists working with limited scientific information. They did the best

they could given the knowledge they had available to them then, and we

should be grateful for their hard work. But today the rank and file of

Natural Hygiene has largely forgotten Herbert Shelton's rallying cry, "Let

us have the truth, though the heavens fall."

Natural Hygiene was alive and vital in Shelton's time because he was

actively keeping abreast of scientific knowledge and aware of the need to

modify his previous views if scientific advances showed them to be

inadequate. But since Shelton retired from the scene, many people in the

mainstream of Hygiene have begun to let their ideas stagnate and become

fossilized. The rest of the dietary world is beginning to pass us by in

terms of scientific knowledge.

As I see it, there remain only two things Natural Hygiene grasps that

the rest of the more progressive camps in the dietary world still don't:

(1) An understanding of the fundamental health principle that outside

measures (drugs, surgery, etc.) never truly "cure" degenerative health

problems. In spite of the grandiose hopes and claims that they do, and the

aura of research breakthroughs, their function is really to serve as

crutches, which can of course be helpful and may truly be needed in some

circumstances. But the only true healing is from within by a body that has

a large capacity, within certain limits, to heal and regenerate itself

when given all of its essential biological requirements--and nothing more

or less which would hamper its homeostatic functioning. The body's

regenerative (homeostatic) abilities are still commonly unrecognized today

(often classed as "unexplained recoveries" or--in people fortunate enough

to recover from cancer--as "spontaneous remission") because the population

at large is so far from eating anything even approaching a natural diet

that would allow their bodies to return to some kind of normal health,

that it is just not seen very often outside limited pockets of people

seriously interested in approximating our natural diet. And the other

thing is: (2) Hygienists are also keenly aware of the power of fasting to

help provide ideal conditions under which such self-healing can occur.

But the newer branch of science called "darwinian medicine" is slowly

beginning (albeit with certain missteps) to grasp the principle of self

healing, or probably more correctly, at least the understanding that

degenerative diseases arise as a result of behavior departing from what

our evolutionary past has adapted us to. They see the negative side of how

departing from our natural diet and environment can result in degenerative

disease, but they do not understand that the reverse--regenerating health

by returning to our pristine diet and lifestyle, without drugs or other

"crutches"--is also possible, again, within certain limits, but those

limits are less than most people believe.

In some ways, though, Hygiene now resembles a religion as much as it

does science, because people seem to want "eternal" truths they can grab

onto with absolute certainly. Unfortunately, however, knowledge does not

work that way. Truth may not change, but our knowledge of it certainly

does as our awareness of it shifts or expands. Once again: The principles

of Hygiene may not change, but the details will always be subject to

refinement.

Speaking of such details subject to refinement, I know you've been

sitting on some very suggestive evidence to add further fuel to the

fire-and-cooking debate now raging between the raw-foodist and

"conservative-cooking" camps within Hygiene. Please bring us up-to-date on

what the evolutionary picture has to say about this. I'd be happy to. But

before we get into the evolutionary viewpoint, I want to back up a bit

first and briefly discuss the strange situation in the Hygienic community

occurring right now over the raw foods vs. cooking-of-some-starch-foods

debate. The thing that fascinates me about this whole brouhaha is the way

the two sides justify their positions, each of which has a strong point,

but also a telling blind spot.

Now since most Natural Hygienists don't have any clear picture of the

evolutionary past based on science for what behavior is natural, the

"naturalistic" model used by many Hygienists to argue for eating all foods

raw does so on a subjective basis--i.e., what I have called "the animal

model for raw-food naturalism." The idea being that we are too blinded

culturally by modern food practices involving cooking, and to be more

objective we should look at the other animals--none of whom cook their

food--so neither should we. Now it's true the "subjective raw-food

naturalists" are being philosophically consistent here, but their blind

spot is they don't have any good scientific evidence from humanity's

primitive past to back up their claim that total raw-foodism is the most

natural behavior for us--that is, using the functional definition based on

evolutionary adaptation I have proposed if we are going to be rigorous and

scientific about this.

Now on the other hand, with the doctors it's just the opposite story.

In recent years, the Natural Hygiene doctors and the ANHS (American

Natural Hygiene Society) have been more and more vocal about what they say

is the need for a modest amount of cooked items in the diet--usually

starches such as potatoes, squashes, legumes, and/or grains. And their

argument is based on the doctors' experience that few people they care for

do as well on raw foods alone as they do with the supplemental addition of

these cooked items. Also, they argue that there are other practical

reasons for eating these foods, such as that they broaden the diet

nutritionally, even if one grants that some of those nutrients may be

degraded to a degree by cooking. (Though they also say the assimilation of

some nutrients is improved by cooking.) They also point out these

starchier foods allow for adequate calories to be eaten while avoiding the

higher levels of fat that would be necessary to obtain those calories if

extra nuts and avocadoes and so forth were eaten to get them.

So we have those with wider practical experience arguing for the

inclusion of certain cooked foods based on pragmatism. But their blind

spot is in ignoring or attempting to finesse the inconsistency their

stance creates with the naturalist philosophy that is the very root of

Hygienic thinking. And again, the total-raw-foodists engage in just the

opposite tactics: being philosophically consistent in arguing for all-raw

foods, but being out of touch with the results most other people in the

real world besides themselves get on a total raw-food diet, and attempting

to finesse that particular inconsistency by nit-picking and fault-finding

other implementations of the raw-food regime than their own. (I might

interject here, though we'll cover this in more depth later, that although

it's not true for everyone, experience of most people in the Natural

Hygiene M2M supports the view that the majority do in fact do better when

they add some cooked foods to their diet.)

Now my tack as both a realist and someone who is also interested in

being philosophically consistent has been: If it is true that most people*

do better with the inclusion of some of these cooked items in their diet

that we've mentioned--and I believe it is, based on everything I have seen

and heard--then there must be some sort of clue in our evolutionary past

why this would be so, and which would show why it might be natural for us.

The question is not simply whether fire and cooking are "natural" by

some subjective definition. It's whether they have been used long enough

and consistently enough by humans during evolutionary time for our bodies

to have adapted genetically to the effects their use in preparing foods

may have on us. Again, this is the definition for "natural" that you have

to adopt if you want a functional justification that defines "natural"

based on scientific validation rather than subjectivity.

So the next question is obvious: How long have fire and cooking been

around, then, and how do we know whether that length of time has been long

enough for us to have adapted sufficiently? Let's take the question one

part at a time. The short answer to the first part of the question is that

fire was first controlled by humans anywhere from about 230,000 years ago

to 1.4 or 1.5 million years ago, depending on which evidence you accept as

definitive.

The earliest evidence for control of fire by humans, in the form of

fires at Swartkrans, South Africa and at Chesowanja, in Kenya, suggests

that it may possibly have been in use there as early as about 1.4 or 1.5

million years ago.[100] However, the interpretation of the physical

evidence at these early sites has been under question in the

archaeological community for some years now, with critics saying these

fires could have been wildfires instead of human-made fires. They suggest

the evidence for human control of fire might be a misreading of other

factors, such as magnesium-staining of soils, which can mimic the results

of fire if not specifically accounted for. For indisputable evidence of

fire intentionally set and controlled by humans, the presence of a hearth

or circle of scorched stones is often demanded as conclusive proof,[101]

and at these early sites, the evidence tying the fires to human control is

based on other factors.

At the other end of the timescale, these same critics who are only

willing to consider the most unequivocal evidence will still admit that at

least by 230,000 years ago[102] there is enough good evidence at at least

one site to establish fire was under control at this time by humans. At

this site, called Terra Amata, an ancient beach location on the French

Riviera, stone hearths are found at the center of what may have been huts;

and more recent sources may put the site's age at possibly 300,000 years

old rather than 230,000.[103]

Somewhat further back--from around 300,000 to 500,000 years ago--more

evidence has been accumulating recently at sites in Spain and France[104]

that looks as if it may force the ultraconservative paleontologists to

concede their 230,000-year-ago date is too stingy, but we'll see.

And then there is Zhoukoudian cave in China, one of the most famous

sites connected with Homo erectus, where claims that fire may have been

used as early as 500,000 to 1.5 million years ago have now largely been

discredited due to the complex and overlapping nature of the evidence left

by not just humans, but hyenas and owls who also inhabited the cave. (Owl

droppings could conceivably have caught fire and caused many of the

fires.) Even after discounting the most extreme claims, however, it does

seems likely that at least by 230,000 to 460,000 years ago humans were

using fire in the cave[105], and given scorching patterns around the teeth

and skulls of some animal remains, it does appear the hominids may have

done this to cook the brains (not an uncommon practice among

hunting-gathering peoples today).[106]

The most recent excavation with evidence for early use of fire has

been within just the last couple of years in France at the Menez-Dregan

site, where a hearth and evidence of fire has been preliminarily dated to

approximately 380,000 to 465,000 years. If early interpretations of the

evidence withstand criticism and further analysis, the fact that a hearth

composed of stone blocks inside a small cave was found with burnt

rhinoceros bones close by has provoked speculation that the rhino may have

been cooked at the site.[107]

Now of course, the crucial question for us isn't just when the

earliest control of fire was, it's at what date fire was being used

consistently--and more specifically for cooking, so that more-constant

genetic selection pressures would have been brought to bear. Given the

evidence available at this time, most of it would probably indicate that

125,000 years ago is the earliest reasonable estimate for widespread

control.*[108] Another good reason it may be safer to base adaptation to

fire and cooking on the figure of 125,000 years ago is that more and more

evidence is indicating modern humans today are descended from a group of

ancestors who were living in Africa 100,000-200,000 years ago, who then

spread out across the globe to replace other human groups.[109] If true,

this would probably mean the fire sites in Europe and China are those of

separate human groups who did not leave descendants that survived to the

present. Given that the African fire sites in Kenya and South Africa from

about 1.5 million years ago are under dispute, then, widespread usage at

125,000 years seems the safest figure for our use here.

/*URHERE*/

One thing we can say about the widespread use of fire probable by

125,000 years ago, however, is that it would almost certainly have

included the use of fire for cooking.* Why can this be assumed? It has to

do with the sequence for the progressive stages of control over fire that

would have had to have taken place prior to fire usage becoming

commonplace. And the most interesting of these is that fire for cooking

would almost inevitably have been one of the first uses it was put to by

humans, rather than some later-stage use.*

The first fires on earth occurred approximately 350 million years

ago--the geological evidence for fire in remains of forest vegetation

being as old as the forests themselves.[110] It is usual to focus only on

fire's immediately destructive effects to plants and wildlife, but there

are also benefits. In response to occasional periodic wildfires, for

example, certain plants and trees have evolved known as "pyrophytes," for

whose existence periodic wildfires are essential. Fire revitalizes them by

destroying their parasites and competitors, and such plants include

grasses eaten by herbivores as well as trees that provide shelter and food

for animals.[111]

Fires also provide other unintended benefits to animals as well. Even

at the time a wildfire is still burning, birds of prey (such as falcons

and kites)--the first types of predators to appear at fires--are attracted

to the flames to hunt fleeing animals and insects. Later, land-animal

predators appear when the ashes are smouldering and dying out to pick out

the burnt victims for consumption. Others, such as deer and bovine animals

appear after that to lick the ashes for their salt content. Notable as

well is that most mammals appear to enjoy the heat radiated at night at

sites of recently burned-out fires.[112]

It would have been inconceivable, therefore, that human beings, being

similarly observant and opportunistic creatures, would not also have

partaken of the dietary windfall provided by wildfires they came across.

And thus, even before humans had learned to control fire purposefully--and

without here getting into the later stages of control over fire--their

early passive exposures to it would have already introduced them, like the

other animals, to the role fire could play in obtaining edible food and

providing warmth.

So if fire has been used on a widespread basis for cooking since

roughly 125,000 years ago, how do we know if that has been enough time for

us to have fully adapted to it? To answer that, we have to be able to

determine the rate at which the genetic changes constituting evolutionary

adaptation take place in organisms as a result of environmental or

behavioral change--which in this case means changes in food intake.

The two sources for estimates of rates at which genetic change takes

place are from students of the fossil record and from population

geneticists. Where the fossil record is concerned, Niles Eldredge, along

with Stephen Jay Gould, two of the most well-known modern evolutionary

theorists, estimated the time span required for "speciation events" (the

time required for a new species to arise in response to evolutionary

selection pressures) to be somewhere within the range of "five to 50,000

years."[113] Since this rough figure is based on the fossil record, it

makes it difficult to be much more precise than that range. Eldredge also

comments that "some evolutionary geneticists have said that the estimate

of five to 50,000 years is, if anything, overly generous."[114] Also

remember that this time span is for changes large enough to result in a

new species classification. Since we are talking here about changes

(digestive changes) that may or may not be large enough to result in a new

species (though changes in diet often are in fact behind the origin of new

species), it's difficult to say from this particular estimate whether we

may be talking about a somewhat shorter or longer time span than that for

adaptation to changes in food.

Fortunately, however, the estimates from the population geneticists

are more precise. There are even mathematical equations to quantify the

rates at which genetic change takes place in a population, given

evolutionary "selection pressures" of a given magnitude that favor

survival of those individuals with a certain genetic trait.[115] The

difficulty lies in how accurately one can numerically quantify the

intensity of real-world selection pressures. However, it turns out there

have been two or three actual examples where it has been possible to do so

at least approximately, and they are interesting enough I'll mention a

couple of them briefly here so people can get a feel for the situation.

The most interesting of these examples relates directly to our

discussion here, and has to do with the gene for lactose tolerance in

adults. Babies are born with the capacity to digest lactose via production

of the digestive enzyme lactase. Otherwise they wouldn't be able to make

use of mother's milk, which contains the milk sugar lactose. But sometime

after weaning, this capacity is normally lost, and there is a gene that is

responsible. Most adults--roughly 70% of the world's population

overall--do not retain the ability to digest lactose into adulthood[116]

and this outcome is known as "lactose intolerance." (Actually this is

something of a misnomer, since adult lactose intolerance would have been

the baseline normal condition for virtually everyone in the human race up

until Neolithic (agricultural) times.[117]) If these people attempt to

drink milk, then the result may be bloating, gas, intestinal distress,

diarrhea, etc.[118]

However--and this is where it gets interesting--those population

groups that do retain the ability to produce lactase and digest milk into

adulthood are those descended from the very people who first began

domesticating animals for milking during the Neolithic periodic several

thousand years ago.[119] (The earliest milking populations in Europe,

Asia, and Africa began the practice probably around 4,000 B.C.[120]) And

even more interestingly, in population groups where cultural changes have

created "selection pressure" for adapting to certain behavior--such as

drinking milk in this case--the rate of genetic adaptation to such changes

significantly increases. In this case, the time span for widespread

prevalence of the gene for lactose tolerance within milking population

groups has been estimated at approximately 1,150 years[121]--a very short

span of time in evolutionary terms.

There is a very close correlation between the 30% of the world's

population who are tolerant to lactose and the earliest human groups who

began milking animals. These individuals are represented most among

modern-day Mediterranean, East African, and Northern European groups, and

emigrants from these groups to other countries. Only about 20% of white

Americans in general are lactose intolerant, but among sub-groups the

rates are higher: 90-100% among Asian-Americans (as well as Asians

worldwide), 75% of African-Americans (most of whom came from West Africa),

and 80% of Native Americans. 50% of Hispanics worldwide are lactose

intolerant.[122]

Now whether it is still completely healthy for the 30% of the world's

population who are lactose tolerant to be drinking animal's milk--which is

a very recent food in our evolutionary history--I can't say. It may well

be there are other factors involved in successfully digesting and making

use of milk without health side-effects other than the ability to produce

lactase--I haven't looked into that particular question yet. But for our

purposes here, the example does powerfully illustrate that genetic

adaptations for digestive changes can take place with much more rapidity

than was perhaps previously thought.*

Another interesting example of the spread of genetic adaptations

since the Neolithic has been two specific genes whose prevalence has been

found to correlate with the amount of time populations in different

geographical regions have been eating the grain-based high-carbohydrate

diets common since the transition from hunting and gathering to Neolithic

agriculture began 10,000 years ago. (These two genes are the gene for

angiotensin-converting enzyme--or ACE--and the one for apolipoprotein B,

which, if the proper forms are not present, may increase one's chances of

getting cardiovascular disease.)[123]

In the Middle East and Europe, rates of these two genes are highest

in populations (such as Greece, Italy, and France) closer to the Middle

Eastern "fertile crescent" where agriculture in this part of the globe

started, and lowest in areas furthest away, where the migrations of early

Neolithic farmers with their grain-based diets took longest to reach

(i.e., Northern Ireland, Scotland, Finland, Siberia). Closely correlating

with both the occurrence of these genes and the historical rate of grain

consumption are corresponding rates of deaths due to coronary heart

disease. Those in Mediterranean countries who have been eating

high-carbohydrate grain-based diets the longest (for example since

approximately 6,000 B.C. in France and Italy) have the lowest rates of

heart disease, while those in areas where dietary changes due to

agriculture were last to take hold, such as Finland (perhaps only since

2,000 B.C.), have the highest rates of death due to heart attack.

Statistics on breast cancer rates in Europe also are higher for countries

who have been practicing agriculture the least amount of time.[124]

Whether grain-based diets eaten by people whose ancestors only began

doing so recently (and therefore lack the appropriate gene) is actually

causing these health problems (and not simply correlated by coincidence)

is at this point a hypothesis under study. (One study with chickens,

however--who in their natural environment eat little grain--has shown much

less atherosclerosis on a high-fat, high-protein diet than on a low-fat,

high-carbohydrate diet.[125]) But again, and importantly, the key point

here is that genetic changes in response to diet can be more rapid than

perhaps once thought. The difference in time since the advent of Neolithic

agriculture between countries with the highest and lowest incidences of

these two genes is something on the order of 3,000-5,000 years,[126]

showing again that genetic changes due to cultural selection pressures for

diet can force more rapid changes than might occur otherwise.

Now we should also look at the other end of the time scale for some

perspective. The Cavalli-Sforza population genetics team that has been one

of the pioneers in tracking the spread of genes around the world due to

migrations and/or interbreeding of populations has also looked into the

genes that control immunoglobulin types (an important component of the

immune system). Their estimate here is that the current variants of these

genes were selected for within the last 50,000-100,000 years, and that

this time span would be more representative for most groups of genes. They

also feel that in general it is unlikely gene frequencies for most groups

of genes would undergo significant changes in time spans of less than

about 11,500 years.[127]

However, the significant exception they mention--and this relates

especially to our discussion here--is where there are cultural pressures

for certain behaviors that affect survival rates.[128] And the two

examples we cited above: the gene for lactose tolerance (milk-drinking)

and those genes associated with high-carbohydrate grain consumption, both

involve cultural selection pressures that came with the change from

hunting and gathering to Neolithic agriculture. Again, cultural selection

pressures for genetic changes operate more rapidly than any other kind.

Nobody yet, at least so far as I can tell, really knows whether or

not the observed genetic changes relating to the spread of milk-drinking

and grain-consumption are enough to confer a reasonable level of

adaptation to these foods among populations who have the genetic changes,

and the picture seems mixed.* Rates of gluten intolerance (gluten is a

protein in certain grains such as wheat, barley, and oats that makes dough

sticky and conducive to bread-baking) are lower than for lactose

intolerance, which one would expect given that milk-drinking has been

around for less than half the time grain-consumption has. Official

estimates of gluten intolerance range from 0.3% to 1% worldwide depending

on population group.[129] Some researchers, however, believe that gluten

intolerance is but the tip of the iceberg of problems due to grain

consumption (or more specifically, wheat). Newer research seems to suggest

that anywhere from 5% to as much as 20-30% of the population with certain

genetic characteristics (resulting in what is called a "permeable

intestine") may absorb incompletely digested peptide fragments from wheat

with adverse effects that could lead to a range of possible diseases.[130]

We have gone a little far afield here getting some kind of grasp on

rates of genetic change, but I think it's been necessary for us to have a

good sense of the time ranges involved. So to bring this back around to

the question of adaptation to cooking, it should probably be clear by this

point that given the time span involved (likely 125,000 years since fire

and cooking became widespread), the chances are very high that we are in

fact adapted to the cooking of whatever foods were consistently cooked.* I

would include in these some of the vegetable foods, particularly the

coarser ones such as starchy root vegetables such as yams, which are long

thought to have been cooked,[131] and perhaps others, as well as meat,

from what we know about the fossil record.

What about the contention by raw-food advocates that cooking foods

results in pyrolytic by-products that are carcinogenic or otherwise toxic

to the body, and should be avoided for that reason?

It's true cooking introduces some toxic byproducts, but it also

neutralizes others.[132] In addition, the number of such toxins created is

dwarfed by the large background level of natural toxins (thousands)[133]

already present in plant foods from nature to begin with, including some

that are similarly carcinogenic in high-enough doses. (Although only a few

dozen have been tested so far,[134] half of the naturally occurring

substances in plants known as "nature's pesticides" that have been tested

have been shown to be carcinogenic in trials with rats and mice.[135])

Nature's pesticides appear to be present in all plants, and though only a

few are found in any one plant, 5-10% of a plant's total dry weight is

made up of them.[136]

[The reason "nature's pesticides" occur throughout the plant kingdom

is because plants have had to evolve low-level defense mechanisms against

animals to deter overpredation. On one level, plants and animals are in a

continual evolutionary "arms race" against each other. Fruiting plants, of

course, have also evolved the separate ability to exploit the fact that

certain animals are attracted to the fruit by enabling its seeds to be

dispersed through the animals' feces.]

We have a liver and kidneys for a reason, which is that there have

always been toxins in natural foods that the body has had to deal with,

and that's one reason why these organs evolved. There are also a number of

other more general defenses the body has against toxins. These types of

defenses make evolutionary sense given the wide range of toxic elements in

foods the body has had to deal with over the eons. [Not clear enough in

the original version of the interview is the point that a wide range of

GENERAL defenses might therefore be reasonably expected to aid in

neutralizing or ejecting toxins even of a type the body hadn't necessarily

seen before, such as those that might be introduced by cooking practices.]

Such mechanisms include the constant shedding of surface-layer cells of

the digestive system, many defenses against oxygen free-radical damage,

and DNA excision repair, among others.[137]

The belief that a natural diet is, or can be, totally toxin-free is

basically an idealistic fantasy--an illusion of black-and-white thinking

not supported by real-world investigations. The real question is not

whether a diet is completely free of toxins, but whether we are adapted to

process what substances are in our foods--in reasonable or customary

amounts such as encountered during evolution--that are not usable by the

body. Again, the black-and-white nature of much Hygienic thinking obscures

here what are questions of degrees rather than absolutes.

Also, and I know raw-foodists generally don't like to hear this, but

there has long been evidence cooking in fact does make foods of certain

types more digestible. For example, trypsin inhibitors (themselves a type

of protease inhibitor) which are widely distributed in the plant kingdom,

particularly in rich sources of protein, inhibit the ability of digestive

enzymes to break down protein. (Probably the best-known plants containing

trypsin inhibitors are legumes and grains.) Research has shown the effect

of most such protease inhibitors on digestion to be reduced by

cooking.[138] And it is this advantage in expanding the range of

utilizable foods in an uncertain environment that was the evolutionary

advantage that helped bring cooking about and enhanced survival.*

I want to make clear that I still believe the largest component of

the diet should be raw (at least 50% if not considerably more), but there

is provision in the evolutionary picture for reasonable amounts of cooked

foods of certain types, such as at the very least, yams, probably some

other root vegetables, the legumes, some meat, and so forth. (With meat,

the likelihood is that it was eaten raw when freshly killed, but what

could not be eaten would likely have been dried or cooked to preserve it

for later consumption, rather than wasting it.) Whether or not some foods

like these can be eaten raw if one has no choice or is determined enough

to do so is not the real question. The question is what was more expedient

or practical to survival and which prevailed over evolutionary time.

A brief look at the Australian Aborigines might be illustrative

here.* What data is available since the aborigines were first encountered

by Europeans shows that inland aborigines in the desert areas were subject

to severe food shortages and prolonged droughts.[139] This of course made

emphasizing the most efficient use of whatever foods could be foraged

paramount. Estimates based on studies of aborigines in northern Australia

are that they processed roughly half of their plant foods, but that no

food was processed unnecessarily, any such preparation being done only to

make a food edible, more digestible, or more palatable.[140] In general

food was eaten as it was collected, according to its availability during

the seasons--except during times of feasts--with wastage being rare, such

a pattern being characteristic of feast-and-famine habitats. Some food,

however, was processed for storage and later retrieval (usually by

drying), including nuts and seeds, but may also have been ground and baked

into cakes instead, before burying in the ground or storing in dry

caches.[141]

Fresh foods such as fruits, bulbs, nectar, gums, flowers, etc., were

eaten raw when collected. Examples of foods that were prepared before

consumption include the cooking of starchy tubers or seeds, grinding and

roasting of seeds, and cooking of meat.[142]

That these practices were necessary to expand the food supply and not

merely induced by frivolous cultural practices like raw-foodists often

tend to theorize can be seen in the fact that after colonization by

Europeans, aborigines were not above coming into missions during droughts

to get food.[143]

But the more interesting and more pressing question, to my mind, is

not whether we are adapted to cooking of certain foods, which seems very

likely,* but how much we have adapted to the dietary changes since the

Neolithic agricultural transition, given the 10,000 years or less it's

been underway. At present, the answer is unclear, although in general, we

can probably say there just hasn't been enough time for full adaptation

yet. Or if so, only for people descended from certain ancestral groups

with the longest involvement with agriculture. My guess (and it is just a

guess) would be that we are still mostly adapted to a Paleolithic diet,

but for any particular individual with a given ancestral background,

certain Neolithic foods such as grains, perhaps even modest amounts of

certain cultured milk products such as cheese or yogurt (ones more easily

digested than straight milk) for even fewer people, might be not only

tolerated, but helpful. Especially where people are avoiding flesh

products which is our primary animal food adaptation, these animal

byproducts may be helpful,* which Stanley Bass's work with mice and his

mentor Dr. Gian-Cursio's work with Hygienic patients seems to show, as Dr.

Bass has discussed previously here in H&B (in the April and June 1994

issues).

How are we to determine an optimum diet for ourselves, then, given

that some genetic changes may be more or less complete or incomplete in

different population groups?

I think what all of this points to is the need to be careful in

making absolute black-or-white pronouncements about invariant food rules

that apply equally to all. It is not as simple as saying that if we aren't

sure we are fully adapted to something to just eliminate it from the diet

to be safe. Because adaptation to a food does not necessarily mean just

tolerance for that food, it also means that if we are in fact adapted to

it, we would be expected to thrive better with some amount of that food in

our diet. Genetic adaptation cuts both ways.

This is why I believe it is important for people to experiment

individually. Today, because of the Neolithic transition and the rates at

which genetic changes are being discovered to take place, it is apparent

humanity is a species in evolutionary transition. Due to the unequal flow

and dissemination of genes through a population during times like these,

it is unlikely we will find uniform adaptation across the population, as

we probably would have during earlier times. This means it is going to be

more likely right now in this particular historical time period that

individuals will be somewhat different in their responses to diet. And as

we saw above (with the two genes ACE and apolipoprotein-B) these genetic

differences may even confound attempts to replicate epidemiological

dietary studies from one population to another unless these factors are

taken into account.*

So while it is important to look for convergences among different

lines of evidence (evolutionary studies, biochemical nutritional studies,

epidemiological studies and clinical trials, comparative anatomy from

primate studies, and so forth), it is well to consider how often the

epidemiological studies, perhaps even some of the biochemical studies,

reverse themselves or come back with conflicting data. It usually takes

many years--even decades--for their import to become clear based on the

lengthy scientific process of peer review and replication of experiments

for confirmation or refutation.

So my advice is: don't be afraid to experiment. Unless you have

specific allergies or strong food intolerances and whatnot, the body is

flexible enough by evolution to handle short-term variations in diet from

whatever an optimal diet might be anyway. If you start within the general

parameters we've outlined here and allow yourself to experiment, you have

a much better chance of finding the particular balance among these factors

that will work you. If you already have something that works well for you,

that's great. If, however, you are looking for improvements, given the

uncertainties above we've talked about, it's important to look at any

rigid assumptions you may have about the "ideal" diet, and be willing to

challenge them through experimentation. In the long-run, you only have

yourself to benefit by doing so.

Ward, despite the evolutionary picture you've presented here, there

are still objections that people have about meat from a biochemical or

epidemiological standpoint. What about T. Colin Campbell's China Study for

example?

Good point. Campbell's famous study, to my mind, brings up one of the

most unremarked-upon recent conflicts in epidemiological data that has

arisen. In his lecture at the 1991 ANHS annual conference, reported on in

the national ANHS publication Health Science, Campbell claimed that the

China Study data pointed to not just high fat intake, but to the protein

in animal food, as increasing cholesterol levels. (High cholesterol levels

in the blood are now widely thought by many to be the biggest single

factor responsible for increased rates of atherosclerosis--clogged blood

vessels--and coronary heart disease.) According to him, the lower the

level of animal protein in the diet (not just the lower the level of fat)

the lower the cholesterol level in the blood. He believes that animal food

is itself the biggest culprit, above and beyond just fat levels in

food.[144]

Yet as rigorous as the study is proclaimed to be, I have to tell you

that Campbell's claim that animal protein by itself is the biggest culprit

in raising blood cholesterol is contradicted by studies of modern-day

hunter-gatherers eating considerable amounts of wild game in their diet

who have very low cholesterol levels comparable to those of the China

study. One review of different tribes studied showed low cholesterol

levels for the Hadza of 110 mg/dl (eating 20% animal food), San Bushmen

120 (20-37% animal), Aborigines 139 (10-75% animal), and Pygmies at 106,

considerably lower than the now-recommended safe level of below 150.[145]

Clearly there are unaccounted-for factors at work here yet to be studied

sufficiently.

One of them might be the difference in composition between the levels

of fat in domesticated meat vs. wild game: on average five times as much

for the former than the latter. On top of that, the proportion of

saturated fat in domesticated meat compared to wild game is also five

times higher.[146]

Other differences between these two meat sources are that significant

amounts of EPA (an omega-3 fatty acid thought to perhaps help prevent

atherosclerosis) are found in wild game (approx. 4% of total fat), while

domestic beef for example contains almost none.[147] This is important

because the higher levels of EPA and other omega-3 fatty acids in wild

game help promote a low overall dietary ratio of omega-6 vs. omega-3 fatty

acids for hunter-gatherers--ranging from 1:1 to 4:1--compared to the high

11:1 ratio observed in Western nations. Since omega-6 fatty acids may have

a cancer-promoting effect, some investigators are recommending lower

ratios of omega-6 to omega-3 in the diet which would, coincidentally, be

much closer to the evolutionary norm.[148]

Differences like these may go some way toward explaining the similar

blood cholesterol levels and low rates of disease in both the rural

Chinese eating a very-low-fat, low-animal-protein diet, and in

hunter-gatherers eating a low-fat, high-animal-protein diet. Rural Chinese

eat a diet of only 15% fat and 10% protein, with the result that saturated

fats only contribute a low 4% of total calories. On the other hand, those

hunter-gatherer groups approximating the Paleolithic norm eat diets

containing 20-25% fat and 30% protein, yet the contribution of saturated

fat to total caloric intake is nevertheless a similarly low 6% of total

calories.[149]

What about the contention that high-protein diets promote calcium

loss in bone and therefore contribute to osteoporosis? The picture here is

complex and modern studies have been contradictory. In experimental

settings, purified, isolated protein extracts do significantly increase

calcium excretion, but the effect of increased protein in natural foods

such as meat is smaller or nonexistent.[150] Studies of Eskimos have shown

high rates of osteoporosis eating an almost all-meat diet[151] (less than

10% plant intake[152]) but theirs is a recent historical aberration not

typical of the evolutionary Paleolithic diet thought to have averaged 65%

plant foods and 35% flesh.* Analyses of numerous skeletons from our

Paleolithic ancestors have shown development of high peak bone mass and

low rates of bone loss in elderly specimens compared to their Neolithic

agricultural successors whose rates of bone loss increased considerably

even though they ate much lower-protein diets.[153] Why, nobody knows for

sure, though it is thought that the levels of phosphorus in meat reduce

excretion of calcium, and people in Paleolithic times also ate large

amounts of fruits and vegetables[154] with an extremely high calcium

intake (perhaps 1,800 mg/day compared to an average of 500-800 for

Americans today[155]) and led extremely rigorous physical lives, all of

which would have encouraged increased bone mass.[156]

Okay, let's move on to the hunter-gatherers you mentioned earlier.

I've heard that while some tribes may have low rates of chronic

degenerative disease, others don't, and may also suffer higher rates of

infection than we do in the West. This is true. Not all "hunter-gatherer"

tribes of modern times eat diets in line with Paleolithic norms. Aspects

of their diets and/or lifestyle can be harmful just as modern-day

industrial diets can be. When using these people as comparative models,

it's important to remember they are not carbon copies of Paleolithic-era

hunter-gatherers.[157] They can be suggestive (the best living examples we

have), but they are a mixed bag as "models" for behavior, and it is up to

us to keep our thinking caps on.

We've already mentioned the Eskimos above as less-than-exemplary

models. Another example is the Masai tribe of Africa who are really more

pastoralists (animal herders) than hunter-gatherers. They have low

cholesterol levels ranging from 115 to 145,[158] yet autopsies have shown

considerable atherosclerosis.[159] Why? Maybe because they deviate from

the Paleolithic norm of 20-25% fat intake due to their pastoralist

lifestyle by eating a 73% fat diet that includes large amounts of milk

from animals in addition to meat and blood.*[160] Our bodies do have

certain limits.

But after accounting for tribes like these, why do we see higher

rates of mortality from infectious disease among other hunter-gatherers

who are eating a better diet and show little incidence of degenerative

disease?

There are two major reasons I know of. First, most modern-day tribes

have been pushed onto marginal habitats by encroaching civilization.[161]

This means they may at times experience nutritional stress resulting from

seasonal fluctuations in the food supply (like the aborigines noted above)

during which relatively large amounts of weight are lost while they remain

active. The study of "paleopathology" (the study of illnesses in past

populations from signs left in the fossil record) shows that similar

nutritional stress experienced by some hunter-gatherers of the past was

not unknown either, and at times was great enough to have stunted their

growth, resulting in "growth arrest lines" in human bone that can be seen

under conditions of nutritional deprivation. Such nutritional stress is

most likely for hunter-gatherers in environments where either the number

of food sources is low (exposing them to the risk of undependable supply),

or where food is abundant only seasonally.[162]

Going without food--or fasting while under conditions of total rest

as hygienists do as a regenerative/recuperative measure--is one thing, but

nutritional stress or deprivation while under continued physical stress is

unhealthy and leaves one more susceptible to pathologies including

infection.[163]

The second potential cause of higher rates of infection are the less

artificially controlled sanitary conditions (one of the areas where modern

civilization is conducive rather than destructive to health)--due to less

control over the environment by hunter-gatherers than by modern

civilizations. Creatures in the wild are in frequent contact with feces

and other breeding grounds for microorganisms such as rotting fruit and/or

carcasses, to which they are exposed by skin breaks and injuries, and so

forth.[164] Contrary to popular Hygienic myth, animals in the wild eating

natural diets in a natural environment are not disease-free, and large

infectious viral and bacterial plagues in the past and present among wild

animal populations are known to have occurred. (To cite one example,

rinderpest plagues in the African Serengeti occurred in the 1890s and

again around 1930, 1960, and 1982 among buffalo, kudu, eland, and

wildebeest.[165])

It becomes obvious when you look into studies of wild animals that

natural diet combined with living in natural conditions is no guarantee of

freedom from disease and/or infection. Chimpanzees, our closest living

animal relatives, for instance, can and do suffer bouts in the wild from a

spectrum of ailments very similar to those observed in human beings:

including pneumonia and other respiratory infections (which occur more

often during the cold and rainy season), polio, abscesses, rashes,

parasites, diarrhea, even hemorrhoids on occasion.[166] Signs of

infectious disease in the fossil record have also been detected in remains

as far back as the dinosaur-age, as have signs of immune system mechanisms

to combat them.[167]

One of the conclusions to be drawn from this is that artificial

modern conditions are not all bad where health is concerned. Such

conditions as "sanitation" due to hygienic measures, shelter and

protection from harsh climatic extremes and physical trauma, professional

emergency care after potentially disabling or life-threatening accidents,

elimination of the stresses of nomadism, plus protection from seasonal

nutritional deprivation due to the modern food system that Westerners like

ourselves enjoy today all play larger roles in health and longevity than

we realize.[168]

Also, I would hope that the chimp examples above might persuade

hygienists not to feel so guilty or inevitably blame themselves when they

occasionally fall prey to acute illness. We read of examples in the

Natural Hygiene M2M which sometimes seem to elicit an almost palpable

sense of relief among others when the conspiracy of silence is broken and

they find they aren't the only ones. I think we should resist the tendency

to always assume we flubbed the dietary details. In my opinion it is a

mistake to believe that enervation need always be seen as simply the

instigator of "toxemia" which is then held to always be the incipient

cause of any illness. It seems to me you can easily have "enervation"

(lowered energy and resistance) without toxemia, and that that in and of

itself can be quite enough to upset the body's normal homeostasis

("health") and bring on illness. (Indeed I have personally become ill once

or twice during the rebuilding period after lengthy fasts when overworked,

a situation in which it would be difficult to blame toxemia as the cause.)

The examples of modern-day hunter-gatherers as well as those of chimps

should show us that you can eat a healthy natural diet and still suffer

from health problems, including infectious disease, due to excessive

stresses--what we would call "enervation" in Natural Hygiene.

Ward, we still have some space here to wrap up Part 2. Given the

research you've done, how has it changed your own diet and health

lifestyle? What are you doing these days, and why?

I would say my diet right now* is somewhere in the neighborhood of

about 85% plant and 15% animal, and overall about 60% raw and 40% cooked

by volume. A breakdown from a different angle would be that by volume it

is, very roughly, about 1/4 fruit, 1/4 starches (grains/potatoes, etc.),

1/4 veggies, and the remaining quarter divided between nuts/seeds and

animal products, with more of the latter than the former. Of the animal

foods, I would say at least half is flesh (mostly fish, but with

occasional fowl or relatively lean red meat thrown in, eaten about 3-5

meals per week), the rest composed of varying amounts of eggs, goat

cheese, and yogurt.

Although I have to admit I am unsure about the inclusion of dairy

products on an evolutionary basis given their late introduction in our

history, nevertheless, I do find that the more heavily I am exercising,

the more I find myself tending to eat them. To play it safe, what dairy I

do eat is low- or no-lactose cultured forms like goat cheese and yogurt.*

Where the grains are concerned, so far I do not experience the kind

of sustained energy I like to have for distance running without them, even

though I am running less mileage than I used to (20 miles/week now as

opposed to 35-40 a few years ago). The other starches such as potatoes,

squash, etc., alone just don't seem to provide the energy punch I need.

Again, however, I try to be judicious by eating non-gluten-containing

grains such as millet, quinoa, or rice, or else use sprouted forms of

grains, or breads made from them, that eliminate the gluten otherwise

present in wheat, barley, oats, and so forth.*

In general, while I do take the evolutionary picture heavily into

account, I also believe it is important to listen to our own bodies and

experiment, given the uncertainties that remain.

Also, I have to say that I find exercise, rest, and stress management

as important as diet in staying energetic, healthy, and avoiding acute

episodes of ill-health. Frankly, my experience is that once you reach a

certain reasonable level of health improvement based on your dietary

disciplines, and things start to level out--but maybe you still aren't

where you want to be--most further gains are going to come from paying

attention to these other factors, especially today when so many of us are

overworked, over-busy, and stressed-out. I think too many people focus too

exclusively on diet and then wonder why they aren't getting any further

improvements.

Diet only gets you so far. I usually sleep about 8-10 hours a night,

and I very much enjoy vigorous exercise, which I find is necessary to help

control my blood-sugar levels, which are still a weak spot for me. The

optimum amount is important, though. A few years ago I was running every

day, totaling 35-40 miles/week and concentrating on hard training for

age-group competition, and more prone to respiratory problems like colds,

etc. (not an infrequent complaint of runners). In the last couple of

years, I've cut back to every-other-day running totaling roughly 20 miles

per week. I still exercise fairly hard, but a bit less intensely than

before, I give myself a day of rest in between, and the frequency of colds

and so forth is now much lower.

I am sure people will be curious here, Ward: What were some of the

improvements you noticed after adding flesh foods to your diet?

Well, although I expected it might take several months to really

notice much of anything, one of the first things was that within about 2

to 3 weeks I noticed better recovery after exercise--as a distance runner

I was able to run my hard workouts more frequently with fewer rest days or

easy workouts in between. I also began sleeping better fairly early on,

was not hungry all the time anymore, and maintained weight more easily on

lesser volumes of food. Over time, my stools became a bit more well

formed, my sex drive increased somewhat (usually accompanies better energy

levels for me), my nervous system was more stable and not so prone to

hyperreactive panic-attack-like instability like before, and in general I

found I didn't feel so puny or wilt under stress so easily as before.

Unexpectedly, I also began to notice that my moods had improved and I was

more "buoyant." Individually, none of these changes was dramatic, but as a

cumulative whole they have made the difference for me. Most of these

changes had leveled off after about 4-6 months, I would say.

Something else I ought to mention here, too, was the effect of this

dietary change on a visual disturbance I had been having for some years

prior to the time I embarked on a disciplined Hygienic program, and which

continued unchanged during the two or three years I was on the traditional

vegetarian diet of either all-raw or 80% raw/20% cooked. During that time

I had been having regular episodes of "spots" in my visual field every

week or so, where "snow" (like on a t.v. set) would gradually build up to

the point it would almost completely obscure my vision in one eye or the

other for a period of about 5 minutes, then gradually fade away after

another 5 minutes. As soon as I began including flesh in my diet several

times per week, these started decreasing in frequency and over the 3 years

since have almost completely disappeared.

What problems are you still working on?

I still have an ongoing tussle with sugar-sensitivity due to the huge

amounts of soft drinks I used to consume, and have to eat fruits

conservatively. I also notice that I still do not hold up under stress and

the occasional long hours of work as well as

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

Terms of Use