Jump to content
CBR1100XX.org Forum

spEEdfrEEk

Members
  • Posts

    708
  • Joined

  • Last visited

Posts posted by spEEdfrEEk

  1. All good ideas.

    I can set the idle higher and prevent it from

    stalling, but it's an abnormally high idle to do so.

    Levers are stock and adjusted pretty well.

    I'll bleed the clutch check and that first. Then

    try the chemtool (never used it before) and see

    if that helps. If so, I'll prolly overhaul the carbs.

    I hope it's not a bearing issue. That would mean

    more work than I have the time for right now..

    TJ

  2. '97

    Relevant mods: DynoJet stage 1, K&N Filter, Jardine RT1

    slipons.

    Thing has been in perfect tune for years. (did the

    dyno runs myself at a local dealership)

    This quirk is fairly recent, like within the last month

    or so. I'm just hoping it's not gearbox or counterbalancer

    related (or anything cryptic like that).

    I don't see how it could be valve clearance, and it's

    definitely not a CCT issue. (no rattles or clacking, etc.)

    TJ

    year of bike?

    list of mods?

  3. I'm getting a slight "lugging" of the engine in neutral.

    Doesn't happen as long as the clutch is in. If I pull

    the clutch in any gear (including neutral) everything

    is hunky dory at idle.

    However, if I'm in neutral and I don't squeeze the

    clutch, the motor will start to lug a bit. It will even

    stall if the engine is cold.

    Thoughts?

    TJ

  4. Yikes!

    He has the worst form, ever, on the one arm military press with

    dumbbell.

    You're making a big mistake if you let your spine get into that

    sort of position. Your spine should always be in lordosis if

    you intend to hoist anything over your head (IMO)

    :icon_cool: TJ :icon_cool:

    Just found this on another board. I don't think its for me but holy crap this guy is an animal. Check out some of his demo vids.

    http://www.rosstraining.com/

  5. I can't tell by looking just at that page. I'd actually want to see the

    recommendations.

    Bear in mind, alot of these questions are already answered by the

    sports medicine community.

    As I've mentioned before, most success comes from training holistically

    and with periodization. (emphasis on recovery and nutrition as well)

    :icon_cool: TJ :icon_cool:

    Anyone tried this workout?

    http://www.ast-ss.com/max-ot/max-ot_intro.asp

    The principles sound good though I'm sure some things work for some and not for others. I was considering giving it a try but I'm not sure just yet.

    BTW you have to sign up to see the workout but I think you can read some of the info without signing up. The sign up process is very easy and free and I haven't gotten any spam as a result so you don't have to worry about that.

  6. Stay away from grains in general. There's no particular "safe" grain or

    even "safer" grain.

    Grains are the most allergenic substances (barring dairy) that you can

    actually consume.

    They wreak havoc on your immune system (in subtle ways) and are

    attributed to being the cause of several degenerative diseases:

    arthritis, diabetes, etc.

    Wheat is one of the only monoploidal things on the planet -- how's that

    for unnatural.

    The only reason wheat, corn, and soy are so popular today is not because

    they're healthy -- it's because they are cash cows that can be grown almost

    anywhere in the world and turned into any type of processed food product.

    :icon_cool: TJ :icon_cool:

  7. There is so many mixed opinions out there on this stuff. Anyone have any knowledge of its health benefits? Anyone use this?

    Coconut oil is one of the best oils you can possibly cook with.

    It maintains it's saturated state even when raised to high temperatures

    (like frying) without de-naturing.

    It is MUCH safer to cook with than the unsaturated (hydrogenated) oils

    like canola and safflower.

    The medium chain triglycerides that are in coconut oil are super metabolic --

    they give you almost as much energy (as easily) as glucose.

    That's why I drink 1/4 cup of pure coconut milk every day 2 hours before I

    hit the gym to lift.

    The benefits of coconut oil are many..

    :icon_cool: TJ :icon_cool:

  8. This is a protein only with no fat. (Stillmans)

    That's actually a dangerous signal to send your body.

    The last thing you want it to do is to adapt to metabolizing proteins

    (amino acids).

    1) it's hard on the kidneys

    2) it leads to catabolism and lower resting metabolic rate

    You should eat a diet higher in fat if you want your body to get

    better at metabolizing fats. (train it to do what you want it to do)

    Do a web search for something called "The Fat Fast".

    That's the closest thing your going to find to something that "jumpstarts"

    fat metabolism.

    :icon_cool: TJ :icon_cool:

  9. Best sources are from roots such as Ma-Huang.

    Twin Lab makes supplements that have the correct proportion

    of Ephedra & Caffeine (from Ma-Huang and Guarana). You can

    add a baby aspirin to that..

    By the way, you won't "gain back the weight" when you stop

    taking the stack.

    The stack doesn't "make you lose weight". It repairs your

    own natural thermogenesis. (or thermogenic response).

    The best way to maintain it, after stopping the stack, is to

    always expose yourself to the cold whenever you can.

    This includes skipping the heater and big warm jackets in the

    winter time.

    I do my best to go without warm clothing in the winter -- unless

    the temp drops below freezing.

    :icon_cool: TJ :icon_cool:

    Where would one get "natural" ephedra from ?

  10. ...And down almost 14lbs. (my weight seems to fluctuate a half-pound from day to day)

    Don't weigh yourself every day -- the results are misleading.

    Weigh yourself once a week at exactly the same time in the morning (before

    eating breakfast).

    That will give you the most consistent results.

    I don't use a scale much any more, but when I did, it was always at 8am saturday morning.

    (for instance)

    :icon_cool: TJ :icon_cool:

  11. I'm actually up about 10 lbs..

    Why?

    Trying to put on a bit more mass. I lost a ton of weight

    (and muscle) trying to get back into motocross racing.

    Went from a lean 200 down to 183.

    Big mistake.

    Lost quite a bit of strength and it still left me too heavy

    to compete with the 150 lb. guys in the 250 cc vet

    class.

    Oh well..

    So now I'm full steam ahead on a growth program. I plan

    to take it up to about 215 and cut to 210.

    The heaviest I've ever been was 217 when I tried a stint

    in full on body-building (normally I lift for strength -- not size).

    Bad idea -- even though I was lean, my blood pressure went way up.

    Yes, it's true -- muscle and fat weight put similar stresses on

    the cardiovascular system.

    So now, it's back to my usual -- strength as a focus, but

    keep it under 210 lbs. total.

    :icon_cool: TJ :icon_cool:

  12. Apparently she makes her own "stacker".....everyday she takes a baby asprin, 200mg caffiene pill, and a primateme tablet......(yes thats a pill for asthma)

    Won't be very effective. The ephedra that actually makes the ECA stack work is

    not the same as the synthetic ephedrine that's in asthma stuff..

    Tell her to get some natural ephedra -- or even try some yohimbine, which has a

    better thermogenic effect on women.

    :icon_cool: TJ :icon_cool:

  13. Not sure if I want to go MWF and then try to do cardio on T/Th or just go 4 or 5 days of strength (and do cardio whenever) so that I can squeeze more stuff in.

    If you're trying to get lean while getting stronger then put the cardio (short 10-15 min

    sessions) after your lifting.

    Why?

    1) That cardio will help deplete the last of your glucose/glycogen stores and you'll run mostly

    on free-fatty acids (or ketones) afterwards

    2) Taking the day _completely_ off from activity in between lifting days will help you recover

    faster and make more progress gaining strength.

    :icon_cool: TJ :icon_cool:

  14. One of the _original_ low-carb diets was done by an individual

    who had a strong propensity for hard liquor.

    I can't remember the guys name, but I'll try to dig up the article.

    I think it dated from the 1860's or so..

    :icon_cool: TJ :icon_cool:

    Atkins also says no alcohol but I stuck by my Bacardi Diets (zero carbs) and the pounds still came off. I found it to be a very effective appetite suppressant. :icon_biggrin:

  15. Do you have any information about milk?

    Yep, it's pretty much as bad as you're thinking it is.

    I FINALLY made a commitment to cut out the gallon of skim milk I drink every day.

    Wise choice.

    I feel like its the root of some of my respiratory problems.

    There's good research out there that implies dairy can cause

    mucus to form in the lungs, etc.

    Some believe that's what caused flo-jo's early demise.

    I decided to replace the protein I lost from the milk elimination by supplementing egg protein (from powder)

    Egg protein is absolutely the best. It has the highest bio-availability,

    and that's what we need when we try to recover from sessions at the gym.

    What is your recommendation for calcium supplementation???

    There are companies now selling calcium mixes in containers similar to

    the protein powders.

    It's mostly marrow, which is the BEST way to get calcium. It's

    how paleo-folks did it.

    What's most bio-available?

    I thought about Sardines...

    You got it.. Sardines. Anything with real bone is the best. The

    marrow powder is good when you get tired of the little fish :lol:

    Sorry so slow bud, I've been busier than I can even comment on.

    8) TJ 8)

  16. CATEGORY: diets/vegetarian

    TECHNICAL: **

    SUMMARY:

    This is part two of one of the most profound documents

    I had ever read at the time I found it. It is written by Ward

    Nicholson who has done a tremendous amount of research into

    human diets based on evolution. As I said in the first part,

    Nicholson, initially practiced a type of diet known as the

    "hygienic" diet -- which is a strict vegetarian/vegan diet in

    which everything is consumed raw and unprocessed. I'm quite

    sure that you too will get as much out of this document as I

    did. And, after you've had a chance to read through it (and the

    remaining parts), I bet you too will find his argument pretty

    convincing. And, of course, that arguement is that human beings

    could never have evolved the way we did if we had been

    vegetarians/vegans or frutarians.

    Possibly the most profound statement, and one that I've

    repeated, is that most people have forgotten that modern drugs only

    masquerade the symptoms of an illness. A real cure can be sought

    through reverting back to a natural human-evolution type diet.

    The discussion on the use of fire for cooking is pretty interesting

    too.

    -------------------------------------------------------------

    Part 2 of our Visit with Ward Nicholson

    Fire And Cooking In Human Evolution,

    Rates Of Genetic Adaptation To Change,

    Hunter-Gatherers, And Diseases In The Wild

    Health & Beyond: Ward, in Part 1 of our interview, you discussed the

    extensive evidence showing that primitive human beings as well as almost

    all of the primates today have included animal foods such as flesh or

    insects in their diets. Why haven't Natural Hygienists and other

    vegetarians looked into all this information?

    Ward Nicholson: My guess is that: (1) Most aren't aware that

    paleoanthropologists have by now assembled a considerable amount of data

    about our evolutionary past related to diet. But more importantly, I think

    it has to do with psychological barriers, such as: (2) Many Hygienists

    assume they don't have to look because the subjective "animal model" for

    raw-food naturalism makes it "obvious" what our natural diet is, and

    therefore the paleontologists' evidence must therefore be in error, or

    biased by present cultural eating practices. Or: (3) They don't want to

    look, perhaps because they're afraid of what they might see.

    I think in spite of what most Natural Hygienists will tell you, they

    are really more wedded to certain specific details of the Hygienic system

    that remain prevalent (i.e., raw-food vegetarianism, food-combining, etc.)

    than they are truly concerned with whether those details follow logically

    from underlying Hygienic principles. The basic principle of Natural

    Hygiene is that the body is a self-maintaining, self-regulating,

    self-repairing organism that naturally maintains its own health when it is

    given food and other living conditions appropriate to its natural

    biological adaptation.

    In and of itself, this does not tell you what foods to eat. That has

    to be determined by a review of the best evidence we have available. So

    while the principles of Hygiene as a logical system do not change, our

    knowledge of the appropriate details that follow from those principles may

    and probably will change from time to time--since science is a process of

    systematically elucidating more "known" information from what used to be

    unknown. Thus the accuracy of our knowledge is to some extent time-based,

    dependent on the accumulation of evidence to provide a more inclusive view

    of "truth" which unfortunately is probably never absolute, but--as far as

    human beings are concerned--relative to the state of our knowledge.

    Science simply tries to bridge the knowledge gap. And a hallmark of

    closing the knowledge gap through scientific discovery is openness to

    change and refinements based on the accumulation of evidence.

    Open-mindedness is really openness to change. Just memorizing details

    doesn't mean much in and of itself. It's how that information is

    organized, or seen, or interpreted, or related to, that means something.

    What's interesting to me is that the evolutionary diet is not so

    starkly different from the Hygienic diet. Much of it validates important

    elements of the Hygienic view. It is very similar in terms of getting

    plenty of fresh fruits and veggies, some nuts and seeds, and so forth,

    except for the addition of the smaller role of flesh and other amounts of

    animal food (at least compared to the much larger role of plant foods) in

    the diet. It's one exception. We have actually done fairly well in

    approximating humanity's "natural" or "original" diet, except we have been

    in error about this particular item, and gotten exceedingly fundamentalist

    about it when there is nothing in the body of Hygienic principles

    themselves that would outlaw meat if it's in our evolutionary adaptation.

    But for some reason, even though Natural Hygiene is not based on any

    "ethical" basis for vegetarianism (officially at least), this particular

    item seems to completely freak most Hygienists out. Somehow we have made a

    religion out of dietary details that have been the hand-me-downs of past

    Hygienists working with limited scientific information. They did the best

    they could given the knowledge they had available to them then, and we

    should be grateful for their hard work. But today the rank and file of

    Natural Hygiene has largely forgotten Herbert Shelton's rallying cry, "Let

    us have the truth, though the heavens fall."

    Natural Hygiene was alive and vital in Shelton's time because he was

    actively keeping abreast of scientific knowledge and aware of the need to

    modify his previous views if scientific advances showed them to be

    inadequate. But since Shelton retired from the scene, many people in the

    mainstream of Hygiene have begun to let their ideas stagnate and become

    fossilized. The rest of the dietary world is beginning to pass us by in

    terms of scientific knowledge.

    As I see it, there remain only two things Natural Hygiene grasps that

    the rest of the more progressive camps in the dietary world still don't:

    (1) An understanding of the fundamental health principle that outside

    measures (drugs, surgery, etc.) never truly "cure" degenerative health

    problems. In spite of the grandiose hopes and claims that they do, and the

    aura of research breakthroughs, their function is really to serve as

    crutches, which can of course be helpful and may truly be needed in some

    circumstances. But the only true healing is from within by a body that has

    a large capacity, within certain limits, to heal and regenerate itself

    when given all of its essential biological requirements--and nothing more

    or less which would hamper its homeostatic functioning. The body's

    regenerative (homeostatic) abilities are still commonly unrecognized today

    (often classed as "unexplained recoveries" or--in people fortunate enough

    to recover from cancer--as "spontaneous remission") because the population

    at large is so far from eating anything even approaching a natural diet

    that would allow their bodies to return to some kind of normal health,

    that it is just not seen very often outside limited pockets of people

    seriously interested in approximating our natural diet. And the other

    thing is: (2) Hygienists are also keenly aware of the power of fasting to

    help provide ideal conditions under which such self-healing can occur.

    But the newer branch of science called "darwinian medicine" is slowly

    beginning (albeit with certain missteps) to grasp the principle of self

    healing, or probably more correctly, at least the understanding that

    degenerative diseases arise as a result of behavior departing from what

    our evolutionary past has adapted us to. They see the negative side of how

    departing from our natural diet and environment can result in degenerative

    disease, but they do not understand that the reverse--regenerating health

    by returning to our pristine diet and lifestyle, without drugs or other

    "crutches"--is also possible, again, within certain limits, but those

    limits are less than most people believe.

    In some ways, though, Hygiene now resembles a religion as much as it

    does science, because people seem to want "eternal" truths they can grab

    onto with absolute certainly. Unfortunately, however, knowledge does not

    work that way. Truth may not change, but our knowledge of it certainly

    does as our awareness of it shifts or expands. Once again: The principles

    of Hygiene may not change, but the details will always be subject to

    refinement.

    Speaking of such details subject to refinement, I know you've been

    sitting on some very suggestive evidence to add further fuel to the

    fire-and-cooking debate now raging between the raw-foodist and

    "conservative-cooking" camps within Hygiene. Please bring us up-to-date on

    what the evolutionary picture has to say about this. I'd be happy to. But

    before we get into the evolutionary viewpoint, I want to back up a bit

    first and briefly discuss the strange situation in the Hygienic community

    occurring right now over the raw foods vs. cooking-of-some-starch-foods

    debate. The thing that fascinates me about this whole brouhaha is the way

    the two sides justify their positions, each of which has a strong point,

    but also a telling blind spot.

    Now since most Natural Hygienists don't have any clear picture of the

    evolutionary past based on science for what behavior is natural, the

    "naturalistic" model used by many Hygienists to argue for eating all foods

    raw does so on a subjective basis--i.e., what I have called "the animal

    model for raw-food naturalism." The idea being that we are too blinded

    culturally by modern food practices involving cooking, and to be more

    objective we should look at the other animals--none of whom cook their

    food--so neither should we. Now it's true the "subjective raw-food

    naturalists" are being philosophically consistent here, but their blind

    spot is they don't have any good scientific evidence from humanity's

    primitive past to back up their claim that total raw-foodism is the most

    natural behavior for us--that is, using the functional definition based on

    evolutionary adaptation I have proposed if we are going to be rigorous and

    scientific about this.

    Now on the other hand, with the doctors it's just the opposite story.

    In recent years, the Natural Hygiene doctors and the ANHS (American

    Natural Hygiene Society) have been more and more vocal about what they say

    is the need for a modest amount of cooked items in the diet--usually

    starches such as potatoes, squashes, legumes, and/or grains. And their

    argument is based on the doctors' experience that few people they care for

    do as well on raw foods alone as they do with the supplemental addition of

    these cooked items. Also, they argue that there are other practical

    reasons for eating these foods, such as that they broaden the diet

    nutritionally, even if one grants that some of those nutrients may be

    degraded to a degree by cooking. (Though they also say the assimilation of

    some nutrients is improved by cooking.) They also point out these

    starchier foods allow for adequate calories to be eaten while avoiding the

    higher levels of fat that would be necessary to obtain those calories if

    extra nuts and avocadoes and so forth were eaten to get them.

    So we have those with wider practical experience arguing for the

    inclusion of certain cooked foods based on pragmatism. But their blind

    spot is in ignoring or attempting to finesse the inconsistency their

    stance creates with the naturalist philosophy that is the very root of

    Hygienic thinking. And again, the total-raw-foodists engage in just the

    opposite tactics: being philosophically consistent in arguing for all-raw

    foods, but being out of touch with the results most other people in the

    real world besides themselves get on a total raw-food diet, and attempting

    to finesse that particular inconsistency by nit-picking and fault-finding

    other implementations of the raw-food regime than their own. (I might

    interject here, though we'll cover this in more depth later, that although

    it's not true for everyone, experience of most people in the Natural

    Hygiene M2M supports the view that the majority do in fact do better when

    they add some cooked foods to their diet.)

    Now my tack as both a realist and someone who is also interested in

    being philosophically consistent has been: If it is true that most people*

    do better with the inclusion of some of these cooked items in their diet

    that we've mentioned--and I believe it is, based on everything I have seen

    and heard--then there must be some sort of clue in our evolutionary past

    why this would be so, and which would show why it might be natural for us.

    The question is not simply whether fire and cooking are "natural" by

    some subjective definition. It's whether they have been used long enough

    and consistently enough by humans during evolutionary time for our bodies

    to have adapted genetically to the effects their use in preparing foods

    may have on us. Again, this is the definition for "natural" that you have

    to adopt if you want a functional justification that defines "natural"

    based on scientific validation rather than subjectivity.

    So the next question is obvious: How long have fire and cooking been

    around, then, and how do we know whether that length of time has been long

    enough for us to have adapted sufficiently? Let's take the question one

    part at a time. The short answer to the first part of the question is that

    fire was first controlled by humans anywhere from about 230,000 years ago

    to 1.4 or 1.5 million years ago, depending on which evidence you accept as

    definitive.

    The earliest evidence for control of fire by humans, in the form of

    fires at Swartkrans, South Africa and at Chesowanja, in Kenya, suggests

    that it may possibly have been in use there as early as about 1.4 or 1.5

    million years ago.[100] However, the interpretation of the physical

    evidence at these early sites has been under question in the

    archaeological community for some years now, with critics saying these

    fires could have been wildfires instead of human-made fires. They suggest

    the evidence for human control of fire might be a misreading of other

    factors, such as magnesium-staining of soils, which can mimic the results

    of fire if not specifically accounted for. For indisputable evidence of

    fire intentionally set and controlled by humans, the presence of a hearth

    or circle of scorched stones is often demanded as conclusive proof,[101]

    and at these early sites, the evidence tying the fires to human control is

    based on other factors.

    At the other end of the timescale, these same critics who are only

    willing to consider the most unequivocal evidence will still admit that at

    least by 230,000 years ago[102] there is enough good evidence at at least

    one site to establish fire was under control at this time by humans. At

    this site, called Terra Amata, an ancient beach location on the French

    Riviera, stone hearths are found at the center of what may have been huts;

    and more recent sources may put the site's age at possibly 300,000 years

    old rather than 230,000.[103]

    Somewhat further back--from around 300,000 to 500,000 years ago--more

    evidence has been accumulating recently at sites in Spain and France[104]

    that looks as if it may force the ultraconservative paleontologists to

    concede their 230,000-year-ago date is too stingy, but we'll see.

    And then there is Zhoukoudian cave in China, one of the most famous

    sites connected with Homo erectus, where claims that fire may have been

    used as early as 500,000 to 1.5 million years ago have now largely been

    discredited due to the complex and overlapping nature of the evidence left

    by not just humans, but hyenas and owls who also inhabited the cave. (Owl

    droppings could conceivably have caught fire and caused many of the

    fires.) Even after discounting the most extreme claims, however, it does

    seems likely that at least by 230,000 to 460,000 years ago humans were

    using fire in the cave[105], and given scorching patterns around the teeth

    and skulls of some animal remains, it does appear the hominids may have

    done this to cook the brains (not an uncommon practice among

    hunting-gathering peoples today).[106]

    The most recent excavation with evidence for early use of fire has

    been within just the last couple of years in France at the Menez-Dregan

    site, where a hearth and evidence of fire has been preliminarily dated to

    approximately 380,000 to 465,000 years. If early interpretations of the

    evidence withstand criticism and further analysis, the fact that a hearth

    composed of stone blocks inside a small cave was found with burnt

    rhinoceros bones close by has provoked speculation that the rhino may have

    been cooked at the site.[107]

    Now of course, the crucial question for us isn't just when the

    earliest control of fire was, it's at what date fire was being used

    consistently--and more specifically for cooking, so that more-constant

    genetic selection pressures would have been brought to bear. Given the

    evidence available at this time, most of it would probably indicate that

    125,000 years ago is the earliest reasonable estimate for widespread

    control.*[108] Another good reason it may be safer to base adaptation to

    fire and cooking on the figure of 125,000 years ago is that more and more

    evidence is indicating modern humans today are descended from a group of

    ancestors who were living in Africa 100,000-200,000 years ago, who then

    spread out across the globe to replace other human groups.[109] If true,

    this would probably mean the fire sites in Europe and China are those of

    separate human groups who did not leave descendants that survived to the

    present. Given that the African fire sites in Kenya and South Africa from

    about 1.5 million years ago are under dispute, then, widespread usage at

    125,000 years seems the safest figure for our use here.

    /*URHERE*/

    One thing we can say about the widespread use of fire probable by

    125,000 years ago, however, is that it would almost certainly have

    included the use of fire for cooking.* Why can this be assumed? It has to

    do with the sequence for the progressive stages of control over fire that

    would have had to have taken place prior to fire usage becoming

    commonplace. And the most interesting of these is that fire for cooking

    would almost inevitably have been one of the first uses it was put to by

    humans, rather than some later-stage use.*

    The first fires on earth occurred approximately 350 million years

    ago--the geological evidence for fire in remains of forest vegetation

    being as old as the forests themselves.[110] It is usual to focus only on

    fire's immediately destructive effects to plants and wildlife, but there

    are also benefits. In response to occasional periodic wildfires, for

    example, certain plants and trees have evolved known as "pyrophytes," for

    whose existence periodic wildfires are essential. Fire revitalizes them by

    destroying their parasites and competitors, and such plants include

    grasses eaten by herbivores as well as trees that provide shelter and food

    for animals.[111]

    Fires also provide other unintended benefits to animals as well. Even

    at the time a wildfire is still burning, birds of prey (such as falcons

    and kites)--the first types of predators to appear at fires--are attracted

    to the flames to hunt fleeing animals and insects. Later, land-animal

    predators appear when the ashes are smouldering and dying out to pick out

    the burnt victims for consumption. Others, such as deer and bovine animals

    appear after that to lick the ashes for their salt content. Notable as

    well is that most mammals appear to enjoy the heat radiated at night at

    sites of recently burned-out fires.[112]

    It would have been inconceivable, therefore, that human beings, being

    similarly observant and opportunistic creatures, would not also have

    partaken of the dietary windfall provided by wildfires they came across.

    And thus, even before humans had learned to control fire purposefully--and

    without here getting into the later stages of control over fire--their

    early passive exposures to it would have already introduced them, like the

    other animals, to the role fire could play in obtaining edible food and

    providing warmth.

    So if fire has been used on a widespread basis for cooking since

    roughly 125,000 years ago, how do we know if that has been enough time for

    us to have fully adapted to it? To answer that, we have to be able to

    determine the rate at which the genetic changes constituting evolutionary

    adaptation take place in organisms as a result of environmental or

    behavioral change--which in this case means changes in food intake.

    The two sources for estimates of rates at which genetic change takes

    place are from students of the fossil record and from population

    geneticists. Where the fossil record is concerned, Niles Eldredge, along

    with Stephen Jay Gould, two of the most well-known modern evolutionary

    theorists, estimated the time span required for "speciation events" (the

    time required for a new species to arise in response to evolutionary

    selection pressures) to be somewhere within the range of "five to 50,000

    years."[113] Since this rough figure is based on the fossil record, it

    makes it difficult to be much more precise than that range. Eldredge also

    comments that "some evolutionary geneticists have said that the estimate

    of five to 50,000 years is, if anything, overly generous."[114] Also

    remember that this time span is for changes large enough to result in a

    new species classification. Since we are talking here about changes

    (digestive changes) that may or may not be large enough to result in a new

    species (though changes in diet often are in fact behind the origin of new

    species), it's difficult to say from this particular estimate whether we

    may be talking about a somewhat shorter or longer time span than that for

    adaptation to changes in food.

    Fortunately, however, the estimates from the population geneticists

    are more precise. There are even mathematical equations to quantify the

    rates at which genetic change takes place in a population, given

    evolutionary "selection pressures" of a given magnitude that favor

    survival of those individuals with a certain genetic trait.[115] The

    difficulty lies in how accurately one can numerically quantify the

    intensity of real-world selection pressures. However, it turns out there

    have been two or three actual examples where it has been possible to do so

    at least approximately, and they are interesting enough I'll mention a

    couple of them briefly here so people can get a feel for the situation.

    The most interesting of these examples relates directly to our

    discussion here, and has to do with the gene for lactose tolerance in

    adults. Babies are born with the capacity to digest lactose via production

    of the digestive enzyme lactase. Otherwise they wouldn't be able to make

    use of mother's milk, which contains the milk sugar lactose. But sometime

    after weaning, this capacity is normally lost, and there is a gene that is

    responsible. Most adults--roughly 70% of the world's population

    overall--do not retain the ability to digest lactose into adulthood[116]

    and this outcome is known as "lactose intolerance." (Actually this is

    something of a misnomer, since adult lactose intolerance would have been

    the baseline normal condition for virtually everyone in the human race up

    until Neolithic (agricultural) times.[117]) If these people attempt to

    drink milk, then the result may be bloating, gas, intestinal distress,

    diarrhea, etc.[118]

    However--and this is where it gets interesting--those population

    groups that do retain the ability to produce lactase and digest milk into

    adulthood are those descended from the very people who first began

    domesticating animals for milking during the Neolithic periodic several

    thousand years ago.[119] (The earliest milking populations in Europe,

    Asia, and Africa began the practice probably around 4,000 B.C.[120]) And

    even more interestingly, in population groups where cultural changes have

    created "selection pressure" for adapting to certain behavior--such as

    drinking milk in this case--the rate of genetic adaptation to such changes

    significantly increases. In this case, the time span for widespread

    prevalence of the gene for lactose tolerance within milking population

    groups has been estimated at approximately 1,150 years[121]--a very short

    span of time in evolutionary terms.

    There is a very close correlation between the 30% of the world's

    population who are tolerant to lactose and the earliest human groups who

    began milking animals. These individuals are represented most among

    modern-day Mediterranean, East African, and Northern European groups, and

    emigrants from these groups to other countries. Only about 20% of white

    Americans in general are lactose intolerant, but among sub-groups the

    rates are higher: 90-100% among Asian-Americans (as well as Asians

    worldwide), 75% of African-Americans (most of whom came from West Africa),

    and 80% of Native Americans. 50% of Hispanics worldwide are lactose

    intolerant.[122]

    Now whether it is still completely healthy for the 30% of the world's

    population who are lactose tolerant to be drinking animal's milk--which is

    a very recent food in our evolutionary history--I can't say. It may well

    be there are other factors involved in successfully digesting and making

    use of milk without health side-effects other than the ability to produce

    lactase--I haven't looked into that particular question yet. But for our

    purposes here, the example does powerfully illustrate that genetic

    adaptations for digestive changes can take place with much more rapidity

    than was perhaps previously thought.*

    Another interesting example of the spread of genetic adaptations

    since the Neolithic has been two specific genes whose prevalence has been

    found to correlate with the amount of time populations in different

    geographical regions have been eating the grain-based high-carbohydrate

    diets common since the transition from hunting and gathering to Neolithic

    agriculture began 10,000 years ago. (These two genes are the gene for

    angiotensin-converting enzyme--or ACE--and the one for apolipoprotein B,

    which, if the proper forms are not present, may increase one's chances of

    getting cardiovascular disease.)[123]

    In the Middle East and Europe, rates of these two genes are highest

    in populations (such as Greece, Italy, and France) closer to the Middle

    Eastern "fertile crescent" where agriculture in this part of the globe

    started, and lowest in areas furthest away, where the migrations of early

    Neolithic farmers with their grain-based diets took longest to reach

    (i.e., Northern Ireland, Scotland, Finland, Siberia). Closely correlating

    with both the occurrence of these genes and the historical rate of grain

    consumption are corresponding rates of deaths due to coronary heart

    disease. Those in Mediterranean countries who have been eating

    high-carbohydrate grain-based diets the longest (for example since

    approximately 6,000 B.C. in France and Italy) have the lowest rates of

    heart disease, while those in areas where dietary changes due to

    agriculture were last to take hold, such as Finland (perhaps only since

    2,000 B.C.), have the highest rates of death due to heart attack.

    Statistics on breast cancer rates in Europe also are higher for countries

    who have been practicing agriculture the least amount of time.[124]

    Whether grain-based diets eaten by people whose ancestors only began

    doing so recently (and therefore lack the appropriate gene) is actually

    causing these health problems (and not simply correlated by coincidence)

    is at this point a hypothesis under study. (One study with chickens,

    however--who in their natural environment eat little grain--has shown much

    less atherosclerosis on a high-fat, high-protein diet than on a low-fat,

    high-carbohydrate diet.[125]) But again, and importantly, the key point

    here is that genetic changes in response to diet can be more rapid than

    perhaps once thought. The difference in time since the advent of Neolithic

    agriculture between countries with the highest and lowest incidences of

    these two genes is something on the order of 3,000-5,000 years,[126]

    showing again that genetic changes due to cultural selection pressures for

    diet can force more rapid changes than might occur otherwise.

    Now we should also look at the other end of the time scale for some

    perspective. The Cavalli-Sforza population genetics team that has been one

    of the pioneers in tracking the spread of genes around the world due to

    migrations and/or interbreeding of populations has also looked into the

    genes that control immunoglobulin types (an important component of the

    immune system). Their estimate here is that the current variants of these

    genes were selected for within the last 50,000-100,000 years, and that

    this time span would be more representative for most groups of genes. They

    also feel that in general it is unlikely gene frequencies for most groups

    of genes would undergo significant changes in time spans of less than

    about 11,500 years.[127]

    However, the significant exception they mention--and this relates

    especially to our discussion here--is where there are cultural pressures

    for certain behaviors that affect survival rates.[128] And the two

    examples we cited above: the gene for lactose tolerance (milk-drinking)

    and those genes associated with high-carbohydrate grain consumption, both

    involve cultural selection pressures that came with the change from

    hunting and gathering to Neolithic agriculture. Again, cultural selection

    pressures for genetic changes operate more rapidly than any other kind.

    Nobody yet, at least so far as I can tell, really knows whether or

    not the observed genetic changes relating to the spread of milk-drinking

    and grain-consumption are enough to confer a reasonable level of

    adaptation to these foods among populations who have the genetic changes,

    and the picture seems mixed.* Rates of gluten intolerance (gluten is a

    protein in certain grains such as wheat, barley, and oats that makes dough

    sticky and conducive to bread-baking) are lower than for lactose

    intolerance, which one would expect given that milk-drinking has been

    around for less than half the time grain-consumption has. Official

    estimates of gluten intolerance range from 0.3% to 1% worldwide depending

    on population group.[129] Some researchers, however, believe that gluten

    intolerance is but the tip of the iceberg of problems due to grain

    consumption (or more specifically, wheat). Newer research seems to suggest

    that anywhere from 5% to as much as 20-30% of the population with certain

    genetic characteristics (resulting in what is called a "permeable

    intestine") may absorb incompletely digested peptide fragments from wheat

    with adverse effects that could lead to a range of possible diseases.[130]

    We have gone a little far afield here getting some kind of grasp on

    rates of genetic change, but I think it's been necessary for us to have a

    good sense of the time ranges involved. So to bring this back around to

    the question of adaptation to cooking, it should probably be clear by this

    point that given the time span involved (likely 125,000 years since fire

    and cooking became widespread), the chances are very high that we are in

    fact adapted to the cooking of whatever foods were consistently cooked.* I

    would include in these some of the vegetable foods, particularly the

    coarser ones such as starchy root vegetables such as yams, which are long

    thought to have been cooked,[131] and perhaps others, as well as meat,

    from what we know about the fossil record.

    What about the contention by raw-food advocates that cooking foods

    results in pyrolytic by-products that are carcinogenic or otherwise toxic

    to the body, and should be avoided for that reason?

    It's true cooking introduces some toxic byproducts, but it also

    neutralizes others.[132] In addition, the number of such toxins created is

    dwarfed by the large background level of natural toxins (thousands)[133]

    already present in plant foods from nature to begin with, including some

    that are similarly carcinogenic in high-enough doses. (Although only a few

    dozen have been tested so far,[134] half of the naturally occurring

    substances in plants known as "nature's pesticides" that have been tested

    have been shown to be carcinogenic in trials with rats and mice.[135])

    Nature's pesticides appear to be present in all plants, and though only a

    few are found in any one plant, 5-10% of a plant's total dry weight is

    made up of them.[136]

    [The reason "nature's pesticides" occur throughout the plant kingdom

    is because plants have had to evolve low-level defense mechanisms against

    animals to deter overpredation. On one level, plants and animals are in a

    continual evolutionary "arms race" against each other. Fruiting plants, of

    course, have also evolved the separate ability to exploit the fact that

    certain animals are attracted to the fruit by enabling its seeds to be

    dispersed through the animals' feces.]

    We have a liver and kidneys for a reason, which is that there have

    always been toxins in natural foods that the body has had to deal with,

    and that's one reason why these organs evolved. There are also a number of

    other more general defenses the body has against toxins. These types of

    defenses make evolutionary sense given the wide range of toxic elements in

    foods the body has had to deal with over the eons. [Not clear enough in

    the original version of the interview is the point that a wide range of

    GENERAL defenses might therefore be reasonably expected to aid in

    neutralizing or ejecting toxins even of a type the body hadn't necessarily

    seen before, such as those that might be introduced by cooking practices.]

    Such mechanisms include the constant shedding of surface-layer cells of

    the digestive system, many defenses against oxygen free-radical damage,

    and DNA excision repair, among others.[137]

    The belief that a natural diet is, or can be, totally toxin-free is

    basically an idealistic fantasy--an illusion of black-and-white thinking

    not supported by real-world investigations. The real question is not

    whether a diet is completely free of toxins, but whether we are adapted to

    process what substances are in our foods--in reasonable or customary

    amounts such as encountered during evolution--that are not usable by the

    body. Again, the black-and-white nature of much Hygienic thinking obscures

    here what are questions of degrees rather than absolutes.

    Also, and I know raw-foodists generally don't like to hear this, but

    there has long been evidence cooking in fact does make foods of certain

    types more digestible. For example, trypsin inhibitors (themselves a type

    of protease inhibitor) which are widely distributed in the plant kingdom,

    particularly in rich sources of protein, inhibit the ability of digestive

    enzymes to break down protein. (Probably the best-known plants containing

    trypsin inhibitors are legumes and grains.) Research has shown the effect

    of most such protease inhibitors on digestion to be reduced by

    cooking.[138] And it is this advantage in expanding the range of

    utilizable foods in an uncertain environment that was the evolutionary

    advantage that helped bring cooking about and enhanced survival.*

    I want to make clear that I still believe the largest component of

    the diet should be raw (at least 50% if not considerably more), but there

    is provision in the evolutionary picture for reasonable amounts of cooked

    foods of certain types, such as at the very least, yams, probably some

    other root vegetables, the legumes, some meat, and so forth. (With meat,

    the likelihood is that it was eaten raw when freshly killed, but what

    could not be eaten would likely have been dried or cooked to preserve it

    for later consumption, rather than wasting it.) Whether or not some foods

    like these can be eaten raw if one has no choice or is determined enough

    to do so is not the real question. The question is what was more expedient

    or practical to survival and which prevailed over evolutionary time.

    A brief look at the Australian Aborigines might be illustrative

    here.* What data is available since the aborigines were first encountered

    by Europeans shows that inland aborigines in the desert areas were subject

    to severe food shortages and prolonged droughts.[139] This of course made

    emphasizing the most efficient use of whatever foods could be foraged

    paramount. Estimates based on studies of aborigines in northern Australia

    are that they processed roughly half of their plant foods, but that no

    food was processed unnecessarily, any such preparation being done only to

    make a food edible, more digestible, or more palatable.[140] In general

    food was eaten as it was collected, according to its availability during

    the seasons--except during times of feasts--with wastage being rare, such

    a pattern being characteristic of feast-and-famine habitats. Some food,

    however, was processed for storage and later retrieval (usually by

    drying), including nuts and seeds, but may also have been ground and baked

    into cakes instead, before burying in the ground or storing in dry

    caches.[141]

    Fresh foods such as fruits, bulbs, nectar, gums, flowers, etc., were

    eaten raw when collected. Examples of foods that were prepared before

    consumption include the cooking of starchy tubers or seeds, grinding and

    roasting of seeds, and cooking of meat.[142]

    That these practices were necessary to expand the food supply and not

    merely induced by frivolous cultural practices like raw-foodists often

    tend to theorize can be seen in the fact that after colonization by

    Europeans, aborigines were not above coming into missions during droughts

    to get food.[143]

    But the more interesting and more pressing question, to my mind, is

    not whether we are adapted to cooking of certain foods, which seems very

    likely,* but how much we have adapted to the dietary changes since the

    Neolithic agricultural transition, given the 10,000 years or less it's

    been underway. At present, the answer is unclear, although in general, we

    can probably say there just hasn't been enough time for full adaptation

    yet. Or if so, only for people descended from certain ancestral groups

    with the longest involvement with agriculture. My guess (and it is just a

    guess) would be that we are still mostly adapted to a Paleolithic diet,

    but for any particular individual with a given ancestral background,

    certain Neolithic foods such as grains, perhaps even modest amounts of

    certain cultured milk products such as cheese or yogurt (ones more easily

    digested than straight milk) for even fewer people, might be not only

    tolerated, but helpful. Especially where people are avoiding flesh

    products which is our primary animal food adaptation, these animal

    byproducts may be helpful,* which Stanley Bass's work with mice and his

    mentor Dr. Gian-Cursio's work with Hygienic patients seems to show, as Dr.

    Bass has discussed previously here in H&B (in the April and June 1994

    issues).

    How are we to determine an optimum diet for ourselves, then, given

    that some genetic changes may be more or less complete or incomplete in

    different population groups?

    I think what all of this points to is the need to be careful in

    making absolute black-or-white pronouncements about invariant food rules

    that apply equally to all. It is not as simple as saying that if we aren't

    sure we are fully adapted to something to just eliminate it from the diet

    to be safe. Because adaptation to a food does not necessarily mean just

    tolerance for that food, it also means that if we are in fact adapted to

    it, we would be expected to thrive better with some amount of that food in

    our diet. Genetic adaptation cuts both ways.

    This is why I believe it is important for people to experiment

    individually. Today, because of the Neolithic transition and the rates at

    which genetic changes are being discovered to take place, it is apparent

    humanity is a species in evolutionary transition. Due to the unequal flow

    and dissemination of genes through a population during times like these,

    it is unlikely we will find uniform adaptation across the population, as

    we probably would have during earlier times. This means it is going to be

    more likely right now in this particular historical time period that

    individuals will be somewhat different in their responses to diet. And as

    we saw above (with the two genes ACE and apolipoprotein-B) these genetic

    differences may even confound attempts to replicate epidemiological

    dietary studies from one population to another unless these factors are

    taken into account.*

    So while it is important to look for convergences among different

    lines of evidence (evolutionary studies, biochemical nutritional studies,

    epidemiological studies and clinical trials, comparative anatomy from

    primate studies, and so forth), it is well to consider how often the

    epidemiological studies, perhaps even some of the biochemical studies,

    reverse themselves or come back with conflicting data. It usually takes

    many years--even decades--for their import to become clear based on the

    lengthy scientific process of peer review and replication of experiments

    for confirmation or refutation.

    So my advice is: don't be afraid to experiment. Unless you have

    specific allergies or strong food intolerances and whatnot, the body is

    flexible enough by evolution to handle short-term variations in diet from

    whatever an optimal diet might be anyway. If you start within the general

    parameters we've outlined here and allow yourself to experiment, you have

    a much better chance of finding the particular balance among these factors

    that will work you. If you already have something that works well for you,

    that's great. If, however, you are looking for improvements, given the

    uncertainties above we've talked about, it's important to look at any

    rigid assumptions you may have about the "ideal" diet, and be willing to

    challenge them through experimentation. In the long-run, you only have

    yourself to benefit by doing so.

    Ward, despite the evolutionary picture you've presented here, there

    are still objections that people have about meat from a biochemical or

    epidemiological standpoint. What about T. Colin Campbell's China Study for

    example?

    Good point. Campbell's famous study, to my mind, brings up one of the

    most unremarked-upon recent conflicts in epidemiological data that has

    arisen. In his lecture at the 1991 ANHS annual conference, reported on in

    the national ANHS publication Health Science, Campbell claimed that the

    China Study data pointed to not just high fat intake, but to the protein

    in animal food, as increasing cholesterol levels. (High cholesterol levels

    in the blood are now widely thought by many to be the biggest single

    factor responsible for increased rates of atherosclerosis--clogged blood

    vessels--and coronary heart disease.) According to him, the lower the

    level of animal protein in the diet (not just the lower the level of fat)

    the lower the cholesterol level in the blood. He believes that animal food

    is itself the biggest culprit, above and beyond just fat levels in

    food.[144]

    Yet as rigorous as the study is proclaimed to be, I have to tell you

    that Campbell's claim that animal protein by itself is the biggest culprit

    in raising blood cholesterol is contradicted by studies of modern-day

    hunter-gatherers eating considerable amounts of wild game in their diet

    who have very low cholesterol levels comparable to those of the China

    study. One review of different tribes studied showed low cholesterol

    levels for the Hadza of 110 mg/dl (eating 20% animal food), San Bushmen

    120 (20-37% animal), Aborigines 139 (10-75% animal), and Pygmies at 106,

    considerably lower than the now-recommended safe level of below 150.[145]

    Clearly there are unaccounted-for factors at work here yet to be studied

    sufficiently.

    One of them might be the difference in composition between the levels

    of fat in domesticated meat vs. wild game: on average five times as much

    for the former than the latter. On top of that, the proportion of

    saturated fat in domesticated meat compared to wild game is also five

    times higher.[146]

    Other differences between these two meat sources are that significant

    amounts of EPA (an omega-3 fatty acid thought to perhaps help prevent

    atherosclerosis) are found in wild game (approx. 4% of total fat), while

    domestic beef for example contains almost none.[147] This is important

    because the higher levels of EPA and other omega-3 fatty acids in wild

    game help promote a low overall dietary ratio of omega-6 vs. omega-3 fatty

    acids for hunter-gatherers--ranging from 1:1 to 4:1--compared to the high

    11:1 ratio observed in Western nations. Since omega-6 fatty acids may have

    a cancer-promoting effect, some investigators are recommending lower

    ratios of omega-6 to omega-3 in the diet which would, coincidentally, be

    much closer to the evolutionary norm.[148]

    Differences like these may go some way toward explaining the similar

    blood cholesterol levels and low rates of disease in both the rural

    Chinese eating a very-low-fat, low-animal-protein diet, and in

    hunter-gatherers eating a low-fat, high-animal-protein diet. Rural Chinese

    eat a diet of only 15% fat and 10% protein, with the result that saturated

    fats only contribute a low 4% of total calories. On the other hand, those

    hunter-gatherer groups approximating the Paleolithic norm eat diets

    containing 20-25% fat and 30% protein, yet the contribution of saturated

    fat to total caloric intake is nevertheless a similarly low 6% of total

    calories.[149]

    What about the contention that high-protein diets promote calcium

    loss in bone and therefore contribute to osteoporosis? The picture here is

    complex and modern studies have been contradictory. In experimental

    settings, purified, isolated protein extracts do significantly increase

    calcium excretion, but the effect of increased protein in natural foods

    such as meat is smaller or nonexistent.[150] Studies of Eskimos have shown

    high rates of osteoporosis eating an almost all-meat diet[151] (less than

    10% plant intake[152]) but theirs is a recent historical aberration not

    typical of the evolutionary Paleolithic diet thought to have averaged 65%

    plant foods and 35% flesh.* Analyses of numerous skeletons from our

    Paleolithic ancestors have shown development of high peak bone mass and

    low rates of bone loss in elderly specimens compared to their Neolithic

    agricultural successors whose rates of bone loss increased considerably

    even though they ate much lower-protein diets.[153] Why, nobody knows for

    sure, though it is thought that the levels of phosphorus in meat reduce

    excretion of calcium, and people in Paleolithic times also ate large

    amounts of fruits and vegetables[154] with an extremely high calcium

    intake (perhaps 1,800 mg/day compared to an average of 500-800 for

    Americans today[155]) and led extremely rigorous physical lives, all of

    which would have encouraged increased bone mass.[156]

    Okay, let's move on to the hunter-gatherers you mentioned earlier.

    I've heard that while some tribes may have low rates of chronic

    degenerative disease, others don't, and may also suffer higher rates of

    infection than we do in the West. This is true. Not all "hunter-gatherer"

    tribes of modern times eat diets in line with Paleolithic norms. Aspects

    of their diets and/or lifestyle can be harmful just as modern-day

    industrial diets can be. When using these people as comparative models,

    it's important to remember they are not carbon copies of Paleolithic-era

    hunter-gatherers.[157] They can be suggestive (the best living examples we

    have), but they are a mixed bag as "models" for behavior, and it is up to

    us to keep our thinking caps on.

    We've already mentioned the Eskimos above as less-than-exemplary

    models. Another example is the Masai tribe of Africa who are really more

    pastoralists (animal herders) than hunter-gatherers. They have low

    cholesterol levels ranging from 115 to 145,[158] yet autopsies have shown

    considerable atherosclerosis.[159] Why? Maybe because they deviate from

    the Paleolithic norm of 20-25% fat intake due to their pastoralist

    lifestyle by eating a 73% fat diet that includes large amounts of milk

    from animals in addition to meat and blood.*[160] Our bodies do have

    certain limits.

    But after accounting for tribes like these, why do we see higher

    rates of mortality from infectious disease among other hunter-gatherers

    who are eating a better diet and show little incidence of degenerative

    disease?

    There are two major reasons I know of. First, most modern-day tribes

    have been pushed onto marginal habitats by encroaching civilization.[161]

    This means they may at times experience nutritional stress resulting from

    seasonal fluctuations in the food supply (like the aborigines noted above)

    during which relatively large amounts of weight are lost while they remain

    active. The study of "paleopathology" (the study of illnesses in past

    populations from signs left in the fossil record) shows that similar

    nutritional stress experienced by some hunter-gatherers of the past was

    not unknown either, and at times was great enough to have stunted their

    growth, resulting in "growth arrest lines" in human bone that can be seen

    under conditions of nutritional deprivation. Such nutritional stress is

    most likely for hunter-gatherers in environments where either the number

    of food sources is low (exposing them to the risk of undependable supply),

    or where food is abundant only seasonally.[162]

    Going without food--or fasting while under conditions of total rest

    as hygienists do as a regenerative/recuperative measure--is one thing, but

    nutritional stress or deprivation while under continued physical stress is

    unhealthy and leaves one more susceptible to pathologies including

    infection.[163]

    The second potential cause of higher rates of infection are the less

    artificially controlled sanitary conditions (one of the areas where modern

    civilization is conducive rather than destructive to health)--due to less

    control over the environment by hunter-gatherers than by modern

    civilizations. Creatures in the wild are in frequent contact with feces

    and other breeding grounds for microorganisms such as rotting fruit and/or

    carcasses, to which they are exposed by skin breaks and injuries, and so

    forth.[164] Contrary to popular Hygienic myth, animals in the wild eating

    natural diets in a natural environment are not disease-free, and large

    infectious viral and bacterial plagues in the past and present among wild

    animal populations are known to have occurred. (To cite one example,

    rinderpest plagues in the African Serengeti occurred in the 1890s and

    again around 1930, 1960, and 1982 among buffalo, kudu, eland, and

    wildebeest.[165])

    It becomes obvious when you look into studies of wild animals that

    natural diet combined with living in natural conditions is no guarantee of

    freedom from disease and/or infection. Chimpanzees, our closest living

    animal relatives, for instance, can and do suffer bouts in the wild from a

    spectrum of ailments very similar to those observed in human beings:

    including pneumonia and other respiratory infections (which occur more

    often during the cold and rainy season), polio, abscesses, rashes,

    parasites, diarrhea, even hemorrhoids on occasion.[166] Signs of

    infectious disease in the fossil record have also been detected in remains

    as far back as the dinosaur-age, as have signs of immune system mechanisms

    to combat them.[167]

    One of the conclusions to be drawn from this is that artificial

    modern conditions are not all bad where health is concerned. Such

    conditions as "sanitation" due to hygienic measures, shelter and

    protection from harsh climatic extremes and physical trauma, professional

    emergency care after potentially disabling or life-threatening accidents,

    elimination of the stresses of nomadism, plus protection from seasonal

    nutritional deprivation due to the modern food system that Westerners like

    ourselves enjoy today all play larger roles in health and longevity than

    we realize.[168]

    Also, I would hope that the chimp examples above might persuade

    hygienists not to feel so guilty or inevitably blame themselves when they

    occasionally fall prey to acute illness. We read of examples in the

    Natural Hygiene M2M which sometimes seem to elicit an almost palpable

    sense of relief among others when the conspiracy of silence is broken and

    they find they aren't the only ones. I think we should resist the tendency

    to always assume we flubbed the dietary details. In my opinion it is a

    mistake to believe that enervation need always be seen as simply the

    instigator of "toxemia" which is then held to always be the incipient

    cause of any illness. It seems to me you can easily have "enervation"

    (lowered energy and resistance) without toxemia, and that that in and of

    itself can be quite enough to upset the body's normal homeostasis

    ("health") and bring on illness. (Indeed I have personally become ill once

    or twice during the rebuilding period after lengthy fasts when overworked,

    a situation in which it would be difficult to blame toxemia as the cause.)

    The examples of modern-day hunter-gatherers as well as those of chimps

    should show us that you can eat a healthy natural diet and still suffer

    from health problems, including infectious disease, due to excessive

    stresses--what we would call "enervation" in Natural Hygiene.

    Ward, we still have some space here to wrap up Part 2. Given the

    research you've done, how has it changed your own diet and health

    lifestyle? What are you doing these days, and why?

    I would say my diet right now* is somewhere in the neighborhood of

    about 85% plant and 15% animal, and overall about 60% raw and 40% cooked

    by volume. A breakdown from a different angle would be that by volume it

    is, very roughly, about 1/4 fruit, 1/4 starches (grains/potatoes, etc.),

    1/4 veggies, and the remaining quarter divided between nuts/seeds and

    animal products, with more of the latter than the former. Of the animal

    foods, I would say at least half is flesh (mostly fish, but with

    occasional fowl or relatively lean red meat thrown in, eaten about 3-5

    meals per week), the rest composed of varying amounts of eggs, goat

    cheese, and yogurt.

    Although I have to admit I am unsure about the inclusion of dairy

    products on an evolutionary basis given their late introduction in our

    history, nevertheless, I do find that the more heavily I am exercising,

    the more I find myself tending to eat them. To play it safe, what dairy I

    do eat is low- or no-lactose cultured forms like goat cheese and yogurt.*

    Where the grains are concerned, so far I do not experience the kind

    of sustained energy I like to have for distance running without them, even

    though I am running less mileage than I used to (20 miles/week now as

    opposed to 35-40 a few years ago). The other starches such as potatoes,

    squash, etc., alone just don't seem to provide the energy punch I need.

    Again, however, I try to be judicious by eating non-gluten-containing

    grains such as millet, quinoa, or rice, or else use sprouted forms of

    grains, or breads made from them, that eliminate the gluten otherwise

    present in wheat, barley, oats, and so forth.*

    In general, while I do take the evolutionary picture heavily into

    account, I also believe it is important to listen to our own bodies and

    experiment, given the uncertainties that remain.

    Also, I have to say that I find exercise, rest, and stress management

    as important as diet in staying energetic, healthy, and avoiding acute

    episodes of ill-health. Frankly, my experience is that once you reach a

    certain reasonable level of health improvement based on your dietary

    disciplines, and things start to level out--but maybe you still aren't

    where you want to be--most further gains are going to come from paying

    attention to these other factors, especially today when so many of us are

    overworked, over-busy, and stressed-out. I think too many people focus too

    exclusively on diet and then wonder why they aren't getting any further

    improvements.

    Diet only gets you so far. I usually sleep about 8-10 hours a night,

    and I very much enjoy vigorous exercise, which I find is necessary to help

    control my blood-sugar levels, which are still a weak spot for me. The

    optimum amount is important, though. A few years ago I was running every

    day, totaling 35-40 miles/week and concentrating on hard training for

    age-group competition, and more prone to respiratory problems like colds,

    etc. (not an infrequent complaint of runners). In the last couple of

    years, I've cut back to every-other-day running totaling roughly 20 miles

    per week. I still exercise fairly hard, but a bit less intensely than

    before, I give myself a day of rest in between, and the frequency of colds

    and so forth is now much lower.

    I am sure people will be curious here, Ward: What were some of the

    improvements you noticed after adding flesh foods to your diet?

    Well, although I expected it might take several months to really

    notice much of anything, one of the first things was that within about 2

    to 3 weeks I noticed better recovery after exercise--as a distance runner

    I was able to run my hard workouts more frequently with fewer rest days or

    easy workouts in between. I also began sleeping better fairly early on,

    was not hungry all the time anymore, and maintained weight more easily on

    lesser volumes of food. Over time, my stools became a bit more well

    formed, my sex drive increased somewhat (usually accompanies better energy

    levels for me), my nervous system was more stable and not so prone to

    hyperreactive panic-attack-like instability like before, and in general I

    found I didn't feel so puny or wilt under stress so easily as before.

    Unexpectedly, I also began to notice that my moods had improved and I was

    more "buoyant." Individually, none of these changes was dramatic, but as a

    cumulative whole they have made the difference for me. Most of these

    changes had leveled off after about 4-6 months, I would say.

    Something else I ought to mention here, too, was the effect of this

    dietary change on a visual disturbance I had been having for some years

    prior to the time I embarked on a disciplined Hygienic program, and which

    continued unchanged during the two or three years I was on the traditional

    vegetarian diet of either all-raw or 80% raw/20% cooked. During that time

    I had been having regular episodes of "spots" in my visual field every

    week or so, where "snow" (like on a t.v. set) would gradually build up to

    the point it would almost completely obscure my vision in one eye or the

    other for a period of about 5 minutes, then gradually fade away after

    another 5 minutes. As soon as I began including flesh in my diet several

    times per week, these started decreasing in frequency and over the 3 years

    since have almost completely disappeared.

    What problems are you still working on?

    I still have an ongoing tussle with sugar-sensitivity due to the huge

    amounts of soft drinks I used to consume, and have to eat fruits

    conservatively. I also notice that I still do not hold up under stress and

    the occasional long hours of work as well as

  17. CATEGORY: diets/vegetarian

    TECHNICAL: **

    SUMMARY:

    This document was one of the most profound documents I had

    ever read at the time I found it. It is written by a truly great

    scholar by the name of Ward Nicholson. Mr. Nicholson, initially,

    practiced a type of diet known as the "hygienic" diet -- which is

    a strict vegetarian/vegan diet in which everything is consumed raw

    and unprocessed. I don't want to give away all of the goods, because

    it really is an exceptional read, but I will say this: After reading

    the document, it's fairly obvious that human beings could never have

    evolved the way we did if we had been vegetarians/vegans or

    frutarians.

    This paper is only the first of the 4-part interview with

    Mr. Nicholson. Since it's quite long, and most people will never

    wade through it, I want to go ahead and pull out some of the more

    interesting passages. In fact, these very passages were the ones

    that pushed me more towards a "paleo" type diet -- even though

    Nicholson's purpose was soley to disprove vegatarianism, and not to

    argue for evolutionary diets.

    The fist passage is my favorite. It was one of the first

    times I had heard anyone argue that the shift from hunter-gatherer

    diets to agarianism was a negative. There's even some pro low-carb

    sentiment to be found in it..

    "In most respects, the changes in diet from hunter-gatherer times

    to agricultural times have been almost all detrimental, although

    there is some evidence we'll discuss later indicating that at least

    some genetic adaptation to the Neolithic has begun taking place in

    the approximately 10,000 years since it began. With the much heavier

    reliance on starchy foods that became the staples of the diet,

    tooth decay, malnutrition, and rates of infectious disease increased

    dramatically over Paleolithic times, further exacerbated by crowding

    leading to even higher rates of communicable infections."

    The next excerpt is one that I have told to countless

    other people. Many argue that man, today, is much better off and

    more healthy than in the past. I've even heard the average life

    expectancy being used as an indicator. When you look at this passage,

    ask yourself if we are more "physical" and more "healthy", and

    more "robust" with todays technology -- or were we better off then:

    "Skeletal remains show that height decreased by four inches from the

    Late Paleolithic to the early Neolithic, brought about by poorer

    nutrition, and perhaps also by increased infectious disease causing

    growth stress, and possibly by some inbreeding in communities that

    were isolated."

    The next passage is one that I have quoted to people

    before. Many have asked me why certain fruits (sweet fruits like

    apples, grapes, etc..) aren't healthy. I always hear "but they're

    natural?!?!?". Well, when you examine the fruits of the past, they

    bear little resemblance to the man-made, altered, super-sweet,

    and seedless fruits we have today:

    "Fruit as defined by Walker in the article included tougher, less

    sugary foods, such as acacia tree pods. (Which laypeople like

    ourselves would be likely to classify as a "vegetable"-type food in

    common parlance). And although it was not clarified in the article,

    anyone familiar with or conscientious enough to look a little further

    into evolutionary studies of diet would have been aware that

    scientists generally use the terms "frugivore," "folivore,"

    "carnivore," "herbivore," etc., as categories comparing broad dietary

    trends, only very rarely as exclusivist terms, and among primates

    exclusivity in food is definitely not the norm."

    Perhaps the biggest nail in the coffin of the "humans are

    vegetarians" issue comes from the fact that the apes that we are

    most closest to are also not completely "plant eaters":

    "A breakdown by feeding time for the chimps of Gombe showed their

    intake of foods to be (very roughly) 60% of feeding time for fruit,

    20% for leaves, with the other items in the diet varying greatly on a

    seasonal basis depending on availability. Seasonal highs could range

    as high as (approx.) 17% of feeding time for blossoms, 22--30% for

    seeds, 10--17% for insects, 2--6% for meat, with other miscellaneous

    items coming in at perhaps 4% through most months of the year.85

    Miscellaneous items eaten by chimps include a few eggs,86 plus the

    rare honey that chimps are known to rob from beehives (as well as

    the embedded bees themselves), which is perhaps the most highly

    prized single item in their diet,87 but which they are limited from

    eating much of by circumstances. Soil is also occasionally

    eaten--presumably for the mineral content according to researchers.88"

    Nicholson points out that a great deal of the animal

    foods in a chimp diet come from insects, which was something I had

    never considered before this paper. Take note of the honey comment

    too. I plan to form an argument over the next few months that

    sugars (from grain, or processed foods) are indeed an addictive

    drug, and that they are being put into our processed man-made foods

    more and more because of their addicitive properties (by the food

    industry).

    After all, how many people do *you* know who continue to

    eat foods they know are bad, *JUST* because they crave the taste

    uncontrollably? (and the sugars are having that effect). I will

    form the argument slowly over time though, just like I did with the

    "cancer is curable with diet" arguement..

    -------------------------------------------------------------

    Interview with Ward Nicholson

    Scholar and thinker Ward Nicholson lives and works in Wichita,

    Kansas, where he used to publish and coordinate what I considered the

    singularly BEST health publication available in the world at that time,

    The Natural Hygiene Many-to-Many. Below, you'll find the complete text of

    Mr. Nicholson's October 1996 interview in Health & Beyond, an interview

    that blew the socks off the traditional "Humans are by nature fruitarian"

    argument.

    We'll discuss two things with Mr. Nicholson in H&B. One of these

    consists of the ideas and conclusions Ward has reached about Hygienists'

    actual experiences in the real world (based on interacting with many

    Hygienists while coordinating the N.H. M2M)--experiences often at

    variance with what the "official" Hygienic books tell us "should" happen.

    And the other involves the meticulous research he has done tracking down

    what our human ancestors ate in the evolutionary past as known by modern

    science, in the interest of discovering directly what the "food of our

    biological adaptation" actually was and is--again in the real world

    rather than in theory. Given the recent death of T.C. Fry (September 6,

    1996), I consider Ward's analysis of special importance to those who

    continue to adhere strictly to the fruits, vegetables, nuts and seeds

    diet. We'll tackle this month the question of humanity's primitive diet.

    In two subsequent issues, we'll wrap that topic up and delve into what

    Ward has learned from coordinating the Natural Hygiene M2M about

    Hygienists' experiences in real life.

    You'll find that will be a recurring theme throughout our

    discussions with Mr. Nicholson: what really goes on in real life when you

    hear a full spectrum of stories from a range of Hygienists, as well as

    what science says about areas of Hygiene that you will find have in some

    cases been poorly researched or not at all by previous Hygienic writers.

    Not everyone will agree with or appreciate what Mr. Nicholson has to

    say. But, as I've written more than once, I publish material in H&B that

    you won't find anywhere else, material and sound thinking that interests

    me and calls into question my ideas and my assumptions about building

    health naturally. In this series of three interviews, I guarantee Ward

    will challenge many of our mind sets. Mr. Nicholson has a lot of ground

    to cover, so without further ado, I happily present our controversial and

    articulate guest for this issue of H&B.

    Setting the Scientific Record Straight on Humanity's Evolutionary

    Prehistoric Diet and Ape Diets

    (Note: Ward has provided footnote numbers referencing the citations

    from which the scientific aspects of the discussion here have been

    sourced. Those of you who are interested may contact him and send $3 to

    receive a copy of all endnotes and bibliography after the last

    installment of these interviews has been completed and published.The

    address for doing this is given at the end of this article)

    Ward, why don't we start out with my traditional question: How was it

    that you became involved with Natural Hygiene?

    I got my introduction to Natural Hygiene through distance running,

    which eventually got me interested in the role of diet in athletic

    performance. During high school and college--throughout most of the

    1970s--I was a competitive distance runner. Runners are very concerned

    with anything that will improve their energy, endurance, and rate of

    recovery, and are usually open to experimenting with different regimens

    in the interest of getting ever-better results. Since I've always been a

    bookworm, that's usually the first route I take for teaching myself about

    subjects I get interested in. In 1974 or '75, I read the book Yoga and

    the Athlete, by Ian Jackson, when it was published by Runner's World

    magazine. In it, he talked about his forays into hatha yoga (the

    stretching postures) as a way of rehabilitating himself from running

    injuries he had sustained. He eventually got into yoga full-time, and

    from there, began investigating diet's effect on the body, writing about

    that too. At first I was more interested in Are Waerland (a European

    Hygienist health advocate with a differing slant than Shelton), who was

    mentioned in the book, so I wrote Jackson for more information. But

    instead of giving me information about Waerland, he steered me in the

    direction of American Natural Hygiene, saying in his experience it was

    far superior.

    I was also fascinated with Jackson's experiences with fasting. He

    credited fasting with helping his distance running, and had a somewhat

    mind-blowing "peak experience" while running on his first long fast. He

    kept training at long distances during his fasts, so I decided that would

    be the first aspect of the Hygienic program I would try myself. Then in

    the meantime, I started frequenting health-food stores and ran across

    Herbert Shelton's Fasting Can Save Your Life on the bookracks, which as

    we all know, has been a very persuasive book for beginning Natural

    Hygienists. So to ease into things gradually, I started out with a few

    3-day "juice" fasts (I know some Hygienists will object to this language,

    but bear with me), then later two 8-day juice-diet fasts while I kept on

    running and working at my warehouse job (during college). These were

    done--in fact, all the fasts I've experienced have been done--at home on

    my own.

    Needless to say, I found these "fasts" on juices difficult since I

    was both working, and working out, at the same time. Had they been true

    "water" fasts, I doubt I would have been able to do it. I had been

    enticed by the promises of more robust health and greater eventual energy

    from fasting, and kept wondering why I didn't feel as great while fasting

    as the books said I would, with their stories of past supermen lifting

    heavy weights or walking or running long distances as they fasted. Little

    did I realize in my naiveté that this was normal for most fasters. At the

    time I assumed, as Hygienists have probably been assuming since time

    immemorial when they don't get the hoped-for results, that it was just

    because I "wasn't cleaned-out enough." So in order to get more

    cleaned-out, I kept doing longer fasts, working up to a 13-day true water

    fast, and finally a 25-day water fast over Christmas break my senior year

    in college. (I had smartened up just a little bit by this time and didn't

    try running during these longer fasts on water alone.) I also tried the

    Hygienic vegetarian diet around this time. But as the mostly raw-food

    diet negatively affected my energy levels and consequently my distance

    running performance, I lost enthusiasm for it, and my Hygienic interests

    receded to the back burner. I was also weary of fasting at this point,

    never having reached what I supposed was the Hygienic promised land of a

    total clean-out, so that held no further allure for me at the time.

    After college, I drifted away from running and got into doing hatha

    yoga for a couple of years, taught a couple of local classes in it, then

    started my own business as a typesetter and graphic designer. Things took

    off and during the mid to late 1980s, I worked 60 to 80 hours a week,

    often on just 5 to 6 hours of sleep a night, under extreme

    round-the-clock deadline pressures setting type at the computer for

    demanding advertising agency clients. I dropped all pretense of Hygienic

    living, with the exception of maintaining a nominally "vegetarian"

    regime. This did not preclude me, however, guzzling large amounts of

    caffeine and sugar in the form of a half-gallon or more of soft drinks

    per day to keep going.

    Eventually all this took its toll and by 1990 my nervous system--and

    I assume (in the absence of having gone to a doctor like most Hygienists

    don't!) probably my adrenals--were essentially just about shot from all

    the mainlining of sugar and caffeine, the lack of sleep, and the

    24-hour-a-day deadlines and accompanying emotional pressures. I started

    having severe panic or adrenaline attacks that would sometimes last

    several hours during which time I literally thought I might die from a

    heart attack or asphyxiation. The attacks were so debilitating it would

    take at least a full day afterwards to recover every time I had one.

    Finally, in late 1990/early 1991, after I had begun having one or

    two of these attacks a week, I decided it was "change my ways or else"

    and did a 42-day fast at home by myself (mostly on water with occasional

    juices when I was feeling low), after which I went on a 95%--100%

    raw-food Hygienic diet. The panic attacks finally subsided after the 5th

    day of fasting, and have not returned since, although I did come close to

    having a few the first year or two after the fast. Soon after I made the

    recommitment to Hygienic living, when I had about completed my 42-day

    fast, I called a couple of Hygienic doctors and had a few phone

    consultations. But while the information I received was useful to a

    degree with my immediate symptoms, it did not really answer my Hygienic

    questions like I'd hoped, nor did it turn out to be of significant help

    overcoming my health problems over the longer-term. So in 1992 I decided

    to start the Natural Hygiene M2M to get directly in touch with Hygienists

    who had had real experience with their own problems, not just book

    knowledge, and not just the party line I could already get from

    mainstream Hygiene. With this new source of information and experience to

    draw on, among others, my health has continued to improve from the low it

    had reached, but it has been a gradual, trial-and-error process, and not

    without the occasional setback to learn from.

    One of the motivating factors here was that although fasting had

    been helpful (and continues to be), unfortunately during the time in

    between fasts (I have done three subsequent fasts on water of 11 days, 20

    days, and 14 days in the past five years), I just was not getting the

    results we are led to expect with the Hygienic diet itself. In fact, at

    best, I was stagnating, and at worst I was developing new symptoms that

    while mild were in a disconcerting downhill direction. Over time, the

    disparity between the Hygienic philosophy and the results I was (not)

    getting started eating at me. I slowly began to consider through reading

    the experiences of others in the M2M that it was not something I was

    "doing wrong," or that I wasn't adhering to the details sufficiently, but

    that there were others who were also not doing so well following the

    Hygienic diet, try as they might. The "blame the victim for not following

    all the itty-bitty details just right" mentality began to seem more and

    more suspect to me.

    This leads us up to the next phase of your Hygienic journey, where

    you eventually decided to remodel your diet based on your exploration of

    the evolutionary picture of early human diets as now known by science.

    Coming from your Hygienic background, what was it that got you so

    interested in evolution?

    Well, I have always taken very seriously as one of my first

    principles the axiom in Hygiene that we should be eating "food of our

    biological adaptation." What is offered in Hygiene to tell us what that

    is, is the "comparative anatomy" line of reasoning we are all familiar

    with: You look at the anatomical and digestive structures of various

    animals, classify them, and note the types of food that animals with

    certain digestive structures eat. By that criterion of course, humans are

    said to be either frugivores or vegetarians like the apes are said to be,

    depending on how the language is used. Now at first (like any good

    upstanding Hygienist!) I did not question this argument because as far as

    it goes it is certainly logical. But nonetheless, it came to seem to me

    that was an indirect route for finding the truth, because as similar as

    we may be to the apes and especially the chimpanzee (our closest

    relative), we are still a different species. We aren't looking directly

    at ourselves via this route, we are looking at a different animal and

    basically just assuming that our diet will be pretty much just like

    theirs based on certain digestive similarities. And in that difference

    between them and us could reside errors of fact.

    So I figured that one day, probably from outside Hygiene itself,

    someone would come along with a book on diet or natural foods that would

    pull together the evidence directly from paleontology and evolutionary

    science and nail it down once and for all. Of course, I felt confident at

    that time it would basically vindicate the Hygienic argument from

    comparative anatomy, so it remained merely an academic concern to me at

    the time.

    And then one day several years ago, there I was at the bookstore

    when out popped the words The Paleolithic Prescription1 (by Boyd Eaton,

    M.D. and anthropologists Marjorie Shostak and Melvin Konner) on the spine

    of a book just within the range of my peripheral vision. Let me tell you

    I tackled that book in nothing flat! But when I opened it up and began

    reading, I was very dismayed to find there was much talk about the kind

    of lean game animals our ancestors in Paleolithic times (40,000 years

    ago) ate as an aspect of their otherwise high-plant-food diet, but

    nowhere was there a word anywhere about pure vegetarianism in our past

    except one measly paragraph to say it had never existed and simply wasn't

    supported by the evidence.2 I have to tell you that while I bought the

    book, red lights were flashing as I argued vociferously in my head with

    the authors on almost every other page, exploiting every tiny little

    loophole I could find to save my belief in humanity's original vegetarian

    and perhaps even fruitarian ways. "Perhaps you haven't looked far enough

    back in time," I told them inside myself. "You are just biased because of

    the modern meat-eating culture that surrounds us," I silently screamed,

    "so you can't see the vegetarianism that was really there because you

    aren't even looking for it!"

    So in order to prove them wrong, I decided I'd have to unearth all

    the scientific sources at the local university library myself and look at

    the published evidence directly. But I didn't do this at first--I stalled

    for about a year, basically being an ostrich for that time, sort of

    forgetting about the subject to bury the cognitive dissonance I was

    feeling.

    In the meantime, though, I happened to hear from a hatha yoga

    teacher I was acquainted with who taught internationally and was

    well-known in the yoga community both in the U.S. and abroad in the '70s

    and early '80s, who, along with his significant other, had been

    vegetarian for about 17 years. To my amazement, he told me in response to

    my bragging about my raw-food diet that he and his partner had

    re-introduced some flesh foods to their diet a few years previously after

    some years of going downhill on their vegetarian diets, and it had

    resulted in a significant upswing in their health. He also noted that a

    number of their vegetarian friends in the yoga community had run the same

    gamut of deteriorating health after 10--15 years as vegetarians since the

    '70s era.

    Once again, of course, I pooh-poohed all this to myself because they

    obviously weren't "Hygienist" vegetarians and none of their friends

    probably were either. You know the line of thinking: If it ain't Hygienic

    vegetarianism, by golly, we'll just discount the results as completely

    irrelevant! If there's even one iota of difference between their brand of

    vegetarianism and ours, well then, out the window with all the results!

    But it did get me thinking, because this was a man of considerable

    intellect as well as a person of integrity whom I respected more than

    perhaps anyone else I knew.

    And then a few months after that, I began noticing I was having

    almost continual semi-diarrhea on my raw-food diet and could not seem to

    make well-formed stools. I was not sleeping well, my stamina was sub-par

    both during daily tasks and exercise, which was of concern to me after

    having gotten back into distance running again, and so real doubts began

    creeping in. It was around this time I finally made that trip to the

    university library.

    And so what did you find?

    Enough evidence for the existence of animal flesh consumption from

    early in human prehistory (approx. 2--3 million years ago) that I knew I

    could no longer ignore the obvious. For awhile I simply could not believe

    that Hygienists had never looked into this. But while it was

    disillusioning, that disillusionment gradually turned into something

    exciting because I knew I was looking directly at what scientists knew

    based on the evidence. It gave me a feeling of more power and control,

    and awareness of further dietary factors I had previously ruled out that

    I could experiment with to improve my health, because now I was dealing

    with something much closer to "the actual" (based on scientific findings

    and evidence) as opposed to dietary "idealism."

    What kind of "evidence" are we talking about here?

    At its most basic, an accumulation of archaeological excavations by

    paleontologists, ranging all the way from the recent past of

    10,000--20,000 years ago back to approximately 2 million years ago, where

    ancient "hominid" (meaning human and/or proto-human) skeletal remains are

    found in conjunction with stone tools and animal bones that have cut

    marks on them. These cut marks indicate the flesh was scraped away from

    the bone with human-made tools, and could not have been made in any other

    way. You also find distinctively smashed bones occurring in conjunction

    with hammerstones that clearly show they were used to get at the marrow

    for its fatty material.3 Prior to the evidence from these earliest stone

    tools, going back even further (2--3 million years) is chemical evidence

    showing from strontium/calcium ratios in fossilized bone that some of the

    diet from earlier hominids was also coming from animal flesh.4

    (Strontium/calcium ratios in bone indicate relative amounts of plant vs.

    animal foods in the diet.5) Scanning electron microscope studies of the

    microwear of fossil teeth from various periods well back into human

    prehistory show wear patterns indicating the use of flesh in the diet

    too.6

    The consistency of these findings across vast eons of time show that

    these were not isolated incidents but characteristic behavior of hominids

    in many times and many places.

    The evidence--if it is even known to them--is controversial only to

    Hygienists and other vegetarian groups--few to none of whom, so far as I

    can discern, seem to have acquainted themselves sufficiently with the

    evolutionary picture other than to make a few armchair remarks. To anyone

    who really looks at the published evidence in the scientific books and

    peer-reviewed journals and has a basic understanding of the mechanisms

    for how evolution works, there is really not a whole lot to be

    controversial about with regard to the very strong evidence indicating

    flesh has been a part of the human diet for vast eons of evolutionary

    time. The real controversy in paleontology right now is whether the

    earliest forms of hominids were truly "hunters," or more opportunistic

    "scavengers" making off with pieces of kills brought down by other

    predators, not whether we ate flesh food itself as a portion of our diet

    or not.7

    Can you give us a timeline of dietary developments in the human line

    of evolution to show readers the overall picture from a bird's-eye view

    so we can set a context for further discussion here?

    Sure. We need to start at the beginning of the primate line long

    before apes and humans ever evolved, though, to make sure we cover all

    the bases, including the objections often made by vegetarians (and

    fruitarians for that matter) that those looking into prehistory simply

    haven't looked far enough back to find our "original" diet. Keep in mind

    some of these dates are approximate and subject to refinement as further

    scientific progress is made.

    65,000,000 to 50,000,000 B.C.: The first primates, resembling

    today's mouse lemurs, bush-babies, and tarsiers, weighing in at 2 lbs. or

    less, and eating a largely insectivorous diet.8

    50,000,000 to 30,000,000 B.C.: A gradual shift in diet for these

    primates to mostly frugivorous in the middle of this period to mostly

    herbivorous towards the end of it, but with considerable variance between

    specific primate species as to lesser items in the diet, such as insects,

    meat, and other plant foods.9

    30,000,000 to 10,000,000 B.C: Fairly stable persistence of above

    dietary pattern.10

    Approx. 10,000,000 to 7,000,000 B.C: Last common primate ancestor of

    both humans and the modern ape family.11

    Approx. 7,000,000 B.C. After the end of the previous period, a fork

    occurs branching into separate primate lines, including humans.12 The

    most recent DNA evidence shows that humans are closely related to both

    gorillas and chimpanzees, but most closely to the chimp.13 Most

    paleoanthropologists believe that after the split, flesh foods began to

    assume a greater role in the human side of the primate family at this

    time.14

    Approx. 4,500,000 B.C.: First known hominid (proto-human) from

    fossil remains, known as australopithecus ramidus--literally translating

    as "root ape" for its position as the very first known hominid, which may

    not yet have been fully bipedal (walking upright on two legs). Anatomy

    and dentition (teeth) are very suggestive of a form similar to that of

    modern chimpanzees.15

    Approx. 3,700,000 B.C.: First fully upright bipedal hominid,

    australopithecus afarensis (meaning "southern ape," for the initial

    discovery in southern Africa), about 4 feet tall, first known popularly

    from the famous "Lucy" skeleton.16

    3,000,000 to 2,000,000 B.C.: Australopithecus line diverges into

    sub-lines,17 one of which will eventually give rise to homo sapiens

    (modern man). It appears that the environmental impetus for this

    "adaptive radiation" into different species was a changing global climate

    between 2.5 and 2 million years ago driven by glaciation in the polar

    regions.18 The climatic repercussions in Africa resulted in a breakup of

    the formerly extensively forested habitat into a "mosaic" of forest

    interspersed with savanna (grassland). This put stress on many species to

    adapt to differing conditions and availability of foodstuffs.19 The

    different australopithecus lineages, thus, ate somewhat differing diets,

    ranging from more herbivorous (meaning high in plant matter) to more

    frugivorous (higher in soft and/or hard fruits than in other plant

    parts). There is still some debate as to which australopithecus lineage

    modern humans ultimately descended from, but recent evidence based on

    strontium/calcium ratios in bone, plus teeth microwear studies, show that

    whatever the lineage, some meat was eaten in addition to the plant foods

    and fruits which were the staples.20

    2,000,000 to 1,500,000 B.C.: Appearance of the first "true humans"

    (signified by the genus homo), known as homo habilis ("handy man")--so

    named because of the appearance of stone tools and cultures at this time.

    These gatherer-hunters were between 4 and 5 feet in height, weighed

    between 40 to 100 pounds, and still retained tree-climbing adaptations

    (such as curved finger bones)21 while subsisting on wild plant foods and

    scavenging and/or hunting meat. (The evidence for flesh consumption based

    on cut-marks on animal bones, as well as use of hammerstones to smash

    them for the marrow inside, dates to this period.22) It is thought that

    they lived in small groups like modern hunter-gatherers but that the

    social structure would have been more like that of chimpanzees.23

    The main controversy about this time period by paleoanthropologists

    is not whether homo habilis consumed flesh (which is well established)

    but whether the flesh they consumed was primarily obtained by scavenging

    kills made by other predators or by hunting.24 (The latter would indicate

    a more developed culture, the former a more primitive one.) While meat

    was becoming a more important part of the diet at this time, based on the

    fact that the diet of modern hunter-gatherers--with their considerably

    advanced tool set--have not been known to exceed 40% meat in tropical

    habitats like habilis evolved in, we can safely assume that the meat in

    habilis' diet would have been substantially less than that.25

    1,500,000 to 230,000 B.C.: Evolution of homo habilis into the

    "erectines," a range of human species often collectively referred to as

    homo erectus, after the most well-known variant. Similar in height to

    modern humans (5--6 feet) but stockier with a smaller brain, hunting

    activity increased over habilis, so that meat in the diet assumed greater

    importance. Teeth microwear studies of erectus specimens have indicated

    harsh wear patterns typical of meat-eating animals like the hyena.26 No

    text I have yet read ventures any sort of percentage figure from this

    time period, but it is commonly acknowledged that plants still made up

    the largest portion of the subsistence. More typically human social

    structures made their appearance with the erectines as well.27

    The erectines were the first human ancestor to control and use fire.

    It is thought that perhaps because of this, but more importantly because

    of other converging factors--such as increased hunting and technological

    sophistication with tools--that about 900,000 years ago in response to

    another peak of glacial activity and global cooling (which broke up the

    tropical landscape further into an even patchier mosaic), the erectines

    were forced to adapt to an increasingly varied savanna/forest environment

    by being able to alternate opportunistically between vegetable and animal

    foods to survive, and/or move around nomadically.28

    For whatever reasons, it was also around this time (dated to approx.

    700,000 years ago) that a significant increase in large land animals

    occurred in Europe (elephants, hoofed animals, hippopotamuses, and

    predators of the big-cat family) as these animals spread from their

    African home. It is unlikely to have been an accident that the spread of

    the erectines to the European and Asian continent during and after this

    timeframe coincides with this increase in game as well, as they probably

    followed them.29

    Because of the considerably harsher conditions and seasonal

    variation in food supply, hunting became more important to bridge the

    seasonal gaps, as well as the ability to store nonperishable items such

    as nuts, bulbs, and tubers for the winter when the edible plants withered

    in the autumn. All of these factors, along with clothing (and also

    perhaps fire), helped enable colonization of the less hospitable

    environment. There were also physical changes in response to the colder

    and darker areas that were inhabited, such as the development of lighter

    skin color that allowed the sun to penetrate the skin and produce vitamin

    D, as well as the adaptation of the fat layer and sweat glands to the new

    climate.30

    Erectus finds from northern China 400,000 years ago have indicated

    an omnivorous diet of meats, wild fruit and berries (including

    hackberries), plus shoots and tubers, and various other animal foods such

    as birds and their eggs, insects, reptiles, rats, and large mammals.31

    500,000 to 200,000 B.C.: Archaic homo sapiens (our immediate

    predecessor) appears. These human species, of which there were a number

    of variants, did not last as long in evolutionary time as previous ones,

    apparently due simply to the increasingly rapid rate of evolution

    occurring in the human line at this time. Thus they represent a

    transitional time after the erectines leading up to modern man, and the

    later forms are sometimes not treated separately from the earliest modern

    forms of true homo sapiens.32

    150,000 to 120,000 B.C.: Homo sapiens neanderthalensis--or the

    Neanderthals--begin appearing in Europe, reaching a height between 90,000

    and 35,000 years ago before becoming extinct. It is now well accepted

    that the Neanderthals were an evolutionary offshoot that met an eventual

    dead-end (in other words, they were not our ancestors), and that more

    than likely, both modern homo sapiens and Neanderthals were sister

    species descended from a prior common archaic sapiens ancestor.33

    140,000 to 110,000 B.C.: First appearance of anatomically modern

    humans (homo sapiens).34 The last Ice Age also dates from this

    period--stretching from 115,000 to 10,000 years ago. Thus it was in this

    context, which included harsh and rapid climatic changes, that our most

    recent ancestors had to flexibly adapt their eating and subsistence.35

    (Climatic shifts necessitating adaptations were also experienced in

    tropical regions, though to a lesser degree.36) It may therefore be

    significant that fire, though discovered earlier, came into widespread

    use around this same time37 corresponding with the advent of modern human

    beings. Its use may in fact be a defining characteristic of modern

    humans38 and their mode of subsistence. (I'll discuss the timescale of

    fire and cooking at more length later.)

    130,000 to 120,000 B.C.: Some of the earliest evidence for seafoods

    (molluscs, primarily) in the diet by coastal dwellers appears at this

    time,39 although in one isolated location discovered so far, there is

    evidence going back 300,000 years ago.40 Common use of seafoods by

    coastal aborigines becomes evident about 35,000 years ago,41 but

    widespread global use in the fossil record is not seen until around

    20,000 years ago and since.42 For the most part, seafoods should probably

    not be considered a major departure, however, as the composition of fish,

    shellfish, and poultry more closely resembles the wild land-game animals

    many of these same ancestors ate than any other source today except for

    commercial game farms that attempt to mimic ancient meat.43

    40,000 to 35,000 B.C.: The first "behaviorally modern" human

    beings--as seen in the sudden explosion of new forms of stone and bone

    tools, cave paintings and other artwork, plus elaborate burials and many

    other quintessentially modern human behaviors. The impetus or origin for

    this watershed event is still a mystery.44 40,000 B.C. to 10--8,000 B.C.:

    Last period prior to the advent of agriculture in which human beings

    universally subsisted by hunting and gathering (also known as the "Late

    Paleolithic"--or "Stone Age"--period). Paleolithic peoples did process

    some of their foods, but these were simple methods that would have been

    confined to pounding, grinding, scraping, roasting, and baking.45 35,000

    B.C. to 15--10,000 B.C.: The Cro-Magnons (fully modern pre-Europeans)

    thrive in the cold climate of Europe via big-game hunting, with meat

    consumption rising to as much as 50% of the diet.46

    25,000 to 15,000 B.C.: Coldest period of the last Ice Age, during

    which global temperatures averaged 14°F cooler than they do today47 (with

    local variations as much as 59°F lower48), with an increasingly arid

    environment and much more difficult conditions of survival to which

    plants, animals, and humans all had to adapt.49 The Eurasian steppes just

    before and during this time had a maximum annual summer temperature of

    only 59°F.50

    Humans in Europe and northern Asia, and later in North America,

    adapted by increasing their hunting of the large mammals such as

    mammoths, horses, bison and caribou which flourished on the open

    grasslands, tundra, and steppes which spread during this period.51

    Storage of vegetable foods that could be consumed during the harsh

    winters was also exploited. Clothing methods were improved (including

    needles with eyes) and sturdier shelters developed--the most common being

    animal hides wrapped around wooden posts, some of which had sunken floors

    and hearths.52 In the tropics, large areas became arid. (In South Africa,

    for instance, the vegetation consisted mostly of shrubs and grass with

    few fruits.53)

    20,000 B.C. to 9,000 B.C.: Transitional period known as the

    "Mesolithic," during which the bow-and-arrow appeared,54 and gazelle,

    antelope, and deer were being intensively hunted,55 while at the same

    time precursor forms of wild plant and game management began to be more

    intensively practiced. At this time, wild grains, including wheat and

    barley by 17,000 B.C.--before their domestication--were being gathered

    and ground into flour as evidenced by the use of mortars-and-pestles in

    what is now modern-day Israel. By 13,000 B.C. the descendants of these

    peoples were harvesting wild grains intensely and it was only a small

    step from there to the development of agriculture.56 Game management

    through the burning-off of land to encourage grasslands and the increase

    of herds became widely practiced during this time as well. In North

    America, for instance, the western high plains are the only area of the

    current United States that did not see intensive changes to the land

    through extensive use of fire.57

    Also during this time, and probably also for some millennia prior to

    the Mesolithic (perhaps as early as 45,000 B.C.), ritual and

    magico-religious sanctions protecting certain wild plants developed,

    initiating a new symbiotic relationship between people and their food

    sources that became encoded culturally and constituted the first phase of

    domestication well prior to actual cultivation. Protections were accorded

    to certain wild food species (yams being a well-known example) to prevent

    disruption of their life cycle at periods critical to their growth, so

    that they could be profitably harvested later.58 Digging sticks for yams

    have also been found dating to at least 40,000 B.C.,59 so these tubers

    considerably antedated the use of grains in the diet.

    Foods known to be gathered during the Mesolithic period in the

    Middle East were root vegetables, wild pulses (peas, beans, etc.), nuts

    such as almonds, pistachios, and hazelnuts, as well as fruits such as

    apples. Seafoods such as fish, crabs, molluscs, and snails also became

    common during this time.60

    Approx. 10,000 B.C.: The beginning of the "Neolithic" period, or

    "Agricultural Revolution," i.e., farming and animal husbandry. The

    transition to agriculture was made necessary by gradually increasing

    population pressures due to the success of homo sapiens' prior hunting

    and gathering way of life. (Hunting and gathering can support perhaps one

    person per square 10 miles; Neolithic agriculture 100 times or more that

    many.61) Also, at about the time population pressures were increasing,

    the last Ice Age ended, and many species of large game became instinct

    (probably due to a combination of both intensive hunting and

    disappearance of their habitats when the Ice Age ended).62 Wild grasses

    and cereals began flourishing, making them prime candidates for the

    staple foods to be domesticated, given our previous familiarity with

    them.63 By 9,000 B.C. sheep and goats were being domesticated in the Near

    East, and cattle and pigs shortly after, while wheat, barley, and legumes

    were being cultivated somewhat before 7,000 B.C., as were fruits and

    nuts, while meat consumption fell enormously.64 By 5,000 B.C. agriculture

    had spread to all inhabited continents except Australia.65 During the

    time since the beginning of the Neolithic, the ratio of plant-to-animal

    foods in the diet has sharply increased from an average of probably

    65%/35% during Paleolithic times66 to as high as 90%/10% since the advent

    of agriculture.67

    In most respects, the changes in diet from hunter-gatherer times to

    agricultural times have been almost all detrimental, although there is

    some evidence we'll discuss later indicating that at least some genetic

    adaptation to the Neolithic has begun taking place in the approximately

    10,000 years since it began. With the much heavier reliance on starchy

    foods that became the staples of the diet, tooth decay, malnutrition, and

    rates of infectious disease increased dramatically over Paleolithic times,

    further exacerbated by crowding leading to even higher rates of

    communicable infections.

    Skeletal remains show that height decreased by four inches from the

    Late Paleolithic to the early Neolithic, brought about by poorer

    nutrition, and perhaps also by increased infectious disease causing growth

    stress, and possibly by some inbreeding in communities that were isolated.

    Signs of osteoporosis and anemia, which was almost non-existent in

    pre-Neolithic times, have been frequently noted in skeletal pathologies

    observed in the Neolithic peoples of the Middle East. It is known that

    certain kinds of osteoporosis which have been found in these skeletal

    remains are caused by anemia, and although the causes have not yet been

    determined exactly, the primary suspect is reduced levels of iron thought

    to have been caused by the stress of infectious disease rather than

    dietary deficiency, although the latter remains a possibility.68

    So have Hygienists really overlooked all the evidence you've compiled

    in the above timeline? Are you serious?

    It was a puzzle to me when I first stumbled onto it myself. Why

    hadn't I been told about all this? I had thought in my readings in the

    Hygienic literature that when the writers referred to our "original diet"

    or our "natural diet," that must mean what I assumed they meant: that not

    only was it based on comparative anatomy, but also on what we actually ate

    during the time the species evolved. And further, that they were at least

    familiar with the scientific evidence even if they chose to keep things

    simple and not talk about it themselves. But when I did run across and

    chase down a scientific reference or two that prominent Hygienists had at

    long last bothered to mention, I found to my dismay they had distorted the

    actual evidence or left out crucial pieces.

    Could you name a name or two here and give an example so people will

    know the kind of thing you are talking about?

    Sure, as long as we do it with the understanding I am not attempting

    to vilify anybody, and we all make mistakes. The most recent one I'm

    familiar with is Victoria Bidwell's citation (in her Health Seeker's

    Yearbook69) of a 1979 science report from the New York Times,70 where she

    summarizes anthropologist Alan Walker's microwear studies of fossil teeth

    in an attempt to show that humans were originally exclusively, only,

    fruit-eaters.

    Bidwell paraphrases the report she cited as saying that "humans were

    once exclusively fruit eaters, eaters of nothing but fruit." And also that,

    "Dr. Walker and other researchers are absolutely certain that our

    ancestors, up to a point in relatively recent history, were

    fruitarians/vegetarians."71 But a perusal of the actual article being

    cited reveals that: The diet was said to be "chiefly" fruit, which was the

    "staple," and the teeth studied were those of "fruit-eater," but the

    article is not absolutistic like Bidwell painted it.

    Fruit as defined by Walker in the article included tougher, less

    sugary foods, such as acacia tree pods. (Which laypeople like

    ourselves would be likely to classify as a "vegetable"-type food in

    common parlance). And although it was not clarified in the article,

    anyone familiar with or conscientious enough to look a little further

    into evolutionary studies of diet would have been aware that

    scientists generally use the terms "frugivore," "folivore,"

    "carnivore," "herbivore," etc., as categories comparing broad dietary

    trends, only very rarely as exclusivist terms, and among primates

    exclusivity in food is definitely not the norm.

    The primate/hominids in the study were australopithecus and homo

    habilis--among the earliest in the human line--hardly "relatively recent

    history" in this context.

    The studies were preliminary, and Walker was cautious, saying he

    didn't "want to make too much of this yet"--and his caution proved to be

    well-warranted. I believe there was enough research material available by

    the late 1980s (Health Seeker's Yearbook was published in 1990) that had

    checking been done, it would have been found that while he was largely

    right about australopithecine species being primarily frugivores (using a

    very broad definition of "fruit"), later research like what we outlined in

    our timeline above has shown australopithecus also included small amounts

    of flesh, seeds, and vegetable foods, and that all subsequent species

    beginning with homo habilis have included significant amounts of meat in

    their diet, even if the diet of habilis probably was still mostly fruit

    plus veggies. There is more that I could nitpick, but that's probably

    enough. I imagine Victoria was simply very excited to see scientific

    mention of frugivorism in the past, and just got carried away in her

    enthusiasm. There's at least one or two similar distortions by others in

    the vegetarian community that one could cite (Viktoras Kulvinskas' 1975

    book Survival into the 21st Century,72 for instance, contains inaccuracies

    about ape diet and "fruitarianism") so I don't want to pick on her too

    much because I would imagine we've all done that at times. It may be

    understandable when you are unfamiliar with the research, but it points

    out the need to be careful.

    Overall, then, what I have been left with--in the absence of any

    serious research into the evolutionary past by Hygienists--is the

    unavoidable conclusion that Hygienists simply assume it ought to be

    intuitively obvious that the original diet of humans was totally

    vegetarian and totally raw. (Hygienists often seem impatient with

    scientists who can't "see" this, and may creatively embellish their

    research to make a point. Research that is discovered by Hygienists

    sometimes seems to be used in highly selective fashion only as a

    convenient afterthought to justify conclusions that have already been

    assumed beforehand.) I too for years thought it was obvious in the absence

    of realizing science had already found otherwise.

    The argument made is very similar to the "comparative anatomy"

    argument: Look at the rest of the animals, and especially look at the ones

    we are most similar to, the apes. They are vegetarians [this is now known

    to be false for chimps and gorillas and almost all the other great

    apes--which is something we'll get to shortly], and none of them cook

    their food. Animals who eat meat have large canines, rough rasping

    tongues, sharp claws, and short digestive tracts to eliminate the poisons

    in the meat before it putrefies, etc.

    In other words, it is a view based on a philosophy of "naturalism,"

    but without really defining too closely what that naturalism is. The

    Hygienic view of naturalism, then, simplistically looks to the rest of the

    animal kingdom as its model for that naturalism by way of analogy. This is

    good as a device to get us to look at ourselves more objectively from

    "outside" ourselves, but when you take it too far, it completely ignores

    that we are unique in some ways, and you cannot simply assume it or figure

    it all out by way of analogy only. It can become reverse anthropomorphism.

    (Anthropomorphism is the psychological tendency to unconsciously make

    human behavior the standard for comparison, or to project human

    characteristics and motivations onto the things we observe. Reverse

    anthropomorphism in this case would be saying humans should take specific

    behaviors of other animals as our own model where food is concerned.)

    When you really get down to nuts and bolts about defining what you

    subjectively think is "natural," however, you find people don't so easily

    agree about all the particulars. The problem with the Hygienic definition

    of naturalism--what we could call "the animal model for humans"--is that

    it is mostly a subjective comparison. (And quite obviously so after you

    have had a chance to digest the evolutionary picture, like what I

    presented above. Those who maintain that the only "natural" food for us is

    that which we can catch or process with our bare hands are by any

    realistic evolutionary definition for what is natural grossly in error,

    since stone tools for obtaining animals and cutting the flesh have been

    with us almost 2 million years now.) Not that there isn't value in doing

    this, and not that there may not be large grains of truth to it, but since

    it is in large part subjectively behavioral, there is no real way to test

    it fairly (which is required for a theory to be scientific), which means

    you can never be sure elements of it may not be false. You either agree to

    it, or you don't--you either agree to the "animal analogy" for raw-food

    eating and vegetarianism, or you have reservations about it--but you are

    not offering scientific evidence.

    So my view became, why don't we just look into the evolutionary

    picture as the best way to go straight to the source and find out what

    humans "originally" ate? Why fool around philosophizing and theorizing

    about it when thanks to paleoanthropologists we can now just go back and

    look? If we really want to resolve the dispute of what is natural for

    human beings, what better way than to actually go back and look at what we

    actually did in prehistory before we supposedly became corrupted by reason

    to go against our instincts? Why aren't we even looking? Are we afraid of

    what we might see? These questions have driven much of my research into

    all this.

    If we are going to be true dietary naturalists--eat "food of our

    biological adaptation" as the phrase goes--then it is paramount that we

    have a functional or testable way of defining what we are biologically

    adapted to. This is something that evolutionary science easily and

    straightforwardly defines: What is "natural" is simply what we are adapted

    to by evolution, and a central axiom of evolution is that what we are

    adapted to is the behavior our species engaged in over a long enough

    period of evolutionary time for it to have become selected for in the

    species' collective gene pool. This puts the question of natural behavior

    on a more squarely concrete basis. I wanted a better way to determine what

    natural behavior in terms of diet was for human beings that could be

    backed by science. This eliminates the dilemma of trying to determine what

    natural behavior is by resorting solely to subjective comparisons with

    other animals as Hygienists often do.

    You mentioned the "comparative anatomy" argument that Hygienists look

    to for justification instead of evolution. Let's look at that a little

    more. Are you saying it is fundamentally wrong?

    No, not as a general line of reasoning in saying that we are similar

    to apes so our diets should be similar. It's a good argument--as far as it

    goes. But for the logic to be valid in making inferences about the human

    diet based on ape diet, it must be based on accurate observations of the

    actual food intake of apes. Idealists such as we Hygienists don't often

    appreciate just how difficult it is to make these observations, and do it

    thoroughly enough to be able to claim you have really seen everything the

    apes are doing, or capable of doing. You have to go clear back to field

    observations in the 1960's and earlier to support the contention that apes

    are vegetarians. That doesn't wash nowadays with the far more detailed

    field observations and studies of the '70s, '80s, and '90s. Chimp and

    gorilla behavior is diverse, and it is difficult to observe and draw

    reliable conclusions without spending many months and/or years of

    observation. And as the studies of Jane Goodall and others since have

    repeatedly shown, the early studies were simply not extensive enough to be

    reliable.73

    Science is a process of repeated observation and progressively better

    approximations of the "real world," whatever that is. It is critical then,

    that we look at recent evidence, which has elaborated on, refined, and

    extended earlier work. When you see anybody--such as apologists for

    "comparative anatomy" vegetarian idealism (or in fact anybody doing this

    on any topic)--harking back to outdated science that has since been

    eclipsed in order to bolster their views, you should immediately suspect

    something.

    The main problem with the comparative anatomy argument, then--at

    least when used to support vegetarianism--is that scientists now know that

    apes are not vegetarians after all, as was once thought. The comparative

    anatomy argument actually argues for at least modest amounts of animal

    flesh in the diet, based on the now much-more-complete observations of

    chimpanzees, our closest animal relatives with whom we share somewhere

    around 98 to 98.6% of our genes.74 (We'll also look briefly at the diets

    of other apes, but the chimpanzee data will be focused on here since it

    has the most relevance for humans.)

    Though the chimp research is rarely oriented to the specific types of

    percentage numerical figures we Hygienists would want to see classified,

    from what I have seen, it would probably be fair to estimate that most

    populations of chimpanzees are getting somewhere in the neighborhood of 5%

    of their diet on average in most cases (as a baseline) to perhaps 8--10%

    as a high depending on the season, as animal food--which in their case

    includes bird's eggs and insects in addition to flesh--particularly

    insects, which are much more heavily consumed than is flesh.75

    There is considerable variation across different chimp populations in

    flesh consumption, which also fluctuates up and down considerably within

    populations on a seasonal basis as well. (And behavior sometimes differs

    as well: Chimps in the Tai population, in 26 of 28 mammal kills, were

    observed to break open the bones with their teeth and use tools to extract

    the marrow for consumption,76 reminiscent of early homo habilis.) One

    population has been observed to eat as much as 4 oz. of flesh per day

    during the peak hunting season, dwindling to virtually nothing much of the

    rest of the time, but researchers note that when it is available, it is

    highly anticipated and prized.77 It's hard to say exactly, but a

    reasonable estimate might be that on average flesh may account for about

    1--3% of the chimp diet.78

    Now of course, meat consumption among chimps is what gets the

    headlines these days,79 but the bulk of chimpanzees' animal food

    consumption actually comes in the form of social insects80 (termites,

    ants, and bees), which constitute a much higher payoff for the labor

    invested to obtain them81 than catching the colobus monkeys that is often

    the featured flesh item for chimps. However, insect consumption has often

    been virtually ignored82 since it constitutes a severe blind spot for the

    Western world due to our cultural aversions and biases about it. And by no

    means is insect consumption an isolated occurrence among just some chimp

    populations. With very few exceptions, termites and/or ants are eaten

    about half the days out of a year on average, and during peak seasons are

    an almost daily item, constituting a significant staple food in the diet

    (in terms of regularity), the remains of which show up in a minimum of

    approximately 25% of all chimpanzee stool samples.83

    Again, while chimp researchers normally don't classify food intake by

    the types of volume or caloric percentages that we Hygienists would prefer

    to see it broken down for comparison purposes (the rigors of observing

    these creatures in the wild make it difficult), what they do record is

    illustrative. A chart for the chimps of Lopé in Gabon classified by

    numbers of different species of food eaten (caveat: this does not equate

    to volume), shows the fruit species eaten comprising approx. 68% of the

    total range of species eaten in their diets, leaves 11%, seeds 7%, flowers

    2%, bark 1%, pith 2%, insects 6%, and mammals 2%.84

    A breakdown by feeding time for the chimps of Gombe showed their

    intake of foods to be (very roughly) 60% of feeding time for fruit, 20%

    for leaves, with the other items in the diet varying greatly on a seasonal

    basis depending on availability. Seasonal highs could range as high as

    (approx.) 17% of feeding time for blossoms, 22--30% for seeds, 10--17% for

    insects, 2--6% for meat, with other miscellaneous items coming in at

    perhaps 4% through most months of the year.85 Miscellaneous items eaten by

    chimps include a few eggs,86 plus the rare honey that chimps are known to

    rob from beehives (as well as the embedded bees themselves), which is

    perhaps the most highly prized single item in their diet,87 but which they

    are limited from eating much of by circumstances. Soil is also

    occasionally eaten--presumably for the mineral content according to

    researchers.88

    For those who suppose that drinking is unnatural and that we should

    be able to get all the fluid we need from "high-water-content" foods, I

    have some more unfortunate news: chimps drink water too. Even the largely

    frugivorous chimp may stop 2--3 times per day during the dry season to

    stoop and drink water directly from a stream (but perhaps not at all on

    some days during the wet season), or from hollows in trees, using a leaf

    sponge if the water cannot be reached with their lips.89 (Or maybe that

    should be good news: If you've been feeling guilty or substandard for

    having to drink water in the summer months, you can now rest easy knowing

    your chimp brothers and sisters are no different!)

    An important observation that cannot be overlooked is the

    wide-ranging omnivorousness and the predilection for tremendous variety in

    chimpanzees' diet, which can include up to 184 species of foods, 40--60 of

    which may comprise the diet in any given month, with 13 different foods

    per day being one average calculated.90 Thus, even given the largely

    frugivorous component of their diets, it would be erroneous to infer from

    that (as many Hygienists may prefer to believe) that the 5% to possibly 8%

    or so of their diet that is animal foods (not to mention other foods) is

    insignificant, or could be thrown out or disregarded without

    consequence--the extreme variety in their diet being one of its defining

    features.

    Over millions of years of evolution, the wheels grind exceedingly

  18. Caffeine is fine on a low carb diet. It's actually beneficial if taken

    in modest amounts. (it causes the release of free fatty acids which

    are then used for energy..)

    :cool: TJ :cool:

    So what's the deal with decaf only while on Atkins?
  19. Have a cup of coffee or some tea before getting active (while

    low-carb) and you'll lose even more.

    Caffeine causes the release of Free Fatty Acids (ffa) which will

    give you more energy and allow you to get rid of more adipose.

    :cool: TJ :cool:

  20. True story:

    When I'm very strict low-carb/paleo I usually sleep for

    4 hours, wake up for a while, then sleep another 4 hours each night.

    A few years back I discovered some paleo research that showed

    that hunter-gatherer humans had the same sleep patterns.

    IE: You eat a healthy paleo-low carb human diet -- you start

    behaving like a paleo-low carb human (and gain all of the health

    benefits too.)

    :cool: TJ :cool:

×
×
  • Create New...

Important Information

Terms of Use