An Essay by Jack Challem
Copyright © 1997 by Jack Challem, The Nutrition Reporter.
All rights reserved.
"Diet cures more than doctors."
From Proverbial Folklore, 1875
Through the ages, our views and vision of nutrition have too often been limited by what we knew, rather than by the limitless bounds of possibility and imagination. At nearly every milestone in vitamin research, scientists have assumed they've reached the pinnacle of knowledge, only to later realize that new discoveries force a revision of older beliefs. The lesson? Never close the door on the future.
These thoughts assume the importance of nutrition in health, an idea that in itself is not original. What is new is the recognition of just how fundamentally important nutrition really is in maintaining your health and vigor.
Our brief journey begins 3500 years ago, when the ancient Egyptians recognized that night blindness (caused by a lack of vitamin A) could be treated with specific foods. Despite this early wisdom, the role of nutrition was largely forgotten over the centuries. At different times and in different cultures, most diseases have been attributed to angry gods, witchcraft, spontaneous generation, bad air, bad humors (body fluids), or simple fate.
Today, thanks to the work of molecular biologists, it's clear that vitamins, minerals, amino acids, and other nutrients do their magic at the most basic levels of the body. We are, in many respects, what we eat.
One of the major milestones in nutritional medicine occurred in 1747, when the Scottish naval surgeon James Lind discovered that an unknown nutrient (vitamin C) in citrus foods prevented scurvy, a hemorrhagic disease characterized by spontaneous bleeding, loose teeth, pain, and lack of energy. It was a common and deadly disease - more British sailors were lost to scurvy than to war. In 1753, Lind published his Treatise on the Scurvy. But his findings were largely ignored for another 40 years, during which time 100,000 British sailors died from scurvy.
In the 1860s, after Louis Pasteur demonstrated that many diseases were caused by microscopic organisms, medicine was swept off its feet by the idea. Indeed, infection was the leading cause of death until the early part of the 20th century. Even in the early 1900s, beri-beri and pellagra were still considered infectious, not nutritional deficiency, diseases
This view was reflected in the authoritative Diet in Health and Disease, published in 1905. In the book, Julius Friedenwald, M.D. and John Ruhräh, M.D., wrote that beri-beri "is probably of microbic origin...Diet probably acts only as a predisposing factor, improper food tending to lower the general health of the individual."
The first major shift in the perception of nutrition resulted from the discoveries of Casimir Funk, a Polish chemist working at the Lister Institute in London. In 1911, he discovered what he termed "vitamines," later called simply vitamins. It quickly became clear that small amounts of vitamins could cure deadly diseases like beri-beri, xerophthalmia (a type of blindness), scurvy, and pellagra. This was a major medical advance because severe nutritional deficiencies were common and deadly afflictions.
Still, controversy surrounded the early days of vitamins. In his 1919 book, Eat and Be Healthy, Virgil MacMickle, M.D., of Portland, Ore., recognized how crucial nutrition was for health. He wrote that the "chemical substances of which the body is composed are very similar to those of the foods which nourish it. They are made up of the same chemical elements....the body can only get the materials from which it is made in the first place from foods..."
Reflecting the views of many medical men, however, MacMickle remained skeptical about vitamins, believing that the recent discovery of vitamins A and B were the beginning and end of the subject - and that the term vitamin would even be dropped. MacMickle slammed the door on the future of discovery.
By the 1920s and 1930s, it became clear that small amounts of vitamins easily cured severe deficiency diseases. But lacking imagination, most researchers and physicians believed vitamins had little other value. It was a faulty belief, based on the idea that scurvy and pellagra were the first signs of vitamin deficiency, not the last signs before death. Unfortunately, it's a belief that many doctors still hold dear.
Through the Looking Glass
Nobel laureate Albert Szent-Györgyi, M.D., Ph.D., who had discovered vitamin C and the flavonoids, may have been the first scientist who attempted to raise the vitamin consciousness of his colleagues. In 1939, he gave a series of guest lectures at Vanderbilt University Medical School in Nashville, Tenn., Szent-Györgyi said it was impossible to predict how many more vitamins would be discovered, partly because so many were being identified. Of greater interest to Szent-Györgyi was the difference between the "minimum daily doses" of vitamins needed to prevent deficiency diseases and optimal doses, which he referred to in Latin as the dosis optima quotidiana.
By the mid-1940s, whether or not he was aware of Szent-Györgyi's thinking, Evan V. Shute, M.D., of Canada had begun putting them into practice. Shute and his colleagues were using large, optimal doses of vitamin E to treat patients with a variety of cardiovascular diseases. Around the same time, Frederick R. Klenner, M.D., of Reidsville, N.C., began to successfully treat a variety of viral diseases (including polio) with large doses of vitamin C. In 1952, Abram Hoffer, M.D., Ph.D., started treating schizophrenics with vitamin C and B3. This was the start of a new paradigm, a new way of thinking about vitamins. They could be used to treat something besides classical deficiency diseases.
Learning from History
From the discovery of vitamins in 1911 through the 1950s, nearly all doctors based their diagnoses of vitamin deficiencies on readily observable symptoms, such as the hemorrhaging of scurvy or the paralysis of beri-beri. During the 1940s and 1950s, a number of researchers began to lay the foundation for a new way of looking at vitamins.
Deoxyribonucleic acid (DNA) - which contains the biological blueprint of your body - was discovered in 1944. A year later, Linus Pauling, Ph.D., developed the concept of "molecular disease" and laid much of the basis for modern molecular biology. It is on the level of molecules, within the 60 trillion cells of your body, that your body actually ages, becomes dysfunctional, and develops diseases.
Another milestone occurred late in 1954, when Denham Harman, M.D., Ph.D., conceived the free radical theory of aging - an idea that, 43 years later, medicine has all but embraced. Harman's idea was simple and elegant: free radicals - molecules with an unbalanced pair of elections - react with and damage DNA and other cell components. That's how aging appears to begin, but it was not an inviolable process. Harman realized then that antioxidant nutrients, such as vitamins C and E, could neutralize free radicals.
Two years later, Roger Williams, Ph.D., published his concept of biochemical - and nutritional - individuality. Based on extensive anatomical, genetic, and biochemical data, Williams argued that people varied widely in their individual nutritional requirements. He saw the Minimum Daily Requirements (MDRs) and Recommended Dietary Allowances (RDAs) as attempts to create statistical norms of intake when no norms were really possible. The only reasonable approach, Williams explained, was to strive for optimal nutrition.
Pauling absorbed these lines of thinking and, in 1968, described the theoretical foundation for nutritional medicine. Writing in the prestigious journal Science (160:265-71), Pauling put molecular biology into understandable terms with relevance to the average person. In his concept of "orthomolecular" medicine, Pauling recommended that people use substances (nutrients) normal to biochemistry to straighten out the body's molecules. Physicians criticized Pauling for stepping out of his field, but molecular medicine had been his field since the 1940s. It was the doctors who weren't keeping up with advances in their own field.
Nutrition at a Crossroads
Today, nutritional medicine finds itself in a peculiar position. The research is unequivocal that free radicals damage DNA and that this deterioration leads to aging and many diseases, including cancer, heart disease, arthritis, and Alzheimer's. We know that antioxidant vitamins protect against DNA damage and that B vitamins and amino acids are essential for DNA repair and synthesis.
This remarkable research has come in large part from the work of cell and molecular biologists, people like Bruce Ames, Ph.D., and Lester Packer, Ph.D., at the University of California, Berkeley. Yet most conventional physicians believe almost religiouslyin a series of homilies: that people get all the nutrition they need in a balanced diet (really, an unproven hypothesis), that vitamins are needed in only small quantities (an idea unsupported by research), and that vitamin deficiencies reveal themselves in the symptoms of scurvy and other obvious deficiency diseases (a narrow view). They act as though molecular biology simply doesn't exist.
The irony of all this comes in the guise of high-tech genetic research and the weekly newspaper headlines that doctors have discovered genes for obesity, diabetes, heart disease, Alzheimer's disease, and dozens of other conditions. In a sense, these doctors are the charlatans who victimize people by offering the distant hope that some day scientists will fabricate replacement genes to prevent and cure these diseases. But if you stop to look at your genes, which are made of DNA, and what your DNA is made of, you eventually come back to vitamins and other nutrients as biological building blocks. The low-tech solution, eating well and taking some vitamins, is the cheaper and better solution.
A Future Path
So what has the history of vitamins taught us?
It pays to avoid the tunnel vision that has often dismissed the value of vitamins. A hundred years ago, no one knew what vitamins were. Now we do. Fifty years ago, the Shutes were castigated for using vitamin E to treat heart disease. Today, most cardiologists are astonished at its effectiveness. Ten 10 years ago, everyone thought beta-carotene was the cat's meow, and no one thought about how other carotenoids might benefit health. Since then, we learned that alpha-carotene, lycopene, and lutein are also important nutrients. We can only measure what we can measure, we can only know what we know at any given time.
When the first vitamins were discovered, no one had any concept of DNA or the genetic code. Today, we have a respectable but still incomplete understanding the important nutrients in foods, and smart researchers are less inclined to dismiss nutrients they don't yet know much about. As for physicians, Pauling once told me, "If a doctor isn't up on something, he's down on it." These were profound words from a wise man. To recognize the potential of nutritional medicine, you have to stay "up" on the research and sometimes use a little imagination.
The future of nutritional medicine holds wondrous possibilities. The vast amount of research and the clinical experiences of many able researchers and physicians demonstrate this. Our view and vision of nutrition may be limited by what we know today, but we should leave the door open for what we have not yet discovered.
This article originally appeared in Let's Live magazine. The information provided by Jack Challem and The Nutrition Reporter newsletter is strictly educational and not intended as medical advice. For diagnosis and treatment, consult your physician.