Oltipraz is a drug that was originally used to treat intestinal worms. It was later found to prevent a broad variety of cancers (1). This was attributed to its ability to upregulate cellular detoxification and repair mechanisms.
Researchers eventually discovered that oltipraz acts by activating Nrf2, the same transcription factor activated by ionizing radiation and polyphenols (2, 3, 4). Nrf2 activation mounts a broad cellular protective response that appears to reduce the risk of multiple health problems.
A recent paper in Diabetologia illustrates this (5). Investigators put mice on a long-term refined high-fat diet, with or without oltipraz. These carefully crafted diets are very unhealthy indeed, and when fed to rodents they rapidly induce fat gain and something that looks similar to human metabolic syndrome (insulin resistance, abdominal adiposity, blood lipid disturbances). Adding oltipraz to the diet prevented the fat gain, insulin resistance and inflammatory changes that occurred in the refined high-fat diet group.
The difference in fasting insulin was remarkable. The mice taking oltipraz had 1/7 the fasting insulin of the refined high-fat diet comparison group, and 1/3 the fasting insulin of the low-fat comparison group! Yet their glucose tolerance was normal, indicating that they were not low on insulin due to pancreatic damage. The low-fat diet they used in this study was also refined, which is why the two control groups (high-fat and low-fat) didn't diverge more in body fatness and other parameters. If they had used a group fed unrefined rodent chow as the comparator, the differences between groups would have been larger.
This shows that in addition to preventing cancer, Nrf2 activation can attenuate the metabolic damage caused by an unhealthy diet in rodents. Oltipraz illustrates the power of the cellular hormesis response. We can exploit this pathway naturally using polyphenols and other chemicals found in whole plant foods.
Showing posts with label cancer. Show all posts
Showing posts with label cancer. Show all posts
Tuesday, March 1, 2011
Thursday, February 24, 2011
Polyphenols, Hormesis and Disease: Part II
In the last post, I explained that the body treats polyphenols as potentially harmful foreign chemicals, or "xenobiotics". How can we reconcile this with the growing evidence that at least a subset of polyphenols have health benefits?
Clues from Ionizing Radiation
One of the more curious things that has been reported in the scientific literature is that although high-dose ionizing radiation (such as X-rays) is clearly harmful, leading to cancer, premature aging and other problems, under some conditions low-dose ionizing radiation can actually decrease cancer risk and increase resistance to other stressors (1, 2, 3, 4, 5). It does so by triggering a protective cellular response, increasing cellular defenses out of proportion to the minor threat posed by the radiation itself. The ability of mild stressors to increase stress resistance is called "hormesis." Exercise is a common example. I've written about this phenomenon in the past (6).
The Case of Resveratrol
Resveratrol is perhaps the most widely known polyphenol, available in supplement stores nationwide. It's seen a lot of hype, being hailed as a "calorie restriction mimetic" and the reason for the "French paradox."* But there is quite a large body of evidence suggesting that resveratrol functions in the same manner as low-dose ionizing radiation and other bioactive polyphenols: by acting as a mild toxin that triggers a hormetic response (7). Just as in the case of radiation, high doses of resveratrol are harmful rather than helpful. This has obvious implications for the supplementation of resveratrol and other polyphenols. A recent review article on polyphenols stated that while dietary polyphenols may be protective, "high-dose fortified foods or dietary supplements are of unproven efficacy and possibly harmful" (8).
The Cellular Response to Oxidants
Although it may not be obvious, radiation and polyphenols activate a cellular response that is similar in many ways. Both activate the transcription factor Nrf2, which activates genes that are involved in detoxification of chemicals and antioxidant defense**(9, 10, 11, 12). This is thought to be due to the fact that polyphenols, just like radiation, may temporarily increase the level of oxidative stress inside cells. Here's a quote from the polyphenol review article quoted above (13):
Nrf2 is one of the main pathways by which polyphenols increase stress resistance and antioxidant defenses, including the key cellular antioxidant glutathione (14). Nrf2 activity is correlated with longevity across species (15). Inducing Nrf2 activity via polyphenols or by other means substantially reduces the risk of common lifestyle disorders in animal models, including cardiovascular disease, diabetes and cancer (16, 17, 18), although Nrf2 isn't necessarily the only mechanism. The human evidence is broadly consistent with the studies in animals, although not as well developed.
One of the most interesting effects of hormesis is that exposure to one stressor can increase resistance to other stressors. For example, long-term consumption of high-polyphenol chocolate increases sunburn resistance in humans, implying that it induces a hormetic response in skin (19). Polyphenol-rich foods such as green tea reduce sunburn and skin cancer development in animals (20, 21).
Chris Masterjohn first introduced me to Nrf2 and the idea that polyphenols act through hormesis. Chris studies the effects of green tea on health, which seem to be mediated by polyphenols.
A Second Mechanism
There is a place in the body where polyphenols are concentrated enough to be direct antioxidants: in the digestive tract after consuming polyphenol-rich foods. Digestion is a chemically harsh process that readily oxidizes ingested substances such as polyunsaturated fats (22). Oxidized fat is neither healthy when it's formed in the deep fryer, nor when it's formed in the digestive tract (23, 24). Eating polyphenol-rich foods effectively prevents these fats from being oxidized during digestion (25). One consequence of this appears to be better absorption and assimilation of the exceptionally fragile omega-3 polyunsaturated fatty acids (26).
What does it all Mean?
I think that overall, the evidence suggests that polyphenol-rich foods are healthy in moderation, and eating them on a regular basis is generally a good idea. Certain other plant chemicals, such as suforaphane found in cruciferous vegetables, and allicin found in garlic, exhibit similar effects and may also act by hormesis (27). Some of the best-studied polyphenol-rich foods are tea (particularly green tea), blueberries, extra-virgin olive oil, red wine, citrus fruits, hibiscus tea, soy, dark chocolate, coffee, turmeric and other herbs and spices, and a number of traditional medicinal herbs. A good rule of thumb is to "eat the rainbow", choosing foods with a variety of colors.
Supplementing with polyphenols and other plant chemicals in amounts that would not be achievable by eating food is probably not a good idea.
* The "paradox" whereby the French eat a diet rich in saturated fat, yet have a low heart attack risk compared to other affluent Western nations.
** Genes containing an antioxidant response element (ARE) in the promoter region. ARE is also sometimes called the electrophile response element (EpRE).
Clues from Ionizing Radiation
One of the more curious things that has been reported in the scientific literature is that although high-dose ionizing radiation (such as X-rays) is clearly harmful, leading to cancer, premature aging and other problems, under some conditions low-dose ionizing radiation can actually decrease cancer risk and increase resistance to other stressors (1, 2, 3, 4, 5). It does so by triggering a protective cellular response, increasing cellular defenses out of proportion to the minor threat posed by the radiation itself. The ability of mild stressors to increase stress resistance is called "hormesis." Exercise is a common example. I've written about this phenomenon in the past (6).
The Case of Resveratrol
Resveratrol is perhaps the most widely known polyphenol, available in supplement stores nationwide. It's seen a lot of hype, being hailed as a "calorie restriction mimetic" and the reason for the "French paradox."* But there is quite a large body of evidence suggesting that resveratrol functions in the same manner as low-dose ionizing radiation and other bioactive polyphenols: by acting as a mild toxin that triggers a hormetic response (7). Just as in the case of radiation, high doses of resveratrol are harmful rather than helpful. This has obvious implications for the supplementation of resveratrol and other polyphenols. A recent review article on polyphenols stated that while dietary polyphenols may be protective, "high-dose fortified foods or dietary supplements are of unproven efficacy and possibly harmful" (8).
The Cellular Response to Oxidants
Although it may not be obvious, radiation and polyphenols activate a cellular response that is similar in many ways. Both activate the transcription factor Nrf2, which activates genes that are involved in detoxification of chemicals and antioxidant defense**(9, 10, 11, 12). This is thought to be due to the fact that polyphenols, just like radiation, may temporarily increase the level of oxidative stress inside cells. Here's a quote from the polyphenol review article quoted above (13):
We have found that [polyphenols] are potentially far more than 'just antioxidants', but that they are probably insignificant players as 'conventional' antioxidants. They appear, under most circumstances, to be just the opposite, i.e. prooxidants, that nevertheless appear to contribute strongly to protection from oxidative stress by inducing cellular endogenous enzymic protective mechanisms. They appear to be able to regulate not only antioxidant gene transcription but also numerous aspects of intracellular signaling cascades involved in the regulation of cell growth, inflammation and many other processes.It's worth noting that this is essentially the opposite of what you'll hear on the evening news, that polyphenols are direct antioxidants. The scientific cutting edge has largely discarded that hypothesis, but the mainstream has not yet caught on.
Nrf2 is one of the main pathways by which polyphenols increase stress resistance and antioxidant defenses, including the key cellular antioxidant glutathione (14). Nrf2 activity is correlated with longevity across species (15). Inducing Nrf2 activity via polyphenols or by other means substantially reduces the risk of common lifestyle disorders in animal models, including cardiovascular disease, diabetes and cancer (16, 17, 18), although Nrf2 isn't necessarily the only mechanism. The human evidence is broadly consistent with the studies in animals, although not as well developed.
One of the most interesting effects of hormesis is that exposure to one stressor can increase resistance to other stressors. For example, long-term consumption of high-polyphenol chocolate increases sunburn resistance in humans, implying that it induces a hormetic response in skin (19). Polyphenol-rich foods such as green tea reduce sunburn and skin cancer development in animals (20, 21).
Chris Masterjohn first introduced me to Nrf2 and the idea that polyphenols act through hormesis. Chris studies the effects of green tea on health, which seem to be mediated by polyphenols.
A Second Mechanism
There is a place in the body where polyphenols are concentrated enough to be direct antioxidants: in the digestive tract after consuming polyphenol-rich foods. Digestion is a chemically harsh process that readily oxidizes ingested substances such as polyunsaturated fats (22). Oxidized fat is neither healthy when it's formed in the deep fryer, nor when it's formed in the digestive tract (23, 24). Eating polyphenol-rich foods effectively prevents these fats from being oxidized during digestion (25). One consequence of this appears to be better absorption and assimilation of the exceptionally fragile omega-3 polyunsaturated fatty acids (26).
What does it all Mean?
I think that overall, the evidence suggests that polyphenol-rich foods are healthy in moderation, and eating them on a regular basis is generally a good idea. Certain other plant chemicals, such as suforaphane found in cruciferous vegetables, and allicin found in garlic, exhibit similar effects and may also act by hormesis (27). Some of the best-studied polyphenol-rich foods are tea (particularly green tea), blueberries, extra-virgin olive oil, red wine, citrus fruits, hibiscus tea, soy, dark chocolate, coffee, turmeric and other herbs and spices, and a number of traditional medicinal herbs. A good rule of thumb is to "eat the rainbow", choosing foods with a variety of colors.
Supplementing with polyphenols and other plant chemicals in amounts that would not be achievable by eating food is probably not a good idea.
* The "paradox" whereby the French eat a diet rich in saturated fat, yet have a low heart attack risk compared to other affluent Western nations.
** Genes containing an antioxidant response element (ARE) in the promoter region. ARE is also sometimes called the electrophile response element (EpRE).
Thursday, June 10, 2010
Nitrate: a Protective Factor in Leafy Greens
Cancer Link and Food Sources
Nitrate (NO3) is a molecule that has received a lot of bad press over the years. It was initially thought to promote digestive cancers, in part due to its ability to form carcinogens in the digestive tract. As it's used as a preservative in processed meats, and there is a link between processed meats and gastric cancer (1), nitrate was viewed with suspicion and a number of countries imposed strict limits on its use as a food additive.
But what if I told you that by far the greatest source of nitrate in the modern diet isn't processed meat-- but vegetables, particularly leafy greens (2)? And that the evidence specifically linking nitrate consumption to gastric cancer has largely failed to materialize? For example, one study found no difference in the incidence of gastric cancer between nitrate fertilizer plant workers and the general population (3). Most other studies in animals and humans have not supported the hypothesis that nitrate itself is carcinogenic (4, 5, 6). This, combined with recent findings on nitrate biology, has the experts singing a different tune in the last few years.
A New Example of Human Symbiosis
In 2003, Dr. K. Cosby and colleagues showed that nitrite (NO2; not the same as nitrate) dilates blood vessels in humans when infused into the blood (7). Investigators subsequently uncovered an amazing new example of human-bacteria symbiosis: dietary nitrate (NO3) is absorbed from the gut into the bloodstream and picked up by the salivary glands. It's then secreted into saliva, where oral bacteria use it as an energy source, converting it to nitrite (NO2). After swallowing, the nitrite is reabsorbed into the bloodstream (8). Humans and oral bacteria may have co-evolved to take advantage of this process. Antibacterial mouthwash prevents it.
Nitrate Protects the Cardiovascular System
In 2008, Dr. Andrew J. Webb and colleagues showed that nitrate in the form of 1/2 liter of beet juice (equivalent in volume to about 1.5 soda cans) substantially lowers blood pressure in healthy volunteers for over 24 hours. It also preserved blood vessel performance after brief oxygen deprivation, and reduced the tendency of the blood to clot (9). These are all changes that one would expect to protect against cardiovascular disease. Another group showed that in monkeys, the ability of nitrite to lower blood pressure did not diminish after two weeks, showing that the animals did not develop a tolerance to it on this timescale (10).
Subsequent studies showed that dietary nitrite reduces blood vessel dysfunction and inflammation (CRP) in cholesterol-fed mice (11). Low doses of nitrite also dramatically reduce tissue death in the hearts of mice exposed to conditions mimicking a heart attack, as well as protecting other tissues against oxygen deprivation damage (12). The doses used in this study were the equivalent of a human eating a large serving (100 g; roughly 1/4 lb) of lettuce or spinach.
Mechanism
Nitrite is thought to protect the cardiovascular system by serving as a precursor for nitric oxide (NO), one of the most potent anti-inflammatory and blood vessel-dilating compounds in the body (13). A decrease in blood vessel nitric oxide is probably one of the mechanisms of diet-induced atherosclerosis and increased clotting tendency, and it is likely an early consequence of eating a poor diet (14).
The Long View
Leafy greens were one of the "protective foods" emphasized by the nutrition giant Sir Edward Mellanby (15), along with eggs and high-quality full-fat dairy. There are many reasons to believe greens are an excellent contribution to the human diet, and what researchers have recently learned about nitrate biology certainly reinforces that notion. Leafy greens may be particularly useful for the prevention and reversal of cardiovascular disease, but are likely to have positive effects on other organ systems both in health and disease. It's ironic that a molecule suspected to be the harmful factor in processed meats is turning out to be one of the major protective factors in vegetables.
Nitrate (NO3) is a molecule that has received a lot of bad press over the years. It was initially thought to promote digestive cancers, in part due to its ability to form carcinogens in the digestive tract. As it's used as a preservative in processed meats, and there is a link between processed meats and gastric cancer (1), nitrate was viewed with suspicion and a number of countries imposed strict limits on its use as a food additive.
But what if I told you that by far the greatest source of nitrate in the modern diet isn't processed meat-- but vegetables, particularly leafy greens (2)? And that the evidence specifically linking nitrate consumption to gastric cancer has largely failed to materialize? For example, one study found no difference in the incidence of gastric cancer between nitrate fertilizer plant workers and the general population (3). Most other studies in animals and humans have not supported the hypothesis that nitrate itself is carcinogenic (4, 5, 6). This, combined with recent findings on nitrate biology, has the experts singing a different tune in the last few years.
A New Example of Human Symbiosis
In 2003, Dr. K. Cosby and colleagues showed that nitrite (NO2; not the same as nitrate) dilates blood vessels in humans when infused into the blood (7). Investigators subsequently uncovered an amazing new example of human-bacteria symbiosis: dietary nitrate (NO3) is absorbed from the gut into the bloodstream and picked up by the salivary glands. It's then secreted into saliva, where oral bacteria use it as an energy source, converting it to nitrite (NO2). After swallowing, the nitrite is reabsorbed into the bloodstream (8). Humans and oral bacteria may have co-evolved to take advantage of this process. Antibacterial mouthwash prevents it.
Nitrate Protects the Cardiovascular System
In 2008, Dr. Andrew J. Webb and colleagues showed that nitrate in the form of 1/2 liter of beet juice (equivalent in volume to about 1.5 soda cans) substantially lowers blood pressure in healthy volunteers for over 24 hours. It also preserved blood vessel performance after brief oxygen deprivation, and reduced the tendency of the blood to clot (9). These are all changes that one would expect to protect against cardiovascular disease. Another group showed that in monkeys, the ability of nitrite to lower blood pressure did not diminish after two weeks, showing that the animals did not develop a tolerance to it on this timescale (10).
Subsequent studies showed that dietary nitrite reduces blood vessel dysfunction and inflammation (CRP) in cholesterol-fed mice (11). Low doses of nitrite also dramatically reduce tissue death in the hearts of mice exposed to conditions mimicking a heart attack, as well as protecting other tissues against oxygen deprivation damage (12). The doses used in this study were the equivalent of a human eating a large serving (100 g; roughly 1/4 lb) of lettuce or spinach.
Mechanism
Nitrite is thought to protect the cardiovascular system by serving as a precursor for nitric oxide (NO), one of the most potent anti-inflammatory and blood vessel-dilating compounds in the body (13). A decrease in blood vessel nitric oxide is probably one of the mechanisms of diet-induced atherosclerosis and increased clotting tendency, and it is likely an early consequence of eating a poor diet (14).
The Long View
Leafy greens were one of the "protective foods" emphasized by the nutrition giant Sir Edward Mellanby (15), along with eggs and high-quality full-fat dairy. There are many reasons to believe greens are an excellent contribution to the human diet, and what researchers have recently learned about nitrate biology certainly reinforces that notion. Leafy greens may be particularly useful for the prevention and reversal of cardiovascular disease, but are likely to have positive effects on other organ systems both in health and disease. It's ironic that a molecule suspected to be the harmful factor in processed meats is turning out to be one of the major protective factors in vegetables.
Saturday, November 1, 2008
Book Review: Dangerous Grains
Dangerous Grains is about the health hazards of gluten grains. It's co-written by James Braly, an M.D. who specializes in food allergies, and Ron Hoggan, a celiac patient who has written widely on the subject.
Celiac disease is a degeneration of the intestinal lining caused by exposure to gluten. Gluten sensitivity is a broader term that encompasses any of the numerous symptoms that can occur throughout the body when susceptible people eat gluten. The term gluten sensitivity includes celiac disease. Gluten is a protein found in wheat, its close relatives (kamut, spelt, triticale), barley and rye. Wheat is the most concentrated source.
Dangerous Grains is a good overview of the mountain of data on celiac disease and gluten sensitivity that few people outside the field are familiar with. For example, did you know:
Dangerous Grains also discusses the opioid-like peptides released from gluten during digestion. Opioids are powerful drugs, such as heroin and morphine, that were originally derived from the poppy seed pod. They are strong suppressors of the immune system and quite addictive. There are no data that conclusively prove the opioid-like peptides in gluten cause immune suppression or addiction to wheat, but there are some interesting coincidences and anecdotes. Celiac patients are at an increased risk of cancer, particularly digestive tract cancer, which suggests that the immune system is compromised. Heroin addicts are also at increased risk of cancer. Furthermore, celiac patients often suffer from abnormal food cravings. From my reading, I believe that wheat causes excessive eating, perhaps through a drug-like mechanism, and many people report withdrawal-like symptoms and cravings after eliminating wheat.
I know several people who have benefited greatly from removing gluten from their diets. Anyone who has digestive problems, from gas to acid reflux, or any other mysterious health problem, owes it to themselves to try a gluten-free diet for a month. Gluten consumption has increased quite a bit in the U.S. in the last 30 years, mostly due to an increase in the consumption of processed wheat snacks. I believe it's partly to blame for our declining health. Wheat has more gluten than any other grain. Avoiding wheat and all its derivatives is a keystone of my health philosophy.
Another notable change that Sally Fallon and others have pointed out is that today's bread isn't made the same way our grandparents made it. Quick-rise yeast allows bread to be fermented for as little as 3 hours, whereas it was formerly fermented for 8 hours or more. This allowed the gluten to be partially broken down by the microorganisms in the dough. Some gluten-sensitive people report that they can eat well-fermented sourdough wheat bread without symptoms. I think these ideas are plausible, but they remain anecdotes to me at this point. Until research shows that gluten-sensitive people can do well eating sourdough wheat bread in the long term, I'll be avoiding it. I have no reason to believe I'm gluten sensitive myself, but through my reading I've been convinced that wheat, at least how we eat it today, is probably not healthy for anyone.
I'm not aware of any truly healthy traditional culture that eats wheat as a staple. As a matter of fact, white wheat flour has left a trail of destruction around the globe wherever it has gone. Polished rice does not have such a destructive effect, so it's not simply the fact that it's a refined carbohydrate. Hundreds, if not thousands of cultures throughout the world have lost their robust good health upon abandoning their traditional foods in favor of white flour and sugar. The medical and anthropological literature are peppered with these stories. I'm aware of one healthy culture that traditionally ate sourdough-fermented whole grain rye bread, the Swiss villagers of the Loetschental valley described in Nutrition and Physical Degeneration.
Overall, the book is well written and accessible to a broad audience. I recommend it to anyone who has health problems or who is healthy and wants to stay that way!
Celiac disease is a degeneration of the intestinal lining caused by exposure to gluten. Gluten sensitivity is a broader term that encompasses any of the numerous symptoms that can occur throughout the body when susceptible people eat gluten. The term gluten sensitivity includes celiac disease. Gluten is a protein found in wheat, its close relatives (kamut, spelt, triticale), barley and rye. Wheat is the most concentrated source.
Dangerous Grains is a good overview of the mountain of data on celiac disease and gluten sensitivity that few people outside the field are familiar with. For example, did you know:
- An estimated one percent of the U.S. population suffers from celiac disease.
- Approximately 12 percent of the US population suffers from gluten sensitivity.
- Gluten can damage nearly any part of the body, including the brain, the digestive tract, the skin and the pancreas. Sometimes gastrointestinal symptoms are absent.
- Both celiac and other forms of gluten sensitivity increase the risk of a large number of diseases, such as type 1 diabetes and cancer, often dramatically.
- The majority of people with gluten sensitivity are not diagnosed.
- Most doctors don't realize how common gluten sensitivity is, so they rarely test for it.
- Celiac disease and other symptoms of gluten sensitivity are easily reversed by avoiding gluten.
Dangerous Grains also discusses the opioid-like peptides released from gluten during digestion. Opioids are powerful drugs, such as heroin and morphine, that were originally derived from the poppy seed pod. They are strong suppressors of the immune system and quite addictive. There are no data that conclusively prove the opioid-like peptides in gluten cause immune suppression or addiction to wheat, but there are some interesting coincidences and anecdotes. Celiac patients are at an increased risk of cancer, particularly digestive tract cancer, which suggests that the immune system is compromised. Heroin addicts are also at increased risk of cancer. Furthermore, celiac patients often suffer from abnormal food cravings. From my reading, I believe that wheat causes excessive eating, perhaps through a drug-like mechanism, and many people report withdrawal-like symptoms and cravings after eliminating wheat.
I know several people who have benefited greatly from removing gluten from their diets. Anyone who has digestive problems, from gas to acid reflux, or any other mysterious health problem, owes it to themselves to try a gluten-free diet for a month. Gluten consumption has increased quite a bit in the U.S. in the last 30 years, mostly due to an increase in the consumption of processed wheat snacks. I believe it's partly to blame for our declining health. Wheat has more gluten than any other grain. Avoiding wheat and all its derivatives is a keystone of my health philosophy.
Another notable change that Sally Fallon and others have pointed out is that today's bread isn't made the same way our grandparents made it. Quick-rise yeast allows bread to be fermented for as little as 3 hours, whereas it was formerly fermented for 8 hours or more. This allowed the gluten to be partially broken down by the microorganisms in the dough. Some gluten-sensitive people report that they can eat well-fermented sourdough wheat bread without symptoms. I think these ideas are plausible, but they remain anecdotes to me at this point. Until research shows that gluten-sensitive people can do well eating sourdough wheat bread in the long term, I'll be avoiding it. I have no reason to believe I'm gluten sensitive myself, but through my reading I've been convinced that wheat, at least how we eat it today, is probably not healthy for anyone.
I'm not aware of any truly healthy traditional culture that eats wheat as a staple. As a matter of fact, white wheat flour has left a trail of destruction around the globe wherever it has gone. Polished rice does not have such a destructive effect, so it's not simply the fact that it's a refined carbohydrate. Hundreds, if not thousands of cultures throughout the world have lost their robust good health upon abandoning their traditional foods in favor of white flour and sugar. The medical and anthropological literature are peppered with these stories. I'm aware of one healthy culture that traditionally ate sourdough-fermented whole grain rye bread, the Swiss villagers of the Loetschental valley described in Nutrition and Physical Degeneration.
Overall, the book is well written and accessible to a broad audience. I recommend it to anyone who has health problems or who is healthy and wants to stay that way!
Saturday, July 26, 2008
The Inuit: Lessons from the Arctic
The Inuit (also called Eskimo) are a group of hunter-gatherer cultures who inhabit the arctic regions of Alaska, Canada and Greenland. They are a true testament to the toughness, adaptability and ingenuity of the human species. Their unique lifestyle has a lot of information to offer us about the boundaries of the human ecological niche. Weston Price was fascinated by their excellent teeth, good nature and overall robust health. Here's an excerpt from Nutrition and Physical Degeneration:
Yet another amazing thing about the Inuit was their social structure. Here's Dr. John Murdoch again (quoted from Cancer, Disease of Civilization):
Overall, the unique lifestyle and diet of the Inuit have a lot to teach us. First, that humans are capable of being healthy as carnivores. Second, that we are able to thrive on a high-fat diet. Third, that we are capable of living well in extremely harsh and diverse environments. Fourth, that the shift from natural foods to processed foods, rather than changes in macronutrient composition, is the true cause of the diseases of civilization.
"In his primitive state he has provided an example of physical excellence and dental perfection such as has seldom been excelled by any race in the past or present...we are also deeply concerned to know the formula of his nutrition in order that we may learn from it the secrets that will not only aid in the unfortunate modern or so-called civilized races, but will also, if possible, provide means for assisting in their preservation."
The Inuit are cold-hardy hunters whose traditional diet consists of a variety of sea mammals, fish, land mammals and birds. They invented some very sophisticated tools, including the kayak, whose basic design has remained essentially unchanged to this day. Most groups ate virtually no plant food. Their calories came primarily from fat, up to 75%, with almost no calories coming from carbohydrate. Children were breast-fed for about three years, and had solid food in their diet almost from birth. As with most hunter-gatherer groups, they were free from chronic disease while living a traditional lifestyle, even in old age. Here's a quote from Observations on the Western Eskimo and the Country they Inhabit; from Notes taken During two Years [1852-54] at Point Barrow, by Dr. John Simpson:
These people [the Inuit] are robust, muscular and active, inclining rather to spareness [leanness] than corpulence [overweight], presenting a markedly healthy appearance. The expression of the countenance is one of habitual good humor. The physical constitution of both sexes is strong. Extreme longevity is probably not unknown among them; but as they take no heed to number the years as they pass they can form no guess of their own ages.One of the common counterpoints I hear to the idea that high-fat hunter-gatherer diets are healthy, is that exercise protects them from the ravages of fat. The Inuit can help us get to the bottom of this debate. Here's a quote from Cancer, Disease of Civilization (1960, Vilhjalmur Stefansson):
"They are large eaters, some of them, especially the women, eating all the time..." ...during the winter the Barrow women stirred around very little, did little heavy work, and yet "inclined more to be sparse than corpulent" [quotes are the anthropologist Dr. John Murdoch, reproduced by Stefansson].Another argument I sometimes hear is that the Inuit are genetically adapted to their high-fat diet, and the same food would kill a European. This appears not to be the case. The anthropologist and arctic explorer Vilhjalmur Stefansson spent several years living with the Inuit in the early 20th century. He and his fellow Europeans and Americans thrived on the Inuit diet. American doctors were so incredulous that they defied him and a fellow explorer to live on a diet of fatty meat only for one year, under the supervision of the American Medical Association. To the doctors' dismay, they remained healthy, showing no signs of scurvy or any other deficiency (JAMA 1929;93:20–2).
Yet another amazing thing about the Inuit was their social structure. Here's Dr. John Murdoch again (quoted from Cancer, Disease of Civilization):
The women appear to stand on a footing of perfect equality with the men, both in the family and the community. The wife is the constant and trusted companion of the man in everything except the hunt, and her opinion is sought in every bargain or other important undertaking... The affection of parents for their children is extreme, and the children seem to be thoroughly worthy of it. They show hardly a trace of fretfulness or petulance so common among civilized children, and though indulged to an extreme extent are remarkably obedient. Corporal punishment appears to be absolutely unknown, and children are rarely chided or punished in any way.Unfortunately, those days are long gone. Since adopting a modern processed-food diet, the health and social structure of the Inuit has deteriorated dramatically. This had already happened to most groups by Weston Price's time, and is virtually complete today. Here's Price:
In the various groups in the lower Kuskokwim seventy-two individuals who were living exclusively on native foods had in their 2,138 teeth only two teeth or 0.09 per cent that had ever been attacked by tooth decay. In this district eighty-one individuals were studied who had been living in part or in considerable part on modern foods, and of their 2, 254 teeth 394 or 13 per cent had been attacked by dental caries. This represents an increase in dental caries of 144 fold.... When these adult Eskimos exchange their foods for our modern foods..., they often have very extensive tooth decay and suffer severely.... Their plight often becomes tragic since there are no dentists in these districts.Modern Inuit also suffer from very high rates of diabetes and overweight. This has been linked to changes in diet, particularly the use of white flour, sugar and processed oils.
Overall, the unique lifestyle and diet of the Inuit have a lot to teach us. First, that humans are capable of being healthy as carnivores. Second, that we are able to thrive on a high-fat diet. Third, that we are capable of living well in extremely harsh and diverse environments. Fourth, that the shift from natural foods to processed foods, rather than changes in macronutrient composition, is the true cause of the diseases of civilization.
Wednesday, July 16, 2008
Sunscreen and Melanoma
Melanoma is the most deadly type of skin cancer, accounting for most skin cancer deaths in the US. As Ross pointed out in the comments section of the last post, there is an association between severe sunburn at a young age and later development of melanoma. Darker-skinned people are also more resistant to melanoma. The association isn't complete, however, since melanoma sometimes occurs on the soles of the feet and even in the intestine. This may be due to the fact that there are several types of melanoma, potentially with different causes.
Another thing that associates with melanoma is the use of sunscreen above a latitude of 40 degrees from the equator. In the Northern hemisphere, 40 degrees draws a line between New York city and Beijing. A recent meta-analysis found consistently that sunscreen users above 40 degrees are at a higher risk of melanoma than people who don't use sunscreen, even when differences in skin color are taken into account. Wearing sunscreen decreased melanoma risk in studies closer to the equator. It sounds confusing, but it makes sense once you know a little bit more about UV rays, sunscreen and the biology of melanoma.
The UV light that reaches the Earth's surface is composed of UVA (longer) and UVB (shorter) wavelengths. UVB causes sunburn, while they both cause tanning. Sunscreen blocks UVB, preventing burns, but most brands only weakly block UVA. Sunscreen allows a person to spend more time in the sun than they would otherwise, and attenuates tanning. Tanning is a protective response (among several) by the skin that protects it against both UVA and UVB. Burning is a protective response that tells you to get out of the sun. The result of diminishing both is that sunblock tends to increase a person's exposure to UVA rays.
It turns out that UVA rays are more closely associated with melanoma than UVB rays, and typical sunscreen fails to prevent melanoma in laboratory animals. It's also worth mentioning that sunscreen does prevent more common (and less lethal) types of skin cancer.
Modern tanning beds produce a lot of UVA and not much UVB, in an attempt to deliver the maximum tan without causing a burn. Putting on sunscreen essentially does the same thing: gives you a large dose of UVA without much UVB.
The authors of the meta-analysis suggest an explanation for the fact that the association changes at 40 degrees of latitude: populations further from the equator tend to have lighter skin. Melanin blocks UVA very effectively, and the pre-tan melanin of someone with olive skin is enough to block most of the UVA that sunscreen lets through. The fair-skinned among us don't have that luxury, so our melanocytes get bombarded by UVA, leading to melanoma. This may explain the incredible rise in melanoma incidence in the US in the last 35 years, as people have also increased the use of sunscreen. It may also have to do with tanning beds, since melanoma incidence has risen particularly in women.
In my opinion, the best way to treat your skin is to tan gradually, without burning. Use clothing and a wide-brimmed hat if you think you'll be in the sun past your burn threshold. If you want to use sunscreen, make sure it blocks UVA effectively. Don't rely on the manufacturer's word; look at the ingredients list. It should contain at least one of the following: titanium dioxide, zinc oxide, avobenzone (Parsol 1789), Mexoryl SX (Tinosorb). It's best if it's also paraben-free.
Fortunately, as an external cancer, melanoma is easy to diagnose. If caught early, it can be removed without any trouble. If caught a bit later, surgeons may have to remove lymph nodes, which makes your face look like John McCain's. Later than that and you're probably a goner. If you have any questions about a growth, especially one with irregular borders that's getting larger, ask your doctor about it immediately!
Another thing that associates with melanoma is the use of sunscreen above a latitude of 40 degrees from the equator. In the Northern hemisphere, 40 degrees draws a line between New York city and Beijing. A recent meta-analysis found consistently that sunscreen users above 40 degrees are at a higher risk of melanoma than people who don't use sunscreen, even when differences in skin color are taken into account. Wearing sunscreen decreased melanoma risk in studies closer to the equator. It sounds confusing, but it makes sense once you know a little bit more about UV rays, sunscreen and the biology of melanoma.
The UV light that reaches the Earth's surface is composed of UVA (longer) and UVB (shorter) wavelengths. UVB causes sunburn, while they both cause tanning. Sunscreen blocks UVB, preventing burns, but most brands only weakly block UVA. Sunscreen allows a person to spend more time in the sun than they would otherwise, and attenuates tanning. Tanning is a protective response (among several) by the skin that protects it against both UVA and UVB. Burning is a protective response that tells you to get out of the sun. The result of diminishing both is that sunblock tends to increase a person's exposure to UVA rays.
It turns out that UVA rays are more closely associated with melanoma than UVB rays, and typical sunscreen fails to prevent melanoma in laboratory animals. It's also worth mentioning that sunscreen does prevent more common (and less lethal) types of skin cancer.
Modern tanning beds produce a lot of UVA and not much UVB, in an attempt to deliver the maximum tan without causing a burn. Putting on sunscreen essentially does the same thing: gives you a large dose of UVA without much UVB.
The authors of the meta-analysis suggest an explanation for the fact that the association changes at 40 degrees of latitude: populations further from the equator tend to have lighter skin. Melanin blocks UVA very effectively, and the pre-tan melanin of someone with olive skin is enough to block most of the UVA that sunscreen lets through. The fair-skinned among us don't have that luxury, so our melanocytes get bombarded by UVA, leading to melanoma. This may explain the incredible rise in melanoma incidence in the US in the last 35 years, as people have also increased the use of sunscreen. It may also have to do with tanning beds, since melanoma incidence has risen particularly in women.
In my opinion, the best way to treat your skin is to tan gradually, without burning. Use clothing and a wide-brimmed hat if you think you'll be in the sun past your burn threshold. If you want to use sunscreen, make sure it blocks UVA effectively. Don't rely on the manufacturer's word; look at the ingredients list. It should contain at least one of the following: titanium dioxide, zinc oxide, avobenzone (Parsol 1789), Mexoryl SX (Tinosorb). It's best if it's also paraben-free.
Fortunately, as an external cancer, melanoma is easy to diagnose. If caught early, it can be removed without any trouble. If caught a bit later, surgeons may have to remove lymph nodes, which makes your face look like John McCain's. Later than that and you're probably a goner. If you have any questions about a growth, especially one with irregular borders that's getting larger, ask your doctor about it immediately!
Monday, July 14, 2008
How to Cause a Cancer Epidemic
A report came out recently showing that melanoma incidence has increased dramatically in the US since 1973, particularly among women. The authors suggested the rise could be due to increasing sun exposure, which I am highly skeptical of. The data he cites to support that idea are quite weak. I think the prevalence of vitamin D deficiency in this country suggests otherwise.
Melanoma is the most deadly form of skin cancer, and the only type that is commonly life-threatening. Its link to sun exposure is tenuous at best. For example, it often occurs on the least sun-exposed parts of the body, and its incidence is lower in outdoor workers.
What is the solution to rising melanoma incidence? Sunblock! Slather it on, ladies and gentlemen! No matter that we evolved outdoors! No matter that it may do nothing for melanoma incidence or mortality! No matter that you'll be vitamin D deficient! No matter that it contains known carcinogens! 30+ SPF, the more the better. Don't let one single deadly UV photon through.
The irony of all this is that if you believe the data on vitamin D, avoiding the sun would cause many more cancers than it would prevent, even if all melanoma were due to sun exposure.
Melanoma is the most deadly form of skin cancer, and the only type that is commonly life-threatening. Its link to sun exposure is tenuous at best. For example, it often occurs on the least sun-exposed parts of the body, and its incidence is lower in outdoor workers.
What is the solution to rising melanoma incidence? Sunblock! Slather it on, ladies and gentlemen! No matter that we evolved outdoors! No matter that it may do nothing for melanoma incidence or mortality! No matter that you'll be vitamin D deficient! No matter that it contains known carcinogens! 30+ SPF, the more the better. Don't let one single deadly UV photon through.
The irony of all this is that if you believe the data on vitamin D, avoiding the sun would cause many more cancers than it would prevent, even if all melanoma were due to sun exposure.
Monday, July 7, 2008
Cancer in Other Non-Industrialized Cultures
In Cancer, Disease of Civilization (1960), Wilhjalmur Stefansson mentions a few cultures besides the Inuit in which large-scale searches never turned up cancer. Dr. Albert Schweitzer examined over 10,000 traditionally-living natives in Gabon (West Africa) in 1913 and did not find cancer. Later, it became common in the same population as they began "living more and more after the manner of the whites."
In Cancer, its Nature, Cause and Cure (1957), Dr. Alexander Berglas describes the search for cancer among natives in Brazil and Ecuador by Dr. Eugene Payne. He examined approximately 60,000 people over 25 years and found no evidence of cancer.
Sir Robert McCarrison conducted a seven year medical survey among the Hunza, in what is now Northern Pakistan. Among 11,000 people, he did not find a single case of cancer. Their diet consisted of soaked and sprouted grains and beans, fruit, vegetables, grass-fed dairy and a small amount of meat (including organs of course).
I no longer think of cancer as an inevitable risk of getting old, but as another facet of the disease of civilization.
In Cancer, its Nature, Cause and Cure (1957), Dr. Alexander Berglas describes the search for cancer among natives in Brazil and Ecuador by Dr. Eugene Payne. He examined approximately 60,000 people over 25 years and found no evidence of cancer.
Sir Robert McCarrison conducted a seven year medical survey among the Hunza, in what is now Northern Pakistan. Among 11,000 people, he did not find a single case of cancer. Their diet consisted of soaked and sprouted grains and beans, fruit, vegetables, grass-fed dairy and a small amount of meat (including organs of course).
I no longer think of cancer as an inevitable risk of getting old, but as another facet of the disease of civilization.
Saturday, July 5, 2008
Mortality and Lifespan of the Inuit
One of the classic counter-arguments that's used to discredit accounts of healthy hunter-gatherers is the fallacy that they were short-lived, and thus did not have time to develop diseases of old age like cancer. While the life expectancy of hunter-gatherers was not as high as ours today, most groups had a significant number of elderly individuals, who sometimes lived to 80 years and beyond. Mortality came mostly from accidents, warfare and infectious disease rather than chronic disease.
I found a a mortality table from the records of a Russian mission in Alaska (compiled by Veniaminov, taken from Cancer, Disease of Civilization), which recorded the ages of death of a traditionally-living Inuit population during the years 1822 to 1836. Here's a plot of the raw data:
Here's the data re-plotted in another way. I changed the "bin size" of the bars to 10 year spans each (rather than the bins above, which vary from 3 to 20 years). This allows us to get a better picture of the number of deaths over time. I took some liberties with the data to do this, breaking up a large bin equally into two smaller bins. I also left out the infant mortality data, which are interesting but not relevant to this post:
Excluding infant mortality, about 25% of their population lived past 60. Based on these data, the approximate life expectancy (excluding infant mortality) of this Inuit population was 43.5 years. It's possible that life expectancy would have been higher before contact with the Russians, since they introduced a number of nasty diseases to which the Inuit were not resistant. Keep in mind that the Westerners who were developing cancer alongside them probably had a similar life expectancy at the time. Here's the data plotted in yet another way, showing the number of individuals surviving at each age, out of the total deaths recorded:
It's remarkably linear. Here's the percent chance of death at each age:
In the next post, I'll briefly summarize cancer data from several traditionally-living cultures other than the Inuit.
I found a a mortality table from the records of a Russian mission in Alaska (compiled by Veniaminov, taken from Cancer, Disease of Civilization), which recorded the ages of death of a traditionally-living Inuit population during the years 1822 to 1836. Here's a plot of the raw data:
Here's the data re-plotted in another way. I changed the "bin size" of the bars to 10 year spans each (rather than the bins above, which vary from 3 to 20 years). This allows us to get a better picture of the number of deaths over time. I took some liberties with the data to do this, breaking up a large bin equally into two smaller bins. I also left out the infant mortality data, which are interesting but not relevant to this post:
Excluding infant mortality, about 25% of their population lived past 60. Based on these data, the approximate life expectancy (excluding infant mortality) of this Inuit population was 43.5 years. It's possible that life expectancy would have been higher before contact with the Russians, since they introduced a number of nasty diseases to which the Inuit were not resistant. Keep in mind that the Westerners who were developing cancer alongside them probably had a similar life expectancy at the time. Here's the data plotted in yet another way, showing the number of individuals surviving at each age, out of the total deaths recorded:
It's remarkably linear. Here's the percent chance of death at each age:
In the next post, I'll briefly summarize cancer data from several traditionally-living cultures other than the Inuit.
Friday, July 4, 2008
Cancer Among the Inuit
I remember coming across a table in the book Eat, Drink and Be Healthy (by Dr. Walter Willett-- you can skip it) a few years back. Included were data taken from Dr. Ancel Keys' "Seven Countries Study". It showed the cancer rates for three industrialized nations: the US, Greece and Japan. Although specific cancers differed, the overall rate was remarkably similar for all three: about 90 cancers per 100,000 people per year. Life expectancy was also similar, with Greece leading the pack by 4 years (the data are from the 60s).
The conclusion I drew at the time was that lifestyle did not affect the likelihood of developing cancer. It was easy to see from the same table that heart disease was largely preventable, since the US had a rate of 189 per 100,000 per year, compared to Japan's 34. Especially since I also knew that Japanese-Americans who eat an American diet get heart disease just like European-Americans.
I fell prey to the same logic that is so pervasive today: the idea that you will eventually die of cancer if no other disease gets you first. It's easy to believe, since the epidemiology seems to tell us that lifestyle doesn't affect overall cancer rates very much. There's only one little glitch... those epidemiological studies compare the sick to the sicker.
Here's the critical fact that modern medicine seems to have forgotten: hunter-gatherers and numerous non-industrial populations throughout the world have vanishingly small cancer rates. This fact was widely accepted in the 19th century and the early 20th, but has somehow managed to fade into obscurity. I know it sounds unbelievable, but allow me to explain.
I recently read Cancer, Disease of Civilization by Vilhjalmur Stefansson (thanks Peter). It really opened my eyes. Stefansson was an anthropologist and arctic explorer who participated in the search for cancer among the Canadian and Alaskan Inuit. Traditionally, most Inuit groups were strictly carnivorous, eating a diet of raw and cooked meat and fish almost exclusively. Their calories came primarily from fat, roughly 80%. They alternated between seasons of low and high physical activity, and typically enjoyed an abundant food supply.
Field physicians in the arctic noted that the Inuit were a remarkably healthy people. While they suffered from a tragic susceptibility to European communicable diseases, they did not develop the chronic diseases we now view as part of being human: tooth decay, overweight, heart attacks, appendicitis, constipation, diabetes and cancer. When word reached American and European physicians that the Inuit did not develop cancer, a number of them decided to mount an active search for it. This search began in the 1850s and tapered off in the 1920s, as traditionally-living Inuit became difficult to find.
One of these physicians was captain George B. Leavitt. He actively searched for cancer among the traditionally-living Inuit from 1885 to 1907. Along with his staff, he performed 50,000 examinations a year for the first 15 years, and 25,000 a year thereafter. He did not find a single case of cancer. At the same time, he was regularly diagnosing cancers among the crews of whaling ships and other Westernized populations. It's important to note two relevant facts about Inuit culture: first, their habit of going shirtless indoors. This would make visual inspection for external cancers very easy. Second, the Inuit generally had great faith in Western doctors and would consult them even for minor problems. Therefore, doctors in the arctic had ample opportunity to inspect them for cancer.
A study was published in 1934 by F.S. Fellows in the U.S Treasury's Public Health Reports entitled "Mortality in the Native Races of the Territory of Alaska, With Special Reference to Tuberculosis". It contained a table of cancer mortality deaths for several Alaskan regions, all of them Westernized to some degree. However, some were more Westernized than others. In descending order of Westernization, the percent of deaths from cancer were as follows:
Keep in mind that all four of the Inuit populations in this table were somewhat Westernized. It's clear that cancer incidence tracks well with Westernization. By "Westernization", what I mean mostly is the adoption of European food habits, including wheat flour, sugar, canned goods and vegetable oil. Later, most groups also adopted Western-style houses, which incidentally were not at all suited to their harsh climate.
In the next post, I'll address the classic counter-argument that hunter-gatherers were free of cancer because they didn't live long enough to develop it.
The conclusion I drew at the time was that lifestyle did not affect the likelihood of developing cancer. It was easy to see from the same table that heart disease was largely preventable, since the US had a rate of 189 per 100,000 per year, compared to Japan's 34. Especially since I also knew that Japanese-Americans who eat an American diet get heart disease just like European-Americans.
I fell prey to the same logic that is so pervasive today: the idea that you will eventually die of cancer if no other disease gets you first. It's easy to believe, since the epidemiology seems to tell us that lifestyle doesn't affect overall cancer rates very much. There's only one little glitch... those epidemiological studies compare the sick to the sicker.
Here's the critical fact that modern medicine seems to have forgotten: hunter-gatherers and numerous non-industrial populations throughout the world have vanishingly small cancer rates. This fact was widely accepted in the 19th century and the early 20th, but has somehow managed to fade into obscurity. I know it sounds unbelievable, but allow me to explain.
I recently read Cancer, Disease of Civilization by Vilhjalmur Stefansson (thanks Peter). It really opened my eyes. Stefansson was an anthropologist and arctic explorer who participated in the search for cancer among the Canadian and Alaskan Inuit. Traditionally, most Inuit groups were strictly carnivorous, eating a diet of raw and cooked meat and fish almost exclusively. Their calories came primarily from fat, roughly 80%. They alternated between seasons of low and high physical activity, and typically enjoyed an abundant food supply.
Field physicians in the arctic noted that the Inuit were a remarkably healthy people. While they suffered from a tragic susceptibility to European communicable diseases, they did not develop the chronic diseases we now view as part of being human: tooth decay, overweight, heart attacks, appendicitis, constipation, diabetes and cancer. When word reached American and European physicians that the Inuit did not develop cancer, a number of them decided to mount an active search for it. This search began in the 1850s and tapered off in the 1920s, as traditionally-living Inuit became difficult to find.
One of these physicians was captain George B. Leavitt. He actively searched for cancer among the traditionally-living Inuit from 1885 to 1907. Along with his staff, he performed 50,000 examinations a year for the first 15 years, and 25,000 a year thereafter. He did not find a single case of cancer. At the same time, he was regularly diagnosing cancers among the crews of whaling ships and other Westernized populations. It's important to note two relevant facts about Inuit culture: first, their habit of going shirtless indoors. This would make visual inspection for external cancers very easy. Second, the Inuit generally had great faith in Western doctors and would consult them even for minor problems. Therefore, doctors in the arctic had ample opportunity to inspect them for cancer.
A study was published in 1934 by F.S. Fellows in the U.S Treasury's Public Health Reports entitled "Mortality in the Native Races of the Territory of Alaska, With Special Reference to Tuberculosis". It contained a table of cancer mortality deaths for several Alaskan regions, all of them Westernized to some degree. However, some were more Westernized than others. In descending order of Westernization, the percent of deaths from cancer were as follows:
Keep in mind that all four of the Inuit populations in this table were somewhat Westernized. It's clear that cancer incidence tracks well with Westernization. By "Westernization", what I mean mostly is the adoption of European food habits, including wheat flour, sugar, canned goods and vegetable oil. Later, most groups also adopted Western-style houses, which incidentally were not at all suited to their harsh climate.
In the next post, I'll address the classic counter-argument that hunter-gatherers were free of cancer because they didn't live long enough to develop it.
Thursday, July 3, 2008
Cancer and the Immune System
My understanding of cancer has changed radically over the past few months. I used to think of it as an inevitable consequence of aging, a stochastic certainty. The human body is made of about 50 trillion cells, many of which replicate their DNA and divide regularly. It's only a matter of time until one of those cells randomly accumulates the wrong set of mutations, and loses the molecular brakes that restrict uncontrolled growth.
Strictly speaking, the idea is correct. That is how cancer begins. However, there's another check in place that operates outside the cancer cell itself: the immune system. A properly functioning immune system can recognize and destroy cancerous cells before they become dangerous to the organism. In fact, your immune system has probably already controlled or destroyed a number of them in your lifetime.
I recently read a fascinating account of some preliminary findings from the lab of Dr. Zheng Cui at Wake Forest university. His group took blood samples from 100 people and purified a type of immune cell called the granulocyte. They then evaluated the granulocytes' ability to kill cervical cancer cells in a cell culture dish. They found that it varied dramatically from one individual to another. One person's granulocytes killed 97% of the cancer cells in 24 hours, while another person's killed 2%.
They found some important trends. Granulocytes from people over 50 years old had a reduced ability to kill cancer cells, as did granulocytes from people with cancer. This raises the possibility that cancer is not simply the result of getting too old, but a very specific weakening of the immune system.
The most important finding, however, was that the granulocytes' kung-fu grip declined dramatically during the winter months. Here's Dr. Cui:
Hmm, I wonder why that could be?? Vitamin D anyone??
In the next post, I'll talk about cancer in non-industrialized cultures.
Strictly speaking, the idea is correct. That is how cancer begins. However, there's another check in place that operates outside the cancer cell itself: the immune system. A properly functioning immune system can recognize and destroy cancerous cells before they become dangerous to the organism. In fact, your immune system has probably already controlled or destroyed a number of them in your lifetime.
I recently read a fascinating account of some preliminary findings from the lab of Dr. Zheng Cui at Wake Forest university. His group took blood samples from 100 people and purified a type of immune cell called the granulocyte. They then evaluated the granulocytes' ability to kill cervical cancer cells in a cell culture dish. They found that it varied dramatically from one individual to another. One person's granulocytes killed 97% of the cancer cells in 24 hours, while another person's killed 2%.
They found some important trends. Granulocytes from people over 50 years old had a reduced ability to kill cancer cells, as did granulocytes from people with cancer. This raises the possibility that cancer is not simply the result of getting too old, but a very specific weakening of the immune system.
The most important finding, however, was that the granulocytes' kung-fu grip declined dramatically during the winter months. Here's Dr. Cui:
Nobody seems to have any cancer-killing ability during the
winter months from November to April.
Hmm, I wonder why that could be?? Vitamin D anyone??
In the next post, I'll talk about cancer in non-industrialized cultures.
Subscribe to:
Posts (Atom)