When practiced appropriately, science can shed a lot of light on some pretty powerful concepts in human health. To date we have unlocked the human genome, developed some novel pharmaceutical interventions for diseases that were once taking a large toll on society, and begun to understand that we are carrying along an exponentially larger number of bacteria that have a dramatic impact on the way we function. However, many of the health concerns we see today are not being addressed in the appropriate manner. Comparing the use of a pharmaceutical drug to no treatment, or placebo, makes perfect sense when you are dealing with a bacterial or viral pathogen, but does it really make sense when the true problem is a poor lifestyle?
When performing science, you are not really trying to prove anything, you are trying to disprove the null hypothesis. The null hypothesis is the default position, and in the instance of a pharmaceutical intervention against a pathogen it typically means giving some of the subjects a placebo pill to simulate not doing anything. The problem is, at this point, there is a very large lifestyle component to the vast majority of diseases we are seeing today. This includes cardiovascular disease, some types of cancer, Alzheimer's, and the topic of this blog, Type 2 diabetes. If we know that a poor lifestyle largely contributes to these diseases, should the default position be taking a placebo pill and calling it a day, or would it make more sense to have the default condition be changing those lifestyle habits that cause the disease? I suppose that depends on how you look at what the pharmaceutical intervention is trying to accomplish.
Let's be honest, I've worked in the health and wellness field for 18 years. When someone is told they need to change their lifestyle or a disease they have is going to progress and they choose not to, they really aren't looking for a solution. What they are looking for is something that will allow them to continue their poor lifestyle while preventing the damage that their lifestyle causes. If this is what we are testing for with pharmaceutical interventions for these lifestyle diseases, it only makes sense to have the null hypothesis be the condition where the damage stops. In other words, the null hypothesis should be the lifestyle that causes no more damage. That way, when we compare the pharmaceutical intervention to the proper lifestyle group, we have an idea to what extent the drug is actually subsidizing the poor behavior. When you compare the pharmaceutical drug to doing absolutely nothing other than taking a sugar pill, what would you expect to happen? The drug will come out looking like it's the cure for something that it's not, which gives the false impression that the person has nothing to worry about.
So what lifestyle should be this default position? Since these lifestyle diseases are prevalent in Western society, it makes no sense to have the null hypothesis be based on the lifestyle of Westerners. However, these diseases are not at all prevalent in modern day hunter gatherers who live a lifestyle more in line with the one we spent most of our time evolving under. This has led many to believe that the null hypothesis, the default lifestyle condition to test against, should be identical to the one we evolved under.
This poses a major problem. The vast majority of our science has been conducted during a time when the Western lifestyle has been the default condition because that has been the default condition since clinical research has become prominent. An easy way to understand this is that we have come up with RDAs for nutrients in human health, but these RDAs are only relevant under a Western lifestyle because a Western lifestyle is the condition under which they were tested. So it is improper to say that the RDA for Vitamin C is the amount of Vitamin C one needs to prevent scurvy. The proper way to phrase what the science actually shows is that the RDA for Vitamin C is the amount of Vitamin C one living a Western lifestyle during the time of testing needs to prevent scurvy.
Assuming the way we currently do things as it pertains to lifestyle is the default condition can also lead us down the wrong path when we are looking at how our body works. It has been assumed that the liver is the primary site of blood glucose regulation. However, this may not actually be the case, it could be that the liver is the primary site of blood glucose regulation in those living a Western lifestyle. A study looking at blood glucose regulation in mice has shed some light on why this may be the case.
In order to better understand how blood glucose is regulated, researchers studied mice who were unable to regulate blood glucose via the liver. The liver is considered to be critical in blood glucose regulation, without it we cannot stabilize blood glucose. As the story goes, we eat carbohydrates that turn in to glucose, the liver stores that glucose and releases it during fasting, and when we are completely fasted, the liver makes glucose from non-carbohydrate food sources.
In the fed state during the study, blood glucose levels were identical between normal mice and mice that were unable to use their liver to regulate blood glucose. During the fasted state, blood glucose levels dropped initially and ketones went up, but within 30 hours both groups of mice had the same blood glucose level as genetic expression changed in the kidneys and intestine to favor blood glucose regulation by those organs in the absence of the liver(1). It is important to note that while blood glucose levels were lower in the mice who relied on the kidneys and intestines for glucose production, blood glucose never reached a critical state as the kidneys and intestines utilized ketones for approximately 50% of their energy and were able to provide enough glucose for survival. In addition, this study found that the kidneys and intestine make glucose out of the amino acid glutamine while the liver uses alanine and lactate.
While it may not seem like this study has any real world relevance since most humans have functioning livers, it does highlight that the liver is not absolutely necessary for blood glucose regulation. The liver is certainly a large player in blood glucose regulation as it can store approximately 100g of glucose and is capable of making glucose from other sources, but the assumption that the optimal way to regulate blood glucose is through consumption of carbohydrate and storage in the liver is flawed. At this point we really don't know for sure how to optimally regulate blood glucose, but if we look at some key aspects of the ancestral diet some doubt creeps in.
In my next blog we'll take a look at how blood glucose may have been regulated by ancestral humans and how the drastic change in the way we regulate blood glucose now may be contributing to the increased prevalence of Type 2 diabetes.