While Intermittent fasting may sound like another dieting craze, the practice of routinely not eating and drinking for short periods of time has shown again to lead to potentially better health outcomes.
In a new study by researchers at the Intermountain Healthcare Heart Institute in Salt Lake City, researchers have found that cardiac catheterization patients who practiced regular intermittent fasting lived longer than patients who don’t.
In addition, the study found that patients who practice intermittent fasting are less likely to be diagnosed with heart failure.
“It’s another example of how we’re finding that regularly fasting can lead to better health outcomes and longer lives,” said Benjamin Horne, Ph.D., principal investigator of the study and director of cardiovascular and genetic epidemiology at the Intermountain Healthcare Heart Institute.
Findings from the study will be presented at the 2019 American Heart Association Scientific Sessions in Philadelphia on Saturday, November 16, 2019.
In the study, researchers asked 2,001 Intermountain patients undergoing cardiac catheterization from 2013 to 2015 a series of lifestyle questions, including whether or not they practiced routine intermittent fasting.
Researchers then followed up with those patients 4.5 years later and found that routine fasters had greater survival rate than those who did not.
Because people who fast routinely also are known to engage in other healthy behaviors, the study also evaluated other parameters including demographics, socioeconomic factors, cardiac risk factors, comorbid diagnoses, medications and treatments, and other lifestyle behaviors like smoking and alcohol consumption.
Correcting statistically for these factors, long-term routine fasting remained a strong predictor of better survival and lower risk of heart failure, according to researchers.
The Intermountain Healthcare Heart Institute has the opportunity to closely study intermittent fasting because a large portion of its patients do it regularly: a significant portion of Utah’s population belongs the Church of Jesus Christ of Latter-day Saints, whose members typically fast the first Sunday of the month by going without food or drink for two consecutive meals, and thus not eating for the period of about a day.
While the study does not show that fasting is the causal effect for better survival, these real-world outcomes in a large population do suggest that fasting may be having an effect and urge continued study of the behavior.
“While many rapid weight loss fasting diets exist today, the different purposes of fasting in those diets and in this study should not be confused with the act of fasting,” said Dr. Horne.
“All proposed biological mechanisms of health benefits from fasting arise from effects that occur during the fasting period or are consequences of fasting.”
Dr. Horne has previously conducted studies about risk of diabetes and coronary artery disease in patients and found that rates are lower in patients who practice routine intermittent fasting.
Those studies were published in 2008 and 2012 and suggested that the decades-long development of those chronic diseases may be ameliorated by long-term routine fasting.
Why long-term intermittent fasting leads to better health outcomes is still largely unknown, though Dr. Horne said it could be a host of factors. Fasting affects a person’s levels of hemoglobin, red blood cell count, human growth hormone, and lowers sodium and bicarbonate levels, while also activating ketosis and autophagy—all factors that lead to better heart health and specifically reduce risk of heart failure and coronary heart disease.
“With the lower heart failure risk that we found, which is consistent with prior mechanistic studies, this study suggests that routine fasting at a low frequency over two thirds of the lifespan is activating the same biological mechanisms that fasting diets are proposed to rapidly activate,” Dr. Horne noted.
Researchers speculate that fasting routinely over a period of years and even decades conditions the body to activate the beneficial mechanisms of fasting after a shorter length of time than usual.
Typically, it takes about 12 hours of fasting for the effects to be activated, but long-term routine fasting may cause that time to be shortened so that each routine faster’s daily evening/overnight fasting period between dinner and breakfast produces a small amount of daily benefit, they noted.
Further studies are on-going that will answer this question and other questions related to possible mechanisms of effects on development of chronic disease and survival.
Additional research will also examine potential psychological effects of fasting and potential effects on appetite and perception of hunger.
Fasting is not for everyone. Researchers caution that pregnant and lactating women should not fast, as well as young children and frail older adults.
People who have received an organ transplant, who have a suppressed immune system, who are experiencing acute or severe chronic infections, and those with eating disorders should also not fast.
People diagnosed with chronic diseases – especially those who take medications for diabetes, blood pressure, or heart disease – should not fast unless under the close care and supervision of a physician because of the severe adverse effects that medications in combination with fasting can cause, including as hypoglycemia.
Mayo Clinic minute: Intermittent fasting facts
Provided by Intermountain Medical Center
Effects of Sugars on “Homeostatic” Neural Systems
The homeostasic system, which regulates feeding patterns based on energy need, is composed of two antagonistic pathways.
The orexigenic pathway includes neuropeptide Y (NPY) and agouti-related protein (AgRP), which are known to stimulate food intake [28] and are produced in the arcuate nucleus (ARC) of the hypothalamus, a critical region involved in homeostatic energy balance [29]. In contrast, the anorexigenic pathway, including proopiomelanocortin (POMC) neurons produced in the ARC, has the opposite effect by inhibiting food intake [28].
Recent evidence suggests that sugar intake differentially affects these two opposing pathways. After a sucrose preload, mice consumed more chow and this behavioral change was accompanied by variations in NPY and AgRP. Immediately following the preload, mice showed reduced expression of NPY and AgRP in the ARC.
However, 30-60 minutes after the sucrose preload and right before the chow meal, mice showed a marked increase in both [30]. This suggests that sucrose consumption led to a temporary decrease in orexigenic peptides followed by activation of the orexigenic pathway, potentiating caloric consumption. In another recent study, mice maintained on a high fat diet and given limited access to sucrose-sweetened water (SSW) showed a down-regulation of POMC mRNA expression in the hypothalamus.
In addition, these mice consumed greater amounts of the high fat diet on days that the SSW was available, suggesting that this reduction in satiety signaling may have facilitated hyperphagia in this group [31]. Chronic limited consumption of a high sucrose diet has also been shown to lead to decreased activity of the anorexigenic oxytocin system in the hypothalamus, which has been associated with satiety and meal termination [32].
Recent data indicate that the type of sugar ingested plays an important role in satiety. One animal study comparing the effects of 24 h access to sucrose, glucose, fructose, or high-fructose corn syrup found that glucose led to a marked upregulation of the satiety-inducing hormone, cholecystokinin (CCK), within the hypothalamus, while fructose resulted in a downregulation of this peptide [33]. This suggests that, relative to fructose, glucose may be more effective in eliciting satiety. This is in line with animal research showing central administration of glucose to inhibit food intake and fructose to stimulate feeding [34]. Further, in humans, fructose ingestion to lead to lower levels of serum glucose, insulin, and glucagon-like polypeptide 1 (GLP-1), a hormone associated with increased satiety relative to glucose ingestion [35].
Effects of Sugar on “Hedonic” Neural Systems
Given that sweet foods and beverages are generally considered pleasurable, the effects of caloric sweeteners on brain mechanisms associated with processing reward, such as the mesolimbic dopamine (DA) system and opioid systems, have been an area of intense research in recent years. One such study observed decreased striatal DA concentrations following prolonged access to a sucrose solution in high-sucrose drinking rats [36], a finding also reported by this group in response to chronic exposure to ethanol [37].
Expression of tyrosine hydroxylase (TH), an enzyme involved in DA synthesis, was also decreased in the striatum of high sucrose-drinking rats. Acute increases in DA release upon consumption of palatable food may, as the authors posit, initiate a negative feedback cycle, inhibiting DA synthesis, ultimately leading to both reduced TH expression and striatal DA concentrations.
It is important to note that while reduced DA content may reflect neuroadaptations due to prolonged sucrose consumption, reduced DA has been observed in the nucleus accumbens (NAc), a brain region associated with reward, of rats prone to obesity even prior to excessive weight gain [38].
Thus, it is also possible that reduced striatal DA concentrations may have predisposed the animals to excessive sucrose consumption, especially as only high-drinking rats were studied. Finally, in this study, high sucrose-drinking rats showed increased prolactin expression. Given the role of DA in inhibiting prolactin, reduced DA concentrations may have led to elevated prolactin.
Two recent studies have also explored the acute effects of sugar consumption on DA levels within the two subregions of the NAc, the shell and the core, given differential efferent projections from these regions. Rewarding substances such as drugs of abuse are known to elevate DA within the NAc shell and this response is thought to facilitate strong associations between the reward and related cues [39].
Using fast-scan cyclic voltammetry in food restricted rats, Cacciapaglia, Saddoris [40] found that sucrose-related cues lead to increased DA levels in both subregions of the NAc, however, DA levels were greater and sustained for longer in the shell. Increased DA levels were also observed in the NAc shell, but not the core, after lever pressing for sucrose. Together, these experiments implicate DA within the NAc shell, versus the core, in sucrose reward.
Using microdialysis techniques in food-restricted animals, it has also been shown that while novel exposure to sucrose increases DA levels in the NAc shell, this effect wanes with repeated exposure, in contrast to what is seen with drugs of abuse [41]. Notably, rats trained to respond for sucrose did not show habituation of increased DA levels in the shell.
This group also noted elevated DA levels in the shell, but not core, when animals responded for sucrose as well as in response sucrose-related cues during extinction. Interestingly, however, elevated DA levels were observed in both the shell and core regions when sucrose was delivered without the requirement of responding (“response non-contingent” sucrose feeding).
Given the established role of opioid signaling in hedonic processes [42], recent studies have also explored opioid involvement in the rewarding aspects of sugar consumption. Interestingly, Ostlund, Kosheleff [43] found no differences in sucrose intake during acquisition testing between mu-opioid receptor knockout (MOR KO) and control mice.
However, MOR KO mice showed fewer average bursts of sucrose licking when food deprived and attenuated licking behavior when sucrose concentrations were increased, indicating reduced sensitivity to these manipulations.
In a separate experiment, MOR KO mice displayed attenuated licking behavior in response to sucralose (a non-caloric sweetener) but not sucrose, extending the evidence that MOR signaling is involved in hedonic processing and raising the possibility that the caloric contribution of sucrose might explain why MOR KO mice did not show reduced levels of sucrose intake.
Alternatively, sucrose consumption may not be affected in these animals due to activation of other intact pathways associated with reinforcement, such as the mesolimbic DA pathway or other opioid receptors.
Interestingly, Castro and Berridge [44] recently identified a subregion located in the rostrodorsal quadrant of the medial shell of the NAc as a hedonic “hotspot” as injections of mu-, kappa-, and delta agonists in this area specifically (relative to the other three quadrants of the medial shell) lead to greater intensity of positive hedonic reactions to sucrose.
A common behavioral marker of reward is the degree of craving a substance elicits. In animal models, craving can be assessed using a paradigm in which animals are trained to self-administer a rewarding substance and their responses are measured at two points during abstinence: very soon after the substance is removed and again after a prolonged period of abstinence.
During extinction, animals are motivated to respond either in the presence of or for the delivery of cues previously associated with the reward.
Enhanced responding for cues at the later time point in abstinence has been noted following exposure to drugs of abuse, such as cocaine, as well as sucrose, a phenomenon termed “incubation of craving” [45]. Recent work shows age-related differences in incubation of sucrose craving, with adult and adolescent, but not young adolescent, rats demonstrating greater responding after the extended extinction period [46].
These behavioral findings were accompanied by reductions in 2-amino-3-(3-hydroxy-5-methylisoxazol-4-yl) propionic acid/N-methyl-D-aspartate (AMPA/NMDA) ratios, a proxy of synaptic plasticity, in the NAc.
Though these data are correlational, taken together, this suggests that age-dependent reductions in synaptic plasticity during abstinence from sucrose may contribute to enhanced craving of sucrose.
This is in contrast to findings that show greater synaptic plasticity during incubation of craving for cocaine [47], suggesting different mechanisms underlying this phenomenon depending on the rewarding substance.
In an effort to dissociate the relative roles of the two monosaccharides that comprise sucrose (fructose and glucose) in reward and/or satiety, Rorabaugh, Stratford [48] employed a 12-h intermittent access paradigm, similar to that used in our laboratory [49], to promote bingeing on isocaloric solutions of fructose, sucrose, and glucose. During the first hour of access, when bingeing is typically most robust, rats bingeing on glucose consumed significantly less than those given access to fructose or sucrose.
This may indicate that the glucose solution was perceived as less palatable relative to the other caloric sweeteners, or, alternatively, that it is perceived as more reinforcing and therefore, less may be needed to experience a similar rewarding effect.
As mentioned earlier, it is also possible that glucose may have been more satiating, resulting in less intake, given findings indicating this to be true in humans [50].
Interactions between “Homeostatic” and “Hedonic” Neural Systems
Recent evidence illustrates interactions between the homeostatic and hedonic systems in response to caloric sweetener intake. For example, prolonged fructose bingeing, elicited by an intermittent access paradigm, led to reduced neuronal activation (measured by c-Fos immunoreactivity [IR]) in the NAc shell and activated orexin neurons, which have been associated with both reward and satiety, in the lateral hypothalamic (LH) /perifornical area of rats [51].
It is postulated that this unusual pattern induced by fructose aligns with a feeding circuit proposed by the Kelley lab, in which the ventral pallidum (VP) forms a hyperphagic circuit that indirectly inhibits the NAc shell to activate the LH [52]. This pattern is seemingly unique to fructose ingestion and further study is needed to understand how this circuit may interact with more established pathways.
This study also found that pretreatment with an orexin 1 receptor antagonist reduced feeding in both fructose- and chow-bingeing rats, suggesting that orexin 1 signaling is involved in food intake that is motivated by caloric need as opposed to palatability.
However, only chow-bingeing rats showed reduced neuronal activation in the NAc shell, LH/perifornical area, or ventromedial hypothalamus in response to this manipulation [51].
Three recent studies approached this subject by introducing agents that typically act as homeostatic mechanisms into reward-related areas exogenously. In one such experiment, NPY increased the motivation to respond for sucrose when infused into the ventral tegmental area (VTA) or NAc and increased sucrose consumption when infused into the NAc or LH [53].
Interestingly, the effect of NPY in the VTA was attenuated following pretreatment with a DA receptor antagonist, suggesting that this effect is dependent on changes in DA signaling. In another study, injection of melanocortin receptor agonists, which customarily decrease food intake, into the VTA decreased sucrose and saccharin intake as well as overall food intake [54]. Another study found injection of orexin into the posterior VP, a region considered to be a “hedonic hotspot,” to enhance positive hedonic reactions to sucrose [55].
Given that the VP receives orexin projections from the LH, the authors propose that during negative energy balance, orexin projections may magnify the pleasure derived from food.
Though it remains unclear exactly how regulatory mechanisms like NPY, melanocortin, and orexin influence hedonic mechanisms under normal conditions, these findings offer compelling evidence of interactions between these two systems, which are frequently conceptualized disparately.
Effects of Low Calorie Sweeteners on “Homeostatic” and “Hedonic” Neural Systems
Despite their widespread use, we are only beginning to understand the effects of low calorie sweeteners on the brain. Research does show that the human brain is capable of dissociating sweet taste from calories [56, 57]. Laboratory animal research is beginning to elucidate the effects of low calorie sweeteners on select homeostatic and hedonic neural systems and their effect on feeding behavior.
Both melanin-concentrating hormone (MCH) and orexin promote feeding [58–60]. A recent study measured phosphorylated cyclic AMP response element binding protein (pCREB), a marker of neural activity, in both MCH and orexin neurons of fasted rats in response to glucose, saccharin or water. While only glucose reduced pCREB expression in MCH neurons in all rats, both glucose and saccharin, but not water, significantly reduced pCREB expression in orexin neurons of female rats [61].
Similarly, binge consumption of either sucrose or saccharin leads to reduced orexin mRNA expression in the LH of mice [62].
It should be noted that although low calorie sweeteners do not provide calories, their consumption can lead to gastric distension, which has been shown to lower IR expression of orexin in the LH [63], making it important for future studies to control for this, perhaps using paired water intake.
Notably, reduced sucrose- and saccharin-bingeing have been observed following treatment with an orexin receptor 1 antagonist [62], which appears inconsistent with the notion that orexin receptor 1 signaling mediates feeding driven by caloric need versus palatability mentioned earlier [51].
Several studies have investigated whether the caloric contribution of sweeteners influences their rewarding properties. For example, Aoyama et al. [64] assessed responding for a saccharin-related cue in rats during prolonged abstinence from the solution as a measure of craving and seeking behavior.
Indeed, responding was significantly greater with greater abstinence, demonstrating that saccharin is capable of eliciting an “incubation of craving,” similar to what has been seen with sucrose [65] and cocaine [66].
In fact, there was no difference between the magnitude of the incubation of craving for saccharin and sucrose [67].
Additionally, similar to sucrose, limited access to saccharin has been shown to induce excessive binge eating [62]. In food-restricted mice, preference for a non-caloric blend of saccharin and sucralose surpassed that for fructose, but not sucrose and glucose [68].
Taken together, these studies provide behavioral evidence that sweet taste, independent of caloric content, is sufficiently rewarding to motivate feeding and seeking behavior.
Under conditions of caloric deficit, recent studies demonstrate an important role for the post-ingestive effects of caloric sweeteners in food reward and preference. While both sucrose and saccharin-related cues evoked a sharp increase in DA within the NAc core of food-restricted rats, both sucrose-related cues and consumption resulted in a significantly greater DA response relative to saccharin [69].
In a recent study conducted in ad libitum and food deprived rats, saccharin and sucrose led to different responses based on physiological state. Unsurprisingly, food deprived rats significantly increased responding for sucrose compared to saccharin, whereas non-food deprived rats showed comparable efforts to obtain sucrose or saccharin [70].
Consistent with this, habituation of DA in the NAc was seen in response to both types of sweeteners in non-food deprived animals, whereas habituation was only seen in response to saccharin among food deprived animals [70].
In one study using intragastric infusion of glucose or saccharin in awake, fasted rats during functional magnetic resonance imaging (fMRI), glucose lead to greater blood oxygen level dependent (BOLD) activation in several brain regions, including key components of the mesolimbic DA pathways (e.g., the VTA and NAc), compared to saccharin.
Moreover, glucose, but not saccharin, evoked a BOLD response in the hypothalamus [71]. Thus, when bypassing the taste pathway via intragastric infusion and in a fasted state, a caloric sweetener led to more pronounced activation of both hedonic and homeostatic regions.
Although the focus of this review is on recent studies using animal models, human studies that are particularly relevant warrant discussion. Recent studies in humans suggest that repeated low calorie sweetener consumption alters brain responses to caloric sweeteners. Subjects who reported higher low calorie sweetener intake showed a reduced BOLD response in the amygdala in response to sucrose [72]. A
dditionally, Green and Murphy [73] found that relative to non-diet soda drinkers, individuals who consumed diet soda regularly showed greater activation in the VTA as well decreased activation in the right caudate in response to saccharin.
In contrast to these findings, Griffioen-Roose et al. [74] did not observe a difference in hedonic value, measured by both behavioral tasks and fMRI, between participants with repeated exposure to low calorie sweeteners and sugar-sweetened beverages suggesting that low calorie sweeteners do not modify reward value (though subjects who exclusively consume “light versions” of foods and beverages were excluded).
Finally, conditioning with low calorie sweeteners or sugar-sweetened beverages led to similar reports of expected fullness following consumption, leading the authors to conclude that low calorie sweeteners may, in fact, be advantageous for weight management.
To this point, recent meta-analyses show that although observational, prospective studies show a small positive association between low-calorie sweetener use and BMI, randomized control trials suggest slight but significant benefits of low-caloric sweeteners substitution for weight loss [75,76].