Authors: Siim Land and James DiNicolantonio
Topics: Nutrition
All information is attributed to the authors. Except in the case where we may have misunderstood a concept and summarized incorrectly. These notes are only for reference and we always suggest reading from the original source.
Introduction
Chapter 1: How the Minerals Drive the Body’s Essential Processes
Chapter 2: Minerals Needed by the Body
Chapter 3: Superoxide Anions Drive Chronic Diseases and Minerals Are the Antidote
Chapter 4: Calcium, Magnesium, Hard Water and Your Heart
Chapter 5: Taking the Waters: Mineral Waters with Magnesium and Calcium
Chapter 6: Magnesium, Calcium, and Phosphorus: Softening Up the Arteries and Hardening the Bones
Chapter 7: The History and Importance of Copper in the Diet
Chapter 8: Getting the Right Amount of Copper, Zinc, and Iron
Chapter 9: Zinc for the Immune and Endocrine System
Chapter 10: Hypothyroidism and Hyperthyroidism: The Sodium-Selenium-Iodine Connection
Chapter 11: Potassium, Sodium, and Hypertension
Chapter 12: Boron and Other Possibly Essential Trace Minerals
Chapter 13: Sulfur, Glutathione, and Organosulfur Compounds
Chapter 14: Chromium and Blood Sugar Management
Chapter 15: Manganese and Molybdenum: Hidden Players of the Body’s Antioxidant Defense
Chapter 16: Eating for the Minerals and Preventing Deficiencies
The criteria for essential nutrients were established in the 1960s and 1970s. It included the following requirements:
There are two kinds of minerals: macrominerals, which you need rather large amounts of, typically more than 100 milligrams (mg) per day, and trace minerals, which are typically consumed in amounts less than 100 mg per day.
There are 17 essential minerals, although there is considerable debate on this number. However, in general, there are 7 essential macrominerals and 10 essential trace minerals. These essential minerals cannot be produced by the body and are considered essential to get in the diet in order for us to live. There are at least 5 minerals that could be considered essential but have yet to be deemed as such.
7 Macrominerals (in general you need more than 100 mg of each per day):
Macromineral: Dietary Sources: RDA/AI in adults
10 Trace Minerals (in general you need less than 100 mg of each per day)
5 Possibly Essential Trace Minerals
Rheumatoid arthritis and other inflammatory disorders, can cause levels of iron, zinc and selenium to drop in the blood and hence in the hair. High amounts of inflammation can increase copper levels in the blood and in the hair even when there is copper deficiency.
The % of the U.S. population not meeting the RDA or AI for minerals is as follows:
Here are the Main Reasons for Widespread Mineral and Nutrient Deficiencies
The Overconsumption of Highly Refined Foods:
Soil Erosion, Fertilizers, Pesticides, Herbicides and Insecticides:
Heavy Metals Competing with Absorption:
Minerals Driving Energy Production
In total, there are up to 22 vitamins and minerals that support mitochondrial enzymes and are needed for energy production. Minerals also help to activate antioxidant enzymes that protect the mitochondria from oxidative stress. Deficiencies in these minerals can lead to a reduction in ATP production, mitochondrial degradation and can accelerate mitochondrial aging.
Importantly, magnesium is needed as a cofactor in several electron transport chain complex subunits, including methylenetetra-hydrofolate dehydrogenase 2 and pyruvate dehydrogenase phosphatase, and controls GLUT4 translocation to the cell membrane surface. GLUT4 is a glucose transporter that helps bring glucose into the cell for energy production. Thus, magnesium deficiency can contribute to a lack of ATP production in the mitochondria.
The inner mitochondrial membrane contains the electron transport chain (ETC), which is a series of 5 complexes that transfers electrons and protons across the membrane. This drives the creation of ATP by complex V (otherwise known as ATP synthase). Energy containing cofactors like NADH and coenzymes like FADH are used to deliver electrons to the electron transport chain for ATP production.
In the electron transport chain, NAD+ functions as an electron transfer molecule. NAD has two forms: NAD+ and NADH which both govern electron transfer reactions:
There are 5 membrane-bound complexes in the mitochondrial electron transport chain – complex I, II, III, IV and V. They are all embedded inside the inner mitochondrial membrane. Here is an overview of their role and function:
Carbohydrates, proteins, fatty acids and ketones will ultimately be broken down into acetyl-CoA, which then delivers the acetyl group into the citric acid cycle (TCA or Krebs cycle). Acetyl-CoA created from carbohydrates happens via glycolysis and via beta-oxidation from fatty acids. These pathways require zinc, magnesium and chromium, which ensure the capacity for energy expenditure and muscle performance.
The entry of acetyl-CoA into the TCA cycle needs magnesium and manganese, which creates NADH and FADH2 to then feed into the electron transport chain (which needs iron and copper) to produce adenosine triphosphate (ATP) the energy currency of cells. All reactions where ATP is involved require magnesium ions. The magnesium ion is an integral part of the last enzyme in the respiratory chain, which initiates reduction of molecular oxygen. As a component of membranes and nucleic acids, magnesium is present in the mitochondria.
Zinc, selenium, iron, copper, manganese, magnesium and iodine are all needed for proper thyroid functioning.
Magnesium:
Calcium:
Phosphorus:
Copper:
Chromium:
Iron:
Manganese:
Zinc:
Selenium:
Iodine:
Electrolytes and Minerals
The main electrolytes are sodium, potassium, magnesium, phosphate, calcium, chloride and bicarbonate. There are others like zinc, copper, iron, manganese, molybdenum and chromium. You obtain these minerals from food, water and supplements.
Electrolytes, especially sodium, help your body maintain normal fluid levels in the blood, within cells and outside of cells. How much fluid each of these compartments hold depends on the amount and concentration of electrolytes in it.
The Sodium-Potassium Pump moves sodium ions out and potassium ions into the cell. This is powered by magnesium and ATP. For every ATP that gets broken down, 3 sodium ions move out and 2 potassium ions move in.
Electrolyte imbalances can have many negative health consequences such as heart failure, arrythmias, oxidative stress or even death.
Here are the reference ranges for electrolytes measured through blood:
To avoid electrolyte imbalances, you have to be obtaining enough minerals from food and liquids. You also need to avoid damage to your organs like your intestines, liver and kidneys so you don’t lose the ability to absorb, reabsorb and/or utilize these minerals.
Magnesium is what primarily regulates the sodium potassium pump. However, a lack of sodium can lead to magnesium deficiency, as can a lack of vitamin B6 or selenium. Thus, salt, vitamin B6 and selenium also control the sodium potassium pump indirectly because they control magnesium status in the body. When in a magnesium deficient state, calcium and sodium accumulate in the cell, promoting hypertension and cardiomyopathy. Magnesium protects against potassium loss and muscle potassium levels won’t normalize unless magnesium status is restored even when serum potassium rises.
Minerals Needed by the Brain
Serotonin (5HT) regulates cognition, mood, and sleep in the central nervous system. Serotonin is both an inhibitory and excitatory neurotransmitter. In the enteric (intestinal) nervous system, serotonin affects digestion, motility and sensation. A serotonin deficiency in the brain causes symptoms of depression and decreased social exploration. Serotonin also gets converted into melatonin, which is called the main sleep hormone, but it also governs central antioxidant and repair processes during nocturnal sleep.
To make serotonin and melatonin, you need magnesium, calcium, iron, copper, cobalt and zinc. About 2% of the circulating amino acid tryptophan is converted into 5-hydroxytryptophan (5HTP) by tryptophan hydroxylase, which is an enzyme that uses 5-MTHF (the active form of folate), iron, calcium, and vitamin B3 as cofactors. 5HTP is further converted into serotonin by an enzyme called dopa decarboxylase, which uses magnesium, zinc, vitamin B6 and vitamin C as co-factors.
The magnesium ion is required for activating vitamin B6, which helps the conversion of serotonin into melatonin. Supplemental magnesium improves subjective measures of insomnia such as sleep onset latency and efficiency in the elderly. In patients with primary insomnia, administration of melatonin, zinc and magnesium improves the quality of sleep and quality of life.
Calcium helps tryptophan to be converted into 5-HTP by tryptophan hydroxylase, thus enabling melatonin production. A study found that fixing a calcium deficiency helped to regain normal REM sleep.
Potassium supplementation has a positive effect on sleep quality and slow-wave-sleep. Since magnesium controls the levels of potassium in the body this further highlights the importance of both magnesium and potassium for sleep.
Dopamine is both a hormone and a neurotransmitter that regulates mood, motivation, well-being and the feeling of reward. Imbalances in dopamine affect fatigue in multiple sclerosis and other neurological disorders. With aging, dopamine levels decrease, causing cognitive inflexibility and rigidity. Not producing enough dopamine may cause symptoms of depression, anxiety, apathy and promote pleasure seeking from harmful activities like drugs or alcohol.
Magnesium, as noted previously, is needed for creating neurotransmitters and hormones like serotonin, dopamine, norepinephrine and melatonin. Low magnesium intake is associated with depression in a near linear fashion. Magnesium supplementation of 450 mg per day was found to improve symptoms of depression and was comparable to the antidepressant drug imipramine. Magnesium can also be helpful in treating anxiety symptoms. Magnesium deficiency induces anxiety and HPA axis dysfunction. In bipolar disorder and mania, magnesium can stabilize mood. There is a clearly established link between the development of migraines and magnesium.
Iron is an essential cofactor for the synthesis of neurotransmitters and myelin. Deficient iron levels are associated with cognitive decline in the elderly. In toddlers, a deficiency in iron causes poor cognitive development, which impairs brain morphology and cognition. However, high iron levels in the brain are associated with Alzheimer’s disease. That is because iron is a major source of reactive oxygen species and oxidation. In reality, the loss in the ability to utilize iron, which can occur with copper deficiency for example, may be driving many of these iron ‘overload’ issues.
Zinc is the most concentrated metal in the brain next to iron. It has an important role in synaptic transmission, nucleic acid metabolism and axonal transmission. There is an association between higher zinc levels during aging and healthy brain aging. In the elderly, zinc concentrations are higher in those with unimpaired cognitive function compared to those with memory impairment. Children with ADHD have reduced ADHD symptoms when given a zinc supplement for 12 weeks. And zinc imbalances may cause neuronal death, neurological disorders, stroke, epilepsy and Alzheimer’s disease.
Here are the most important organs and tissues that need various minerals
Heart – related to cardiac function, cardiovascular disease risk, atherosclerosis, hypertension and stroke.
Gut, intestines, intestinal lining and the microbiome.
Liver – everything related to liver function and energy metabolism is controlled by the liver.
Eyes – everything related to vision, eye health, and preventing macular degeneration.
Collagen – everything related to the skin, tendons, ligaments and soft tissue.
Minerals for Managing Stress and Steroid Hormones
All nutrients and minerals follow a hormetic U-shaped curve.
Corticosteroids are a class of steroid hormones that are synthesized in the adrenal cortex from cholesterol. The most common steroid hormones are aldosterone, testosterone, cortisol, cortisone, pregnenolone, progesterone, DHEA and others. Steroid hormones regulate metabolism, inflammation, body composition, immunity, stress adaptation, and recovery from injuries.
In order to make steroid hormones, vitamin D and DHEA, you need complex II, III, and IV, which are dependent on selenium, copper, magnesium, and iron. You also need ferredoxin and ferredoxin reductase for producing steroid hormones, which are iron-sulfur proteins (thus we need iron and sulfur). Anytime iron is needed in a reaction that automatically means copper is needed because copper is needed to oxidize ferrous iron (Fe2+) to ferric iron (Fe3+) so iron can move and be transported around the body. Thus, you need adequate levels of numerous minerals to produce sex hormones, corticosteroids and other steroid hormones.
Steroid hormone synthesis is regulated by trophic hormones like adrenocorticotropin hormone (ACTH) and luteinizing hormone (LH). They activate G protein-coupled receptors, promoting intracellular cyclic AMP (cAMP) levels, which supports cAMP-dependent protein kinase (PKA), protein synthesis, and protein phosphorylation. All of these processes assist in delivering cholesterol from the outer mitochondrial membrane to the inner one, which overcomes the rate-limiting step in producing steroid hormones.
Zinc deficiency is associated with lower testosterone in men and supplementation can improve testosterone levels. Magnesium also affects free and total testosterone levels.
Magnesium sulphate helps with the formation of cyclic AMP and governs its antiplatelet effects, thus affecting steroidogenesis. ATP is also required for steroid synthesis, which requires iron, copper, manganese and magnesium.
Malnourished individuals are more vulnerable to infections and illness because their immune system lacks the nutrients to function properly.
One of the most impactful forms of oxidative stress is DNA damage that results in mutations and genomic instability. This may lead to cancer, chronic diseases and accelerated aging because the DNA replication mechanism stops working properly. Unrepaired DNA damage leads to the accumulation of misfolded proteins, inflammatory cytokines, and dysfunctional cells, which can spread inflammation and lead to accelerated aging. Cellular senescence occurs in response to DNA damage that results from exposure to reactive oxygen species (ROS) and free radicals. Deficiencies in DNA repair mechanisms increase the frequency of mutations and leaves cells more vulnerable to malignancies. Reduced DNA repair protein activity are seen in early stages of cancer and are thought to contribute to the genetic instability of cancer.
Autophagy has a protective role against many metabolic and age-related diseases such as insulin resistance, heart disease, atherosclerosis, inflammation, Crohn’s, bacterial infections, neurodegeneration, gut health, fatty liver and aging in general.
Nutrient: Synergistic Relationship (SR): Varies Based on Nutrient Levels (VBNL): Antagonistic Relationship (AR)
Low micronutrient intakes can even accelerate the so-called degenerative diseases of aging. This is known as the triage theory invented by Bruce Ames. The idea is that if you are deficient in a particular micronutrient, it gets triaged where it is needed most for survival while non-vital functions suffer causing insidious diseases such as cancer, metabolic disease and vascular calcifications. Although virtually impossible to prove with long-term randomized controlled trials, this logic is consistent with evolutionary theory, wherein natural selection favors short-term survival and reproduction over long-term health and nutritional status.
Chromium, vanadium and magnesium are some of the most common deficiencies associated with diabetes.
Most free radicals are very reactive and they can cause oxidative damage. However, free radicals are also important for inducing hormesis, whereby a small exposure to free radicals makes our body more resilient to them in the future.
A reducing agent is a compound that can donate an electron to a free radical to neutralize it. For example, NADPH is the universal electron donor in the body and it helps to recycle or “reduce” oxidized glutathione back to its reduced or unoxidized form.
Age-related impairments in the mitochondrial respiratory chain decrease ATP synthesis, damage DNA, and make the cells more susceptible to oxidative stress.
Superoxide and Peroxynitrite – Worst of the Free Radicals
Oxidative stress results from an imbalance in the proinflammatory molecules compared to the anti-inflammatory enzymes and antioxidants in the cell – more pro-oxidants than antioxidants mean chronic inflammation and mitochondrial/tissue damage. Free radicals are implicated in serious health conditions like atherosclerosis, diabetes, cancer and neurodegeneration.
Harmful free radicals include superoxide anion radical (O 2 *), reactive oxygen species (ROS), superoxide anion (O 2 *), hydrogen peroxide (H 2 O 2), singlet oxygen, hydroxyl radical (OH •), peroxy radical, as well as the second messenger nitric oxide (NO •), which can react with O 2 * to form peroxynitrite (ONOO −). They can be created from internal sources during mitochondrial energy production or through exposure to external stimuli like pollution or poor diet.
The body has an entire grid of antioxidant systems that consists of multiple lines of defense:
Unrepaired DNA damage leads to the accumulation of misfolded proteins, inflammatory cytokines and dysfunctional cells, which can spread inflammation and induce accelerated aging.
To deal with environmental stress and oxidative damage, organisms have evolved systems of DNA damage response (DDR). This includes DNA repair mechanisms, damage tolerance and adaptation. The rate of DNA repair depends on many factors, such as cell type, age of cell and the surrounding environment. A cell that has accumulated too much DNA damage, or no longer repairs itself, can go into one of three states:
Having enough DHA in the cardiolipin of the mitochondria is extremely important for a cell’s ability to induce apoptosis and controlled cell and/or cancer cell death.
Cells with deletion of the essential autophagy gene Atg7 exhibit degradation and attenuated activation of checkpoint kinase 1 (Chk1) and diminished repair of DNA double-strand breaks by homologous recombination.
SIRT7 depletion causes impaired DNA repair and genome instability. One of the best activators of sirtuins is melatonin. Thus, maintaining good levels of melatonin production throughout the day, with things like morning light exposure, avoiding light at night and consuming an optimal amount of nutrients (so the body can synthesize melatonin) can help with sirtuin activation and DNA repair.
There are at least 169 enzymes involved in DNA repair pathways, including superoxide dismutase, glutathione, Nrf2 and others. They all require minerals to work, such as copper, zinc, selenium and magnesium.
Superoxide can combine with nitric oxide to form the highly reactive nitrogen species peroxynitrite (ONOO −). Even though there are other oxidants that are formed in the body like hydroxyl radicals, for example, peroxynitrite has a much longer half-life, it can penetrate into multiple cells and it causes significant and relevant damage to the body. Peroxynitrite creates a burst of oxidative stress, mitochondrial dysfunction, inflammation, DNA damage, lipid peroxidation, necrosis and shuts down many enzymes. It also modifies tyrosine into nitrotyrosines, which are associated with atherosclerosis, myocardial ischemia, and inflammatory bowel disease. In fact, peroxynitrite is implicated in nearly every pathology, starting with hypertension, heart failure and ending with diabetes and vascular aging. Thus, the production of superoxide anions and the subsequent formation of peroxynitrite can be viewed as one of the most harmful and relevant oxidative stressors that occurs within our bodies.
Nitric Oxide (NO) is an important signaling molecule that’s known for its benefits on the cardiovascular system and its antiviral effects. Nitric oxide (NO) suppresses platelet aggregation, lowers blood pressure, reduces blood clot formation, prevents blood vessel inflammation and improves the transport of lipids and cholesterol. It promotes blood flow through vasodilation and reduces the time these particles stay in your bloodstream. Reduced availability of nitric oxide has been implicated in the pathogenesis of hypertension and atherosclerosis.
Here are the nutrients needed for nitric oxide production and cardiovascular health:
Glutathione (GSH) is another internal antioxidant in the body. It protects against reactive oxygen species and free radicals like peroxides, lipid peroxides and heavy metals. Glutathione promotes the regulation of nitric oxide by enhancing citrulline function. Without enough NADPH, your body can’t recharge glutathione after it becomes oxidized. This will put breaks on all the detoxification systems. Glutathione is extremely important for protecting red blood cells from oxidative stress and its levels are highly dependent on magnesium.
Minerals for Preventing Premature Aging and Promoting Longevity
Premature aging is primarily caused by damage to tissues from numerous factors and reduced cellular repair. This leads to an accumulation of dysfunctional enzymes, proteins, and cellular membranes and if it reaches a certain threshold, it can present as disease and ultimately death. As you’ve just learned, the major cause of this damage in the body is from the activation of NADPH oxidase, which produces the harmful superoxide anion and the reactive nitrogen species peroxynitrite.
Inflammasomes Drive a Vast Range of Acute and Chronic Inflammatory Diseases
Inflammation is considered one of the main contributing factors to many chronic diseases like cardiovascular disease, cancer, autoimmunity and brain aging.
Inflammasomes, which are protein complexes that assemble in response to certain pro-inflammatory signals exert pro-inflammatory and pro-apoptotic effects. They have been shown to have a pathogenic role in diabetes, neurodegenerative diseases, autoimmune disorders like rheumatoid arthritis, psoriasis, asthma, allergies and acne.
Inflammasomes mediate acute inflammatory conditions such as gout and the acute respiratory distress syndrome (ARDS) seen in COVID-19. Inflammasomes can also trigger a type of cell death called pyroptosis, which essentially spills the cells’ guts releasing its inflammatory materials driving many chronic inflammatory conditions.
One of the primary regulators of our body’s antioxidant systems is Nuclear Factor Erythroid 2-Related Factor 2 (Nfr2), which is a transcription factor that binds to DNA to express various genes. Nrf2 works by activating the antioxidant response element (ARE), which increases antioxidants like glutathione, NADPH, bilirubin, thioredoxin and cell protection, producing major anti-inflammatory changes and lowering oxidative stress.
How to Suppress NLRP3 Inflammasome Activation:
The active form of vitamin D, known as calcitriol, can also suppress MAP kinase, which is a major driver of inflammation during infections and sepsis but vitamin D requires magnesium for its activation. It has been found that 15 ng/ml of vitamin D which is considered an insufficient serum vitamin D level in humans, does not suppress lipopolysaccharide (LPS)-induced inflammation. Whereas, inhibition of LPS (i.e., endotoxin) inflammation was significantly reduced with vitamin D levels at 30 ng/ml but maximal inhibition occurred at a vitamin D level of 50 ng/ml.
NADPH oxidase or NOX is a complex of enzymes bound to the cellular membrane. It senses the presence of oxygen and nutrients to balance the body’s ROS. Inhibiting NOX increases NADPH and combats oxidative stress. NOX proteins are involved in the inflammation of the vasculature. However, NOX also generates free radicals that destroy pathogens through a process called the respiratory burst.
Here are several ways to increase NADPH to promote the regeneration of antioxidant defenses in the body such as glutathione and thioredoxin:
Hard Water vs. Soft Water
Around 20% of your daily fluid intake comes from food while the rest is provided by what you drink. Water hardness simply means the mineral content of the water. The softer the water the higher rate of cardiovascular disease.
Sodium, potassium and lithium are considered monovalent cations (or single positively charged molecules) and do not contribute to the hardness of water, only divalent cations (such as calcium or magnesium) contribute.
Most of the people in industrialized societies consume soft water or water that lacks minerals. The reason is because soft water is cheaper and easier to use. Soft water requires less soap, both for personal hygiene and when washing clothes or dishes and it causes less scaling of pipes and leaves fewer stains in pots and pans and enamel sinks. There is a counterpoint to that however: over time, soft water does the opposite of depositing limescale, as it is more corrosive and it can dissolve some of the metals found in the water distribution pipes, which can include copper, zinc and cadmium.
Masironi and Shaper noted that, “Soft waters could be carrying trace levels of toxic elements from pipes or soil into supply; hard waters could be protective due to their content in calcium and magnesium or in beneficial trace elements.”
Many researchers cite the lack of magnesium in the water supply as a significant factor. Magnesium deficiency is considered a principal driver of cardiovascular disease that increases the risk of heart disease substantially. This problem is magnified by the fact that about 50-75% of the population isn’t meeting the 350-420 mg RDA for magnesium. Magnesium regulates vascular smooth muscle cells, affecting blood pressure, calcification, atherosclerosis, kidney disease, arrhythmias and thrombosis (clots).
Challenging the Food-Mineral Hypothesis
Given that the majority of the population is already consuming too many calories, optimizing the water mineral content should be a matter of high priority.
A lower intake of minerals increases the absorption and the toxic effect of heavy metals, so in this way, by not consuming mineral rich water, we may increase the risk of certain diseases and enhance the harmful effects of toxic heavy metals found in the environment such as aluminum, cadmium, lead, mercury, and arsenic. We should bear in mind that, calcium, which is contained in substantially greater amounts in hard water, may help protect us against lead and cadmium absorption.
To maximize the potential benefits from water that is rich in minerals, it should be used when cooking and preparing your foods too. Boiling pasta, rice and vegetables in the right water endows those foods with a higher mineral content, the proportion of water in your food being higher than you may at first imagine.
The Ratio of Minerals to Toxic Heavy Metals in Water is Just as Important as Absolute Values
High-calcium water may well have a double protective effect, containing decreased amounts of toxic heavy metals as well as reducing their degree of absorption into the body. Put simply, although from a practical point of view, you may be frustrated by the calcium rich ‘hard’ water furring-up the pipes and kitchen equipment such as coffee makers and kettles, it is precisely that calcium lining of those water pipes which blocks any leaching effect giving you double protection.
However, just getting more calcium, especially if it’s being provided in high amounts from supplements or food fortification, is not always beneficial and could even be harmful. Although this does not appear to be the case from naturally high calcium waters as the intake of calcium is slower and doesn’t spike blood levels like higher amounts of calcium can do from supplements and food fortifications. Indeed, a high intake of supplemental calcium is associated with increased risk of cardiovascular disease death in both men and women. Calcium supplementation without co-administered vitamin D increases the risk of having a heart attack (which also requires vitamin K and magnesium).
Extra calcium intake through supplementation results in higher circulating calcium, leading to calcification and calcium deposition in the coronary arteries and extra-skeletal muscle tissue.
It is estimated that less than 30% of the calcium ingested through food is absorbed, but if you consider the factors that affect your body’s absorption, the importance of water as a source of calcium may be greater than most of us appreciate. Consider the fact that the more calcium you take in at one time, the harder it is for your body to process it, which speaks for the steady ‘supplementation’ provided by your hard water supply, rather than occasionally swallowing calcium-rich pills.
Magnesium in Mineral Waters Provide Optimal Heart Health
The magnesium concentration in the hearts of subjects dying of heart disease is 24% lower than that of those subjects dying from accidents. There is also a link between magnesium deficiency and sudden death, suggesting that (1) sudden death is common in areas where community water supplies are magnesium deficient, (2) Myocardial magnesium content is low in people who die of sudden death, (3) Cardiac arrhythmias and coronary artery vasospasm can be caused by magnesium deficiency and (4) Intravenous magnesium reduces the risk of arrhythmia and death immediately after a heart attack. Thus, sufficient magnesium levels are strongly and negatively correlated with rates of sudden cardiac death even after adjusting for other risk factors.
In the body, magnesium has antithrombotic effects and reduces mortality in pulmonary thromboembolism. This suggests that magnesium has anticoagulant properties. Magnesium deficiency has been implicated in insulin resistance, type-2 diabetes, hyperglycemia, and hyperinsulinemia, all of which are considered risk factors of heart disease.
Just because your water is classified as hard that does not mean there is necessarily ample magnesium in it, it may simply be high in calcium.
Other Relevant Minerals in Your Water Supply
Chromium has been shown to play an important role in glucose metabolism, which influences glucose tolerance. Chromium picolinate, specifically, has been shown to reduce insulin resistance and may even reduce the risk of cardiovascular disease and type-2 diabetes. A report on four meta-analyses of human studies saw a significant reduction in fasting plasma glucose levels from chromium supplementation. A 2016 review covering six meta-analyses concluded that chromium decreases fasting blood glucose and HbA1C. Thus, you do not want to be deficient in chromium because that is associated with diabetes and hyperglycemia.
Lithium is the lightest solid element in the periodic table, and it is an essential trace mineral with many recommending an intake of 1 mg/day to meet requirements. In miniscule doses, lithium has been shown to have a number of health-related benefits. Observational studies in Japan have noted that low-dose lithium in drinking water is associated with better longevity.
Iodine – The regular consumption of iodine is important because it allows our bodies to make the all-important thyroid hormones. Low thyroid function leads to hypothyroidism, which can raise cholesterol, promote weight gain, and predispose you to metabolic syndrome. A telltale goiter (neck swelling) is one of the most visible signs of iodine deficiency.
Fluoride – The mineral fluorine presents us with a somewhat different situation, since many authorities around the world add fluoride to the water supply, usually citing studies which have shown how this measure can reduce the prevalence of tooth decay. Untreated dental caries can lead to weight gain, impair growth, increase the risk of infections, affect school performance and possibly lead to death. Adequate fluoride intake inhibits demineralization and bacterial activity in dental plaque.
Silicon is best known to most of us as the base material for semiconductor manufacturing and the creation of the computer chip beginning with silica sand, which is made up of silicon dioxide. As the eighth most common element in the universe, making up more than 90% of the earth’s crust, it should be no surprise that trace elements (mostly silicon dioxides) are found in our drinking water.
Cadmium – In its natural state, cadmium is a lustrous and silver-white, malleable metal, often used together with chromium in the electro-plating of steel, but you can also find it in hard and soft drinking water, albeit in miniscule quantities.
Working in the late 1950s, Dr. Henry Schroeder was also among those carrying out research which indicated that soft water was linked to higher levels of heart disease. He suggested that the action of soft water increased hypertension and pointed his finger particularly at higher cadmium levels. His four considerations were namely that:
This theory is supported by both experimental and clinical evidence, making cadmium something to watch out for, avoiding higher levels where you can. The wide use of cadmium in nickel-cadmium batteries, the coloration of plastics and in various discarded electronic products has led to cadmium getting into water supplies in certain areas. A potential environmental hazard to be aware of.
A Geographical Advantage? Tap Water from the Western United States and from Southern Europe May Provide Greater Health Benefits
To the north of Europe, vast and incredibly old geological substrata underlie surface rocks and topsoil meaning that the underground water flows are soft by nature. This represents a North-South direction, but a similar divisive feature can also be found in North America, albeit this time running in an East-West direction. In both continents, associations with cardiovascular disease are consistent with the softness of the water, which led the team to deduce (by elimination) that your latitude – that being how far North or South of the equator you are – plays little if any role in the matter. However, where you live in terms of your relationship to the underlying local geology consistently showed similar relationships concerning water hardness and cardiovascular disease.
As well as the many heart related correlations, it appears that the rates of stones formed in the bladder or urinary tract are also higher in areas that drink soft water (rabbit study).
Water is just slightly acidic and that’s down to the carbon dioxide content. Over time, this is what gives the water a cumulative, corrosive nature which can allow the water to strip cadmium, lead and other harmful elements from your piping, depending of course on what that piping is made of. This over time leads to adverse effects on your personal cardiovascular condition and represents a biologically plausible mechanism for how soft water supplies are associated with cardiovascular disease. That is in contrast to supplies of harder water which contain calcium and importantly, other beneficial minerals for your heart and health such as magnesium.
The Vital Role Played by Magnesium
Chalkiness in the form of calcium carbonate will be what you most often identify when looking at the limescale build-up in your household water pipes, kettle or coffee machine. Yet it is the magnesium content of that scale which may be having the biggest influence on your heart and surprisingly, on your taste buds too.
“While high bicarbonate levels are bad,” Hendon explains, “high magnesium ion levels increase the extraction of coffee into water and improve the taste.” His research showed that sodium rich waters produced by water softeners didn’t help the taste but that using magnesium-rich water was best.
Magnesium is essential for heart muscle contraction and for oxidative phosphorylation in heart mitochondria. It is important because oxidative phosphorylation is the process by which adenosine triphosphate (ATP) is formed. It is the high energy molecule which stores the energy we need to do just about everything we do, and it has been suggested that a magnesium-ATP complex is the true substrate for all reactions involving ATP.
Water Hardness and Magnesium – Their Role in Heart Muscle
Admittedly, it was not a large study but in all, 64 Canadian males had died as the result of accidents and thus were considered representative of the general population when compared with victims who die from “natural causes”. Of those 64 men, 20 were residents of three different hard-water areas while 44 of them were residents of five soft-water areas. The mean magnesium concentrations in wet heart tissue for the hard and soft water residents were 222.3 ug/gran and 206.7 ug/gram, respectively (the difference being statistically significant at the 0.01 level). Another way of interpreting these results would suggest that although the very fact of suffering heart disease per se may lead to a reduction in magnesium levels in the heart, drinking from a soft water supply that is lower in magnesium seems also to lead to a reduction in magnesium content.
After calcium, sodium and potassium, magnesium is the fourth most commonly found mineral in the human body. Of the 25 grams of magnesium present in an average 70-kilogram human (155 pounds), you will find half of it in your bones, around a quarter in your muscle tissue and the rest distributed among soft tissue and blood. A lack of magnesium is to be taken seriously, as it contributes not only to heart problems but to numerous health conditions, the mineral being necessary for the efficient biochemical functioning of numerous metabolic pathways.
Higher sodium levels in the resident’s tap water were also found to play their part in lowering the rates of cardiovascular, kidney and ischemic heart disease mortality.
The most common minerals found in spring water are calcium, sodium and magnesium. Older man-made water systems used copper for plumbing. Copper is important for energy production and it has antimicrobial properties, but it can also corrode easily and contribute to excess copper.
Historians nowadays think that lead poisoning because of lead water pipes played a major role in the downfall of the Roman Empire.
Water is by far a more bioavailable source of minerals because they are dissolved and charged in the liquid, rather than bound to food particles. Different foods also have ingredients and compounds, such as phytates, fiber or phytonutrients, that will decrease the amount of minerals you will absorb.
History of Spa and Mineral Water Therapy
The Romans began ‘taking the waters’ in spa towns over 2,000 years ago, and ‘the right kind of water’ has been reputed to cure all sorts of ailments, whether you are bathing in it or drinking copious amounts. Galen, the Roman surgeon, promoted the effects of mineral water on various diseases. Romans also built spas across Europe in newly conquered lands to treat wounded soldiers and recover from physical exertion. During the Renaissance, Italian doctors began to associate the health benefits of spas with the waters high in minerals. By the 17th century, many spa treatment resorts and centers were built in many regions of Europe, such as France, England, Italy, Germany, Austria and Eastern Europe.
Epsom’s salty springs possessed highly concentrated mineral rich waters, and that intensity of taste was due to the local rocks being richly endowed in magnesium and sulfate. While supplies lasted, these, sometimes cloudy waters, were used for bathing as well as consumption, sometimes in quantities of ‘several pints after another’ according to anecdotal tales.
Paracelsus believed all diseases are the poisoning of a combustible element (sulphur), a fluid element (mercury) and a solid element (salt). According to him, these three substances are at the root of all physiological processes in the body – metabolism (sulphur), genetics (mercury) and enzymatic reactions (salt/minerals). Salt controls the body’s liquids so that materials could be moved around. If there’s excessive mineralization, arthritis and kidney stone formation occurs. Paracelsus figured out that some compounds that are poisonous in large amounts can be beneficial in smaller quantities, laying the foundation to future science about hormesis.
NASA scientists deduced that as water pressure increases with depth, the legs and abdomen are lightly compressed, expelling blood and some interstitial fluid, the thin liquid layer which surrounds the body’s cells. The fluid is pushed up into blood vessels in the chest, producing a marked increase in available central blood volume, which leads to an increase of about 50% in cardiac output. There being no associated rise in blood pressure, the resistance of the rest of the body’s veins and arteries has to go down – a factor that could be of great significance when treating conditions of poor blood circulation or early stages of paralysis.
Why Drink Water with Minerals?
There are many benefits to drinking water with minerals compared to regular plain water:
Improved Digestion and Gut Health – Stomach acid and digestion require salt and other minerals. Gastric juice is composed of hydrochloric acid, potassium and sodium chloride, which help to break down food and assimilate the nutrients. Other stomach cells produce bicarbonate that buffers against the acidity and regulates the pH.
May Help Chronic Diseases – A higher intake of calcium and magnesium, aligned with the RDA, may be protective against many chronic diseases, such as osteoporosis, hypertension, sudden cardiac death and cardiovascular disease.
Improve Lung and Respiratory Conditions. Aerosol therapy using “Dovolenskaya” mineral water, improves bronchial drainage and inflammation in patients with occupational bronchopulmonary diseases. Inhaling aerosolized sulphurous mineral water has been shown to improve inflammation in 65% of the studied subjects who suffer from chronic inflammation of the upper respiratory airway. A meta-analysis of 13 clinical studies utilizing thermal water inhalations and irrigations concluded, “Thermal water applications with radon or sulphur can be recommended as additional nonpharmacological treatment in upper airway diseases.”
Natural mineral water intake improves skin hydration and dryness. Purified thermal water can reduce skin irritation and reduce transepidermal water loss. Spraying thermal spring water to the face after dermatological surgery, laser therapy or chemical peelings reduces local inflammatory symptoms and adverse effects associated with the procedure.
Rheumatoid Arthritis and Osteoarthritis – Minerals, especially magnesium and calcium, have a crucial role in bone health and density. Your bones are made of calcium and they need minerals to maintain their integrity. Skeletal bone acts as a sodium-rich reservoir that can be depleted during sodium deficiency, which has adverse negative side-effects on bone quality and fracture risk.
Maintain Kidney Health – The body’s fluid balance and electrolyte status are governed by the kidneys. Electrolyte imbalances are commonly seen in kidney disease. Dehydration can cause elevated levels of blood sodium, which is often seen in diseases associated with kidney problems like diabetes. Edema, which is characteristic to nephrotic syndrome, can cause low sodium levels in the blood due to fluid retention.
Cognition and Neurological Health – Drinking rosemary-infused water can improve cognition and cerebrovascular health. Mineral water consumption can increase the excretion of silicic acid and aluminum in Alzheimer’s disease.
Exercise Performance – Electrolytes, especially sodium and magnesium, are especially important for physical activities and muscle contraction. Sodium generates action potentials of nerves and muscle cells by entering the cell. Physical exertion and exercise burn through minerals and electrolytes, while increasing the demand for them. In professional sports, vitamin and mineral supplementation can provide a competitive advantage. Athletes should be well-hydrated with fluids and electrolytes before and after exercise for optimal results.
A blood sodium level of less than 135 mEq/L is considered low. It is called hyponatremia, which can be characterized by headaches, cramps, arrhythmia, and increased risk of seizures. In certain cases, it is not a lack of sodium that is causing the problems but the excess water in relation to the amount of sodium, which dilutes the sodium concentration. That is why overhydration can easily cause hyponatremia, which has led to death and brain swelling.
Here are the most common causes of low sodium levels:
Hypernatremia refers to a blood sodium level above 145 mEq/L. This most commonly happens because of water restriction that elevates the ratio of sodium to water. That is why hypernatremia is seen in diseases like diabetes and with excessive use of loop diuretics.
Here are the most common causes of high sodium levels:
The body regulates and reduces the excretion of minerals out of urine when you are very deficient as a means of slowing down additional depletion. The absorption of minerals is also lowered when you reach excessive amounts of them in the body to prevent overload and toxicity. However, if you have a poor diet, metabolic syndrome, damaged kidneys or bad digestion, then your ability to absorb or reabsorb minerals is greatly diminished. There are additional things that can make you hold onto less minerals, like excess consumption of plain water without minerals or fasting (due to metabolic acidosis and mineral losses from bone). Bicarbonate has been shown to reduce acute acidosis and improve exercise performance.
Drink Mineral Spring Water: But Check the Label
Among commercial bottled waters, there is a large variation of minerals – magnesium ranging from 0-126 mg/L, sodium 0-1,200 mg/L and calcium 0-546 mg/L. Generally, European bottled waters have a higher mineral content than North American ones. In North America, the average concentration of magnesium is 2.5 mg/L, sodium 5 mg/L and calcium 8 mg/L. In Europe, it is 23.5 mg/L for magnesium, 20 mg/L for sodium and 115 mg/L for calcium.
There are some American brands like Adobe Springs and Mendocino that have magnesium levels comparable to European sparkling waters. According to the International Bottled Water Association, variation of minerals between individual bottles of the same brand are less than 5%.
Sulfates are the 8th most common mineral in our bodies with benefits on the antioxidant system. Getting the right amount of sulfate can help to support joints, the nervous system, the cardiovascular system and reduce inflammation. Unfortunately, nowadays our food contains less sulfur and sulfates because of soil depletion. However, sulfate is also acidic, and if you are looking to reduce metabolic acidosis or improve athletic performance, you need to be careful with overconsuming waters that are high in sulfate and low in bicarbonate.
Sulfites are inorganic salts used for food preservation and in medications. You can get sulfites mostly from processed foods, wine and processed meat. They can also inhibit the browning of fruit and vegetables. Some people, especially asthmatics, are overly sensitive to sulfites, which cause gastrointestinal, cardiovascular, pulmonary and dermatological problems. Nausea, abdominal cramps, diarrhea, and urticaria are commonly reported. Although carcinogenic in laboratory animals, the FDA considers sulfites generally recognized as safe…
Carbon dioxide in the blood will join with water to create carbonic acid. This lowers the blood pH and the nervous system responds by increasing your breathing rate, which is termed “respiratory compensation”. During exhalation, CO2 gets exhaled and pH normalizes. Thus, CO2 is important for keeping your metabolic rate elevated and active.
When drinking bottled mineral water, it is better to have it in glass bottles to avoid the plastics and heavy metals in cans and plastic bottles. Most plastics leach hormone mimicking agents called ‘xenoestrogens’ that mimic the hormone estrogen in the body.
Here are the benefits of mineral waters:
In essence, our bodies are walking bags of salt water that contain various electrolytes and minerals. Electrolytes are the ancient components of optimal cellular functioning used for centuries to improve health and treat certain conditions. The most frequent benefits of drinking mineral waters are better blood pressure and hydration. Spa therapy and thermal baths have more evidence for improving physical pains such as lower back pain, osteoarthritis and fibromyalgia. It requires almost no effort to implement mineral rich waters into your diet. Theoretically, you could get all the minerals you need from food, but as we’ve seen from research, adding mineral water and slowly consuming small amounts throughout the day provides a more bioavailable source of minerals. Thus, it is a prudent strategy for optimal results.
Magnesium and calcium are particularly relevant to cardiovascular health, bone health and blood pressure. It is known that calcium build-up in the arteries contributes to atherosclerosis and heart disease risk. Calcium supplementation has been associated with increased CVD mortality. High calcium intake from supplements can offset the balance with magnesium. Low magnesium is associated with an increased risk of cardiovascular disease, whereas high magnesium status is associated with a lower risk.
Magnesium intake is also inversely associated with the risk of metabolic syndrome and diabetes. Magnesium also antagonizes the inflammation caused by increased intracellular calcium, which is why it is often called ‘nature’s calcium blocker’.
However, increased dietary calcium intake has been found to be linked with a reduced incidence of atherosclerosis. High intakes of calcium from food does not appear to increase vascular calcification either. Whole foods that contain calcium also bring some magnesium, which keeps the two in balance. Vitamin K1, and especially K2, also helps keeps calcium in check by directing calcium into the bones instead of the arteries.
Very little calcium from supplements will end up in the bones and high doses of calcium can lead to calcium overload in the blood and arterial calcifications. In older adults, osteoporosis co-exists with higher rates of arterial calcification. Low bone mineral density is linked with vascular calcification. Hypercalcemia (high calcium in the blood) is associated with worse rheumatoid arthritis disease markers. However, vitamin K, magnesium and vitamin D are linked with better bone mineral density.
Magnesium Deficiency and Calcium Overload
Magnesium is the 4th most common cation (positively charged ion) in the human body, the 2nd most common intracellular cation and the most common intracellular divalent cation. It is required for over 300 enzymes in the human body. The main functions of magnesium include regulation of sodium, calcium and potassium levels, ATP generation, modulation of inflammation, DNA/RNA protein synthesis and neurotransmitter production.
Extracellular magnesium contributes only ∼ 1% of total body magnesium, which is concentrated primarily in the serum and red blood cells. The human body contains roughly 25 grams of magnesium with 50-60% of it being in bones and the remainder in soft tissue. Out of that amount, 90% is bound and 10% is free. In the blood, 32% of magnesium is bound to albumin and 55% is free.
Here is how magnesium affects the pathogenesis of cardiovascular disease:
Magnesium supplementation in patients of coronary heart disease has been found to decrease platelet-dependent thrombosis, improve flow-mediated vasodilation and increase VO2 max.
Women with osteoporosis have been found to have lower serum magnesium compared to those with osteopenia and healthy individuals. A short-term study on 20 postmenopausal women saw that 30 days of 290 mg/d of magnesium citrate suppressed bone turnover, indicating reduced bone loss.
Here are the magnesium-dependent enzymes, functions and consequences that may occur with a deficit:
Magnesium-Dependent Enzymes/Proteins: Function: Consequences of Deficit
Some studies indicate that 180-320 mg/d is enough to maintain a positive magnesium balance, but 107 mg/d is not enough. A study on postmenopausal women showed they were able to maintain a positive magnesium balance by getting 399 mg of magnesium per day on a 2000 kcal diet whereas ~100mg of magnesium per 2000 kcal was inadequate. However, that would refer to an already homeostatic magnesium status, which most people are not at, as well as being without disease states that increase the demand for magnesium.
Postmenopausal women on a low magnesium intake of 100 mg/d develop atrial fibrillation and elevated glucose levels.
Many studies find that at least 300 mg of magnesium needs to be supplemented in addition to the diet to increase serum magnesium levels. Thus, the RDA of 350-420 mg/d may not be adequate to reach that effect.
About 20-40% of the dietary magnesium gets absorbed by the body. It is commonly thought that phytate-rich foods like beans, grains and legumes will lead to magnesium deficiency by binding to it and preventing its absorption. However, urinary magnesium excretion will reduce to compensate for a reduction in magnesium intake. Consuming 322 mg/d of magnesium on a high fiber diet has been noted to result in a negative magnesium balance, but that may be due to the inadequate magnesium intake itself.
A vitamin B6 deficiency will increase magnesium excretion and promote a negative magnesium balance. Combining vitamin B6 with magnesium increases its absorption rate and helps to drive magnesium into the cell. Increased protein and fructo-oligosaccharide consumption appear to improve magnesium absorption.
In physiological intracellular concentrations, magnesium competes with calcium for binding with calmodulin and other calcium-binding proteins, which is how magnesium downregulates nuclear factor kappa-beta activation and inflammation.
Magnesium absorption can also be inhibited by antibiotics, such as ciprofloxacin, levofloxacin and demeclocycline, thus, antibiotics should be taken at least 2 hours before or 4-6 hours after magnesium supplementation. Magnesium decreases the absorption of bisphosphonates, such as alendronate, used to treat osteoporosis. Thus, magnesium supplements and bisphosphonates should be separated by at least 2 hours.
Magnesium Excretion and Assessment
Magnesium excretion is mainly regulated by the kidneys, which excrete about 120 mg of magnesium every day through urine. Magnesium excretion increases during a surplus and falls down to ~12mg during deficits. When magnesium blood levels are low, magnesium will be pulled from muscles, organs and even bones.
Here are factors that contribute to magnesium deficiency and increase magnesium excretion:
Normal serum magnesium is considered to be 0.7-1.0 mmol/L, but the optimal range has been proposed to be>0.80 mmol/L. About 10-30% of a given population experiences subclinical magnesium deficiency based on serum magnesium levels below 0.80 mmol/L.
Muscle magnesium stores are a more accurate reflection of whole-body content of magnesium than the plasma, however muscle biopsy is an invasive procedure.
Here are the signs and symptoms of magnesium deficiency:
An elevated retention of an intravenous or oral magnesium load is likely the best way to test for magnesium deficiency. It suggests that the body is trying to hold onto more magnesium because the tissues are depleted. However, the tests assume a normal kidney function for the IV test and normal gastrointestinal/kidney function for the oral test. If you do not have normal kidney function for the IV load or gastrointestinal/kidney function for the oral test, then the measurements can be inaccurate. Measuring hair, bone, lymphocyte, urinary or fecal magnesium excretion may be easier and cheaper, but they are less reliable methods. For the most reliable assessment, multiple methods need to be used. The easiest way to identify magnesium deficiency is from a low-normal blood level (< 0.82 mmol/L), especially if the 24-hour urinary magnesium is low (< 80 mg/day). This is not 100% accurate but it is highly suggestive of magnesium deficiency.
Healthy magnesium-sufficient subjects have an IV magnesium load retention of just 2-8%. On average elderly people retain around 28% of an IV magnesium load, suggesting that many older individuals have marginal magnesium deficiency.
Supplement overview:
When taking magnesium supplements, it is better to take smaller doses more frequently rather than large doses at once because of improved total absorption.
How to Restore Your Magnesium Levels: A 4 Step Plan
Calcium Overload and Calcification
Calcium is the most abundant mineral in the human body. It is required for muscle contraction, vasodilation and bone health. However, only 1% of total body calcium is needed for carrying out these roles. The remaining 99% is stored in bones and teeth for structural support. The skeleton and bones are a readily available source of calcium to meet this baseline requirement. When calcium intake is low or malabsorbed, the body will pull stored calcium from bones to maintain normal functioning, which can eventually lead to osteoporosis and fractures. Low calcium levels in the body can cause muscle cramps, convulsions, abnormal heartbeat and eventually death.
Calcium homeostasis is regulated by parathyroid hormone, calcitriol and calcitonin. When blood calcium levels drop, the parathyroid glands secrete parathyroid hormone (PTH), which stimulates the conversion of vitamin D in the kidneys into its active form calcitriol. This decreases urinary excretion of calcium but raises urinary excretion of phosphorus. Elevated PTH also promotes bone resorption or breakdown, which releases calcium and phosphorus into the serum from bones. Higher calcitriol concentrations increase intestinal absorption of calcium and phosphorus. As calcium levels normalize, PTH secretion stops and the thyroid gland secretes a peptide hormone called calcitonin. Calcitonin inhibits PTH, reduces bone resorption as well as calcium absorption and promotes urinary calcium excretion.
Osteoporosis occurs when bone resorption is chronically higher than formation, which promotes the risk of fractures.
Hyperparathyroidism is the most common cause of elevated calcium in the blood (hypercalcemia). Patients with kidney disease on high calcium intakes experience low bone turnover and PTH suppression. Parathyroid hormone suppression caused by high calcium intake is also thought to reduce magnesium absorption. Sweden is the highest dairy and calcium consuming countries, but it also has the highest rate of hip fractures among developed nations. Meta-analysis find calcium supplementation does increase bone mineral density modestly but does not result in reduced incidence of fractures. Ultimately, the optimal level of calcium intake will also depend on the background intake of magnesium, vitamin D and vitamin K.
Extracellular calcium is constantly maintained at 1.25 mM by parathyroid hormone and 1,25-dihydroxyvitamin D (calcitriol) activity, except after supplementation over a 3-4-hour period.
Intracellular calcium levels are maintained around 100 nM (nanomolar) and can rise up to 1 mcM (micromolar) upon activation, whereas extracellular calcium sits at 1.2 mM.
When intracellular calcium reaches 1 mcM it is taken up by the mitochondria and through a uniporter it is released once levels drop below 1 mcM. Concentrations >1 mcM may inhibit mitochondrial respiration and trigger pro-cell death signals. The mitochondria can accumulate a lot of calcium, exceeding 1000 nmol/mg of mitochondrial protein. This ability is an intrinsic component of ATP production.
Heart mitochondria have a sodium-calcium exchanger in addition to the calcium uniporter. Intracellular calcium is kept at a balance by the influx and efflux of calcium by voltage-operated channels or agonist binding (glutamate, ATP, acetylcholine) via receptor-operated channels. Calcium can also be released from internal stores such as the endoplasmic reticulum.
Increased calcium intake decreases the gastrointestinal absorption of lead and can prevent it from being mobilized from bone during bone demineralization. Average dietary calcium intake of 900 mg/day and supplementation of 1,200 mg/d during pregnancy can reduce maternal lead concentrations by 8-14%. Similar results were found in the blood and breast milk.
Most calcium supplements contain some lead as they are typically sourced from oysters, shellfish, bone meal or dolomite. The FDA’s provisional total tolerable intake (PTTI) level for lead is set at 7.5 mg/1,000 mg of elemental calcium. A review of 324 multivitamin/mineral supplements found lead exposure ranges from 1% to 4% of the PTTI.
Dairy, in particular, promotes insulin-like growth factor (IGF-1), which regulates cell proliferation and growth. Circulating IGF-1 levels are positively correlated with cancer risk, especially prostate cancer. Prostate cancer risk has been observed to be the highest in people who consume the most dairy and animal protein. However, it has also been found that individuals with a higher dairy/calcium consumption are more likely to engage in healthy lifestyle practices like exercise or seeking of medical help, which might mitigate the association with increased prostate cancer risk.
A recent meta-analysis estimated that a high calcium intake (1,300 mg/d) increases fat oxidation by 11% compared to a low intake of 488 mg/d. Dietary calcium may also bind to fat from food in the digestive tract and prevent its absorption. Vitamin D sufficiency could also help with lipolysis as vitamin D deficiency is associated with obesity.
It appears to be dairy foods that are associated with the greatest fat loss seen from increased calcium intake. Calcium may also regulate appetite and hunger levels, making the person eat fewer calories, and whole foods are superior to supplements.
It is thought that calcium supplements override the homeostatic regulation of serum calcium, causing hypercalcemia. Hypercalcemia promotes blood coagulation, vascular calcification and arterial stiffness, which all elevate the risk of CVD.
Phosphate has a role in the accumulation of calcium by activating the calcium uniporter and inhibiting calcium efflux. One of the contributing factors for arterial calcification is high serum phosphate, which has been shown to promote calcification in animal models and cell studies. In humans, high serum phosphate is associated with increased cardiovascular disease events. For every 1 mg/dL increase above normal in phosphorus levels, the risk of coronary artery calcification has been seen to increase by 21%. A low calcium to phosphorus ratio in diet has adverse health effects, starting with arterial calcification and ending with bone loss.
Excessive supplemental calcium can contribute to calcium overload, increasing urinary calcium excretion and possibly soft tissue calcification.
Excess calcium converts smooth muscle cells into bone-forming cells or osteoblasts that promote arterial calcification. High calcium intake also suppresses kidney synthesis of calcitriol, which raises cardiovascular risk.
Calcium supplementation during pregnancy may reduce the risk of preeclampsia, in which the pregnant woman develops hypertension and proteinuria after 20 weeks’ gestation. Supplementing 1,000 mg/d of calcium may reduce the incidence of preeclampsia by 55%. However, it may only be effective in women with a low calcium intake (~314 mg/d) but not in those with more sufficient intakes (1,100 mg/d). There is an inverse relationship between calcium intake during pregnancy and the incidence of preeclampsia.
Intracellular calcium homeostasis is regulated by the mitochondria’s tightly regulated calcium transport system. The mitochondria also produce ATP, which generates reactive oxygen species as an inevitable by-product. In healthy subjects, this basal oxidative stress should be dealt with by the body’s endogenous antioxidant systems. However, during pathophysiological states the balance is offset, causing calcium dysregulation, oxidative stress and eventually cell death. In vitro studies find that calcium in the brain inhibits mitochondrial respiration by inhibition of Complex I and thus reduces mitochondria-generated reactive oxygen species in a dose-dependent manner. Calcium can also inhibit free radical production in the presence of ATP and magnesium.
Calcium signaling also regulates autophagy, which is the recycling of cellular material that is beneficial in moderation but can also lead to cell death when in excess.
Activation of the NMDA-selective glutamate receptors causes a massive influx of calcium into neurons, leading to their death. This is the result of excess mitochondrial superoxide production that damages organelles. Beta-amyloid proteins involved in Alzheimer’s disease cause sporadic intracellular calcium signaling and calcium influx that results in neuronal death.
Here are the calcium-dependent enzymes, functions and consequences that may occur with a deficit:
Calcium-Dependent Enzymes/Proteins: Function: Consequences of Deficit
Calcium Food Sources
The current RDA for calcium in women is 1,200 mg/d and for men 1,000 mg/d. The tolerable upper limit is 3,000 mg/d for 9-18-year-olds, 2,500 mg/d for adults 19-50 years old and 2,000 mg/d for 51+ years of age.
Healthy non-growing adults require 550-1,200 mg of calcium a day to maintain balance. For growing boys, the minimal intake to achieve maximal retention is 1,140 mg/d and 1,300 mg/d for growing girls. Intakes greater than 1,400 mg/d achieve a positive calcium balance in people with both normal renal function as well as in those with end-stage renal disease.
An intake of 2,000 mg/d has demonstrated a positive calcium balance of 450 mg/d in normal subjects and 750 mg/d in patients with mild chronic kidney disease (CKD). Thus, calcium retention is higher in people with renal dysfunction and/or kidney disease. As a result, they may also need to aim for the lower end of the RDA and be more cautious with calcium supplementation to prevent soft tissue calcification.
Females, especially adolescent girls, are generally less likely to get adequate amounts of calcium from food. Dairy contributes to 75% of the calcium intake in the American diet. It is common for children to replace milk for sugary soft drinks during the most critical period of their peak bone mass development. Replacing milk with carbonated soft drinks high in phosphorus has been associated with a rise in bone mineral density loss and fracture risk. Soft drinks that contain phosphoric acid have been a major source of phosphorus over the past quarter century.
Calcium from food is not correlated with increased CVD risk, whereas supplemental calcium is. Among calcium supplement users, taking more than 1,400 mg/d of calcium total (from diet and supplements) is associated with higher all-cause mortality, including cardiovascular disease. Dietary calcium intake and the association with cardiovascular mortality follows a U-shaped pattern with intakes < 800 mg/d or > 1,200 mg/d sharply associated with an increased risk of cardiovascular mortality.
Supplemental calcium >1,000 mg/d with or without vitamin D increases the risk of myocardial infarction slightly. The same applies to kidney stones – calcium from food decreases the risk while calcium supplementation increases it. It is thought that acute large boluses of calcium from supplements elevate serum calcium in a way that leads to transient vascular calcification.
Eating calcium-rich foods should not contribute to cardiovascular disease, unless magnesium deficiency occurs, or high-dose calcium supplements are taken.
The recommended ratio of calcium to magnesium is ~2:1 both for daily dietary intake and supplementation. In Western countries, the ratio of calcium to magnesium is much higher than it is in China (3 vs 1.7).
A recent meta-analysis of prospective studies in Shanghai discovered that in people with a Ca/Mg ratio below 1.7, increased magnesium intake was associated with higher mortality, whereas in those with a ratio above 1.7, magnesium reduced mortality. Thus, high supplemental magnesium may not be advisable if calcium intake is low. Instead, it is better to aim for 500-600 mg/d of magnesium and 1,000-1,200 mg of calcium per day, which provides an optimal calcium to magnesium ratio of ~ 2:1. When calcium intake is 10 mg/kg/d, magnesium balance begins to decrease.
The rate of gastrointestinal calcium absorption is about 10- 30% for adults and as high as 60% for growing children. Antacids increase urinary calcium excretion and glucocorticoids promote calcium depletion.
Urinary calcium excretion is limited to 4 mg/kg of bodyweight per day. For an average-weighing person that would equate to a limit of 280-350 mg per day. Because of that, the kidneys are not able to lower hypercalcemia during excess calcium intake. In the elderly and people with renal dysfunction, the arteries become a calcium sink because the kidneys are not able to excrete extra calcium. Hypercalcemia also increases the risk of developing kidney stones.
Here are things that reduce calcium absorption and increase calcium demand:
Calcium itself decreases the absorption of some drugs, such as bisphosphonates (used for osteoporosis), fluoroquinolones and tetracycline antibiotics, levothyroxine, phenytoin (an anticonvulsant), and tiludronate disodium (for Paget’s disease). Consuming 600 mg of calcium with a meal cuts the absorption of zinc in half from that meal. Taking calcium supplements together with thiazide diuretics increases the risk of hypercalcemia because of increased calcium reabsorption by the kidneys. It can also promote abnormal heartbeat in people taking digoxin for heart failure.
Calcium supplementation is not recommended for those who do not need it as dietary calcium comes in safer amounts and with other nutrients. However, it might be necessary for those who are having difficulties consuming calcium foods, like in those who are lactose intolerant or for postmenopausal women. To avoid calcium overload and increased urinary excretion, it is better to take no more than 500 mg per meal and not exceed 1,500 mg/d. Supplementing calcium (600-2,000 mg/d for 2-5 years) has been associated with gastrointestinal disturbance, constipation, cramping, bloating and diarrhea. The most cost-effective calcium supplement is calcium carbonate. Calcium citrate is best for those who have poor stomach acid or are on histamine-2 blockers or protein-pump inhibitors.
Phosphorus and Health
Phosphorus is another essential mineral component of bones and teeth as well as DNA and RNA. Calcium and phosphorus make up hydroxyapatite, which is the main structural component of bones, the collagen matrix and tooth enamel. In humans, phosphorus makes up ~1-1.4% of fat-free mass (approximately 700 grams). Out of this amount, 85% is located in bones and teeth as hydroxyapatite, 14% exists in soft tissue and 1% is present extracellularly.
Inorganic phosphorus as phosphate (PO 4 3−) is required for all known life forms. Phosphorus is a part of cell membranes, ATP and phospholipids. Phosphorylation is the addition of a phosphoryl group to an organic molecule, which affects many proteins and sugars in the body. The phosphorylation of carbohydrates is involved with many steps in glycolysis. Glucose phosphorylation is needed for insulin-dependent mTOR activity in the heart, regulating cardiac growth. Thus, without phosphorus, or phosphorylation, glycogen resynthesis and glycolysis do not work as well. Phosphorylation also affects amino acids and proteins like histidine, lysine, arginine, aspartic acid and glutamic acid.
Phosphorus deficiency can cause anorexia, muscle wasting, rickets, hyperparathyroidism, kidney tubule defects, and diabetic ketoacidosis. Genetic phosphorus disorders include X-linked hypophosphatemic rickets, which promotes rickets, osteomalacia, pseudofractures, dental damage and enthesopathy (mineralization of ligaments and tendons). Hypophosphatemic rickets may also be accompanied with hypercalciuria.
In early starvation, the body will shift from using carbs to using fat and protein for fuel, which can decrease metabolic rate by up to 20-25%. During refeeding, the increase in blood glucose increases insulin and decreases glucagon, which stimulates synthesis of glycogen, fat, and protein. This process requires phosphate, magnesium, and vitamin B1. Insulin stimulates the absorption of potassium, magnesium, and phosphate into the cells via the sodium-potassium ATPase pump, which also shuttles glucose into cells. This process decreases serum levels of these minerals because the body is already depleted of them.
Normal serum phosphate in adults is 2.5-4.5 mg/dL or 0.81-1.45 mmol/L. Hypophosphatemia is defined when phosphate drops below the lower end of the normal range and hyperphosphatemia is when it is above the high end. However, serum or plasma phosphate does not necessarily reflect whole-body phosphorus content.
Here are the phosphorus-dependent enzymes, functions and consequences that may occur with a deficit:
Phosphorus-Dependent Enzymes/Proteins: Function: Consequences of Deficit
The RDA for phosphorus is 700 mg for adults, 1,250 mg for teens 9-18 years of age, 500 mg for 1-8-year-olds and < 275 mg for infants. Tolerable upper intake levels (Uls) are set at 3,000-4,000 mg/d. Estimated Average Requirement (EARs) for basal requirements are deemed to be 580 mg/d for ages above 19. The RDA for phosphorus is being exceeded greatly and it may be even higher than currently estimated. NHANES data from 2015-2016 shows that adults get on average 1,189 mg/d for women and 1,596 mg/d for men, while children and teens get 1,237 mg/d. In 2013-2014, NHANES showed the average phosphorus intake from both food and supplements was 1,744 mg/d for men and 1,301 mg/d for women. However, this may not be adequate as these surveys don’t take into account the phosphate additives in many foods.
High intake of phosphorus can contribute to calcium overload and magnesium deficiency. Cell studies find excess phosphorus can cause vascular calcification and endothelial dysfunction. A high intake of phosphorus (>1,400 mg/d) in relation to calcium keeps parathyroid hormone elevated constantly, which decreases bone mineral density. Over the long term, it can lead to an increased risk of skeletal fractures and cardiovascular disease mortality.
However, if calcium intake is adequate, high intake of phosphorus doesn’t seem to have a negative effect on bone mineral content. Excess phosphorus, however, may promote the formation of kidney stones and calcium oxalate.
The recommended ratio of Ca/P is 1-2:1. That ratio in the U.S. between 1932-1939 was 1.2-1 and has risen to 1/4 in those who replaced soda with milk. Increased phosphoric acid consumption from soft drinks may have a negative effect on bone health. Diets with a low Ca/P ratio (≤0.5) have an increase in parathyroid hormone and urinary calcium excretion.
High dietary calcium and phosphate intake inhibit the absorption of magnesium. People eating refined low magnesium diets also tend to get a lot of phosphorus from those same foods. Since the early 1900s, the P/Mg ratio has increased from 1.2:1 to 7:1. Dairy, especially cheese has a high phosphorus to magnesium ratio. For example, cheddar cheese has a P/Mg ratio of 18 and Ca/Mg ratio of 26. Pumpkin seeds, on the other hand, have a P/Mg ratio of 0.35 and a Ca/Mg ratio of 0.21.
Phosphorus is found in many different types of foods, such as dairy, meats, fish, eggs, legumes, vegetables, grains and seeds. Infants fed cow-milk formulas have a 3x higher phosphorus intake and higher serum phosphate levels compared to those fed human milk. These levels seem to normalize after 6 weeks but less than 1 month old infants may be at a risk of hypocalcemia when fed exclusively high phosphorus formulas.
Phosphate additives like phosphoric acid, sodium phosphate and sodium polyphosphate are added to processed foods for preservation and they have been shown to reduce bone mineral density in humans. These additives can provide 300-1,000 mg to the total daily phosphorus intake with an average of 67 mg of phosphorus per serving. That is about 10-50% of the phosphorus intake in Western countries.
Absorption of phosphorus from whole foods is anywhere from 40-70% with animal foods being more bioavailable than plants. The absorption rate of phosphate additives added to foods is anywhere from 70-100%. Unenriched meat and cottage cheese are better choices than hard cheeses or sausages that tend to have added phosphate additives. In seeds and grains, phosphorus is stored as phytic acid. Because humans lack the phytase enzyme, phytic acid is not absorbed and it actually inhibits the absorption of other nutrients.
Calcium from foods and supplements may bind to phosphorus and reduce its absorption. A high intake of calcium (2,500 mg/d) can bind 0.61–1.05 grams of phosphorus. Replacing high phosphorus foods like animal protein and phosphate additives with more calcium foods can reduce serum phosphate levels.
Phosphorus homeostasis is regulated by the kidneys, bones and intestines. In kidney failure, the body cannot excrete phosphate efficiently and its levels rise. Phosphorus resorption by the kidneys also increases. Patients with moderate kidney dysfunction have higher serum phosphate levels (4.12 ml/dL) compared to those with normal kidney function (3.74 mg/dL). Higher phosphorus retention promotes chronic kidney disease (CKD) and bone disorders. High as well as low bone turnover are common in CKD.
Reducing phosphorus containing foods, which are often high in protein, in CKD patients may not outweigh the benefit of controlling phosphorus levels and may lead to higher mortality because of protein restriction. Instead, getting higher calcium proteins foods like cottage cheese or whey protein might be a better approach.
Antacids can bind to phosphorus and over the long-term lead to hypophosphatemia. If the antacids contain calcium carbonate, they can also decrease the intestinal absorption of phosphorus. Laxatives that contain sodium phosphate can increase serum phosphate levels. When taken at high doses, laxatives that contain sodium phosphate may even lead to death, especially in those with kidney failure, heart disease or dehydration.
Conclusion on Magnesium, Calcium, and Phosphorus
Copper is a component of numerous enzymes in the body including cytochrome c oxidase in the electron transport chain. Copper helps with electron transport, energy production and oxygen transportation. It is also a very versatile catalyst for oxidation/reduction (redox) reactions.
As the Earth went from being anaerobic to aerobic due to the formation of oxygen from photosynthetic blue-green algae, the increase in atmospheric oxygen forced organisms to figure out a way to handle the highly reactive oxygen species such as hydroxyl radicals, peroxynitrite, superoxide anions and hydrogen peroxide. Thus, enzymes evolved like superoxide dismutase, which require copper, zinc and manganese in humans. In plants, copper is also needed for growth.
The human body contains roughly 1.4-2.1 mg of copper per kilogram of bodyweight. Thus, an average adult who weighs 70-90 kg has only about 1/8th to 1/10th of a single gram of copper.
Low copper intake jeopardizes the immune response, which is not restored even after several weeks of high copper intake. A large part of the population is not meeting the RDA of 0.9 mg/day, not to mention the optimal intake for copper, which sits at around 2.6 mg/day.
The History and Ancient Utility of Copper
Copper vessels have been used for cooking food, which can be beneficial or harmful, depending on the dose. Excess copper can promote reactive oxygen species formation, whereas deficient copper promotes anemia, neutropenia and ischemic heart disease. However, even 10 mg per day of copper has been found to be safe for most individuals. Copper excess is typically due to a loss in the control of copper utilization in the body and not due to an excessive intake.
The antiseptic properties of copper are known to kill off bacteria and viruses like E. coli, Candida utilis, and poliovirus, thus improving wound healing and reducing the risk of sepsis. The Egyptians, Aztecs, and Romans used copper mixtures for many different wound healing and pathogen destroying purposes. Even China supposedly used copper coins, due to their antiseptic properties. Also, France used it to treat epileptic hysteria, anemia, neuralgia, hysteria, hypochondria and nervous paralysis of various sorts, such as hemiplegia, paraplegia, amaurosis, hysteria with amenorrhea, hysteria, beginning paralysis, complete hysteric paralysis with amenorrhea, neurosis of hysteric nature and hysterical shaking.
The Modern Utilization of Copper
During the cholera epidemic, copper industry workers in France were protected from dying. The 16 people who were said to have died were unemployed from the copper industry at the time. Workers in non-copper manufacturing were 10-40 times more likely to die.
In regions of India and Nepal, copper was used for treating a variety of disorders such as venereal diseases, fevers, diarrhea, skin diseases (such as ringworm, eczema and leprosy), colic, hemorrhoids, indigestion, spleen, liver and blood diseases and wound healing of ulcers and sores.
The ancient Persians also used copper for treating boils, eye diseases and “yellow bile”.
In Switzerland, a physician named Köchlin used copper to treat syphilis, rickets, tuberculosis, cholera, boils, skin eruptions, chorea, ulcers, epilepsy, hypochondria, hysteria, melancholy and other afflictions. Köchlin also noted that copper improved male fertility.
Rademacher noted that copper cured neuralgias of the head, apoplexy, paralysis, angina (chest pain) and scarlatina (scarlet fever), pleurisy, dropsy, hematuria, acute and chronic rheumatism and asthma. Rademacher also used copper to kill and expel intestinal worms and utilized copper for treating numerous skin diseases such as eczema, itching, herpes and removal of warts.
Research nowadays reports that different copper chelates have anti-convulsant activity in animal models of epilepsy.
In 1930, de Nabias told the French Cancer Association that intramuscular injections of a copper preparation helped to expel tumor tissue. He also noticed that during the repair, necrotic cells were replaced by healthy ones. In 1959, Voisin saw adequate results in mice with adenocarcinoma with intravenous injections of colloidal copper. The growth of adenocarcinoma was inhibited in all but two cases with daily injections.
Kobert, in 1895, published a review paper covering the benefits of copper compounds for treating acute and chronic diarrhea, dysentery, cholera, adenitis, eczema, impetigo, tuberculosis, lupus, syphilis, anemias, chorea (a neurological disorder hallmarked by jerky involuntary movements) and facial neuralgia. Heilmeyer and Stüwe, in 1938, were the first to suggest that copper levels increase as a defense mechanism to combat diseases such as scarlet fever, diphtheria, tuberculosis, arthritis, malignant tumors, and lymphogranulomas. Fenz, in 1941, noted that copper was effective for patients with polyarthritis, acute and chronic rheumatoid arthritis and ankylosing polyarthritis.
With the discovery of corticosteroids, which are now common day medications to treat inflammation, the interest in copper compounds for treating rheumatic and arthritic conditions died off.
Copper Deficiency Anemia: The Copper/Iron Connection
Anemia is defined as a decrease in the oxygen carrying capacity of the blood. Anemia is suggested when levels are slightly below a normal hemoglobin level of around 13.8-17.2 g/dL in adult men and 12.1-15.1 g/dL in adult women.
As a result, you can experience chronic fatigue, shortness of breath, weakness, pale skin, lightheadedness and other problems. Both too much, as well as too little iron, can make people more susceptible to infections. Some amounts of iron are needed for fighting harmful bacteria; however, excess iron also promotes the growth of these pathogens.
Copper improves the ability of red blood cells to transport oxygen and nutrients, creating a ‘hypernutritive state’. Clinicians nowadays acknowledge that copper sulfate can increase the number of red blood cells. Copper can also protect against the oxidation of red blood cells, making them more resilient.
In the mid-1800s in France, it was common knowledge that chlorosis/anemia could be treated with exposure to copper. The essential role of copper in health was first discovered in 1928 when it was shown that rats fed a copper-deficient diet were unable to produce enough red blood cells.
Although copper was not a major component of hemoglobin, it was considered a catalyst needed for hemoglobin synthesis. Fixing a copper deficiency also improves erythropoietin unresponsiveness (erythropoietin is needed to make red blood cells) in anemic hemodialysis patients. In 1931, Cook and Spilles discovered that copper stimulates the release of stored iron in tissues during the creation of red blood cells. Thus, copper mobilizes iron, helping to shift iron storage into hemoglobin.
Iron and copper are absorbed in the upper small intestine and hepcidin controls the absorption of iron, by inhibiting iron export in the intestine. Hepcidin expression is decreased on a copper deficient diet, potentially leading to iron deficiency. In other words, a low copper diet can lead to low iron absorption and iron deficiency anemia.
Hephaestin and its homologue ceruloplasmin, the latter being the major copper-carrying protein in blood, are required for efficient iron transformation into transferrin. Their synthesis and activity are positively correlated with copper levels in the cell. Thus, a copper deficiency can lead to a reduction in hepcidin, hephaestin and ceruloplasmin jeopardizing iron metabolism and restricting its movement out of the gut and liver to areas where it may be needed in the body. Additionally, copper is also needed for the synthesis of heme, which is a constituent of hemoglobin. Thus, in order to get iron where it needs to go in the body, it must be placed onto transferrin, which requires copper.
In 1966, Osaki and colleagues suggested to rename ceruloplasmin ‘serum ferroxidase’ because of its importance in iron metabolism. They showed how ceruloplasmin, which is the main copper-carrying protein in our blood, allowed iron to get into bone marrow by promoting iron transfer onto transferrin. Thus, numerous iron proteins need ceruloplasmin to work.
The potential consequences of copper deficiency:
Copper deficient diets can reduce cytochrome c oxidase activity and heme synthesis from iron and protoporphyrin. This reduction in cytochrome c oxidase limits the reduction of Fe3+ iron to Fe2+ iron to form heme. Thus, copper deficiency reduces iron utilization by the mitochondria.
Why You Need Copper
Cardiovascular Disease – Copper deficiencies are seen animals and humans with a wide range of heart disease outcomes, such as aortic rupture, fissures, cardiac enlargement, abnormal lipid metabolism, and ischemic heart disease. Copper deficiency is one of the few nutrient shortages that elevates cholesterol, blood pressure, homocysteine, adversely affect the arteries, impairs glucose tolerance, and promotes thrombosis. People with hypertension tend to have lower copper than those with normal blood pressure. In men, copper deficiency can cause irregular heartbeat.
Inflammatory Conditions – Many times elevated copper levels in the blood indicate inflammation, which itself is a cardiovascular disease risk factor. That’s why it is more accurate to look at enzyme activity or leukocyte copper levels than just blood copper levels.
Oxidative Stress – Copper inhibits oxidative stress and reactive oxygen species (ROS) via superoxide dismutase (SOD). The superoxide anion causes lipid peroxidation in addition to other problems, which can be neutralized primarily by copper-zinc superoxide dismutase (Cu,Zn SOD). Excessive oxidative stress is a key contributor to the pathogenesis of many diseases, primarily by raising inflammation.
Mitochondrial Function – Copper is a co-factor for respiratory complex IV in the mitochondria. A copper deficiency impairs immature red blood cell bioenergetics, which alters the metabolic pathways, turning off the mitochondria and switching over to more glycolysis (sugar burning). As a result, there is a reduction in energy production and excessive lactate production from glycolysis. Lactic acidosis or the buildup of lactic acid is associated with several cancers and inflammatory diseases. During oxygen shortage, which can accompany copper deficiency anemia, lactic acid can begin to accumulate.
Metabolic Syndrome – Copper deficiency has a detrimental role on your metabolic health and condition. Metabolic syndrome is a condition in which at least three or more of the five are present: high blood pressure, central obesity, high fasting triglycerides, high blood sugar and low serum HDL cholesterol. It is associated with cardiovascular disease and type 2 diabetes by causing inflammation. Metabolic syndrome doubles the risk of cardiovascular disease and increases all-cause mortality by 1.5-fold.
Kidney Health – Ceruloplasmin protects the kidneys as it acts as an antioxidant. The development of kidney dysfunction may occur in part due to a lack of copper, with kidney function improving upon copper supplementation. One group of authors concluded, “Copper deficiency can worsen nephrotic syndrome by decreasing the ceruloplasmin activity, which protects the glomeruli.” Additionally, kidney damage can increase the loss of copper in the urine. Patients with kidney damage tend to develop moderate copper deficiency. An easy indication of copper loss is spillage of albumin (protein) into the urine. This is because most copper in the blood is bound to the protein ceruloplasmin. The urinary losses of copper can be 8-32 times higher than normal in those with kidney damage.
Reproductive Health – Both copper transport as well as ceruloplasmin’s ferroxidase activity are exhibited in the testicles, primarily helping in the processes of spermatogenesis. Thus, copper is important for male reproductive systems. Additionally, the copper-dependent cytochrome c oxidase has an integral role in energy metabolism in the testes, enabling the production of ATP needed for sperm motility. Essentially, if a man is deficient in copper, their sperm does not swim as well. On top of that, a copper deficiency can also make the sperm more vulnerable to oxidative stress and DNA damage. When couples are having difficulties getting pregnant, it might be due to inadequate copper status. Foods high in copper like liver, oysters and clams are also considered fertility-boosting foods.
Immune System – Micronutrients, including copper and iron, are needed for modulating immune function and reducing the severity of infections. A copper deficiency has been shown to cause thymus (one of the main organs that produces immune cells) atrophy, reducing the amount of circulating neutrophils and decreasing the ability of macrophages to kill pathogens. Repletion of copper appears to be able to reverse this. By causing iron deficiency, a copper deficiency can also reduce phagocytosis and lymphocyte proliferation. In fact, infants with copper deficiency have been documented to have recovery in their immune health with copper supplementation. Copper is also essential for the production of interleukin-2, which is a cytokine that helps us fight infections.
Here are the enzymes and functions that are dependent on copper and the consequences of a copper deficit:
Copper-Dependent Enzymes/Proteins: Function: Consequences of Deficit
In humans, copper levels are high in the organs during childhood and start declining throughout later stages of life. During old age, hepatic and aortic copper decreases substantially, which is inversely associated with the concentration of calcium in the arteries.
The functions of copper are listed below:
In conclusion, copper has a central role in energy production and energetic processes. Copper deficiencies can cause many health problems, including anemia, chronic oxidative stress, cardiovascular disease and metabolic syndrome. Now that we’ve covered the extensive history and origins of copper in medical practice, we shall turn to covering how to obtain the right amounts of copper in balance with the other minerals for optimal health.
“Iron deficiency anemia” is not necessarily a disease of iron deficiency. In fact, it can be caused by copper deficiency. Unfortunately, most doctors and medical practitioners are not aware of this as they try to fix symptoms of anemia with supplemental iron. Making matters worse, high iron intakes can block copper absorption, increasing copper requirements and causing copper deficiency in the process. On the flip side, chelation or binding of iron, has been shown to reverse copper deficiency.
There are many other factors that can cause an iron deficiency anemia, such as inflammation (as found in those with obesity or metabolic syndrome), bleeding, malnutrition, copper deficiency, infections, and lack of other nutrients but a lack of dietary iron tends to make all the headlines. On the flip side, insulin resistance can lead to liver iron overload and frequent blood donations can improve hyperinsulinemia.
The Copper-Iron Deficiency Anemia Epidemic
Increasing iron intake does not seem to eliminate iron deficiency anemia. It might work in some people who are severely deficient in iron, such as during pregnancy or infections, but most of the population is already getting plenty of iron. Unfortunately, these warnings are ignored by the grain lobbyists who push further increases in iron enrichment.
“Copper metabolism is altered in inflammation, infection, and cancer. In contrast to iron levels, which decline in serum during infection and inflammation, copper and ceruloplasmin levels rise.” This is why so many doctors fear “copper overload” and don’t take copper deficiency seriously, i.e., because they rarely see low copper levels on a blood draw.
The estimated safe and adequate copper intake in 1989 was set at 1.5-3.0 mg/day (Food and Nutrition Board also suggests this), which exceeded the previous required estimates of 1.2-2.0 mg/day. Why then is the current RDA set so low at just 0.9 mg at minimum? Even the World Health Organization recommends at least 1.3 mg a day. The reason is likely because if we kept the RDA at 1.2-3.0 mg/day, virtually no one in the United States would be meeting that requirement, and that would then require another mineral fortification.
Balance studies suggest that a daily intake of 0.8 mg leads to net copper loss whereas net gains are seen with 2.4 mg/day.
Men and women fed diets low in copper, around 1 mg/day, develop increases in blood pressure, cholesterol, glucose intolerance, and abnormal electrocardiograms.
Data from 10 dietary surveys shows that only 14% of typical diets in Belgium, Canada, the UK and the U.S. exceed 2 mg of copper a day. Only 3.2% of diets exceed 3 mg/day, 61% get less than 1.5 mg/day, and 1/3rd obtain less than 1 mg/day.
There is around 50-120 mg of copper in the human body. Copper is mainly stored in the liver, but it is also found in the heart, brain, kidneys and muscle. About 1/3rd of total body copper is in the liver. However, copper is present in almost every tissue of the body. The daily copper turnover is estimated to be around 1 mg through fecal loss. Thus, total body copper stores can be depleted in a few months if daily intake is not near this level of intake. More importantly, this does not take into account copper losses in sweat, which can be quite significant. Overweight people appear to require more copper than normal weight ones, generally needing more than 1.23 mg/day. In obese children, copper levels in the blood are lower, but slightly higher in the plasma, than in controls. They also have lower iron levels, which may be due to inflammation caused by excess body fat. It is inflammation that raises copper and lowers iron in the blood.
Furthermore, certain medical conditions can also increase copper excretion. For example, patients with nephrotic syndrome have an 8-32-fold greater urinary loss of copper than normal patients. Fructose consumption also increases the need for copper and promotes copper deficiency, which increases oxidative stress, causes fatty liver and damages the intestine.
Things that are caused by copper deficiency and/or raise demand for or reduce copper levels:
The upper safe threshold is deemed to be 10 mg/day for adults. Taking 10 mg/day of copper for 12 weeks has not been seen to cause liver damage, but one case reports that a long-term intake of 60 mg/day resulted in acute liver failure. Intakes of 7-7.8 mg/day for 4-5 months in healthy male volunteers between the ages of 27-48 may cause some negative side effects, such as decreased antioxidant defense, suppressed immunity and increased excretion of copper. However, it seems that the body’s own homeostatic regulation of copper absorption is being automatically regulated to minimize the amount of excess copper retention.
When copper intake was at 1.6 mg/day, copper retention was 0.06 mg/day but increased to 0.67 mg/day at intakes of 7.8 mg/day. So, as copper intake increases, its absorption levels slow down to prevent overload. High and low copper intakes exceed these regulatory mechanisms, causing either copper depletion or retention. A low copper consumption for 6 weeks does not appear to significantly change copper absorption compared to high copper consumption of 6 mg/day.
Usually, copper deficiency occurs during malnutrition, in patients receiving parental care with insufficient amounts of copper, and when an individual ingests excess quantities of zinc on top of a low copper intake (zinc is a copper absorption antagonist but zinc may not be an issue if the intake is 50 mg per day or less and copper intakes are at 1 mg or above). Copper deficiency has also been seen in infants fed only cow’s milk, lightweight preterm infants and the elderly. Pregnant and lactating women are also at risk of copper deficiency or people with malabsorption diseases. Bile eliminates most copper, whereas the kidneys typically only eliminate ionic or loosely bound copper.
Copper Deficiency Testing
The main symptoms of copper deficiency include anemia, secondary iron deficiency, neutropenia, and bone abnormalities. Additional signs include hypopigmentation, growth impairment, immune system suppression, abnormal electrocardiograms and dysregulated glucose and cholesterol metabolism.
Symptoms of copper deficiency include:
Approximately 60-70% of symptomatic copper deficiency occurs with anemia and more than 50% are normocytic anemia.
C-reactive protein has been associated with an elevation of copper. Usual plasma copper tests are insensitive and not entirely accurate. Low copper levels in the liver would likely be a better determinant but a more practical strategy would be to measure leukocyte copper levels, which are more reflective of total body copper status and are depleted in those with significant atherosclerosis or coronary artery disease.
Unfortunately, measuring copper levels in the liver requires a biopsy and hence it is not very practical. Hair mineral analysis is a potential option to add to other measurements for an overall assessment of trace mineral status. Scalp hair remains isolated from other metabolic activities and is a unique indicator of elements concentrated in an individual at a given time point. Hair analysis has been used for successful assessment of heavy metal toxicity. Red blood cell copper may also be a better indicator of copper status versus serum levels and may reveal an undiagnosed pandemic of copper deficiency.
Best to Worst Ways to Diagnose Copper Deficiency:
Copper and Iron in Food
The RDA for iron is 8 mg for adult men and 18 mg for adult women. During pregnancy, the RDA increases to 27 mg. Normal ranges for serum ferritin are 20-250 ng/ml for adult males and 10-120 ng/ml for women. A serum ferritin below 30 suggests iron deficiency. Numbers above the reference range may indicate oxidative stress or just iron overload. People, especially children, who take supplements may overdose on iron. You can get iron poisoning from doses as low as 10-20 mg/kg.
Our bodies have a hard time eliminating iron. Losses only occur, at least in a significant amount, during bleeding, blood donation, menstruation in women or through chelation. That is why females are more susceptible to anemia and iron deficiency, whereas men are more prone to iron overload. Common iron chelators include coffee, spirulina, chlorella, green tea, beans, whole grains and vegetables. That is why, although things like spinach and beans contain high amounts of iron, it is harder to reach iron overload eating them because a large proportion of the iron gets chelated and/or not absorbed.
Iron and zinc are copper antagonists. Foods are not fortified with copper, which contributes to this epidemic of copper deficiency. Unsaturated fats (omega-6 seed oils) are also lacking copper because transition metals are removed by chelation to increase shelf-life. At the same time, foods high in copper, such as liver, oysters, lobster and crab are eaten less frequently
Vermont spring water appears to have the highest copper levels at 1.4 mg of copper per liter of water, whereas soft municipal tap water ranges from around 0.17-0.73 mg/liter. In other words, natural spring waters may be is 5 to 10-times higher in copper versus tap water.
Muscle meat is extremely low in copper but high in zinc and iron. Heme iron, as found in meat, is particularly well absorbed compared to non-heme iron. The highly bioavailable iron in cooked meat may also reduce the absorption of copper. Excess cystine and cysteine consumption enhances copper deficiency in rats. Foods high in cystine/cysteine are animal proteins like poultry, beef, pork, eggs, fish, etc., and in smaller amounts in lentils and oatmeal. These foods are also highest in iron. Zinc absorption is also greater when eating animal protein compared to plant foods, which inhibits copper absorption. This might be why vegetarians or vegans may need more zinc than omnivores but less copper.
Zinc/Copper Ratio
Zinc is a copper antagonist, which is why ingesting large doses of zinc can lead to copper deficiency by inhibiting its absorption and increasing its secretion. A high concentration of zinc in the small intestine triggers the expression of metallothionines, which bind to copper.
The golden rule when supplementing with zinc, is that at a minimum, for every 40mg of zinc, 1 mg of copper should be employed.
High copper and phytate intake may be a protective factor against the Standard American Diet high in zinc and iron (phytate can also bind free iron). Phytic acid decreases the ratio of zinc to copper that gets absorbed in the intestinal tract. This is not to say that you need to eat fiber or phytic acid, but it’s important to balance the zinc/copper ratio, and that means sourcing more than just muscle meat and eggs.
Organ meats like liver have a much more appropriate iron-zinc to copper ratio than muscle meat. Per ounce of beef liver, you get 4.1 mg of copper or 4.5 times the RDA for copper of 0.9 mg. Thus, just a little bit of liver goes a long way. However, this is also why it is not recommended to consume more than 1 oz. of beef liver per day as total intakes of copper at or over 10 mg for several months could cause unwanted side-effects. Although liver is high in cholesterol, it may be an antidote to the hypercholesterolemic effects of meat, which may actually be due to the low copper content of muscle meat. Liver, heart, kidney, suet, etc. have the fat-soluble vitamins like K2, A, D as well as CoQ10. Per calorie, liver is one of the most nutrient dense foods in the world. It is also relatively low in calories compared to muscle meat (~130-150 vs ~150-200 calories/3.5 oz).
Exercise stimulates the growth of muscle and bone, which are made-up of larger amounts of zinc compared to copper (zinc/copper ratios 56 and 120, respectively) and thus exercise tends to protect against a diet higher in zinc to copper (i.e., more zinc gets directed towards muscle and bone growth). This may be why weightlifters are somewhat protected from their high intake of meat, as the zinc is needed for muscle and bone growth. This also suggests that weightlifters need more zinc, in order to develop larger muscles and stronger bones. Thus, zinc requirements go up when building muscle, and this is likely why many weightlifters crave muscle meat (for the high zinc content).
The zinc to copper ratio of cow’s milk is 38, whereas that of human milk is between 4 and 6, this might explain the health benefits of populations that breast feed for several years versus in the United States, where children are typically switched over to cow’s milk at 9-12 months of age. A zinc/copper ratio of 38 leads to copper deficit in infants and is clearly not healthy for infants or adult humans. Additionally, a higher zinc to copper ratio (as found in cow’s milk) positively correlates with mortality due to coronary heart disease in 47 cities in the United States.
Thus, an optimal intake of zinc in an adult can range anywhere from 20-80 mg/day, depending on the person and the situation. The optimal intake of copper, however, seems to sit at around 3 mg/day, providing an optimal Zn/Cu ratio of anywhere from 4-20:1.
Foods with a high Zinc:Copper Ratio (these foods must be balanced with appropriate amounts of copper from other food sources):
Foods with a low Zinc:Copper Ratio (these will need to be paired with foods higher in zinc):
The takeaway is that you shouldn’t make these high zinc:copper ratio foods the only ones that you eat in your diet. Additionally, you should not overconsume high copper containing foods either, like eating 3 oz. of liver every day. As a rule of thumb, eat around 0.5 to 1 oz. of liver/day for copper and at least 12 oz. of red meat for zinc, iron, B12 and protein.
Copper and Molybdenum
Another copper inhibitor is molybdenum (Mo). It is a mineral found in foods like pork, lamb, beef, beans, eggs and cereal grains in small amounts. Molybdenum and copper have been found to have antagonistic roles with each other. Indeed, a Mo/Cu ratio of 2.5/1 in animals can lead to copper deficiency and altered immune responses during infection.
Patients with type-2 diabetes have been shown to have higher serum and urine copper levels due to molybdenum removing it from tissues and eliminating it through urine. Those with severe nephropathy have a higher urine level of Cu/Mo. Thus, excess molybdenum, when combined with sulfur, may promote the excretion of copper through urine or bile via binding with protein-bound and free copper.
In animals, molybdenum overload has been shown to disrupt carbohydrate metabolism and increase oxidative stress by reducing copper-dependent enzymes like ceruloplasmin, superoxide dismutase, and myocardial cytochrome c oxidase. The glucose intolerance is a secondary symptom of copper deficiency.
Copper Absorption
Copper is absorbed in the gastrointestinal tract and is transported to the liver.
Liver damage, cirrhosis or fibrosis increase the demand for copper and decrease its absorption. Fibrate medications (such as bezafibrate, fenofibrate, gemfibrozil) may be able to restore low copper levels in NAFLD disease and increase concentrations of hepatic Cu-Zn SOD. Malabsorption diseases like celiac disease, short bowel syndrome and IBS also reduce copper absorption. Copper deficiency has been seen in patients with celiac disease and it can cause anemia and thrombocytopenia. Gastric bypass surgery or colon removal can cause copper deficiency.
The oral bioavailability of copper can range from 12-75%, depending on the amount of copper in the diet. The average bioavailability of most normal diets is between 30-40% on the typical Western diet. That number can be lower in the presence of excess zinc, molybdenum, iron or fructose/added sugars. The refining of grains and cereals can reduce the copper content by up to 45%.
Indiscriminate use of multivitamins that often contain zinc and iron can be problematic for people low in copper.
Copper absorption occurs in the stomach and the small intestine, but the most efficient absorption takes place in the ileum and duodenum. For optimal absorption, copper needs acid from the stomach to be liberated from food. During absorption, excess copper is transiently stored in the enterocytes by binding to metallothionine. Once inside the enterocyte, cuprous ions bind to different copper chaperones to be distributed to various organelles and cell compartments. Thus, damage to the microvilli of enterocytes, which contain micronutrient absorption transporters, will decrease copper absorption. It has been shown that the overconsumption of refined sugar and fructose damages the enterocytes, which will likely reduce copper absorption.
Approximately 0.5-2.5 mg of copper is excreted via feces every day. Biliary copper secretion cannot be reabsorbed, as it is prevented by the binding to bile salts and comes out in the feces. Thus, the body has the ability to get rid of copper excess through bile excretion, which is regulated by the liver (liver cells excrete excess copper into bile). Low blood copper levels, or low ferritin levels, leads to copper being released into the blood circulation. To prevent copper overload, surplus copper can be excreted through the natural shedding of the intestinal epithelial cells.
Copper absorption in the stomach is stimulated by an acidic pH and is inhibited by zinc and cadmium. Stomach acid frees up copper bound to food and facilitates peptic digestion, which releases copper from natural organic complexes. That is why proton pump inhibitors or other acid suppressing over the counter therapies or medications can reduce copper absorption as it reduces the release of copper from natural organic complexes in food.
Cadmium is a common toxic heavy metal that has accumulated in the environment due to pollution. Ironically, zinc helps to reduce the absorption of cadmium.
Vitamin C can enhance, or inhibit, copper absorption. If taken prior to (around 75 minutes before) or with copper, vitamin C reduces copper absorption by blocking the binding of copper to metallothionein. However, if vitamin C is taken after copper ingestion (75 minutes after) it increases ceruloplasmin and lysyl oxidase activity. Vitamin C also seems to help facilitate the transfer of copper ions into cells, reacting with the copper carrying protein ceruloplasmin, reducing cupric copper (CuII+) to cuprous copper (CuI+) making the bound copper atoms more freely available and facilitating their cross-membrane transport. Moreover, histidine, may aid the transport of copper from albumin into certain cells. Additionally, glutathione appears to help facilitate the transfer of copper to metallothionine and superoxide dismutase. Thus, cysteine and glycine, which help to increase glutathione levels, may also help with copper transport in the body.
In large doses, vitamin C may decrease copper status by increasing iron and zinc absorption. However, there may not be enough vitamin C in whole foods like fruit and vegetables to inhibit copper absorption or deplete it.
Copper homeostasis is maintained in the liver. Importantly, the thyroid gland regulates more than 5% of all the genes expressed in the liver, including copper metabolism.
Only certain genetic disorders, such as Wilson’s disease (WD) or Menke’s disease (MD) are known to disturb copper metabolism in a life-threatening way, by affecting the copper transporting proteins ATP7A and ATP7B:
One of the biggest chelators of copper is glyphosate. Deficiencies in trace minerals like iron, molybdenum, copper and others is associated with celiac disease and it can also be attributed to glyphosate’s ability to chelate these metals. Glyphosate is one of the most common pesticides in the U.S. used on almost all types of fruit and vegetables. Not only does it reduce the mineral content of the vegetation, but it also disrupts the microbiome potentially promoting malabsorption diseases.
It is not difficult to obtain the optimal amount of copper from diet, which we deem to be around 3 mg/day. If you are already deficient in copper, then intakes of above 2.6 mg/day are likely needed for several weeks if not months. Dietary copper intake should not exceed 10 mg/day. However, excess copper will typically be excreted or its absorption will be reduced. The highest copper-containing foods are beef liver, oysters, shellfish and beans. They also have a lower zinc-copper ratio (except oysters and mollusks), which improves copper absorption. Generally, it is recommended to eat liver at least once a week (3 oz. at a time), or if eaten several times per week, 0.5 to 1 oz. at a time. Eating plant fibers, phytate and phytonutrients can help to chelate iron and maintain a better copper status, but they can also reduce zinc absorption. Thus, it is important to balance plant foods with animal foods. To not impose additional demands for copper on the body, avoid added sugars and fructose-sweetened beverages.
Zinc is another essential mineral needed for many processes in the body, including postnatal development, protein synthesis, wound healing and immune system functioning. You also need zinc for a proper sense of taste and smell. Zinc acts as a catalyst for over 300 enzymatic reactions and over 1000 transcription factors. DNA/RNA transcription and replication factors are zinc-dependent, which will not work in the absence of zinc.
DNA synthesis is impaired by dietary zinc deficiency. Thus, zinc affects gene expression. Apoptosis, or programmed cell death, is regulated by zinc. However, in some instances, zinc can protect against apoptosis during gamma irradiation, hyperthermia and cold shock. In other words, zinc can make our cells more resilient to certain stressors. This is a good thing if you are fighting off an infection or are dealing with a lot of oxidative stress.
Deficiencies in zinc cause growth retardation, hormonal imbalances, impotence, delayed wound healing, frailty, hair loss, diarrhea, skin lesions and increased susceptibility to infections. Zinc deficiency during pregnancy is associated with an increased risk of premature delivery and abortion. Since the early 1960s, it has been noted that zinc deficiency can cause anemia, dwarfism, growth retardation and hypogonadism independent of iron deficiency.
Zinc deficiency is not widespread in Western countries, but it does affect a lot of people in the third world and most people in the Western world are not consuming optimal amounts of zinc. Stunted growth in childhood is typically caused by zinc deficiency, especially in low-income countries where the intake of animal foods tends to be lower.
Zinc and Health
Zinc helps to catalyze the calcification of bone. Zinc deficiency causes skeletal abnormalities by reducing parameters of calcium metabolism in the bone. Zinc is also needed for collagen synthesis and zinc deficiency can increase damage to the body’s soft tissue. Supplementation with arginine, glutamine, vitamin C and zinc has been shown to increase collagen synthesis during the first 2 days after inguinal hernia repair. Zinc also helps those with bed sores.
Zinc speeds up wound healing. Wounds do not heal as fast with zinc deficiency. Oral zinc sulfate improves foot and leg ulcers. Zinc deficiency exacerbates ulcers by increasing oxidative stress in the skin. Zinc sulfide nanoparticles improve skin regeneration. Healing of the aorta requires zinc. Think of zinc as a healing mineral. Zinc can be thought of as our adamantium and the source of our healing power.
Zinc may help to protect against atherosclerosis.
Zinc is important for brain development, learning and plasticity. Zinc deficiencies are implicated in brain disorders. Zinc supplementation in malnourished schoolboys has been shown to increase cortical thickness significantly more than in those who have received all the essential amino acids, vitamins and minerals without zinc.
Zinc and other trace minerals like copper and selenium are needed for the production of thyroid hormones. On the flip side, you need adequate amounts of thyroid hormones to absorb zinc, which is why hypothyroidism can cause an acquired zinc deficiency. Hypothyroidism reduces zinc absorption in the intestine. Zinc is required to activate the T3 receptor, hence zinc deficiency impairs T3 receptor action. Zinc deficiency is associated with hypothyroidism and severe alopecia, i.e., hair loss. Supplementing with zinc in those who are deficient improves thyroid function. Thus, it’s a vicious cycle of zinc deficiency impairing thyroid hormone function, which drives more zinc deficiency.
Zinc is important for glucose metabolism and insulin production. Zinc helps with the uptake of glucose into the cell and with insulin synthesis. Zinc deficient rats and humans have impaired glucose tolerance. A 2012 systematic meta-analysis that included 22 studies concluded that zinc supplementation has a beneficial effect on glycemic control and lipids in patients with type-2 diabetes. In type-1 diabetes, a combination of zinc and vitamin A (10 mg zinc a day and 25,000 IU vitamin A every other day) can improve apolipoprotein-B/A1 ratios (essentially LDL+VLDL/HDL ratios).
A zinc deficiency may lead to the following:
Here are the zinc-dependent enzymes, functions and consequences that may occur with a deficit
Zinc-Dependent Enzymes/Proteins: Function: Consequences of Deficit
Zinc and the Immune System
Even a modest zinc deficiency can impair the functions of macrophages, neutrophils, natural killer cells and the complement system, reducing their cytotoxicity. Zinc is needed for the production of T cells. People with zinc deficiency show reduced natural killer cell cytotoxicity, which can be fixed with zinc supplementation.
Zinc is an essential cofactor for thymulin, which is a thymic hormone. Without zinc, thymulin can’t bind to zinc and this impairs immunity. With the help of zinc, thymulin helps to differentiate immature T cells, preserving the balance between killer T cells and helper T cells. Autoimmune diseases with a T cell imbalance, such as rheumatoid arthritis, are associated with a zinc deficiency. Thymulin also modulates the release of cytokines.
Zinc is needed for the interaction between the natural killer cell inhibitory receptor p58 and major histocompatibility complex (MHC) class I molecules. A deficiency in zinc impairs natural killer cell activity, phagocytosis of macrophages, cytotoxicity of neutrophils and generation of the oxidative burst needed to kill pathogens. Zinc ions have been shown to stimulate the production of lymphocytes, which attack invading pathogens. The addition of zinc releases interleukin-1, interleukin-6, tumor necrosis factor-α and interferon gamma in human peripheral blood mononuclear cells, which help us fight infections.
Low zinc status has been associated with an increased risk of pneumonia, respiratory tract infections, diarrhea and other infectious diseases.
Zinc administration helps patients with hepatitis C virus-induced chronic liver disease. It is also noted that zinc deficiency is related to different autoimmune diseases such as type-1 diabetes, rheumatoid arthritis, multiple sclerosis, systemic lupus, autoimmune hepatitis, celiac disease and Hashimoto’s thyroiditis. A significant number of COVID-19 patients are zinc deficient, which is associated with worse COVID-19 outcomes. Zinc has been shown to inhibit the replication of viruses like SARS and arterivirus.
Zinc enables the calcium-binding protein calprotectin to inhibit the reproduction of bacteria through neutrophil degradation. Prophylactic zinc intake before LPS treatment in a porcine sepsis model has been shown to reduce the inflammatory response. Thus, doses of zinc before an LPS infection may be protective.
Extremely large doses of zinc (up to 400 mg/day) have been shown to impair immune function. In vitro, doses 7-8 times the physiological level inhibit T cell function and reduce interferon-alpha production.
Typical ratios for the general population are 15-20 mg/1 mg zinc/copper or (30-40 mg/2 mg copper), in elderly populations usually 40 of zinc with 1 mg of copper is used once to twice daily. In the AREDs study, 40 mg/1mg of zinc/copper twice daily significantly lowered mortality by 27%, which was seemingly driven by a reduction from deaths from respiratory causes.
Zinc Absorption and Bioavailability
The human body contains about 2-4 grams of zinc, out of which about 0.1% gets depleted every day. Only 1/1000 of the total zinc pool gets renewed on a daily basis. Zinc’s biological half-life is 280 days. Up to 57% of zinc in the body is located in muscle and 29% in bone. There is no separate storage site for zinc in the body, which is why it needs to be obtained from dietary sources on a regular basis.
Red meat avoidance is thought to contribute to iron and zinc deficiency, especially in young premenopausal women. Vegetarian diets, as well as a preference of poultry, fish and dairy over red meat has been shown to increase the risk of zinc deficiency. The requirement of zinc for vegetarians is at least 50% higher than omnivores because zinc is not particularly bioavailable in plant-based foods. Thus, adult vegetarian males might need up to 16-20 mg/day and vegetarian females 12-14 mg/day of zinc.
Children in certain U.S. middle- and upper-income families, have been seen to have impaired taste acuity, low hair concentrations of zinc and poor growth. This is due to “picky eating”, but technically, it is due to not eating zinc-rich animal foods like red meat. When giving these children zinc at 2 mg/kg of body weight, their ability to taste, appetite and growth rate improves.
Phytate, phosphate and other phytonutrients are able to chelate zinc, including other minerals. Phytate is a mineral chelator that affects zinc bioavailability. In Korean adults, lower zinc bioavailability because of phytates is related to a higher risk of atherosclerosis. Foods high in phytates like grains bind to zinc, iron and calcium and reduce their absorption, which is why diets high in phytates can cause zinc and iron deficiency.
The demonization of all grains, especially in those who consume large amounts of muscle meat but not organ meats, may be a leading contributor of copper deficiency. On the flip side, a high intake of grains and low intake of animal protein can lead to dwarfism, as observed in Middle Eastern countries such as Turkey, Morocco, Tunisia, Iran and Egypt. The high phytate content of bread in some regions like Iran, where the diet is primarily made up of grains, contributes to zinc deficiency.
It has been observed that the rising CO2 levels in the atmosphere can reduce the amount of zinc and iron in common crops, such as rice, wheat, peas and soybeans, with projections that a 10% drop in these minerals will occur by the end of the century. This puts developing countries, which are particularly dependent of these crops, in greater danger of zinc/iron deficiencies.
Supplementing 400 mcg of folic acid every other day for 4 weeks has been shown to reduce zinc excretion by 50%. However, polyphenols, such as resveratrol and quercetin, may enhance the uptake of zinc into certain cells via metallothionine.
Another inhibitor of zinc absorption is calcium, which enhances phytate’s ability to reduce zinc absorption. The ratios between phytate-zinc and phytate-zinc-calcium can predict the risk of zinc deficiency.
A high intake of other minerals like copper, magnesium, calcium, iron and nickel also reduces the absorption of zinc.
Ferrous iron (Fe 2+) supplements can reduce zinc absorption and lower plasma zinc levels. That is why experts recommend taking iron supplements away from food. However, when zinc and iron are consumed together in a meal, this effect does not seem to happen.
Certain antibiotics can inhibit the absorption of zinc. Taking these either 2 hours before, or 4-6 hours after zinc intake, can minimize this effect. Zinc reduces the absorption of penicillamine, which is used to treat rheumatoid arthritis. Diuretics like chlorthalidone and hydrochlorothiazide increase zinc excretion by as much as 60%. ACE inhibitors, like lisinopril and other blood pressure lowering medications in this class that end in “pril” deplete zinc. Omeprazole, a common antacid/heartburn medication, has been found to lower serum zinc levels. This is because proton pump inhibitors reduce zinc absorption from food due to a reduction in stomach acid.
Inflammation and certain disease states also promote zinc excretion. Picolinate complexes are more easily absorbed by the body. Zinc picolinate has been seen to improve zinc status in children with a genetic mutation that prevents them from absorbing zinc from cow’s milk.
Factors that increase the demand for zinc:
Supplemental doses of 80-150 mg/day of zinc can be immunosuppressive if not taken with copper. The lowest median effective dose of supplementary zinc is deemed to be 24 mg/day. Using zinc-containing nasal gels or sprays has been associated with a loss in the sense of smell.
Men taking over 100 mg/day of zinc experience a 2.9-fold increase in the risk for metastatic prostate cancer, which is likely due to copper deficiency.
Best to worst ways to measure zinc status and zinc deficiency:
Getting the Right Amount of Zinc from Diet
Since zinc is a copper antagonist, too much dietary zinc can lead to copper deficiency. Accordingly, too much zinc in relation to copper is associated with hypercholesterolemia and increased coronary heart disease mortality. At the same time, excess iron supplementation inhibits zinc and copper absorption, which can weaken the immune system and cause copper-deficient anemia. Thus, it is necessary to balance these three minerals – iron, zinc and copper.
Generally, most people should avoid iron supplements and focus on eating iron/zinc/copper rich foods. Here are some guidelines for meeting those guidelines:
Zinc supplementation is not needed when you are getting enough of it from diet. Zinc supplementation is mostly beneficial for fixing deficiencies quickly or trying to help with an acute infection. Chronic intakes of zinc over 80-100 mg/day can promote copper depletion, especially if dietary copper intake is not concomitantly increased. The idea is to try and obtain anywhere from 20-80 mg of zinc, 8-18 mg of iron and 3 mg of copper each day. This will of course depend on the person.
Thyroid cells absorb iodine from food and combine it with the amino acid tyrosine, which is used to create thyroid hormones such as thyroxine (T4) and triiodothyronine (T3). Most of the effects of thyroid hormones are performed by the active thyroid hormone T3. There are also T0, T1 and T2, which act as hormone precursors and byproducts. Thyroid hormones, primarily T3, are then released into the bloodstream to affect your body temperature, growth, daily caloric needs, heart rate and metabolic rate.
The production of thyroid hormones requires certain minerals, in particular, iodine, selenium, sodium, zinc and magnesium. Only 20-50% of all iodine in the body is found in thyroid hormones or the thyroid gland, whereas the other 50-80% is concentrated in non-thyroid tissues.
Symptoms of hypothyroidism include chronic fatigue, weight loss plateaus or weight gain, low testosterone and sex drive, cognitive dysfunction, intolerance to cold, joint pain, constipation, hair loss, brittle skin, water retention, depression and high cholesterol levels.
Cellular uptake of the thyroid hormones T4 and T3 is dependent on energy (ATP), magnesium and sodium. One of the main thyroid hormone transporters is the sodium-iodide symporter (NIS). The NIS transports two sodium cations (Na +) and one iodide anion (I −) into the thyroid gland. Albumin also appears to also have an essential role in thyroid hormone transportation.
Iodine, Hypothyroidism, and Goiter
Thyroid hormone levels are regulated by thyroid-stimulating hormone (TSH), also known as thyrotropin. TSH promotes the uptake of iodine by the thyroid and stimulates the synthesis and release of T4 and T3. Thyroid hormone production works in a negative feedback loop with TSH. This is why people with hypothyroidism have high TSH levels because their body is trying to make more thyroid hormones.
In the presence of iodine, T4 gets synthesized and is transformed into T3 based on needs. The formation of thyroid hormones is initiated by thyroglobulin (Tg), which gets synthesized by TSH when thyroid hormones are low. If iodine is overconsumed, the activity of sodium-iodide symporter (NIS) decreases in order to slow down the synthesis of thyroid hormones.
Iodine is an essential trace mineral needed for thyroid functioning, thyroid hormone synthesis, physical and mental development and metabolism. It can only be obtained from diet or supplements. Low iodine status is one of the biggest risk factors for hypothyroidism. Iodine is the heaviest, yet least abundant, element that occurs in several oxidation states, such as iodide (I −) and iodate (IO −3).
Iodide combined to DHA and arachidonic acid in cell membranes protects them from oxidative damage. Indeed, iodide acts as an electron donor in the presence of hydrogen peroxide and peroxidases. Thus, iodine/iodide is important for protecting against lipid peroxidation and oxidative stress. In fact, it is thought that the high levels of iodine in algae played an important antioxidant role when they first started to produce oxygen 3 billion years ago.
Iodine contributes to 65% of the molecular weight of T4 and 59% of T3. Only about 10-20% of T3 is secreted by the thyroid gland itself, whereas most of it is produced from T4. Up to 80% of T3 gets created outside of the thyroid gland through deiodination of T4 with the help of type I deiodinase in the liver and kidneys. Skeletal muscle can also initiate the conversion of T4 into T3.
An iodine-replete adult contains about 25-50 mg of iodine out of which 20-50% is stored in the thyroid. The iodine gets transported into the thyroid as iodide by the sodium-iodide symporter (NIS). This NIS-mediated delivery of iodide into the follicular cells of the thyroid gland is the first step in the synthesis of thyroid hormones. That is why the concentration of iodine in the thyroid gland is 20-50 times greater than in the plasma. The remaining iodine that is left in the bloodstream will be excreted through urine.
The size of the thyroid gland is dependent on iodine intake, whereby an excessive or deficient intake of iodine can lead to goiter. Iodine-induced goiter was first described in the 19th century and it has also been documented in children. Excessive iodine intake is thought to contribute to goiter in Sudan, Ethiopia, Algeria and China. It can also lead to reversible hypothyroidism.
Hypothyroidism from excess iodine intake may happen because of thyroid autoimmunity, characterized by elevated anti-thyroid antibodies. Excess iodine is considered a risk factor for the development of thyroid autoimmune disease. Autoimmune thyroiditis and hypothyroidism are more common the older you get with antibodies peaking at 45-55 years of age. Urinary iodine levels > 300 μg/L are considered excessive in children and adults and levels > 500 μg/L are considered excessive in pregnant women.
Excess iodine can also cause hyperthyroidism, which happens after the iodine-deficient thyroid gland nodules escape the control of TSH and start overproducing thyroid hormones. Graves’ disease is the most common form of hyperthyroidism. Other autoimmune thyroid disorders include Hashimoto’s thyroiditis, post-partum thyroiditis and autoimmune arthritis. Hashimoto’s thyroiditis is associated with a significantly higher risk of papillary thyroid carcinoma. Thyroiditis refers to inflammation of the thyroid gland. Inflammation also promotes hypothyroidism by blocking T4 conversion to T3. Low thyroid individuals have a higher risk of hip fractures and non-spine fractures.
Here are the iodine-dependent enzymes, functions and consequences that may occur with a deficit in iodine:
Iodine-Dependent Enzymes/Proteins: Function: Consequences of Deficit
Hypothyroidism is about 10 times more prevalent in women than in men because they have higher iodine requirements during lactation and pregnancy.
Here are the risk factors for developing hypothyroidism or hyperthyroidism:
A meta-analysis discovered that consumption of iodine-rich seafood (>300 mcg/day of iodine) decreased the risk of thyroid cancer. Molecular iodine (I 2) at 3 mg/day has been proposed to be able to suppress benign and cancerous neoplasia due to its antioxidative effects. Radioactive iodine can increase the risk of thyroid cancer and its uptake is higher in iodine-deficient people. Thus, iodine-deficient people are at a higher risk of developing thyroid cancer.
There’s an interrelationship between thyroid dysfunction and hypogonadism or low testosterone in men. Free testosterone levels are reduced in subjects with hypothyroidism and thyroid hormone treatment normalizes that. Testosterone has protective effects on thyroid autoimmunity by reducing titers of thyroid peroxidase and thyroglobulin antibodies. There is an association between erectile dysfunction and hypothyroidism. Hypothyroidism increases the risk of gaining weight and developing obesity, which can promote metabolic syndrome and other chronic diseases.
Hypothyroid patients have elevated lipoprotein(a) [Lp(a)], which is a risk factor for cardiovascular disease. Lp(a) attracts oxidized lipids and can cause inflammation. Those with hypothyroidism also have higher phospholipase A2 (Lp-PLA2), which is a pro-inflammatory enzyme that moves with LDL particles. C-reactive protein (CRP), one of the primary markers of inflammation has been found to be elevated in patients with subclinical hypothyroidism, which increases cardiovascular disease risk.
The body needs thyroid hormones to use cholesterol and cholesterol is a precursor to testosterone and other steroid hormones like DHEA, pregnenolone and progesterone. Thyroid hormones influence cholesterol synthesis, catabolism, absorption and excretion by regulating cholesterol 7alpha-hydroxylase (CYP7A), the rate-limiting enzyme in the synthesis of bile acids. T3 stimulates lipoprotein lipase (LPL) to increase the breakdown of VLDL cholesterol. Furthermore, T3 increases the expression of LDL cholesterol receptors that enable LDL to enter cells to be used for its essential functions instead of staying in the bloodstream. T3 can protect LDL from becoming oxidized by mopping up free radicals.
Hyperthyroidism, or an over-active thyroid, has the opposite effect to hypothyroidism and is associated with low HDL and LDL levels. This is not ideal as very low cholesterol is associated with depression, memory loss and increased mortality.
Thyroid hormones are important for the development of a healthy heart and abnormal thyroid hormone function can cause issues like cardiac hypertrophy and irregular heartbeat. Heart, endothelial and other cells have thyroid hormone receptors which regulate thyroid hormone function. Thyroid dysfunction can affect cardiac output, blood pressure, vascular resistance and heart rhythm.
Administration of iodide, the reduced form of iodine, has been shown to protect the heart tissue from myocardial ischemia reperfusion injury in mice. Smoking increases the risk of atherosclerosis, cardiovascular disease and autoimmune conditions like Graves’ disease. Importantly, smokers appear to have lower levels of thyroid stimulating hormone activity than non-smokers. Smoking is also associated with an increased risk of thyroid disease.
Food and Other Sources of Iodine
Many regions have low soil iodine content, especially those areas that are far inland from the coast. That is why iodine intake in the United States primarily comes from iodized salt, which is in the form of potassium iodide in North America but potassium iodate in other regions due to its better stability in hot/humid climates. Vaporized sea water coming as rainfall is not always able to enrich the soil with enough iodine. This is true even for some coastal areas like the Zanzibar Islands of Tanzania. Mountainous regions and river valleys have the most iodine-deficient soils in the world. Crops and animals grown on low iodine land and consuming low iodine foods become deficient in iodine, which leads to low iodine intakes and iodine deficiency in populations of that region.
Brown algae seaweed accumulate more than 30,000 times more iodine in seawater. East-Asian countries like Japan and Korea include a lot of seaweeds into their diet. The average iodine intake in Japan is 1.2 mg/day from eating seaweeds like kelp, nori and kombu.
According to the World Health Organization, based on estimates from 2002, the proportion of the population with insufficient iodine intake (measured as urinary iodine < 100 mcg/L) is as follows, Americas (9.8%), Western Pacific (24.0%), South-East Asia (39.8%), Africa (42.6%), Eastern Mediterranean (54.1%) and Europe (56.9%).
The Standard American Diet is typically low in iodine unless it includes things like pastured eggs, cranberries, milk, yogurt, shrimp, oysters, tuna/cod or seaweed.
Dairy and milk can also be a reliable source of iodine in many diets, including camel milk and goat milk. Vegan diets that lack seafood and dairy dramatically increase the risk of iodine deficiency unless seaweeds or two and a half medium sized potatoes are eaten on a daily basis.
Most commercial bread has extraordinarily little iodine, but it can be fortified with potassium iodate or calcium iodate. Pasta can contain iodine only when cooked in water with iodized salt or when made out of iodized dough. Most fruits and vegetables are a poor source of iodine.
Iodine is especially important during the first 1000 days of life, as infants in that timeframe are more vulnerable to hypothyroidism and developmental issues. Thyroid hormones are needed for the brain’s myelination and neuronal migration. An iodine deficiency during infancy can impair physical and neuromotor development. It also increases the risk of stillbirths, abortions and developmental abnormalities. Impaired thyroid function in iodine-deficient children is correlated with reduced insulin-like growth factor-1 (IGF-1) and IGF-binding protein-3 (IGFBP-3) and stunted growth.
Women in regions with high dietary iodine intake have higher amounts of iodine in their breast milk. In Korean preterm infants, subclinical hypothyroidism is associated with high iodine levels in breast milk. Expression of the sodium-iodide symporter (NIS) in mammary glands regulates iodine absorption and contributes to the presence of iodine in breast milk. In the thyroid, NIS is regulated by thyroid stimulating hormone, whereas in breast tissue prolactin, oxytocin and β-estradiol regulate its expression. Some environmental chemicals like perchlorate, pertechnetate, nitrate and thiocyanate can compete with iodine absorption by hijacking the sodium-iodide cotransporter.
Fluoride in drinking water may also inhibit the sodium-iodide symporter activity, causing iodine deficiency.
Infant formulas have to be fortified with iodine to mimic breast milk nutrition. The infant formulas in the U.S. are regulated to contain 5-76 mcg/100 kcal of iodine, whereas in the EU it is 15−29 mcg/100 kcal. If the mother is iodine deficient, their whole-body iodine stores will be more concentrated in their breast milk to safeguard infant dietary requirements. Prenatal supplements containing iodine are being recommended during pregnancy and lactation, but if the mother is already consuming too much iodine it may cause fetal complications.
To get the right amount of iodine, you would want to consume a diet that provides anywhere from 150-300 mcg/day of iodine. That ensures there is circulating iodine in your system in adequate quantities to fuel thyroid hormones.
Raw cruciferous vegetables contain goitrogenic compounds called glucosinolates that if overconsumed may inhibit iodine absorption and cause low thyroid function. Cruciferous vegetables include collard greens, broccoli, kale, cauliflower, cabbage and arugula. Soy, cassava, peanuts, corn, certain beans, millet, pine nuts and strawberries also contain goitrogens. Cooking lowers the amount of goitrogenic compounds, thus, if you consume these foods cooked, then it’s much less likely that there will be any issues.
Some amino acids can partially inhibit thyroid hormone transport by competing with the receptors. In rat small intestinal cells, tryptophan, phenylalanine and tyrosine inhibit T3 transportation. Leucine has also been found to have inhibitory effects on T3 and T4 uptake by pituitary cells. Large doses of amino acids in isolation, like a tryptophan supplement or a branched chain/essential amino acid supplement, may have a competitive effect, but this may only be an issue for an hour after consumption.
Certain medications can inhibit thyroid hormone uptake into certain tissues by competing with them due to structural similarities. These include certain antiarrhythmic drugs, calcium channel blockers (like nifedipine, verapamil and diltiazem), calmodulin antagonists and benzodiazepines. Some anti-inflammatory drugs like diphenylhydantoin, meclofenamic acid, mefenamic acid, fenclofenac, flufenamic acid and diclofenac inhibit T3 uptake. Medications that have goitrogenic effects include antibiotics like ciprofloxacin and rifampin, steroids, statins and others. However, diabetic drugs like metformin may reduce the goitrogenic effects of type-2 diabetes and obesity. ACE inhibitors such as benazepril, lisinopril and fosinopril are blood pressure medications that can elevate potassium levels when taken together with potassium iodide. Taking potassium iodide with potassium-sparing diuretics can also cause hyperkalemia (high potassium). Certain food chemicals and flavor-enhancers like perchlorate block the thyroid’s ability to absorb and utse dietary iodine.
Chronic low-calorie dieting can cause irreversible damage to the thyroid. A 2016 study on the Biggest Loser competitors discovered that 6 years after the show the participants had regained 70% of the weight they initially lost and were burning 700 fewer calories per day compared to when they started the show. Despite gaining some lean muscle during the show, their metabolic rate didn’t increase due to the severe calorie restriction they experienced, which mimicked starvation. This is the result of crash dieting and yo-yo dieting, which inevitably disrupts thyroid function.
Prolonged fasting for 72 hours has been shown to decrease serum T3 by 30% and thyroid-stimulating hormone by 70% in healthy men. Similar results have been noted with 4-day fasts. Men typically see greater changes than women. However, T3 and reverse T3 return to prefasting levels after refeeding and returning to eating a normal diet. Thus, alterations in thyroid hormones are an acute adaptive mechanism to food scarcity to preserve energy. Having a lower metabolic rate also protects against muscle loss during fasting.
Fructose has been shown to reduce liver T4 uptake in humans and rats. This is thought to be caused by fructose raising lactic acid and uric acid levels, which consume ATP. Physiological stress has also been noted to reduce T3 and T4. Posttraumatic stress disorder is associated with higher rates of hypothyroidism. Thus, the synthesis and transport of thyroid hormones is energy-dependent and can be reduced with fasting, excessive fructose intake and magnesium deficiency.
Here are 12 things that lower thyroid hormones or thyroid hormone uptake:
Special salts, like sea salts, do not usually contain iodine, however, Himalayan and other pink rock salts typically do contain some naturally occurring iodine.
Abrupt intake of iodine after a period of deficiency, such as seen with salt iodization, can cause hyperthyroidism and raise thyroid antibodies, especially in young women. Thus, introducing too much iodine when you have been deficient beforehand might trigger autoimmunity against the thyroid. Thus, a more gradual re-introduction of iodine, and from more natural sources (salts or foods naturally containing iodine), may be optimal.
Drinking water contains on average 1-10 mcg/L of iodine. High levels of iodine in groundwater have been reported in Algeria, Argentina, China, Denmark, Djibouti, Somalia, Ethiopia and Kenya. In some places like Somalia, drinking water is the only source of iodine. Long-term exposure to excess iodine from drinking water has been seen to cause hypothyroidism in children. Excess iodine from water is also thought to cause endemic goiter in some regions of China but not in others. Water purified with iodine has also been shown to cause thyroid dysfunction in a few cases of Nigerian workers and U.S. astronauts.
Excess iodine is more toxic in the presence of selenium deficiency, which can lead to damage and oxidation of proteins. In fact, supplemental selenium alleviates the toxic effects of excess iodine, such as thyroid damage and thyroid hormone dysfunction. Acute symptoms of iodine poisoning include burning of the mouth, vomiting, diarrhea, coma and pain in the stomach and throat. Some people are also hypersensitive towards iodine-containing foods and products, which makes them experience rashes and allergies. However, there are no confirmed reports of actual iodine allergy.
Assessing Iodine Status
Under normal circumstances, only 10% or less of circulating iodine gets taken up by the thyroid but during chronic deficiency it can be more than 80%. The half-life of plasma iodine is about 10 hours, but this is reduced in iodine deficiency. Over 90% of iodine is absorbed in the small intestine and 90% of it is excreted within the first 24-48 hours. The half-life of T4 is 5 days and for T3 it is roughly 1.5-3 days. Urinary iodine concentration (UIC) is the most commonly used biomarker for measuring iodine status within a population. Intradiurnal rhythms will offset individual spot urine samples, which is why spot urinary iodine is not recommended for individual assessment.
Thyroid-stimulating hormone (TSH) is often used to assess hypothyroidism, but it is not fully accurate due to individual differences and diurnal variations. Generally, TSH levels start to rise when iodine intake drops below 100 mcg/day. The normal range for TSH is between 0.45-4.5 mU/L and 95% of the disease-free population in the U.S. has their TSH at 0.45 and 4.12 mU/L. T4 and T3 measure hormone levels but not necessarily thyroid hormone function (receptor activation and subsequent effects thereafter). TSH is not accurate for assessing iodine status in adults but it may be accurate for assessing iodine status in newborns because they have a higher rate of iodine turnover than adults.
The Sodium-Selenium-Iodine (SSI) Connection
Selenium is an essential mineral that protects against oxidative damage, promotes DNA synthesis and is needed for metabolism. Selenium is contained in over two dozen selenoproteins in the body, many of which have roles in cell redox homeostasis and antioxidant defense. Selenium has two forms: inorganic (selenate and selenite) and organic (selenomethionine and selenocysteine) both of which are good dietary sources of selenium.
Selenoproteins are involved with thyroid hormone metabolism and antioxidant defense. Selenium is present in selenoproteins as selenocysteine, which is essential for enzymatic activities. In addition to iodine, the thyroid gland is also the highest concentrated source of selenium per gram of tissue in the body due to a high number of selenoproteins.
The main selenoproteins, namely glutathione peroxidase, thioredoxin reductase and deiodinases are expressed in the thyroid gland in large quantities. Here is how they affect thyroid function:
Thus, selenium is important for not only activating thyroid hormones but also recycling them. In other words, iodine does not work without selenium and sodium and a lack in any of these essential nutrients will worsen thyroid function.
Sodium-selenium therapy improves thyroid status and the T3/T4 ratio in cystic fibrosis and congenital hypothyroidism. The more impactful effect seems to be that selenium supplementation improves thyroid parameters by increasing deiodinase activity.
However, selenium supplementation without prior iodine replenishment may exacerbate hypothyroidism because there isn’t enough glutathione peroxidase to clean up the hydrogen peroxide that gets created during iodine-deficient hypothyroidism. Those unleashed free radicals will then begin to damage thyroid cells and interfere with thyroid health. Thyroid function has not been shown to change in healthy subjects receiving additional supplemental selenium if they have no parameters of hypothyroidism.
Selenium deficiency increases thyroid size and risk for goiter. Lower selenium levels are associated with increased thyroid volume and nodule formation. Thus, adequate selenium may protect against goiter as well as thyroid disease. The incidence of goiter in children is associated with the amount of selenium in the local soil – i.e., low selenium in soil, low selenium in food, selenium and iodine deficiency, hypothyroidism and goiter. Fixing iodine deficiency in children with goiter and selenium deficiency does not reduce the volume of goiter or improve thyroid function.
Selenium deficiency leads to the activation of oncogenes (cancer causing genes) due to an increased formation of reactive oxygen species. Thus, selenium might have a role in the prevention of cancer due to its antioxidant and immunomodulatory properties.
Chronic lymphocytic thyroiditis (CLT) can be improved with selenium at a dose of 200 mcg/day given for 3-12 months by decreasing anti-thyroperoxidase (TPO) antibodies. The suppression of antibodies appears to require doses above 100 mcg/day to maximize glutathione peroxidase. CLT is the most common autoimmune thyroid disease in which iodine status is adequate and where genetic factors of selenium deficiency are present. Giving 200 mcg of selenium/day as selenomethionine might help to reduce postpartum thyroiditis by reducing thyroid antibodies. Selenium given at 80 mcg/day for 12 months has also been observed to significantly lower anti-TPO antibodies in patients with Hashimoto’s thyroiditis with normal T4 and TSH levels.
Graves’ disease is also correlated with lower selenium levels and selenium is necessary for protecting against the increased oxidative stress that occurs during this condition. Selenium supplementation has also been shown to improve the quality of life in patients with Graves’ orbitopathy.
Selenium deficiency, in combination with an RNA viral infection called coxsackie virus, can lead to Keshan disease, which is a type of cardiomyopathy. Keshan disease originated from parts of China where people got on average 11 mcg/day of selenium. An intake of at least 20 mcg/day is needed to prevent Keshan disease. Observational studies have found an association between lower selenium in people with HIV and increased risk of cardiomyopathy and death. Selenium deficient women also have a higher chance of transmitting the virus to their offspring. Selenium supplementation can reduce hospitalization and HIV viral load.
Selenium supplementation in those who have a baseline deficiency likely provides significant benefits, whereas consuming over 300 mcg/day of selenium may potentially lead to negative consequences:
People with lower selenium status are more likely to experience cognitive decline and neurodegeneration because of increased oxidative stress. Low selenium in the elderly also correlates with worse memory. In France, supplementation with antioxidants, including selenium, has been shown to improve episodic memory and semantic fluency in subjects aged 45-60.
Selenium has detoxifying effects on several heavy metals, such as cadmium, mercury and lead. The mechanisms are mediated by glutathione peroxidases and other antioxidants that help to eliminate these toxic compounds. Mercury toxicity can inhibit selenoenzymes that are needed to combat oxidative stress. Thus, selenium might be a cornerstone mineral for preventing the damage that occurs from heavy metal toxicity.
Here are the selenium-dependent enzymes, functions and consequences that may occur with a deficit:
Selenium-Dependent Enzymes/Proteins: Function: Consequences of Deficit
Getting Enough Selenium from the Diet
How much selenium is in food depends on the amount of selenium in the soil. The soil consists of inorganic selenium that the plants obtain and convert into the organic form. Selenocysteine and selenite promote selenoprotein biosynthesis by getting reduced to hydrogen selenite, which is then converted to selenophosphate which supports selenoprotein biosynthesis. In animal and human tissues, selenium is in the form of selenomethionine and selenocysteine.
Ocean waters and rivers also contain quite a lot of selenium, which is why meat, fish and seafood are generally much higher in selenium than plants.
The RDA for selenium is 15 mcg for infants, 20-40 mcg for children and 55 mcg for adults. During pregnancy and lactation, the RDA for selenium increases to 60-70 mcg/day, respectively. Selenium toxicity or selenosis generally occurs if you exceed 400 mcg/day (although the actual level that induces toxicity is likely at 800mcg/day or higher).
Large doses of selenium in individuals without selenium deficiency may promote hyperglycemia and insulin resistance. Symptoms of too much selenium consumption include a metallic taste in the mouth, garlic odor, hair and nail loss and brittleness. Diarrhea, nausea, fatigue and skin lesions are also common.
The low selenium content of New Zealand’s diet was fixed by increasing the import of high-selenium wheat. Selenium levels are well known to be lower in smokers and the elderly.
2-3 Brazil nuts per day will achieve daily requirements. Other options for daily selenium consumption are consuming seafood 2-3 times per week since that will also provide iodine, omega-3s and protein. The majority of selenium is absorbed in the small intestine (50-80%) and excreted by kidneys (60%). Thus, your selenium status is also determined by the state of your digestive and excretory organs.
Here are things that raise the requirement for selenium:
Selenium supplements mostly come in the form of selenomethionine, selenium-enriched yeast, sodium selenite or sodium selenate. The human body can absorb up to 70-90% of selenomethionine but only 50% of selenium from selenite. Thus, selenomethionine is the best organic selenocompound for improving selenium status because it is incorporated into proteins in a non-specific way. However, selenomethionine needs to be reduced to hydrogen selenide (H 2 Se) by selenocysteine beforehand, making it less efficient metabolically. Nevertheless, organic selenocompounds like selenomethionine cause less acute toxicity, which is why they are more preferred in short-term therapy.
Naturally, selenomethionine is mostly found in animal tissue and protein, which is why meat and fish tend to be far more bioavailable sources of selenium.
Plasma or serum selenium levels at 8 mcg/dL or above are enough to meet the requirements for selenoprotein biosynthesis. The average serum selenium concentration in U.S. adults over 40 years old is 13.67 mcg/dL. Males tend to have higher selenium levels than females and whites more than African Americans. Blood and urine selenium reflect only recently consumed selenium. Measuring glutathione peroxidase can be an indirect measurement of selenium status. Selenoprotein P may be one of the best markers for measuring total body selenium status.
Other Minerals for Thyroid Health
There are also other minerals relevant to thyroid function, which are summarized below:
Eat mineral dense superfoods like liver, oysters, spinach and seaweed at least 1-2 times per week and foods like pastured meat and eggs on a daily basis. Pay attention to the inhibitors and chelators like phytates and goitrogens but deliberate avoidance is not necessary, unless you already have existing clinical hypothyroidism.
Most people are not aware that they have hypertension because they do not experience any visible symptoms, which is why it is called the “silent killer”. However, if you do experience symptoms of elevated blood pressure, it can include headache, nosebleed, arrhythmia, heart pounding, blurry vision, buzzing in the ears, fatigue, nausea, chest pain, anxiety and tremors.
Systolic blood pressure represents the pressure in blood vessels while the heart contracts or beats. Diastolic blood pressure the pressure in the vessels during rests between heart beats. Hypertension is diagnosed if your systolic blood pressure is ≥ 140 mmHg (millimeters of mercury) and diastolic blood pressure is ≥ 90 mmHg on two different days.
There are many things that cause hypertension, such as obesity, aging, lack of exercise, excess alcohol consumption, smoking and electrolyte imbalances. Insulin resistance and metabolic syndrome are also associated with hypertension. Consumption of more fruit and vegetables has been consistently shown to be associated with lower blood pressure. Part of that is deemed to be because of a higher potassium content that regulates the body’s fluid volume and leads to blood vessel dilation. Consuming excess sodium, typically on top of a low potassium intake, is associated with an increased risk of hypertension.
High potassium intake may protect against stroke mortality, hypertension, arterial endothelial injury, cardiac hypertrophy and kidney damage.
Effects of Potassium on Health
Potassium is the most abundant intracellular cation (an ion with a positive charge) that helps to maintain intracellular fluid volume. The adult human body contains around 1,755 mg of potassium per kg of body weight. Out of that amount, only 2% is located in the extracellular fluid with most of it being found in skeletal muscle. The concentration of potassium in the cell is 30 times higher than outside the cell. This difference creates a transmembrane electrochemical gradient that is regulated by the sodium-potassium ATPase pump (Na + -K + ATPase). The gradient of potassium across cell membranes determines the cells’ membrane potential. The resting membrane potential and the electro-chemical differences across cell membranes are critical for normal cellular function and biology. Transfer of potassium ions across nerve membranes is essential for nerve function and transmission.
Once activated, the sodium-potassium pump exchanges 2 extracellular potassium ions for 3 intracellular sodium ions. Other channels responsible for maintaining differences in cell membrane potential are the sodium-potassium chloride (Cl) symporter and sodium-calcium (Ca) exchanger. The ATPase is also stimulated by activation of β-adrenergic and insulin receptors, which are stimulated by epinephrine/norepinephrine and insulin, respectively. Consequently, simultaneous stimulation of both the β-adrenergic and insulin receptors increases the influx of potassium into the cell but causes sodium to leave the cell via the Na + -K + ATPase.
Triiodothyronine (T3) and thyroxine (T4) increase Na-KATPase subunits by increasing gene expression (although the effect is the opposite in the thyroid gland, as thyroid hormones will have a negative feedback effect on the Na-K-ATPase). The Na-K-ATPase provides the sodium gradient to absorb iodide via the sodium-iodide symporter, thus reducing its activity will reduce iodine transport into the thyroid gland, thus reducing thyroid hormone synthesis.
Insulin substrate receptor-1 (IRS1) and intracellular glucose transport proteins (GLUT4) facilitate the influx of glucose into the cell. Downstream signaling events to that, involving cyclic adenosine-monophosphate (cAMP), protein kinase A (Akt), and IRS1-phosphatidylinositide-3-kinase (PI3-K) regulated pathways also mediate the influx of glucose as well as potassium into the cell. The fastest potassium exchange occurs in the kidneys and the slowest in muscles and the brain.
Metabolic acidosis, or a drop in extracellular pH, increases the loss of intracellular potassium into the extracellular space via transporters that regulate pH in skeletal muscle. Organic acidosis, caused by an accumulation of lactic acid or uric acid, lowers intracellular pH, which stimulates the movement of hydrogen out of the cell via the sodium-hydrogen transporter and bicarbonate into the cell via the sodium-bicarbonate transporter. As a result, intracellular sodium increases, which maintains Na+-K+ ATPase activity limiting the loss of potassium from the cell. Thus, organic acidosis, say from lactic acid build up from excessive exercise, causes a much smaller loss of intracellular potassium than does metabolic acidosis.
Fixing low grade metabolic acidosis with potassium bicarbonate has been shown to increase 24-hour mean growth hormone secretion as well as provide a nitrogen sparing effect that helps to prevent muscle loss in postmenopausal women. In patients on bed rest, potassium bicarbonate (3,510 mg/day) and whey protein supplementation (0.6 g/kg of body mass per day) for 19 days has been noted to mitigate the loss of muscle function due to inactivity.
The alkaline theory has already been tested in several studies of patients with kidney disease showing that consuming an alkaline diet (either via fruits/vegetables or sodium bicarbonate supplementation) over several months can improve metabolic acidosis and kidney function. In postmenopausal women, an alkaline treatment (potassium bicarbonate supplementation) improved calcium and phosphorus balance, reduced bone resorption and increased bone formation and reduces their urinary nitrogen (i.e., muscle) loss. These findings have been confirmed in middle-aged and elderly men, whereby alkaline therapy (potassium bicarbonate) has a muscle protein-sparing effect.
Animal proteins, when broken down, release their sulfur-containing amino acids that provide an acidic load (sulfuric acid), which cannot simply be breathed out of the body.
Added sugars and grains also contribute to the dietary acid load and are typically overconsumed in the Western world. The lack of potassium/bicarbonate-forming foods in the western diet exacerbates the metabolic acidosis caused by eating the processed, industrialized foods, promoting osteoporosis, inflammation and kidney stones.
Bicarbonate-forming and potassium-rich foods are thought to improve bone health, thanks to regulating the acid-base homeostasis as well as their antioxidant content. Fruits and vegetables are the only foods that contain both potassium and bicarbonate-forming substances, giving them a negative potential renal acid load (PRAL). Eating acidic foods isn’t necessarily harmful as long as other variables like potassium/calcium/magnesium intake, bicarbonate-forming foods, metabolic and kidney health are taken into account.
Potassium, Sodium, and Blood Pressure
The low salt guidelines are not really warranted for most people and may actually cause other problems, such as insulin resistance, hypothyroidism, increased heart rate, heart failure, kidney damage, high triglycerides and cholesterol.
Chronic salt restriction may cause “internal starvation”, making the body raise insulin levels because insulin helps the kidneys retain sodium. That is why a high salt intake can be more dangerous if you are diabetic or have insulin resistance because your body is holding onto more sodium. On the flip side, healthy kidneys can handle excess sodium with ease. Salt sensitivity is primarily driven by insulin resistance and sympathetic overdrive.
The body needs sodium for nerve function, muscle contraction and cell metabolism. You would be fine without ever eating another gram of sugar, but you wouldn’t survive very long without salt.
Regular table salt (sodium chloride) can raise blood pressure significantly more than sodium bicarbonate or sodium acetate. Table salt is 100% sodium chloride, whereas many unrefined salts are anywhere from 92-97% sodium chloride, where the other 2-8% are composed of other minerals like calcium, magnesium and iodine. Sodium bicarbonate has actually been shown to lower blood pressure slightly in hypertensive individuals.
Modern diets that are high in salt wouldn’t be such a big issue if they also didn’t lack bicarbonate and potassium. Thus, it isn’t the dietary salt that’s the problem, it is the lack of the other alkaline minerals (calcium, magnesium and potassium) and base-forming substances/bicarbonate. Fruit and vegetables contain potassium phosphate, sulfate, citrate, bicarbonate and others, but not chloride, which can be added to salt substitutes and supplements (think magnesium chloride). Potassium bicarbonate has a natural sodium-excreting effect, reversing sodium chloride-induced increases in blood pressure.
Low potassium levels in the blood usually occurs alongside with low magnesium levels, both of which can promote cardiovascular complications. This is because low potassium levels can be caused by magnesium deficiency.
There are many ways potassium can help with hypertension and cardiovascular disease:
Potassium also has a beneficial effect on blood sugar management and insulin resistance. These problems may become a bigger issue for hypertensive people who are prescribed potassium excreting thiazide diuretics. Thiazide diuretics are the preferred pharmacological treatment for hypertension but have a tendency to reduce glucose tolerance and increase the risk of type 2 diabetes. Thiazide diuretics lower serum potassium, which may actually be caused by an increased urinary loss of magnesium, and diuretic-induced hypokalemia has been shown to reduce glucose tolerance by reducing insulin secretion.
Potassium is an essential activator of pyruvate kinase, which is the enzyme involved with the last step of glycolysis, whereby energy is produced from glucose. One of the binding sites of pyruvate is dependent on potassium ions.
Here are the potassium-dependent enzymes, functions and consequences that may occur with deficient potassium:
Potassium-Dependent Enzymes/Proteins: Function: Consequences of Deficit
Because low potassium impairs calcium reabsorption by the kidneys, it can increase urinary calcium, which can cause kidney stones. Potassium citrate may prevent kidney stones from forming and it also helps to break down kidney stones. However, it is primarily citrate that seems to be responsible for this effect. Nevertheless, a low potassium intake is still a potential contributor to kidney stone formation.
You can ask your doctor to assess your kidney health and functioning using the below methods:
To improve kidney health, you want to eat a healthy diet, maintain an active lifestyle, consume an adequate amount of salt, potassium and magnesium, consume bicarbonate or fruits/vegetables, stay hydrated and avoid refined carbs/sugars/omega-6 seed oils. You should also avoid smoking and excessive alcohol intake.
Getting Potassium from Food and Supplements
Malabsorption conditions, such as bariatric surgery, IBS and inflammatory bowel disease (ulcerative colitis and Crohn’s) can reduce potassium absorption.
In Nordic countries, fruit and vegetables contribute 17.5% of overall potassium intake, whereas in Greece they provide 39%. Vegetables and potatoes typically provide about 24.5% of the total potassium intake in the UK. In the U.S., potatoes contribute up to 19-20% of the total daily potassium intake. There is an inverse relationship between both raw, as well as cooked, vegetables and blood pressure, although the relationship is slightly greater with raw vegetables. Monocropping, and other non-regenerative farming methods, depletes the soil of potassium, reducing the potassium content of the food that is grown on it.
In 2019, it was decided that there was insufficient data to set an RDA for potassium and thus an adequate intake for potassium was decided on 3,400 mg/day for men and 2,600 mg for women. However, as you know adequate intakes and RDAs are not optimal intakes. For primary prevention of hypertension, it has been recommended to get over 3,500 mg/day of potassium, which is above the adequate intake for men and women.
Many “salt sensitive” people simply need more potassium, magnesium or calcium instead of sodium restriction. It’s very likely that the higher the magnesium intake, the less potassium is needed, as the former helps prevent potassium loss from the cell.
Hunter-gatherer tribes that eat only whole foods have been estimated to consume anywhere from 5,850-11,310 mg/day of potassium. Sodium intake was estimated at 1,131-2,500 mg/day but this did not take into account sodium consumed from blood, interstitial fluids, salt licks and salty/brackish water. Thus, the sodium intake is an underestimation. To be fair, the potassium to sodium ratio of our ancestors likely sits somewhere between 2-3:1. On the other hand, industrialized societies that consume a lot of processed foods get about 2,100-2,730 mg/day of potassium and about 3,400 mg/day of sodium or a K/Na ratio of 0.7:1. Potentially up to 10 in foragers.
Even fairly large daily doses of potassium do not seem to be an issue for those with normal kidney function. The ratio of dietary potassium to sodium also affects potassium concentrations.
Potatoes have even been shown to have beneficial effect on blood pressure, lipid profiles and glycemic control. This may have something to do with their high potassium content. Consider eating lightly cooked and cooled potatoes. Lightly cooking and cooling potatoes maintains the resistant starch, helping to reduce the spike in glucose and insulin that occurs after you eat it. Perhaps more importantly, eating a steak with your potato, will also reduce the glucose/insulin spike.
Using salt substitutes has its risks and benefits. On one hand, salt substitutes that have more potassium and less sodium can be great for increasing dietary potassium intake without the need for supplementation. However, it can also cause hyperkalemia and do harm in patients with chronic renal failure whose kidneys are working at near maximum capacity to excrete potassium. This effect is exacerbated further by sodium restriction.
Potassium Absorption and Excretion
Approximately 90% of the consumed potassium is lost in the urine and 10% of it is excreted in the stool. A small amount of potassium is lost through sweat. With zero potassium intake, serum potassium can reach deficient levels of 3.0-3.5 mmol/L in about a week. Potassium excretion is regulated by the kidneys in response to dietary potassium intake. Unless you are potassium deficient, excretion of potassium increases rapidly after consuming potassium.
The glomerulus of the kidneys freely filters potassium and 70-80% of it is reabsorbed in the proximal tubule and loop of Henle. The two major factors that contribute to potassium loss are the renal handling of sodium and mineralocorticoid activity. Reabsorption of potassium in the proximal tubule is mostly passive and happens in proportion to the reabsorption of solute and water. In the Henle’s loop of the kidney, a small amount of potassium is excreted in the descending limb, whereas reabsorption of potassium and sodium occurs in the thick ascending limb. A lot of the potassium excretion/regulation happens in the late distal convoluted tubule (DCT) and the early connecting tubule of the kidney. Dietary potassium intake inhibits the sodium-chloride transporter in the distal convoluted tubule, reducing the absorption of salt by the kidneys. In other words, potassium consumption promotes salt excretion and helps people handle higher salt loads. Low potassium, however, causes salt retention and contributes to hypertension through activation of the sympathetic nervous system.
Aldosterone is a major mineralocorticoid that increases the excretion of potassium out in the urine by increasing sodium reabsorption in the lumen. It stimulates epithelial sodium channels and Na + -K + ATPase activity. Primary aldosteronism is associated with an increased risk of cardiovascular disease. The diurnal rhythm of aldosterone expression can affect renal potassium excretion, which should be kept in mind when taking urine samples. Aldosterone tends to be highest in the morning between 6-9 AM together with cortisol and drops down in the afternoon. Salt restriction increases aldosterone, which promotes sodium reabsorption and potassium/magnesium excretion, creating the opposite desired effect. Normal or even excess sodium intake results in suppressed or delayed aldosterone activity. Thus, reducing sodium intake may not be the most effective long-term strategy for lowering blood pressure.
A normal serum potassium concentration is 3.6-5.0 mmol/L. Hypokalemia or low potassium levels occur below 3.6 mmol/L and hyperkalemia (elevated potassium) is above 5.0 mmol/L.
Hypokalemia and hyperkalemia are rare in healthy people with normal kidney function, but it can occur due to diarrhea, kidney failure or vomiting. Hypokalemia affects up to 1/5th of hospitalized patients because of the use of diuretics and other pharmaceuticals. Mild hypokalemia can cause constipation, muscle cramps, weakness and malaise. Moderate hypokalemia causes encephalopathy, glucose intolerance, paralysis, arrhythmias and dilute urine. Severe hypokalemia can be fatal due to the induction of arrythmias. Magnesium deficiency can promote hypokalemia by increasing excretion of potassium. Hypokalemia and magnesium deficiency are associated with each other. Thus, fixing potassium deficiency would also require improving magnesium deficiency. Hyperkalemia can result from the use of ACE inhibitors, such as benazepril, and ARBs, such as losartan because they reduce potassium excretion. Potassium-sparing diuretics can also increase the risk of hyperkalemia. The side-effects of hyperkalemia include arrhythmias, palpitations, muscle pain and weakness. In severe cases hyperkalemia can cause cardiac arrest and death.
Here are the things that increase potassium demand or excretion:
Here are the things that improve potassium status
Assessing potassium status is difficult because most of the body’s potassium is located in the cell. Blood potassium levels do not correlate accurately with tissue potassium stores. Muscle biopsies can be used to look at tissue potassium status but measuring net potassium retention and excretion can also be an indicator.
Ways to Optimize Potassium Intake without Causing Hyperkalemia:
In conclusion, potassium and sodium are important minerals for cardiovascular health and the prevention of hypertension and insulin resistance. Traditionally, all the blame has been put on sodium when really it is sugar that’s the problem. Insulin resistance and diabetes can cause sodium retention and accumulation and raise blood pressure. At the same time, salt restriction comes with an array of negative side-effects that perpetuate insulin resistance and cardiac complications. Thus, it is more important to fix the underlying cause of hypertension, i.e., metabolic syndrome and obesity, which is due to an overconsumption of refined carbohydrates and sugars. Keeping potassium intake high by eating more muscle meat, fish, potatoes, fruit and green leafy vegetables alongside other mineral-dense foods is a great way to prevent glucose intolerance and ensure a healthy sodium to potassium ratio. Reaching an intake of potassium of 4,000-4,500 mg/day may seem like a daunting task but it is possible if you replace the processed foods with whole animal foods and fresh produce.
Boron intake is associated with a reduced risk of cancer, improved brain function, bone mineralization and reduced inflammation. It also seems to have anti-inflammatory, anti-osteoporotic, anti-coagulating, anti-neoplastic and hypolipemic (blood lipid lowering) effects.
Boron is found in many foods, especially vegetation, as it is a structural component of plant cell walls, providing cell wall rigidity. In plants, boron also has important roles in nucleic acid, carbohydrate and protein metabolism.
Boron’s Effect on Health and Vitality
More than 90% of boron consumed is absorbed and distributed as boric acid with 98% getting excreted through urine within 120 hours. Nearly 96% of the boron in an organism is uncharged boric acid [B(OH) 3] and a small amount of it is the borate anion [B(OH) 4 -]. Boric acid or borate forms ester complexes with many important sugars in energy production and stabilizes them, such as ribose, helping to produce energy and combat fatigue. Because of that, boron can regulate ribose-containing enzymes and catalysts, such as S-adenosylmethionine (SAMe), diadenosine phosphate, NAD+, and the NAD+ metabolite cyclic ADP ribose (cADPR).
These enzymes are involved with energy production, neurological function, cardiovascular health and bone formation. S-adenosylmethionine (SAMe) and diadenosine phosphates have a high affinity for boron. S-adenosylmethionine is one of the most common enzymatic substrates in the body, used primarily in methylation reactions. Boron also binds to oxidized NAD+ and cyclic ADP ribose, which can prevent the release of intracellular calcium.
Turkish men with a boron intake of 6 mg/day have significantly smaller prostate glands than men who consume 0.64–0.88 mg/day. Physiological concentrations of boric acid have been shown to inhibit prostate cancer cell growth in a dose-dependent manner through controlled apoptosis. Tumor suppressor p53 function requires the activation of activating transcription factor 4 (ATF4) and binding immunoglobulin protein (BiP) also known as glucose regulated protein (GRP-78) or heat shock 70 kDa protein 5 (HSPA5), which get increased by physiological doses of boric acid. Boric acid inhibits cADPR, reducing endoplasmic reticulum Ca 2+. This reduction in endoplasmic reticulum Ca 2+ activates ATF4 and nuclear factor erythroid 2 like 2 (Nrf2), which increases antioxidant response element genes.
Supplemental doses of boron likely act as an Nrf2 booster, which activates our body’s overall antioxidant defense enzymes. This is why boron has been suggested to protect against oxidative stress and DNA damage. Boron compounds (such as boric acid and borax) can increase total glutathione levels, total antioxidant capacity and numerous antioxidant enzyme activities in red blood cells, such as superoxide dismutase, catalase, glutathione peroxidase, glutathione-S-transferase (GST) and glucose-6-phosphate dehydrogenase (G-6-PDH).
Boron might be useful for reducing symptoms of osteoarthritis by reducing inflammation. By binding to 6-phosphogluconate, boron inhibits the 6-phosphogluconate dehydrogenase enzyme and reduces the inflammatory response and reactive oxygen species. Boron also inhibits other inflammatory enzymes like lipoxygenase that triggers the inflammatory responses of prostaglandins, leukotrienes and thromboxanes. Supplementing with calcium fructoborate at 112 mg/day reduces LDL, triglycerides, total cholesterol, interleukin-6, monocyte chemoattractant protein-1, c-reactive protein and the pro-inflammatory cytokine IL-1beta. Boron intake helps to regulate osteoblast and osteoclast proteins. Boron deprivation inhibits protein synthesis, which can result in delayed wound healing and soft tissue repair.
Boron also controls calcium and magnesium metabolism, which are vital for bone health. Boron supplementation reduces calcium excretion. In postmenopausal women, a low boron diet (0.25 mg boron/2,000 kcal) can raise urinary calcium and magnesium excretion. At the same time, a low boron diet promotes hyperabsorption of calcium for compensation. That can cause cardiovascular and bone health complications. Supplementing with 3 mg of boron a day for 10 months in sedentary female college students reduces serum phosphorus, increases serum magnesium and may slow the loss in bone mineral density.
Boron affects the function of many hormones, including vitamin D. Supplementing 3 mg/day of boron after 63 days of boron deprivation (0.25 mg/day) raises 25-OH-vitamin D (calcidiol) levels in older men and women. Boron also increases the half-life of active vitamin D likely through inhibition of 24-hydroxylase, which is the primary enzyme that inactivates calcitriol.
Boron affects sex hormones like estrogen and testosterone. By inhibiting sex hormone binding globulin (SHBG), boron prevents testosterone from being bound up, resulting in higher free testosterone. Instead of increasing estrogen directly, boron inhibits its breakdown and enhances estrogen receptor beta activity. Both estrogen and testosterone levels have been shown to double in women and men, respectively, after an increase in boron intake in individuals who were previously on a low boron diet. In healthy men, supplementing with boron at 10 mg/d for only a week can raise free testosterone from 11.83 pg/mL to 15.18 pg/mL and decrease estrogen from 42.33 pg/mL to 25.81 pg/mL (although in some studies boron increases both testosterone and estradiol levels in men).
Here are the boron-dependent enzymes, functions and consequences that may occur with deficient boron intake:
Boron-Dependent Enzymes/Proteins: Function: Consequences of Deficit
Boron Food Sources
Boron is found the most in plant foods, such as legumes, vegetables, tubers and fruit. Alcoholic beverages like wine, beer and cider also have some boron. Personal-care products like toothpaste, lip balm, baby oil and deodorants also have trace amounts of boron. Prunes are an excellent source of boron, providing 1.5 to 3.0 mg of boron per 3 oz serving. A study found that eating 3 ounces of prunes a day for a year improved bone mineral density in postmenopausal women, but dried apples did not. Boron does not accumulate in fish or land animals, but it does in algae, seaweed and plants. Many people following an animal-based diet like to eat avocadoes to boost their boron intake (8 oz. provides ~ 2 mg of boron).
Brazil, Japan and most of the U.S. have low boron in the soil because of high rainfall. California, Chile, Russia, China, Argentina, Turkey and Peru, however, have higher boron levels. Excessive levels of boron in the soil also reduce crop yields. One estimate suggested that up to 17% of the barley loss in Southern Australia was caused by boron toxicity. Borate deposits form into the ground when boric acid reacts with steam, which is why areas with hot springs and geysers tend to have more boron.
Urine, blood and other bodily fluids primarily contain boric acid. The absorption of boron occurs in the intestine but not through the skin. Boron homeostasis is regulated primarily by the kidneys and urinary excretion similar to sodium. The excretion of boron happens mostly through urine, feces, bile or sweat. Up to 92-94% of boron gets excreted in the urine after 96 hours of consumption. If dietary intake of boron is high, excretion is also higher and vice versa.
Environmental boron exposure is not a threat to human health. However, accidental consumption of borax found in household chemicals and pesticides (at doses of 18 to 9,713 mg) causes nausea, gastrointestinal distress, vomiting, convulsions, diarrhea, rashes and vascular collapse. Oral doses of 88 grams of boric acid were not shown to be toxic. However, taking over 84 mg/kg of boron has caused gastrointestinal, cardiovascular and renal adverse effects, dermatitis and death.
In excess amounts, boron can also contribute to the development of goiter by competing with iodine absorption. High concentrations of boron are used to eliminate bacterial and fungal infections by causing mitochondrial dysfunction.
Supplemental boron usually comes in the form of sodium borate, sodium tetraborate, boron amino acid chelate, boron gluconate, boron glycinate, boron picolinate, boron ascorbate, boron aspartate, boron citrate and calcium fructoborate. Taking 10 mg of boron as sodium tetraborate for 7 days can reduce sex hormone binding globulin (SHBG), resulting in higher testosterone, lower high sensitivity CRP (hsCRP) and TNF-alpha 6 hours after supplementation. Boron supplementation does not seem to affect lean body mass or strength in young male bodybuilders already engaged in resistance training. Calcium fructoborate has been shown to reduce inflammation in osteoarthritis and cardiovascular disease.
When taking boron supplements, it might be worthwhile to consume away from foods that contain B-vitamins and/or consume more foods with B-vitamins like beef, fish, eggs, vegetables and beans. Boron has a high affinity for riboflavin and ingesting high doses of boron may lead to side effects due to riboflavin deficiency (dry skin, photosensitivity and inflammation of the mouth and tongue).
Other Possibly Essential Trace Minerals
Lithium – There is a lot of evidence to show that lithium benefits mental conditions like bipolar disorder, schizophrenia and depression. Lithium can also benefit cluster headaches. These benefits are partially due to increased serotonin transmission but also from enhanced folate and vitamin B12 transport.
Vanadium is another trace metal with many biological functions. In animals, vanadium deficiency causes low thyroid, depressed fertility and lactation.
Nickel is circumstantially essential for growth, reproduction and glucose regulation in animals. It also influences iron metabolism, increasing its concentrations in the liver. Vitamin B12 and folic acid are also connected to nickel status. These vitamins are important for regulating homocysteine and thus affecting cardiovascular disease risk.
Silicon is a mineral important for connective tissue, blood vessels and bones. Animals deprived of silicon show abnormal cartilage and collagen. Reports since the 1970s suggest deficient silicon intake contributes to hypertension, atherosclerosis, bone disorders, arthritis and Alzheimer’s disease. Silicon might have anti-atheroma activity, reducing the accumulation of plaque in the inner wall of arteries.
Cobalt – It is a key component of cobalamin or vitamin B12, which is an essential vitamin. Vitamin B12 deficiency can cause hyper-homocysteinemia, which is associated with cardiovascular disease and Alzheimer’s. Ruminants convert cobalt into vitamin B12 in their stomachs.
In addition to the ones already mentioned, there are many other trace minerals that play some biological role in the body (bad or good), such as bromine, germanium, rubidium, tin and others. Their requirements, however, are just so small that there is no need to specifically try to increase intake. The toxic heavy metals aluminum, cadmium, arsenic and lead should be avoided, as many of us are getting too much due to environmental contamination. In regard to cadmium toxicity, oysters and scallops are the most contaminated, thus, their intakes should be kept to a minimum.
Sulfur, after calcium and phosphorus, is the 3rd most abundant mineral in the human body. Elemental sulfur is non-toxic and soluble sulfate salts come packaged in things like Epsom salts, which are typically used in baths for helping with muscle recovery and muscle soreness. Burning sulfur (power plants) creates sulfur dioxide (SO2), which can be harmful to the eyes and lungs in large concentrations. When SO2 reacts with atmospheric water and oxygen, it produces sulfuric acid (H2SO4) and sulfurous acid (H2SO3), which make up acid rain. Sulfuric acid is also a powerful dehydrant that can be used to dehydrate sugar or organic tissue. Sulfuric acid can also be produced in the body by eating sulfur-rich amino acids found in animal protein.
Historically, sulfur has been used to treat skin conditions, dandruff, improve wound healing and protect against acute radiation. Today, organic sulfur has been seen to improve rosacea and psoriasis. Sulfur springs and water have been known to have therapeutic effects for centuries. Studies show that spa therapy with sulfur water improves osteoarthritis and inhaling aerosolized sulfur containing water improves respiratory conditions. Sulfur-containing foods like broccoli may protect against inflammatory conditions like vascular complications from diabetes and heart disease.
Effects of Sulfur on the Body
Next to hydrogen, CO2 (carbon dioxide) and nitrogen, sulfur was probably one of the first nutrients organisms used as a building block of life and for energy production. These organisms can be divided into sulfur reducing or sulfur oxidizing. Sulfur-reducing anaerobic bacteria are believed to trace back to 3.5 billion years ago, making them one of the oldest living microorganisms on Earth. They obtain energy through a process called sulfate respiration, during which the oxidation of organic compounds or molecular hydrogen gets coupled to the reduction of sulfate.
One of the most important roles of sulfur is serving as a precursor in glutathione synthesis and promoting detoxification. The sulfur amino acids (SAA) methionine and cysteine promote protein synthesis as well as glutathione synthesis. In plants, sulfur partitioning between glutathione and protein synthesis determines growth. Eating a sulfur-rich diet in humans increases glutathione levels and reduces oxidative stress.
Defects in Fe-S cluster biogenesis cause numerous mitochondrial issues, anemia and problems with fatty acid and/or glucose metabolism. Mammalian target of rapamycin (mTOR), the body’s master growth pathway responsible for growth, regulates iron metabolism through iron-sulfur cluster assembly. Prolonged elevation of iron-sulfur cluster assembly enzyme (ISCU) protein levels enhanced by mTOR can inhibit iron-responsive element and iron-regulatory protein binding activities that are involved in iron uptake. Thus, sulfur is important for keeping iron levels in check and preventing iron overload. Protein and amino acids that contain sulfur (which also tend to contain the most leucine) are the most mTOR-stimulating food sources that promote growth.
Alzheimer’s disease patients have low levels of sulfur and selenium. These reductions are implicated in the lower glutathione levels seen in these patients and the increased oxidative stress and neuroinflammation, which promote neurodegeneration. Sulfur is also a potent aluminum antagonist that can reduce aluminum accumulation. Aluminum is a neurotoxin and considered a causative factor in Alzheimer’s disease. High amounts of aluminum in the brain tissue are associated with familial Alzheimer’s disease and other neurological disorders.
Connective tissue, the skin, ligaments and tendons require sulfur for proper cross-linking and extracellular matrix proteins like glycosaminoglycans (chondroitin sulfate, dermatan sulfate, keratan sulfate and heparan sulfate). Extracellular matrix proteins are highly sulfonated. They strengthen the bone structure and retain moisturization. Sulfur is needed to make bile acid for fat digestion and is a constituent of bone, teeth and collagen. The hair and nails are made of a sulfur-rich protein called keratin. Sulfur is also a component of insulin and hence is needed for blood glucose regulation.
Vitamin D3 sulfate is a water-soluble form of vitamin D3, unlike un-sulfated vitamin D3, and as a result of that, it can travel the bloodstream more freely without needing to be carried in LDL lipoproteins. Vitamin D3 sulfate has less than 5% of the ability of vitamin D to mobilize calcium from bones and about 1% of the ability to stimulate calcium transport, raise serum phosphorus or cause bone calcification. The form of vitamin D in human milk and raw cow’s milk is vitamin D3 sulfate but pasteurization destroys it.
Sunlight exposure triggering eNOS sulfate production increasing sulfate availability for heparan sulfate proteoglycan (HSPG) synthesis, which buffers against glycation damage and coagulation. Hyperinsulinemia reduces cholesterol sulfurylation into cholesterol sulfate by lowering calcitriol bioavailability, thus contributing to thrombosis seen in COVID-19. Platelets will respond to cholesterol sulfate but not other forms of cholesterol or steroid sulfates under the same conditions. Like vitamin D3 sulfate, cholesterol sulfate is also water-soluble and doesn’t have to be packed in LDL particles to be delivered to tissues. It has been shown that cholesterol sulfate inhibits enzymes involved in blood clotting, such as thrombin and plasmin, which may prevent coagulation in the arteries. Additionally, sulfate promotes oxygen delivery to oxidative phosphorylation dependent cells. Thus, a lack of sulfur in the diet may worsen COVID-19 outcomes by increasing protein glycation and coagulation and reducing tissue oxygenation.
Sulfur is needed for transporting free fatty acids from chylomicrons/VLDL/LDL into the capillary endothelium. Lipoprotein lipase (LPL), which is an enzyme that governs the uptake of free fatty acids from lipoproteins into cells, works with heparan sulfate proteoglycans (HSPGs). Abnormal or dysfunctional LPL expression is associated with various conditions like atherosclerosis, hypertriglyceridemia, obesity, diabetes, heart disease, stroke, Alzheimer’s and chronic lymphocytic leukemia. A low sulfur intake could lead to a decrease of free fatty acids into tissues that express LPL, such as the heart, skeletal muscle and brown adipose tissue, leading to an increase in free fatty acids in the blood and reduced supply to these tissues for energy.
Glucose can enter the cells through specific cholesterol-rich sites in the cell wall called lipid rafts that also control GLUT-4’s actions. GLUT4 is a receptor that sits on our cellular membranes to bring glucose into muscle cells. Cholesterol sulfate in the cell membrane reduces the risk of oxidation. Overconsuming refined glucose and fructose can promote the formation of advanced glycation end products (AGEs) that are associated with cardiovascular disease, diabetes and aging.
Here are the sulfur-dependent enzymes, functions and consequences that may occur with deficient sulfur intake:
Sulfur-Dependent Enzymes/Proteins: Function: Consequences of Deficit
Sulfur-Containing Compounds and Organosulfur Compounds
Sulfur exists in all cells and extracellular compartments as part of the amino acids cysteine and methionine. Covalent bonds of sulfhydryl groups between molecules form disulfide bridges that are required for the structure of proteins that govern the function of enzymes, insulin and other proteins.
Sulfur is also a component of chondroitin in bones and cartilage, thiamine (vitamin B1), biotin (vitamin B7), pantothenic acid (vitamin B5) and S-adenosyl methionine (SAM-e), methionine, cysteine, homocysteine, cystathione (an intermediate in the formation of cysteine), coenzyme A, glutathione, chondroitin sulfate, glucosamine sulfate, fibrinogen, heparin, methallotheionein and inorganic sulfate.
With the exception of thiamine and biotin, all of these substances are synthesized utilizing methionine or cysteine. The vast majority of the sulfur demands of the body are met with methionine and cysteine consumption. Glutathione, taurine, Nacetylcysteine, MSM (methylsulfonylmethane) and inorganic sulfate can spare the dietary need for methionine and cysteine.
Here is an overview of the effects of sulfur containing compounds:
Eating a half of an onion or more per day may lower the incidence of stomach cancer. In Northeast China, consumption of fresh cabbage and onions is inversely associated with brain cancer risk. More than one serving of garlic per week is inversely associated with the risk of colon cancer in postmenopausal women. Onion and garlic also possess antidiabetic, antibiotic, cholesterol-lowering and fibrinolytic effects. Garlic can lower mild hypertension and blood lipids, potentially reducing the risk of cardiovascular disease. Allium-containing vegetables like garlic, shallots and onions can raise glutathione levels via their organosulfur compounds.
Sulforaphane (SFN) is another sulfur-containing compound that has a long record of health benefits. Research has shown sulforaphane to be beneficial in managing Type-2 diabetes, improving blood pressure, Alzheimer’s disease, liver detoxification, LDL cholesterol, the immune system, bacterial and fungal infections, inflammation, liver functioning, brain derived neurotrophic factor (BDNF) and may reduce cancer risk. A lot of these effects are mediated by increased glutathione production as SFN promotes glutathione synthesis through the activation of the Nrf2/ARE antioxidant defense system. Sulforaphane is produced when we eat cruciferous vegetables like broccoli, cauliflower, cabbage and Brussels sprouts. Eating cruciferous vegetables have been shown to raise glutathione S-transferase activity (a family of phase II detoxification enzymes), which binds glutathione to a wide variety of toxic compounds for their elimination.
A small amount of sulfur can be obtained from the organosulfur compounds found in foods like broccoli, garlic and onions. However, the actual content of sulfur in a food depends on the soil quality.
Whole foods appear to be a more bioavailable source of sulfur than supplements. A study found that sulfur-methyl-L-methionine (SMM) from kimchi was more bioavailable than SMM by itself and it led to better digestion. Fresh vegetables and plants also contain more organosulfur compounds than processed foods. Taurine, which spares cysteine and hence the utilization of sulfur in the body for other functions, is found in animal foods but not in plants.
Benefits of Hydrogen Sulfide (H2S)
Overproduction of H 2 S is implicated in cancer, whereas deficiency can cause vascular disease. Hydrogen sulfide is involved in multiple cell signaling processes, such as the regulation of reactive oxygen species, reactive nitrogen species, glutathione and nitric oxide:
Hydrogen sulfide has cardioprotective effects that prevent atherosclerosis and hypertension. Hydrogen sulfide has vasorelaxant effects and acts as an endogenous potassium-ATP channel opener. H 2 S stimulates endothelial cells to facilitate smooth muscle relaxation. Disrupted H2S production has been suggested to promote endothelial dysfunction.
H2S may have neuroprotective effects by increasing glutathione and suppressing oxidative stress. There is a high level of hydrogen sulfide in the brain and it may act as an endogenous neurotransmitter. H 2 S might be able to scavenge peroxynitrite and prevent oxidative stress, especially in the brain where there is low extracellular glutathione. In mice and rats, H 2 S is released from bound sulfur in neurons and astrocytes. Alzheimer’s patients have lower hydrogen sulfide levels in the brain.
Hydrogen sulfide is involved in cell-death and inflammatory signaling as an endogenous mediator. Elevation of H 2 S accompanies inflammation, infection and septic shock. Inhibiting H 2 S production contributes to the gastric injury caused by nonsteroidal anti-inflammatory drugs (NSAIDS).
Hydrogen sulfide has a beneficial effect on longevity and lifespan. Nematodes that live in an environment of 50 ppm H 2 S exhibit higher thermotolerance and a 70% longer lifespan. They also have higher antioxidant and stress response gene activity. In bacteria, endogenous hydrogen sulfide and nitric oxide protects against antibiotic-induced oxidative stress and death. In mice, H 2 S exposure inhibits cytochrome c oxidase, lowers metabolic rate and body temperature, putting them in a suspended hibernation state. Calorie restriction is one of the few known ways of reliably extending lifespan in virtually all species. Endogenous production of hydrogen sulfide appears to be essential for the benefits of restricting calories by mediating the enhanced stress resistance. However, humans who are chronically exposed to exogenous sources of hydrogen sulfide from working in an industrial setting show signs of accelerated aging.
Hydrogen sulfide gets synthesized by three enzymes: cystathionine γ-lyase (CSE), cystathionine β- synthetase (CBS) and 3-mercaptopyruvate sulfur-transferase (3-MST). CSE and CBS go through the trans-sulfuration pathway during which a sulfur atom gets transferred from methionine to serine, forming a cysteine molecule. Thus, dietary amino acids like methionine and cysteine are the main substrates in the production of hydrogen sulfide. They are also needed for glutathione and taurine synthesis.
After consumption, methionine reacts with adenosine triphosphate (ATP) with the help of methionine adenosyltransferase to produce S-adenosylmethionine (AdoMet), a universal transmethylation methyl donor. The transfer of AdoMet creates Sadenosylhomocysteine (AdoHcy), which gets rapidly hydrolyzed into adenosine and homocysteine by AdoHcy hydrolase. Homocysteine can be either methylated into methionine or used to synthesize cysteine and hydrogen sulfide. About 50% of the methionine taken up by hepatocytes gets regenerated. The remaining 50% is metabolized by the trans-sulfuration pathway.
The trans-sulfuration pathway is present in many tissues, such as the kidney, liver and pancreas. It does not appear to take place in the spleen, testes, heart and skeletal muscle. Inhibition of the trans-sulfuration pathway results in a 50% decrease in glutathione levels in cultured cells and tissues. Oxidative stress can activate the trans-sulfuration pathway and antioxidants downregulate it. The synthesis of glutathione from cysteine depends on ATP and thus magnesium.
Sulfates and Sulfites
Sulfites are food preservatives derived from sulfur added to packaged foods, hair colors, topical medications, eye drops, bleach and corticosteroids to prevent spoilage. They can also develop naturally during fermentation, such as in sauerkraut, kimchi, beer or wine. Thus, alcohol can be a significant source of sulfites, which in excess can promote inflammation and gastrointestinal problems. About 1% of people, and 3-10% of asthmatics, are sulfite sensitive and they experience swelling, rashes, nausea and possibly seizures or anaphylactic shock when consuming large amounts of sulfites. Some people are allergic to sulfur and sulfur-containing foods like garlic or eggs, causing them to experience swelling, rashes and low blood pressure.
Sulfate is a source of sulfur for amino acid synthesis. Sulfate-reducing bacteria in the gut can promote ulcerative colitis through hydrogen sulfide. Patients of ulcerative colitis have elevated levels of fecal sulfide and sulfate-reducing bacteria. Sodium sulfate supplementation has been shown to stimulate the growth of sulfate-reducing bacteria. Excess sulfide can overburden the detoxification systems, impairing butyrate oxidation and causing colonic epithelial inflammation.
Gastrointestinal absorption of sulfate happens in the stomach, small intestine and colon. The absorption process is sodium-dependent. Heat and cooking reduce the digestibility of cysteine because heating oxidizes cysteine into cystine, which is absorbed more poorly. More than 80% of oral sulfate taken as soluble sulfate salts like sodium sulfate or potassium sulfate gets absorbed. Insoluble sulfate salts like barium sulfate barley get absorbed. The sulfate that does not get absorbed in the upper gastrointestinal tract will be passed to the large intestine and colon where it is either excreted in the feces, reabsorbed or reduced to hydrogen sulfide.
Getting Enough Sulfur-Containing Amino-Acids from the Diet
About 89% of sulfur amino acids are obtained as cystine, which is the oxidized disulfide of cysteine. Ingesting ample amounts of cysteine will also have a sparing effect on methionine. Safe minimum methionine requirements in the presence of excess dietary cysteine have been found to be 5.8-7.3 mg/kg/d compared to the 12.9-17.2 mg/kg/d in the absence of dietary cysteine (a 55-58% sparing effect).
Sulfur amino acids are not maintained in the body for the long-term. They either get excreted through urine, oxidized into sulfate or stored as glutathione. Sulfur is excreted primarily as sulfate with urinary sulfate reflecting the overall intake of sulfur from inorganic sources or amino acids. The 24-hour urinary excretion of urea, the final end-product of protein metabolism, reflects the 24-hour urinary excretion of sulfate because methionine and cysteine are the primary sources of body sulfate stores. Urinary sulfate excretion is inversely associated with all-cause mortality in the general population with higher excretion rates linked with mortality.
Many drugs used in the treatment of joint diseases and pain like acetaminophen require a lot of sulfate for their excretion. Up to 35% of acetaminophen is excreted conjugated with sulfate and 3% is conjugated with cysteine. The rest is conjugated with glucuronic acid, which is a major component of glycosaminoglycans that is essential for the integrity of cartilage. Thus, acetaminophen depletes the body of sulfur. Added methionine or cysteine can overcome methionine deficiency induced by acetaminophen.
Current research has seen that the maximum effects of protein on muscle growth appears to be around 0.8-1.0 grams of protein per pound of lean body mass, which would be roughly 1.0-1.2 grams per pound of total body weight.
Glutathione status is compromised in various disease states and an increased supply of sulfur amino acids can reverse many of these changes. A restriction in dietary sulfur amino acids slows down the rate of glutathione synthesis and diminishes its turnover. Fruit and vegetables contribute up to 50% of dietary glutathione, whereas meat contributes 25% for an average diet. Direct food sources high in glutathione are freshly prepared meat, fruit and vegetables are moderate sources and dairy and cereal are low in glutathione. Freezing foods maintains relatively the same amount of glutathione as fresh food, but other preservation methods, such as fermentation or canning, lead to extensive losses. Whey protein is also a good source of cysteine and hence glutathione and has been shown to lower oxidative stress. Milk thistle can raise glutathione levels and has antioxidant properties. Turmeric and curcumin can also promote glutathione synthesis and inhibit inflammation.
Excess methionine intake may lead to harmful side-effects, such as brain lesions and retinal degeneration, especially if not balanced with glycine intake. High dietary methionine intake (5-6 g/d compared to the 1-1,2 g RDA) can also raise homocysteine levels, even on top of adequate B-vitamin intake.
Methionine restriction (MR) is thought to increase lifespan and slow down aging. Additionally, it’s possible that those effects can happen without caloric restriction, which so far is the only known way of lengthening life in many species. Restricting methionine in Baker’s yeast extends their lifespan, which was accompanied by an increase in autophagy. However, when autophagy genes were deleted, the extension in lifespan was also prevented. It’s thought that restricting just protein can give the same effect on life-extension than caloric restriction without needing to restrict calories.
Methionine and cysteine are the most widespread sulfur amino acids that support glutathione and protein synthesis. A low intake of them can promote muscle wasting and inflammation.
In conclusion, sulfur is a central mineral for the body’s detoxification and antioxidant defense systems. It is needed for preventing oxidative stress-related conditions like atherosclerosis, cardiovascular disease, neurodegeneration, arthritis and aging in general.
Type-1 diabetes is an autoimmune disorder with a genetic basis wherein the pancreas produces almost no insulin. Without treatment, blood sugar can stay continuously elevated in Type 1 diabetics. Type 2 diabetes is where the body becomes resistant to the effects of insulin. In other words, the body can make insulin, but it doesn’t respond to it as well.
Diabetes increases the risk of cardiovascular disease and stroke by up to 1.8 to 6-fold. Nearly 50% of diabetics die due to cardiovascular disease. Diabetes is also the leading cause of kidney disease and kidney failure.
Diabetic retinopathy can cause blindness and impaired vision, whereas diabetic neuropathy in the limbs can lead to a lack of feeling in the extremities resulting in untreated wounds, which can lead to amputation. Furthermore, diabetes is associated with impaired cognition and neurodegenerative diseases like Alzheimer’s disease.
The first sign of Type 2 diabetes is hyperinsulinemia. However, this can only be picked up by measuring insulin levels after an oral glucose tolerance test, which most doctors do not order. Thus, by the time someone is diagnosed with having impaired glucose tolerance and/or impaired fasting glucose, they have already lost ~ 50% of their beta-cells that are needed to produce insulin. Both Type 1 and Type 2 diabetes are diagnosed as a fasting blood glucose ≥ 7.0 mmol/l (126 mg/dl) or when plasma glucose after a glucose challenge is ≥ 11.1 mmol/l (200 mg/dl) two hours later. Glycated hemoglobin (HbA1c) of ≥ 48 mmol/mol (6.5%) is another diagnosis method. Symptoms of diabetes include dry mouth, increased thirst, fatigue, hair loss, blurred vision, peripheral neuropathy and frequent urination.
Magnesium deficiency has been implicated in pancreatic beta-cell function, reduced DNA repair capacity, insulin resistance, cardiovascular disease, type-2 diabetes, osteoporosis, hyperglycemia and hyperinsulinemia. Copper, zinc, potassium and sodium are also needed for proper glucose metabolism.
Here are the main causes of Type 2 diabetes and/or impaired glucose intolerance:
Chromium Benefits
Chromium 3+, or trivalent chromium, was found to be the main active component of the ‘glucose tolerance factor’ in rats that alleviated their glucose intolerance caused by high sucrose consumption. In diabetic patients, chromium-enriched yeast can decrease fasting blood glucose and insulin. Chromium is required for the binding of insulin to the cell surface so it can exert its effects. However, chromium appears to require the synergy with nicotinic acid to work in lowering blood sugar through the glucose tolerance factor. GTF enhances the activity of insulin.
The most recent adequate intake (AI) for chromium in the U.S. has been set at 25-35 mcg/d for adults. To reach the 1 mcg/d recommendation you would have to ingest about 33-100 mcg/d. In 1980, the Estimated Safe and Adequate Daily Dietary Intake (ESADDI) for chromium was set at 50-200 mcg/d for adults. According to the World Health Organization, an approximate chromium intake of 33 mcg/d is likely to meet normal requirements. In Australia and New Zealand, the AI for chromium is 35 mcg/d for men, 25 mcg/d for women, 30 mcg/d for pregnant women and 45 mcg/d for lactating women.
Chromium supplementation does not appear to improve insulin sensitivity in healthy non-obese people.
The current understanding is that chromium binds to an oligopeptide to form chromodulin that activates the insulin receptor to mediate the actions of insulin. Chromodulin stimulates insulin-dependent tyrosine kinase activity that activates the insulin receptor. Thus, chromodulin functions as an amplifier of insulin signaling. Another oligopeptide called low-molecular-weight chromium-binding substance (LMWCr) carries chromium around the body as a second messenger for insulin signaling.
Because of its effects on insulin and glucose metabolism, chromium is thought to help with polycystic ovary syndrome (PCOS), which is characterized by insulin resistance and dyslipidemia. Chromium supplementation of 200-1,000 mcg/d for 8-24 weeks has not been found to have a significant effect on fasting glucose, but it did lower fasting insulin and bodyweight. Among diabetic PCOS patients, chromium supplementation improved HOMA-IR scores (a marker of insulin resistance). However, due to mixed results, more studies are needed.
By increasing insulin sensitivity, chromium supplementation has also been promoted as a way to increase muscle mass and athletic performance. With higher insulin action, nutrients would be stored and utilized faster, which is why bodybuilders use injectable insulin. The potential benefits of chromium on muscle anabolism are thought to be marginal, however, many athletes are thought to be deficient in this nutrient. If not deficient, the effects appear negligible.
Athletes who train at an intense level and sweat for around 60 minutes per training session, should consider supplementing with an extra 600 mcg of chromium picolinate (which has a 1.2% bioavailability, providing 7.2 mcg of chromium) or another chromium supplement with a similar bioavailability (such as Brewer’s yeast chromium) on exercise days.
A prospective study involving 3,648 subjects found an inverse association between the occurrence of metabolic syndrome and toenail chromium concentrations. Low toenail chromium concentration has also been associated with an increased risk of a heart attack. In Korean males, insulin resistance is associated with lower chromium hair concentrations and higher Ca/Mg ratio. Type 1 and Type 2 diabetes are correlated with blood chromium deficiency.
Here are the chromium-dependent enzymes, functions and consequences that may occur with deficient chromium intake
Chromium-Dependent Enzymes/Proteins: Function: Consequences of Deficit
Chromium Foods
Chromium is found in many foods, starting with meat and ending with vegetables. Many medicinal plants like sand immortelle, foxglove, Alexandrian laurel, Greek valerian, marsh cudweed, adenostilis and lobelia have high amounts of chromium. However, not all chromium in foods has glucose tolerance factor (GTF) and thus total chromium content may not be a valid indicator of the insulin sensitizing benefits of a given food.
Foods with the highest amount of chromium as GTF are brewer’s yeast, black pepper, liver, cheese, bread and beef, whereas the lowest ones are skim milk, chicken breast, flour and haddock. Supplementation with 9 grams of Brewer’s yeast per day for 8 weeks can improve impaired glucose tolerance, reduce serum cholesterol and lower the insulin response to a glucose load in humans. Polyphenolic herbs and spices like cinnamon have also been shown to improve insulin sensitivity together with chromium.
The richest sources of chromium are mussels and oysters, probably because of environmental contact. Fresh water fish living in areas of stainless-steel manufacturing, chrome plating or rubber processing have nearly 5 times more chromium than normal. They also get more of the toxic hexavalent chromium, which causes mutagenic gene damage.
Refining grains and wheat reduces the amount of chromium by up to 8-fold. Peeling apples and eating them without the skin reduces their chromium content by 70% (from 1.4 mcg to 0.4 mcg per apple).
Human milk also contains small amounts chromium, which ranges from 0.25-60 mcg/L, depending on the nutrient and lactation status of the mother. Infants of less than 6 months of age need less than 0.5 mcg/d of chromium. Finnish infants who are exclusively breast-fed and get 0.27 mcg/d of chromium do not show any abnormalities. Exposure to excess environmental chromium during pregnancy increases the risk of delivering low birth weight babies. Chromium levels in the body are higher at birth than later in life, which might explain why infants need much less chromium.
In deficiency the body tends to shut down urinary/fecal losses and increase mineral absorption. Thus, it may appear that someone is in positive mineral balance (less urinary/fecal losses than what is being absorbed) but this may actually indicate mineral deficiency. This is why you typically need to give an oral or IV loading dose of a mineral to see how much gets excreted out in the urine (at least for minerals that are regulated by the kidney) to determine deficiency. This is because if the body does not excrete much of a high loading dose of a mineral, this suggests a deficiency, as the body is trying to hold onto the mineral.
Chromium can also be obtained from the use of stainless-steel cookware during cooking. Cooking acidic foods like tomato sauce for several hours promotes the leaching of nickel and chromium from the cookware into the food. Preparing fresh meat in a food processor equipped with stainless-steel blades nearly doubles its chromium content and it does so 10-fold for liver after 3 minutes of blending. Canned and processed foods may be higher in chromium than fresh foodstuff because of this reason. The exception is refined white sugar that is incredibly low in chromium, whereas brown sugar and molasses are high in chromium. Consuming refined sugar as it comes in candies, pastries or other processed foods will thus contribute to excretion of chromium, especially if it is not being replaced.
Vitamin C (ascorbic acid) and prostaglandin inhibitors, such as aspirin, increase chromium absorption, whereas antacids and oxalates decrease it. Taking 100 mg of vitamin C together with 1,000 mcg of chromium raises plasma chromium levels more than taking 1,000 mcg of chromium alone. High sugar and fructose intake increases chromium excretion. Simple sugars excrete more chromium than complex carbohydrates. Thus, hyperglycemia and insulin resistance will also promote chromium elevation in blood and its excretion. Diabetics show a 3-fold increase in urinary chromium excretion. Physical exercise, sweating and injury will also promote chromium losses. Infections such as viruses promote the excretion of chromium.
Vitamin C (ascorbic acid) and prostaglandin inhibitors, such as aspirin, increase chromium absorption, whereas antacids and oxalates decrease it. Taking 100 mg of vitamin C together with 1,000 mcg of chromium raises plasma chromium levels more than taking 1,000 mcg of chromium alone. High sugar and fructose intake increases chromium excretion. Simple sugars excrete more chromium than complex carbohydrates. Thus, hyperglycemia and insulin resistance will also promote chromium elevation in blood and its excretion. Diabetics show a 3-fold increase in urinary chromium excretion. Physical exercise, sweating and injury will also promote chromium losses. Infections such as viruses promote the excretion of chromium.
Taking chromium together with insulin can trigger hypoglycemia because of the increased insulin sensitivity. The same applies to metformin or other blood sugar lowering medication.
It is recommended to not take levothyroxine of other thyroid medications with food or together with a chromium supplement. Instead, it would be best to take levothyroxine before eating and chromium at least 2 hours after levothyroxine. Chromium supplementation has been shown to reverse corticosteroid-induced diabetes.
In conclusion, chromium is considered an essential nutrient, especially because of its effects on insulin, glucose and lipid metabolism. However, healthy individuals do not need to deliberately be eating a high chromium diet to prevent metabolic syndrome as there are other more important variables, such as overall energy intake and body composition. Chromium has a more beneficial effect in people who have diabetes, insulin resistance or prediabetes. For them, chromium picolinate supplementation of 200-1,000 mcg/d may improve glycemic control and fasting insulin levels. An adequate consumption of 25-35 mcg/d from whole foods would probably be enough to prevent prediabetes in already healthy subjects but optimally most people would benefit from 33-100 mcg/d. To obtain that amount, you can get the most chromium from animal foods, especially mussels, oysters and lobster, but nutritional yeast and potatoes also have quite a lot. The demand for chromium increases with diabetic comorbidities, exercise and sweating.
Manganese has its own superoxide dismutase –manganese superoxide dismutase (MnSOD) – which functions to protect our mitochondria and endothelium from oxidative stress.
Molybdenum is also an interesting mineral that plays a part in a lot of enzymatic reactions in the body, involving detoxification of sulfites, purines and alcohol.
Manganese, MnSOD and the Body’s Antioxidant Defense
Manganese (Mn) is an essential mineral involved in the function of many enzymes, the immune system, blood clotting, bone formation and metabolism. In nature, it is found in combination with iron. Deficiencies in manganese impair growth, lead to skeletal defects, reduce fertility and alter glucose tolerance. Manganese metalloenzymes include arginase, glutamine synthetase, phos-phoenolpyruvate decarboxylase and MnSOD, all of which have an important role in antioxidant defense and metabolic reactions. Manganese superoxide dismutase deficiency triggers mitochondrial uncoupling and the Warburg effect, which may promote cancer and metastasis.
Manganese deficiency reduces bone mineral density, whereas supplementation improves bone health. Women with osteoporosis have been observed to have lower serum manganese levels than those without osteoporosis.
One of primary functions of manganese is that it makes up Mn superoxide dismutase (MnSOD), which scavenges free radicals and reactive oxygen species (ROS) in the mitochondria, endothelium and even in atherosclerotic plaque.
Atherosclerosis is a disease of chronic inflammation that damages the blood vessel wall and oxidizes LDL cholesterol. Superoxide dismutase protects against the oxidation of lipids. Manganese protects against heart mitochondria lipid peroxidation in rats through MnSOD. MnSOD can reduce oxLDL-induced cell death of macrophages, inhibit LDL oxidation and protect against endothelial dysfunction. There is an association between decreased MnSOD and atherogenesis.
Insulin resistance can be thought of as a cellular antioxidant defense mechanism against oxidative stress in the absence of other antioxidant defense systems, such as manganese superoxide dismutase or glutathione. In other words, a lack of manganese may be a direct cause of insulin resistance.
Manganese is required for creating oxaloacetate from pyruvate via pyruvate carboxylase. In mammals, oxaloacetate is needed for gluconeogenesis, the urea cycle, neurotransmitter synthesis, lipogenesis and insulin secretion.
Oxaloacetate also synthesizes aspartate, which then gets converted into other amino acids, asparagine, methionine, lysine and threonine. During gluconeogenesis, pyruvate is converted first into oxaloacetate by pyruvate carboxylase, which is then simultaneously decarboxylated and phosphorylated into phosphoenolpyruvate (PEP). This is the rate-limiting step in gluconeogenesis. Pyruvate carboxylase is expressed the most in gluconeogenic tissues, such as the liver, kidneys, pancreatic islets and lactating mammary glands.
Arginase is another manganese-containing enzyme and the final enzyme in the urea cycle, during which the body gets rid of ammonia. Arginase converts L-arginine into L-ornithine and urea. Manganese ions are required for stabilizing water and hydrolyzing L-arginine into L-ornithine and urea. Arginase is also involved with nitric oxide synthase. Disorders in arginase metabolism are implicated with neurological impairment, dementia and hyperammonemia. Excess ammonia is not acutely harmful to humans as it will either be converted into amino acids or excreted through urine. However, high ammonia is associated with upper gastrointestinal bleeding in cirrhosis and kidney damage. Urea cycle disorders cause delirium, lethargy and even strokes in adults.
Manganese is a co-factor for glutamine synthetase (GS), which is an enzyme that forms glutamine from ammonia and glutamate. Glutamine is used by activated immune cells. It supports lymphocyte proliferation and helps to produce cytokines by lymphocytes and macrophages. Additionally, glutamine may help people with food hypersensitivities by reducing inflammation on the gut surface. It can thus protect against and repair leaky gut, hence improving immunity. Getting enough glutamine from the diet or by using a supplement may help to protect intestinal epithelial cell tight junctions, which prevents intestinal permeability. Glutamine synthetase is most predominantly found in the brain, astrocytes, kidneys and liver. In the brain, glutamine synthetase regulates ammonia detoxification and glutamate the excitatory neuro-transmitter. Manganese gets in the brain with the help of the iron-carrying protein transferrin, but manganese doesn’t seem to compete with iron absorption.
Acute manganese toxicity causes manganism, characterized by neurological symptoms, mood swings, compulsive behavior and decreased response speed. Manganese overexposure can also impair cardiovascular function and heartbeat. Chronic exposure leads to a more permanent dysfunction that resembles Parkinson’s disease. Unlike Parkinson’s, manganism does not appear to cause a loss of sense of smell and patients do not respond to LDOPA treatment. The neurotoxic effects appear to be caused by disturbed iron and aluminum metabolism and iron overload that causes oxidative stress. Oxidative stress is one of the main factors of manganese-induced neurotoxicity. It is important to note that a lack copper in the body leads to iron overload, which can lead to manganese overload in tissues. Thus, a lack of one mineral, such as copper, can lead to the harmful accumulation and dysregulation of other minerals in the body.
Manganese accumulation in the mitochondria can also cause mitochondrial dysfunction by inhibiting complex I and II. The highest accumulation rate is observed in the mitochondria of astrocytes and neurons compared to other organelles after chronic manganese exposure. Excessive manganese gets excreted out of the mitochondrial lumen through sodium-independent mechanisms, but it gets imported mainly by the calcium uniporter. Manganese inhibits calcium efflux, which increases the probability of mitochondrial permeability transition associated with brain injury and stroke.
Manganese poisoning can occur due to drinking contaminated water or when exposed to the fuel additive methylcyclopentadienyl manganese tricarbonyl (MMT) and the pesticide manganese ethylene-bisdithiocarbamate (Maneb). Permanganate (Mn 7+) is much more toxic than Mn 2+ with potassium permanganate having a lethal dose of 10 grams. Mining and processing manganese causes air and water pollution, which threatens the health of workers and local residence, especially in South Africa, China and Australia where most of the world’s manganese is mined from.
The olfactory tract is the most direct pathway for manganese to get into the brain. By using two zinc transporters ZIP8 and ZIP14, it can bypass the blood-brain barrier. Infants have an immature blood-brain barrier, making them more vulnerable to manganese toxicity. MRI studies show that manganese accumulates predominantly in the globus pallidus located in the basal ganglia. Dopamine oxidation by manganese causes oxidative stress. Chronic manganese exposure has been seen to reduce choline levels in the hypothalamus and thalamus.
Here are the manganese-dependent enzymes, functions and consequences that may occur with deficient manganese intake:
Manganese-Dependent Enzymes/Proteins: Function: Consequences of Deficit
Manganese Food Sources
There is no established RDA for manganese. The estimated safe and adequate daily intake for manganese has been set at 2-5 mg/d. Adequate intakes for adult males are 2.3 mg/d and 1.8 mg/d for females. Men on experimentally manganese-deficient diets (< 0.74 mg/d) have been shown to develop erythematous rashes. Women consuming < 1 mg of manganese per day experience altered mood and increased pain during their premenstrual phase of the estrous cycle. During pregnancy or lactation, the demand for manganese increases to 2.0-2.6 mg/d. Children that are 1-3 years old should get 1.2 mg/d, 4-8 years old 1.5 mg/d and boys 9-13 years old should get 1.9 mg/d and girls 1.6 mg/d. Newborns that are 6 months old or younger should get 0.003 mg/d and 7– 12-month-olds should obtain 0.6 mg/d. The tolerable upper intake levels for manganese have been set at 9-11 mg for adults and 2-9 mg for children 12 months of age or older.
Manganese is found in many foods, including grains, clams, oysters, legumes, vegetables, tea and spices. One of the most commonly consumed sources of dietary manganese are rice, nuts and tea. Dietary and non-experimental manganese deficiency is quite rare, and toxicity occurs mostly due to environmental exposure. Some supplements are also fortified with manganese, ranging from 5 to 20 mg. The amount of manganese in human breast milk ranges from 3-10 mcg/L. Cow’s milk-based infant formulas contain 30-100 mcg/L. Soy-based formulas have the highest concentration of manganese (200-300 mcg/L), which might lead to manganese accumulation. The absorption of manganese from human milk is the highest (8.2%) compared to soy formulas (0.7%) and cow’s milk formula (3.1%).
Manganese absorption and influx is mostly mediated by the divalent metal transporter 1 (DMT1) and transferrin receptor (TfR) on the cell surface that also transport other divalent cations, such as iron and calcium. Iron deficiency can increase the risk of manganese poisoning because manganese absorption will increase under low iron states.
Liver damage impairs manganese excretion, causing high levels of manganese in the blood. Patients with portosystemic shunts (also known as liver shunts) and biliary atresia (narrowed or blocked bile ducts) exhibit hypermanganesemia, even without increasing dietary manganese intake.
Lymphocyte manganese is thought to be better than whole blood manganese for measuring manganese deficiency.
Most of blood manganese gets distributed in soft tissue (~60%), with the remainder being in the liver (30%), kidney (5%), pancreas (5%), colon (1%), bone (0.5.%), urinary system (0.2.%), brain (0.1.%) and erythrocytes (0.0.2%). Divalent Mn 2+ is the predominant form of blood manganese and it is complexed with different molecules, such as albumin (84% of total Mn 2+), hexahydrated ion (6%), bicarbonate (6%), citrate (2%), and transferrin (Tf) (1%), whereas almost all trivalent Mn 3+ is bound to transferrin. Mn 3+ is more reactive and gets reduced to Mn 2+ and Mn 2+ can be oxidized to Mn 3+ by ceruloplasmin the active form of copper. Mn 3+, typically coming in the form of manganese (III) fluoride, manganese (III) oxide and manganese (III) acetate, is more effective at inhibiting the mitochondrial complex I. The reduction of Mn 3+ to Mn2 + is mediated by ferrireductase, which helps to avoid oxidative stress.
Tissues with a high energy demand like the brain and liver as well as high pigment regions such as the retina and skin have the highest concentrations of manganese. Approximately 40% of total body manganese is stored in the bones (1 mg/kg). It has been found that the hand bone manganese levels of welders exposed to high manganese environments are significantly higher compared to non-occupationally exposed individuals.
Molybdenum for Fixing Nitrogen and Detoxifying Toxins
Molybdenum cofactors (Moco) are complexed to a pterin, forming molybdopterin, which is a cofactor for many important enzymes in the body including sulfite oxidase, xanthine oxidase, dimethyl sulfoxide (DMSO) reductase, aldehyde oxidase, carbon monoxide dehydrogenase (CODH), nitrate reductase and mitochondrial amidoxime reducing component (mARC). They metabolize sulfur-containing amino acids and heterocyclic compounds, including purines and pyrimidines. Xanthine oxidase, aldehyde oxidase, and mARC also help to metabolize drugs, alcohol and toxins. The only molybdenum enzyme that does not contain molybdopterin is nitrogenases that fix nitrogen (protein) in all life forms. The cofactor for nitrogenase is called FeMoco (iron-molybdenum cofactor) that converts atmospheric nitrogen into ammonia.
Genetic mutations of a molybdenum cofactor deficiency cause an absence of molybdopterin, which leads to the accumulation of sulfite and neurological damage due to lack of sulfite oxidase. Some people with blatant molybdenum deficiency may also have troubles dealing with sulfites in food because of a poorly functioning sulfite oxidase. Molybdenum cofactor deficiency is not the same as molybdenum deficiency.
Molybdenum cofactors cannot be obtained as a nutrient from dietary sources and need to be synthesized by the body (de novo biosynthesis). One of the enzymes that catalyzes this process is radical SAM (an enzyme that uses iron-sulfur clusters to cleave Sadenosyl-L-methionine (SAM) to generate a radical. The other option for molybdopterin, besides molybdenum-sulfur pairing, is to complex with tungsten(wolfram)-using enzymes, which requires selenium for its action.
Here are the molybdenum-dependent enzymes, functions and consequences that may occur with deficient molybdenum intake:
Molybdenum-Dependent Enzymes/Proteins: Function: Consequences of Deficit
Molybdenum Food Sources
The RDA for molybdenum is 45 mcg/d for adults 19 years of age and older and 43 mcg/d for teenagers 14-18 years of age. During pregnancy and lactation, it is recommended to get 50 mcg/d of molybdenum and for children 9-13 years old 34 mcg/d. Infants from birth to 6 months of age should get 2 mcg/day and from 6 to 12 months of age 3 mcg/d. Children aged 1-3 years of age should get 17 mcg/day and children 4-8 years of age should get 22 mcg/day. The tolerable upper intake level (UL) for adults 19 years of age and older is 2,000 mcg/d. The UL for children 1-3 years old is 300 mcg/d, 4-8 years old is 600 mcg/d, 9-13 years old is 1,100 mcg/d and 14-18 years old is 1,700 mcg/d. Consuming a reasonably high molybdenum diet does not pose a threat to health because it gets rapidly excreted through urine. The minimum requirement for molybdenum for adults is deemed to be 22-25 mcg/d and estimated average requirement is 34 mcg/d.
Low soil concentrations of molybdenum, ranging from Iran to Northern China cause a dietary molybdenum deficiency, which is associated with an increased rate of esophageal cancer. There is no evidence that excess molybdenum causes cancer in humans or animals.
Exposure to prolonged high intakes of molybdenum can raise uric acid levels because xanthine oxidase breaks down purines into uric acid.
Exposure to molybdenum in molybdenum-copper plants can raise serum bilirubin and decrease blood albumin/globulin ratios, which is interpreted to indicate liver dysfunction.
Foods high in molybdenum include legumes, beans, whole grains, nuts and liver. They are also the top sources of molybdenum in the U.S. diet. For teenagers and children, the top sources are milk and cheese. Molybdenum deficiency is quite rare because it can be found in a lot of different foods.
Molybdenum is absorbed at a rate of 40-100% from dietary sources in adults. Soy contains relatively high amounts of molybdenum but is a less bioavailable source (56.7% absorption rate), whereas foods like kale are as bioavailable as other foods (86.1%). Dietary tungsten reduces the amount of molybdenum in tissues and sodium tungsten is a competitive inhibitor of molybdenum.
The human body contains roughly 0.07 mg of molybdenum per kilogram of bodyweight. Molybdenum is stored in the liver, kidneys, adrenal glands and bones in the form of molybdopterin. It is also found in tooth enamel, helping to prevent tooth decay. The ability to synthesize molybdopterin, which is an iron-sulfur and selenium-dependent process, seems to be necessary for retaining molybdenum in tissues.
The kidneys are the main organs regulating molybdenum levels in the body and promote its excretion through urine. Urinary excretion of molybdenum directly reflects dietary molybdenum intake. During low molybdenum intake, about 60% of consumed molybdenum gets excreted through urine and 90% is excreted when molybdenum intake is high. The average U.S. urinary molybdenum concentration is 69 mcg/L. In molybdenum cofactor deficiency and in that single case of molybdenum deficiency, urinary sulfate and serum uric acid were low, but urinary xanthine, hypoxanthine and plasma methionine increased. However, these indicators have not been associated with molybdenum intake in normal healthy people.
Getting Enough Manganese and Molybdenum
Here is a brief overview of the daily requirements:
In conclusion, manganese and molybdenum are hidden players in the body’s antioxidant defense system that provide protection against free radicals, oxidative stress and inflammation. It is hard to become deficient in them, but many people are not getting optimal intakes, especially for manganese.
Optimal Daily Intake of Minerals
Both EPA and DHA can be created by the body by converting the essential omega-3 fatty acid alpha-linoleic acid (ALA) into EPA and DHA. So technically, EPA and DHA are not essential nutrients. However, the conversion of ALA to EPA and DHA in the body is low and taking preformed EPA and DHA has numerous health benefits. Hence, despite the fact that EPA and DHA are not considered ‘essential’, a lack of these nutrients in the diet can lead to poor health, especially during pregnancy, malnourishment, childhood growth and during some diseases. And this is likely to be the case for numerous ‘non-essential’ minerals. In fact, the situation is likely grimmer as there isn’t a mechanism in the body to produce non-essential minerals. Thus, both essential and non-essential minerals must be obtained through diet on a regular basis to maintain optimal nutrient status and health.
7 Macrominerals
Mineral: Health Function: Risk of Deficiency/Excess
10 Trace Minerals
Mineral: Health Function: Risk of Deficiency/Excess
5 Possibly Essential Trace Minerals
Mineral: Health Function: Risk of Deficiency/Excess
Here is an overview of these same minerals, their recommended dietary sources and recommended daily intakes for adults
7 Macrominerals
Mineral: Recommended Dietary Sources: Optimal/Deficiency/Excess Intake
10 Trace Minerals
Mineral: Recommended Dietary Sources: RDA/Deficiency/Excess Intake
5 Possibly Essential Trace Minerals
Mineral: Recommended Dietary Sources: RDA/Deficiency/Excess Intake
Guidelines for Eating Superfoods
Not all foods are created equal in terms of their nutrient values, especially their mineral content. Some of them, like beef liver and pastured eggs, contain virtually all the nutrients your body needs while others, like prunes are an excellent source of primarily one thing – boron. Then there are a bunch of foods that are moderately good for getting a wide range of minerals, such as beans or potatoes. Regardless, there are specific “superfoods” that you may want to keep in your diet on a regular basis to cover your bases for some of the more common deficiencies. However, you shouldn’t be eating things like liver, cruciferous vegetables, dairy or legumes in excess either because they can cause imbalances with other minerals or reduce their absorption.
Here is a list of the more frequently referred to superfoods that you should know in what amounts and how often to eat:
Liver – Arguably the most nutrient-dense food on the planet is liver, whether from beef, lamb, pork, chicken or game. It is an excellent source of the commonly deficient minerals, especially copper, iron, chromium, molybdenum, selenium and zinc. However, because liver is so packed with vitamins and minerals, eating it in excess will lead to increased urinary excretion and overload of certain nutrients.
Heart – Although not as nutrient-dense as liver, heart is still packed with a lot vitamins and minerals, in comparison to regular muscle meat. The most noticeable nutrient in heart is CoQ10, which is a coenzyme involved with many mitochondrial processes as well as participates in the electron transport chain and ATP generation. It also has antioxidant properties and is has been used in cardiovascular diseases including those with heart failure. Heart also contains high amounts of protein and amino acids, zinc, selenium and elastin. However, it is slightly higher in iron than regular red meat. Consuming about 0.5-1 oz. of heart daily or 2-3 oz. of heart two to three times per week would suffice.
Oysters/Clams/Mollusks – Just a single 3 oz. serving of oysters or mollusks can cover your entire weekly zinc demand (75-150 mg vs the 8-11 mg RDA). However, it is likely your body doesn’t absorb all of it in a single sitting and responds by increasing urinary excretion. Regardless, eating oysters/clams/mollusks every day is probably not a good idea as it could lead to zinc overload, which reduces copper absorption. Thus, eating seafood like oysters once a week is sufficient enough to help boost your zinc RDA.
Red Meat/Beef/Pork/Chicken/Game – Meat is also an excellent source of many minerals, especially zinc, iron and sulfur. It is the best way to get all the sulfur amino acids, like methionine and cysteine, but because of that same reason can lead to a high methionine to glycine ratio.
Beans/Legumes/Lentils – Beans, legumes and lentils are one of the top nutrient-dense foods in developing countries that don’t have as much access to meat. They are the highest sources of plant-based protein, making them essential for any vegetarian/vegan diet.
Broccoli/Cauliflower/Cabbage – Cruciferous vegetables are great for increasing glutathione through sulforaphane and Nrf2. They are also good for getting more potassium, boron, chromium, calcium and magnesium. Compared to beans, broccoli and cauliflower do not contain phytates or phytonutrients that chelate iron or zinc. However, their goitrogenic content does reduce iodine absorption if high amounts are consumed raw, which may cause goiter and hypothyroidism. To prevent that, you should not eat a high amount of these vegetables raw or in smoothies. Instead, cooking, heating, frying and steaming them reduces their goitrogenic properties without losing a lot of micronutrients.
Kale/Spinach/Collard Greens – All kinds of greens like kale and spinach can have some health properties, especially in terms of their magnesium, potassium and calcium content. However, they can also harm the thyroid when eaten raw. That is why you should cook them beforehand. They are also high in oxalates that can promote kidney stones and gastrointestinal distress in those who are susceptible, however, their high calcium content tends to make them fairly low in bioavailable oxalates. Citric acid (lemon juice) and increased calcium intake helps to break down oxalates. So, if you are making a salad, adding some lemon juice, vinegar and eating it with some dairy can prevent potential negative effects.
Eggs – Similar to liver, eggs contain nearly all the vitamins and minerals your body needs. Eggs have all the amino acids both essential and non-essential. Because of that, eating 2-4 pastured eggs a day is an easy way to meet your daily sulfur requirements, while simultaneously hitting a lot of the other minerals, such as iodine, magnesium, molybdenum, selenium, zinc and phosphorus. You should avoid eating conventional eggs as they will have less omega-3s and other health promoting nutrients. To avoid a high methionine/glycine ratio, either stick to eating around 4 eggs a day or ensure you are consuming additional sources of glycine such as hydrolyzed collagen peptides.
Dairy/Cheese/Milk – One of the most bioavailable and common sources of calcium is dairy. Milk is probably more bioavailable than cheese and curds because it’s a liquid and, similar to mineral waters, minerals get absorbed better when consumed in a liquid form. It is hard to get excess calcium (>1,500 mg/d) by eating dietary calcium. Regardless, you may want to add a little bit of dairy/calcium to your larger meals because it also reduces the total absorption of fat, helping with body composition. Consuming more calcium from sardines (with the bones) or a glass of pastured milk with salads will also protect against oxalates. On a daily basis, you can meet your calcium requirements by drinking either 1 glass of milk per meal or having 3 oz of fish with the bones next to 1.5-3 oz of cheese. Many people do not tolerate dairy, thus dairy consumption should be individualized.
Fish/Salmon/Sardines – Omega-3s are greatly beneficial for reducing inflammation and improving lipid profile. Fish itself is also a great source of other minerals, such as magnesium, iodine, selenium, zinc, manganese and potassium. You can also get calcium from the bones of small fish, like sardines or sprats.
Coffee/Tea – The most consumed beverages in the world after water are coffee and tea. They have a long history of culinary and recreational use. However, research also finds these drinks have some health benefits. There’s evidence that habitual tea drinking has positive effects on brain efficiency and slows down neurodegeneration. The polyphenols in coffee have also shown to reduce the risk of diabetes, Alzheimer’s, dementia, and even liver cancer.
Chocolate/Cacao – One of the best-known superfoods of South America is cacao found in chocolate. It is true that chocolate actually has a significant amount of some minerals, such as magnesium, copper, iron and chromium.
Fruit/Berries/Juices – Fruits and berries are naturally high in potassium, which is hard to come by in other foods. You can also get citrate, which helps to buffer against the dietary acid load from meat and eggs. However, added fructose as a sweetener tends to promote the excretion of minerals. Added fructose can also induce insulin resistance, which places an additional demand for certain minerals, such as magnesium and chromium.
Shilajit (mumie, moomiyo or mummiyo) also called mineral pitch is a black-colored substance, consisting of paleohumus and vegetation fossils, that’s high in fulvic acid and has been used in Ayurvedic medicine for thousands of years. It is collected from steep rock faces at altitudes 1000- 5000 meters. Shilajit is not an actual food per se but instead an herb that can be taken as a supplement. Typical doses range from 200-2,000 mg/d.
If you can’t tolerate organ meats by themselves, you can grind or mince them together with ground beef and make patties or pate.
Generally, cooking and overheating destroys some nutrients, which for goitrogens, lectins or phytates may actually be a good thing.
Lightly cooking and cooling starch like potatoes and rice also increases their resistant starch content. There are many studies showing that resistant starch can improve insulin sensitivity, lower blood sugar, reduces appetite and help with digestion. Resistant starch also stimulates the bacteria in your gut to produce short-chain fatty acids (SCFAs) like acetic acid, propionic acid, and butyric acid. The SCFAs can feed the cells that line the colon and help with nutrient absorption. Some studies show that 15-30 grams of resistant starch/day for 4 weeks can improve insulin sensitivity by 33-50%.
Fresh foods tend to have more nutrients as storage can reduce nutrient content. Freezing, however, can help reduce the loss of the nutrients if the food is frozen right after harvest. Unfortunately, frozen veggies deactivate myrosinase, which is an enzyme that creates sulforaphane. Regardless, you would still get plenty of potassium, vitamin C and sulfur from vegetables. Organic foods also have more bioavailable nutrients, as pesticides and glyphosate can bind to minerals reducing their presence in the food and their bioavailability once consumed.
The US Department of Agriculture’s Pesticide Data Program publishes an annual report on the most pesticide-rich foods. They divide it into The Dirty Dozen and The Clean 15. Here’s a list for the year 2018:
Fixing Mineral Deficiencies
About a third of the U.S. population is likely to be deficient in the below 10 minerals (estimated % not hitting RDA/AI or estimated % deficient):
Here is how to prevent your body from becoming deficient of essential minerals by protecting against their excretion or improving their absorption:
Limit Added Sugar and Refined Food Intake – Hyperglycemia and high sugar consumption places an additional burden on the liver and kidneys, which also makes the body increase excretion of certain minerals, namely magnesium, chromium and copper.
Fix Insulin Resistance and Improve Glycemic Control – During insulin resistance, the body either is not producing enough insulin (like in type-1 diabetes), keeping the blood sugar elevated for longer, or the cells are not responsive to the actions of insulin and don’t allow the entry of nutrients into the cell. In either case, hyperglycemia ensues that makes you burn through minerals while increasing their excretion.
Improve Gut Health and Fix Malabsorption Conditions – Your gut is where most of the absorption of minerals from food occurs. Having a healthy gut is vital for assimilating and retaining nutrients. Many malabsorption conditions, such as intestinal permeability (leaky gut), IBS, Crohn’s and ulcerative colitis can reduce the absorption of certain minerals. If you have any gut condition, you will need to hit at least the RDA for magnesium, potassium, zinc, copper and selenium. If you already show signs of deficiencies in these minerals, you may need to increase your intake further in the short term.
Improve Liver and Kidney Health – Most of the metabolic processes are regulated by the liver and kidneys. They also determine the homeostatic balance and excretion of all minerals. Poor kidney function tends to promote the urinary loss of magnesium, chromium, manganese, zinc, copper and many others.
Avoid Heavy Metal Exposure – Environmental pollutants, especially heavy metals, also increase the excretion of some minerals and compete with their absorption. What’s more, mineral deficiencies like iron deficiency can increase the absorption of heavy metals, such as cadmium, lead and aluminum.
Avoid Drugs/Medications That Promote Mineral Loss – Pharmaceuticals tend to reduce the absorption of minerals and promote their excretion. Antacids and diuretics affect magnesium and potassium the most. When taking a prescription drug for a certain medical condition that cannot be avoided, look up to see what nutrients they may deplete and make sure you obtain enough of them in the diet or through supplementation.
Eat Mineral-Dense Foods Regularly – You should be eating things like liver, oysters and/or clams on a fairly regular basis, at least once a week. This way your requirement for supplements will greatly reduce as you’ll be getting the nutrients from your food. It is not necessary to be eating “superfoods” daily with every meal. However, you could also “microdose” (1 oz/d) foods like liver to spread your intake across the entire week. Large acute doses of minerals tend to make the body increase urinary excretion or reduce their absorption. It is also harder to catch up on deficiencies compared to having a consistent intake of minerals from foods.
Add Some Mineral Waters to Your Diet – One of the best ways to get more magnesium and calcium into your diet is to drink mineral waters. Mineral waters have a better bioavailability while providing other health benefits. Because they lack calories, mineral water is one of the healthiest ways to simultaneously improve your mineral status and metabolic health. Consuming about 1/3-2/3rds of your daily water intake as mineral water would contribute greatly to your daily mineral requirements.
Supplement Your Deficiencies – Taking supplements is a quick way to overcome severe nutrient deficiencies that are causing health problems. However, they may also have negative side-effects. For example, taking an iron or zinc supplement will impair copper absorption. Likewise, a chromium supplement for someone who is already metabolically healthy may be just a waste of money. Thus, you should supplement only those minerals you are deficient or suboptimal in. First, test your mineral status and then consult with a medical professional about the appropriate course of action in terms of supplementation. Minor deficiencies can easily be fixed by improving diet or metabolic health.
People who exercise or sweat a lot due to either being physically more active, sunbathing, or taking saunas frequently are more prone to electrolyte imbalances. When you sweat you lose water, sodium, chloride, copper, chromium, selenium and iodine. Because sweating is an essential way for humans to regulate their body temperature, we can’t avoid it and thus may be prone to becoming deficient or at least suboptimal in different minerals.