Understanding the role of vitamin K-dependent proteins in vascular calcification

What if the process of arterial calcification was regulated from within the cells of the blood vessels, and that it had nothing directly to do with what you ate and what circulated in the bloodstream because calcification takes place not anywhere near the surface but inside the blood vessel wall?

What if the process of arterial calcification was actually a process by which muscle is transformed into bone, a process by which vascular smooth muscle cells transform themselves into bone cells which then actually build bone tissue within the blood vessel wall?

And what if apoptosis preceded calcification, what if cell death was what triggered the process of calcification, and it was the apoptotic bodies of dead vascular smooth muscle cells within the blood vessel wall that served as the nodes around which calcium crystals formed?

Would you not find this shocking? Find it incredible that any of these could be true, let alone all of them? It’s entirely not at all what we’ve been told by “health experts” and “health authorities” for more than half a century!

All of these statements are hard to believe. It is especially unbelievable that muscle cells can change into bone-building cells, and begin to grow bone tissue within the artery wall. It sounds surreal, kind of like science fiction. But it isn’t. All of it is true. All of this has been observed.

Interesting, you may think, but what does any of this have to do vitamin K? Everything! It has everything to do with vitamin K.

How clever we are

The sophistication and precision of biochemical reactions and processes in animals and humans are mind blowing. Understanding how they work is a wonderfully noble endeavour that is certainly very fulfilling in its own right. In some cases though, it can be a matter of life and death. And in the case of the processes related to and regulated by vitamin K dependent proteins it definitely is.

This is not an exaggeration to push you to read on. It’s a statement of fact. And you’ll see how this is true by the time we finish. I believe it is essential, for each one of us to understand the details of how things in our body work and how they are related and connected in order to appreciate their significance and their importance.

We are so clever. We can figure out such complicated things when we put our minds to it. Things like complex biochemical pathways, or long chains of enzymatic reactions that, one step at a time, transform molecules from one form into another. And it is this kind of cleverness that has enabled us to develop the hundreds of different types of medications we can find today in drug stores.

We have designed medications to address basically every symptom we can think of. If it’s a symptom we’ve had, it’s most likely a symptom that many others have or have had. And if many have the same or a similar symptom, we can be sure that at least one pharmaceutical company will have made a drug for it.

Warfarin was developed in the 1950s to prevent or at least suppress coagulation, and in so doing help prevent or at least reduce the number of strokes and heart attacks. Because so many people either suffer from, are susceptible to, or are at risk of cardiovascular disease, many people take warfarin.

And what I mean by many in this case is between 20 and 30 million prescriptions per year in the United States alone. The number went up to 35 million in 2010 and dropped back to 20 million in 2015. That’s a lot of warfarin pills! You can see the stats here (http://clincalc.com/DrugStats/Drugs/Warfarin). Warfarin is in the top 50 drugs. It’s 42nd down the list. Just below aspirin at 39, insulin at 36, and ibuprofen at 34, as you can see here (http://clincalc.com/DrugStats/Top200Drugs.aspx).

Surely close to every household in the western world will have somewhere in a bathroom cupboard or drawer a bottle of aspirin or ibuprofen. Given how close to warfarin they are in popularity of usage, there’s clearly no need to even say that this anti-coagulant drug is in broad and widespread use.

Isn’t this great, though? Millions of people at risk of having blood clots that would possibly cause them a stroke or heart attack, protected by taking a little warfarin every day? Yes, I suppose in some ways, it is, if these people are actually at risk. But, unfortunately, with a drug like that, we can be pretty sure that most are taking it preventatively, as in, just in case. And this is a problem.

Warfarin works by disrupting the process that leads to the activation of coagulation factors. The blood’s ability to form clots quickly is one of its most vital functions, because without it we would just bleed to death from a flesh wound. Evolutionarily, we simply would not have made it to here without this protection mechanism that ensured that when we were wounded, the blood would immediately thicken to stop it bleeding out of our body by forming clots at the surface of the open wound as fast as possible. The special proteins responsible for regulating coagulation are vitamin K-dependent proteins (VKDPs).

It has taken a long time to understand, first of all, that there wasn’t just vitamin K, but in fact two different kinds of vitamin K. It is also true that it has taken a long time to identify the major vitamin K-dependent proteins and figure out how they work. We are talking about 40 years from the 1950s to the 1990s. So, you really shouldn’t be surprised if you haven’t read or heard about this before.

But today, a lot has been understood through in vitro and in vivo observations, trials and studies both in animal models and in humans. And even though we will inevitably continue to deepen our understanding of the subtleties of the molecular mechanisms, the species, and the interactions involved in the life of cells and proteins in how they affect the state of our blood vessels and organs, this is a sketch of the picture we have at this stage.

Vitamin K dependent proteins

There are about twenty identified VKDPs belonging to two classes: hepatic—those produced by the liver, and extra-hepatic—those produced in other tissues. Those from the first class are the most well-known and well-studied. They are the coagulation factors (II, VII, IX, and X) manufactured by the liver and activated within it before being pushed into the bloodstream and circulated throughout the body to maintain a healthy coagulation response in case it is ever needed. These are the ones targeted by warfarin. Naturally, since that drug has been around since the 1950s, the role and function of these vitamin-K dependent coagulation factors have also been known at least since that time.

The second class is less known and less studied but has—luckily for us—gained much more attention in the last two decades. It includes three very important proteins whose functions are essential in maintaining healthy blood vessels. But unlike the coagulation factors produced in the liver, these proteins are instead produced by the vascular smooth muscle cells and activated there locally in the vasculature. These vascular health factors, we call them that in analogy to but to distinguish them from the coagulation factors, were identified much more recently in the 1980s and 1990s. All are proteins that contain gamma-carboxyglutamic acid abbreviated Gla.

Some important ones for us here are osteocalcin, for which it took 30 years to be identified as an inhibitor of calcification when it was discovered in vitro to prevent the precipitation of crystals in a supersaturated calcium solution. This means that without it, calcium crystals would have inevitably formed spontaneously in the solution. Osteocalcin is also called bone Gla protein. Growth arrest specific protein 6 is involved in the regulation of cell proliferation, and seems to inhibit premature cell death. And the most important one in relation to soft tissue calcification, matrix Gla protein abbreviated MGP.

Matrix Gla protein was originally isolated from bone, but it has been found to be expressed in several other tissues including kidney, lung, heart, and—most critically—vascular smooth muscle cells or VSMCs. It is now known to be the most potent inhibitor of calcification of blood vessels, and even though the liver does produce and secrete MGP into the bloodstream, only the MGP produced in the vasculature inhibits calcification.

Besides being produced in different tissues, another important difference between the two classes of VKDPs is that the liver-produced coagulation factors are phylloquinone—or vitamin K1-dependent, whereas the vascular smooth muscle cell-produced proteins are menaquinone—or vitamin K2-dependent. In light of the fact that it is rather hard to find vitamin K1 insufficiency with a diet that contains at least some green plant foods, while the exact opposite is true for vitamin K2 of which the western diet is practically devoid, this difference is highly significant.

Both vitamin K1 and K2 are absorbed in the second and third portions of the small intestine, the jejunum and ileum, K1 is delivered to the liver, whereas K2 is transported via LDL and HDL to other organs. K1 is mainly found in the liver, whereas K2 is preferentially stored in peripheral tissues, with the highest levels in the brain, aorta, pancreas, and fat tissues. This obviously attests to the importance of these essential vitamins.

While vitamin K1 and K2 are really two different vitamins with different functions, transport mechanisms, and distribution in the tissues, and while there are several differences between the vitamin K1-dependent and the vitamin K2-dependent proteins, these have one essential thing in common. This is, as their name says, that they are vitamin K-dependent. What this means is that all these proteins share the same enzymatic chain of activation—whether it mediated by K1 or K2—that transforms them into their biologically active form, the form they need to have in order to do the things they are meant to do.

All VKDPs must be carboxylated in order to be activated. The process is complicated and not yet completely understood. We know that it is targeted to the glutamic acid (Glu) residues on the protein that must be made into gamma-carboxylglutamic acid (Gla). We also know that the process is mediated by the enzyme gamma-glutanyl carboxylase (GGC), and that vitamin K is the main co-factor that enables the enzyme to perform the activation. In the end, the process leads to the addition of a carbon dioxide molecule to the gamma-carbon of Glu, which transforms it into Gla. However, it is the reduced form of vitamin K that is required.

Vitamin K, whether it is the plant-based phylloquinone (K1) or the animal-based menaquinone (K2), enters the body through the diet in its non-reduced form. Reduction involves the addition of hydrogen in molecular form, H2, to make KH2. Transformations of this kind are generally always done by enzymes, and so is this one. In this case the enzyme is vitamin K epoxide reductase (VKOR). Its action is essential because it is the reduced form KH2 that acts as the co-factor in the process of carboxylation.

The energy released by the oxidation of KH2 drives the addition of the carboxyl group unto the glutamic acid residues. But the oxidised form of vitamin K, KO, can subsequently be reduced again to KH2. Thus vitamin K is first reduced, then oxidised to help push the carboxyl group unto the glutamic residue, and then reduced once more to start the whole cycle again. This cycle is called the vitamin K epoxide reductase or VKOR cycle.

For this class of proteins, the VKDPs, activation through carboxylation means for them to acquire the structure and properties needed to bind calcium in order to transport it. You may recall from a previous chapter in the story of vitamin K2, matrix Gla protein generally transports calcium out of soft tissues in order to prevent calcification, and bone Gla protein generally transports it into bones and teeth to prevent osteopenia, osteoporosis, and tooth decay.

The big red flag

Now you understand why it is that when, in our remarkable cleverness, we understood that the main coagulation factors depended on the action of these enzymes to be activated and rendered functional, we naturally concluded that the best way to prevent clot formation would be to prevent coagulation, and that this could be achieved by blocking these enzymes from doing what they are intended to in a healthy organism. This is precisely what warfarin does.

And it does it well. Otherwise it wouldn’t have become as commonly used as it is. And we can be certain it has saved a lot of people much of the pain and possibly life-threatening conditions that a blood clot could have caused them. The problem is that the vascular health factors so critical for maintaining healthy blood vessels, depend on the same enzymes for activation as do the coagulation factors. Preventing the carboxylation of coagulation factors, prevents, in exactly the same way, the carboxylation of the vascular health factors.

This was only understood to be a major problem relatively recently. We first had to understand that there isn’t just one kind of vitamin K, but that there are two, and that they are very different in their functions. We had to understand that both vitamin K1 and vitamin K2-dependent proteins rely on the same enzymes to get activated. We had to understand the carboxylation process by which they are activated. And we had to understand that MGP, BGP, and Gas 6 are vitamin K-dependent proteins, that they are specifically vitamin K2-dependent, how they are activated, what they actually do in our veins and arteries, and what happens if they can’t do what they are designed to do.

A major red flag about anticoagulants and warfarin came up from what was seen in mice. The first part of the study was with MGP-knockout mice, (mice in which the MGP-encoding gene was deactivated). They were observed to have stunted growth from the premature calcification of the epiphysis—the part at the end of bones and at the junction with the cartilage of the joint which allows the bone to grow longer. As as soon as the epiphysis calcifies, longitudinal growth stops. But this was the least severe of the problems that were observed.

The MGP-knockout mice very quickly developed severe arterial calcification, and died highly prematurely, within 6 to 8 weeks, of strokes, heart attacks, and rupture of the aorta. Normal lab mice live 2 to 3 years and some even up to 4 years. So, in the least extreme case, a MGP-knockout mouse dying from aortic rupture at 2 months instead of living a relatively short normal life of only 2 years, would be equivalent for a human that would normally live to the age of 72 to die at the age of 6!

Here is what severe coronary calcification looks like in humans:

severe_coronary_calcification

Severe coronary calcification in a patient with end-stage renal disease. We can see that these blood vessels are basically filled with bone tissue that appears bright white. (https://www.bmj.com/content/362/bmj.k3887)

It was also observed that although the liver did produce and release MGP into the bloodstream, it had no effect on the arteries. Only the tissue-specific, locally-produced MGP within the vascular smooth muscle cells was able to inhibit calcification.

To check these conclusions, a similar study was done on normal mice that were given vitamin K1 to ensure proper liver function and healthy coagulation, and warfarin to block all extra-hepatic MGP action in tissues. The result? Stunted growth, pervasive arterial calcification, and premature death from stroke, heart attack, and aortic rupture.

The conclusions were solid: matrix Gla protein is the organism’s primary protection against soft tissue and arterial calcification; liver MGP has no protective effect on arteries, and only VSMCs-produced MGP can inhibit calcification in the arteries; both vitamin K deficiency and disruptions of the action of the enzymes that activate MGP cause extensive soft tissue calcification; and only vitamin K2, not vitamin K1, can inhibit warfarin-induced calcification.

Going further

When this was understood, more attention began to be paid to matrix Gla protein. Many other details were elucidated through further investigations. It was found that MGP is an 84-amino acid protein with five Gla residues. That all of these Gla residues are produced by gamma-carboxylation, which is mediated by the enzyme gamma-carboxylase that requires vitamin K2 as a cofactor, and that until now, the only known function of Gla residues is to bind calcium ions and crystals (calcium apatite). It was discovered that the concentration of calcium and phosphate in extracellular fluids is high enough to trigger and sustain growth of crystals, but that MGP and BGP prevent this from happening. That MGP is required by VSMCs to maintain their elastic and contractile nature. And not just that.

MGP actually inhibits the transformation of VSMCs into bone cells by antagonising the action of Bone Morphogenic Protein 2 (BMP2). It turns out that the muscle cells of the blood vessels have in them the potential to either stay smooth elastic contractile muscle cells, or turn into osteoblast-like bone building cells. BMP2 triggers that osteogenetic gene expression in the VSMCs: it tells muscle cells of the blood vessels to transform into bone-building cells.  And as if this wasn’t enough, BMP2 also induces apoptosis: it tells blood vessel muscle cells to commit suicide, which is certainly to help in the process given that once dead, they can be used as seeds for calcium crystal formation, and thus promote a faster and more efficient calcification.

What induces expression of BMP2 in cells? Probably several things that we haven’t yet identified. But for now we know that BMP2 is stimulated by oxidative stress, chronic inflammation, and high blood sugar levels. The good news is that MGP protects against all of these effects by antagonising BMP2. So if there is enough MGP and enough vitamin K2, if there are no disruptions to the action of the vitamin K dependent enzymes by anticoagulants like warfarin, and if oxidative stress, inflammation, and blood sugar are kept low, then there is protection against calcification of the arteries and other soft tissues like the liver, kidneys, and heart.

Recap

Here we have it. We have now understood the role of vitamin K dependent proteins in vascular calcification. And although it was a little long and maybe somewhat arduous, all the details are clear. It is complicated. I won’t deny that. But I have strived to make it all as accessible as I could without diluting the mechanisms of action and relationship between the different players. Let’s recap to make sure you are left with the essential elements in mind.

Vitamin K dependent proteins can either be vitamin K1 or vitamin K2 dependent. The dependence comes from the fact that vitamin K is required to activate the protein. This activation is the carboxylation in which a carbon dioxide is added to the glutamic acid residues along the protein. Carboxylation is mediated by carboxylase (GGC) that requires the reduced form of vitamin K in order to oxidise it and get the energy to push the carbon dioxide molecule onto the glutamic acid residue. Vitamin K is reduced by reductase (VKOR) which can do it over and over again in what is called the VKOR cycle.

Vitamin K1 dependent proteins are mostly liver based coagulation factors. Vitamin K2 dependent proteins are mostly outside the liver and generally involved in inhibiting soft tissue calcification. The most important calcification-inhibiting VKDP is matrix Gla (MGP), which performs a wide range of tasks to maintain elastic, flexible, calcium-free blood vessel walls.

Calcification is triggered by the death of vascular smooth muscle cells. These dead muscle cells act at seeds for calcium apatite crystals to form. VSMCs can be induced to become osteoblast bone-building like cells that then go on to stimulate the growth of bone tissue within the artery walls. This process is stimulated by bone morphogenic protein 2 (BMP2), which is expressed under conditions of oxidative stress, inflammation, and hyperglycaemia.

To prevent and reverse calcification the most important is to provide a good supply of vitamin K2 through diet and supplementation. Because it is essential in the activation of Gla proteins but only through its role in the VKOR cycle, the amount of K2 is the rate limiting factor. Hence more is better than less, and excess will simply remain unused but will not cause harm.

Naturally, matrix Gla protein needs to be available. Cells of tissues where calcification occurs (kidney, liver, heart, and blood vessels) secrete MGP. An interesting evolutionary self-protection adaptation mechanism. And here’s another: the amount of MGP that is produced by a cell depends on at least two factors that have been identified. One is the amount of calcium; the other is the amount of vitamin D3. In both cases, the more there is, the more MGP is produced.

So, vitamin D3 has the role of making calcium available but at the same time stimulates the production of MGP in order for the calcium to be available to the bones and not to the soft tissues. But for this, it relies on vitamin K2. This is why vitamin D3 without vitamin K2 leads to calcification: because MGP and BGP remain inactive and incapable of binding to the calcium ions to move them into bones and out of tissues. On the other hand, plenty of vitamin K2 would indeed activate the available MGP, but without enough vitamin D3 there might not be enough MGP to confer proper protection against calcification. This is a perfect example of the complementarity of action and function in essential micronutrients. There are certainly many more, but this one is particularly remarkable.

Final thoughts

I want to close on a final consideration. It is so easy and seems so natural for us to think in terms of this and that, good and bad, for and against, that our tendency is to look at everything in these terms. This is also true when we look at biochemical processes like the ones we have described and explored here. We naturally lean towards looking at the calcification inhibiting mechanisms as protective, and those that promote calcification and apoptosis as destructive.

But the reality is that cells, proteins, and enzymes don’t behave in these terms, they don’t think in these terms simply because they don’t think. They react biochemically to what they are exposed to, to the molecules and chemical messengers they encounter, to the quality of the liquids in which they bathe, to the characteristics of the environment in which they live, microsecond after microsecond, without any forethought or concern for the microsecond that will follow. The only guiding principle that can be used to lead us to understand why things happen the way they do is evolutionary adaptation to survive.

Having recognised this, we immediately see that the mechanisms that promote apoptosis of VSMCs, their subsequent transformation to osteoblast-like cells, and the growth of bone tissue within the artery walls that we refer to as arterial calcification, can only be a protection mechanism. A mechanism to protect the tissues and cells from the damaging effects of exposure to free radicals, inflammatory molecules, and glucose. Because, as we have seen, the process is reversible, it would be perfectly natural to undergo periods of calcification followed by periods during which the bone tissue is broken down and removed from our arteries and other soft tissues and organs when the circumstances allow it. Actually, we should say when the circumstances dictate it, because no matter what happens, it is always the circumstances—the environment—that dictate what is to happen.

What we can do, with the knowledge of what we have understood, is make choices about what we eat and drink, when and how much we eat, and how we live, sleep, and exercise. Choices that will shape or reshape, define or redefine the makeup of this internal environment of the body to always move us in the direction of optimal biochemistry, optimal physiology, optimal metabolism, and optimal health.

Everything that we explore together is always about just this. But sometimes the corrective action requires effort, sometimes even a lot of effort. In this case, however, it is as simple as can be, because it only requires us to supplement with vitamin K2 and possibly also D3. Of course, the last thing we want is a lifestyle that promotes the expression of BMP2 and the growth of bone tissue within our arteries. But supplementing with K2 and D3 together will in general bring only benefits. I know it was a very long-winded way to get to this, but now you understand why. That was—and is—the whole point of this blog, after all. I hope you enjoyed reading.

 

The information in this article comes primarily from the following papers: Molecular Mechanisms Mediating Vascular Calcification by Proudfoot and Shanahan (2006); Vitamin K-dependent Proteins, Warfarin, and Vascular Calcification by Danziger (2008); The Role of Vitamin K in Soft Tissue Calcification by Theuwissen, Smit, and Vermeer (2012).

Become a proud sponsor of healthfully! Join our patrons today!

Reversing calcification and the miracle of vitamin K2

Vitamin K2 is the only known substance that can stop and reverse soft tissue calcification.

If you didn’t stop at the end of that sentence to say Wow to yourself, you should keep reading.

Soft tissue calcification is one of the most serious health problems we face as individuals, as modern societies, and, on a global scale, as a species.  Cardiovascular disease—which leads to heart attacks and strokes, and accounts for nearly half of all deaths in industrialised countries—is a disease of soft tissue calcification: the calcification of our arteries.

Arthritis, of which basically everyone past the age of 40 suffers, and increasingly more with time and with age, is a disease of soft tissue calcification.  It is caused by the calcification of the cartilage in the joints:  the joints of the knees, but also of the shoulders; the joints of the hips, but also of the wrists; the joints of the elbows, but also of the feet and the toes; the cartilage between the vertebrae of the neck and the spine all the way down the back, but also of the hands and of the fingers.

Soft tissue calcification also causes kidney stones and kidney disease.  How many people above the age of 60 don’t have kidney problems?  Hardly any.  And how many young men and women in their 20s and 30s already have kidney stones and kidney dysfunction?  More and more every year.

Every one of the processes generally associated with ageing, from heart disease and stroke, to Alzheimer’s and dementia, to arthritis and kidney disease, to stiffness in the joints and muscles, but also to the wrinkling of the skin, is intimately linked to soft tissue calcification.

And now, let me repeat the sentence with which we opened:  Vitamin K2 is the only known substance that can stop and reverse soft tissue calcification.  It is really remarkable.

Maybe you didn’t know about calcification.  And so, maybe you are wondering why it is such a major and widespread problem, why it affects everyone no matter where we are or what we do.  It’s a good question.  But because we know that only vitamin K2 can prevent this from happening, we already have our answer:  soft tissue calcification is a major and widespread problem because our intake of vitamin K2 is inadequate to provide protection from calcification.

Naturally, the next question is why?  Why is our intake of vitamin K2 so inadequate?  If it is such a crucial essential nutrient, we would surely not be here as a species if intake had always been so inadequate.  Looking at things the other way around, if we are so dependent on adequate K2 intake for staying healthy, this must necessarily mean that we evolved having plenty of it in our food supply.  What’s so different now?

To answer this question with some level of detail—meaning with an explanation more extensive than just saying that it’s industrialisation that stripped our food supply of vitamin K2 as it has for all the essential nutrients to a greater or lesser extent—we have to understand what K2 is, how it’s made, and where it’s found in food.

The short answer is that K2 is found in the fat of pastured animals that graze on fresh green grass, and produced from vitamin K1 by certain kinds of bacteria in their gut.

The longer answer is that vitamin K2 is a family of compounds called menaquinones, ranging from MK-4 to MK-13 depending on their molecular structure.  These compounds are derived from the plant analog, the sister compound, vitamin K1, called phylloquinone, and found in chlorophyll-rich plant foods.  Phylloquinone is consumed by the pastured animal, it makes its way into their intestines, and there it is transformed by the bacteria of the animal’s intestinal flora.  The resulting menaquinone is then stored in the fat cells of the animal as well as in the fat of their milk if they are milk-producing.  Consuming these animal fats in which vitamin K2 has been concentrated will provide this precious essential micronutrient.

If the grazing animal does not feed on green grass, they get no vitamin K1.  If they get no vitamin K1, their gut flora is not only compromised and negatively altered with respect to what it should be if they were consuming the grass they have evolved eating, but it produces no vitamin K2.  If their gut flora produces no vitamin K2, their fat and milk will contain no vitamin K2, and neither their offspring nor any person consuming products derived from the animal will get any vitamin K2.  Hence, no grass feeding, no vitamin K2 in the animal’s fat.

international_dairy_week_banner

It is most natural that grass-eating animals should be grazing on fresh green grass in open pastures.  And yet, it is rather rare.  But without green grass, there is no vitamin K1.  And without vitamin K1 there can be no vitamin K2.

Maybe you’ve already thought ahead, and wondered since it is bacteria that produces vitamin K2 from vitamin K1 in the guts of grazing animals, can’t we make vitamin K2 without the need for grass-fed animals to do it for us?  Yes, it is possible.  Fermented vegetables and dairy products like cheese can also contain vitamin K2.  In fact, in the case of cheese, there is a lot more in the finished hard cheese than in the milk used to make it.  The amount varies widely because it depends on the kind of bacteria.  For dairy products, hard cheeses like Gouda have the most, and for plant foods, even if fermented veggies have a little, the Japanese fermented soybean snack natto is the ultimate source of K2.

As we all know, pastured meat and dairy is not easy to come by in our modern world.  It’s actually quite hard to find.  Our supermarkets and food stores are flooded with industrially produced meat and dairy from animals that have never seen a blade of grass—grass-grazing animals living their entire lives indoors, in stalls, fed and fattened exclusively on grains, corn, and soybeans.  This is how we have stripped our food supply of vitamin K2, and this is why is this a modern phenomenon—most of our grand-parents were still eating pastured meats and animal foods.

And if this wasn’t enough of a blow to vitamin K2 status, trans-fats, which are formed when vegetable oils are hydrogenated to be made saturated and stable (for long shelf life), and which most of us consume in great quantities, contain a K2 analog called DHP (dihydrophylloquinone) that displaces the little K2 that might has found its way into our diet.

It is for all these reasons that soft tissue calcification is so widespread.  And you have at this point what you need to know in order to first stop the process by which your soft tissues are getting increasingly calcified, and then, in time, to remove the accumulated calcium from these tissues.  It’s simple: healthy grass-fed animals produce yellow butter, yellow yolks, and yellowish fat;  you need to eat plenty of pastured animal foods, making sure you eat the fat in which vitamin K2 is concentrated, and, to be sure you have enough to reverse the already present calcification, take K2 supplements.  And this might be enough for you.

If it is, you can head to your browser to find and order some K2 supplements (I currently get mine, it’s a 500 mcg per tablet, from Phoenix Nutrition).  Also, we need to know that the two main forms of K2 are MK-4 (with four double bonds) and MK-7 (with seven).  The first is the one generally found in animal fats that haven’t been fermented, while the second is the product of bacterial fermentation.  Hence, meat and butter contain mostly MK-4, whereas natto, sauerkraut, and cheese contain mostly MK-7.

There is an important difference between these two forms of K2 in terms of their effects inside the body which has to do with their half-life, not in the sense of radioactivity, but in the sense of duration of biological activity in the body.  MK-4 will be in circulation at therapeutic doses for a number of hours, while MK-7 remains in circulation between 24 and 48 hours.  Therefore, to be safe, we need to eat grass fed meat and butter, and take MK-7 supplements (I take 1000 mcg), always after a meal with plenty of fat to maximize absorption.

If you are curious to find out more, if you want to know how menaquinone does this, how vitamin K2 does its miracles inside the body, then we need to take a closer look at the biochemistry of calcium metabolism.

There are three proteins found in bone matrix that undergo gamma-carboxylation via Vitamin K-dependent enzymes: matrix-gla-protein (MGP) (Price et al., 1983), osteocalcin (bone gla-protein, BGP) (Price et al., 1976), both of which are made by bone cells, and protein S (made primarily in the liver but also made by osteogenic cells) (Maillard et al., 1992) (Table V).  The presence of di-carboxylic glutamyl (gla) residues confers calcium-binding properties to these proteins.

MGP is found in many connective tissues and is highly expressed in cartilage.  It appears that the physiological role of MGP is to act as an inhibitor of mineral deposition.  MGP-deficient mice develop calcification in extraskeletal sites such as in the aorta (Luo et al., 1997).  Interestingly, the vascular calcification proceeds via transition of vascular smooth muscle cells into chondrocytes, which subsequently hypertrophy (El-Maadawy et al., 2003).  In humans, mutations in MGP have been also been associated with excessive cartilage calcification (Keutel syndrome, OMIM 245150).

Whereas MGP is broadly expressed, osteocalcin is somewhat bone specific, although messenger RNA (mRNA) has been found in platelets and megakaryocytes (Thiede et al., 1994).  Osteocalcin-deficient mice are reported to have increased bone mineral density compared with normal (Ducy et al., 1996).  In human bone, it is concentrated in osteocytes, and its release may be a signal in the bone-turnover cascade (Kasai et al., 1994).  Osteocalcin measurements in serum have proved valuable as a marker of bone turnover in metabolic disease states.  Interestingly, it has been recently suggested that osteocalcin also acts as a hormone that influences energy metabolism by regulating insulin secretion, beta-cell proliferation, and serum triglyceride (Lee et al., 2007).

These are the first three paragraphs of the chapter Noncollagenous Bone Matrix Proteins in Principles of Bone Biology (3rd ed.) which I found it on the web when I was searching for more info on the biochemical action of menaquinone.

And now, here is my simple explanation of how things work:

The players are the fat-soluble vitamins A, D, and K2;  three special proteins called osteocalcin, matrix gla protein, and protein S;  and an enzyme called vitamin K-dependent carboxylase.

First, vitamin D makes calcium available by allowing its absorption from the intestines into the bloodstream.  This is vital for life and health.  You know that severe vitamin D deficiency is extremely dangerous and develops into the disease that deforms bones called rickets.  Milder forms of vitamin D deficiency are much harder to detect without a blood test, but can and do lead to a huge spectrum of disorders and health problems.  However, without vitamin K2, ample or even just adequate levels of vitamin D will inevitably lead to increased soft tissue calcification.

Vitamins A and D make bone-building cells (osteoblasts) and teeth-building cells (odontoblasts) produce osteocalcin (also known as bone gla protein or BGP) and matrix gla protein (or MGP).  This is key because it is these proteins that will transport the calcium.

Vitamin K2, through the action of the vitamin K-dependent carboxylase enzyme, activates bone and matrix gla proteins by changing their molecular structure which then allows them to bind and transport calcium.

Once activated, bone gla protein brings calcium (and other minerals) into the bones;  and matrix gla protein takes calcium out of the soft tissues like smooth muscle cells of arteries, but also organs, cartilage, skeletal muscles, and skin.  Without this K2-dependent activation, BGP and MGP remain inactive, and the calcium accumulates in soft tissues all over the body.

What completes the act, is that vitamin K2 activates protein S which oversees and helps the immune system clear out the stuff of arterial plaques that remains once the calcium making the plaques structurally stable has been taken out.  And, amazingly, protein S does this without triggering a large inflammatory response.

Even though it is quite straight forward when explained in this way, this understanding of vitamin K2 and its action in the body is really quite recent: in the last 20 years or so.  For one thing, it was only 10 years ago that Chris Masterjohn solved the 60-year old mystery of Weston A. Price’s X-Factor, correctly identifying it for the first time as vitamin K2. (You can read that for yourself here.)  And although some laboratory studies and experiments on vitamin K were done several decades ago, the majority are from the last 10 years (take a look at the references in Masterjohn’s paper.)

We’ll stop here for now.  But we’ll come back to vitamin K2 because there are so many other amazing things it does for our health.

This article was inspired by Dr. Kate Rheaume-Bleue’s book entitled Vitamin K2 and the Calcium Paradox.

Join our patrons today!

No more bipolar disorder?

Our world is replete with diseases of all sorts, illnesses of all kinds, ailments countless in numbers. Modern medicine views these in isolation, and therefore also attempts to treat them in isolation: we have a headache, we take an aspirin; we have high blood sugar, we take insulin injections; we have high cholesterol, we take statin drugs to disrupt the manufacturing of cholesterol in the liver; we have cancer, we are given toxic poisons that kill our cells and hope the cancer will be weakened; we have arthritis or multiple sclerosis, and we are given immune suppressants because it is thought that our own immune system has turned against us, attacking the very body it is intended to protect. We have no idea why, but this is what we do, and this is also what we believe we should be doing.

In psychiatry, we treat so-called mental illnesses. But because we are even more clueless in this realm of the subtle functioning of the brain and mind than we are of the subtle functioning of the body and its organs, we look for drugs that suppress the behaviours which are symptomatic of the “illness” we have been diagnosed with. It’s very simple: we take uppers and stimulants when we are down and low, and downers and sleeping pills when we are high and excited. Because we all do it, we think it’s perfectly normal.

When we take a close look, we see that there are no diseases, no illnesses, no ailments that are not caused by biochemical imbalances; we see that all of our health problems are rooted in problems in the biochemistry; and we see that the functioning of the body and the functioning of the mind cannot be considered independently, because they are both nothing other than the functioning of the whole body-mind.

Surely a most striking example of this is the now almost forgotten disease condition called pellagra. The name comes from the contraction of the Italian pelle (skin) and agra (sour), and was first used by Francesco Frapolli treating people in the 1880’s in Italy where more than 100 thousand suffered from it. But this wasn’t unique to Italy. The same was true in Spain and in France in the late 19th century. In the US, it reached epidemic proportions in the American south where it was estimated that between 1906 and 1940, more than 3 million were affected, and more than 100 thousand actually died from it.

Can you image that? This many people—millions of people—in quite a restricted region, walking around in manic states, delusional states, paranoid states, seeing and hearing things, talking or even yelling to themselves and others around them, completely incoherent and, in addition, covered in red, sore, flaking and bleeding skin on the arms, neck, and face? What a nightmare it must have been.

In all countries and all cases, pellagra was associated with poor nutrition, and more specifically, associated with corn-based diets in which the maize was not treated with lime in the traditional way. Similarly, in all countries and all cases, it was found that a nutritious diet based on fresh animal foods very quickly resolved the problems that afflicted the sufferers of this disease. So, even in the late nineteenth century, they had figured out how to treat and prevent it. The thing is, though, they didn’t know why if they replaced the corn and starches with meats and vegetables, people got better.

Pellagra would usually first manifest as skin problems: eczema and psoriasis-like irritations and lesions. Then, it brought about anxiety, depression, irritability and anger. And eventually, periods of full blown mania, visual and auditory hallucinations, extreme fear, paranoia, bipolar and schizophrenic behaviours.

Bipolar-Disorder-Mood_scrabbleLetters

Now, if you know someone, if you have been close to someone diagnosed with bipolar disorder, with schizophrenia, with anxiety disorders, depression, or paranoia, you will immediately recognise in this list of symptoms those you saw in this person, surely to different degrees, and surely in the most extreme during a full blown crisis. Without a doubt, at least for bipolar disorder, these symptoms are all present, often simultaneously, and sometimes in close succession.

And do you know what pellagra is? It’s vitamin B3 deficiency.

Yes, pellagra, this terrible disease that caused such awful skin conditions and straight out madness in people, this disease that made these poor people act in ways indistinguishable from those of manic-depressives and schizophrenics, was a simple vitamin B3 deficiency.

When this was understood, niacin fortification was mandated, and the epidemic affecting millions of people in the southern United States was resolved almost instantly. After decades of rampant “mental illness” among so many—so much fear, so much anxiety, so much terror within families and communities, so much pain and suffering, and tens of thousands of deaths—a little added niacin ended this national disaster that was the pellagra epidemic almost overnight. The fact that you have most likely never heard of pellagra goes to show how effective niacin fortification has been in preventing it. But something else happened.

Following the introduction of niacin fortification, half the patients held in psychiatric wards were discharged. Just like that, they got better, and went home. There was at least one psychiatrist who noticed this remarkable coincidence: his name was Abram Hoffer. He wondered why so many got better, but also why only half. What about the other half? Could it be that they just need a little more niacin? Hoffer was an MD, a board-certified psychiatrist, and a biochemistry PhD. He was also the Director of Psychiatric Research for the province of Saskatchewan in Canada, a position he held from 1950, when he was hired and appointed by the department of public health, until 1967, when he opened a private practice.

What he did to check this hypothesis—that maybe more of the psychiatric patients were not mentally ill at all, but just in need of greater amounts of niacin—was to conduct a study. He chose schizophrenics because they are among the most difficult to treat, and also because together with bipolar patients, they have many of the symptoms associated with pellagra. The results were stunning: 80% of the schizophrenics given B3 supplementation recovered. And these results aren’t anecdotal—the word often used in a pejorative or derogatory manner to dismiss important observations or evidence that fall outside the narrow realm of the conventionally accepted. These were the results of the first double-blind placebo-controlled nutrition study in the history of  psychiatry.

What double-blind placebo-controlled means is that he took two equally sized groups of people diagnosed with schizophrenia, and then randomly and blindly, both on the patient’s end as well as on his end, gave half of them 3000 mg of flush-less niacin per day in three doses. (Niacin has a flushing effect that would be noticed, but either inositol hexanicotinate or niacinamide can be used instead.) He gave the other half a placebo, which would have been a pill that looked identical, but contained no niacin or anything else that could have any significant effect on them, (like powdered sugar or a starch of some kind). And at the end of the trial, when they looked at which patient got what, they found that 80% of niacin-treated recovered, whereas none in the placebo group showed significant improvements.

Over the years, Hoffer treated thousands of people with remarkable success. With simple vitamin B3 supplementation he continued to successfully treat people suffering from schizophrenia, but also people suffering from attention deficit disorder (ADD), general psychosis, anxiety, depression, obsessive-compulsive disorder (OCD), and bipolar disorder. In fact, he considered pellagra, bipolar disorder, and schizophrenia to be the manifestation of niacin deficiency on different scales, and the sufferers to be niacin-dependent to different extents. Obviously, this is the only natural conclusion he could have drawn given how effectively niacin resolved psychiatric symptoms in these people, but also in light of the fact that each individual seemed to need somewhat different amounts to have these positive effects.

The expression niacin-dependent was used to emphasise that they needed to take it on a daily basis. Naturally, an essential vitamin is not only essential in the sense that it is absolutely needed, but also in the sense that it needs to be consumed regularly because it is not manufactured within the body. Deficiencies develop when the diet lacks in these essential nutrients, and grow more severe as time goes on. When the nutrients are then reintroduced, the deficiencies can be corrected. Some nutrients are abundant, some are rare. Some are easily absorbed, some are not. Some are more easily stored, and some cannot really be stored at all.

In addition, besides the fact that in any given population there is always—for the very same essential nutrient—a range of nutritional needs that vary between individuals based both on their genetic predispositions and on what they do, countless other factors influence and affect the amounts of essential nutrients that each one of us needs to be healthy. These include various kinds of injuries to the body-mind, and in particular to the gut where absorption of nutrients take place, that may have incurred at one point or another from an infection, a virus, a bacteria, a bad diarrhoea we had when we were babies, a childhood disease we don’t even remember, and really anything that could have damaged a specific part of the intestine where a specific family of nutrients are absorbed.

Any such injury could result in a greatly increased need for a particular nutrient that, without knowing about it, could not be supplied in adequate amounts from diet alone, and would inevitably develop into a progressively more severe deficiency whose effects on the body-mind would eventually appear as dysfunctions that would, without a doubt, have physical as well as psychological or psychiatric manifestations. Why? Because there is no body that functions independently of the mind, and there is no mind that functions independently of the body. There is only this single body-mind.

Niacin and B vitamins in general are water-soluble. This means that we pee most of them out, and that we therefore need to have them every day, or nearly, in order to prevent the development of deficiencies. The experience from the last decades of the nineteenth and the first five decades of the twentieth century in Spain, Italy, France, and  in the US, showed that a single vitamin deficiency, a simple niacin deficiency, could cause extreme symptoms that included severe psychiatric dysfunctions. It also showed that even very small amounts of B3 added to the otherwise nutrition-less white bread that was eaten as a staple could cure millions of pellagra sufferers, and prevent the disease from developing in the bulk of the population.

Unexpectedly, niacin-fortification coincided with a large number of the psychiatric ward patients getting well enough to go home. This observation prompted a study with niacin supplementation which showed that in 80% of the schizophrenia patients treated with niacin, symptoms disappeared in the same way they had in pellagra sufferers, but with higher doses of niacin. It was also shown that a similarly high cure rate was seen in people suffering from ADD, psychosis, anxiety, depression, OCD, and, in the point we wanted to emphasise in this article, bipolar disorder. In almost all cases, niacin supplementation resolved the dysfunctional behaviours and psychiatric symptoms. What varied were the amounts of vitamin B3 needed to achieve recovery, and the speed with which symptoms would come back upon interruption of the supplementation.

Therefore, whether you are among the lucky people who never were niacin deficient, among the lucky people who need little niacin, or among the less lucky ones who are deficient, who do need more of it than most, or who are suffering from anxiety or depression, schizophrenia or bipolar disorder, doesn’t it make sense to just start taking a little bit of extra B3 each day? Doesn’t it make sense to give your body-mind the amount of vitamin B3 it needs, recognising that for each one of us this amount may be different, that for some it will be a lot more than for others, but resting in complete assurance that no ill effects will come from it, because niacin supplementation is harmless, and that the only disadvantage of it being harmless, even in large doses, is that we need to take it daily?

Given how inexpensive any form of niacin is, shouldn’t we be giving it in large amounts to every patient in every hospital, psychiatric ward, and medical institution? We should, but this will probably never happen. What we can do is take care of ourselves, of those people closest to us like our children and spouses, siblings and parents; of those people we care about like our friends and colleagues; and even of those people who are simple acquaintances who come to us for advice or just to share their concerns about a health issue. And one of the simplest and most effective things we can do to improve our own health and the health of those around us is by taking a little B3 supplement every day. It could just make you feel more relaxed, more focused, calm and at ease, as it does for me, or it could completely transform your world, bringing you from a state of hyper-anxious, paranoid, delusional and hallucinatory mania, back to a relaxed, helpful and trusting, conscientious and reasonable self, giving you the gift of your own life back to yourself.

Could it really be this simple and this amazingly miraculous? No more pellagra, no more schizophrenia, no more bipolar disorder, just with a little B3 supplementation? Well, maybe. You try it, and let us know.

Vitamin C is not vitamin C

Several years ago now, when I read The Calcium Lie, I found out that vitamin C and whole food vitamin C complex were not the same thing. I wasn’t surprised in the least because obviously this is surely the case for most supplements: an extract is not the whole food. But a few days ago, I saw a short video presentation that forced upon me the realisation that there is a huge functional difference between what is sold as vitamin C and the complex vitamin C molecule we find in whole foods.

wholefoodvitaminc

The distinction may seem trivial at first—it has on the whole clearly been missed—but it is rather important: ascorbic acid, that has been equated to and sold as vitamin C, is the substance from which is made the thin antioxidant shell that protects the many constituents of the vitamin C complex as it is found in food. Since ascorbic acid can be produced in a lab, whereas whole vitamin C complex can only be found and extracted from real food and therefore cannot, this is naturally what has been done: manufacture ascorbic acid and sell it as vitamin C.

This makes sense, of course, because none of the supplement manufacturers would be inclined to emphasise this point. It would be kind of like shooting themselves in the foot. But also because, given the proven biochemical and physiological value of antioxidants, it’s not a far stretch to convince oneself that the usefulness of vitamin C is, in fact, derived from the effects of the ascorbic acid shell. For this reason, when I read Dr Thompson’s comments on vitamin C, I made a point to pile on the red peppers, brocoli and lemons in our diet at home, but nonetheless kept on taking ascorbic acid supplements and do to this day. This is about to change.

Dr. Darren Schmidt is an American chiropractor who works at the Nutritional Healing Center of Ann Arbour and, as most chiropractors, practices natural medicine, treating thousands of patients each year, most of them suffering from the same kinds of complaints, aches, pains and disorders, as is the case everywhere else. The talk was about heart disease: number one killer in the US and very prominent in all industrialised countries. To make it as clear and simple as possible and get the message across, he described that heart disease arises from the gradual filling up of the coronary arteries supplying blood to the heart with arterial plaques that with time grow to block the way completely or almost, and that this leads to a heart attack. We covered this topic in detail in the article At the heart of heart disease.

The main point he wanted to get across is that plaques in the arteries and blood vessels develop because of an injury to the tissues lining the vessels, just like a scab does on the surface of the skin when we accidentally scratch, scrape or cut it, and that a well-functioning organism will fix that injury in the same way as it does the surface of the skin: the scab forms, the skin repairs itself underneath, and when it is healed, the scab falls off. Plaques are like scabs.

He explained that, fresh out of university in the early 90’s, he had heard at a conference someone speak of the work of a great pioneer in nutritional medicine of the first half of the twentieth century, Dr Royal Lee, a friend and colleague of the other great pioneer Dr Weston Price. Dr Lee was the man who made the first food supplement, and the first concentrated whole food vitamin C supplement. He founded in 1929 the Vitamin Products Company, which later became Standard Process, Inc. Lee taught that this concentrated food in tablet form was like a pipe cleaner for arteries. Hearing this, the young chiropractor thought to himself, if it worked then it should work now, and he began to prescribe vitamin C to all his heart disease patients. For a decade he prescribed vitamin C, and for a decade he failed to see significant improvements or any sign of reversal of atherosclerosis in his heart disease patients. But he had missed something.

Frustrated and disappointed, he looked again at the original research and writings of Drs Lee and Price about nutrition and disease, and in particular about vitamin C, and began prescribing only Standard Process vitamin C. What he found, invariably, was a quick improvement in his patients whose chest pains and complains would disappear, and who would gradually feel better and better. Since then, he has repeated this on thousands of people with such success that he now teaches, he now repeats what Dr Royal Lee taught almost a century ago, that the cure for heart disease, for disease of the arteries and atherosclerosis, is vitamin C. And that vitamin C is not ascorbic acid, but it is whole food vitamin C complex.

Schmidt is not handsome nor charismatic. He does not speak eloquently. He is far from refined in his choice of words and speaking style. He doesn’t come across as a brilliant doctor or scientist, and not even as a bright guy, really. But the clinical experience and observations on which his statements and claims are based are undeniably impressive and clearly unambiguous in the information they convey: ascorbic acid has no effect on healing injured tissues and in allowing for the body to clean up and remove the plaques from the arteries and blood vessels; whole food vitamin C complex does, and it does so remarkably well and efficiently in everyone who takes it.

The implication is that other than providing antioxidant effects, ascorbic acid is useless for aiding and promoting healing of tissues. In this case, the concern is the health of the arteries, but it’s not a far stretch to conclude that this applies to all injured tissues in general. What is needed is whole food complex vitamin C, which we eat in whole foods or take in supplements that are made from whole foods. Therefore, it’s a no brainer: if you are interested in keeping your arteries clean and your heart and brain healthy and well-functioning for as long as possible, take a whole food vitamin C complex supplement, and pile on the vitamin C rich foods in your diet (superfoods highest in vitamin C include Camu Camu, Acerola and Goji ; regular foods highest in C include bell peppers, broccoli, brussels sprouts, strawberries and kiwi).

There is one last crucial point to this story, and I was happily surprised to hear it mentioned during the presentation. It is something that is explained by Gary Taubes in Good Calories, Bad Calories, but that is very rarely heard or mentioned anywhere. Vitamin C enters cells through the same channel as sugar does. But for evolutionary reasons, glucose always takes precedence over it (and all other nutrients). Therefore, as long as there is sugar to be shuttled into the cell, vitamin C stays out and waits: it does not enter the cell. So, what does he suggest for the diet? Can you guess? No sugars (simple carbohydrates), no starches (starchy carbohydrates) because they become sugars, lots of fat, adequate protein from healthy animal sources, and lots of green veggies, Sounds familiar? And, of course, whole food vitamin C concentrated in supplement form.

Finally, I promise to write about these and other great pioneers of nutritional medicine. I feel that these people who were greatly ahead of their times and usually greatly suffered from it deserve more recognition than they get. They deserve more recognition than they ever will get. But still, I would like to do my part. I don’t know when, but I will.

If you think this article could be useful to others, please ‘Like’ and ‘Share’ it.

The sun, our Earth, and the colour of your skin

Skin colour is the most obviously visible manifestation and expression of our evolutionary history. This history is carried over the course of hundreds of thousands of generations and tens of thousands of years. What we have to understand is that each one of us—as an individual, a person, a self—has nothing to do with the colour of our skin, the colour of our skin has nothing to do with us, and we have no choice in the matter. What we must also understand is that to be optimally healthy, we have to live and eat in accordance with the colour of our skin and what information it carries about our ancestry. All of this is true for you, and it is true for everyone of every colour in the magnificent spectrum of human skin colours as it exists on the planet today. Let me explain why.

skinColourPalette

(Photo credit: Pierre David as published in this article of the Guardian)

The Sun, like every other star in the universe, formed from the gravitational collapse of a huge cloud of gas. This happened about 5 billion years ago. All the planets, like every other planet everywhere in the universe, formed from the left over debris that wasn’t needed or used in making the Sun, and that remained orbiting around it in a large, flat accretion disk consisting of 99% hydrogen and helium gas and only 1% of solid dust particles. In a blink of an eye, a million years or so, the disk was replaced by a large number of planetesimals. An additional couple hundred million years or so, and the planets of our Solar system were formed.

Beyond the snow line, the radius from the Sun past which water can only exist as ice and where the temperature is below -120 C, volatiles froze into crystals, and were formed from massive icy cores the gas giants: Jupiter (the king at 320 times the mass of the Earth), Saturn, Uranus and Neptune. Within the snow line were formed the rocky planets: Mercury, Venus, Earth and Mars. About 4.5 billion years ago the Solar system was in place. It was in place but not quite like we know it today. It was fundamentally different in several ways, especially in regards to what concerns us here, which is how the Earth was: a fast-spinning burning inferno of molten rock spewing out of volcanos everywhere and flowing all over the globe, completely devoid of water, oxygen, carbon and other volatiles species.

The Earth formed more or less simultaneously with a very close neighbour about the size of Mars. Inevitably, soon after their formation, they collided. This apocalyptic encounter tilted the Earth off its original axis and destroyed the smaller planet that, in the collision, dumped its iron core into the Earth, and expelled about a third of our planet into the atmosphere. Most of the stuff rained back down, but some of the material lumped into larger and larger lumps that eventually resulted in the moon, our moon. When it formed, the moon was a lot closer—it would have looked twice as large as it does now, and the Earth was spinning approximately five times faster than it does today—a day back then would have lasted only 5 hours. Because of the proximity between them, huge tidal forces would have deformed the liquid Earth on a continuous cycle driven by its super short 5-hour days. This would have heated the Earth tremendously by squeezing its insides from one side and then from the other, and caused massive volcanic activity all over the globe.

But this inelastic gravitational interaction, this drag of the moon on the Earth worked, as it still does, to sap rotational energy from the Earth and transfer it to the smaller and far less rotationally energetic moon. This made, and continues to make, the Earth slow down, the moon speed up and therefore drift out into a progressively larger orbit. The moon’s drag on the Earth continues to make the Earth’s spin slower and the moon’s orbit larger, but at an increasingly slower rate, now of 3.8 cm per year. This will continue until there is no more rotational energy to be transferred from the Earth to the moon, at which point we will be tidally locked in order with the moon, and not only will we always see the same side of the moon as we do today, but the moon will also always see the same side of the Earth. For what it’s worth, this will happen way after the Sun has come to the end of its life, and thus in more than 5 billion years. So, for now, this is definitely not a major issue.

Besides this important difference in the Earth’s spin rate and its relationship with the moon, there were a lot of left overs from the Sun’s formation that had clumped up in asteroids and comets whirling around in all sorts of both regular and irregular orbits that had them sweeping across the Solar system from the furthest reaches and most distant places to the inner regions near the Sun and rocky planets. The Heavy Bombardment lasted for a period of approximately 500 million years from about 4.3 to 3.8 billion years ago. During this tumultuous early history of our Solar system, a lot of these asteroids and comets flying past the Earth and the other rocky inner planets were gravitationally captured and pulled in towards the planet to crash on the surface or just swoop down into the atmosphere, leaving behind all or some of their mostly volatile constituents: water and carbon compounds. The Earth would have been regularly bombarded by massive asteroids, and the energy dumped by the impacts would have made it a hellish place covered in flowing lava, obviously without any crust, but rather only molten rock flowing everywhere and volcanos spewing out noxious gases and spilling out more molten rock that merged into the already flowing streams of lava. Very inhospitable.

But with these brutal hundreds of millions of years of bombardment from asteroids and comets, water and carbon compounds were brought to our planet. Given how hot it was, the water was in the atmosphere as vapour, and so were the carbon monoxide and dioxide as well as methane. However, these were now bound to the planet gravitationally and couldn’t escape back into space. Once the bulk of the randomly orbiting solar system debris had been cleared out and incorporated into the various planets onto which they had fallen, the bombardment came to an end, and the Earth started cooling down. It is believed that the last major sterilising impact would have hit the Earth around 3.9 billion years ago.

Cooling during a few thousand years allowed the formation of a thin crust. Further cooling then brought on thousands of years of rain that dumped most of the water vapour from the atmosphere onto the surface. This formed vast planet-spanning oceans. The whole planet was at this point still super hot, but also super wet, and therefore super humid, with the surface practically entirely underwater, lots of active volcanos all over the place but otherwise no mountains. Nevertheless, there would have been some  slightly more elevated places, like on the flanks of volcanos, that would have been dry at least some of the time, leaving some spots where water could accumulate in ponds and stagnate. As soon as these conditions were present, around 3.8 billion years ago, the Earth saw its first microbial life emerge.

Claims for the earliest evidence of life at 3.8, 3.7 or 3.5 billion years are still controversial, but it is well established that hydrogen cyanide dissolved in water produces a diversity of essential biological molecules like urea, amino acids and nucleic acid bases; that formaldehyde in slightly alkaline water polymerises to form a range of different sugars; that amino acids, sugars and nucleic acid bases as well as fatty acids have been found in carbonaceous meteorites; and that by 3 billion years ago, prokaryotes (organisms made of cells without a nucleus) were widespread.

There was a major problem, a major impediment to life, that had to be overcome. This was the fact that the entire surface of the Earth was exposed during the day to the Sun’s UV radiation, and UV rays destroy biological structures and DNA. The cleverest of tricks would have been to find a way to absorb these energetic photons and use the energy for something.

Nature is very clever: by 3.5 billion years ago, chlorophylls believed to have developed in order to protect proteins and DNA of early cells appeared, and chlorophyll-containing cyanobacteria—the oldest living organisms and only prokaryotes that can do this—had developed the ability to absorb light, use that energy to split water molecules and use the free electron from the hydrogen atom to sustain their metabolism, spewing out the oxygen in the process. Oxygen accumulated in the crust for a billion years before the latter became saturated with it and unable to absorb any more. Evidence for increasing oxygen levels in the atmosphere is first seen at around 2.5 billion years ago. By 2.2 billion years ago, oxygen concentrations had risen to 1% of what they are today.

Increasing concentrations of reactive and corrosive oxygen was devastating for all forms of life that, at this stage, were all anaerobic: the oxygen was combining with everything it got in contact with creating all sorts of reactive oxygen species (free radicals) that went around causing damage, exactly as they do in our bodies and that of all animals today, and which, in the absence of antioxidants to neutralise them accelerated ageing and death. These were the only card that these simple anaerobic organisms were dealt.

Nevertheless, for another reason entirely, atmospheric oxygen was a blessing because it turned out to be an excellent UV shield. Not only that, but the splitting of oxygen molecules (O2) into oxygen atoms promoted the recombination of these free-floating oxygens into ozone (O3) that turns out to be an even better UV absorbing shield. So, the more photosynthesis was taking place on the surface, the greater the concentration of atmospheric oxygen grew. The more molecular oxygen there was in the atmosphere, the more ozone could be formed. And the more ozone there was to protect and shield the surface from the harsh UV radiation from the Sun, the more complex and delicate structures could develop and grow. Pretty cool for a coincidence, wouldn’t you say?

By 2 billion years ago—within 200 million years—the first eukaryotes appear (organisms made of cells with a nucleus). This makes good sense considering that these simple organisms and independently-living organelles had a great survival advantage by getting together in groups to benefit from one another and protect each other behind a membrane while making sure the precious DNA needed for replication and proliferation was well sheltered inside a resilient nucleus. Note here that these would have been trying to protect themselves both from the damaging UV radiation streaming down from the Sun (it’s estimated that DNA damage from UV exposure would have been about 40 times greater than it is today), as well as from the corrosive oxygen floating in the air (imagine how much more oxidising it is today with concentrations 100 times greater than they were). And in there, within each of these cells, there were chloroplasts—direct descendants from the first UV absorbers and converters, the cyanobacteria—whose job was to convert the photons from the sun into useful energy for the cell.

In all likelihood unrelated to this biological and chemical evolution of the Earth’s biosphere and atmosphere, a long period of glaciation between 750 and 600 million years transformed the planet into an icy snow and slush ball. And with basically all water on the surface of the globe having frozen over, all organisms under a thick layer of ice and snow, photosynthetic activity must have practically or completely ceased. Fortunately, without liquid water in which to dissolve the atmospheric carbon dioxide into the carbonic acid that in turn dissolves the silicates in the rocks over which is streams and carries down to the ocean floor for recycling by the active tectonic plates, all the carbon dioxide sent into the atmosphere by the volcanos just accumulated. It is believed to have reached a level 350 times higher than it is now. This is what saved the planet from runaway glaciation.

Thanks to this powerful greenhouse of CO2, the ice and snow eventually melted back into running streams and rivers, and flowing wave-crested seas and oceans. With water everywhere and incredibly high concentrations of CO2, plant life exploded. And soon after that, some 540 million years ago, complex animals of all kinds—molluscs, arthropods and chordates—also burst into existence in an incredible variety of different body plans (morphological architectures), and specialised appendages and functions. This bursting into life of so many different kinds of complex animals, all of them in the now already salty primordial oceans, is called the Cambrian Explosion. Complex plant life colonised the land by about 500 million years ago, and vertebrate animals crawled out of the sea to set foot on solid ground around 380 million years ago.

Clearly, all plant life descends from cyanobacteria, first to develop the ability to absorb UV radiation, and without complex plant life, it is hard to conceive of a scenario for the evolution of animal life. The key point in this fascinating story of evolution of the solar system, of our Earth and of life on this planet as it pertains to what we are coming to, is that the light and energy coming from the Sun are essential for life while being at the same time dangerous for the countless living organisms that so vitally depend on it. In humans and higher animals this duality is most plainly and clearly exemplified by the relationship between two essential micronutrients without which no animal can develop, survive and procreate. These vital micronutrients are folate and vitamin D.

What makes folate (folic acid or vitamin B9) and vitamin D (cholecalciferol) so important is that they are necessary for proper embryonic development of the skeleton (vitamin D), and for the spine and neural tube as well as for the production of spermatozoa in males (folate). Vitamin D transports calcium into the blood from the intestinal tract making it available to be used in building bones and teeth; folate plays a key role in forming and transcribing DNA in the nucleus of cells, making it crucially important in the development of all embryonic cells and quickly replicating or multiplying cells (like spermatozoa).

Here’s the catch: vitamin D is produced on the surface of the skin (or fur) through the photochemical interaction of the sun’s UV-B rays and the cholesterol in the skin; folate is found in foods, mostly leafy greens (the word comes from the latin folium that means leaf), but it is broken down by sunlight.

What this translates to is this: too little Sun exposure of the skin leads to vitamin D deficiency, which leads to a deficiency in the available and useable calcium needed to build bones, which in turn leads to a weak, fragile and sometimes malformed skeletal structure—rickets; too much Sun exposure leads to excessive breakdown of folate, which leads to folate deficiency, and which in turn leads to improper development of the quickly replicating embryonic cells of the nervous system and consequent malformation of the neural tube—spina bifida.

The most important thing of all for the survival of a species, is the making and growing of healthy babies and children so that they can make and grow other generations of healthy babies and children. This is true for all living beings, but it is not just true: it is of the highest importance, and it has been—taking evolutionary precedence over everything else—since the dawn of life on Earth. Here is how the biochemistry of the delicate balance between these two essential micronutrients evolved.

Six to seven million years ago, our oldest ape-like ancestors walked out of the forest and into the grassy savannah most probably to look for food. (Isn’t this what also gets you off the couch and into the kitchen?). It is most probably the shift in climate towards hotter and dryer weather and, in response to that, the shrinking of their woodlands, that pushed them to expand their foraging perimeter out into the plains that were growing as the forests were shrinking.

Our first australopith ancestors, these ancestors that we share with modern chimpanzees, would have been in all likelihood covered in hair with pale skin underneath (just as chimps are today), their exposed skin growing darker in time with exposure to sunlight. Having left the forest cover, they were now exposed to the hot scorching Sun most of the day, while walking around looking for food, before going back to the forest’s edges to sleep in the trees.

Natural selection would now favour the development of ways to stay cool and not overheat. This meant more sweat glands to increase cooling by evaporation of water on the surface of the skin. It also meant less hair for the cooling contact of the air with the wet skin to be as effective and efficient as possible. But less hair implied that the skin was now directly exposed to sunlight. To protect itself from burns and DNA damage, but also to protect folate, natural selection pushed towards darker skin: more melanocytes producing more melanin to absorb more photons and avoid burning and DNA damage.

In these circumstances, the problem was never too little sun exposure; it was too much exposure, and thus sunburns and folate deficiency. So these early hominids gradually—and by gradually is meant over tens of thousands of years—became less hairy and darker-skinned. They also became taller and leaner, with narrow hips and long thin limbs: this gave less surface area exposed to the overhead sun but more skin surface area for sweating and cooling down, together with better mechanical efficiency in walking and running across what would appear to us very long distances in the tens of kilometres every day, day after day, in foraging and hunting, always under a blazingly hot sunshine. This process that is described here in a few sentences took place over millions of years, at least 3 or 4 and most probably 5 or 6 million years. The Turkana boy, a 1.6 million years old fossilised skeleton is definitive proof that by that time, hominids were already narrow-hipped and relatively tall.

From an evolutionary standpoint it couldn’t be any other way. While keeping in mind that we are still talking about ancient human ancestors, and not modern homo sapiens, nonetheless, did you, as you were reading these sentences, start to wonder who today would fit such a physical description of being hairless, dark-skinned, tall, lean and narrow hipped? Naturally: savannah dwelling modern hunter-gatherers, and, of course, the world’s best marathon runners. It makes perfect sense, doesn’t it?

Taking all currently available archaeological, paleontological, anthropological, as well as molecular and other scientific evidence as a coherent whole brings us to the most plausible scenario in which all humans on the planet today descend from a single mother who was part of a community of people living somewhere on the western coast of Africa; that it is this group of modern homo sapiens that first developed and used symbolic language to communicate and transmit information and knowledge acquired through their personal and collective experiences; and that it was descendants of these moderns who migrated in small groups, in a number of waves, first into Asia and later into Europe, starting 70 to 100 thousand years ago.

It is very interesting that we also have evidence that moderns had settled areas of the middle east in today’s Israel and Palestine region as early as 200 thousand years ago, and that these moderns shared the land and cohabited with Neanderthals for at least 100 thousand years, using the same rudimentary tools and technologies, without apparently attempting to improve upon the tools they had. Meanwhile, this other group of western African coast moderns had far more sophisticated tools that combined different materials (stone, wood, bone), as well as decorative ornaments and figurines.

Thus, although equal or close to equal in physical structure, appearance, dexterity and skills—a deduction based on fossils and evidence that newer and better tools were immediately adopted and replicated in manufacture by moderns to whom they were introduced by other moderns—it is clear that different and geographically isolated communities of moderns ate differently, lived differently, developed differently and at different rates.

This is not surprising, really. Some children start to speak before they turn one, while other do not until they are two, two and a half or even three. Some children start to walk at 10 or 11 months, while others just crawl on the ground or even drag their bum in a kind of seated-crawl until they are three or more. And this is for children that watch everyone around them walking all day long, and listen to everyone around them speak using complex language also all day long. Now, what do you think would happen if a child grew up without being exposed to speech? Why would they ever, how could they ever start to speak on their own, and to whom would they speak if nobody spoke to them?

Fossil evidence shows that the structures in the ear and throat required for us to be able to make the sounds needed for refined speech and verbal communications were in place (at the very least 200 thousand years ago) tens and even hundreds of thousands of years before the first evidence of symbolic thought (70-50 thousand years ago) and together with it, it is assumed, advanced language.

Symbolic thinking in abstract notions and concepts is the most unique feature of our species. It is the hallmark of humans. And it is the most useful and powerful asset we have in the evolutionary race for survival. Sophistication in symbolic thought can only come with sophistication in language and in the aptitude for language: it is only by developing and acquiring more complex language skills that more complex symbolic thinking can come about, and more sophisticated symbolic thinking naturally leads to developing a more sophisticated and refined language in order to have the means to express it.

It’s surely essential to recognise that this is as true for our ancestors, those that developed that first symbolic language, as it is for you and me today. The difference is that then, the distinction was between those few moderns that used symbolic language and those that didn’t, whereas today, the distinction is more subtle because everyone speaks at least one language to a greater or lesser extent. Nonetheless, anyone can immediately grasp what is described here by listening to Noam Chomsky lecture or even just answer simple questions in the course of an interview.

As they moved northward, settling in different places along the way, staying for thousand or tens of thousands of years, then leaving their settlements behind, either collectively or in smaller groups, and moving on to higher latitudes before settling again somewhere else, these people encountered a wide range of different climates and geographical conditions: usually colder, sometimes dry and sometimes wet, sometimes forested and sometimes open-skyed, sometimes mountainous and sometimes flat. In all cases, they were forced to immediately adapt their living conditions, building suitable dwellings and making adequate clothing. This, we know for sure, because they would have simply not survived otherwise, and it is only those that did survive that are our direct ancestors.

Evolutionary adaptation through natural selection of traits and characteristics arising from small—and, on their own, insignificant and typically unnoticeable—random genetic mutations also took place as it does in every microsecond and in every species of animals and plants. But this, we know to be a slow process that is measured on the timescale of tens of thousands of years (10, 50 even 100). Now, consider the evolutionary pressure—the ultimate evolutionary pressure—of giving birth to healthy and resilient offspring that will grow up to learn from, take care of, and help their parents. The most pressing evolutionary need at these higher latitudes was for the body to more efficiently make and store vitamin D from the incoming UV-B rays that, (and this is an important detail often overlooked or under appreciated), make it to the surface only when the Sun is high in the sky and have less atmosphere to go through. This stringent restriction on the few hours near midday when UV-B can make it to the surface is both constraining and life-saving: it is constraining because only during those hours can the essential vitamin D be made, and it is life-saving because a continual exposure to this energetic, DNA-damaging UV radiation would in time sterilise the surface of the entire planet.

The higher the latitude, the lower the Sun’s path on the sky throughout the year and especially during the winter months. Therefore, the shorter is the season during which UV-B rays reach the surface and during which it is possible for vitamin D to be produced on the skin or fur of animals. The only solution to this severe evolutionary pressure is as little body hair and as little pigmentation as possible (think of the completely white polar bears, arctic wolves, foxes and rabbits). As an aside, what else do you think as advantageous in the cold? The opposite as what is in the hot sun: more volume for less surface area; a smaller and stockier build that keeps heat better, exactly as we see in the cold-adapted Neanderthal.

Settled in a place that provides what we need to live relatively comfortably, we tend to stay there. This has always been true, and even if it has changed in the last few generations in industrialised western countries, we have witnessed this phenomenon up until very recently on islands like Sardinia, Crete, or Okinawa, remote valleys in the Swiss Alps, the Karakoram, Himalayas or Andes, and in other geographically isolated pockets of people with genetic characteristics homogenous amongst themselves but distinct with respect to other human populations. And thus across the world we find a whole spectrum—a rainbow—of different colours and shades of skin, different colours of hair and eyes, different amounts and textures of body hair, of different physical builds and morphologies, of different metabolic and biochemical sensitivities, all seen on a continuum, all dependent upon the evolutionary history of the subpopulation where particular characteristics are seen to be present or absent to a greater or lesser extent, and all of this driven by the evolutionary pressures to adapt and maximise the survival probability of our offspring, our family, our clan, our species, by optimising the amount of folate and vitamin D through the delicate balance between not enough of the latter from under-exposure to UV-B’s that produce it, and not enough of the former from excessive exposure to the same UV-B’s that destroy it.

What this tells us is that, for one thing, we have absolutely nothing to do with the colour of our skin, eyes and hair, and nothing to do with any of the physical and biochemical characteristics we have inherited. It tells us that this has nothing to do with our parents or grand parents either, really, because these are particularities that have evolved over tens of thousands of years of evolution in a very long line of ancestors that settled in a place, stayed put and lived at a particular latitude in a particular geographical setting with a particular climate. It tells us, in the most obvious manner, that because this is so, discrimination based on colour or physical features is not jut unfounded, but it is simply absurd.

If you’re black, you’re black. If you’re white, you’re white. If you’re chocolate or olive-skinned, then you’re chocolate or olive-skinned. If you are “yellow” or “red” then that’s just how it is. And who cares how you phrase it or not, try to be “political correct” and avoid speaking of it. That’s just silly. All of it is simply just the way it is. In the same way, if you’re short or tall, hairy or not, thin or stocky, it is just the way it is. However you are and whatever features you consider, there is never anything more or less about it, never anything more or less about any of these features: it is an expression of our genetic ancestry going back not just a few but hundreds of thousands of generations.

What this also tells us is that we have to take this information into account in everything we do, especially in regards to what we eat, where we live, and how much or how little we expose ourselves to the Sun’s vitally important UV-B rays. Disregard for these fundamentally important details leads to what we see in the world in this modern era where we all live wherever we want, more or less, and find ourselves with our olive or dark brown skin living in at high northern latitudes, or with our fair or milk-white skin living near the equator with strong overhead sun all year round, and see the consequent high rates of vitamin D deficiency and rickets in our dark-skinned northern dwellers, together with the similarly high rates of folate deficiency and spina bifida in our fair-skinned southern dwellers.

In general, if you are dark-skinned you need to expose your skin to the sun a lot more than if you are fair-skinned, because you will both produce less vitamin D and store less. If you are fair-skinned you need less exposure and will tend to store the vitamin D more efficiently for longer periods of time. As for folate, we all need to eat (or drink) leafy greens (i.e., foliage) and green veggies.

However, there is an additional complication that makes matters worse (far worse) That complication is that in this day and age, we all live inside, typically sitting all day facing a computer screen, and sitting all evening eating supper and then watching TV. Not everyone, of course… but most people. Not only that, but most of us all over the world now eat more or less the same things: highly processed packaged foods usually high in processed carbs and low in good, unprocessed fats, high in chemicals of all kinds and low in nutrients, and hardly any leafy and green veggies or nuts and seeds. And boy do we love our Coke, our daily bread, our fries and potatoes, our pizzas and big plates of pasta, and our sweets and desserts! Not everyone, of course… but most people. Consequently, we are all as deficient in folate as we are in vitamin D. We are all as deficient in unprocessed fats and fat-soluble vitamins as we are in all other essential micronutrients. How depressing.

But once we know this, once we have been made aware of this situation, we can correct the problem by switching to a diet of whole foods—of real foods—rich in folic acid and fat-soluble vitamins like A, D, E and K2, (the inuits, for example, get all their vitamin D and the other fat-soluble vitamins from the fat of the whales and seals they eat), and supplementing adequately to maintain optimal levels of both vitamin D (80-100 ng/ml or 200-250 nmol/L) and folate (>5 ng/ml or >11 nmol/l), especially during conception, pregnancy and early childhood, but throughout life and into old age.

There’s one last thing I wanted to mention before closing, and in which you might also be interested: can we ask if one is more important then the other, folate or vitamin D, and do we have a way to answer this question from an evolutionary standpoint? Well, here is something that suggests an answer: in all races of humans on Earth, women are on average about 3% lighter in skin colour than men of the same group. For decades, researchers (mostly old men, of course) were satisfied with the conclusion that this was the result of sexual selection, in the sense that men preferred lighter skinned women and so this is how things evolved over time. Of course, most of you will agree with me now that this just sounds like a cop-out or at best a shot in the dark from a possibly sexist male perspective.

Most of you will surely also agree that considering the question from the perspective of the importance of vitamin D versus folate is clearly more scientific in spirit than claiming sexual selection to explain the difference. And if women are lighter than men no matter where we look on Earth, this strongly suggests that it is either more difficult to build up and maintain good levels of vitamin D to ensure healthy offspring, or that it is more important. In today’s world, it certainly is true that it is far easier to have good levels of folate because even if you stay inside all day, as long as you eat leafy greens or drink green juice, your folate levels will easily be higher than the optimal minimum of 5 ng/ml, and probably much higher, like mine which are five time higher than that at 25 ng/ml.

So, for us today, especially if we eat greens, there is no question that we have to pay much closer attention to our vitamin D levels that tend to be way too low across the board all over the world. We can hypothesise that if we continue evolving over millennia following this indoors lifestyle that we have, humans everywhere will continue to lose both body hair and pigmentation, even those who live in sunny countries, because they don’t expose themselves to the Sun. I would like to encourage you to instead expose your skin to the amount of sunlight that is in accord with your complexion, drink green juice, monitor your vitamin D levels at least once per year, and take supplements to ensure both stay in the optimal range (I recommend taking A-D-K2 together to ensure balance between them, better absorption and physiological action). That alone, even if you don’t do anything else, will be of great benefit to you, and, if you are a soon-to-be or would-like-to-be mother, of even greater benefit to your child or children.

And next time you go out, and each time after that, pay attention, look and appreciate the amazing richness and beauty of all the different skin colours and unique physical features of all the people you see all around. What you will be seeing is the inestimable richness and incalculable pricelessness of our collective human ancestry expressing itself vividly and openly, nothing held back and nothing hidden, for everyone to see and appreciate.

If you are interested in reading more about the topics touched upon in this article, its contents draw from the books Life in the Universe, Rare Earth, Masters of the Planet, The Story of the Human Body and the Scientific American special issue Evolution that features the article, entitled Skin Deep, that prompted me to write this post. And please share this post: we all need to do what we can to help overcome discrimination based on race and appearance.

If you think this article could be useful to others, please ‘Like’ and ‘Share’ it.

Updated recommendations for magnesium supplementation

Daily magnesium supplementation is definitely more than a no-brainer, it is really very important, and this, for everyone. I hope that I managed to convey just how important it really is in Why you should start taking magnesium today, Treating arthritis Ias well as in At the heart of heart disease. In terms of supplementation, however, I would like to refine my recommendations.

Nigari, or magnesium chloride, is excellent because it is inexpensive and easily absorbed. I continue to stand by this, and also continue to use it very regularly. However, I now only use it trans-dermally (on the skin), and recommend you do the same. The reason for this is very simple. Taking it internally, is fine, but because absorption goes through the digestive system, the most that will be absorbed is estimated at 25%, and the rest will be eliminated.

And how will it be eliminated? Well, naturally, through the stools. And I, after using a 2% nigari-water solution orally for supplementation for several months (even with some breaks as recommended by proponents of this manner of magnesium supplementation), found that my colon gradually became more and more irritated (which could be felt when passing stools and wiping). When I would stop supplementation for a few days, the irritation would go down; when I started again, it would come back. Therefore, after a couple of times checking this, it became clear that it was indeed oral supplementation with magnesium chloride that was the cause of the irritation in the colon.

But why even bother taking magnesium chloride orally when it is far better absorbed through the skin? Magnesium oil (20-30% nigari-water solution) that you must leave on for 30 minutes, works great, but the most pleasant is definitely a 30 minute bath spiked with a cup of nigari flakes. This is without a doubt the most effective and most agreeable way to supplement, while ensuring maximum absorption by the body of the magnesium ions so importantly needed by cells in tissues throughout the body.

Having said that, I recognise that having baths every day is time consuming, only really tempting when the weather is cool, and also wasteful in terms of water usage. Therefore, we don’t have baths when it is hot, and should restrict it to a max of three times per week in the cold season, using the least amount of water, and having really short showers on the days in between in order to keep water consumption as reasonable as we can. In the end, magnesium oil is far more environmentally friendly, because it works all year around and does not result in accrued water consumption.

As an aside relating to hot water usage and energy efficiency, because heat loss is always directly proportional to the difference in temperature between ‘inside’ and ‘outside’ , we should set the temperature on our hot water heater to the minimum useable temperature. This minimise heat loss, and consequently, energy consumption for water heating.  I have determined that temperature to be 41-42 C. These temperatures are also perfect to wash the dishes, wash your hands or face, shower, and also to run a bath that is hot (but not too hot) when you get in, and after 25-30 minutes is still hot enough for you to feel comfortable in the water without any hint of feeling cold, but not too hot such that you can’t stand it any longer, or be sweating for half an hour after you’ve gotten out. (Actually, 40 C is perfect for a shower, dishes, hands and face, etc, we need 1 or 2 degrees more for a bath due to heat losses into the tub and air.)

Naturally, the exact ideal hot water temperature is a personal thing that depends on many factors, surely most importantly on body composition and especially basal body temperature, which in turn depends on metabolism. In my case, basal body temperature is as low as can be, since my metabolism runs almost exclusively on fat, and you’ll remember that fat burns cool while carbs and protein burn hot. Anyway, you need to experiment a little, but I’m pretty sure that you will find your ideal hot water temperature between 40 and 43 C.

Because magnesium is water soluble and used up as it is needed every day throughout the day, it is necessary to supply the body with it on a daily basis. Naturally, eating foods rich in magnesium is essential (almonds and greens are the best), this is typically not enough, and oral supplementation is quick and easy. Fortunately, the perfect magnesium supplement is now available. This is ReMag, designed and marketed by Dr. Carolyn Dean (the doctor who wrote The Magnesium Miracle), and who guarantees that it’s 100% absorbed by the cells because it is in a form that is small enough to pass through the 400-500 pico metre sized ion channels that regulate mineral absorption and excretion through cell walls, and therefore, that none of it is eliminated through the digestive system as are most forms of magnesium supplements. (You can read her e-book about it here, and watch this recent video on Mercola’s site.)

So, these are my updates recommendations for magnesium supplementation:

  1. Magnesium oil on the skin for a couple of months to quickly replenish cellular magnesium levels,
  2. Bath with 1-2 cup of nigari flakes, once or twice a week, and
  3. L-Threonate (liposomal) or ReMag (pico sized) taken orally.

This is really important for everyone, but crucial for any person suffering from any kind of illness or disease condition whatsoever.

If you enjoyed reading this article, please click “Like” and share it on your social networks. This is the only way I can know you appreciated it.

B12: your life depends on it

There are very few nutrients as crucial to our well-being as vitamin B12. The reason why this is so is that vitamin B12 is essential for cellular energy metabolism, gene transcription, and nervous system function. This vital role at the cellular level is not restricted to only some tissues and organs: it is vital for every single cell of every tissue and every organ.

For the nervous system, both for the central nervous system—our brain—and the peripheral nervous system—the spine and entire network of nerves connected to the brain and coursing through the whole body—vitamin B12 is essential in building, maintaining and repairing the myelin sheath that covers every nerve to ensure protection and proper nerve signalling. It is, in fact, the consequences of B12 deficiency on the nervous system that most often betray this very serious problem.

Everyone should supplement and maintain blood levels of B12 in the range from 600 to 2000 pg/ml in order to avoid and, if this is the case, help recover from the wide range of problems that result from B12 deficiency or insufficiency. Health care practitioners: this is the first thing you should check for every patient that comes in, independently of their age or condition.

What is vitamin B12 and how is it absorbed?

B12 or cobalamin is a large molecule whose central atom is cobalt, and around which are arranged various other compounds. To be active in the body, the cobalamin molecule must be in one of two enzyme forms: methylcobalamin or adenosylcobalamin, both of which must be in a charge state of +1. Even though cobalamin can exist in two other charge states, +2 and +3, neither of these is bio-active. Its most powerful antagonist is nitrous oxide (N2O; laughing gas), which continues to be commonly used as an anaesthetic agent during surgical operations, because it inactivates the molecule by modifying the cobalt ion from a charge state of +1 to one of either +2 or even +3.

Cobalamin is produced in the gut of animals by specific bacteria that make part of the intestinal flora. Although this can also be true for humans, we have mostly relied on animals both by eating them and products derived from them, like eggs and dairy. In animal foods, cobalamin is always bound to protein from which it needs to be separated in order to be used. This, in turn, can only be done starting in the highly acidic environment of a well functioning stomach that secretes enough hydrochloric acid, but also enough Intrinsic Factor and pepsin.

Cobalamin is carried into the duodenum—the first part of the small intestine—by salivary B12 receptors that are then broken down by pancreatic protease. This allows the free B12 to attach to Intrinsic Factor, and make its way to the ileum—the very last part of the small intestine—where it penetrates the mucosal wall for absorption. Finally, the free cobalamin latches onto the plasma transporter Protein Transcobalamin II whose function it is to carry it to the cells throughout the body. Any excess, unneeded at any given time, is carried to the liver where it is stored.

Where do we get B12?

That herbivores like sheep, goats and cows, which thrive when they eat only grass, do not suffer from B12 deficiency, but that most of us humans tend to (estimates from various large scale studies range between 40 and 80%), points to two key issues at the heart of this problem:

One, we have evolved and survived as a species over several million years by eating animals. It is believed by some that it was, in fact, the very eating of animal foods, maybe specifically bone marrow, which was, on the one hand, the only left overs after carnivore predators like lions, and then all other scavengers but predators for us like wolves and jackals had eaten all they could, and on the other, the only thing that only humans could get to by breaking apart the bones, that allowed the brain to grow in size over a relatively short evolutionary period, seting us apart from our our primate ancestors and cousins. Whatever the case may be, the organism of the human species as a whole grew accustomed and became reliant on an external supply of vitamin B12 from animal sources.

Two, it is most certainly the case that even with the healthiest, let’s even say ideal or perfect intestinal flora, as humans we will definitely have a very different flora than those of the herbivore animals we domesticated, and it will arguably always be much less capable and much less efficient at producing cobalamin from any of the plant foods we do eat. Moreover, if B12 is manufactured by some of the bacteria in our perfectly healthy colon—the large intestine, it will still not easily make it into circulation because, as we saw, absorption of cobalamin takes place in the ileum in the last part of the small intestine, which is upstream from the large intestine. The manufactured B12 would somehow have to migrate backwards from the colon to the ileum, a most likely very difficult thing to do.

The first point is supported by ample archeological, anthropological, as well as evolutionary biological evidence. In fact, it turns out that our hominid ancestors have most certainly lived for the bulk of our evolutionary history during periods of glaciation where the land over most of the Earth’s surface was covered in ice. This implies that there was a marked absence of plant life in most places on Earth, and therefore an absolute reliance on animals for survival, eating virtually only animals, which in turn also ate virtually only other animals and fish, which ate smaller fish, and on down the food chain to those feeding on sea-borne plant foods. The Inuits, who basically live on whale blubber, are the perfect example of such a scenario. But this could well have been the scenario for a lot of the humans that populated the Earth, and for a good portion of our history spanning the last 2.5 million years.

The second is hypothetical, but on firm footing given that it is indisputable that the gut flora of a herbivore will be different—substantially different—from ours, but also that we simply cannot survive for very long on greens alone as do sheep, goats, cows and all other herbivores. Furthermore, in actual fact, most humans have a dysfunctional digestive system, with heavily compromised and impaired intestinal flora. As a consequence, even those who eat adequate or even large amounts of B12-rich animal foods, usually cannot benefit from it because the cobalamin simply doesn’t make it into the bloodstream for any one of several possible impediments along the ingestion-breakdown-absorption chain.

This is not to say that our digestive flora cannot produce some B12 from plant-based foods, but the evidence shows us that it definitely cannot produce enough, whatever the reason: studies have shown that although B12 deficiency is of the order of 40% in the general omnivore population, it is 50% in vegetarians, and up to a staggering 80% in long-term vegans (see Chapter 6 of Could it be B12? and references therein).

Why is B12 deficiency such a big deal?

Well, let’s ask another question instead: What would happen if the myelin sheath that covers the nerves in our body—peripheral, spinal and brain—were to deteriorate?

Neurological symptoms would include: numbness, tingling and burning sensations in the hands, fingers, wrists, legs, feet, or truncal areas; Parkinson-like tremors and trembling; muscles weakness, paraesthesia and paralysis; pain, fatique and debility labelled as chronic fatique syndrome; shaky legs, unsteadiness; dizziness, loss of balance; weakness of extremities, clumsiness, twitching, muscle cramps, lateral and multiple sclerosis-like symptoms; visual disturbances, partial loss of vision or blindness. But the list goes on.

Psychiatric symptoms? Confusion and disorientation, memory loss, depression, suicidal tendencies, dementia, Alzheimer’s, delirium, mania, anxiety, paranoia, irritability, restlessness, manic depression, personality changes, emotional instability, apathy, indifference, inappropriate sexual behaviour, delusions, hallucinations, violent or aggressive behaviour, hysteria, schizophrenia-like symptoms, sleep disturbances, insomnia. And here again, the list goes on.

At the cellular level, every cell would be unable to adequately produce energy, be it from glucose or from fat. We can easily extrapolate and imagine what it would mean for the organism as a whole to have a lack of, or severe debility in the energy available to it at the cellular level, and this, for the trillions of cells throughout. This would have a most profound effect on everything that we do, and everything that the body does throughout the day and night.

Now consider a yet deeper level: in the nucleus of every cell, where genes are protected and cared for, a problem in the very transcription and replication of genes—these delicate operations that are necessary and vital for the continual renewal, repair and reproduction of cells—which must and do take place throughout our life, this long succession of infinitesimal instants the perception of which is almost universally absent from consciousness, but for which the timescale is, in fact, very long at the cellular level, where movements and interactions take place at phenomenal speeds. Vitamin B12 is absolutely essential for this too. And if it’s missing? Unintended, unplanned, and unwanted genetic mutations from errors in transcription. This means problems; very serious problems.

Who should be concerned about all this?

The short answer is: everyone. This means you, but also your kids as well as your parents. It means infants, toddlers, children, teenagers, young adults, mature adults, the middle aged, the elderly, and the oldest among us: absolutely everyone.

For the longer answer, it would appear to be the case that we are, or at least should be, born with a good B12 reserve, and that, as it is used over time, the amount in the body and blood slowly decreases as the reserves get used up and eventually depleted. Some consider this to be the normal state of affairs. This inevitably implies that those at greatest risk of suffering from B12 deficiency are the oldest, and also that the older we get, the greater our chances of becoming victims of the effects of this deficiency. And this is indeed what we find: practically everyone above the age of 60 is B12 deficient, and more often than not, severely deficient (serum B12 < 200 pg/ml).

It is therefore not really surprising that every single behavioural characteristic—intellectual, psychological, emotional, physiological and physical—associated with ageing and its multiple manifestations in the elderly, senior moments in all their different forms: memory problems, disorientation, inability to concentrate or even pay attention, frailty, weakness, unsteadiness, loss of balance, etc, etc, are all typical  symptoms of B12 deficiency.

Could it be that all these characteristics of old age are actually the characteristics of B12 deficiency? Could it be that if we didn’t let B12 levels drop below 600 pg/ml and actively maintained them around 1000 pg/ml throughout life, that seniors would simply not manifest any of these signs of old age? Maybe. Maybe even most probably. What an entirely different world it would be: strong and healthy, energetic and vibrant, sharp and alert old people. Sounds great, doesn’t it? And hard to imagine, isn’t it? But wouldn’t that be wonderful, for everyone, and especially for the elderly themselves?

As alluded to a moment ago, we should be born with a large B12 reserve. It is of particular importance that we need to have a plentiful supply of B12 throughout our development in the womb, during infancy, and up to the 7 years of age. Why is it so important? Because our nervous system develops fastest while we are in our mother’s womb, and then during infancy and as a toddler, until it reaches maturity by the time we are about 7, and because cobalamin is essential for this development.

The complication, however, a point of crucial importance, is that only B12 consumed by the pregnant mother at first, the breast-feeding mother afterwards, and finally by the toddler can be used to ensure an optimal development and building of a healthy brain and nervous system. Even if the mother had good B12 levels before, during and following pregnancy, only fresh B12 can be used in the developing child. So, if she doesn’t consume much or any during this critical period, the unborn child and infant will have only a meagre or non-existent supply of cobalamin, and consequently, impaired—often severely—brain and nervous system development.

This is a very serious matter. In fact, for many infants, it is a matter of life or death. Or just barely less dramatic but maybe even worse in some respects, it can make the difference between a normally healthy brain and nervous system, and permanent developmental disability, both physical and intellectual, right down to a full or partial vegetative state for a whole lifetime.

All of this shows why B12 deficiency tends to be not only transmitted, but to worsen in severity from one generation to the next, with all the negative consequences that come with it, but most notably those that affect the brain and all cognitive functions. Terribly sad and unfortunate as it is, numerous studies and reports on the babies of vegetarian but especially vegan and macrobiotic mothers have shown very serious neurological problems, developmental delays as severe as stunted brain growth and death, but also that even mild deficiencies in infancy are associated with seriously impaired cognitive performance in adolescence and adulthood. I cannot stress this enough: B12 deficiency is really very serious.

Now, between the oldest and the youngest there is everyone else. If we are born with an excellent B12 status, then we are lucky and likely to be able to make it to old age without any apparent problems in this regard. If we are born B12-deficient, then we are most certainly likely to suffer from it greatly, and this, much sooner than later. And if we are born with anything in between, an intermediately good or bad B12 status, then problems will appear later in life, or sooner, depending on many other factors, but most importantly on how much cobalamin we consume, and how well it is absorbed. Consequently, manifestations of cobalamin deficiency can appear at the age of a few months or a few years; as a child or teenager; as a young adult or person in their prime; near retirement or in old age; or it may also never become apparent. Unfortunately, this condition is continuously growing in importance, the people it affects growing in number, and the reported cases growing in severity.

Unfortunately, and extremely sadly for way too many people whose bodies, minds and lives are destroyed by an undiagnosed deficiency, B12 is not something that doctors routinely check or know much about. Most of them believe that it will appear in the total blood count (TBC) panel either as enlarged (megaloblastic anaemia) or fewer red blood cells (pernicious anaemia). But by the time you get there, you have been suffering the ravages of B12 deficiency for a while already, and have thus almost certainly also already suffered permanent neurological damage. So, for your sake, don’t wait for your doctor to notice this. Instead, teach them about it. You will be doing them and their patients an immense favour.

Closing with the good news

It is really easy to prevent and avoid becoming cobalamin deficient, but also to correct a deficiency that exists or even one that has persisted for several years or decades, no matter if you eat animal products or not, want to or not, think that you should or not. We must, very simply, check our B12 status regularly by measuring three markersserum B12, plasma homocysteine (Hcy), and urinary methyl-malonic acid (MMA)—and make sure to supplement in order to raise and maintain B12 levels in the range between 600 and 2000 pg/ml, with concentrations of Hcy and MMA as low as possible. Pregnant and nursing mothers should maintain levels above 1000 pg/ml to ensure healthy nervous system development in their children.

(Both Hcy and MMA are toxic byproducts of protein metabolism that must be converted to benign and/or useable forms, the animo acid methionine, for example, by the action of B6, folic acid (B9) and especially B12. Here is a good information-dense compilation of B12/Hcy/MMA publications, and transcript of an interview with John Dommisse, a psychiatrist and B12 expert, who published the above quoted serum B12 range as optimal in this authoritative paper cited in Could it be B12? where I read about it.)

Supplementation should be with methylcobalamin—not cyanocobalamin—and should be as aggressive as needed depending on the result of the assessment. In cases where B12 levels are below 200 pg/ml, we should request methylcobalamin injections to be administered daily for 5-6 days, and then weekly until B12 reaches 2000 pg/ml. It should be maintained there at least until Hcy and urinary MMA have dropped significantly, and then monitored and maintained around 1000 pg/ml.

For anything else between 200 and 600 pg/ml and/or elevated Hcy or MMA, methylcobalamin patches are an effective way to get B12 levels up. In addition, oral supplementation, although the least effective of the three, still works surprisingly well compared to other supplements, and obviously cannot possibly hurt; it can only help. I recommend doing both patches and oral supplements until levels are around 1000 pg/ml, and then maintaining them with either one.

Finally, and very important to know is that you cannot overdose on methylcobalamin B12: not one negative physiological side effect has been reported or is known from methylcobalamin supplementation. You cannot do yourself or anyone any harm by taking B12 as methylcobalamin in large quantities for a long time; you can only do yourself and others harm by allowing a deficiency, as mild as it may be, to develop or linger. This applies to everyone.

If you enjoyed reading this article, please click “Like” and share it on your social networks. This is the only way I can know you appreciated it.