Food fortification – Wikipedia

Food fortification or enrichment is the process of adding micronutrients (essential trace elements and vitamins) to food. Sometimes it's a purely commercial choice to provide extra nutrients in a food, while other times it is a public health policy which aims to reduce the number of people with dietary deficiencies within a population. Staple foods of a region can lack particular nutrients due to the soil of the region or from inherent inadequacy of a normal diet. Addition of micronutrients to staples and condiments can prevent large-scale deficiency diseases in these cases.[1]

As defined by the World Health Organization (WHO) and the Food and Agricultural Organization of the United Nations (FAO), fortification refers to "the practice of deliberately increasing the content of an essential micronutrient, ie. vitamins and minerals (including trace elements) in a food irrespective of whether the nutrients were originally in the food before processing or not, so as to improve the nutritional quality of the food supply and to provide a public health benefit with minimal risk to health", whereas enrichment is defined as "synonymous with fortification and refers to the addition of micronutrients to a food which are lost during processing".[2]

Food fortification was identified as the second strategy of four by the WHO and FAO to begin decreasing the incidence of nutrient deficiencies at the global level.[2] As outlined by the FAO, the most common fortified foods are cereals (and cereal based products), milk (and milk products), fats and oils, accessory food items, tea and other beverages, and infant formulas.[3] Undernutrition and nutrient deficiency is estimated globally to cause between 3 and 5 million deaths per year.[1]

Main methods of food fortification:

The WHO and FAO, among many other nationally recognized organizations, have recognized that there are over 2 billion people worldwide who suffer from a variety of micronutrient deficiencies. In 1992, 159 countries pledged at the FAO/WHO International Conference on Nutrition to make efforts to help combat these issues of micronutrient deficiencies, highlighting the importance of decreasing the number of those with iodine, vitamin A, and iron deficiencies.[2] A significant statistic that led to these efforts was the discovery that approximately 1 in 3 people worldwide were at risk for either an iodine, vitamin A, or iron deficiency.[5] Although it is recognized that food fortification alone will not combat this deficiency, it is a step towards reducing the prevalence of these deficiencies and their associated health conditions.[6]

In Canada, the Food and Drug Regulations have outlined specific criterion which justifies food fortification:

There are also several advantages to approaching nutrient deficiencies among populations via food fortification as opposed to other methods. These may include, but are not limited to: treating a population without specific dietary interventions therefore not requiring a change in dietary patterns, continuous delivery of the nutrient, does not require individual compliance, and potential to maintain nutrient stores more efficiently if consumed on a regular basis.[4]

Several organizations such as the WHO, FAO, Health Canada, and Nestl Research acknowledge that there are limitations to food fortification. Within the discussion of nutrient deficiencies the topic of nutrient toxicities can also be immediately questioned. Fortification of nutrients in foods may deliver toxic amounts of nutrients to an individual and also cause its associated side effects. As seen with the case of excessive fluoride intakes below, the result can be irreversible staining to the teeth. Although this may be a minor toxic effect to health, there are several that are more severe.[8]

The WHO states that limitations to food fortification may include: human rights issues indicating that consumers have the right to choose if they want fortified products or not, the potential for insufficient demand of the fortified product, increased production costs leading to increased retail costs, the potential that the fortified products will still not be a solution to nutrient deficiencies amongst low income populations who may not be able to afford the new product, and children who may not be able to consume adequate amounts thereof.[2]

Food safety worries led to legislation in Denmark in 2004 restricting foods fortified with extra vitamins or minerals. Products banned include: Rice Crispies, Shreddies, Horlicks, Ovaltine and Marmite.[9]

One factor that limits the benefits of food fortification is that isolated nutrients added back into a processed food that has had many of its nutrients removed, does not always result in the added nutrients being as bioavailable as they would be in the original, whole food. An example is skim milk that has had the fat removed, and then had vitamin A and vitamin D added back. Vitamins A and D are both fat-soluble and non-water-soluble, so a person consuming skim milk in the absence of fats may not be able to absorb as much of these vitamins as one would be able to absorb from drinking whole milk. On the other hand, the nutrient added as a fortificant may have a higher bioavailability than from foods, which is the case with folic acid used to increase folate intakes.[10]

Phytochemicals such as phytic acid in cereal grains can also impact nutrient absorption, limiting the bioavailability of intrinsic and additional nutrients, and reducing the effectiveness of fortification programs.

Ecological studies have shown that increased B vitamin fortification is correlated with the prevalence of obesity and diabetes.[11] Daily consumption of iron per capita in the United States has dramatically surged since World War II and nearly doubled over the past century due to increases in iron fortification and increased consumption of meat.[12] Existing evidence suggests that excess iron intake may play a role in the development of obesity, cardiovascular disease, diabetes and cancer.[13]

Fortification of foods with folic acid has been mandated in many countries solely to improve the folate status of pregnant women to prevent neural tube defectsa birth defect which affected 0.5% (1 out of 200) US births before fortification began.[14][15] However, when fortification is introduced, several hundred thousand people are exposed to an increased intake of folic acid for each neural tube defect pregnancy that is prevented.[16] In humans, increased folic acid intake leads to elevated blood concentrations of naturally occurring folates and of unmetabolized folic acid. High blood concentrations of folic acid may decrease natural killer cell cytotoxicity, and high folate status may reduce the response to drugs used to treat malaria, rheumatoid arthritis, psoriasis, and cancer.[16] A combination of high folate levels and low vitamin B-12 status may be associated with an increased risk of cognitive impairment and anemia in the elderly and, in pregnant women, with an increased risk of insulin resistance and obesity in their children.[16] Folate has a dual effect on cancer, protecting against cancer initiation but facilitating progression and growth of preneoplastic cells and subclinical cancers.[16] Furthermore, intake of folic acid from fortification have turned out to be significantly greater than originally modeled in pre mandate predictions.[17] Therefore, a high folic acid intake due to fortification may be harmful for more people than the policy is designed to help.[15][16][18][19]

There is a concern that micronutrients are legally defined in such a way that does not distinguish between different forms, and that fortified foods often have nutrients in a balance that would not occur naturally. For example, in the U.S., food is fortified with folic acid, which is one of the many naturally-occurring forms of folate, and which only contributes a minor amount to the folates occurring in natural foods.[20] In many cases, such as with folate, it is an open question of whether or not there are any benefits or risks to consuming folic acid in this form.

In many cases, the micronutrients added to foods in fortification are synthetic.

In some cases, certain forms of micronutrients can be actively toxic in a sufficiently high dose, even if other forms are safe at the same or much higher doses. There are examples of such toxicity in both synthetic and naturally-occurring forms of vitamins. Retinol, the active form of Vitamin A, is toxic in a much lower dose than other forms, such as beta carotene. Menadione, a phased-out synthetic form of Vitamin K, is also known to be toxic.

There are several main groups of food supplements like:

Many foods and beverages worldwide have been fortified, whether a voluntary action by the product developers or by law. Although some may view these additions as strategic marketing schemes to sell their product, there is a lot of work that must go into a product before simply fortifying it. In order to fortify a product, it must first be proven that the addition of this vitamin or mineral is beneficial to health, safe, and an effective method of delivery. The addition must also abide by all food and labeling regulations and support nutritional rationale. From a food developer's point of view, they also need to consider the costs associated with this new product and whether or not there will be a market to support the change.[21]

Examples of foods and beverages that have been fortified and shown to have positive health effects:

"Iodine deficiency disorder (IDD) is the single greatest cause of preventable mental retardation. Severe deficiencies cause cretinism, stillbirth and miscarriage. But even mild deficiency can significantly affect the learning ability of populations. [] Today over 1 billion people in the world suffer from iodine deficiency, and 38 million babies born every year are not protected from brain damage due to IDD."Kul Gautam, Deputy Executive Director, UNICEF, October 2007[22]

Iodised salt has been used in the United States since before World War II. It was discovered in 1821 that goiters could be treated by the use of iodized salts. However, it was not until 1916 that the use of iodized salts could be tested in a research trial as a preventative measure against goiters. By 1924, it became readily available in the US.[23] Currently in Canada and the US, the RDA for iodine is as low as 90g/day for children (48 years) and as high as 290g/day for breast-feeding mothers.

Diseases that are associated with an iodine deficiency include: mental retardation, hypothyroidism, and goiter. There is also a risk of various other growth and developmental abnormalities.

Folic acid (also known as folate) functions in reducing blood homocysteine levels, forming red blood cells, proper growth and division of cells, and preventing neural tube defects (NTDs).[24] In many industrialized countries, the addition of folic acid to flour has prevented a significant number of NTDs in infants. Two common types of NTDs, spina bifida and anencephaly, affect approximately 2500-3000 infants born in the US annually. Research trials have shown the ability to reduce the incidence of NTDs by supplementing pregnant mothers with folic acid by 72%.[25]

The RDA for folic acid ranges from as low as 150g/day for children aged 13 years old, to 400g/day for males and females over the age of 19, and 600g/day during pregnancy.[26] Diseases associated with folic acid deficiency include: megaloblastic or macrocytic anemia, cardiovascular disease, certain types of cancer, and NTDs in infants.

Niacin has been added to bread in the USA since 1938 (when voluntary addition started), a programme which substantially reduced the incidence of pellagra.[27] As early as 1755, pellagra was recognized by doctors as being a niacin deficiency disease. Although not officially receiving its name of pellagra until 1771.[28] Pellagra was seen amongst poor families who used corn as their main dietary staple. Although corn itself does contain niacin, it is not a bioavailable form unless it undergoes nixtamalization (treatment with alkali, traditional in Native American cultures) and therefore was not contributing to the overall intake of niacin.

The RDA for niacin is 2mg NE(niacin equivalents)/day (AI) for infants aged 06 months, 16mg NE/day for males, and 14mg NE/day for females who are over the age of 19.

Diseases associated with niacin deficiency include: Pellagra which consisted of signs and symptoms called the 3D's-"Dermatitis, dementia, and diarrhea. Others may include vascular or gastrointestinal diseases.[28] Common diseases which present a high frequency of niacin deficiency: alcoholism, anorexia nervosa, HIV infection, gastrectomy, malabsorptive disorders, certain cancers and their associated treatments.[28]

Since Vitamin D is a fat-soluble vitamin, it cannot be added to a wide variety of foods. Foods that it is commonly added to are margarine, vegetable oils and dairy products.[29] During the late 1800s, after the discovery of curing conditions of scurvy and beriberi had occurred, researchers were aiming to see if the disease, later known as rickets, could also be cured by food. Their results showed that sunlight exposure and cod liver oil were the cure. It was not until the 1930s that vitamin D was actually linked to curing rickets.[30] This discovery led to the fortification of common foods such as milk, margarine, and breakfast cereals. This took the astonishing statistics of approximately 8090% of children showing varying degrees of bone deformations due to vitamin D deficiency to being a very rare condition.[31]

The current RDA for infants aged 06 months is 10g (400 International Units (IU))/day and for adults over 19 years of age it is 15g (600 IU)/day.

Diseases associated with a vitamin D deficiency include rickets, osteoporosis, and certain types of cancer (breast, prostate, colon and ovaries). It has also been associated with increased risks for fractures, heart disease, type 2 diabetes, autoimmune and infectious diseases, asthma and other wheezing disorders, myocardial infarction, hypertension, congestive heart failure, and peripheral vascular disease.[31]

Although fluoride is not considered an essential mineral, it is useful in prevention of tooth decay and maintaining adequate dental health.[32][33] In the mid-1900s it was discovered that towns with a high level of fluoride in their water supply was causing the residents' teeth to have both brown spotting and a strange resistance to dental caries. This led to the fortification of water supplies with fluoride in safe amounts (or reduction of naturally-occurring levels) to retain the properties of resistance to dental caries but avoid the staining cause by fluorosis (a condition caused by excessive fluoride intake).[34]The tolerable upper intake level (UL) set for fluoride ranges from 0.7mg/day for infants aged 06 months and 10mg/day for adults over the age of 19.

Conditions commonly associated with fluoride deficiency are dental caries and osteoporosis.

Some other examples of fortified foods:

Here is the original post:

Food fortification - Wikipedia

Related Posts

Comments are closed.