In the UK vitamin B12 deficiency occurs in approximately 20% of adults aged >65 years. This incidence is significantly higher than that among the general population. The reported incidence invariably depends on the criteria of deficiency used, and in fact estimates rise to 24% and 46% among free-living and institutionalised elderly respectively when methylmalonic acid is used as a marker of vitamin B12 status. The incidence of, and the criteria for diagnosis of, deficiency have drawn much attention recently in the wake of the implementation of folic acid fortification of flour in the USA. This fortification strategy has proved to be extremely successful in increasing folic acid intakes pre-conceptually and thereby reducing the incidence of neural-tube defects among babies born in the USA since 1998. However, in successfully delivering additional folic acid to pregnant women fortification also increases the consumption of folic acid of everyone who consumes products containing flour, including the elderly. It is argued that consuming additional folic acid (as ‘synthetic’ pteroylglutamic acid) from fortified foods increases the risk of ‘masking’ megaloblastic anaemia caused by vitamin B12 deficiency. Thus, a number of issues arise for discussion. Are clinicians forced to rely on megaloblastic anaemia as the only sign of possible vitamin B12 deficiency? Is serum vitamin B12 alone adequate to confirm vitamin B12 deficiency or should other diagnostic markers be used routinely in clinical practice? Is the level of intake of folic acid among the elderly (post- fortification) likely to be so high as to cure or ‘mask’ the anaemia associated with vitamin B12 deficiency?