Intravenous (IV) iron is widely used to provide supplementation when oral iron is ineffective or not tolerated. All commercially available intravenous iron formulations are comprised of iron oxyhydroxide cores coated with carbohydrates of varying structure and branch characteristics. The diameter of the iron-carbohydrate complexes ranges from 5-100 nm and meets criteria for nanoparticles. Clinical use of IV iron formulations entered clinical practice beginning of the late 1950s, which preceded the nanomedicine exploration frontier. Thus, these agents were approved without full exploration of labile iron release profiles or comprehensive biodistribution studies. The hypothesis for the pathogenesis of acute oxidative stress induced by intravenous iron formulations is the release of iron from the iron-carbohydrate structure, resulting in transient concentrations of labile plasma iron and induction of the Fenton chemistry and the Haber-Weiss reaction promoting formation of highly reactive free radicals such as the hydroxyl radical. Among available IV iron formulations, products with smaller carbohydrate shells are more labile and more likely to release labile iron directly into the plasma (i.e., before metabolism by the reticuloendothelial system). The proposed biologic targets of labile-iron-induced oxidative stress include nearly all systemic cellular components including endothelial cells, myocardium, liver as well as low density lipoprotein and other plasma proteins. Most studies have relied on plasma pharmacokinetic analyses that require many model assumptions to estimate contribution of the iron-carbohydrate complex to elevations in serum iron indices and hemoglobin. Additionally, the commercially available formulations have not been well studied with regard to optimal dosing regimens, long-term safety and comparative efficacy. The IV iron formulations fall into a class defined by the Food and Drug Administration as “Complex Drugs” and thus present considerable challenges for bioequivalence evaluation.