Pore formation in cast iron castings is driven both by shrinkage and dissolved gases, where the latter stems from supersaturated gaseous species, such as nitrogen. During solidification, nitrogen partitions between the austenite and the liquid according to the ratio of its solubility in each phase. This ratio, known as the partition coefficient, is essential to characterize, as accumulation in the liquid phase can lead to critical supersaturation for pore formation. However, there is conflicting information in CALPHAD databases and literature regarding its partitioning behavior. This work evaluates nitrogen partitioning between the primary austenite and the liquid in a hypoeutectic lamellar cast iron alloy. To investigate this, a cylindrical specimen was produced and remelted under an inert atmosphere, allowing the austenite and the liquid to establish a solute equilibrium. After 6 days of holding at 1175 degrees C within the solid-liquid biphasic range, the specimen was quenched, and samples were extracted from the austenitic and liquid regions, which had transformed into martensite and ledeburite, respectively. The nitrogen concentration was measured by inert gas fusion (IGF), resulting in a nitrogen partition coefficient kN gamma/L\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$${k}_{N}<^>{\gamma /L}$$\end{document} = 0.72 +/- 0.08, which represents partition in the opposite direction suggested by thermodynamic databases. The results indicate that there are opportunities to further explore nitrogen partitioning in other compositions and highlight the importance of selecting databases that more accurately represent the phenomena of interest. Moreover, a better understanding of nitrogen partitioning can enhance the control of porosity in cast iron processing.