Metallicity
In astronomy, metallicity is the abundance of elements present in an object that are heavier than hydrogen and helium. Most of the normal currently detectable (i.e. non-dark) matter in the universe is either hydrogen or helium, and astronomers use the word metals as convenient shorthand for all elements except hydrogen and helium. This word-use is distinct from the conventional chemical or physical definition of a metal as an electrically conducting element. Stars and nebulae with relatively high abundances of heavier elements are called metal-rich in discussions of metallicity, even though many of those elements are called nonmetals in chemistry.
The presence of heavier elements is the result of stellar nucleosynthesis. The majority of elements that are heavier than hydrogen and helium in the Universe are formed in the cores of stars as they evolve. Over time, stellar winds and supernovae deposit those heavier metals into the surrounding environment, which enriches the interstellar medium and provides material for the birth of new stars. Older generations of stars formed in a metal-poor early stage of the Universe, so it follows that they have lower metallicities than younger generations of stars which formed in a more metal-rich Universe.
The metallicity of a star is most often expressed in terms of [Fe/H], which represents the logarithmic ration of iron to hydrogen relative to the Sun's value. There are several compounding reasons for why this scale has become adopted as the standard: iron abundance in a galaxy increases roughly linearly with time through successive generations of stellar nucleosynthesis and supernova enrichment, iron has a rich spectrum that creates hundreds of absorption lines across the optical range, making iron lines extremely prominent when mapping the solar spectrum, and finally, iron was recognized as the default reference element in the mid-20th century due to how reliably it could be measured, today's standards are built on a history of iron-centric calibrations.