Mathematics Throughout History
Development Ofweights and Measures
How did the first standard metric measurements evolve over time?
The first standard metric units were developed by 1799: The meter was defined as one ten-millionth of the distance from the equator to the North Pole; the liter was defined as the volume of one cubic decimeter; and the kilogram, as the weight of a liter of pure water.
The standards metamorphosed over the years. For example, the first physical standard meter was in the form of a bar defined as a meter in length. By 1889, the International Bureau of Weights and Measures (BIPM, or Bureau International des Poid et Mesures) replaced the original meter bar. The new bar not only became a standard in France, but copies of the newest bar were distributed to the seventeen countries that signed the Convention of the Meter (Convention du Mètre) in Paris in 1875. The accepted distance became two lines marked on a bar measuring 2 centimeters by 2 centimeters in cross-section and slightly longer than one meter; the bar itself was composed of 90 percent platinum and 10 percent iridium. But it was only a “standard meter” when it was at the temperature of melting ice.
By 1960, the BIPM decided to make a more accurate standard; mostly this was done to satisfy the scientific community’s need for precision. The new standard meter was based on the wavelength of light emitted by the krypton-86 atom (or 1,650,763.73 wavelengths of the atom’s orange-red line in a vacuum). An even more precise measurement of the meter came about in 1983, when it became defined as the distance light travels in a vacuum in 1/299,792,458 second. This is currently the accepted standard.