NextPrevious

Mathematics Throughout History

Development Ofweights and Measures

What were some early units used for calculating length?

The earliest length measurements reach back into ancient time, and it is a convoluted history. Some of the earliest measurements of length are the cubit, digit, inch, yard, mile, furlong, and pace. One of the earliest recorded length units is the cubit. It was invented by the Egyptians around 3000 B.C.E. and was represented by the length of a man’s arm from his elbow to his extended fingertips. Of course, not every person has the same proportions, so a cubit could be off by a few inches. This was something the more precision-oriented Egyptians fixed by developing a standard royal cubit. This was maintained on a black granite rod accessible to all, enabling the citizenry to make their own measuring rods fit the royal standard.

The Egyptian cubit was not the only one. By 1700 B.C.E., the Babylonians had changed the measurement of a cubit, making it slightly longer. In our measurement standards today, the Egyptian cubit would be equal to 524 millimeters (20.63 inches) and the Babylonian cubit would be equal to 530 millimeters (20.87 inches; the metric unit millimeters is used here, as it is an easier way to see the difference between these two cubits).

As the name implies, a digit was measured as the width of a person’s middle finger, and was considered the smallest basic unit of length. The Egyptians divided the digit into other units. For example, 28 digits equaled a cubit, four digits equaled a palm, and five digits equaled a hand. They further divided three palms (or 12 digits) into a small span, 14 digits (or a half cubit) into a large span, and 24 digits into a small cubit. To get smaller measurements than a digit, the Egyptians used fractions.

Over time, the measurement of an inch was all over the measurement map. For example, one inch was once defined as the distance from the tip to the first joint on a man’s finger. The ancient civilization of the Harappan in the Punjab used the “Indus inch”; based on ruler markings found at excavation sites, it measured, in modern terms, about 1.32 inches (3.35 centimeters; see below for more about the Harappan). The inch was defined as 1/36th of King Henry I of England’s arm in the 11th century; and by the 14th century, King Edward II of England ruled that one inch equaled three grains of barleycorn placed end to end lengthwise. (See below for more about both kings.)

Longer measurements were often measured by such units as yards, furlongs, and miles in Europe. At first, the yard was the length of a man’s belt (also called a girdle). The yard became more “standard” for a while, as determined to be the distance from King Henry I’s nose to the thumb of his outstretched arm. The term mile is derived from the Roman mille passus, or “1,000 double steps” (also called paces). The mile was determined by measuring 1,000 double steps, with each double step by a Roman soldier measuring five feet. Thus, 1,000 double steps equaled a mile, or 5,000 feet (1,524 meters). The current measurement of feet in a mile came in 1595, when, during the reign of England’s Queen Elizabeth I, it was agreed that 5,280 feet (1,609 meters) would equal one mile. This was mainly chosen because of the popularity of the furlong—eight furlongs equaled 5,280 feet.

Finally, the pace was once attached to the Roman mile (see above). Today, a pace is a general measurement, defined as the length of one average step by an adult human, or about 2.5 to 3 feet (0.76 to 0.91 meters).



Close

This is a web preview of the "The Handy Math Answer Book" app. Many features only work on your mobile device. If you like what you see, we hope you will consider buying. Get the App