In the 1840s a Greenwich standard time for all of England, Scotland, and Wales was established, replacing several "local time" systems. The Royal Greenwich Observatory was the focal point for this development because it had played such a key role in marine navigation based upon accurate timekeeping. Greenwich Mean Time (GMT) subsequently evolved as the official time reference for the world and served that purpose until 1972.
The United States established the U.S. Naval Observatory (USNO) in 1830 to cooperate with the Royal Greenwich Observatory and other world observatories in determining time based on astronomical observations. The early timekeeping of these observatories was still driven by navigation. Timekeeping had to reflect changes in the earth's rotation rate; otherwise navigators would make errors. Thus, the USNO was charged with providing time linked to "earth" time, and other services, including almanacs, necessary for sea and air navigation.
With the advent of highly accurate atomic clocks, scientists and technologists recognized the inadequacy of timekeeping based on the motion of the earth which fluctuates in rate by a few thousandths of a second a day. The redefinition of the second in 1967 had provided an excellent reference for more accurate measurement of time intervals, but attempts to couple GMT (based on the earth's motion) and this new definition proved to be highly unsatisfactory. A compromise time scale was eventually devised, and on January 1, 1972, the new Coordinated Universal Time (UTC) became effective internationally.
UTC runs at the rate of the atomic clocks, but when the difference between this atomic time and one based on the earth approaches one second, a one-second adjustment (a "leap second") is made in UTC. NIST's clock systems and other atomic clocks located in more than 25 countries now contribute data to the international UTC scale coordinated in Paris by the International Bureau of Weights and Measures (BIPM). An evolution in timekeeping responsibility from the observatories of the world to the measurement standards laboratories has naturally accompanied this change from "earth' time to 'atomic' time. But there is still a needed coupling, the leap second, between the two.
Time zones did not become necessary in the United States until trains made it possible to travel hundreds of miles in a day. Until the 1860s most cities relied upon their own local "sun" time, but this time changed by approximately one minute for every 12 miles traveled east or west. The problem of keeping track of over 300 local times was overcome by establishing railroad time zones. Until 1883 most railway companies relied on some 100 different, but consistent, time zones.
That year, the United States was divided into four time zones roughly centered on the 75th, 90th, 105th, and 120th meridians. At noon, on November 18, 1883, telegraph lines transmitted GMT time to major cities where authorities adjusted their clocks to their zone's proper time.
On November 1, 1884, the International Meridian Conference in Washington, D. C., applied the same procedure to zones all around the world. The 24 standard meridians, every 15 east and west of 0 at Greenwich, England, were designated the centers of the zones. The international dateline was drawn to generally follow the 180 meridian in the Pacific Ocean. Because some countries, islands and states don't want to be divided into several zones, the zones' boundaries tend to wander considerably from straight north-south lines.