Atomic Clock - History of Atomic Clock
The atomic clock was invented and announced by Harold Lyons, the Microwave Standards Section Chief at NBS, (National Bureau of Standards) and his team in 1949. The atomic clock is the most accurate method of keeping time available, using an electronic transition frequency in the electromagnetic spectrum of an atom. It is not a nuclear device, and it is not radioactive, despite its name.
The atomic clock was invented out of need for a timing standard that would allow radio stations to stay on their assigned frequency. Atomic clocks keep better time than the movement of stars, and the earth's rotation, and are used as the primary standard for international time.
- In 1923 U.S. radio stations were assigned to 81 fixed frequencies, but the lack of timing standards made it difficult to stay on their assigned frequency (frequency is a measure of cycles per second).
- The first atomic clock was based on the properties of the ammonia molecule, and produced light pulses at a constant rate.
- Harold Lyons and his team improved upon their original design from 1949, creating a clock based on beans of cesium atoms. This is the standard still in use today.
- NBS changed to NIST (National Institute of Standards and Technology), and has since pioneered several ways to improve the atomic clock's technology.
- Today atomic clocks are used for more than radio and TV signals. They are the basis for internet time through the Network Time Protocol.
- Atomic clocks provide the time used by GPS (Global Positioning Systems) satellites.