I get a steady stream of indignant sputtering about this post on the metric system and what it means for authentication. One common point that readers make is that Celsius is better than Fahrenheit because it is based on natural law, defined as 100 degrees between the freezing and boiling point of water.
Only it isn’t, and hasn’t been for some time (at least not since 1954). While the freezing point and boiling point of water was precise enough in the 1700’s, it is no where near precise enough to act as a standard. The reason is that no two samples of water will melt and freeze at the same temperature due to variations in water purity, air pressure, and humidity.
By international convention, the Celsius scale is defined by a range between absolute zero and the thermodynamic triple point of Vienna Standard Mean Ocean Water (VSMOW). This point, by the way, is 0.01 C. And VSMOW is not ocean water (despite it’s name), but rather is a carefully crafted lab concoction comprised of specially defined proportions of oxygen and hydrogen isotopes.
So while we are taught Celsius is defined by the freezing and boiling points of water, it is actually defined by absolute zero (which doesn’t exist in the natural world), and the triple point of a form of water that only exists in the lab.
Explain to me again, why this is less arbitrary that Fahrenheit?
And why is it still taught incorrectly in schools (at least in the US)?