A frequent ‘petty annoyance’ comes from some folks who just can’t stand the idea of anyone doing something that does not conform to their idea of approved behaviour. One particularly frequent example of the generic effect is folks who lament the use of any temperature scale other than Celsius, with particular bile reserved for folks who use Fahrenheit.
IMHO, this shows a rather brittle and stultified character, but that may be a reflection of my own bias as I’m fond of a great diversity of methods, explorations, and ideas. Does it reflect a bias on my part from having grown up in a nation that uses Fahrenheit? Maybe, but I doubt it. Pretty much everything is ‘dual scale’ here. I used C in every science class from high school on. I spent many years working in hospitals where the common scale used was C. I’ve been “bilingual” for temperature as long as I can remember. (Early on I was interested in history, and much of it has records in C from non-USA non-UK-Empire sources).
So WHY defend the use of F?
Well, why not?
The earth does not end because someone uses F instead of C. Or K or R or any of several others, either. Must we all speak English? Is every use of French, German, Russian, Chinese, Japanese, etc. etc. to be stamped out? Is there some overriding value to forcing conformity in how we measure our weather or cook our meals that is greater than the value of a common language? Frankly, I’d be much more inclined to support an “everyone learn Esperanto” so we had some common language to share; then to support the extinction of alternative temperature scales.
Personally, I find the finer graduations of F and the particular range it spans rather convenient. Substantially all usage for day to day weather and similar uses can do nicely using only whole digits. No ‘fractional part’ is needed, really. Even the 98.6 F “normal” body temperature is really a case of false precision. “Normal” is more correctly 98 F. ( There’s a bit of a long story behind that, but the short form is that it wasn’t originally fractional in F. Later it was ‘back figured’ from 37 C and that’s where the fractional fiction came from…) MOST of the weather stays at ‘above zero’ (and when it does go below zero, well, you know that’s just way too cold… while -3 C isn’t all that cold but gets a gratuitous minus you have to carry around). Any temperature worth living in tends to be bounded by 0 F and 100 F. Yeah, you can go outside those, but it really means something if you do. For degrees C, you have the ‘max comfortable’ at a somewhat fuzzy 37-40 ish and just where IS that lower bound of comfort / livable?
But that aside:
The assertion is often made that F is just not as scientific or as orderly as C.
This is simply wrong.
The Physical Basis of Thermometry
The early thermometers mostly depend on the fact that as things get warmer, they expand. Some of them use a fluid in a solid ( mercury in glass or alcohol in glass – generically LIG Liquid In Glass ) while the Galileo Thermometer uses a gas filled ball floating in a liquid. As the density changes, the different balls float.
How they were calibrated is an interesting story too. Today we have various temperature standards and various other thermometers to compare and calibrate. But in the beginning? There was little that was known about actual temperatures in a numeric sense, and it was not entirely clear what was a temperature standard and what was not. This is an important part of the history of Fahrenheit in particular, but also impacts on C (and several others). For me, as a person who likes the idea of being able to make my own tools, having a convenient temperature standard is a useful thing.
One mathematical point. “Decimalization” is a core ideal in the metric system. Everything must go by 10s. But is it REALLY a benefit? Well, “that depends”. In a world of base 10 math and digital calculators, yes, it is. But in a world of hand calculation with pen and paper, it is a hindrance. Why?
Fractions are more convenient; and often both more precise and more accurate for hand done calculations of any size.
1/3 is an exactly accurate and infinitely precise value. 1.333… has you deciding just how many 3s to print and where to cut off your precision.
A “factor rich” number base lets you divide things comfortably by many common factors. No, not all of them. But enough for most things to be very easily handled. A base 10 system gives you factors of 1, 2, 5, and 10. In practical terms, it’s really just 2 and 5 (as the 1 is useless and the 10 is just the base and all number bases have a base). Yes, in ‘base 60’ you still don’t have a factor of 7 so a 1/7 gives 8.571428(repeat); but you can simply and easily divide by 2, 3, 4, 5, 6 and several others (including 10 and 12) with only whole digits involved. So very rarely do you need to calculate a decimal fraction part. For that matter, it is often the case that you can just carry forward the fraction and ‘cancel terms’ at the end. So 60/7 stays accurate and completely precise, even if not reducible to an integer number.
This matters not just for the math of things, like temperatures, but also for how you make your thermometer. With just a dividers (compass) you can divide a line segment in half. It is much harder to divide it by 1/5. So if you want to be able to do math simply and to divide your scale on your liquid in glass thermometer with such tools as a compass ( i.e. if you want to be able to make one on your own without a lot of high precision tooling) it helps to have a scale that is divided into some powers of 2 or a factor rich base, like 60, that is also divisible by 2 a couple of times to get at least some major divisions with compass.
Irrational Basis and Celsius History of Change
Yes, I’m deliberately skewering Celsius with the same arguments typically applied to Fahrenheit. Why? Because the C advocates regularly make the assertion, but do not realize that the same thing applies to C as to F and for substantially the same reasons.
To understand this, it is best to start with a bit of the history of F. Originally it was a scale that ran from a freezing salt mixture at zero to human body temperature at 96. The salt mix was one of the few things known to freeze at ONE temperature. It was a widely available FIXED and REPEATABLE zero point temperature. The top end was available to everyone. While we now know that there is some change of temperature during the day, if taken during the ‘work day’ it is a remarkably stable temperature (especially in a 1700s technology world). At that time, precision was not available into the 1/10 ths of a degree anyway, so it was a very available and very repeatable (even if not perfectly repeatable) high end standard.
Why 96 degrees? 96, 48, 24, 12, 6, 3. So with compass / dividers you can start at your end points and mark down to 3 degree divisions with very high precision. Then just inscribe 2 divisions inside that last bit. (Fairly easily done within the precision of the device ‘by eye’ or via careful setting of the compass and ‘fitting’ to the gap, but there is also a method to get an exact 1/3 via geometry, though it is more trouble than it is worth).
However, Fahrenheit was even more crafty than that. With freezing at 32, the difference between 96 and 32 is exactly 64 and can be done exactly, with compass only, all the way. Nice, very nice.
Now think about it for a minute. With nothing but your body, some salt and ice, and a compass you can calibrate a thermometer to fairly good precision and accuracy. THAT is a nice and rational design.
Later on, the Fahrenheit scale got changed some, and we ended up with 98 F for body temperature instead. The story is here:
The Fahrenheit scale in use today differs slightly from the original. The two fixed points are the ice point, assigned a value of 32°F, and the steam point, assigned a value of 212°F. On this scale the normal human body temperature is 98.6°F, slightly higher than the 96° originally chosen by Fahrenheit.
Gee… So it uses the same two calibration points now as does the Celsius scale…
Yes, in the process some things were lost (like that nice round body temperature number), but the rational nature of it is founded on the same stones as the Celsius scale.
From that same article comes an interesting bit of history of the choice of the salt mixture and a hint about why F is so convenient for ‘day to day’ things:
By 1724 Fahrenheit had adopted a new scale, similar to Roemer’s but with much finer divisions. For the zero point he chose the same reference as Roemer. However, since his thermometer was intended for meteorological observations, he wanted a second reference point that would be nearer the maximum observed temperature for weather. He chose the normal temperature of the human body as the upper reference point, which he called 96°. Fahrenheit gave no reason for his choice of 96, but it may have been due to his desire for a finer scale and because 96 is evenly divisible by 2, 3, 4, 8, and 12.
Why didn’t Fahrenheit choose the freezing point of water for his zero reference, as Newton had done before him and as Celsius did later on? Perhaps Fahrenheit was influenced by Roemer, or he may have wanted to avoid the inconvenience of repeatedly using negative temperatures during winter. Also, in the early 1700s it was widely believed that water did not always freeze at the same temperature. Soon, using his newly calibrated thermometers, Fahrenheit learned that water always froze at 32° on his scale. He immediately added this third reference point to his instruments.
Again we see the ‘factor rich’ base value. The convenience for weather continues as does the convenience for most day to day temperatures.
But of interest to me is the point about water freeze temperatures. Water only freezes at 32 F / 0 C if it is PURE water. While you can buy a jug of distilled or deionized water at the grocery store today, it was not always so convenient. (And it may not be so convenient after a global disaster, either. No, no paranoia; just recognizing that rocks fall from space, ice ages come, and civilization has a long history of being very unstable.) So using a salt / ice mixture solves that issue. It gives a reliable zero point in a world of not-so-pure water sources.
And the boiling point? Well, what is shifting your boiling point? How about altitude and barometric pressure. It varies quite a bit, really. It all comes down to a question of just how easily available is pure water and a ‘standard atmosphere’. I rather like the idea of not being dependent on a distilled water supply to define my temperature scale.
(Personally, I’d likely pick another eutectic mixture for the upper bound too. Getting really pure water is not that easy some times… and a ‘standard atmosphere’ can be problematic too.)
But what about that idea that C has changed?
Celsius (formerly centigrade) is a scale and unit of measurement for temperature. It is named after the Swedish astronomer Anders Celsius (1701–1744), who developed a similar temperature scale two years before his death. The degree Celsius (°C) can refer to a specific temperature on the Celsius scale as well as a unit to indicate a temperature interval, a difference between two temperatures or an uncertainty. The unit was known until 1948 as “centigrade” from the Latin “centum” translated as 100 and “gradus” translated as “steps”.
OK, so first off, they can’t even keep the name straight. It was centigrade for “a long time” then changed. (And even that 1948 date isn’t ‘hard’ as I was taught ‘centigrade’ in high school in the 1970s along with Celsius). Next notice that Celsius (the person) developed a ‘similar scale’… so we’ve had a couple of them already…
From 1744 until 1954, 0 °C was defined as the freezing point of water and 100 °C was defined as the boiling point of water, both at a pressure of one standard atmosphere with mercury being the working material. Although these defining correlations are commonly taught in schools today, by international agreement the unit “degree Celsius” and the Celsius scale are currently defined by two different temperatures: absolute zero, and the triple point of VSMOW (specially-purified water).
Oh Dear. So it ISN’T based on freezing and boiling points anymore… And that means that every temperature prior to 1954 is in a subtle way not the same as after 1954. Yeah, it will be lost down in the weeds of some irrelevant level of precision… unless, of course, you are working in those ranges. Notice also the reference to standard atmosphere.
Remember that “boiling point” standard? It varies with altitude and barometric pressure…
So which would you rather have? A barometric and dissolved material sensitive ‘bound’ or a bounds based on a homeostatic mechanism that is with you wherever you go? I suspect it depends on when you want to create your thermometer and the technologies available to you at the time… For me, I like having my own standard (even if a few 1/10ths of a degree accuracy) rather than one that requires technically pure water and a ‘standard atmosphere’.
This definition also precisely relates the Celsius scale to the Kelvin scale, which defines the SI base unit of thermodynamic temperature with symbol K. Absolute zero, the lowest temperature possible at which matter reaches minimum entropy, is defined as being precisely 0 K and −273.15 °C. The temperature of the triple point of water is defined as precisely 273.16 K and 0.01 °C.
This definition fixes the magnitude of both the degree Celsius and the kelvin as precisely 1 part in 273.16 (approximately 0.00366) of the difference between absolute zero and the triple point of water. Thus, it sets the magnitude of one degree Celsius and that of one kelvin as exactly the same. Additionally, it establishes the difference between the two scales’ null points as being precisely 273.15 degrees Celsius (−273.15 °C = 0 K and 0 °C = 273.15 K).
So we’ve now got it tied to Kelvin, but with odd little 1/100th place offsets. Frankly, if we’re doing that kind of thing, why not just toss that sucker out and make Kelvin the standard and a Kc scale that is 273 warmer (and drop those 1/100ths place finagles variable constants…)
In 1742 Swedish astronomer Anders Celsius (1701–1744) originally created a “reversed” version of the modern Celsius temperature scale wherein zero represented the boiling point of water and one hundred represented the freezing point of water. In his paper Observations of two persistent degrees on a thermometer, he recounted his experiments showing that the melting point of ice is essentially unaffected by pressure. He also determined with remarkable precision how the boiling point of water varied as a function of atmospheric pressure. He proposed that the zero point of his temperature scale, being the boiling point, would be calibrated at the mean barometric pressure at mean sea level. This pressure is known as one standard atmosphere. The BIPM’s 10th General Conference on Weights and Measures (CGPM) later defined one standard atmosphere to equal precisely 1013250dynes per square centimeter (101.325kPa).
Oh. It ran “backwards”… We also again see the dependence on atmospheric pressure of the scale and the need to find a way to make a ‘standard atmosphere’.
In 1744, coincident with the death of Anders Celsius, the Swedish botanist Carolus Linnaeus (1707–1778) reversed Celsius’s scale upon receipt of his first thermometer featuring a scale where zero represented the melting point of ice and 100 represented the boiling point. His custom-made “linnaeus-thermometer”, for use in his greenhouses, was made by Daniel Ekström, Sweden’s leading maker of scientific instruments at the time and whose workshop was located in the basement of the Stockholm observatory. As often happened in this age before modern communications, numerous physicists, scientists, and instrument makers are credited with having independently developed this same scale; among them were Pehr Elvius, the secretary of the Royal Swedish Academy of Sciences (which had an instrument workshop) and with whom Linnaeus had been corresponding; Christian of Lyons; Daniel Ekström, the instrument maker; and Mårten Strömer (1707–1770) who had studied astronomy under Anders Celsius.
And we don’t even really know who all decided which way it ought to run.
Yes, I’m sure that today we have it right, finally, for sure this time, honest. No more fiddling with it. Well keep it running the same way, with the same name, and with the same funky offset and based on the Kelvin scale… unless we don’t.
So, IMHO, the difference between F and C is not one of stability, nor of being based on a simple fundamental standard like the boiling point of water. The only difference that really matters is how convenient it is to me to use. For that, I’d rather have the whole degrees of F than those big fat C degrees that often need a decimal point to be useful.
That I could create a workable F thermometer from ‘nearly nothing’ doesn’t hurt.
More Degrees of Degrees
But the world does not start with F and end with C. There are several other temperature scales in use too.
In many ways my favorite is Rankine. It starts at absolute zero, just like Kelvin (and after some math, Celsius) but has those fine precise degrees of Fahrenheit. Just after Rankine and Fahrenheit in my preference list is Kelvin.
Again a sane starting point of absolute zero, but with those fat degrees of C. But both Rankine and Kelvin came along after the early scales. In essence, once we figured out that there WAS an absolute zero, we ‘refitted’ both F and C to that starting point and gave them new names.
But there were other, earlier, thermometers. Turns out that those earlier guys had some decent ideas. One uses a base 60 division. We still use base 60 for time (and 360 for circles) for the simple reason that they are easy to divide into 1/2, 1/4, 1/3, etc. During the French Revolution they tried to make “decimal time” and decimal circles too, but they were just too inconvenient and died out (mostly…)
Here is a nice chart that compares the various scales.
You may want to ‘click on it’ and get a larger version of this image for easier reading.
First off, notice that the Rankine and Fahrenheit scales both are ‘steeper’. That’s the indication of the greater precision in the scales. More divisions per unit of temperature. Then notice that they are both the same slope (as are K and C) but offset. That reflects the ‘offset’ of the starting zero point to absolute zero ( where both K and R originate).
There are several other lines on this graph. The most striking is that Delisle scale which slants from upper left to lower right. The original Celsius scale ran that way, too, and would have a similar ‘backwards’ slope but with the angle as at present.
The other lines are various other scales. Most of them are fairly flat (more than the ones we use today). That to some extent is a reflection of the available precision of the materials at the time of their creation or the range over which the inventor was interested.
The key takeaway for me? There are LOTS of ways to measure temperature and it really is a choice how steep you want your line, what kind of offset you want for your zero point, and which way you want the slope to run. (Frankly, I could make a decent case for a non-linear scale too. One with fine precision near room temperatures and with ever larger gradations as you move off to extreme ends.)
So lets look at a couple of these.
Galileo: 1593 (no scale)
Santorio: 159x? (with some scale of unknown calibration)
Newton: “Around 1700”
Réaumur: 1730, redef 1772
Delisle: 1732, redef 1738
Celsius: 1742, reversed 1744, redef 1948
Galileo and Santorio
Technically these are ‘thermoscopes’ not thermometers. Galileo made a device that floated glass balls in water. Santorio made some kind of thermoscope, to which a scale was applied, but I have not been able to find the method of calibration (if any). He was a doctor, and used this device to take human temperatures, so I would speculate that he started at some approximation of human body temperature and marked graduations away from that point. (That is what I would do, were I looking for over / under normal temperatures).
Says that there is some controversy over who actually invented what, as both Galileo and Santorio look to have been in communications to some extent.
His passion for describing phenomena in terms of numbers, led Santorio to invent several instruments, among which a wind gauge, a water current meter, the “pulsilogium,” and a thermoscope. The last two of these are also mentioned by Galileo, and, especially in the case of the thermoscope, there has been controversy about who the actual inventor was. We do know that Santorio was the first to apply a numerical scale to the thermoscope, which later evolved into the thermometer. Both the pulsilogium and the thermoscope are perhaps best seen as the product of a learned circle in Venice that included Galileo, Santorio, Giofrancesco Sagredo, and fra Paolo Sarpi.
Regardless, we can date interest in graduated temperature observation from about 1590. It was a good while before we get to a reasonably modern thermometer, and undoubtedly there were many folks “playing around” with various ideas and changes. Rather than dig into those musty bits (if they are really even there to find), we’re going to pick up the trail about 100 years later when a recognizable thermometer with a reasonable design shows up. But first, just a bit of detail on the thermoscope.
We can get some idea how it may have worked from this description:
Measuring heat became a puzzle in the circle of practical and learned men in Venice to which Galileo belonged. The first solution was a thermoscope. Building on Pneumatics by Hero of Alexandria (1st century BCE), first published in the West in 1575, several authors had begun playing with the idea of the expansion of air as its heat increased, and vice versa. The first versions, usually called thermoscopes, were little more than toys. Benedetto Castelli wrote in 1638 about a device he had seen in Galileo’s hands around 1603:
He took a small glass flask, about as large as a small hen’s egg, with a neck about two spans long [perhaps 16 inches] and as fine as a wheat straw, and warmed the flask well in his hands, then turned its mouth upside down into the a vessel placed underneath, in which there was a little water. When he took away the heat of his hands from the flask, the water at once began to rise in the neck, and mounted to more than a span above the level of the water in the vessel. The same Sig. Galileo had then made use of this effect in order to construct an instrument for examining the degrees of heat and cold.
Over the next several years this thermoscope was developed by Santorio Santorio and Galileo’s friend Gianfrancesco Sagredo (both in Venice), Galileo, and others to include a numerical scale. It had thus become a full-fledged air thermometer. The first series of quantitative meteorological observations date from this period. In other parts of Europe the inventor Cornelis Drebbel and Robert Fludd developed similar instruments. The questions about who was the first, and whether one derived his knowledge from another, are sterile ones which shed little light on the historical context in which this and other instruments (e.g., the telescope and barometer) developed. The near simultaneous (and surely independent) invention of the air thermometer illustrates the seventeenth-century trend toward quantification of natural phenomena–an essential dimension of the “mathematization of nature.”
That same article goes on to describe the early invention of liquid in glass thermometers:
The liquid in glass thermometer was developed in the 1630s, but a universal standard of temperature remained elusive. Each scientist had his own scale divisions, often based on different reference points. It is impossible for us accurately to convert their measurements to our temperature scale, and at the time it was impossible to compare temperatures in different places. In the early eighteenth century, universal temperature scales based on several fiduciary points (e.g. a mixture of ice and brine, a mixture of ice and water, body temperature, the boiling point of water) were developed by Daniel Gabriel Fahrenheit (1686-1736), Anders Celsius (1701-1744), and René-Antoine Ferchault de Réaumur (1683-1757).
It is worth hitting the link as they have a well done site and it has nice artwork, including what looks to be period paintings.
So, at this point, we can see that Celsius and Fahrenheit were around at about the same time, and using about the same materials and methods. Not a lot of reason to choose one over the other, really. (Yes, I’m very aware of how C integrates with the rest of the French Units… oh, pardon, “SI” units … are those the too little ones or the too big ones? :-)
But what about that Réaumur guy? Well, we have a couple of other folks to get past first…
Yes, that Newton. He was into everything it seems. So yes, he also had a thermometer and a scale for it. From the wiki:
The Newton scale is a temperature scale devised by Isaac Newton around 1700. Applying his mind to the problem of heat, he elaborated a first qualitative temperature scale, comprising about twenty reference points ranging from “cold air in winter” to “glowing coals in the kitchen fire”. This approach was rather crude and problematic, so Newton quickly became dissatisfied with it. He knew that most substances expand when heated, so he took a container of linseed oil and measured its change of volume against his reference points. He found that the volume of linseed oil grew by 7.25% when heated from the temperature of melting snow to that of boiling water.
After a while, he defined the “zeroth degree of heat” as melting snow and “33 degrees of heat” as boiling water. His scale is thus a precursor of the Celsius scale, being defined by the same temperature references. Indeed it is likely that Celsius knew about the Newton scale when he invented his. Newton called his instrument a “thermometer”.
Thus the unit of this scale, the Newton degree, equals (approximately 3.03) kelvins or degrees Celsius and has the same zero as the Celsius scale.
So from Newton we get the name ‘thermometer’ and we have the use of melting ice / boiling water as calibration points. The choice of 33 degrees seems a bit odd, then again, he was rumored to be a Freemason, so 33 has a certain charm…
Also, IMHO, the claim that a degree Newton is 3.03 Celsius is more an indication of how Celsius has changed than one about the Newton. As both originally used the same calibration standards, they ought to retain the 3:1 ratio.
So, for my purposes, the use of a Newton Scale has just too fat a degree.
From the wiki we get:
Rømer is a temperature scale named after the Danish astronomer Ole Christensen Rømer, who proposed it in 1701.
In this scale, the zero was initially set using freezing brine. The boiling point of water was defined as 60 degrees. Rømer then saw that the freezing point of pure water was roughly one eighth of the way (about 7.5 degrees) between these two points, so he redefined the lower fixed point to be the freezing point of water at precisely 7.5 degrees. This did not greatly change the scale but made it easier to calibrate by defining it by reference to pure water. Thus the unit of this scale, a Rømer degree, is 100/52.5 = 40/21 of a kelvin (or of a Celsius degree). The symbol is sometimes given as °R, but since that is also sometimes used for the Rankine scale, the other symbol °Rø is to be preferred. The name should not be confused with Réaumur.
The inventor of the Fahrenheit scale Daniel Gabriel Fahrenheit learned of Rømer’s work and visited him in 1708; in one of his letters Fahrenheit narrates how he borrowed the idea for the scale from this visit, increasing the number of divisions by a factor of four and eventually establishing what is now known as the Fahrenheit scale, in 1724
Here we see the use of brine as an easy way to get a clearly repeatable reference point. Here we also see the use of boiling water for the other end of the scale. But then he shifts to using the freezing point of water, once it was recognized as repeatable. So, in many ways, Celsius is just a rip-off of Rømer, but with ‘base 10’ bias instead of factor rich ‘base 60’. Personally, I’d rather have the Rømer system (though I’d likely use 360 instead of 60).
So lets think about this just a minute: We want to recognize the folks who use 100 divisions rather than the guy who used 60? That’s really enough to warrant all the hoopla? Oh, right, he also got the scale backwards and it took a few years to fix that…
Then again, it looks like Fahrenheit is a bit of a rip-off of his work too. Kept the brine starting point (that I rather like as you don’t need ‘pure’ water quite so much) but went to a larger number of divisions (more or less) for his starting point. Oddly, this is in conflict with the story about using 96 degrees as body temperature, but we will sort that out down below.
I note in passing that pretty much everyone was using the same basic standards of something freezing or boiling, with the occasional use of body temperature, as they figured out what was a stable temperature reference and what was not. In that context, and with most things not having precision to 1 F, using core waking body temperature is a pretty decent discovery.
At any rate, the Rømer thermometer looks like a pretty decent one, but the degrees are rather fat, so be prepared to use a lot of decimal points…
There is a rather good telling of the story here:
That includes a description of making a thermometer along with a description of the transition in how Fahrenheit did his scale:
Fahrenheit used this scale until 1717, with the only difference that he divided every °Rø in four °F, so the two fixpoints, the freezing point was 30°F and the human body temperature was 90°F. He then changed the scale (because it was difficult to divide in thirtieth) to FP = 32°F and HBT = 96°F. He discovered that the temperature of youngsters were higher than that of elderly people, so HBT was not so fixed as he had believed. He changed this fixpoint to the temperature of boiling water, BP = 205°F to 212°F, depending on air pressure, so the new fixpoint gave a scale similar to the old one. Later, he began making thermometers filled with Mercury and after experiments he preferred this liquid; furthermore, it was difficult to obtain alcohol with the same strength and consequently the same expansion, every time he wanted to make thermometers.
Personally, I’d rather have 98 F +/- 1 F or so for normal healthy adult body temperature than a 205 – 212 F range for making a DIY thermometer. OTOH, I think a nice eutectic salt mix could work even better.
In any case, I think it’s pretty clear that there is a very large debt owed to Rømer by both Celsius and Fahrenheit (and via them by Kelvin and Rankine).
One small point: Many times folks assert that ONLY the USA still uses Fahrenheit. That is not true. Aside from all the individuals scattered around the world who use it (and I’ve seen it in some several countries in addition to C) there are a few other countries with the good sense to keep it official. Per the wiki, Cayman Islands and Belize. As there are lots of folks from the USA who tour the Caribbean Islands, I’ve seen F used in other of them as well. Also on a cruise ship in that area. This article:
asserts that there are some hold outs in the UK and Canada as well.
In some countries, both systems are used. In the United Kingdom and Canada, Celsius is mainly used in the news, weather forecasts, books, magazines and daily conversations, but many outdoor thermometers display temperatures in both Fahrenheit and Celsius. Likewise, indoor thermometers, including both digital and analogue, may be in Fahrenheit, Celsius or both.
I suspect one would find other users scattered about the Former British Empire, if one cared to look.
But on with the article…
We’ve already seen some information about the F scale, so I’ll just mention a few interesting bits here. From the wiki we get:
On the Fahrenheit scale, the freezing point of water is 32 degrees Fahrenheit (°F) and the boiling point 212 °F (at standard atmospheric pressure). This puts the boiling and freezing points of water exactly 180 degrees apart. Therefore, a degree on the Fahrenheit scale is 1⁄180 of the interval between the freezing point and the boiling point. On the Celsius scale, the freezing and boiling points of water are 100 degrees apart. A temperature interval of 1 degree Fahrenheit is equal to an interval of 5 ⁄ 9 degrees Celsius. The Fahrenheit and Celsius scales intersect at −40 °F (−40 °F and −40 °C represent the same temperature).
I note that 180 is exactly 1/2 of 360… a “magic number” (in the computer geek sense of one that keeps showing up with particular meanings / uses) of sorts that is factor rich and good for fractional math. Was this really an accident? Or is there a bit of history of the decisions about the Fahrenheit scale that are a bit shrouded in mystery? One can only guess.
My guess would be that in the process of adjusting from 240 degrees and using 96 as the body temperature, over to using boiling water, the folks who did the recalibrating noticed a nice connection to 180 and the boiling point of water and, during THAT shift of calibration, shifted to 180 divisions as well. Factors of 180? 1, 2, 3, 4, 5, 6, 9, 10, 12, 15, 18 20, 30, 36, 45, 60, 90, 180.
The wiki is a bit unclear on who did the deed that gave us 98.6 instead of 96 F for body temperature…
According to an article Fahrenheit wrote in 1724, he based his scale on three reference points of temperature. In his initial scale (which is not the final Fahrenheit scale), the zero point is determined by placing the thermometer in brine: he used a mixture of ice, water, and ammonium chloride, a salt, at a 1:1:1 ratio. This is a frigorific mixture which stabilizes its temperature automatically: that stable temperature was defined as 0 °F (−17.78 °C). The second point, at 32 degrees, was a mixture of ice and water without the ammonium chloride at a 1:1 ratio. The third point, 96 degrees, was approximately the human body temperature, then called “blood-heat”.
According to a letter Fahrenheit wrote to his friend Herman Boerhaave, his scale was built on the work of Ole Rømer, whom he had met earlier. In Rømer’s scale, brine freezes at zero, water freezes and melts at 7.5 degrees, body temperature is 22.5, and water boils at 60 degrees. Fahrenheit multiplied each value by four in order to eliminate fractions and increase the granularity of the scale. He then re-calibrated his scale using the melting point of ice and normal human body temperature (which were at 30 and 90 degrees); he adjusted the scale so that the melting point of ice would be 32 degrees and body temperature 96 degrees, so that 64 intervals would separate the two, allowing him to mark degree lines on his instruments by simply bisecting the interval six times (since 64 is 2 to the sixth power).
Fahrenheit observed that water boils at about 212 degrees using this scale. Later, other scientists decided to redefine the degree slightly to make the freezing point exactly 32°F, and the boiling point exactly 212 °F or 180 degrees higher. It is for this reason that normal human body temperature is approximately 98° (oral temperature) on the revised scale (whereas it was 90° on Fahrenheit’s multiplication of Rømer, and 96° on his original scale).
At any rate, the ‘basis’ of the modern Fahrenheit scale is just as ‘scientific’ as the Celsius scale and the major difference is just what size gradations you have and how often you have to use ‘below zero’ for weather reports. If doing chemistry or physics problems in metric, it works a bit better to use C, but frankly I never found it much different using any particular system. (With the way the metric folks keep shifting names and units though, it’s a real PITA some times. I find the ‘too small’ vs ‘too large’ vs “what’s next?” a bit of an annoyance. CGS vs SI vs ?? Frankly, my ‘pound of butter, foot of cloth, and 72F room temperature’ haven’t changed my whole life, nor that of my parents, nor their parents, or their parents… and I’m rather happy with that.)
So what came after Fahrenheit? And was it significantly different or better?
From this wiki we find that it is based on the same standards (boiling and freezing water) but with 80 divisions instead of 100. OK, 80 isn’t as factor rich as 180 or 360, but it’s still pretty good. Why use only 80 divisions? Well, for one thing, he used alcohol in the thermometer. It is not a precise as mercury, so finer divisions would have really just been recording more false precision. Part of what let Fahrenheit do so many divisions was the use of mercury. We also get to see how someone’s bright idea can get changed in the implementation as folks changed how he defined the device when they went to others for manufacture…
The Réaumur scale (°Ré, °Re, °R), also known as the “octogesimal division”, is a temperature scale in which the freezing and boiling points of water are set to 0 and 80 degrees respectively. The scale is named after René Antoine Ferchault de Réaumur, who first proposed something similar in 1730.
Réaumur’s thermometer contained diluted alcohol and was constructed on the principle of taking the freezing point of water as 0°, and graduating the tube into degrees each of which was one-thousandth of the volume contained by the bulb and tube up to the zero mark. He suggested that the quality of alcohol employed be such that it began boiling at 80 °Ré — that is, when it had expanded in volume by 8 %. He chose alcohol instead of mercury on the grounds that it expanded more visibly, but this posed problems: his original thermometers were very bulky, and the low boiling point of alcohol made them unsuitable for many applications. Instrument-makers generally chose different liquids, and then used 80 °Ré to signify the boiling point of water, causing much confusion. In 1772 Jean-André Deluc studied the several substances then used in thermometers in the light of new theories of heat and came to the conclusion that mercurial thermometers were the best for practical use; for example, if two equal amounts of water at x and y degrees were mixed, the temperature of the result was then the average of x and y degrees, and this relationship only held reliably when mercury was used. From the late 18th century mercury was used almost without exception. These thermometers, the stems of which are graduated into eighty equal parts between the freezing and boiling points of water, are not Réaumur’s original thermometers in anything but name.
So, is this a ‘dead system’ now? Well, not quite. For what just must be fascinating historical reasons, it is still used in some kinds of cheese making operations:
The Réaumur scale saw widespread use in Europe, particularly in France and Germany as well as Russia, as referenced in works of Dostoyevsky, Tolstoy, and Nabokov. By the 1790s, France chose the Celsius scale for the metric system over the Réaumur measurements. Its only modern use is in the measuring of milk temperature in cheese production. It is used in some Italian dairies making Parmigiano-Reggiano and Grana Padano cheeses and in Swiss Alp cheeses.
Gotta love those cheese makers. Not going to change a thing and put the cheese at risk ;-)
So we still care about this scale if we wish to read old Russian novels or like our traditional cheeses… or for looking at old temperature records from Europe prior to 1800.
Some day I’ll just have to indulge a small ‘dig here’ about just why those cheese makers kept it. Perhaps some other ‘magic number’ where the cheese is aged at a nice round factor? Something that ends up being a strange fractional thing in C?
The Delisle scale is in some ways the more interesting as it runs ‘backwards’ like the original Celsius scale.
From the wiki
The Delisle scale (°D) is a temperature scale invented in 1732 by the French astronomer Joseph-Nicolas Delisle (1688–1768). Delisle was the author of Mémoires pour servir à l’histoire et aux progrès de l’Astronomie, de la Géographie et de la Physique (1738).
He had been invited to Russia by Peter the Great. In 1732 he built a thermometer that used mercury as a working fluid. Delisle chose his scale using the temperature of boiling water as the fixed zero point and measured the contraction of the mercury (with lower temperatures) in hundred-thousandths. The Celsius scale, likewise, originally ran from zero for boiling water down to 100 for freezing water. This was reversed to its modern order some time after his death, in part at the instigation of Daniel Ekström, the manufacturer of most of the thermometers used by Celsius.
The Delisle thermometers usually had 2400 graduations, appropriate to the winter in St. Petersburg. In 1738 Josias Weitbrecht (1702–47) recalibrated the Delisle thermometer with 0 degrees as the boiling point and 150 degrees as the freezing point of water. The Delisle thermometer remained in use for almost 100 years in Russia.
Key points: It used mercury. This matters for accuracy and precision. It used boiling water to ‘fix the zero’, just like the original Celsius. Instead of using frozen water to fix a second point, he simple contracted the mercury and marked it in steps. Calibration being based, in essence, on the physics of mercury and percentage contraction. We get a ‘recalibration’ to freezing water in 1738 (but don’t know how much that changed things).
Note, too, that early records from Russia (up to about 1832 ) will be in this scale. These old methods and their changes STILL matter to us if we wish to go back and look at ‘original records’ and source documents.
I can only speculate that the 2400 ‘graduations’ was in 1/10 degrees. That would give 240 total degrees and if 150 of them were between the boiling and freezing points of water, that would leave 90 more for ‘below zero’ on the Celsius scale. At 1.5 Delisle degrees per Celsius degree, that’s down to about -60 C or “appropriate to the winter in St. Petersburg” indeed…
Only in Russia would you need a thermometer that was ‘open ended’ to the downside ;-)
I don’t know if there is all that much worth saying about Celsius that we’ve not already covered. It is convenient for some kinds of chemical reaction equations ( heat balance too ) but IMHO has too large a graduations for day to day things. I want to know if I’ve got a 1 F body temp variation from normal, not be trying to read the .xx range.
Mostly I’ll use this section to cover some of the points that people frequently flail on, but that really don’t deserve it.
One is Centigrade vs Celsius. For some entirely unknown reason, some folks come unglued when you say ‘Centigrade’ and insist that all sorts of things are Centigrade and it is NOT a synonym for Celsius and that us Philistines need to get with the program and use The Proper Term. Well, I’m rather fond of my history, and just because some YaHoo thinks he can tell me he is in charge of changing things, that does not mean I can never use a historical term.
For the last 204 years, the scientific and thermometry communities worldwide referred to this scale as the centigrade scale. Temperatures on the centigrade scale were often reported simply as degrees or, when greater specificity was desired, as degrees centigrade. The symbol for temperature values on this scale was °C.
So I’ll take that 204 years as approval to use Centigrade whenever I damn well please. Been doing so since I used it as such in high school chemistry class and see no reason to change now.
Because the term centigrade was also the Spanish and French language name for a unit of angular measurement (1/10,000 of a right angle) and had a similar connotation in other languages, the term centesimal degree was used when very precise, unambiguous language was required by international standards bodies such as the Bureau international des poids et mesures (BIPM). The 9th CGPM (Conférence générale des poids et mesures) and the CIPM (Comité international des poids et mesures) formally adopted “degree Celsius” (symbol: °C) in 1948.
For scientific use, “Celsius” is the term usually used with “centigrade” otherwise continuing to be in common use.
So no, some committee in France does not get to tell me how to speak English.
Frankly, the kinds of petty bickering indulged in by folks involved in trying to regulate other folks behaviour can be astounding (and not at all worth my time to try and follow…)
The “degree Celsius” has been the only SI unit whose full unit name contains an uppercase letter since the SI base unit for temperature, the kelvin, became the proper name in 1967 replacing the term degree Kelvin. The plural form is degrees Celsius.
Really? REALLY? Folks actually devoted time, money, and effort to deciding when to capitalize what? And ended up with two different proper names of real people where ONE is to be capitalized and the other not? Well I’m going to keep on calling it a degree Kelvin or even just a Kelvin; and I’m as likely to call it a centigrade as a celsius and / or Celsius.
Better yet, I’ll just use F and ignore the whole thing ;-)
The general rule is that the numerical value always precedes the unit, and a space is always used to separate the unit from the number, e.g., “23 °C” (not “23°C” or “23° C”). Thus the value of the quantity is the product of the number and the unit, the space being regarded as a multiplication sign (just as a space between units implies multiplication). The only exceptions to this rule are for the unit symbols for degree, minute, and second for plane angle, °, ′, and ″, respectively, for which no space is left between the numerical value and the unit symbol.
Riiiight… A space implies multiplication? How about we just use C 23, 23 C, 23C or anything else folks can figure out? Frankly, I’m glad I’m using F as folks don’t seem to get their panties in a bunch nearly so much about how you use it (their heads already exploding over the simple fact that you DID use it ;-0
There is another interesting bit in the wiki, but in addition to the continued obsessing over Form-Over-Function, there is also an interesting minor point on temperature vs interval. I do find it rather odd that they are claiming that the C is just a ‘special name’ for a Kelvin. Somehow I think someone has molested this pooch a bit too much…
The degree Celsius is a special name for the kelvin for use in expressing Celsius temperatures. The degree Celsius is also subject to the same rules as the kelvin with regard to the use of its unit name and symbol. Thus, besides expressing specific temperatures along its scale (e.g. “Gallium melts at 29.7646 °C” and “The temperature outside is 23 degrees Celsius”), the degree Celsius is also suitable for expressing temperature intervals: differences between temperatures or their uncertainties (e.g. “The output of the heat exchanger is hotter by 40 degrees Celsius”, and “Our standard uncertainty is ±3 °C”). Because of this dual usage, one must not rely upon the unit name or its symbol to denote that a quantity is a temperature interval; it must be unambiguous through context or explicit statement that the quantity is an interval.
This is sometimes solved by using the symbol °C (pronounced “degrees Celsius”) for a temperature, and C° (pronounced “Celsius degrees”) for a temperature interval, although this usage is non-standard.
Got that? An awful long winded way of saying “temperature is FOO” is not the same as “offset is FOO degrees”. It has to be something about other languages, since in all the English I’ve ever used, saying “It is FOO” and any degree units you like tells you it is a temperature and saying “it is hotter | colder | range of FOO” tells you it is an interval.
What is often confusing about the Celsius measurement is that it follows an interval system but not a ratio system; that it follows a relative scale not an absolute scale. This is put simply by illustrating that while 10 °C and 20 °C have the same interval difference as 20 °C and 30 °C the temperature 20 °C is not twice the air heat energy as 10 °C. As this example shows degrees Celsius is a useful interval measurement but does not possess the characteristics of ratio measures like weight or distance.
Are folks REALLY so confused about heat content that they need this spelled out to them? Is this really ‘special’ to the folks who use Celsius? So much that they need this kind of hand holding?
Frankly I really hope it’s just a wikimentia thing, because if the folks who use Celsius really need that kind of spoon feeding, well, lets just say that the Fahrenheit folks don’t seem to need it at all and do just fine with such complex things as “It’s 20 F hotter today than yesterday” not meaning it’s 20 F outside… or that air at 20 F is not holding twice as much heat as air at 10F (frankly, everyone pretty much knows that if it’s 10F or 20F there’s just no darned heat to speak of in it anyway ;-) But perhaps such clear insights come from a scale that puts both 10F and 20F in the freezer… and makes it fairly clear by context most of the time that it’s not going to be 90F warmer than yesterday in San Diego… (The more I think about it, the more I think F has a natural advantage here…)
Sidebar on ‘wikimentia’: Over on Vertity Jone’s place we were looking at neologisms, and I’m kind of stuck in that mode now.. so be advised that you may see some ‘new words’ from me for a while. “wikimentia’ being a kind of dementia commonly seen in wiki articles written by folks with some mental ‘issues’ and reflecting in the various sorts of delving into too much detail, not enough detail, making up detail, deleting inconvenient facts, and generally being empixelated and / or causing cliflation in articles. Wikimentia can also lead to googlehuffing articles and general panixilation in keeping with the enviralaxed nature of their out of alignment world view… Definitions at the link ;-) And yes, some of that wiki on C gives me just that kind of feeling ;-)
And All The Rest
With the discovery / proof of an ‘absolute zero’, we got the two most common temperature scales re-indexed to that point. Useful for some kinds of science. A bit hard to make your own thermometer using a ‘zero degrees at absolute zero’ temperature standards for calibration ;-)
Still, they were the last development of this saga. Rankine being Fahrenheit reset to an absolute zero start while Kelvin (note that I’m happy with a capital K on Kelvin…) is a reset centigrade (and I’m happy with centigrade as a synonym for Celsius or ‘renamed kelvin’…).
The wiki on Rankine is rather sparse. The only interesting bit being who and when created:
Rankine is a thermodynamic (absolute) temperature scale named after the Glasgow University engineer and physicist William John Macquorn Rankine, who proposed it in 1859. (The Kelvin scale was first proposed in 1848.)
The symbol for degrees Rankine is R (or Ra if necessary to distinguish it from the Rømer and Réaumur scales). Zero on both the Kelvin and Rankine scales is absolute zero, but the Rankine degree is defined as equal to one degree Fahrenheit, rather than the one degree Celsius used by the Kelvin scale. A temperature of −459.67 °F is exactly equal to 0 R.
Frankly, anything invented by a Glasgow University Engineer (note the capital E on Engineer… get it right, damn it!) has got my vote (even if he does have 4 names).
The Kelvin is wordier, but doesn’t really say much more. It has a lot of the same SI prissiness about conventions and rules of use, but at the core it’s just a C shifted to start at absolute zero:
The kelvin is a unit of measurement for temperature. It is one of the seven base units in the International System of Units (SI) and is assigned the unit symbol K. The Kelvin scale is an absolute, thermodynamic temperature scale using as its null point absolute zero, the temperature at which all thermal motion ceases in the classical description of thermodynamics. The kelvin is defined as the fraction 1⁄273.16 of the thermodynamic temperature of the triple point of water (273.16 K (0.01 °C; 32.02 °F)).
Oh, I ought to note that the 0.01 C triple point issue is yet another minor loose bit of ‘slightly-off’ in the land of C… and that defined at 1/273.16 is not so easy to have slide off the tongue either… Starting to smell more like a ‘legacy kludge’ and less like a clean scientific design…
But at least Kelvin is also the product of some decent Glasgow University Engineer talent:
The Kelvin scale is named after the Belfast-born, Glasgow University engineer and physicist William Thomson, 1st Baron Kelvin (1824–1907), who wrote of the need for an “absolute thermometric scale”. Unlike the degree Fahrenheit and degree Celsius, the kelvin is not referred to or typeset as a degree. The kelvin is the primary unit of measurement in the physical sciences, but is often used in conjunction with the degree Celsius, which has the same magnitude. Absolute zero at 0 K is −273.15 °C (−459.67 °F).
Why all the weirdness and making a Kelvin into a kelvin? Lord only knows. But frankly, if being free of all that kind of nonsense is a side effect of using F and Rankine, hey, I’m all for it!
There’s 3/4 of a screen full ( 4 paragraphs) of egotripe farigling of syntax and usage at the wiki. The smallest bit is enough to give a brain fade:
When reference is made to the unit kelvin (either a specific temperature or a temperature interval), kelvin is always spelled with a lowercase k unless it is the first word in a sentence. When reference is made to the “Kelvin scale”, the word “kelvin”—which is normally a noun—functions adjectivally to modify the noun “scale” and is capitalized.
Yeah, with a page full of that kind of advice just on how to use Kelvin, I’m dead certain that it’s easier to just tell the whole SI gang to stuff it and stick with F. Nobody really cares if you say F, degrees F, Fahrenheit, Fahrenheit degrees, or if you’ve nounified your verbs or vebialed your nouns or if you function adjectivally or not… it just functions however you like…
But just to show that things are not quite settled in the ‘settled science’ of temperatures, they are working hard to redefine the Kelvin, yet again (and with it the degree C).
In 2005 the CIPM embarked on a program to redefine, amongst others, the kelvin using a more rigorous basis than was in use. The current (2010) definition is unsatisfactory for temperatures below 20 K and above 1300 K. It is anticipated that the program will be completed in time for its adoption by the CGPM at its 2011 meeting. The committee proposes defining the kelvin as the temperature scale for which Boltzmann’s constant is 1.3806505×10−23 J/K exactly.
From a scientific point of view, this will link temperature to the rest of SI and result in a stable definition that is independent of any particular substance. From a practical point of view, the redefinition will pass unnoticed; water will still freeze at 0 °C (32 °F)(273.15 K).
I’m sure they have a good reason for this. (Well, I’m mostly just hoping they do…) And I’m sure someone is paying them a nice fat government salary to ponder such weighty things. But really, do I care? Only to the extent that they are sucking on my taxes (which I can only hope is ‘not at all’…)
Having officious bureaucrats in charge of things like “what is a degree?” is, IMHO, a big step backwards from when guys with Engineer after their name or guys with MD as their calling were figuring out how to make useful instruments. I’m sure the Engineers and MDs will ‘keep on keeping on’ and not let such things stop them from getting some decent work done.
I’m also pretty sure that I’d rather know I can make my own thermometer that’s good “for all practical purposes” using a divider (compass), a bucket of ice water (or brine), and either a steam bucket or my own body temperature. That nobody is likely to harass me about how I capitalize it or when I use it adjectively is all just gravy. IF I can make some SI Snob’s head explode in the process, so much the better.
The reality of temperature measurement (outside of those places where ‘angels and pins’ is a valid topic…) is that such practical arts are far more valuable than how precisely the unit tracks the Boltzmann’s Constant.
Does that mean I’ll never use C or K? Not at all. I’ll use them whenever it’s easier to use them. For substantially everything I do day to day, that’s nearly never. (The major exception being ‘climate science’ code wrangling) Frankly, I’ve got a new found respect for the Rømer and the Réaumur and just might try finding ways to use them, from time to time, too. Heck, I’ve always wanted to try cheese making ;-)
I’ve also got this idea niggling at me to make the “degree S” (yes, as in Smith). Pick a nice eutectic salt for the top end standard. Use ammonium salt /ice / water for the zero point. Divide into 360 degrees between them. No dependency on air pressure / steam… Or perhaps use 256 degrees so it’s easy to mark using dividers… Now if I can just figure out a way to have it be 256 degrees between two calibration points and 360 degrees between ‘zero’ and ‘hot calibration’ I’ll be all set ;-)
For those not willing to hit the link, a rough idea of what the ‘neologisms’ mean. Not as detailed as at the link, but the general idea:
Cliflation: The tendency for anything climate related to be inflated in importance, size, warming tendency, etc.
Empixelated: To uncritically believe anything presented to you by the pixels on your screen. “Jones was sure Mann would be empixelated by the latest runs of HADcrut”.
Enviralax: That peculiar tendency to see things shifted through an environmental filter just out of kilter. Political parallax.
Farigle: To diddle data that is far enough away that nobody will notice. “Hansen was farigling the Arctic Data”
Googlehuffing: To manipulate search engines so as to rank an article (especially one about climate) extra high for political / monetary purposes. “AlGore asked the programmer to googlehuff his latest book”. A non-standard usage is to down rate articles by skeptics or with a skeptical point of view. “Algore demanded WUWT be googlehuffed into the 20th page”.
Panixilation: (Br. Sp. Panicselation) That peculiar tendency to turn anything into a Panic Attack, especially with some exhilaration about it. Often seen in the Warmers World. “Hansen was clearly panixilated about the coal trains”.
Wikimentia: A kind of dementia commonly seen in wiki articles reflecting bias in the various sorts of delving into too much detail, not enough detail, making up detail, deleting inconvenient facts, and generally being politically driven to excess.
Not used here, but being mentioned anyway:
Doomian: The world view that says anything we do can only lead to doom. Related to panixilation, but more operative in that an actual outcome of doom is predicted. Doomers is a related noun form. “The doomian result was clearly sea level rise of 1000 meters and the loss of all islands in the Pacific.” Or even “Jones, a clear doomer, looked at the printout, panixilated, and said ‘The end is near!’; yet Smith just thought him doomian.”
In closing, a Galileo Thermoscope: