Winter is the season of thermometers. Weather forecasts are full of reports of the rising and falling of mercury in glass tubes, reported to be above or below the usual average. In many homes, parents are thrusting thermometers into ears, mouths, and armpits to monitor the symptoms of seasonal flus and colds. As we prepare for holiday meals, debates can be heard about the appropriate temperature to cook a roast turkey. The meat thermometer is fished out of the back of the utility drawer or, failing that, a pop-up disposable thermometer is bought for the occasion. Dry flesh or food poisoning are the potential risks of turkey roasting but can be avoided by the use of a good thermometer.
The thermometer is a relatively recent instrument in a long history of measuring hotness and coldness. Thermoscopes were used from ancient times to detect changes resulting from heating and cooling. The thermoscope consisted of a glass tube submerged in a container of water, and the movement of the water in the tube was affected by applying heat or cold to the container. In some versions, the tube ran between an empty sphere and a container of water, and heat or cold were applied to the empty sphere. If you remember any of your school physics you will immediately see a number of problems: air pressure, evaporation, and condensation will all affect readings.
Despite the thermoscope’s limitations, scientists began trying to standardise instruments so that they made comparable readings. This involved choosing two measurement points, one cold and one hot, and developing a regular scale between them. Cold was often simply a reading from snow, while hot was a reading from a candle flame. These were marked as minimum and maximum on the tube and the distance between them was divided into equal parts.
Galileo’s classmate
The earliest use of a thermometer to monitor temperature in patients is attributed to Sanctorius Justipolitanus, who was a colleague of Galileo’s at the University of Padua during the early 17th century. Sanctorius used a device with two glass bulbs connected by a tube and containing a liquid (possibly wine and water, as Galileo used in his thermoscope). Having created a scale – from snow to candle flame – Sanctorius placed one bulb in his patients’ mouths and recorded the movement of the liquid in the tube. Although physicians had long observed fever, this may have been the earliest attempt to measure it as internal temperature.
By the mid-17th century the hermetically sealed thermometer, similar to the ones we still use today, had been developed. One of the earliest such thermometers is attributed to the Grand Duke of Tuscany (a pupil of Galileo's). The sealed tube, filled with alcohol, reduced the problems of evaporation and air pressure and opened the door to creating instruments that delivered comparable readings in different places and times. A similar thermometer was also invented by Robert Boyle, working at the Royal Society in London. Boyle, in keeping with his reputation for scientific experimentation and replication, became particularly concerned to standardise the measurement of temperature. He understood that the use of different liquids (which responded differently to hot and cold) and different sizes of bulbs and tubes would affect measurements, and he tried to account mathematically for these variables.
In the early 18th century Daniel Fahrenheit proposed a thermometer with a temperature scale that is still in use, almost unaltered, today. He suggested three reference points based on the temperature of a mixture of ice and salt, the temperature of ice alone, and the temperature of the human body. That is how freezing came to be 32 degrees Fahernheit rather than 0 (ice plus salt) and how body temperature came to be 96 degrees. Cooked turkey, if anyone measured it, became 165 degrees. He also used mercury rather than alcohol or water, which has many advantages in a thermometer. The thermometers that he manufactured were perceived as reliable and became very popular across Europe.
As many readers will recall, Fahrenheit’s temperature scale was in use in Ireland until the move to metrication in the 1970s. The metric temperature scale that Ireland has since adopted was invented by a Swede (Anders Celsius) in 1710. Confusingly, Celsius originally proposed 100 degrees Celsius to mark the freezing point of water and 0 to mark its boiling point. So if you want to feel a bit warmer, just reverse your temperature scale and tell yourself it’s not five degrees outside, it’s actually 95 degrees.
Juliana Adelman lectures in history at Dublin City University