How do resistance thermometers work?

Resistance thermometers are widely used and are often used for more precise temperature measurements. The usual temperature range of them is between about -50 °C to 600 °C. However, there are also special applications where resistance thermometers are used from -200 °C to over 1000 °C [1]. The picture shows typical Pt100 resistance thermometers from the Klasmeier calibration laboratory:

The measuring principle of resistance thermometers is based on the electrical resistance measurement of so-called measuring resistors. The basis for this is Ohm’s law:

U = voltage
R = resistance
I = current

In physics lessons at schools, Ohm’s law is often presented as a triangle:

This triangle beautifully illustrates the connection between current, voltage and resistance. If two of the two quantities are known, the third missing quantity can be calculated. In a circuit diagram, Ohm’s law can be represented as follows:
The connections of the resistance measurement and thus the resistance thermometers can also be shown graphically in a diagram.

The connections of the resistance measurement and thus the resistance thermometers can also be shown graphically in a diagram.

The “VOLT-man” (U – voltage) pushes the “AMP-man” (I – current) through a tube. The “OHM-man” (R – resistance) tries to prevent this by making the tube narrower and narrower. The success of the “OHM-man” is also dependent on temperature. The warmer it is, the harder it is for the “VOLT-man” of the “AMP-man” to move. Since the temperature-dependent success of the “OHM male” is reproducible, the principle of electrical resistance measurement can be used to measure temperature. A measured resistance R in ohms is converted into a temperature T in °C or K via a known relationship.

Any electrical conductor to which the ohmic law can be applied can, in principle, be used as a thermometer. The physical constant that describes this property is the so-called specific resistance. An overview from Wikipedia [2] shows the different resistances of materials at 20 °C.

In principle, all materials mentioned can be used to draw conclusions about the temperature. However, there are of course different selection criteria according to which materials for thermometers are chosen. First of all, the material from which a thermometer is to be built should have the highest possible specific resistance and it must be suitable in principle.

For example, human blood with 1.6 x 106 Ohm x mm2/m has an excellent resistivity, but is of course not suitable for building thermometers on an industrial scale. Metals are much better suited for this purpose.

In addition to the resistivity, there is another constant that is important for the materials from which thermometers are to be built. The linear resistance-temperature coefficient describes the change in resistance of a material per °C. It is given as 1/K and can also be called sensitivity. In order to keep the demands on measurement technology as low as possible, this constant should also be as high as possible.

The task is therefore to find the best compromise between the costs, the basic suitability of the material and the resistivity and temperature coefficient of resistance.

Nickel and platinum are two metals that have proven to be suitable. At the beginning of the development of electrical resistance thermometers, nickel measuring resistors, such as Ni 100, were long regarded as favourites because they have a higher sensitivity than platinum measuring resistors. However, a higher limiting deviation and a limited temperature range proved to be a disadvantage. The standard [3] for nickel thermometers was withdrawn in the 1990s. Since then, nickel measuring resistors are primarily used only in special technical applications.

In the course of time, platinum measuring resistors, such as a Pt 100, have become established. They are widely used in industrial measurement technology and today represent the standard in electrical temperature measurement technology with resistance thermometers.


[3] DIN 43760