A resistance thermometer is a passive sensor; it requires a measuring current in order to produce a usable signal. This measuring current heats the element and raises its temperature. Errors will result unless the additional heat is absorbed.
Self-heating is expressed in mW/°C, which is the power in milliwatts (1000.RI²) which raises the internal temperature of the sensor by 1°C. The higher the number of mW/°C, the less significant the phenomenon as it requires more power for the same increase in temperature.
As an example, let’s take a measuring current of 5 mA through Pt100 sensors in an environment at 100°C. The specifications (IEC751) indicate 50 mW/°C in water flowing at 1 m/sec. The amount of heat produced is: 1000 mW * (0.005 A)² * (138.5Ω) = 3.5 mW; the self-heating error is: (3.5 mW)/(50 mW/°C) = 0.07°C, i.e. 0.07% of the temperature of the medium. 5 mA is a large current. Modern measurement instruments use very small measuring currents of the order of 100μA or even less. In the above case, this would give an error due to the increase in temperature of only (0.00138 mW)/(50 mW/°C) = 0.000027°C, which is negligible.
NB, the resulting error is inversely proportional to the capacity of the thermometer to dispose of the additional heat; this depends on the thermometer materials, construction and environment.
The worst case occurs where these is a high resistance in a small body. Film RTDs, with little surface area in which to absorb the heat, are one example.
Self-heating also depends on the medium in which the thermometer is immersed. The error in still air can be 100 times greater than in moving water.