**By Michael Emanuel **VP of R&D

We are frequently asked whether the Trident measurement results are corrected for IR radiation or whether a correction is needed at all. Radiation is one of the three basic methods for heat transfer between media. At temperatures around room temperature (RT) and up to a thousand °C the main radiation emitted from bodies is in the near to far infrared (IR, 0.7-20µm) range.

In this article, the quantity of heat, or heat flux, that is transferred from the sensor to the sample by IR radiation is evaluated against the heat flux that is transferred by conduction for different sensor methods under various temperature conditions and different optical properties of the samples.

The total IR radiation from a “gray” body at temp *T _{0}* in Kelvin of heat flux is given by

in units of Wm^{-2}, where ε is a number between 0 and 1 representing the emissivity of the radiating body, and σ is the Stefan-Boltzmann constant, 5.67E-8 Wm^{-2}K^{-4}. The net heat transfer between two bodies depends on the geometrical parameters, the emissivity of each body, and the temperature of their surfaces. If two bodies are at the same temperature, their net heat transfer will be zero regardless of their emissivity values.

The absorption of radiation in materials happens in two different ways. One is a quantum phenomenon, or direct absorption; the other is by scattering the radiation multiple times which eventually causes its absorption given that the sample is adequately thick. The terms absorption and extinction are often used interchangeably. For simplicity, in this article we use the term extinction coefficient to describe both phenomena.

An ideal black body with ε=1 absorbs the IR radiation near its surface, while a semi-transparent material gradually absorbs the radiation at a distance from its surface. The absorption depth is determined by its extinction coefficient, which measures the depth at which the incident radiation on the surface decays to 1/*e* of its value. The extinction coefficient is expressed in units of m^{-1}. In mathematical formulation,

where *I(x)* is the intensity at a distance *x* meters from the surface, *I _{0}* is the incident radiation intensity at the surface (less the reflected part, if any), and

*E*is the extinction coefficient. At the distance

*x=(1/E)*the intensity

*I(1/E)=(I*. The larger the value of

_{0}/e)*E*, the shorter the absorption distance, and thus less radiative heat transfer. Low conductivity, porous materials may have an extinction coefficient of about 500 m

^{-1}to 5000 m

^{-1}. Solid, dense materials may have an extinction coefficient in the range of 10,000 m

^{-1}to 50,000 m

^{-1}or larger.

**Case 1 – IR opaque and transparent samples**

An optically opaque material is a material that absorbs the IR radiation very near its surface. The emissivity is related to the capability of a body to absorb IR radiation, which in turn depends on the material’s extinction coefficient. A material with a large extinction coefficient, such as 10^{5} m^{-1}, will absorb almost all incident radiation near its surface and be considered optically opaque. Conversely, a material with a very low extinction coefficient, 1 m^{-1}, will act as a transparent medium.

A sample with emissivity near 1 will practically absorb all the incident radiation from the sensor; a sample with emissivity near 0 will transfer the incident radiation to the environment beyond, in which case the environment will absorb the radiation, assuming the boundary of the transparent sample does not participate in this exchange. From the sensor’s point of view, both IR opaque and transparent samples are equivalent.

The heat flux radiated from the sensor to a sample, or to the environment around it, when at a slightly different temperature (say, 1-3K) is approx.

where Δ*T* is the average temperature difference between the sensor surface and the sample, or environment.

If the sensor and the sample are in perfect contact and there is no thermal contact resistance between them, the temperature of their contact surfaces will be the same; there will be no net heat transfer between them by radiation, and the transfer will be by conduction only. If there is a thermal contact resistance during operation of the sensor, a temperature difference will develop between the two surfaces, resulting in a net transfer of both radiative heat flux and conductive heat flux from the sensor to the sample.

In most practical cases, the temperature difference between the surfaces of the sensor and sample will be less than 0.5K. Figure 1 shows the radiative heat flux *q _{r}* vs T

_{0}for a maximum Δ

*T*=0.5K and sensor emissivity ε=0.5, a mid-value between 0 and 1 was chosen arbitrarily for this illustration. The right axis shows the percentage of the radiative heat flux transfer relative to the total heat flux generated by the sensor. It can be seen, that the radiative heat flux is a negligible percentage of the total heat flux for a typical measurement in a single-sided mode with 2W applied power, a 15mm diameter sensor and a 0.5K temperature difference between the sensor and sample. In this mode, most of the generated heat flux is assumed to be transferred to the sample. The percentage would be doubled in Figure 1 if a symmetrical double-sided mode is used, since the generated heat flux is equally divided between the two samples. Specifically, up to 800K (527 °C) the radiative heat flux is much less than 1% of the total heat flux.

Note that in a transient single-sided mode the temperature drop over the thermal resistance is not constant over time. Thus, the estimated radiative percentage will be even lower than that shown in Figure 1 during most of the measurement time.

**Case 2 – IR semi-transparent sample**

Unlike in the case of ideal opaque or transparent samples, in a semi-transparent material, there is an internal interaction between parts of the material themselves – due to the temperature gradient inside the material created by conduction – as well as with the sensor generating the IR radiation. The parts of the sample closer to the sensor are at a higher temperature than deeper in the sample, and therefore, some internal heat transfer by radiation will take place. Heat transfer by conduction and radiation are coupled with an effective heat conduction *k _{eff}* which can be expressed as

with *k _{c}*

_{ }being conduction and

*k*representing the radiative het transfer component. In ref [1] and [2], 14.19 p.500, the radiative heat transfer for an optically thick sample is expressed as,

_{r }where* n* is the sample index of refraction, *T _{r}*

_{ }is the sample temperature at a distance

*r*from the source and

*E*is the extinction coefficient. The radiative heat transfer is proportional to the 3

^{rd}power of the temperature and inversely proportional to the extinction coefficient.

When the physical thickness of a sample is significantly higher than its extinction mean free path (*1/E*) it is considered optically thick.

The mathematical formulation of the energy equations including radiative heat transfer has been developed and is available in literature. Whilst the conductive heat transfer equation is linear with the temperature, it is seen in Eq. (5) that the radiative heat transfer is non-linear. This makes the coupled equation of conductive and radiative heat transfer very difficult to solve analytically for most transient cases. Instead, researchers resorted to numerical evaluations [3-7]. In all cases, measurement data analyzed by conduction only show higher effective conductivity than actual due to the additional radiative heat transfer. Such numerical evaluations have been done on a line source, such as the transient hot wire (THW) [3,4] and plane sources [5-7].

Normally, the radiative heat transfer component increases with measurement time and ambient temperature and is inversely dependent on the extinction coefficient of the sample. Samples mostly affected by radiative heat transfer have low thermal conductivity and are porous solids, such as metal and ceramic foams [7].

In the case of a line source, samples with extreme porosity, low conductivity *k _{c}*<0.1 W/mK, a very low extinction coefficient

*E*≈ 500 m

^{-1}, at RT during a measurement lasting 100s, and a large increase of wire temperature of about 15K, the measured conductivity may be overestimated by 30% due to the contribution of radiative heat flow ([3], Fig 2). However, with the same line source measurements at RT, with a much shorter measurement time, and temperature increase of only 1-2K, the radiative heat transfer becomes much lower, in the order of approx. 10%.

Figure 2 shows a plane source’s radiative heat transfer contribution for optically thick samples, temperature up to 1,000 K and 3 different extinction coefficients. It is calculated from Eq. (5) with *n* = 1. The right axis depicts the percentage of the radiative to conductive ratio. The effective (or measured) conductivity, the sum of the conductive and radiative, may show a very large error for IR semi-transparent materials with a very low extinction coefficient.

**Conclusions**

- The radiative heat transfer into an optically thick sample increases with temperature and decreases with extinction coefficient according to Eq. (5).
- For IR opaque or transparent samples (extinction coefficient of ≥ 5E4 and ~0-50 m
^{-1}, respectively), the radiative heat transfer is negligible relative to the conductive heat transfer even for the extreme case of 0.5K difference between the sensor and the sample surface temperatures. Therefore, no correction for radiative heat transfer is needed even for ambient temperature up to 1,000K. - For plane sources, radiative heat transfer correction is needed for low conductivity, porous IR semi-transparent solids such as ceramic foams. Knowledge of the extinction coefficient is required for quantitative estimate of the radiative heat transfer, as shown in Figures 2 and 3.
- It is desired to use optically thick samples in the order of 50 (i.e.,
*E** sample thickness ≥ 50) to avoid radiation interaction and reflection at the far boundary of the sample. - Correction of radiative heat transfer with line source measurements may be needed in certain cases above 500°C, depending on measurement time and the optical properties of the sample, which may be unknown in most cases. Therefore, as a rule of thumb a reduction of ~5-10% of the measured conductivity will be reasonable for porous solids with thermal conductivity below 0.2 W/mK.

**WORK CITED**

- J.R. Howell et al., Thermal Radiation Heat Transfer, 7
^{th}Edition, CRC Press, 2020. - M.F. Modest, Radiative Heat Transfer,4
^{th}Edition, Academic Press, 2021. - U. Gross et al., Radiation effects on transient hot-wire measurements in absorbing and emitting porous media, International Journal of Heat and Mass Transfer 47 (2004) 3279-3290.
- N. Daouas et al., Solution of a coupled inverse heat conduction-radiation problem for the study of radiation effects on the transient hot wire measurements, Experimental Thermal and Fluid Science 32 (2008) 1766-1778.
- H. Zhang et al., Effect of radiative heat transfer on determining thermal conductivity of semi-transparent materials using transient plane source method, Applied Thermal Engineering 114 (2017) 337-345.
- S. Wang et al., Analysis of radiation effect on thermal conductivity measurement of semi-transparent materials based on transient plane source method, Applied Thermal Engineering 177 (2020) 115457.
- R. Coquard et al., Experimental investigations of the coupled conductive and radiative heat transfer in metallic/ceramic foams, International Journal of Heat and Mass Transfer 52 (2009) 4907-4918.

#### About the Author

Michael Emanuel has over 20 years of experience in high technology companies, including Honeywell, ACS (Alcohol Countermeasure Systems) and SCD (Semiconductor Devices). He has extensive experience in complex systems design, product engineering, electronics hardware and firmware, application software, product testing, reliability, and quality processes. Michael is the author of many technical publications and holds several international patents, most notably in the fields of optical imaging and materials monitoring. Michael received a Master of Science in Applied Physics and a Bachelor of Science in Physics and Mathematics from The Hebrew University of Jerusalem, Israel. He also holds a diploma in Executive Development from McGill University in Montreal, Canada. |