“Be kind, for everyone you meet is fighting a hard battle” - Often attributed to Plato but likely from Ian McLaren (pseudonym of Reverend John Watson)

Saturday, May 28, 2011

A hectowatt or a kilowatt?

I'm not an expert on radiative physics, but I've had a bit of physics here and there. There's a long-running debate on whether or not so-called "downward longwave radiation," that is, infrared radiation emitted by the Earth's surface after absorbing the Sun's shortwave (centered in the visible light band) radiation and subsequently absorbed and re-emitted by greenhouse gases can cause the Earth to be at a higher temperature than it would otherwise be. In fact, there's even debate as to whether it exists, this despite the fact that it's been accurately and repeatedly measured. That's the 324 W/m^2 (watts per meter squared) in the graphic above. And Science of Doom has a thorough explanation extending over several posts.

I thought I'd see how it works for my body (as usual, making a bunch of assumptions and estimates).  I have figured that my rest metabolic daily calorie requirement is 1900 kilocalories. 1900 kilocalories in 24 hours is 92.0 watts. This is very straightforward. Since I'm in thermal equilibrium (for practical purposes), this heat mut be dissipated.

On the other hand, I also used a thermometer to measure my skin temperature to be 93.3 degrees F or 307.2K (Kelvins). I looked here to get a figure of 0.97 for the emissivity (the ratio of the energy radiated by a material to that emitted by a "black body" at the same temperature) of human skin. Finally, I went here to get an estimate of 2.00 m^2 for my body's surface area. This gives me sufficient information to determine the power I'm radiating using the Stefan Boltzmann Law (erroneously ignoring evaporation, conduction and convection):

Here, P is power in watts, sigma is Boltzmann's constant, 5.67*10^(-8) watts/K^4, A is surface area in m^2, and e is emissivity. Plugging in the numbers, I get P=980 watts. This is certainly a considerably higher number than 92 watts, the rate at which my resting body is converting food energy to thermal energy, what gives?

Let's look at this the other way: supposing that all heat dissipation in my body is by radiation, what temperature would my skin need to be at for me to radiate at 92 watts? Rearranging the Stefan-Boltzmann law:

This gives me T=170K or -154 degrees Fahrenheit. Wow, quite chilly. Have I discovered a flaw in the Stefan Boltzmann Law? Should the Nobel committee be called? Probably not. What is happening is that both my internal thermal energy and the radiative energy I'm absorbing from my surroundings are contributing to my outgoing radiation.

But the walls, the monitor, etc. are BELOW the temperature of my skin, at about 70 degrees F. Do these incoming (980-92) 888 watts warm me? Let's think about how we might get a grip on this question. First, a thought experiment: suppose that, somehow, the environment suddenly stopped supplying radiative energy to me (never mind how this could happen). What then?

I could continue to radiate at 980 watts, but that energy would need to come from somewhere. Assuming I didn't want to burn my flesh, food would supply it. My caloric intake would need to increase by a factor of greater than 10 (980/92) to keep the fires stoked. Failing this, I'd begin to chill to the 170K above (never mind that I'd soon perish), while my internal regulating processes attempted to maintain a normal temperature by burning whatever was available. I think it's safe to say that the radiation from my surroundings, which are at a measurably lower temperature than my skin, keep me warmer than I'd otherwise be. Now, substitute the Sun for my metabolism, the greenhouse gases in the atmosphere for my surroundings, and the Earth's surface for my skin and the analogy is complete.