**1. The problem statement, all variables and given/known data**

Radioactive decay of elements in the Earth’s interior results in a mean heat flux through the Earth’s surface of 5×10^-2 W/m^2. What is this flux expressed as a fraction of the energy flux due to thermal re-radiation of absorbed solar energy? If radioactive decay were the only heat source for the Earth, what would the Earth’s surface temperature be?

**2. Relevant equations**

$$ W_{p} = \frac{L_{\odot}}{4\pi r^{2}}(\pi R^{2})(1-A) $$

$$ L_{P} = 4\pi R^{2}\sigma_{SB}T_{P}^{4} $$

**3. The attempt at a solution**

To begin, I need to find the energy flux due to thermal re-radiation of absorbed solar energy. To do this, I figured, I could just plug the appropriate numbers into the first of the two given formulas. I’m not certain as to my answer doing that, using 3.839×10^26 W for L, 1.5×10^11 m for r, and .4 for A. I was initially not sure what value to use for the cross sectional radius R (where pi*R^2 is the cross sectional area of the planet); I used the radius of the Earth, 6.37×10^11 m, which gives an energy absorbed of about 10^17 W. I’ve heard figures for the Earth more around ~1300 W (though I don’t recall the source for this), so I’m rather suspicious about this answer.

Where am I going off the right path here?

http://ift.tt/NrVjgJ