Hehe indeed. One must keep in mind Richard spent weeks researching this, not just a few days.
I think it’s possible to get to a good enough endpoint eventually though…
Note I haven’t read through fully what Andrew posted yet, just replying here to what I already looked into…
Firstly it must be pointed out that this isn’t really relevant for making the simple argument against the theory. Whatever the calculation of how much hotter the Earth should be etc., doesn’t matter when evaluating the argument of whether the IR-absorbing gases heat the Earth a significant amount.
That being said, the math here is a little funny which comes with its own set of problems.
Although it appears to make sense to equate energy in == energy out and consider energy in on just the part the Earth is receiving from the sun, and energy out across the whole Earth…
…
This is mathematically equivalent to treating the Earth like a flat disk, with twice the radius of the Earth (and therefore the same surface area), which disk is constantly 100% of the time receiving on average 1/4th the power per square meter than the Sun is actually emitting.
Think about that:
Reality: rotating sphere, only half of which is illuminated at any given instance (i.e. 100% of the energy spread across 50% of the surface), while the other gets zero energy
Mathematical model: flat disk, 100% illuminated 100% of the time with all the energy spread across the full surface
So it is not clear at all on the face of it that any average temperature resulting from such a calculation has any significance or sensibility at all!
Yeah the blackbody calculation gives you a pretty good answer there for the part of the Moon receiving the full blast of the Sun.
Note that basically the same calculation holds for the Earth — 1,368 W/m^2 goes to 957.6 W/m^2 after accounting for albedo, which with an emissivity of 0.9 gives you (link):
i.e. the Earth should be at +87.4C in the direct sunlight.
Since 50% of the Earth is receiving the 957.6 W/m^2 from the Sun and the other 50% of the Earth is receiving 0 W/m^2, wouldn’t a better estimate be to take half the Earth as receiving 478.8 W/m^2 (i.e. since half the sphere has 2x the surface area of the circular area the Sun is irradiating) and the other half as receiving 0 W/m^2?
That gives a temperature of 303.1K for the side facing the sun and 3K (basically as cold as it gets in space) for the side facing away from the sun, which is 153K or -120C!
The reason this is so different than the number from dividing the irradiance by 4 is because of the 4th power relation in the Stefan-Boltzmann equation.
Specifically in the case of the Moon:
divide by 4 blackbody calculation: 270.4K (see: Nasa)
consider degree-by-degree exposure level and average each bit, also factoring in 35K of ‘night’ temperature from the light coming from the Earth: 178.45K (see Chapter 5 linked above)
reality: 204K (see Chapter 5)
Note particularly that the divide/4 calc is way off, and also that the better estimate going degree-by-degree still undershoots reality because of the heat-absorptive effect of the moon’s surface - i.e. by similar logic of the 33C “greenhouse effect” predicted for Earth (namely, making a blackbody calculation), there’s a 26C “greenhouse effect” on the Moon.
Right, what I was getting at is that I don’t think the reason the radiative heat loss is “much less efficient” is because of the specifics of the materials used, but rather because of the nature of radiative heat loss. The quote here was not qualifying it in terms of the material.
That is: “At lower altitudes, convection takes over from radiation as the most important heat transport process […]”.
It’s a good point and a cogent discussion. By ‘efficiency’ I am thinking in terms of, which of the mechanisms at the Earth’s surface is responsible for moving heat at a faster rate - radiation or convection? As per the atmosphere science textbook, it’s convection.
Also note that since the atmosphere is mostly transparent to IR, basically the entirety of the heating of it is due to convection!
Of course to leave the planet, period, only radiation will work.
But the picture in my mind is coming together something like this:
The sun heats the Earth basically by directly heating the surface
The surface then rapidly heats the entire atmosphere via convection
The entire atmosphere radiates IR back down towards the surface, much more than the IR-absorbing gases in the atmosphere emit IR back down towards the surface
This re-heats the surface which then re-heats the atmosphere etc… Meanwhile the surface + atmosphere as a net is gradually losing heat via radiation towards space
The question is, how much does the IR-absorbing gas contribute in terms of the net loss to space? I propose that it’s begging the question to say “a lot” because the “a lot” is based on the presumption that the entire heating-of-the-surface is due solely to the IR-absorbing gases. But if it’s only responsible for a small amount of heating the surface then the answer is “a little” (or at least: “not necessarily a lot”).
I’m not fully resolved on the matter tho. I will have to read more carefully the materials Andrew linked to, and understand a few more things. But so far the argument is holding for me.