You've just explained why average temperature is useful. It balances out these changes to give a general picture. If temp in the arctic, the mojave, and germany all rise by 1K in a year, the difference in temp between the locations doesn't matter, rather the overall change. Yes, weather fluctuates within hours, but I can guarantee you, if it was cold in wyoming, it was cold a few miles east of there. You don't need a weather station every square mile to accurately record data. You don't need 200m weather stations. If all 17,500 weather stations record a ~1K increase in average temp in their area, that shows quite a clear trend. You seem not to have understood the concept that, if temp fluctuates from 250-300K in one day, a weather station can just take several readings at those points and find the average temperature. It's not that complex. The climate is not static, that's why we use an average.
No I did not neglect this happening in the opposite direction, because it doesn't happen. Photons coming from the sun have a higher frequency than those leaving the earth (UV compared to infrared), meaning they interact minimally with greenhouse gases.
The OSHA standard determines breathability of air, not heat. CO2 is toxic in large concentrations, so OSHA defines a concentration above which it is dangerous. OSHA does not cover the greenhouse effect because a workplace is not a planet. CO2 warms the planet up. That's the issue. Not how toxic it is. I've already shown you the maths on why an increase in greenhouse gases is dangerous.
The image I included seems to describe how the NWS uses satellites to monitor temperature.
Yes, Ice ages exist. In fact, the average temp was as much as 8K cooler 20,000 years ago. Climate change is currently warming us 10 times faster than leaving the ice age did. This means we don't have time to evolve or adapt. Our stat for pre-industrial temp is the 1850-1900 average, so it doesn't account for the ice ages.
If you're going to dig at this by calling it Malthusian, I suggest you learn the actual background behind the Malthusian model.
You're the one trying to disprove climate change. If different layers in the atmosphere disprove it, you show me.
PS: I'd still love to see the map of 3000ppm downwind of rainforests as well as the paper you wrote
Saying there is an average temperature is quite reasonable. The question is how useful that metric is, and it is useful for one thing - trends in weather and temperature. We do not need trillions of data points for this - Earth has 200 million square miles of surface area. If we put a weather station in each square mile, and take 1 measurement every day, we get 73bn data points. In reality, we do not need this many, because temperature does not wildly change every day, every square mile. The pre-industrial average temperature (the 1850-1900 average) is not 0, be that °C, °F, or Kelvin (though I'll stick with Kelvin from now on because it's easiest). Average temperature was roughly 288K (15°C or 59°F). Now that has moved up to around 289-290K. This should be all I need to prove it's real, but let's continue.
78% of the atmosphere is nitrogen, 21% is oxygen and 0.9% is argon. These three are not greenhouse gases, as they have either one or no chemical bonds in their molecular structure. This leaves 0.1% of the Earth's atmosphere (by volume) to be greenhouse gases, or 1,000ppm. Space's temperature is around 3K, meaning those 1,000ppm greenhouse gases heat the Earth from 3K to 288K, or by 285K. Basic maths tells us that 1ppm of greenhouse gas causes 0.29K of temperature increase. Now, the reality is lower than this for a variety of factors (the sun heats up the space around Earth slightly, the gases only insulate at night, etc). So yes, 50ppm does mean a significant increase.
The law of equilibrium is correct. An infrared photon leaving the Earth strikes a CO2 molecule, exciting one of its electrons. That electron de-excites, emitting another infrared photon. That photon has a 50/50 chance of going back to Earth or into space. Without the CO2, it had a 100% chance of leaving Earth. Hence heating.
I'm not sure what greenhouse CO2 concentration has to do with anything, given that it's done because plants use CO2 in photosynthesis.
Now I would really love to see NASA's imagery of 3000ppm, if you could send that, please do. Same goes for your paper, I would love to read it. I would also love to hear about the statisticians who say satellite data is unreliable, especially since statisticians dont usually concern themselves with the workings of satellites.
As for your claim regarding geological periods, it is true. However, humans did not evolve in the Carboniferous period (300mya). We are not suited for different temperatures. Life will go on. Human life won't.
Just to add to my previous statement, the Greenhouse effect that causes climate change was first proposed as a theory/discovered by Joseph Fourier (one of the most important mathematicians in history) in 1824. The earliest known time someone predicted a link between human carbon emissions and climate change was in 1912 in a Kiwi newspaper I've included above. If it's a hoax, who was there to benefit from it in 1824, or 1912?
Climate Change is an observed phenomenon and is supported by basic chemistry -- all remotely complex gases absorb infrared radiation eacaping the Earth (heat) and re emit it back towards Earth. This takes place with all gases in the atmosphere with more than two constituent atoms.