Isn't amazing that two years ago when we had devastating hurricanes in the Gulf we heard a lot about "Global Warming." The media was on the hype bandwagon about how the earth is warming and the waters got so warm that they caused such devastating hurricanes. Well this winter it was so cold all over the US. How could the media explain that? Was the earth still warming? All of a sudden we don't hear much about "Global Warming" as the term "Global Warming" has turned into "Climate Change."