Friday, January 30, 2015

2014-- Hottest Year Ever Recorded? Look!

Figure 1

     You can see immediately that 2014 is not the hottest year among even the last 18 years. Not by a long shot. Why is this chart be so different from the widely reported announcement in January by NOAA and NASA that 2014 was the hottest ever (“ever” being just since 1880, when records began)? The difference results from different temperature data being used. The National Oceanic and Atmospheric Administration (NOAA) and the National Aeronautics and Space Administration's Goddard Institute for Space Studies (GISS) base their analyses on surface temperature measurements. The graph shown above is based on satellite measurements, which are far more accurate and don't show any warming. Satellite measurements show 2014 was only the sixth warmest year of the last 18.
     Since 72 percent of the earth's surface is covered by oceans, temperature measurements are unavailable for a large part of the earth's surface. By contrast, satellite measurements cover the entire earth not just at the surface but at various elevations all the way to the top of the atmosphere. There are over 160 meteorological satellites orbiting the earth and transmitting 80 million measurements every day to an accuracy of one one-hundredth of a degree. Land based thermometers can do no better than one-tenth of a degree. Clearly the satellite data is far more comprehensive and accurate than that of the surface stations. In fact, the satellite measurement systems were developed because of the weaknesses and inaccuracies of the land-based network. Yet neither NASA nor NOAA use the satellite data.
     Those two agencies reported the amount of warming that made 2014 the hottest year was two-hundredths of a degree. No mention was made of the accuracy of the measurement or the range of probable error. It is against normal scientific practice to have a margin of error greater than the precision of the measurement. Yet a two-hundredths degree of warming was reported based on temperature measurements with an accuracy of one-tenth of a degree—that is, the error bar was five times larger than the reported result. An error of a tenth of a degree is in the statistical 95% uncertainty range.
     Even greater uncertainty has been exposed by examination of the data itself. It has been subjected to bias, both deliberate and unintentional, in several ways which do not occur with satellite data.
     In 2007 it was revealed that GISS had been artificially inflating U.S. temperatures by 0.15 degrees Celsius since the year 2000. NASA had claimed that six of the ten hottest years in U.S. history had occurred since 1995. When the erroneous data was corrected, 1998 (an unusually warm year due to El Nino—not carbon dioxide) was no longer the warmest year of the past century in the US. It fell to second place, with 1934 now being the warmest. And third place now belonged to 1921, not 2006. The formerly high-ranking years 2000, 2002, 2003 and 2004 fell well down the leader board—behind even 1900. Four of the top ten were now from the 1930s: 1934, 1931, 1938, 1939. Since more than 80 percent of the century's increase in atmospheric carbon dioxide occurred after 1940, the warmer temperatures of earlier years can't be explained by higher carbon dioxide levels.  So why should we believe all the hype about increased CO2 emissions causing catastrophic warming in the future? Remember, too, that while CO2 was increasing steadily since 1940, the earth's temperature was decreasing from 1940 until 1975—leading to widespread media reports about fears of a new ice age.
     In April 2008 Canadian statistician Steve McIntyre documented that NASA has been “rewriting history time and time again.”  Still, NASA continued the process. It falsely reported that October 2008 was the warmest October on record. Statistical scientists jumped on this claim, leading even NASA to admit it was wrong.
     Then meteorologist Anthony Watts caught GISS and James Hansen doctoring data records from Santa Rosa, California, and potentially other temperature stations. The charts below show how Hansen and his underlings turned a long-term decline into a long-term temperature increase.

Raw Data:
                                                                           Figure 2

GISS “Adjustment”:
                                                                           Figure 3

Figure 2 shows actual readings reported by the U.S. Historical Climatology Network (USHCN). GISS arrives at its numbers, illustrated on Figure 3, by taking the USHCN data and applying secret adjustments. USHCN reports a temperature decline of nearly one-half degree Celsius during the twentieth century, while GISS reports an increase of one-half a degree. Hansen has refused to explain how and why he makes these adjustments. His s secrecy raises an ethical and perhaps legal question of whether the head of an agency federally funded by U.S. taxpayers can refuse to disclose how those funds are spent. It also raises the question of whether the adjustments are legitimate or merely deliberate manipulations contrived to produce a desired result.
     James Hansen is the NASA scientist who started the whole global warming hysteria in 1988 when he told a Senate committee he was 99 percent sure global warming was already underway. The news  media seized upon Hansen's unsupported testimony and parlayed it into an impending planetary crisis. A new industry was born. Billions of dollars were spent, and tens of thousands of jobs were created, giving rise to growing numbers of people with vested interests in promoting the specter of global warming. James Hansen gave them ammunition. For years, as head of NASA's Goddard Institute for Space Studies (GISS), he has “repeatedly been caught providing erroneous temperature reports that always err on the side of claiming more warming than has occurred,” wrote James Taylor in the February 2009  issue of Environment & Climate News. Perhaps this explains why Hansen has been adamantly opposed to having NASA utilize satellite temperature data.
     There are five official temperature data records. Three of these are surface records. The other two are satellite records furnished by Remote Sensing Service (RSS) and the University of Alabama at Huntsville (UAL).
     The three surface records are NASA's (GISS), NOAA, and the University of East Anglia/Hadley Centre, part of the UK Met Office. All three are run by passionate believers in man-made global warming, and all three depend on data supplied by ground stations via the Global Historical Climate Network (GHCN), managed by the US National Climate Data Center under NOAA. A shocking report by two veteran meteorologists Anthony Watts and Joseph D'Aleo states, “All the data centers, most notably NOAA and NASA, conspired in the manipulation of global temperature records.” Thus all three do not display independent research confirming the work of the others; instead they demonstrate their common corruption.
     Here's another example of tampering with climate data, this one reported very recently, January 2015. It covers massive falsification of records for 65 years of data covering a vast area stretching across Brazil and Paraguay. Paul Homewood noticed that this area, according to GISS records, showed a temperature rise between 1950 and 2014 of more than twice the accepted global increase for the entire century. He was able to compare the original data with what was reported by GISS. Far from the temperature increase shown by GISS, the original data showed the temperatures declined by a full degree over those 65 years. The graphs below demonstrate this difference for the Puerto Casada station.

The adjusted graph from the Goddard Institute for Space Studies:
                                                                     Figure 4
Below, the raw data in graph form:
                                                                      Figure 5
Only two other rural stations exist in this vast area, and Homewood found the same thing happened with data there. You can see these graphs here.
     There is a far larger and more serious distortion in the global temperature data than falsifying the reports from the individual measuring stations. Temperature records throughout the world have been falsified by manipulating the locations of the reporting stations. Beginning about 1990, higher-altitude, higher-latitude, and rural stations were removed from the network in order to create a false warming trend. The global temperature record that used to be based on 6,000 reporting stations now is based on fewer than 1,500. The thoroughly-researched 106-page report by Joseph D'Aleo and Anthony Watts documents the effect with this graph: 

Figure 6

The rise in global temperature correlates with eliminating 
data from weather stations likely to show cooling, 

     In many cases the stations are still reporting, but their data are no longer utilized. Often the stations have been replaced by others more likely to show warming from lower elevations, lower latitudes, or urban development, which reflects the well-known “heat island” effect of cities. Data gaps are filled in by extrapolating from nearby stations.  Here are some examples from the D'Aleo/Watts report:
     "In the cold countries of Russia and Canada, the rural stations in the Polar Regions were thinned out leaving behind the lower latitude more urban stations. The data from the remaining cities were used to estimate the temperatures to the north. As a result the computed new averages were higher than the averages when the cold stations were part of the monthly/yearly assessment.
     "In Canada, the number of stations dropped from 600 to less than 50. The percentage of stations in the lower elevations (below 300 feet) tripled and those at the higher elevations above 3,000 feet were reduced by half. [The] depicted warmth comes from interpolating from more southerly locations to fill northerly vacant grid boxes, even as a simple average of the available stations shows an apparent cooling."
     Environment Canada reports that there are 1400 weather stations in Canada, many reporting even hourly readings that are readily available on the internet but not included in the global data base. Canada has 100 stations north of the Arctic Circle, but NOAA uses just one.
     The Moscow-based Institute of Economic Analysis (IEA) claims the Hadley Center has tampered with the Russian data: "The IEA believes the Russian meteorological station data did not substantiate the anthropogenic global-warming theory. Analysts say Russian meteorological stations cover most of the country's territory and that the Hadley Center had used data submitted by only 25 percent of such stations in its reports. The Russian station count dropped from 476 to 121 so over 40 percent of Russian territory was not included in global temperature calculations for some other reasons than the lack of meteorological stations and observations."
     The Russians found that the 121 sites used gave mostly warmer reports than the 355 unused sites. In some cases stations records going back into the 19th Century were ignored in favor of stations with less data but which pointed to warming. The IEA team stated, “Only one tenth of meteorological sites with complete temperature series are used.”
     In Europe higher mountain stations were dropped and thermometers were marched toward the Mediterranean, lower elevations, and more cities. The station dropout was almost 65 percent for Europe as a whole and 50 percent for the Nordic countries.
     Africa is not showing warming despite efforts to make it appear so by eliminating thermometers from cool areas like the Moroccan coast and moving them toward the Sahara.
     Analyst E. Michael Smith found that most of the stations remaining in the United States are at airports. Most mountain stations of the west are gone. In California the only remaining stations are in San Francisco, Santa Maria, Los Angeles and San Diego.
     As recently as 1988, temperature records for China came from over 400 stations. In 1990, only 35.
     The raw temperature data show no trend in temperatures in Northern Australia in 125 years. The IPCC, however, uses “adjusted” data. NOAA makes data “adjustments” to remove “inhomogeneities” and for other reasons. The D'Aleo/Watts report says, “We have five different records covering Darwin from 1941 on. They all agree almost exactly. Why adjust them at all? NOAA added a huge, artificial, imaginary trend to the most recent half of the raw data.” The raw temperatures in Darwin were falling at 0.7 C. per century. After the NOAA adjustment, the temperatures were rising 1.2 C per century. 
     NASA applies an “urbanization adjustment,” but Steve McIntyre reveals that NASA made the adjustment in the wrong direction, exaggerating the warming effect instead of showing what the temperatures would be without urban development. NASA is always tampering with its data. John Goetz has shown it “adjusted” 20 percent of its data sixteen times in two and a half years. 
     Lastly, we take note of the absurdity of recent studies and observations purporting to show that the effects of global warming are already occurring. In a cause-and-effect relationship, the effect cannot occur before the cause. You can't have effects from global warming when there is no global warming and has been none for over 18 years—despite a massive increase in carbon dioxide emissions.  Clearly carbon dioxide emissions have not caused global warming, because the actual temperature records show no warming. Those records have been falsified to justify the global warming doctrine for political purposes. 

Tuesday, December 30, 2014

Government + Phony Science + $$$$ = Waste

     The obvious successes of past technologies have made politicians and environmentalists eager to be in the forefront of promoting futuristic schemes for their goals. Everyone wants to be on the side of the next Great Idea. All too often these futuristic fantasies are sold to a gullible public, as well as fellow politicians and the news media, with impressive but scientifically-flawed arguments that bump up against harsh physical realities that are immutable. These cannot be changed by any amount of laws, government spending or propagandizing. Solar power and wind power are examples. So is global warming. 

     Sunlight, despite being plentiful all over the earth, is inherently dilute. It arrives at a rate of 1 kilowatt per square meter (about 11 square feet) when the sun shines unobstructed directly overhead. That rate can never be increased. It imposes an inefficiency that can never be overcome. As Dr. Petr Beckmann, a professor of electrical engineering, explained:
     “No amount of technology, no amount of money, no genius of human inventiveness can ever change it....The effort of concentrating it, either by accumulation in time or by funneling it in space, is so vast that nothing as puny as man has been able to achieve it; only Nature herself has the gigantic resources in space, time and energy to do the job.”
     “To get an idea of how concentrated the energy is in coal, and how dilute sunshine is, consider a lump of coal needed to make 1 kilowatt-hour of electricity. It weighs under a pound...its shadow would measure perhaps 15 square inches. How long would the sun have to shine on those 15 square inches to bring in 1 kilowatt hour of energy? For 1,000 hours of pure sunshine. In the Arizona desert, where the sun is out about 12 hours a day, almost three months. For the average location in the U.S., our little lump of coal would have to be out for almost half a year to be struck by a total energy of 1 kWh. But only struck by it; if we wanted to get 1kWh out from that sunbeam, we would have to divide by the conversion factor....That is how concentrated the energy is in coal, and how dilute it is in sunshine.” And the energy in coal is available almost immediately.

     Power from wind energy also bumps up against immutable physical realities. One is the fact that the wind doesn't blow all the time. That can never be changed. Another is that 25 to 60 percent of the time the wind is blowing, it is at a rate less than the maximum efficiency for the turbine. As a result, windmills operate at only around 33 to 40 percent of maximum production level, compared to 90 percent for coal and 95% for nuclear power. Turbines start producing power with winds at 8 mph, operate most efficiently at 29-31 mph winds, and must be shut down at 56 mph winds (though the highest winds have the most energy to be collected) to avoid potential damage, such as rotor blades flying off or "cause vibration that can shake the turbine into pieces."
     Wind turbines are—and will always be—unable to achieve high efficiency even under the most favorable conditions. To attain 100 percent efficiency would mean the blades would stop rotating. The best efficiency achieved is about 47 percent, which is about as good as it can get because of a physical law known as the Betz Limit. This has been known for a hundred years. It was independently discovered by three scientists in three different countries: Albert Betz (1919) in Germany, Frederck Lanchester (1915) in Great Britain, and Nicolay Zhukowsky (1920 in Russia. Their discovery applies to all Newtonian fluids and identifies the maximum amount of kinetic energy that can be captured by windmills as 59.3 percent. But the advocates of wind power are either ignorant of this or willfully ignore it to make wind power seem feasible for achieving their goals. Wind turbines may be useful in remote locations with adequate winds where more efficient energy sources are unavailable, but they will never achieve widespread displacement of more economic energy without the waste from government subsidies and/or artificially high electricity rates for consumers.

     The government's taxpayer support for industry to produce ethanol—for which both republicans and democrats received huge campaign donations—and the requirement that consumers buy it at gas pumps are further examples of forcing us to pay wastefully higher prices for an inferior fuel. Ethanol has only two-thirds the energy of gasoline, meaning it will take a car only two-thirds as far as a gallon of gas without ethanol would take it. And the amount of energy in ethanol cannot be changed; you cannot get more energy out of ethanol than is in it. No amount of laws, government spending or research is going to change that. Ethanol corrodes rubber, aluminum and steel, imposing costs on the design and construction of surfaces with which it comes in contact. US Department of Energy’s Oak Ridge National Laboratory warns against the use of zinc with ethanol. (Carburetors are made from alloys of zinc and aluminum.) Because ethanol attracts water, including moisture from the air, it cannot be shipped by pipeline—the cheapest method of transport—and must be distributed by truck.
     There have been several studies of the economics of ethanol. The most thorough was done by Cornell University Professor David Pimmentel, who also chaired a U.S. Department of Energy panel to investigate the energetics of ethanol production. Pimmentel and his associates found that it takes more energy to produce ethanol than you can get out of it. They found that “131,000 BTUs are needed to make one gallon of ethanol. One gallon of ethanol has an energy value of only 77,000 BTU....a net energy loss of 54,000 BTUs [per gallon].” Pimmentel adds, “That helps to explain why fossil fuels—not ethanol—are used to produce ethanol.”
     A study by Hosein Shapouri is championed by pro-ethanol advocates because it disputes Pimmentel's finding and instead claims a modestly positive net energy balance. But Howard Hayden, a Professor Emeritus of Physics from the University of Connecticut and Adjunct Professor at the University of Southern Colorado, notes that Shapouri et al “use the most optimistic figures: the best corn yield, the least energy used for fertilizer, the least energy required for farming, the most efficient distillation techniques, the most residual energy (in the form of mash); and in general the most favorable (but still credible) values for any and all aspects of [ethanol] production.” Even so, Hayden says that, using Shapouri's numbers, the net average power available from the ethanol of one acre of corn would be enough only to keep one 60-watt light bulb burning continuously for one month. To keep that bulb burning for a year would require 12 acres of corn, an area larger than nine football fields. (An acre is 43,560 square feet.)
     Pimmentel vigorously defends his study and launches a formidable criticism of Shapouri's report. He gives this explanation for their differences:
     “Pro-ethanol people make [ethanol] out to be positive by omitting many of the inputs that go into corn production. For example, they omit the farm labor — I’m not talking about the farm family, I’m talking about the farm labor. They omit the farm machinery. They omit the energy to produce the hybrid corn. They omit the irrigation. I could go on and on. Anyway, if I did all of those manipulations, I could achieve also a positive return.
     “However, that’s not the way these assessments are made. You can go check the noted agricultural economists who have looked at corn as well as other crops, and they do include the labor, they include the farm machinery, they include repair of the farm machinery, and so forth and so on. And so, those are all inputs that the ag economists include. Why are the pro-ethanol people leaving them out?”
     One aspect of the dispute between Pimmental and Shapouri involves credit for distillers grains, a byproduct of ethanol fermentation that is used as animal feed. Pimmental says Shapouri uses an extravagant credit to make ethanol look good, while Shapouri say Pimmental doesn't account for the credit. Pimmental's response:
     “We do account for it. Distillers grains, incidentally, are being used as a substitute for soybean meal. So we went back to the soybean meal, and examined how it’s produced, and the energy that is required to produce it. Instead of giving [distillers grains] a 40 to 60 percent credit as the pro-ethanol people do, we found that the credit should be more like 9 percent. They [pro-ethanol researchers] are manipulating the data again.”
     How can the correct numbers for all the inputs be determined? How can each be given an appropriate weighting in the total picture? How can we be sure that no inputs are left out? Or manipulated? The free market automatically does this through the mechanism of prices. It is a further waste of resources to employ hordes of government regulators and researchers to determine that which the market can automatically do more thoroughly and accurately. If the inputs for ethanol, or anything else, add up to a profitable price for demand of a given commodity, it will be produced voluntarily in the market. If it is not profitable, laws and regulations on producers and consumers will not make it so. They will merely translate the losses (waste) into higher prices for consumers or taxpayers or to future generations in the form of depreciating dollars and a growing national debt.
     Other biofuels—which futuristic fantasists acclaim is the next Great Idea in energy—are worse than corn-based ethanol. They all bump up against the limits of photosynthesis. Chlorophyll converts sunlight into energy, but solar energy is dilute to begin with and plants, on average, collect only one-tenth of one percent of the solar energy available. If corn-based ethanol were to replace all the oil used in the U.S. for transportation, 700 million acres of corn would be required, compared to 408 million acres currently used for all types of crop production. If soy biodiesel were the substitute, it would require 3.2 billion acres of soybeans—one billion more acres than the entire U.S. including Alaska. Nevertheless, in 2013 the Obama administration announced contracts for $16 million to 3 biofuel plants in Illinois, Nebraska and California.

     It has long been argued that solar and wind must be the energy sources of the future, regardless of cost, because the world will run out of petroleum in a few centuries. But that argument has been destroyed in recent years by new drilling techniques, the invention of fracking to extract oil and gas, and the discovery of hundreds of new oil and gas fields all over the world—even at extreme depths under the oceans—all of which mean the world will not exhaust petroleum resources for many thousands of years, if ever.
     The environmentalists preach that the extractive industries which produce fossil fuels rape the landscape, pollute the air and water, and consume resources that should be left in their natural state. They worship the primitive. Their ideal is a pristine state of nature uncontaminated by civilization. So they favor renewable energies as being less damaging to the environment. But they ignore the environmental consequence of the consumption of energy and other resources that solar and wind utilization requires, which are greater than the traditional energy industries they deplore.
     Like the pro-ethanol advocates, the advocates of solar and wind energy fail to consider all the inputs in the claims they are economic. The construction of a 1,000 MW solar plant requires 35,000 tons of aluminum, 2 million tons of concrete, 7,500 tons of copper, 600,000 tons of steel, 75,000 tons of glass, and 1,500 tons of chromium and titanium and other materials. These amounts are about 1,000 times greater than what's needed to construct a coal-fired or nuclear plant producing the same power. Nuclear plants require enormous amounts of concrete for the massive containment structure, but a solar plant of equal capacity requires 500 times as much..
     Those massive material requirements also consume massive amounts of energy in their manufacture. These include: 75 million BTU per ton of aluminum, 56 million BTU per ton of steel, 18 million BTU per ton of glass, and 12 million BTU per ton of concrete. And the manufacturing processes emit pollution of various sorts, some toxic, along with other wastes, for which disposal sites must be provided at further cost.
     Solar and wind generating plants require vast land areas, with the most favorable areas being distant from the urban populations that require electrical power. This means further cost and energy requirements for establishing a power distribution network.

     Though it is no longer believable that the world is going to run out of petroleum, there is another scare tactic that demands we must use renewable fuels even if they are uneconomic. That is the threat of global warming ruining the planet. Isn't saving the world more important than saving money? But once again, just as in the foregoing examples, inputs have been left out resulting in a false conclusion.
     What has been left out? Temperature records were discontinued or no longer included in the data base in certain cold regions of the world, thus showing an elevated global temperature record. New measuring stations were added in warm areas, with the same result. Data was manipulated, falsifying records. There is no recognition of, or explanation for, the earth being warmer 1,000 years ago, 3,000 years ago and 10,000 years ago, when there were no factories or automobiles. Also omitted is that 10,000 years ago when the carbon dioxide level was about the same as today, the climate rose as much as 6 degrees Celsius in a decade—a hundred times faster than the rate we are supposed to regard as troubling—yet without the catastrophic consequences now predicted for global warming. Also omitted is incontrovertible evidence that rising temperatures produce rising carbon dioxide levels—not the other way around. Also omitted are 90,000 measurements of atmospheric carbon dioxide made between the years 1812 and 1961 and published in 175 technical papers that give lie to the claim industrialization has increased atmospheric carbon dioxide. Those measurements were made by top scientists, including two Nobel Prize winners, using techniques that are standard textbook procedures; they show an average of 440 ppm carbon dioxide in 1820 and 1940. Also, ice cores show over 400 ppm carbon dioxide in 1700 A.D. and 200 A.D., as well as 10,000 years ago. Samples from Camp Century (Greenland) and Byrd Camp (Antarctica) range from 250 to nearly 500 ppm over the last 10,000 years. These make a lie out of the claim in 2013 that the recent atmospheric carbon dioxide level of 400 ppm is the highest in 3 million years. I explained these issues in previous postings on this blog so will not repeat them here. What else has been left out? Almost 5,000 peer-reviewed scientific papers in professional journals, identified by the Nongovernmental International Panel on Climate Change, showing the claim that carbon dioxide causes dangerous global warming is baseless.
     One doesn't need to be aware of all those shortcomings to realize the falsity of the global warming propaganda. It was sold to the public on grounds that the computer models on which it was based represented the real world. Reality has shown they do not. That is all you need to know. Eighteen years of no global warming—in the face of enormous increases in carbon dioxide emission—invalidate the global warming hypothesis. The U.S. government alone has wasted more than $165 billion since 1993 to combat global warming from carbon dioxide and caused many billions of dollars more to be wasted by the private sector.
     One other item missing from the anti-global warming campaign is the role of the sun. It determines not only the earth's climate but the carbon dioxide content of its atmosphere!
     The sun's radiation is varied by “sunspot cycles,” disturbances on the surface of the sun. Magnetic fields rip through the sun's surface, producing holes in the sun's corona, solar flares, coronal mass ejections, and changes in the “solar wind,” the stream of charged particles emanating from the sun. The solar wind, by modulating the galactic cosmic rays which reach the earth, determines both the formation of clouds and the carbon dioxide level in the earth's atmosphere. Sunspot cycles cause only slight changes in the sun's radiation, but these changes are amplified many fold by interaction 1) with ozone in the upper stratosphere, and 2) with clouds in the lower troposphere. Clouds have a hundred times greater impact on climate and temperature than CO2.
     “Cosmic radiation comes to the earth from the depths of the universe, ionizing atoms and molecules in the troposphere, and thus enabling cloud formation. When the sun's activity is stronger, the solar magnetic field drives a part of cosmic radiation away from the earth, fewer clouds are formed in the troposphere, and the earth becomes warmer,” wrote N.D. Marsh and H. Svensmark, pioneers in this issue. The process was explained with eloquent simplicity by Theodore Landsheidt of Canada's Schroeder Institute: “When the solar wind is strong and cosmic rays are weak, the global cloud cover shrinks. It expands when cosmic rays are strong because the solar wind is weak. This effect [is] attributed to cloud seeding by ionized secondary particles.” Or, as Zbigniew Jaworowski put it more poetically, “The sun opens and closes a climate-controlling umbrella of clouds over our heads.”
     The sun also sets the carbon dioxide level in the earth's atmosphere by the same process. Nigel Calder explains: “The sun sets the level of carbon dioxide in the earth's atmosphere by the cumulative effect of variations in the galactic cosmic rays reaching the earth, as modulated by the solar wind. My results leave no room for CO2 levels due to man-made CO2....nothing to do with emissions from factories or cars (emphasis added.)"
     After about 210 years, sunspot cycles “crash” or almost entirely die out, and the earth can cool dramatically. These unusually cold periods last several decades. Of greatest concern to us is the Maunder Minimum, which ran from 1645 to 1715. (See It's the Sun,Stupid for graphs and explanation.) Some years had no sunspots at all. The astronomer Sporer reported only 50 sunspots during a 30-year period, compared to 40,000, to 50,000 typical for that length of time.
     In the year 2008 there were no sunspots at all on 266 days, an ominous indication of extreme cold weather for several decades despite all the BS about carbon dioxide.  While the believers in anthropological global warming usually try (and fail) to make their case on a basis of a century or less of data covering the rise of industrialization, a prominent Russian solar physicist was looking elsewhere. Looking at 7,500 years of Maunder-type deep temperature drops, Habibullo Abdussamatov predicted a slow decline in temperatures would begin as early as 2012-2015 and lead to a deep freeze in 2050-2060 that will last about fifty years. In October 2013 he updated his earlier warning: “We are now on an unavoidable advance towards a deep temperature drop.”
     Everyone knows the sun's heating of the earth and atmosphere is uneven. We have all witnessed changes in the sun's heat we receive throughout the day, that it is warmest in midday when the sun is directly overhead; and as the sun moves across the sky, new volumes of air are exposed to its heating while others are left behind. This uneven heating is the basis for wind currents. A similar process takes place in the oceans, creating ocean currents. According to NASA, “uneven heating from the sun drives the air and ocean currents that produce the Earth's climate” (italics added)
     While others were studying and propagandizing about carbon dioxide, Don Easterbrook, a geology professor and climate scientist, noticed a recurring pattern in an ocean cycle known as the Pacific Decadal Oscillation going back to 1480. Every 25-30 years there was an alternation of warm and cool ocean cycles. In 2000 he concluded “the PDO said we're due for a change,” and that happened. No global warming now for 18 years.
     Nitrogen, oxygen and argon comprise more than 99 percent of our atmosphere. Water vapor and carbon dioxide are the next most abundant gases. Carbon Dioxide comprises 0.04 percent and is a weak greenhouse gas; water vapor, a strong one. Joseph D'Aleo, first direct of meteorology, the Weather Channel, offers this perspective, “If the atmosphere was a 100-story building, our annual anthropogenic (man-made) contribution today would be equivalent to the linoleum on the first floor.” 
      Meteorologist Joe Bastardi says only 1/50,000 of the air is man-made and carbon dioxide is too trivial of a factor” to be of concern regarding global warming, that the whole argument is “tiresome and absurd...Warmists are living in a fantasy world.
     “To those who say there is...a gas (which is only 1/2500th of an atmosphere in which the most prominent GHG [Greenhouse gas] is water vapor, and with oceans that have 1,000 times the heat capacity of the atmosphere) is somehow controlling all this (recall that we once had an ice age at 7,000 ppm CO2), you live in a fantasy world.
     “Then again, men who have the fantasy of saving the planet by controlling others are indeed in their own world. It’s up to those grappling with the real facts to make sure that the world we live in is one that promotes freedom and the betterment of mankind, and not one controlled by those who believe they are superior to everyone else. This is where the real battle is, and not with a trace gas that has little if anything to do with the climate of a planet created and designed the way it was.”
     We close with a quote from Reid Bryson, founding chairman of the Department of Meteorology at the University of Wisconsin: “You can go outside and spit and have the same effect as a doubling of carbon dioxide.”