Some scientists skillfully utilize the news media to promote a view of carbon dioxide and global warming that ignores contrary evidence or pretends it doesn't even exist. The media suck this up, puke it all over the airwaves and the newspapers, and the public, not knowing any better and awed by these scientists with Ph.D.s, believes it must be true. Here are a couple of examples. On June 25, 2007, Professor Raymond Bradley, in a discussion on KSTP-TV about carbon dioxide levels stated, “We cannot find any other period in the last 800,000 years when greenhouse gas levels were as high as they are today.” And on August 19, 2007, CBS featured a story about global warming with reporter Scott Pelley that focused on the increase in atmospheric carbon dioxide. In an extensive interview with Pelley, Professor Paul Mayewski stated: “The level and speed of rise is significantly [he repeated 'SIGNIFICANTLY' with great emphasis] greater than anything in the last 850,000 years.” He indicated this portends serious problems from continued global warming. But Professor Tim Patterson, a paleoclimatologist who is director of the Ottawa-Carleton Geoscience Centre of Canada's Carleton University, says that 10,000 years ago the carbon dioxide level was about the same as now and temperatures rose as much as 6 degrees Celsius in a decade—100 times faster than the past century!
The current level of carbon dioxide in the atmosphere, which Bradley and Mayewski find so alarming, is about 380 parts per million. The 2007 IPCC Summary report states: “The global atmospheric concentration of carbon dioxide has increased from a pre-industrial value of about 280 ppm to 379 ppm in 2005. The atmospheric concentration of carbon dioxide in 2005 exceeds by far the natural range over the last 650,000 years (180 to 300 ppm) as determined from ice cores [emphasis added].” Oh, yeah? The ice cores show measurements of over 400 ppm in 1700 A.D. and 200 A.D., as well as 10,000 years ago. Samples from Camp Century (Greenland) and Byrd Camp (Antarctica) range from 250 to nearly 500 ppm over the last 10,000 years.
Zbigniew Jaworowski, M.D., Ph.D., D.Sc., has studied climate for over 40 years and organized 11 glacier expeditions researching 17 glaciers in the Arctic, Antarctic, Alps, Norway, Himalayas, Peruvian Andes and other mountainous regions. He has also published about 20 papers on climate issues, most of them about ice cores. He writes that the ice core information in the 2007 IPCC Summary Report was “plagued with improper manipulation of data, an arbitrary rejection of high readings from old ice, and an arbitrary rejection of low readings from young ice, simply because they did not fit the preconceived idea of man-made global warming.”
Furthermore, how could scientists who believe carbon dioxide is causing global warming be unaware of the abundant direct (not from ice cores) measurements that give lie to their claim? More than 90,000 (!) measurements of atmospheric carbon dioxide were made between the years 1812 and 1961 and published in 175 technical papers. Jaworowski says these measurements were ignored for three decades “not because they were wrong. Indeed, these measurements were made by top scientists, including two Nobel Prize winners, using techniques that are standard textbook procedures....The only reason for rejection was that these measurements did not fit the hypothesis of anthropogenic global warming. I regard this as perhaps the greatest scientific scandal of our time.”
Ernst Georg Beck made a monumental compilation of these carbon dioxide measurements and graphed five-year averages, which smooth irregularities and show trends rather than an individual year that might be an anomaly. These further discredit the claims of the IPCC and others that today's carbon dioxide levels are the highest in thousands of years. Beck's work shows an average of 440 ppm carbon dioxide for the years 1820 and 1940, and 390 ppm for 1855. Can there be any doubt that the promoters of global warming are distorting science for political purposes?
Figure 1, above, is based on sediments from the Sargasso Sea. It shows the earth was much warmer 500 and 900 years ago and that there were even warmer times 500 BC and 1000 BC. All of these times had no factories or automobiles. They also had far smaller human populations, who devoted much less land to agriculture and cut far fewer trees. Note, too, that now we have barely reached the average temperature for the last 3,000 years. The chart also shows the current warming trend began more than 250 years ago, before the Industrial Revolution. It was a natural recovery from the Little Ice Age.
The widely publicized melting of ice sheets in the Arctic and Greenland in recent years was due to decadal oscillations in ocean and air currents that are unrelated to carbon dioxide or greenhouse warming. On July 17, 2008, the Science & Public Policy Institute stated: “Towards the end of the 20th century, the two most powerful of these oscillations, the Pacific Decadal Oscillation and the Atlantic Multidecadal Oscillation, were both in their warming phases. In 1998, their effect, combined with an exceptional but not unprecedented El Nino Southern Oscillation, caused a very strong upward spike in temperature.... [R]esearchers at NASA last year concluded that the reason for the record shrinkage of the Arctic ice-cap was an acceleration of pole-ward sea and air currents caused by the warming phase of the Atlantic Multidecadal Oscillation.” Those oscillations have now reversed. In October 2008 the extent of Arctic sea ice grew at the fastest pace ever recorded. It grew 3,481,575 square kilometers for the month, breaking the old record by 150,638 sq. km (58,161 sq. miles.)
The disappearance of Arctic ice is by no means something new. The areas where ice melted in recent years were, in fact, open water a century ago. We know this because Roald Amundsen and other explorers sailed their ships in these waters. This area has been freezing and thawing for millions of years, and at times the Arctic icecap completely disappeared. It always recovered, and Environment & Climate News recently reported “the 2005-2007 data confirm Greenland's ice melt has returned to normal.” Data from weather stations on the southern coast of Greenland show almost all decades between 1915 and 1965 were as warm or warmer than the 1995 to 2005 decade. In the 1920s, when mankind's emissions of carbon dioxide were nine times lower than now, Greenland's temperature increased 2 to 4 degrees Celsius in less than ten years, which is contrary to all the computer models that are the basis for predictions of global warming. (So why should anyone believe those predicitions?) Summer temperatures are the most relevant to Greenland's ice sheet melting rates, and summer temperatures at the summit of the ice sheet have declined 2.2 degrees C. per decade since 1987.
Figure 2, below, shows 4,000 years of temperatures from ice cores drilled into the Greenland ice sheet. Note that for most of this time temperatures were well above more recent times, and some periods were markedly warmer and show far larger swings than the latest uptrend.
The National Oceanic and Atmospheric Administration reported that Antarctic ice in 2007 reached its largest extent in recorded history. In April 2008 climatologist Patrick Michaels wrote “the coverage of ice surrounding Antarctica is almost exactly 2 million square miles above where it is historically supposed to be at this time of year. It's farther above normal than it has ever been for any moth in climatological records.” James Taylor of the Heartland Institute noted that “photographs distributed by the National Oceanic and Atmospheric Administration showed penguins and other cold-weather creatures able to stand further north on Southern Hemisphere sea ice than has ever been recorded.”
There is nothing new or frightening about reports of ice chunks breaking off the Antarctic ice sheet. Nautical records show it's been happening for at least 2 centuries, even when the continent's ice mass has been increasing, as it has been over the past half century. Most of the Antarctic ice is above 4,000 feet. As the ice increases there from precipitation, it pushes the glaciers toward lower elevations at the edge of the continent, where they break off.
Most of the scare stories about warming in the Antarctic have focused on the peninsula, which has been warming for decades. What those stories don't tell you is that the peninsula comprises only 2 percent of Antarctica while the other 98 percent has been cooling, that there are both surface and subsurface volcanoes in the area, and that there is no basis for inferring that what is happening on the peninsula will determine the climate for the rest of the continent. That would be like the tail waging the dog.
Alarms have also been raised about glaciers disappearing worldwide, but the public is usually not told about the glaciers that are advancing. An article in 21st Century Science and Technology states: “Since 1980, there has been an advance of more than 55% of the 625 mountain glaciers under observation by the World Glacier Monitoring group in Zurich. (From 1926 to 1960, some 70-95% of these glaciers were in retreat.)” The CBS program mentioned above showed the extensive retreat of the O'Higgins glacier in Patagonia, the fastest-melting glacier in South America. The program did not tell you that the Perito Moreno Glacier—the largest glacier in Patagonia—is advancing at the rate of 7 feet per day. Nor did it mention that Chile's Pio XI Glacier, the largest glacier in the southern hemisphere, is also growing.
In Europe many glaciers have not retreated back to their positions in the Medieval Warm Period, when there was no industrial civilization producing greenhouse gases. The Aletsch and Grindelwald glaciers (Switzerland) were much smaller between 800 and 1000 AD than now. The latter glacier is still larger than it was in 1588. In Iceland today, the Vatnajokull glacier—the largest in Europe—and also the Drangajokull glacier are far more extensive than in the Middle Ages, and farms remain buried beneath their ice.
Figure 3, below, is an intriguing chart by J. Oerlemans of 169 glacier records. It shows glaciers have been receding since 1750, with the trend accelerating after about 1820. The electric light bulb and the telephone hadn't been invented yet. (Thomas Edison and Alexander Graham Bell weren't even born.) The first commercial electric power plant was not built until 1881-82. Henry Ford began assembly line production in 1913, but by then half of the glacier loss from 1800 to 2000 had already occurred. And 70 percent of the glacier shortening occurred before 1940.
Siberia's Lake Baikal is the world's deepest lake. It contains more water than all five of North America's Great Lakes combined. Fed by over 300 rivers and far from the moderating effects of any ocean, it offers a pristine, uninterrupted sedimentary record that permits a highly accurate reconstruction of temperatures over a broad area. Environmental researcher Anson Mackay has found increased biogenic silica in sediments correlates with warmer temperatures, as shown in Figure 4 below:
Four things stand out about this graph: (1) As MacKay states, “Warming in the Lake Baikal region commenced before rapid increases in greenhouse gases;” he dates the warming from around 1750 A.D.,, long before industrial development led to the increase of greenhouse gases. (2) The warming trend began from one of the coldest periods in the last 800,000 years. (3) These coldest periods in the past were always followed by sharp, large temperature increases that couldn't possibly have been caused by human activity. (4) The latest warming is puny compared to the many much warmer periods in the past.
Mackay's findings are supported by similar research papers on other parts of the globe. These include Brauning's research in Turkey, Hallert's in Canada, and Vollweiler, et al, in Austria/Germany.
Why is it that the global warming advocates are unfazed by any contrary evidence, no matter how strong? All their claims of disasters from global warming have been debunked. All their computer models have been shown to be false, to be based on flawed assumptions, incapable of being reconciled with the observable facts. Vaclav Klaus, President of the Czech Republic and a university professor before he became president, is the author of a book on global warming and has spoken often on the subject. He says, “What frustrates me is the feeling that everything has already been said and published, that all rational argument has been used, yet it does not help.” It does not help because global warming alarmism is not based on rational argument. It is not based on science. It is not based on reality. It is based on political ideology. If rational argument doesn't fit, then phony arguments must be invented: the spread of malaria, the loss of biological diversity, polar bears disappearing, etc. If computer models can predict disaster scenarios only by programming unrealistic assumptions, then that will be done. If global warming does not fit the observable temperature measurements, then a new “reality” must be invented to fit the ideology: the actual temperature records must be altered or dismissed. Ditto for carbon dioxide measurements. The global warming advocates are not disturbed by all this because, in their view, ideology trumps reality!
Patrick Moore, a co-founder and director of Greenpeace, resigned because of its “trend toward abandoning scientific objectivity in favor of political agendas.” After the failure of communism, he says, there was little public support for collectivist ideology. In his view, a “reason environmental extremism emerged was because world communism failed, the [Berlin] wall came down, and a lot of peaceniks and political activists moved into the environmental movement bringing their neo-Marxism with them and learned to use green language in a very clever way to cloak agendas that actually have more to do with anti-capitalism and anti-globalism than they do anything with ecology or science.”
“I think if we don't overthrow capitalism, we don't have a chance of saving the world ecologically,” said Judi Bari, principal organizer of Earth First!
James Hansen revealed his hatred of capitalism in an impassioned e-mail denouncing the attention paid to errors in NASA temperature data: “The deceit behind the attempts to discredit evidence of climate change reveals matters of importance. This deceit has a clear purpose: to confuse the public about the status of knowledge of global climate change, thus delaying effective action to mitigate climate change. The danger is that delay will cause tipping points to be passed, such that large climate impacts become inevitable...[T]he ones who will live in infamy if we pass the tipping points, are the captains of industry, CEOs in fossil fuel companies such as EXXON/Mobil, automobile manufacturers, utilities, all of the leaders who have placed short-term profit above the fate of the planet and the well-being of our children.” On June 23, 2008, exactly twenty years to the day from his Senate testimony that he was 99% sure greenhouse warming was already underway, Hansen appeared before the House Select Committee on Energy Independence and Global Warming. There he conjured up images of the Nuremberg trials of Nazi war criminals by claiming the CEOs of fossil fuel energy companies “should be tried for high crimes against humanity and nature.” He said earth’s atmosphere can only stay this loaded with man-made carbon dioxide for a couple more decades without changes such as mass extinction, ecosystem collapse and dramatic sea level rises.
Klaus states: "We succeeded in getting rid of communism, but along with many others, we erroneously assumed that attempts to suppress freedom, and to centrally organize, mastermind, and control society and the economy, were matters of the past, an almost-forgotten relic. Unfortunately, those centralizing urges are still with us.... Environmentalism only pretends to deal with environmental protection. Behind their people and nature friendly terminology, the adherents of environmentalism make ambitious attempts to radically reorganize and change the world, human society, our behavior and our values....
“The followers of the environmentalist ideology, however, keep presenting us with various catastrophic scenarios with the intention of persuading us to implement their ideas. That is not only unfair but also extremely dangerous. Even more dangerous, in my view, is the quasi-scientific guise that their oft-refuted forecasts have taken on....Their recommendations would take us back to an era of statism and restricted freedom....The ideology will be different. Its essence will, nevertheless, be identical—the attractive, pathetic, at first sight noble idea that transcends the individual in the name of the common good, and the enormous self-confidence on the side of the proponents about their right to sacrifice the man and his freedom in order to make this idea reality.... We have to restart the discussion about the very nature of government and about the relationship between the individual and society....It is not about climatology. It is about freedom.”
Do you ever wonder how communism could last for seventy years in Russia? Surely there was plenty of evidence for decades that the system was failing: food shortages, declining life expectancy, increased infant mortality, low standards of living, primitive hospitals and sanitation facilities lagging far behind those in Western Europe and America—not to mention pollution far worse than in the West. But to die-hard communists, the observable facts did not matter. All the observable negatives of collectivism were trumped by ideology. The same is true of the ideology behind global warming.
Additional information on global warming and the deception of the IPCC can be found on pages 224-234 of my book MAKERS AND TAKERS: How Wealth and Progress are Made and How They are Taken Away or Prevented. The book also covers many other environmental and economic issues. See http://www.amlibpub.com/.
Spreading the ideas of freedom. Discusses economic, political, and social issues in the context of liberty and individual rights. Shows freedom and capitalism are essential to human advancement on all levels. Identifies government actions that are anti-liberty and detrimental to our rights, our lives and health, and the nation's prosperity and future.
Friday, November 14, 2008
Friday, October 24, 2008
Global Warming, Global Myth, Part 2
The global warming advocates make all sorts of false claims about dire consequences of global warming. They claim it will result in the spread of malaria, food shortages, more human deaths, more violent weather, and a loss of biological diversity through the extinction of species. All untrue. The largest number of species — the greatest biological diversity — is in the tropics. As you move away from the equator, you find fewer and fewer species, until you reach the earth’s poles, where there is zero diversity because nothing can live there.
Agricultural productivity is also reduced by cold climate, not a warmer one. That’s why Siberia and Alaska are not noted for agricultural abundance. A warmer climate would mean longer growing seasons and would make agriculture possible in areas where it isn’t today. And there are at least 300 studies showing plants and forests grow faster and more luxuriantly under conditions of increased carbon dioxide.
Our bodies require heat. We are warm-blooded and have no fur. We wear clothes, build homes, and heat them with fires, all as protection against the cold. Far more people move to Florida, California or Arizona because of warm climate than move to Alaska, North Dakota, or Montana. Canada is the world’s second largest country, lots of space for people to live, but 90% of the population lives within 100 miles of its southern border. Worldwide, far more people die every year from cold than from heat. So why should global warming be bad for us?
Global warming will not result in the spread of malaria. Paul Reiter, of the Pasteur Institute in Paris, is one of the world's foremost experts on insect borne diseases. He says, “The global warming alarm is dressed up as science, but it is not science. It is propaganda. I was horrified to read the [Intergovernmental Panel on Climate Change] 2nd and 3rd Assessment Reports because there was so much misinformation.” For example, the IPCC states “mosquito species that transmit malaria do not usually survive where the mean winter temperature drops below 16–18 degrees C.” This is “clearly untrue,” says Reiter. “In fact, mosquitoes are extremely abundant in the Arctic. The most devastating epidemic of malaria was in the Soviet Union in the 1920s. There were something like 13 million cases a year and something like 600,000 deaths, a tremendous catastrophe that reached up to the Arctic Circle. Arkhangel [a city 300 miles further north than Helsinki, Finland] had 30,000 cases and about 10,000 deaths. So it’s not a tropical disease. Yet these people in the global warming fraternity invent the idea that malaria will move northward.”
New York City and Boston had long histories of malaria. In 1933, when President Roosevelt authorized the Tennessee Valley Authority, a third of the population in the area had malaria. Malaria was not eliminated in the U.S. until 1951. It was done through the use of DDT — which the environmentalists prevailed upon the United States to ban, resulting in 40–50 million unnecessary deaths from malaria since 1972.
The environmentalists have also invented the idea that the polar bear is threatened by global warming. Today there are 22–25 thousand polar bears, compared to 8–10 thousand 40 years ago and only 5 thousand in 1940, before the big rise in carbon dioxide. Eleven of the 13 polar bear groups in Canada today are stable or increasing. The two that are decreasing are in an area where the climate has gotten colder! Furthermore, the polar bears survived many periods of much warmer temperatures, some lasting thousands of years. They survived the Medieval Warm Period a thousand years ago, when the Vikings settled both Iceland and Greenland. Greenland actually was green then and could support agriculture; but when the cold returned a few centuries later, the people there all starved to death. Today Greenland is covered by a sheet of ice. Six thousand years ago the earth’s climate was much warmer than now, and the polar bears survived. Ten thousand years ago the earth’s climate was a whopping six degrees C.(11 degrees F) warmer than now, and the bears survived. Polar bears have been a distinct species for 125 thousand years (they descended from grizzly bears) and they’ve survived far warmer climates than anything they face today or in the foreseeable future. A Canadian polar bear expert, Mitch Taylor, says, “They are not going extinct, or even appear to be affected.”
The argument that a warmer climate will bring more violent weather can only be made by people who have no knowledge of climate history or simply dismiss it because it contradicts their propaganda. And they rely on the public — and the media — being uninformed enough and gullible enough to believe them. There is abundant historical evidence that the earth had far more violent weather in times of colder climate, such as the Little Ice Age, than in warmer times. It is well known, too, that what determines violent weather is the temperature differential between the equator and the poles. All the computer models predict the greatest warming from the greenhouse effect will be at the poles, which will reduce that differential and violent weather.
There are four sources of global temperature measurements: NASA, The UK Meteorological Office’s Hadley Center for Climate Studies, the University of Alabama at Huntsville, and RSS (Remote Sensing Systems). NASA is out of step with the other three. The others show global temperatures declining since 1998 while NASA shows them increasing at a record pace. How can that be? Statistician Steve McIntyre tracks climate data closely at http://www.climateaudit.org/. Recently he ran an article titled “NASA is Rewriting History, Time and Time Again.” It explains that NASA has “adjusted” recent temperatures upward and older temperatures downward, which creates the appearance of warming. The man behind these changes is James Hansen, the scientist who started the whole global warming hysteria by testifying before a Senate committee in June 1988 that he was 99% sure greenhouse warming was already under way. The same media which scarcely a decade earlier were touting a coming ice age now seized upon Hansen’s unsupported testimony and began touting global warming. Hansen has been trying ever since to come up with evidence to support his claims, now even tampering with the actual temperature record. Steven Goddard asks, “How could it be determined that so many thermometers were wrong by an average of 0.5 degrees in one particular year several decades ago, and an accurate retrofit be made? Why is the adjustment 0.5 degrees one year, and 0.1 degrees the next?” Statistically, the odds are 50/50 of an error being either up or down. But Hansen adds an upward correction to the average of thousands of temperature measurements annually across the globe in more than 55 years out of 70. That's like flipping a coin 70 times and having it turn up heads 55 times. The odds of that happening are about one in a million.
Nor is that the only example of phony manipulation of data for the good of the cause. The centerpiece of the IPCC Third Assessment Report was the “hockey stick” graph by Michael Mann, et al. It showed a thousand years of “reconstructed” global temperatures as a long horizontal trend looking like the long handle of a hockey stick — with a sharp rise since 1900 looking like the blade of the hockey stick, due to global warming. This work has now been thoroughly discredited. It was the product of multiple inaccuracies from errors, omissions, obsolete data, and manipulations in “reconstructing” data, all of which was then processed through an invalid statistical procedure. That procedure was found to produce a “hockey stick” even from random inputs, and Mann himself later admitted it would find a “hockey stick” where there wasn't one. The National Academy of Sciences found a “validation skill not significantly different from zero.” The issue was presented to the National Academy of Sciences by the Wegman Panel, consisting of three independent statisticians chaired by an eminent statistics professor, Edward Wegman, who also testified about it at a congressional investigation. After explaining the incorrect mathematics in Mann’s procedure, Wegman stated: “I am baffled by the [Mann] claim that incorrect mathematics doesn’t matter because the answer is correct anyway[!]” Ideology trumps mathematics! (Incidentally, this graph is still being used on TV programs on global warming. I was on one such program less than a year ago that displayed this graph four or five times in an hour and allowed Mann plenty of airtime to tout it, and the program provided no rebuttal. And I have been told by students and parents that the “hockey stick” graph is still being used in schools.)
To be continued.
Agricultural productivity is also reduced by cold climate, not a warmer one. That’s why Siberia and Alaska are not noted for agricultural abundance. A warmer climate would mean longer growing seasons and would make agriculture possible in areas where it isn’t today. And there are at least 300 studies showing plants and forests grow faster and more luxuriantly under conditions of increased carbon dioxide.
Our bodies require heat. We are warm-blooded and have no fur. We wear clothes, build homes, and heat them with fires, all as protection against the cold. Far more people move to Florida, California or Arizona because of warm climate than move to Alaska, North Dakota, or Montana. Canada is the world’s second largest country, lots of space for people to live, but 90% of the population lives within 100 miles of its southern border. Worldwide, far more people die every year from cold than from heat. So why should global warming be bad for us?
Global warming will not result in the spread of malaria. Paul Reiter, of the Pasteur Institute in Paris, is one of the world's foremost experts on insect borne diseases. He says, “The global warming alarm is dressed up as science, but it is not science. It is propaganda. I was horrified to read the [Intergovernmental Panel on Climate Change] 2nd and 3rd Assessment Reports because there was so much misinformation.” For example, the IPCC states “mosquito species that transmit malaria do not usually survive where the mean winter temperature drops below 16–18 degrees C.” This is “clearly untrue,” says Reiter. “In fact, mosquitoes are extremely abundant in the Arctic. The most devastating epidemic of malaria was in the Soviet Union in the 1920s. There were something like 13 million cases a year and something like 600,000 deaths, a tremendous catastrophe that reached up to the Arctic Circle. Arkhangel [a city 300 miles further north than Helsinki, Finland] had 30,000 cases and about 10,000 deaths. So it’s not a tropical disease. Yet these people in the global warming fraternity invent the idea that malaria will move northward.”
New York City and Boston had long histories of malaria. In 1933, when President Roosevelt authorized the Tennessee Valley Authority, a third of the population in the area had malaria. Malaria was not eliminated in the U.S. until 1951. It was done through the use of DDT — which the environmentalists prevailed upon the United States to ban, resulting in 40–50 million unnecessary deaths from malaria since 1972.
The environmentalists have also invented the idea that the polar bear is threatened by global warming. Today there are 22–25 thousand polar bears, compared to 8–10 thousand 40 years ago and only 5 thousand in 1940, before the big rise in carbon dioxide. Eleven of the 13 polar bear groups in Canada today are stable or increasing. The two that are decreasing are in an area where the climate has gotten colder! Furthermore, the polar bears survived many periods of much warmer temperatures, some lasting thousands of years. They survived the Medieval Warm Period a thousand years ago, when the Vikings settled both Iceland and Greenland. Greenland actually was green then and could support agriculture; but when the cold returned a few centuries later, the people there all starved to death. Today Greenland is covered by a sheet of ice. Six thousand years ago the earth’s climate was much warmer than now, and the polar bears survived. Ten thousand years ago the earth’s climate was a whopping six degrees C.(11 degrees F) warmer than now, and the bears survived. Polar bears have been a distinct species for 125 thousand years (they descended from grizzly bears) and they’ve survived far warmer climates than anything they face today or in the foreseeable future. A Canadian polar bear expert, Mitch Taylor, says, “They are not going extinct, or even appear to be affected.”
The argument that a warmer climate will bring more violent weather can only be made by people who have no knowledge of climate history or simply dismiss it because it contradicts their propaganda. And they rely on the public — and the media — being uninformed enough and gullible enough to believe them. There is abundant historical evidence that the earth had far more violent weather in times of colder climate, such as the Little Ice Age, than in warmer times. It is well known, too, that what determines violent weather is the temperature differential between the equator and the poles. All the computer models predict the greatest warming from the greenhouse effect will be at the poles, which will reduce that differential and violent weather.
There are four sources of global temperature measurements: NASA, The UK Meteorological Office’s Hadley Center for Climate Studies, the University of Alabama at Huntsville, and RSS (Remote Sensing Systems). NASA is out of step with the other three. The others show global temperatures declining since 1998 while NASA shows them increasing at a record pace. How can that be? Statistician Steve McIntyre tracks climate data closely at http://www.climateaudit.org/. Recently he ran an article titled “NASA is Rewriting History, Time and Time Again.” It explains that NASA has “adjusted” recent temperatures upward and older temperatures downward, which creates the appearance of warming. The man behind these changes is James Hansen, the scientist who started the whole global warming hysteria by testifying before a Senate committee in June 1988 that he was 99% sure greenhouse warming was already under way. The same media which scarcely a decade earlier were touting a coming ice age now seized upon Hansen’s unsupported testimony and began touting global warming. Hansen has been trying ever since to come up with evidence to support his claims, now even tampering with the actual temperature record. Steven Goddard asks, “How could it be determined that so many thermometers were wrong by an average of 0.5 degrees in one particular year several decades ago, and an accurate retrofit be made? Why is the adjustment 0.5 degrees one year, and 0.1 degrees the next?” Statistically, the odds are 50/50 of an error being either up or down. But Hansen adds an upward correction to the average of thousands of temperature measurements annually across the globe in more than 55 years out of 70. That's like flipping a coin 70 times and having it turn up heads 55 times. The odds of that happening are about one in a million.
Nor is that the only example of phony manipulation of data for the good of the cause. The centerpiece of the IPCC Third Assessment Report was the “hockey stick” graph by Michael Mann, et al. It showed a thousand years of “reconstructed” global temperatures as a long horizontal trend looking like the long handle of a hockey stick — with a sharp rise since 1900 looking like the blade of the hockey stick, due to global warming. This work has now been thoroughly discredited. It was the product of multiple inaccuracies from errors, omissions, obsolete data, and manipulations in “reconstructing” data, all of which was then processed through an invalid statistical procedure. That procedure was found to produce a “hockey stick” even from random inputs, and Mann himself later admitted it would find a “hockey stick” where there wasn't one. The National Academy of Sciences found a “validation skill not significantly different from zero.” The issue was presented to the National Academy of Sciences by the Wegman Panel, consisting of three independent statisticians chaired by an eminent statistics professor, Edward Wegman, who also testified about it at a congressional investigation. After explaining the incorrect mathematics in Mann’s procedure, Wegman stated: “I am baffled by the [Mann] claim that incorrect mathematics doesn’t matter because the answer is correct anyway[!]” Ideology trumps mathematics! (Incidentally, this graph is still being used on TV programs on global warming. I was on one such program less than a year ago that displayed this graph four or five times in an hour and allowed Mann plenty of airtime to tout it, and the program provided no rebuttal. And I have been told by students and parents that the “hockey stick” graph is still being used in schools.)
To be continued.
Friday, October 17, 2008
Global Warming, Global Myth, Part 1
“Unless we announce disasters, no one will listen.”—Sir John Houghton, first chairman of the Intergovernmental Panel on Climate Change and lead editor of its first three reports.”
During the 20th century, the earth warmed 0.6 degree Celsius (1 degree Farhenheit), but that warming was wiped out in a single year when January 2008 showed an annual drop of 0.63 degree C. (1.13 F.). A single year does not constitute a trend reversal, but the magnitude of that temperature drop—equal to 100 years of warming—is noteworthy. Of course, it can also be argued that a mere 0.6 degree warming in a century is so tiny it should never have been considered a cause for alarm in the first place. But then how could the idea of global warming be sold to the public? In any case, global cooling has been evident for more than a single year. Global temperature has declined since 1998. Meanwhile, atmospheric carbon dioxide has gone in the other direction, increasing 15 - 20%. This divergence casts doubt on the validity of the greenhouse hypothesis, but that hasn't discouraged the global warming advocates. They have long been ignoring far greater evidence that the basic assumption of greenhouse warming from increases in carbon dioxide is false.
Man-made emissions of carbon dioxide were not significant before worldwide industrialization began in the 1940s. They have increased steadily since. Over 80% of the 20th century's carbon dioxide increase occurred after 1940—but most of the century's temperature increase occurred before 1940! From 1940 until the mid-1970s, the climate also failed to behave according to the greenhouse hypothesis, as carbon dioxide was strongly increasing while global temperatures cooled. This cooling led to countless scare stories in the media about a new ice age commencing.
In the last 1.6 million years there have been 63 alternations between warm and cold climates and no indication that any of them were caused by changes in carbon dioxide levels. A recent study of a much longer period (600 million years) shows—without exception—that temperature changes precede changes in carbon dioxide levels, not the other way around. As the earth warms, the oceans yield more carbon dioxide to the atmosphere, because warmer water cannot hold as much carbon dioxide as colder water.
The public has been led to believe that increased carbon dioxide from human activities is causing a greenhouse effect that is heating the planet. But carbon dioxide comprises only 0.035% of our atmosphere and is a very weak greenhouse gas. Although it is widely blamed for greenhouse warming, it is not the only greenhouse gas or even the most important. Water vapor is a strong greenhouse gas and accounts for at least 95 percent of any greenhouse effect. Carbon dioxide accounts for only about 3 percent, with the remainder due to methane and several other gases.
Not only is carbon dioxide's total greenhouse effect puny, mankind's contribution to it is minuscule. The overwhelming majority (97%) of carbon dioxide in the earth's atmosphere is due to nature, not man. Volcanoes, swamps, rice paddies, fallen leaves, and even insects and bacteria produce carbon dioxide, as well as methane. According to the journal Science, Nov. 5, 1982, termites alone emit ten times more carbon dioxide than all the factories and automobiles in the world. Natural wetlands emit more greenhouse gases than all human activities combined. (If greenhouse warming is such a problem, why are we trying to save all the wetlands?) Geothermal activity in Yellowstone National Park emits 10 times the carbon dioxide of a mid-sized coal-burning power plant, and volcanoes emit hundreds of times more. In fact, our atmosphere is primarily the result of volcanic activity. There are about 100 active volcanoes today, mostly in remote locations, and we're living in a period of relatively low volcanic activity. There have been times when volcanic activity was ten times greater than in modern times. But by far the largest source of carbon dioxide emissions is the equatorial Pacific Ocean. It produces 72 percent of the earth's emissions of carbon dioxide, and the rest of the Pacific Ocean, the Atlantic and Indian Oceans, and other waters also contribute. The human contribution is so puny it is far overshadowed by these far larger sources of carbon dioxide. Combining the factors of water vapor and nature's production of carbon dioxide, we see that 99.8 percent of any greenhouse effect has nothing to do with carbon dioxide emissions from human activity. So how much effect could regulating the tiny remainder have upon world climate? That is, if climate was determined by changes in carbon dioxide, which is not the case.
Since carbon dioxide is a very weak greenhouse gas, computer models predicting environmental catastrophe depend on the small amount of warming from carbon dioxide being amplified by increased evaporation of water. But in the many documented periods of higher carbon dioxide, even during much warmer climate periods, that never happened. During the time of the dinosaurs, the carbon dioxide levels were 300 – 500% greater than today. Five hundred million years ago, the level of carbon dioxide in the atmosphere was 15 - 20 times what it is today. Yet the catastrophic water-vapor amplification of carbon dioxide warming never occurred. Today we're told catastrophic warming will result if carbon dioxide doubles. But during the Ordovician Period, the carbon dioxide level was 12 times what it is today, and the earth was in an Ice Age. That's exactly opposite to the “runaway” warming that computer models predict should occur. Clearly the models are wrong; they depend upon an assumption of amplification that is contrary to the climate record of millions of years. So there is no reason to fear the computer predictions—or base public policies on them. Reid Bryson, founding chairman of the Department of Meteorology at the University of Wisconsin, has stated, "You can go outside and spit and have the same effect as doubling carbon dioxide."
There are other examples where the computer models fail to agree with reality. According to the greenhouse hypothesis, the warming should occur equally during day and night. But most of the warming that has been observed has occurred at night, thus falsifying the models.
All of the models agree—for sound theoretical reasons—that warming from a greenhouse effect must be 2-3 times greater in the lower atmosphere than at the earth's surface. This is not happening. Both satellites and weather balloons show slightly greater warming at the surface. These atmospheric temperature measurements furnish direct, unequivocal evidence that whatever warming has occurred is not due to the greenhouse effect.
Everyone knows the sun heats the earth, but the public is generally unaware that the sun’s heat is not uniform. Solar radiation is affected by disturbances on the surface of the sun, called “sunspots,” which correspond to the sun's 11-year magnetic cycle. There are also several solar cycles of longer duration. Superimposed, these cycles might augment or cancel each other. There are also periods when sunspots “crash” or almost disappear, which can lead to dramatic cooling of the earth for several decades. This is what happened 400 years ago during the Maunder Minimum, which was the coldest part of the Little Ice Age. During one 30-year period within the Maunder Minimum, only about 50 sunspots were observed, compared to a typical 40 – 50 thousand.
Sunspots have now virtually vanished. You can check out pictures of the sun day after day after day for the last few years at http://tinyurl.com/6zck4x. Very few show more than one sunspot and many show none. We are currently at a solar minimum, awaiting the start of the next solar cycle. If sunspot activity does not pick up soon, we could be in for some seriously cold climate. The jury is still out on sunspot numbers.
In any case, some climate scientists believe the length of past solar cycles points to a cool phase early in this century. Professor Habibullo Abdussamatov, head of the Pulkovo Observatory in Russia, believes a slow decline in temperatures will begin as early as 2012-2015 and will lead to a deep freeze in 2050-2060 that will last about fifty years. Climatologist Tim Patterson thinks by 2020 the sun will be starting into its weakest 11-year sunspot cycle of the past two centuries, likely leading to unusually cool conditions on earth. He says, “If we're to have even a medium-sized solar minimum, we could be looking at a lot more bad effects than 'global warming' would have had.”
To be continued.
During the 20th century, the earth warmed 0.6 degree Celsius (1 degree Farhenheit), but that warming was wiped out in a single year when January 2008 showed an annual drop of 0.63 degree C. (1.13 F.). A single year does not constitute a trend reversal, but the magnitude of that temperature drop—equal to 100 years of warming—is noteworthy. Of course, it can also be argued that a mere 0.6 degree warming in a century is so tiny it should never have been considered a cause for alarm in the first place. But then how could the idea of global warming be sold to the public? In any case, global cooling has been evident for more than a single year. Global temperature has declined since 1998. Meanwhile, atmospheric carbon dioxide has gone in the other direction, increasing 15 - 20%. This divergence casts doubt on the validity of the greenhouse hypothesis, but that hasn't discouraged the global warming advocates. They have long been ignoring far greater evidence that the basic assumption of greenhouse warming from increases in carbon dioxide is false.
Man-made emissions of carbon dioxide were not significant before worldwide industrialization began in the 1940s. They have increased steadily since. Over 80% of the 20th century's carbon dioxide increase occurred after 1940—but most of the century's temperature increase occurred before 1940! From 1940 until the mid-1970s, the climate also failed to behave according to the greenhouse hypothesis, as carbon dioxide was strongly increasing while global temperatures cooled. This cooling led to countless scare stories in the media about a new ice age commencing.
In the last 1.6 million years there have been 63 alternations between warm and cold climates and no indication that any of them were caused by changes in carbon dioxide levels. A recent study of a much longer period (600 million years) shows—without exception—that temperature changes precede changes in carbon dioxide levels, not the other way around. As the earth warms, the oceans yield more carbon dioxide to the atmosphere, because warmer water cannot hold as much carbon dioxide as colder water.
The public has been led to believe that increased carbon dioxide from human activities is causing a greenhouse effect that is heating the planet. But carbon dioxide comprises only 0.035% of our atmosphere and is a very weak greenhouse gas. Although it is widely blamed for greenhouse warming, it is not the only greenhouse gas or even the most important. Water vapor is a strong greenhouse gas and accounts for at least 95 percent of any greenhouse effect. Carbon dioxide accounts for only about 3 percent, with the remainder due to methane and several other gases.
Not only is carbon dioxide's total greenhouse effect puny, mankind's contribution to it is minuscule. The overwhelming majority (97%) of carbon dioxide in the earth's atmosphere is due to nature, not man. Volcanoes, swamps, rice paddies, fallen leaves, and even insects and bacteria produce carbon dioxide, as well as methane. According to the journal Science, Nov. 5, 1982, termites alone emit ten times more carbon dioxide than all the factories and automobiles in the world. Natural wetlands emit more greenhouse gases than all human activities combined. (If greenhouse warming is such a problem, why are we trying to save all the wetlands?) Geothermal activity in Yellowstone National Park emits 10 times the carbon dioxide of a mid-sized coal-burning power plant, and volcanoes emit hundreds of times more. In fact, our atmosphere is primarily the result of volcanic activity. There are about 100 active volcanoes today, mostly in remote locations, and we're living in a period of relatively low volcanic activity. There have been times when volcanic activity was ten times greater than in modern times. But by far the largest source of carbon dioxide emissions is the equatorial Pacific Ocean. It produces 72 percent of the earth's emissions of carbon dioxide, and the rest of the Pacific Ocean, the Atlantic and Indian Oceans, and other waters also contribute. The human contribution is so puny it is far overshadowed by these far larger sources of carbon dioxide. Combining the factors of water vapor and nature's production of carbon dioxide, we see that 99.8 percent of any greenhouse effect has nothing to do with carbon dioxide emissions from human activity. So how much effect could regulating the tiny remainder have upon world climate? That is, if climate was determined by changes in carbon dioxide, which is not the case.
Since carbon dioxide is a very weak greenhouse gas, computer models predicting environmental catastrophe depend on the small amount of warming from carbon dioxide being amplified by increased evaporation of water. But in the many documented periods of higher carbon dioxide, even during much warmer climate periods, that never happened. During the time of the dinosaurs, the carbon dioxide levels were 300 – 500% greater than today. Five hundred million years ago, the level of carbon dioxide in the atmosphere was 15 - 20 times what it is today. Yet the catastrophic water-vapor amplification of carbon dioxide warming never occurred. Today we're told catastrophic warming will result if carbon dioxide doubles. But during the Ordovician Period, the carbon dioxide level was 12 times what it is today, and the earth was in an Ice Age. That's exactly opposite to the “runaway” warming that computer models predict should occur. Clearly the models are wrong; they depend upon an assumption of amplification that is contrary to the climate record of millions of years. So there is no reason to fear the computer predictions—or base public policies on them. Reid Bryson, founding chairman of the Department of Meteorology at the University of Wisconsin, has stated, "You can go outside and spit and have the same effect as doubling carbon dioxide."
There are other examples where the computer models fail to agree with reality. According to the greenhouse hypothesis, the warming should occur equally during day and night. But most of the warming that has been observed has occurred at night, thus falsifying the models.
All of the models agree—for sound theoretical reasons—that warming from a greenhouse effect must be 2-3 times greater in the lower atmosphere than at the earth's surface. This is not happening. Both satellites and weather balloons show slightly greater warming at the surface. These atmospheric temperature measurements furnish direct, unequivocal evidence that whatever warming has occurred is not due to the greenhouse effect.
Everyone knows the sun heats the earth, but the public is generally unaware that the sun’s heat is not uniform. Solar radiation is affected by disturbances on the surface of the sun, called “sunspots,” which correspond to the sun's 11-year magnetic cycle. There are also several solar cycles of longer duration. Superimposed, these cycles might augment or cancel each other. There are also periods when sunspots “crash” or almost disappear, which can lead to dramatic cooling of the earth for several decades. This is what happened 400 years ago during the Maunder Minimum, which was the coldest part of the Little Ice Age. During one 30-year period within the Maunder Minimum, only about 50 sunspots were observed, compared to a typical 40 – 50 thousand.
Sunspots have now virtually vanished. You can check out pictures of the sun day after day after day for the last few years at http://tinyurl.com/6zck4x. Very few show more than one sunspot and many show none. We are currently at a solar minimum, awaiting the start of the next solar cycle. If sunspot activity does not pick up soon, we could be in for some seriously cold climate. The jury is still out on sunspot numbers.
In any case, some climate scientists believe the length of past solar cycles points to a cool phase early in this century. Professor Habibullo Abdussamatov, head of the Pulkovo Observatory in Russia, believes a slow decline in temperatures will begin as early as 2012-2015 and will lead to a deep freeze in 2050-2060 that will last about fifty years. Climatologist Tim Patterson thinks by 2020 the sun will be starting into its weakest 11-year sunspot cycle of the past two centuries, likely leading to unusually cool conditions on earth. He says, “If we're to have even a medium-sized solar minimum, we could be looking at a lot more bad effects than 'global warming' would have had.”
To be continued.
Wednesday, August 27, 2008
Mortgage Crisis, the Dollar and its Future, Part 4
Various other countries, even those not hostile to the U.S., also seek to reduce their their risks by reducing their dollars. Over a year ago Italy, Russia, Sweden and the United Arab Emirates announced they would reduce their dollar holdings slightly. Sweden's 90 % dollar foreign reserve went to 85 %. The UAE said it would convert 8% of its dollar holdings to euros. Friendly Japan, the largest holders of U.S. treasuries, was a net seller of $46 billion in U.S. treasuries in the last 12 months, reducing its holding by 3 % in three years. South Korea sold $19 billion this past year.
U.S. citizens, like foreign governments, have been diversifying out of depreciating dollars. In the last two years, more than 200 internationally focused mutual funds have been launched in the U.S. When the dollar is weak, overseas gains are worth more when translated back into dollars. This is part of the reason these mutual funds gained 16.3 % last year compared with 5.2 % for U.S counterparts. In 2007 more than 95 % of net inflows of money into mutual funds went into internationally focused funds, compared to less than 10 % five years ago. This net outflow of capital adds to the underlying weakness of the dollar: mutual funds and other institutions sell increasing amounts of dollars to buy increasing amounts of currencies in which the foreign stocks are denominated.
An increasingly popular way for governments to diversify out of dollars is through so-called sovereign wealth funds. These are state-owned entities that invest central bank reserves in stocks, real estate, bonds and other financial instruments. They are typically created when central banks have reserves massively in excess of needs for liquidity and foreign exchange.
Back in 1971 the U.S. said foreign governments couldn't exchange their dollars in gold. Now they have been told they can't exchange them for certain other things either. The U.S. prevented the sovereign fund of oil-rich Dubai from buying a company that operates six U.S. ports. It also prevented a Chinese government-controlled company from buying Unocal, an oil company. The Chinese company withdrew its offer of $18.5 billion, “saying it could not overcome resistance from politicians in Washington,” according to the Washington Post.
Countries have to question the value of accumulating a currency they can't spend in the country that issued it. Fortunately for them, the credit crisis gave them another opportunity. Massive losses by U.S. banks and other financial institutions created a need for cash. In January 2008, Citigroup (formerly Citibank), the nation's second largest bank, reported a 4th quarter loss of $18.1 billion, in addition to a $6 billion loss in the third quarter. Merrill Lynch, a 94-year firm that is the world's largest brokerage, reported a 4th quarter loss of almost $10 billion after $16.7 billion in write-downs from subprime mortgages and CDOs. Morgan Stanley reported a $9.4 billion loss for the 4th quarter. UBS reported a 4th quarter loss of $14.45 billion, primarily from U.S. subprime mortgages. Sovereign wealth funds from China, Singapore, South Korea and other nations jumped at the opportunity to buy into these companies. Merrill Lynch obtained $6.6 billion in January from South Korea, Kuwait and Japan and $6.2 billion the previous month from Singapore. UBS received $9.75 billion from Singapore. China now owns 9.9 % of Morgan Stanley.
Mindful of the earlier problems of Dubai and the Chinese attempt to buy Unocal, foreign governments are well aware they may encounter further political obstacles to investing in the U.S., particularly if the current banking crisis abates. In any case, they are looking for other ways of diversifying out of dollars. Think gold. It is something they can buy without any government's permission. It has an unrivaled record of 5,000 years as a store of value. And it surely is more valuable than unredeemable paper currency. Last year Qatar tripled its reserves of gold in one month. That still was a small amount, but what if other countries with far larger dollar reserves start trading them for gold?
For more than a century, South Africa has been the world's leading producer of gold. For most of those years, it produced three-fourths of the world's output. But last year it was in second place. The number one gold producing country in 2007 was.....(are you ready for this?)......China!
Through centuries of political and economic turmoil, people in China and India have hoarded gold and silver jewelry and bars as a means of storing wealth. There were no large, organized markets; prices were determined by the haggling of local buyers and sellers. All that has changed. New opportunities have greatly expanded commerce in gold and made it much more convenient for the public, which has responded with enthusiasm. Investors can now trade in physical gold on the Shanghai Gold Exchange. Some new Chinese gold issues are traded over the internet. On January 9, 2008, China's first gold futures contract was launched, attracting thousands of investors. A spokesman for China International Futures said “about 90 % are individual investors, most of whom were moving assets after turning bearish on the stock markets.” As China's economy grows, people have higher incomes that raise the appetite for gold.
The State Bank of India plans to launch an exchange traded fund (ETF) this year that will enable investors to trade gold like a regular stock. Dubai also hopes to launch an ETF in gold this year. In August, 2007, the Osaka Securities Exchange in Japan began offering a gold-linked bond aimed at smaller investors. In January the Hong Kong Exchanges & Clearing Ltd. announced plans for ETFs and other gold-related investments.
Gold buying in the U.S. used to be a clumsy process. For decades, thanks to President Franklin Roosevelt's prohibition on owning gold, U.S. citizens could not even own the metal except for jewelry and coins classified as collector's items. After gold became legal again, the investment process was small-scale and cumbersome. Investors buying gold coins or bars were faced with varying—and sometimes hefty—commissions plus storage and transportation costs. Investing in gold became more convenient and efficient with the advent of ETFs. These allow investors to buy or sell shares of stock shares tied to the value of the underlying commodity. When people buy shares of stock in such a fund, the fund buys an equivalent amount of gold, which is stored in vaults. When shares are sold, the fund sells an equivalent amount of gold. The ease and simplicity of the process, with the same commissions as for regular stocks, has resulted in investors pouring billions of dollars into these funds on stock exchanges in the U.S., Paris, London, Australia, South Africa, Mexico, Singapore, and some other countries. The most active gold ETF is SPDR Gold Trust, which trades on the New York Stock Exchange. It holds more gold than the European Central Bank or China's Central Bank. Meanwhile, gold futures are trading robustly on the world's most important gold market, the Comex division of the New York Mercantile Exchange, and on the London Metal Exchange. Other gold plays include a five-year gold bullion CD from Everbank Financial, common stocks in gold mining companies, and mutual funds specializing in those stocks.
Both foreign governments and individuals are turning away from the dollar as it loses value. The dollar is in danger of losing its status as a reserve currency. The international monetary system that has existed since the post-World War II Bretton Woods agreement is coming apart at the seams. Recently Bill Gross, CEO of PIMCO, the world's largest bond fund manager, wrote, “What we are witnessing is essentially the breakdown of our modern day banking system.” Vladimir Putin has called for a “new international financial architecture,” a call that resonates with emerging economies such as China, India, Brazil. At an international economic forum in St. Petersburg in June 2007, Putin pointed out, “Fifty years ago, 60 % of world gross domestic product came from the Group of Seven industrial nations. Today 60 % of world GDP comes from outside the G7.” In view of this growing global economic trend, it is hard to see the current monetary system enduring when the linchpin of that system, the U.S. dollar, is continually undermined by the inflationary policies of its own government.
Regarding a recent hint from China about dumping a portion of its dollars, Judy Shelton, Ph.D., a professor of international finance and author of books and articles on monetary issues, said: “The prospect of such a shock to the U.S. economy in the midst of a housing slump threatens to bring the whole edifice crashing down. Throw in statements of support from oil-producers Venezuela and Iran, and you have the makings of a devastating dollar rout.” Incidentally, last July Japan agreed to pay for its oil imports from Iran in yen rather than dollars.
Even if China does not dump its U.S. treasuries on the market any time soon, the longer term outlook for the dollar is still bleak. It seems unlikely any of the trends that brought us to this point will reverse; more likely, they will worsen. Earlier I mentioned that the U.S. national debt is now $9 trillion. But that does not include unfunded Social Security, Medicaid and similar obligations. If those are included, the debt is $59 trillion. There is no way those future bills can be paid in terms of current dollars. As a “solution,” future administrations will simply employ larger doses of the same inflationary measures that created the problem. Because of structural changes in our government many years ago, which I discuss at some length in my book Makers and Takers, the system has evolved to favor the politicians who spend more and promise more. “A lot of people, including me,” said Paul Volcker, former chairman of the Federal Reserve, “have been saying that the country has been spending more than it's been producing, and that will have to come to an end. The question is: Does it come to an end with a bang or a whimper?”
The attempt to kick the problem down the road to the next generation and let them pay off future obligations with trillions of dollars worth a fraction of today's dollars will no longer work. As the dollar loses its stature as a reserve currency, as countries one by one defect from their formal links to the dollar, as U.S. dollars pile up abroad from financing our deficits, and as our borrowing becomes increasingly expensive, the monetary framework becomes ever more fragile. The danger of creating a run from the dollar could easily snowball like the run to gold prior to Nixon's action in 1971. The dollar may survive this crisis, or the next one, but there will always be one more, to which the U.S. will be more vulnerable than to the previous ones. Long before that $59 trillion debt comes due, the dollar will have lost too much value to be trusted any more. The end will not come with a whimper.
Edmund Contoski is the author of MAKERS AND TAKERS: How Wealth and Progress Are Made and How They Are Taken Away or Prevented (American Liberty Publishers, 1997). www.amlibpub.com
U.S. citizens, like foreign governments, have been diversifying out of depreciating dollars. In the last two years, more than 200 internationally focused mutual funds have been launched in the U.S. When the dollar is weak, overseas gains are worth more when translated back into dollars. This is part of the reason these mutual funds gained 16.3 % last year compared with 5.2 % for U.S counterparts. In 2007 more than 95 % of net inflows of money into mutual funds went into internationally focused funds, compared to less than 10 % five years ago. This net outflow of capital adds to the underlying weakness of the dollar: mutual funds and other institutions sell increasing amounts of dollars to buy increasing amounts of currencies in which the foreign stocks are denominated.
An increasingly popular way for governments to diversify out of dollars is through so-called sovereign wealth funds. These are state-owned entities that invest central bank reserves in stocks, real estate, bonds and other financial instruments. They are typically created when central banks have reserves massively in excess of needs for liquidity and foreign exchange.
Back in 1971 the U.S. said foreign governments couldn't exchange their dollars in gold. Now they have been told they can't exchange them for certain other things either. The U.S. prevented the sovereign fund of oil-rich Dubai from buying a company that operates six U.S. ports. It also prevented a Chinese government-controlled company from buying Unocal, an oil company. The Chinese company withdrew its offer of $18.5 billion, “saying it could not overcome resistance from politicians in Washington,” according to the Washington Post.
Countries have to question the value of accumulating a currency they can't spend in the country that issued it. Fortunately for them, the credit crisis gave them another opportunity. Massive losses by U.S. banks and other financial institutions created a need for cash. In January 2008, Citigroup (formerly Citibank), the nation's second largest bank, reported a 4th quarter loss of $18.1 billion, in addition to a $6 billion loss in the third quarter. Merrill Lynch, a 94-year firm that is the world's largest brokerage, reported a 4th quarter loss of almost $10 billion after $16.7 billion in write-downs from subprime mortgages and CDOs. Morgan Stanley reported a $9.4 billion loss for the 4th quarter. UBS reported a 4th quarter loss of $14.45 billion, primarily from U.S. subprime mortgages. Sovereign wealth funds from China, Singapore, South Korea and other nations jumped at the opportunity to buy into these companies. Merrill Lynch obtained $6.6 billion in January from South Korea, Kuwait and Japan and $6.2 billion the previous month from Singapore. UBS received $9.75 billion from Singapore. China now owns 9.9 % of Morgan Stanley.
Mindful of the earlier problems of Dubai and the Chinese attempt to buy Unocal, foreign governments are well aware they may encounter further political obstacles to investing in the U.S., particularly if the current banking crisis abates. In any case, they are looking for other ways of diversifying out of dollars. Think gold. It is something they can buy without any government's permission. It has an unrivaled record of 5,000 years as a store of value. And it surely is more valuable than unredeemable paper currency. Last year Qatar tripled its reserves of gold in one month. That still was a small amount, but what if other countries with far larger dollar reserves start trading them for gold?
For more than a century, South Africa has been the world's leading producer of gold. For most of those years, it produced three-fourths of the world's output. But last year it was in second place. The number one gold producing country in 2007 was.....(are you ready for this?)......China!
Through centuries of political and economic turmoil, people in China and India have hoarded gold and silver jewelry and bars as a means of storing wealth. There were no large, organized markets; prices were determined by the haggling of local buyers and sellers. All that has changed. New opportunities have greatly expanded commerce in gold and made it much more convenient for the public, which has responded with enthusiasm. Investors can now trade in physical gold on the Shanghai Gold Exchange. Some new Chinese gold issues are traded over the internet. On January 9, 2008, China's first gold futures contract was launched, attracting thousands of investors. A spokesman for China International Futures said “about 90 % are individual investors, most of whom were moving assets after turning bearish on the stock markets.” As China's economy grows, people have higher incomes that raise the appetite for gold.
The State Bank of India plans to launch an exchange traded fund (ETF) this year that will enable investors to trade gold like a regular stock. Dubai also hopes to launch an ETF in gold this year. In August, 2007, the Osaka Securities Exchange in Japan began offering a gold-linked bond aimed at smaller investors. In January the Hong Kong Exchanges & Clearing Ltd. announced plans for ETFs and other gold-related investments.
Gold buying in the U.S. used to be a clumsy process. For decades, thanks to President Franklin Roosevelt's prohibition on owning gold, U.S. citizens could not even own the metal except for jewelry and coins classified as collector's items. After gold became legal again, the investment process was small-scale and cumbersome. Investors buying gold coins or bars were faced with varying—and sometimes hefty—commissions plus storage and transportation costs. Investing in gold became more convenient and efficient with the advent of ETFs. These allow investors to buy or sell shares of stock shares tied to the value of the underlying commodity. When people buy shares of stock in such a fund, the fund buys an equivalent amount of gold, which is stored in vaults. When shares are sold, the fund sells an equivalent amount of gold. The ease and simplicity of the process, with the same commissions as for regular stocks, has resulted in investors pouring billions of dollars into these funds on stock exchanges in the U.S., Paris, London, Australia, South Africa, Mexico, Singapore, and some other countries. The most active gold ETF is SPDR Gold Trust, which trades on the New York Stock Exchange. It holds more gold than the European Central Bank or China's Central Bank. Meanwhile, gold futures are trading robustly on the world's most important gold market, the Comex division of the New York Mercantile Exchange, and on the London Metal Exchange. Other gold plays include a five-year gold bullion CD from Everbank Financial, common stocks in gold mining companies, and mutual funds specializing in those stocks.
Both foreign governments and individuals are turning away from the dollar as it loses value. The dollar is in danger of losing its status as a reserve currency. The international monetary system that has existed since the post-World War II Bretton Woods agreement is coming apart at the seams. Recently Bill Gross, CEO of PIMCO, the world's largest bond fund manager, wrote, “What we are witnessing is essentially the breakdown of our modern day banking system.” Vladimir Putin has called for a “new international financial architecture,” a call that resonates with emerging economies such as China, India, Brazil. At an international economic forum in St. Petersburg in June 2007, Putin pointed out, “Fifty years ago, 60 % of world gross domestic product came from the Group of Seven industrial nations. Today 60 % of world GDP comes from outside the G7.” In view of this growing global economic trend, it is hard to see the current monetary system enduring when the linchpin of that system, the U.S. dollar, is continually undermined by the inflationary policies of its own government.
Regarding a recent hint from China about dumping a portion of its dollars, Judy Shelton, Ph.D., a professor of international finance and author of books and articles on monetary issues, said: “The prospect of such a shock to the U.S. economy in the midst of a housing slump threatens to bring the whole edifice crashing down. Throw in statements of support from oil-producers Venezuela and Iran, and you have the makings of a devastating dollar rout.” Incidentally, last July Japan agreed to pay for its oil imports from Iran in yen rather than dollars.
Even if China does not dump its U.S. treasuries on the market any time soon, the longer term outlook for the dollar is still bleak. It seems unlikely any of the trends that brought us to this point will reverse; more likely, they will worsen. Earlier I mentioned that the U.S. national debt is now $9 trillion. But that does not include unfunded Social Security, Medicaid and similar obligations. If those are included, the debt is $59 trillion. There is no way those future bills can be paid in terms of current dollars. As a “solution,” future administrations will simply employ larger doses of the same inflationary measures that created the problem. Because of structural changes in our government many years ago, which I discuss at some length in my book Makers and Takers, the system has evolved to favor the politicians who spend more and promise more. “A lot of people, including me,” said Paul Volcker, former chairman of the Federal Reserve, “have been saying that the country has been spending more than it's been producing, and that will have to come to an end. The question is: Does it come to an end with a bang or a whimper?”
The attempt to kick the problem down the road to the next generation and let them pay off future obligations with trillions of dollars worth a fraction of today's dollars will no longer work. As the dollar loses its stature as a reserve currency, as countries one by one defect from their formal links to the dollar, as U.S. dollars pile up abroad from financing our deficits, and as our borrowing becomes increasingly expensive, the monetary framework becomes ever more fragile. The danger of creating a run from the dollar could easily snowball like the run to gold prior to Nixon's action in 1971. The dollar may survive this crisis, or the next one, but there will always be one more, to which the U.S. will be more vulnerable than to the previous ones. Long before that $59 trillion debt comes due, the dollar will have lost too much value to be trusted any more. The end will not come with a whimper.
Edmund Contoski is the author of MAKERS AND TAKERS: How Wealth and Progress Are Made and How They Are Taken Away or Prevented (American Liberty Publishers, 1997). www.amlibpub.com
Saturday, August 23, 2008
Mortgage Crisis, the Dollar and its Future, Part 3
The Fed, of course, hopes to pick the optimum rate for increasing the money supply, but it will always prefer to err on the side of being a little too loose rather than a little too tight. Everyone on the Fed board is aware of the role of tight money during the Great Depression and wants to avoid a similar outcome at all costs. The danger of just a little more inflation will always seem preferable to the risk that an economic correction will slide dangerously further than expected. The latter can pose a far more difficult problem for the Fed. The leverage which was so attractive to investments on the upside now works in reverse, accelerating the decline. Sharp losses in the stock market, real estate and elsewhere can be wipe out assets rapidly on a colossal scale that monetary policy cannot quickly replace. Moreover, the Fed can make credit available, but it can't make people use it. It's actions have been compared to “pushing on a string.” People have to want to “pull” on the available credit for the Fed policy to have effect. When an economy gets so bad that pessimism pervades society, businesses are afraid to hire new workers or invest in plant or equipment, and consumers are afraid to spend. Then central bank policies are ineffective. A case in point is Japan after the collapse of its stock market in 1989. During the 1990s, even an interest rate as low as zero couldn't revive the economy. The country still has still not fully recovered. Its stock market (Nikkei Index) is barely one-third of its peak in 1989. Trillions of dollars (or yen) of assets were wiped out that have never been recouped.
Another factor favoring continued inflation in the U.S. is the growing national debt, which has accelerated wildly under President George W. Bush. Congress increased the debt limit five times since he took office in January 2001. At that time, the national debt stood at $5.6 trillion. Now it is $9 trillion. The $3.865 trillion increase is the largest of any administration ever, despite the fact he ran for office as an economic conservative.
A growing portion of our national debt is held by foreigners, particularly foreign governments. Foreign investors own only about 13 percent of U.S. equities but 44 percent of U.S. Treasury debt. And it is likely to get worse. As Peter Orzag, director of the nonpartisan Congressional Budget Office, recently testified on Capitol Hill, “Under any plausible scenario, the federal budget is on an unsustainable path—that is, federal debt will grow much faster than the economy over the long run.”
Our government finances its deficits by auctions of U.S. Treasury securities. It deposits the proceeds from the sale of the securities into it own checking account, against which it writes checks for employee salaries, federal contracts, government grants, goods and services.. The recipients of these checks deposit them in their own accounts at commercial banks. Those banks then are required to set aside a percentage (currently 10 %) as a reserve—but can loan out the remaining 90 %. For example, if someone receives a government check for $1,000 and deposits it in his bank account, his bank sets aside $100 as the required reserve and can then loan out the other $900. When someone else borrows that $900 and deposits it in his account, his bank sets aside $90 and loans out $810. The next time around, the amount loaned out will be $729. The cycle repeats until there is nothing left to loan. At that point it will be seen that the total amount of deposits is $10,000, which consists of the original $1,000 plus $9,000 in credit created by the banks. That's how the increase to our money supply can be ten times the amount borrowed to finance our debt. Is it any wonder we have price inflation?
It should be noted that the ten-fold increase in the above example increases the money supply only when treasury securities are purchased by foreigners. If those securities are purchased by Americans, no new credit is created, because the money that the government receives has already been part of the U.S. money supply; it is simply transferred from the private sector to the government, with the multiplier being the same for both. But when foreigners buy treasury securities, the proceeds are added to the previous money supply.
In 1971 President Nixon severed the last link between the dollar and gold by declaring that the U.S. government would no longer allow foreign central banks to redeem their dollars in gold. Prior to this, the dollar had been a “reserve currency” because of its gold convertibility, and foreign banks held their reserves in both dollars and gold. But as they accumulated more and more dollars, they knew the U.S. had nowhere near enough gold to back the outstanding dollars. So they wanted to get gold while they could. The U.S. paid out several billion dollars in gold but could not stem the tide of demands. It was obvious that further redemptions would soon exhaust our reserves. So Nixon simply declared that no more gold would be paid out.
The system of fixed exchange rates broke down, and currencies were set free to float against each other. But the dollar was still a reserve currency. Central banks had large quantities of them, and have been accumulating more. More than 60 % of foreign exchange reserves are still kept in U.S. dollars, which have been losing value rapidly. Priced against a basket of major currencies, the U.S. dollar has lost 38 % of its value in the last six and a half years. It lost 7.5 percent in 2007 alone. It lost 17 % last year against the Canadian dollar, falling to its lowest level in well over a century, and nearly 17 % against the Brazilian currency. It lost more than 10 % last year against the Turkish, Thai, and Indian currencies. And it lost 9.5 % last year against the euro, the common currency of 15 countries in the European Union. At one point in 2000, a dollar would buy 1.176 euros; now only .6250, a decline of more than 47 %.
Foreign governments are alarmed that the large and growing percentage of their monetary assets in dollars is rapidly losing value. They have been buying U.S. treasury securities, for which they collect interest, but the interest is clearly no longer keeping pace with the loss of value in the currency. So they are looking to reduce their risk and diversifying into assets that will better retain their value.
It is the oil-producing countries and China, with its rapidly growing economy, that find themselves with mountains of dollar reserves. The Persian Gulf nations originally pegged their currencies to the dollar to stabilize oil revenues, because oil was priced in dollars. But this forces them to accept U.S. inflation and monetary stimulus. It also makes their imports from countries with stronger currencies more expensive. Now the Fed is cutting interest rates to fight the slowdown in the U.S. economy. This policy is exactly opposite to the interests of the oil producers who are fighting inflation and overheated economies. An official spokesman for Qatar has stated that pegging its currency to the dollar has “many disadvantages, especially if that country adopts monetary policies that clash with ours." In November, Nasser al-Sulweidi, governor of the United Arab Emirates central bank, while acknowledging that the dollar peg has “served the economy...very well in the past” ended by saying: “However, we have reached a crossroads.” Kuwait severed its peg with the dollar in May 2007, linking it instead to a basket of currencies. Since then, its currency has strengthened about 5 % compared to the dollar.
Since oil is priced in dollars, the depreciating value of the dollar gives oil-exporting countries an incentive to try to keep oil prices high, by restricting production, to avoid eroding their own purchasing power. It also gives them another incentive to move away from the dollar, with the euro being the most attractive currency for pricing oil. Beginning in 2003, the oil price tripled in U.S. dollars but only a little more than doubled in euros.
Thus far Saudi Arabia, though it is struggling with inflation, has said it will retain its link to the dollar. Its foreign minister, Saud al-Faisal, said such a move would damage the U.S. economy. Other countries, however, are not so considerate of U.S. interests, because they do not have the long relationship that Saudi Arabia has with the U.S., which includes military protection. Russian President Putin, who has visions of restoring his country to its former stature as a world power, would be only too happy to advance his ambitions at the expense of the U.S. He would like to see the ruble become a global currency and has expressed interest in a Russian stock exchange pricing oil and gas in rubles. That is not realistic now—though Russia is the world's second largest exporter of oil—but in 2005 he severed the ruble's link to the dollar and aligned it with the euro.
Russia has $475 billion in reserves, but China has $1.7 trillion, with which it could do a lot of damage to the U.S. dollar if it so chose. Deng Xiaoping, who turned China away from Mao's communism, urged the country to “bide time” and “seek cooperation and avoid confrontation.” His successor Jiang Zemin had big ambitions for his country, but he continued Deng's approach because he saw the benefits of cooperation with the U.S. and its allies. Current President Hu Jintao has changed directions. In the words of China scholar G.G. Chang, he “appears to see his country working against the U.S.” (italics Chang's). Last year China refused to provide shelter for two U.S. minesweepers seeking refuge from a storm. In November, a long-arranged port call for the carrier Kitty Hawk was denied at the last minute. A routine flight to resupply the American consulate in Hong Kong was denied. Veteran China analyst Willy Lam has noted that Mr. Hu and the party leadership structure have decided “to make a clean break with Deng's cautious axioms and, instead, embark on a path of high-profile force projection.”
In October 2006 a Chinese submarine surfaced for the first time in the middle of an American carrier group. In January 2007, China “in an unmistakable display of military power,” said Chang, “destroyed one of China's old weather satellites with a ground-based missile.” Its military exercises last August “were remarkable in scope and sophistication” and were “apparently rehearsals to take Taiwan and disputed islands in the South China Sea.” Hong Yuan, a military strategist at the Chinese Academy of Social Sciences, says China's new posture shows it intends to project force in areas “way beyond the Taiwan Strait.”
So don't expect China to do us any favors in regard to the U.S. dollar. It will use its dollars against the U.S. when it considers it most advantageous to do so. It continues to fund our deficits by buying U.S. treasury securities because actions destructive to the dollar would also be destructive to its own hoard of dollars. That hoard, however, continues to lose value anyway because of U.S. inflation. As the value of the dollar continues to slide, not only China but other countries, too, will demand higher interest rates on U.S. treasuries to compensate for inflation. Those rates, in turn, will slow the growth of the U.S. economy, making us more dependent on foreign financing of our debt and at odds with our own monetary objectives for economic growth. As China's mountain of dollars grows, it becomes an ever more potent weapon to use against the U.S. at some future date. Meanwhile, China also seeks to mitigate its own growing risk by diversifying out of dollars.
To be concluded.
Another factor favoring continued inflation in the U.S. is the growing national debt, which has accelerated wildly under President George W. Bush. Congress increased the debt limit five times since he took office in January 2001. At that time, the national debt stood at $5.6 trillion. Now it is $9 trillion. The $3.865 trillion increase is the largest of any administration ever, despite the fact he ran for office as an economic conservative.
A growing portion of our national debt is held by foreigners, particularly foreign governments. Foreign investors own only about 13 percent of U.S. equities but 44 percent of U.S. Treasury debt. And it is likely to get worse. As Peter Orzag, director of the nonpartisan Congressional Budget Office, recently testified on Capitol Hill, “Under any plausible scenario, the federal budget is on an unsustainable path—that is, federal debt will grow much faster than the economy over the long run.”
Our government finances its deficits by auctions of U.S. Treasury securities. It deposits the proceeds from the sale of the securities into it own checking account, against which it writes checks for employee salaries, federal contracts, government grants, goods and services.. The recipients of these checks deposit them in their own accounts at commercial banks. Those banks then are required to set aside a percentage (currently 10 %) as a reserve—but can loan out the remaining 90 %. For example, if someone receives a government check for $1,000 and deposits it in his bank account, his bank sets aside $100 as the required reserve and can then loan out the other $900. When someone else borrows that $900 and deposits it in his account, his bank sets aside $90 and loans out $810. The next time around, the amount loaned out will be $729. The cycle repeats until there is nothing left to loan. At that point it will be seen that the total amount of deposits is $10,000, which consists of the original $1,000 plus $9,000 in credit created by the banks. That's how the increase to our money supply can be ten times the amount borrowed to finance our debt. Is it any wonder we have price inflation?
It should be noted that the ten-fold increase in the above example increases the money supply only when treasury securities are purchased by foreigners. If those securities are purchased by Americans, no new credit is created, because the money that the government receives has already been part of the U.S. money supply; it is simply transferred from the private sector to the government, with the multiplier being the same for both. But when foreigners buy treasury securities, the proceeds are added to the previous money supply.
In 1971 President Nixon severed the last link between the dollar and gold by declaring that the U.S. government would no longer allow foreign central banks to redeem their dollars in gold. Prior to this, the dollar had been a “reserve currency” because of its gold convertibility, and foreign banks held their reserves in both dollars and gold. But as they accumulated more and more dollars, they knew the U.S. had nowhere near enough gold to back the outstanding dollars. So they wanted to get gold while they could. The U.S. paid out several billion dollars in gold but could not stem the tide of demands. It was obvious that further redemptions would soon exhaust our reserves. So Nixon simply declared that no more gold would be paid out.
The system of fixed exchange rates broke down, and currencies were set free to float against each other. But the dollar was still a reserve currency. Central banks had large quantities of them, and have been accumulating more. More than 60 % of foreign exchange reserves are still kept in U.S. dollars, which have been losing value rapidly. Priced against a basket of major currencies, the U.S. dollar has lost 38 % of its value in the last six and a half years. It lost 7.5 percent in 2007 alone. It lost 17 % last year against the Canadian dollar, falling to its lowest level in well over a century, and nearly 17 % against the Brazilian currency. It lost more than 10 % last year against the Turkish, Thai, and Indian currencies. And it lost 9.5 % last year against the euro, the common currency of 15 countries in the European Union. At one point in 2000, a dollar would buy 1.176 euros; now only .6250, a decline of more than 47 %.
Foreign governments are alarmed that the large and growing percentage of their monetary assets in dollars is rapidly losing value. They have been buying U.S. treasury securities, for which they collect interest, but the interest is clearly no longer keeping pace with the loss of value in the currency. So they are looking to reduce their risk and diversifying into assets that will better retain their value.
It is the oil-producing countries and China, with its rapidly growing economy, that find themselves with mountains of dollar reserves. The Persian Gulf nations originally pegged their currencies to the dollar to stabilize oil revenues, because oil was priced in dollars. But this forces them to accept U.S. inflation and monetary stimulus. It also makes their imports from countries with stronger currencies more expensive. Now the Fed is cutting interest rates to fight the slowdown in the U.S. economy. This policy is exactly opposite to the interests of the oil producers who are fighting inflation and overheated economies. An official spokesman for Qatar has stated that pegging its currency to the dollar has “many disadvantages, especially if that country adopts monetary policies that clash with ours." In November, Nasser al-Sulweidi, governor of the United Arab Emirates central bank, while acknowledging that the dollar peg has “served the economy...very well in the past” ended by saying: “However, we have reached a crossroads.” Kuwait severed its peg with the dollar in May 2007, linking it instead to a basket of currencies. Since then, its currency has strengthened about 5 % compared to the dollar.
Since oil is priced in dollars, the depreciating value of the dollar gives oil-exporting countries an incentive to try to keep oil prices high, by restricting production, to avoid eroding their own purchasing power. It also gives them another incentive to move away from the dollar, with the euro being the most attractive currency for pricing oil. Beginning in 2003, the oil price tripled in U.S. dollars but only a little more than doubled in euros.
Thus far Saudi Arabia, though it is struggling with inflation, has said it will retain its link to the dollar. Its foreign minister, Saud al-Faisal, said such a move would damage the U.S. economy. Other countries, however, are not so considerate of U.S. interests, because they do not have the long relationship that Saudi Arabia has with the U.S., which includes military protection. Russian President Putin, who has visions of restoring his country to its former stature as a world power, would be only too happy to advance his ambitions at the expense of the U.S. He would like to see the ruble become a global currency and has expressed interest in a Russian stock exchange pricing oil and gas in rubles. That is not realistic now—though Russia is the world's second largest exporter of oil—but in 2005 he severed the ruble's link to the dollar and aligned it with the euro.
Russia has $475 billion in reserves, but China has $1.7 trillion, with which it could do a lot of damage to the U.S. dollar if it so chose. Deng Xiaoping, who turned China away from Mao's communism, urged the country to “bide time” and “seek cooperation and avoid confrontation.” His successor Jiang Zemin had big ambitions for his country, but he continued Deng's approach because he saw the benefits of cooperation with the U.S. and its allies. Current President Hu Jintao has changed directions. In the words of China scholar G.G. Chang, he “appears to see his country working against the U.S.” (italics Chang's). Last year China refused to provide shelter for two U.S. minesweepers seeking refuge from a storm. In November, a long-arranged port call for the carrier Kitty Hawk was denied at the last minute. A routine flight to resupply the American consulate in Hong Kong was denied. Veteran China analyst Willy Lam has noted that Mr. Hu and the party leadership structure have decided “to make a clean break with Deng's cautious axioms and, instead, embark on a path of high-profile force projection.”
In October 2006 a Chinese submarine surfaced for the first time in the middle of an American carrier group. In January 2007, China “in an unmistakable display of military power,” said Chang, “destroyed one of China's old weather satellites with a ground-based missile.” Its military exercises last August “were remarkable in scope and sophistication” and were “apparently rehearsals to take Taiwan and disputed islands in the South China Sea.” Hong Yuan, a military strategist at the Chinese Academy of Social Sciences, says China's new posture shows it intends to project force in areas “way beyond the Taiwan Strait.”
So don't expect China to do us any favors in regard to the U.S. dollar. It will use its dollars against the U.S. when it considers it most advantageous to do so. It continues to fund our deficits by buying U.S. treasury securities because actions destructive to the dollar would also be destructive to its own hoard of dollars. That hoard, however, continues to lose value anyway because of U.S. inflation. As the value of the dollar continues to slide, not only China but other countries, too, will demand higher interest rates on U.S. treasuries to compensate for inflation. Those rates, in turn, will slow the growth of the U.S. economy, making us more dependent on foreign financing of our debt and at odds with our own monetary objectives for economic growth. As China's mountain of dollars grows, it becomes an ever more potent weapon to use against the U.S. at some future date. Meanwhile, China also seeks to mitigate its own growing risk by diversifying out of dollars.
To be concluded.
Saturday, August 16, 2008
Mortgage Crisis, the Dollar and its Future, Part 2
John Berlau, a scholar at the Competitive Enterprise Institute, says, “The collapse of whole segments of the housing market can be traced to FHA-subsidized mortgage products. Despite its decreasing market share, the FHA appears to have played a significant role in the current mortgage 'meltdown' attributed to subprime loans. For the past three years, [FHA] delinquency rates have consistently been higher than even those of the dreaded subprime mortgages...[and]...nearly twice as high as the rate for all mortgages.”
Berlau also notes: “FHA-insured loans have also been at the center of some of the worst excesses of the housing boom, including mortgage fraud, loans made without income verification, and property 'flipping' with inflated appraisals.” These allegations have been documented by Congressional probes and investigative newspaper reporting. Senator Susan Collins of Maine, who headed a 2001 Senate investigation of mortgage fraud, said, “The federal government has essentially subsidized much of this fraud.” Over the years, FHA's down payment requirement of 20 % was gradually whittled down to 3 %. That was a result of the agency trying to compete for market share by making its own standards even more “subprime” than those of the private sector.
Fannie Mae and Freddie Mac own or guarantee 45 percent of all U.S. home-loan mortgages,. These giant agencies don't make loans. They are forbidden from doing so. Instead they buy mortgages from banks, bundle them into securities, and then resell these to investors. This “securitizing” of mortgages doesn't require Fannie or Freddie to hold a mortgage on its books any longer than it takes to package and resell it. Once a mortgage is off the books, the agency's capital is freed up to do the same thing all over again. Hence the potential for a credit bubble in the housing market. Fannie and Freddie have grown explosively since 1990. In 1990 their combined holdings of mortgages and related securities was $136 billion. In 2004 it was $1.6 trillion. Three years later it was $4.8 trillion.
In May 2006 the Office of Federal Housing Enterprise Oversight (OFHEO) announced a $400 million civil penalty against Fannie Mae for accounting manipulations. The agency discovered “a wide variety of unsafe and unsound practices.” Its report shows “Fannie Mae's faults were not limited to violating accounting and corporate governance standards, but included excessive risk-taking and poor risk management as well.” Fannie was ordered to restate its earnings from prior years by an estimated $11 billion.
In 2003, Freddie Mac was fined $125 million by OFHEO for accounting irregularities and ordered to restate earnings 2000-2002 by $5 billion. In September 2007 Freddie was fined $50 million, this time by the SEC, for accounting fraud that deceived investors, and four former key officials including a CFO, COO and two senior vice presidents, who profited from the scheme, were required to repay ill-gotten gains.
In the housing boom following the tech-stock bubble in 2000, Wall Street investment firms started dominating the lucrative business pioneered by Fannie and Freddie of bundling mortgages into securities. But Wall Street was selling them world-wide, spreading the credit bubble far beyond our shores. By the end of 2006 the total U.S. residential mortgage debt was $10.3 trillion, almost double the level of just six years earlier.
Fannie Mae and Freddie Mac are not government agencies. They are private corporations established by federal charters, implicitly backed by the U.S. government. Government-backed financial institutions have been known to fail. In the savings-and-loan debacle in the 1980s, more than 1,000 S & Ls collapsed. The Federal Savings and Loan Insurance Corporation, which had been in existence since 1934 to guarantee depositors funds, became insolvent. Though recapitalized several times by Congress with multi-billion dollar infusions of taxpayer money, the FSLIC by 1989 was deemed too insolvent to save and was abolished. Overall, the S & L bailout cost taxpayers an estimated $124 billion. For a brief period in the 1980s, Fannie Mae was losing about $1 million a day and was technically insolvent.
The feeling that mortgages are backed by the federal government undoubtedly led investors to be less circumspect than they otherwise would have been. “Unsafe and unsound practices” and “excessive risk taking and poor risk management” escaped scrutiny behind the government guarantee. Inadequate recognition of risk coupled with the explosive growth of Fannie and Freddie produced the potential for a gigantic financial disaster. Even though the mortgages were sold to other investors, Fannie and Freddie for a fee still guaranteed that payments would be made on the loans. So when the banks divided the securitized mortgages and repackaged them in SIVs (structured investment vehicles) and CDOs (collateralized debt obligations), the mortgage payments were still federally guaranteed. And the banks would collect a fee for imaginatively repackaging and selling the SIVs and CDOs.
The situation was made worse by home-equity loans, which exploded during the housing boom. Home owners took advantage of rising home values and tapped their equity to fund spending or leverage other investments. Some took out “piggyback loans,” which allowed them to borrow as much as 100 percent of a home's value by combining a mortgage with a home-equity loan. The value of home-equity loans stood at $1.1 trillion in the third quarter of 2007.
It was a boom time for the home building industry. With easy credit terms available, many people bought homes who couldn't afford them and would eventually lose them to foreclosure. Others, who were better off, were buying second and third homes as investments, with little or no money down. Home buying was stimulated by people's experience of seeing homes appreciate over the years while the dollar lost purchasing power through inflation. Money is a medium of exchange, but it is also store of value; in fact, it must be a store of value before it can be a medium of exchange. It is often lamented that the U.S. has a low rate of saving, but why should people save dollars that will be worth less in future years? A significant threat of inflation is always an incentive for people to try to get out of the currency, to spend now before the money loses value, or to find an alternative asset which will serve as a store of value. Home ownership was regarded in this manner. Home buying was viewed as a “safe” investment and one likely to appreciate with inflation rather than be eroded by it. Certainly the real estate market, within the memory of most Americans, was much more stable than the stock market. For all these reasons, money poured into the home building industry until the supply of housing outstripped the demand. Then prices started coming down. And the mountain of debt started to crumble.
Banks have been blamed for creating the subprime mortgage crisis by making risky loans to borrowers who did not meet standards of creditworthiness, but the federal government forced them to do so. Boston Globe columnist Jeff Jacoby explains: "The crisis has its roots in the Community Reinvestment Act of 1977, a Carter-era law that purported to prevent 'redlining'—denying mortgages to black borrowers—by pressuring banks to make home loans in 'low- and moderate-income neighborhoods.' ...The CRA [was] made even more stringent during the Clinton administration....Banks nationwide thus ended up making more and more ‘sub-prime’ loans and agreeing to dangerously lax underwriting standards―no down payment, no verification of income, interest-only payment plans, weak credit history....Trapped in a no-win situation entirely of the government’s making, lenders could only hope that home prices would continue to rise, staving off the inevitable collapse. But once the housing bubble burst, there was no escape. Mortgage lenders have been bankrupted, thousands of sub-prime homeowners have been foreclosed on, and countless would-be borrowers can no longer get credit. The financial fallout has hurt investors around the world. And all of it thanks to the government, which was sure it understood the credit industry better than the free market did, and confidently created the conditions that made disaster unavoidable."
Homeowners with little equity found themselves “upside down” with their mortgages: they owed more than the homes were worth. So, many simply walked away, leaving the banks to swallow the losses. Those defaults reduced bank reserves, which further reduced capital to support credit of all types. The same thing was happening with Fannie and Freddie, which were called upon to make good on their mortgage guarantees. When borrowers fall behind on their loan payments, Fannie and Freddie must buy those loans and recognize a loss on any drop in market value below the amount they paid for them. At the end of the third quarter 2007, Freddie had marked down its assets by $3.6 billion to match current market levels. In addition, it took $1.2 billion in credit losses. These losses left the company with core capital of $34.6 billion, a mere $600 million above the minimum requirement of OHFEO. Freddie estimates its losses for 2008 and 2009 will be $1.5 billion and $2.1 billion respectively.
Banks have been trying to keep as much cash as possible as a cushion against further write-downs and credit losses. Banks are also wary of lending to each other because, knowing how bad their own assets are, they don't trust each other's balance sheets. Consequently, they have been charging each other higher interest rates. Those rates, in turn, affect monthly interest payments on millions of credit cards and mortgages in Europe and the U.S.
Research suggests consumer spending drops 9 cents for every dollar decline in home equity. A decline of $2.1 trillion in U.S. residential values has already occurred, implying a decline of $200 billion in consumer spending. Consumer spending accounts for two-thirds of U.S. economic activity.
Alan Greenspan recently stated, “After more than a half-century observing numerous price bubbles evolve and deflate, I have reluctantly concluded that bubbles cannot be safely defused by monetary policy before the speculative fever breaks on its own.”
James Grant, long-time editor of Grant's Interest Rate Observer, neatly summarized Greenspan's current view: “The enlightened central banker will let speculation take its course. Following the inevitable blow-up, he will clean up the mess with low interest rates and lots of freshly printed dollar bills—thereby gassing up a new bubble.”
That was the lesson from the savings and loan crisis. Under Greenspan, the Fed became a kind of first responder to financial distress following the 1987 stock market crash, the Mexican peso crisis in 1994-95, and the Long-Term Capital Management crisis of 1998. Following the tech-stock bubble in 2000, Greenspan steadily brought interest rates down to 1 percent in June 2003 and kept them there until mid-2004. Many economists now blame those low interest rates for contributing to the housing bubble that burst in 2007.
To be continued.
Berlau also notes: “FHA-insured loans have also been at the center of some of the worst excesses of the housing boom, including mortgage fraud, loans made without income verification, and property 'flipping' with inflated appraisals.” These allegations have been documented by Congressional probes and investigative newspaper reporting. Senator Susan Collins of Maine, who headed a 2001 Senate investigation of mortgage fraud, said, “The federal government has essentially subsidized much of this fraud.” Over the years, FHA's down payment requirement of 20 % was gradually whittled down to 3 %. That was a result of the agency trying to compete for market share by making its own standards even more “subprime” than those of the private sector.
Fannie Mae and Freddie Mac own or guarantee 45 percent of all U.S. home-loan mortgages,. These giant agencies don't make loans. They are forbidden from doing so. Instead they buy mortgages from banks, bundle them into securities, and then resell these to investors. This “securitizing” of mortgages doesn't require Fannie or Freddie to hold a mortgage on its books any longer than it takes to package and resell it. Once a mortgage is off the books, the agency's capital is freed up to do the same thing all over again. Hence the potential for a credit bubble in the housing market. Fannie and Freddie have grown explosively since 1990. In 1990 their combined holdings of mortgages and related securities was $136 billion. In 2004 it was $1.6 trillion. Three years later it was $4.8 trillion.
In May 2006 the Office of Federal Housing Enterprise Oversight (OFHEO) announced a $400 million civil penalty against Fannie Mae for accounting manipulations. The agency discovered “a wide variety of unsafe and unsound practices.” Its report shows “Fannie Mae's faults were not limited to violating accounting and corporate governance standards, but included excessive risk-taking and poor risk management as well.” Fannie was ordered to restate its earnings from prior years by an estimated $11 billion.
In 2003, Freddie Mac was fined $125 million by OFHEO for accounting irregularities and ordered to restate earnings 2000-2002 by $5 billion. In September 2007 Freddie was fined $50 million, this time by the SEC, for accounting fraud that deceived investors, and four former key officials including a CFO, COO and two senior vice presidents, who profited from the scheme, were required to repay ill-gotten gains.
In the housing boom following the tech-stock bubble in 2000, Wall Street investment firms started dominating the lucrative business pioneered by Fannie and Freddie of bundling mortgages into securities. But Wall Street was selling them world-wide, spreading the credit bubble far beyond our shores. By the end of 2006 the total U.S. residential mortgage debt was $10.3 trillion, almost double the level of just six years earlier.
Fannie Mae and Freddie Mac are not government agencies. They are private corporations established by federal charters, implicitly backed by the U.S. government. Government-backed financial institutions have been known to fail. In the savings-and-loan debacle in the 1980s, more than 1,000 S & Ls collapsed. The Federal Savings and Loan Insurance Corporation, which had been in existence since 1934 to guarantee depositors funds, became insolvent. Though recapitalized several times by Congress with multi-billion dollar infusions of taxpayer money, the FSLIC by 1989 was deemed too insolvent to save and was abolished. Overall, the S & L bailout cost taxpayers an estimated $124 billion. For a brief period in the 1980s, Fannie Mae was losing about $1 million a day and was technically insolvent.
The feeling that mortgages are backed by the federal government undoubtedly led investors to be less circumspect than they otherwise would have been. “Unsafe and unsound practices” and “excessive risk taking and poor risk management” escaped scrutiny behind the government guarantee. Inadequate recognition of risk coupled with the explosive growth of Fannie and Freddie produced the potential for a gigantic financial disaster. Even though the mortgages were sold to other investors, Fannie and Freddie for a fee still guaranteed that payments would be made on the loans. So when the banks divided the securitized mortgages and repackaged them in SIVs (structured investment vehicles) and CDOs (collateralized debt obligations), the mortgage payments were still federally guaranteed. And the banks would collect a fee for imaginatively repackaging and selling the SIVs and CDOs.
The situation was made worse by home-equity loans, which exploded during the housing boom. Home owners took advantage of rising home values and tapped their equity to fund spending or leverage other investments. Some took out “piggyback loans,” which allowed them to borrow as much as 100 percent of a home's value by combining a mortgage with a home-equity loan. The value of home-equity loans stood at $1.1 trillion in the third quarter of 2007.
It was a boom time for the home building industry. With easy credit terms available, many people bought homes who couldn't afford them and would eventually lose them to foreclosure. Others, who were better off, were buying second and third homes as investments, with little or no money down. Home buying was stimulated by people's experience of seeing homes appreciate over the years while the dollar lost purchasing power through inflation. Money is a medium of exchange, but it is also store of value; in fact, it must be a store of value before it can be a medium of exchange. It is often lamented that the U.S. has a low rate of saving, but why should people save dollars that will be worth less in future years? A significant threat of inflation is always an incentive for people to try to get out of the currency, to spend now before the money loses value, or to find an alternative asset which will serve as a store of value. Home ownership was regarded in this manner. Home buying was viewed as a “safe” investment and one likely to appreciate with inflation rather than be eroded by it. Certainly the real estate market, within the memory of most Americans, was much more stable than the stock market. For all these reasons, money poured into the home building industry until the supply of housing outstripped the demand. Then prices started coming down. And the mountain of debt started to crumble.
Banks have been blamed for creating the subprime mortgage crisis by making risky loans to borrowers who did not meet standards of creditworthiness, but the federal government forced them to do so. Boston Globe columnist Jeff Jacoby explains: "The crisis has its roots in the Community Reinvestment Act of 1977, a Carter-era law that purported to prevent 'redlining'—denying mortgages to black borrowers—by pressuring banks to make home loans in 'low- and moderate-income neighborhoods.' ...The CRA [was] made even more stringent during the Clinton administration....Banks nationwide thus ended up making more and more ‘sub-prime’ loans and agreeing to dangerously lax underwriting standards―no down payment, no verification of income, interest-only payment plans, weak credit history....Trapped in a no-win situation entirely of the government’s making, lenders could only hope that home prices would continue to rise, staving off the inevitable collapse. But once the housing bubble burst, there was no escape. Mortgage lenders have been bankrupted, thousands of sub-prime homeowners have been foreclosed on, and countless would-be borrowers can no longer get credit. The financial fallout has hurt investors around the world. And all of it thanks to the government, which was sure it understood the credit industry better than the free market did, and confidently created the conditions that made disaster unavoidable."
Homeowners with little equity found themselves “upside down” with their mortgages: they owed more than the homes were worth. So, many simply walked away, leaving the banks to swallow the losses. Those defaults reduced bank reserves, which further reduced capital to support credit of all types. The same thing was happening with Fannie and Freddie, which were called upon to make good on their mortgage guarantees. When borrowers fall behind on their loan payments, Fannie and Freddie must buy those loans and recognize a loss on any drop in market value below the amount they paid for them. At the end of the third quarter 2007, Freddie had marked down its assets by $3.6 billion to match current market levels. In addition, it took $1.2 billion in credit losses. These losses left the company with core capital of $34.6 billion, a mere $600 million above the minimum requirement of OHFEO. Freddie estimates its losses for 2008 and 2009 will be $1.5 billion and $2.1 billion respectively.
Banks have been trying to keep as much cash as possible as a cushion against further write-downs and credit losses. Banks are also wary of lending to each other because, knowing how bad their own assets are, they don't trust each other's balance sheets. Consequently, they have been charging each other higher interest rates. Those rates, in turn, affect monthly interest payments on millions of credit cards and mortgages in Europe and the U.S.
Research suggests consumer spending drops 9 cents for every dollar decline in home equity. A decline of $2.1 trillion in U.S. residential values has already occurred, implying a decline of $200 billion in consumer spending. Consumer spending accounts for two-thirds of U.S. economic activity.
Alan Greenspan recently stated, “After more than a half-century observing numerous price bubbles evolve and deflate, I have reluctantly concluded that bubbles cannot be safely defused by monetary policy before the speculative fever breaks on its own.”
James Grant, long-time editor of Grant's Interest Rate Observer, neatly summarized Greenspan's current view: “The enlightened central banker will let speculation take its course. Following the inevitable blow-up, he will clean up the mess with low interest rates and lots of freshly printed dollar bills—thereby gassing up a new bubble.”
That was the lesson from the savings and loan crisis. Under Greenspan, the Fed became a kind of first responder to financial distress following the 1987 stock market crash, the Mexican peso crisis in 1994-95, and the Long-Term Capital Management crisis of 1998. Following the tech-stock bubble in 2000, Greenspan steadily brought interest rates down to 1 percent in June 2003 and kept them there until mid-2004. Many economists now blame those low interest rates for contributing to the housing bubble that burst in 2007.
To be continued.
Saturday, August 09, 2008
Mortgage Crisis, the Dollar and its Future, Part 1
The popular definition of “inflation” is a general increase in the level of prices. But what causes the price level to rise? It is an increase in the money supply without a corresponding increase in goods and services; there is more money with which to bid up the prices of available goods and services. Inflation used to mean an increase in the money supply without an increase in physical assets, namely gold or silver. Higher prices are the result. Replacing the traditional meaning of inflation with the popular one, which refers to the effect rather than the cause, has obscured the fact that government is the cause since it controls the money supply. “Inflation,” writes economist Kelley L. Ross, Ph.D., “does not occur because of a 'wage-price spiral,' an 'overheated' economy, excessive economic growth, or through any other natural mechanism of the market. A government debasing the currency would not have fooled anyone a century ago. Now, through deception, a government can try to blame inflation on anything but its own irresponsible actions.”
The money supply can be increased by simply printing more paper currency—unbacked by gold or silver—or by increasing bank credit, which is the method used in the U.S. and other developed countries today.
Every period of “easy money”—loose credit—is inevitably followed by a correction that wrings the excess credit out of the system. The result is the familiar “boom-and-bust” cycle in the economy. It is commonly called the “business cycle,” but it is basically a monetary cycle. The initial economic stimulus of excess monetary credit is followed by an offsetting loss of value in the currency and an economic slowdown as markets readjust from the credit distortions. While some people may gain from inflated prices, everyone else—particularly the common people—lose because of the depreciating value of the currency. It is reminiscent of an old Russian proverb: “The shortage will be divided among the peasants.”
The central bank, in our case the Federal Reserve, attempts to fine tune the economy by tightening or loosening credit in order to control inflation and prevent the economy from sliding into recession or depression. This is a tricky task because of external factors over which the Fed has no control and an unpredictable time lag between Fed actions and their consequences. Nobel Prize-winning economist Milton Friedman says this time lag may range from 3 to 18 months, a range so large that Fed timing is difficult. As a result, the Fed is always subject to criticism that it acted too soon or not soon enough, or that its measures were too strong or not strong enough at a particular time. The difficulty of timing Fed actions led Friedman to declare that the Fed shouldn't try to fine tune the economy at all. He said this was more disruptive of economic growth than a fixed policy. He proposed that a steady but moderate growth of the money supply would be a major contribution to the avoidance of either inflation or deflation. “I’ve always been in favor,” Friedman said, “of replacing the Fed with a laptop computer, to calculate the monetary base and expand it annually, through war, peace, feast and famine, by a predictable 2%.”
Now, interestingly, the world's gold supply has typically increased 1.5 to 3 percent annually, which is right in line with Friedman's recommended figure. So, why do we need a gold standard? Why not just increase the money supply steadily by the same fixed amount without tying it to gold? Because gold never becomes worthless; paper currencies can and do. The supply of gold never decreases. And only on very rare occasions, such as the major discoveries of gold in California in the 1850s and in South Africa and Australia in the 1890s, has it increased annually by over 4 percent. Those increases were very modest compared to the price increases caused by governments inflating the money supply. Moreover, while increased gold production did push up prices, this was because of the increase in material value, not arbitrary paper value. The world really was richer. On the other hand, there has never been a paper money unredeemable in a material asset that did not eventually become worthless. Obviously, therefore, no government can be trusted to increase the supply of an unredeemable money at a fixed rate. Sooner or later, political expediency combined with monetary ignorance and shortsightedness—not to mention “good intentions”—will result in the first small steps down the inflationary road. The first few steps will seem harmless enough, and so the process will be repeated. And broadened. More and more “good intentions” will be found. And they will be more and more expensive.
Of course governments do not want a fixed monetary policy. The Fed board of governors does not want to be replaced by a laptop computer. Nor do politicians want to give up the power to be expedient and irresponsible with other people's money—all in the name of good intentions, of course. They have a vested interest in inflation. They do not want a system that would restrain the lavish spending that buys voter support for their reelections. They do not want to give up playing god with the economy and the populace. Their good intentions for both can be financed in only two ways: 1) by taking money away from the people (taxation), or 2) by taking value away from the money (inflation). Taxation is not sufficient; there is no way the voters would accept taxes high enough to equal what they lose through inflation that finances the politicians' schemes.
The gold standard produced remarkable price stability. The Bank of England, founded in 1694 as a private company (nationalized in 1946), acted responsibly in issuing paper money. Its banknotes were “as good as gold” and led to Great Britain adopting the gold standard in 1816. Historical research by David Ranson and Penney Russell shows the stabilizing effect of this monetary policy. Ranson holds four degrees, including an M.B.A. in finance and a Ph.D. in business economics, and taught at the University of Chicago Graduate School of Business; Russell, a mathematician, is executive vice-president of H.C. Wainwright & Co. Economics, of which Ranson is president and director of research. Their research shows prices were lower in Great Britain at the beginning of World War II than in 1800. In the U.S., cumulative consumer-price inflation from 1820 to 1913, when the Federal Reserve Act was passed, was zero. According to the inflation calculator on the U.S. Bureau of Labor Statistics website, the dollar has lost more than 95% of its purchasing power since 1913.
In the 1920s (and even prior to 1920), the Fed rapidly expanded credit. This produced an enormous boom in the economy and a growing wave of optimism about continued prosperity. The result was a bubble in prices, most notoriously in the stock market. In the 1920s, stocks could be bought on margin for only 10 percent, the remainder being on credit from the brokerage houses. By 1926, they could be bought on 5 % margin. In September 1922, brokers' loans totaled $1.7 billion; by December 1926, they were $4.4 billion. And by September 1929, they were $8.5 billion.
The credit bubble of the 1920s was also evident in real estate. Henry Hoagland, a Federal Home Loan Bank board member, later wrote: “After a prolonged period of insufficient home construction during the World War, a tremendous surge of residential building in the decade of the twenties turned villages into cities and added tremendous acreage to our urban centers...[There was] a demand for modern homes greater than had ever been experienced before. This demand was matched by an ever-increasing supply of homes on easy terms.
“The easy-terms plan has a catch in it. It usually accompanies high prices and small ownership equities, giving superficial covering to a mountain of debt. When the crash came in 1929, a large proportion of home owners had but a thin equity in their homes. Only a small decline in prices was necessary to wipe out this equity. Unfortunately, deflationary processes are never satisfied with small declines in values. They feed upon themselves and produce results all out of proportion to their causes.” Sound familiar?
After the crash of 1929, stock market margins were never that low again. Since 1974, the margin rate has been 50 %. But we should not be surprised by the recent price bubble in residential real estate. For several years it was possible to buy a home for as little as 5 percent down, then 3 percent, and finally in many instances with no money down at all. According to a survey of first-time buyers by the National Association of Realtors in late 2004 and early 2005, a stunning 43% had put no money down.
The Great Depression led to greater involvement by the government in the economy as it tried to alleviate problems its monetary policies had caused. As a remedy for the disaster in the housing market, the government created the Home Loan Bank system in 1932 patterned after the Federal Reserve system, with 12 regional banks. That proved insufficient. Banks were still failing, and people were still losing their homes through foreclosures. So the government decided another agency was needed to make more credit available on easier terms. The result, in 1934, was the Federal Housing Administration (FHA), which originally required 20% down payment. Then more agencies were added to make even more mortgage credit available. In 1938 the Federal National Mortgage Association (Fannie Mae) was created. Freddie Mac (Federal Home Loan Mortgage Corp.) was created in 1970 to supplement Fannie Mae's role.
FHA, Fannie Mae, and Freddie Mac met with public approval but planted the seeds of future problems. Vernon L. Smith, a Nobel Prize-winning economist and professor of law and economics at George Mason University, says the government “set the stage for housing bubbles by creating those implicitly taxpayer-backed agencies, Fannie Mae and Freddie Mac, as lenders of last resort.”
The money supply can be increased by simply printing more paper currency—unbacked by gold or silver—or by increasing bank credit, which is the method used in the U.S. and other developed countries today.
Every period of “easy money”—loose credit—is inevitably followed by a correction that wrings the excess credit out of the system. The result is the familiar “boom-and-bust” cycle in the economy. It is commonly called the “business cycle,” but it is basically a monetary cycle. The initial economic stimulus of excess monetary credit is followed by an offsetting loss of value in the currency and an economic slowdown as markets readjust from the credit distortions. While some people may gain from inflated prices, everyone else—particularly the common people—lose because of the depreciating value of the currency. It is reminiscent of an old Russian proverb: “The shortage will be divided among the peasants.”
The central bank, in our case the Federal Reserve, attempts to fine tune the economy by tightening or loosening credit in order to control inflation and prevent the economy from sliding into recession or depression. This is a tricky task because of external factors over which the Fed has no control and an unpredictable time lag between Fed actions and their consequences. Nobel Prize-winning economist Milton Friedman says this time lag may range from 3 to 18 months, a range so large that Fed timing is difficult. As a result, the Fed is always subject to criticism that it acted too soon or not soon enough, or that its measures were too strong or not strong enough at a particular time. The difficulty of timing Fed actions led Friedman to declare that the Fed shouldn't try to fine tune the economy at all. He said this was more disruptive of economic growth than a fixed policy. He proposed that a steady but moderate growth of the money supply would be a major contribution to the avoidance of either inflation or deflation. “I’ve always been in favor,” Friedman said, “of replacing the Fed with a laptop computer, to calculate the monetary base and expand it annually, through war, peace, feast and famine, by a predictable 2%.”
Now, interestingly, the world's gold supply has typically increased 1.5 to 3 percent annually, which is right in line with Friedman's recommended figure. So, why do we need a gold standard? Why not just increase the money supply steadily by the same fixed amount without tying it to gold? Because gold never becomes worthless; paper currencies can and do. The supply of gold never decreases. And only on very rare occasions, such as the major discoveries of gold in California in the 1850s and in South Africa and Australia in the 1890s, has it increased annually by over 4 percent. Those increases were very modest compared to the price increases caused by governments inflating the money supply. Moreover, while increased gold production did push up prices, this was because of the increase in material value, not arbitrary paper value. The world really was richer. On the other hand, there has never been a paper money unredeemable in a material asset that did not eventually become worthless. Obviously, therefore, no government can be trusted to increase the supply of an unredeemable money at a fixed rate. Sooner or later, political expediency combined with monetary ignorance and shortsightedness—not to mention “good intentions”—will result in the first small steps down the inflationary road. The first few steps will seem harmless enough, and so the process will be repeated. And broadened. More and more “good intentions” will be found. And they will be more and more expensive.
Of course governments do not want a fixed monetary policy. The Fed board of governors does not want to be replaced by a laptop computer. Nor do politicians want to give up the power to be expedient and irresponsible with other people's money—all in the name of good intentions, of course. They have a vested interest in inflation. They do not want a system that would restrain the lavish spending that buys voter support for their reelections. They do not want to give up playing god with the economy and the populace. Their good intentions for both can be financed in only two ways: 1) by taking money away from the people (taxation), or 2) by taking value away from the money (inflation). Taxation is not sufficient; there is no way the voters would accept taxes high enough to equal what they lose through inflation that finances the politicians' schemes.
The gold standard produced remarkable price stability. The Bank of England, founded in 1694 as a private company (nationalized in 1946), acted responsibly in issuing paper money. Its banknotes were “as good as gold” and led to Great Britain adopting the gold standard in 1816. Historical research by David Ranson and Penney Russell shows the stabilizing effect of this monetary policy. Ranson holds four degrees, including an M.B.A. in finance and a Ph.D. in business economics, and taught at the University of Chicago Graduate School of Business; Russell, a mathematician, is executive vice-president of H.C. Wainwright & Co. Economics, of which Ranson is president and director of research. Their research shows prices were lower in Great Britain at the beginning of World War II than in 1800. In the U.S., cumulative consumer-price inflation from 1820 to 1913, when the Federal Reserve Act was passed, was zero. According to the inflation calculator on the U.S. Bureau of Labor Statistics website, the dollar has lost more than 95% of its purchasing power since 1913.
In the 1920s (and even prior to 1920), the Fed rapidly expanded credit. This produced an enormous boom in the economy and a growing wave of optimism about continued prosperity. The result was a bubble in prices, most notoriously in the stock market. In the 1920s, stocks could be bought on margin for only 10 percent, the remainder being on credit from the brokerage houses. By 1926, they could be bought on 5 % margin. In September 1922, brokers' loans totaled $1.7 billion; by December 1926, they were $4.4 billion. And by September 1929, they were $8.5 billion.
The credit bubble of the 1920s was also evident in real estate. Henry Hoagland, a Federal Home Loan Bank board member, later wrote: “After a prolonged period of insufficient home construction during the World War, a tremendous surge of residential building in the decade of the twenties turned villages into cities and added tremendous acreage to our urban centers...[There was] a demand for modern homes greater than had ever been experienced before. This demand was matched by an ever-increasing supply of homes on easy terms.
“The easy-terms plan has a catch in it. It usually accompanies high prices and small ownership equities, giving superficial covering to a mountain of debt. When the crash came in 1929, a large proportion of home owners had but a thin equity in their homes. Only a small decline in prices was necessary to wipe out this equity. Unfortunately, deflationary processes are never satisfied with small declines in values. They feed upon themselves and produce results all out of proportion to their causes.” Sound familiar?
After the crash of 1929, stock market margins were never that low again. Since 1974, the margin rate has been 50 %. But we should not be surprised by the recent price bubble in residential real estate. For several years it was possible to buy a home for as little as 5 percent down, then 3 percent, and finally in many instances with no money down at all. According to a survey of first-time buyers by the National Association of Realtors in late 2004 and early 2005, a stunning 43% had put no money down.
The Great Depression led to greater involvement by the government in the economy as it tried to alleviate problems its monetary policies had caused. As a remedy for the disaster in the housing market, the government created the Home Loan Bank system in 1932 patterned after the Federal Reserve system, with 12 regional banks. That proved insufficient. Banks were still failing, and people were still losing their homes through foreclosures. So the government decided another agency was needed to make more credit available on easier terms. The result, in 1934, was the Federal Housing Administration (FHA), which originally required 20% down payment. Then more agencies were added to make even more mortgage credit available. In 1938 the Federal National Mortgage Association (Fannie Mae) was created. Freddie Mac (Federal Home Loan Mortgage Corp.) was created in 1970 to supplement Fannie Mae's role.
FHA, Fannie Mae, and Freddie Mac met with public approval but planted the seeds of future problems. Vernon L. Smith, a Nobel Prize-winning economist and professor of law and economics at George Mason University, says the government “set the stage for housing bubbles by creating those implicitly taxpayer-backed agencies, Fannie Mae and Freddie Mac, as lenders of last resort.”
Friday, May 30, 2008
An American Lesson in Money—Inflation
America's first experience with inflation occurred during our Revolutionary period. Lawrence W. Reed has written an excellent article on this in the March 2008 issue of The Freeman. An abridged version follows, reprinted with permission.
For six years—from 1775 until 1781—representatives from the 13 colonies (states after July 4, 1776) met and legislated as the Second Continental Congress. They were America's de facto central government during most of the Revolutionary War and included some of the greatest minds and admirable patriots of the day. They included Thomas Jefferson, Benjamin Franklin, John and Sam Adams, Alexander Hamilton, Patrick Henry, John Jay, James Madison, and Benjamin Rush. The Second Continental Congress produced and ratified the Declaration of Independence and the country's first written constitution, the Articles of Confederation. It also ruined the currency and very nearly the fledgling nation in the process, proving that even the best of men with the noblest of intentions sometimes must learn economics the hard way.
Governments derive their revenues primarily from one, two, or all three of these sources: taxation, borrowing, and inflation. Americans were deemed to be in no mood to replace London's taxes with local ones, so the Second Continental Congress opted not to. It borrowed considerable sums by issuing bills of credit, but with few moneyed interests willing to risk their capital to take on the British Empire, the expenses of war could hardly be covered that way. What the Congress chose as its principal fundraising method is revealed by this statement of a delegate during the financing debate: “Do you think, gentlemen, that I will consent to load my constituents with taxes when we can send to our printer and get a wagon-load of money, one quire of which will pay for the whole?”
Reports of the deliberations that led to the printing of paper money are sketchy, but indications are that support for it was probably not universal. John Adams, for instance, was a known opponent. He once referred to the idea as “theft” and “ruinous.” Nonetheless, he and Ben Franklin were among five committee members appointed to engrave the plates, procure the paper, and arrange for the first printing of Continental dollars in July 1775. Many delegates were convinced that issuing unbacked paper would somehow bind the colonies together in the common cause against Britain.
In any event, not even the skeptics foresaw the bottom of the slippery slope that began with the first $2 million printed on July 21. Just four days later, $1 million more was authorized. Franklin actually wanted to stop the process with the initial issue and opposed the second batch, but the temptation proved too alluring. By the end of 1775 another $3 million in notes were printed. Then $4 million more in February 1776, followed by $5 million more just five months later and another $10 million before the year was out.
In the marketplace the paper notes fell in value even before independence was declared. The consequences of paper inflation at the hands of American patriots were no different from what they ever were (or still are) when rampant expansion of the money supply is conducted by rogues or dictators: prices rise, savings evaporate, and governments resort to draconian measure to stymie the effects of their own folly.
Americans increasingly refused to accept payment in the Continental dollar. To keep the depreciating notes in circulation, Congress and the states enacted legal-tender laws, measures that are hardly necessary if people have confidence in the money. Though he used the power sparingly, George Washington was vested by Congress with the authority to seize whatever provisions the army needed and imprison merchants and farmers who wouldn't sell goods for Continentals. At harvest time in 1777, with winter approaching and the army in desperate need of supplies, even farmers who supported independence preferred to sell food to the redcoats because they paid in real money—gold and silver.
Congress cranked out another $13 million paper dollars in 1777. With prices soaring, the Pennsylvania legislature compounded the effects of bad policy: it imposed price controls on precisely those commodities required by the army. Washington's 11,000 men at Valley Forge froze and starved while not far away the British army spent the winter in relative comfort, subsisting on the year's ample local crops.
Congress recognized the mistake on June 4, 1778, when it adopted a resolution urging the states to repeal all price controls. But the printing presses rolled on, belching out 63 million more paper Continentals in 1778 and 90 million more in 1779. By 1780 the stuff was virtually worthless, giving rise to a phrase familiar to Americans for generations: “not worth a Continental.”
A currency reform in 1780 asked everyone to turn in the old money for a new one at the ratio of 20 to 1. Congress offered to redeem the paper in gold in 1786, but this didn't wash with a citizenry already burned by paper promises. The new currency plummeted in value until Congress was forced to get honest. By 1781 it abandoned its legal-tender laws and started paying for supplies in whatever gold and silver it could muster from the states or convince a friend (like France) to lend it. Not by coincidence, supplies and morale improved, which helped to bring the war to a successful end just two years later.
The early years of our War for Independence were truly, as Tom Paine wrote, “times that tr[ied] men's souls” and not just because of Mother Nature [i.e.,harsh weather] and the British troops. Pelatiah Webster, America's first economist, summed up our own errors rather well when he wrote: “The people of the states had been...put out of humor by so many tender acts, limitations of prices, and other compulsory methods to force value into paper money...and by so many vain funding schemes, declarations and promises, all of which issued from Congress but died under the most zealous efforts to put them into operation and effect.”
History texts often bestow great credit on the men of the Second Continental Congress for winning American independence. A case can also be made, however, that we won it in spite of them.
For six years—from 1775 until 1781—representatives from the 13 colonies (states after July 4, 1776) met and legislated as the Second Continental Congress. They were America's de facto central government during most of the Revolutionary War and included some of the greatest minds and admirable patriots of the day. They included Thomas Jefferson, Benjamin Franklin, John and Sam Adams, Alexander Hamilton, Patrick Henry, John Jay, James Madison, and Benjamin Rush. The Second Continental Congress produced and ratified the Declaration of Independence and the country's first written constitution, the Articles of Confederation. It also ruined the currency and very nearly the fledgling nation in the process, proving that even the best of men with the noblest of intentions sometimes must learn economics the hard way.
Governments derive their revenues primarily from one, two, or all three of these sources: taxation, borrowing, and inflation. Americans were deemed to be in no mood to replace London's taxes with local ones, so the Second Continental Congress opted not to. It borrowed considerable sums by issuing bills of credit, but with few moneyed interests willing to risk their capital to take on the British Empire, the expenses of war could hardly be covered that way. What the Congress chose as its principal fundraising method is revealed by this statement of a delegate during the financing debate: “Do you think, gentlemen, that I will consent to load my constituents with taxes when we can send to our printer and get a wagon-load of money, one quire of which will pay for the whole?”
Reports of the deliberations that led to the printing of paper money are sketchy, but indications are that support for it was probably not universal. John Adams, for instance, was a known opponent. He once referred to the idea as “theft” and “ruinous.” Nonetheless, he and Ben Franklin were among five committee members appointed to engrave the plates, procure the paper, and arrange for the first printing of Continental dollars in July 1775. Many delegates were convinced that issuing unbacked paper would somehow bind the colonies together in the common cause against Britain.
In any event, not even the skeptics foresaw the bottom of the slippery slope that began with the first $2 million printed on July 21. Just four days later, $1 million more was authorized. Franklin actually wanted to stop the process with the initial issue and opposed the second batch, but the temptation proved too alluring. By the end of 1775 another $3 million in notes were printed. Then $4 million more in February 1776, followed by $5 million more just five months later and another $10 million before the year was out.
In the marketplace the paper notes fell in value even before independence was declared. The consequences of paper inflation at the hands of American patriots were no different from what they ever were (or still are) when rampant expansion of the money supply is conducted by rogues or dictators: prices rise, savings evaporate, and governments resort to draconian measure to stymie the effects of their own folly.
Americans increasingly refused to accept payment in the Continental dollar. To keep the depreciating notes in circulation, Congress and the states enacted legal-tender laws, measures that are hardly necessary if people have confidence in the money. Though he used the power sparingly, George Washington was vested by Congress with the authority to seize whatever provisions the army needed and imprison merchants and farmers who wouldn't sell goods for Continentals. At harvest time in 1777, with winter approaching and the army in desperate need of supplies, even farmers who supported independence preferred to sell food to the redcoats because they paid in real money—gold and silver.
Congress cranked out another $13 million paper dollars in 1777. With prices soaring, the Pennsylvania legislature compounded the effects of bad policy: it imposed price controls on precisely those commodities required by the army. Washington's 11,000 men at Valley Forge froze and starved while not far away the British army spent the winter in relative comfort, subsisting on the year's ample local crops.
Congress recognized the mistake on June 4, 1778, when it adopted a resolution urging the states to repeal all price controls. But the printing presses rolled on, belching out 63 million more paper Continentals in 1778 and 90 million more in 1779. By 1780 the stuff was virtually worthless, giving rise to a phrase familiar to Americans for generations: “not worth a Continental.”
A currency reform in 1780 asked everyone to turn in the old money for a new one at the ratio of 20 to 1. Congress offered to redeem the paper in gold in 1786, but this didn't wash with a citizenry already burned by paper promises. The new currency plummeted in value until Congress was forced to get honest. By 1781 it abandoned its legal-tender laws and started paying for supplies in whatever gold and silver it could muster from the states or convince a friend (like France) to lend it. Not by coincidence, supplies and morale improved, which helped to bring the war to a successful end just two years later.
The early years of our War for Independence were truly, as Tom Paine wrote, “times that tr[ied] men's souls” and not just because of Mother Nature [i.e.,harsh weather] and the British troops. Pelatiah Webster, America's first economist, summed up our own errors rather well when he wrote: “The people of the states had been...put out of humor by so many tender acts, limitations of prices, and other compulsory methods to force value into paper money...and by so many vain funding schemes, declarations and promises, all of which issued from Congress but died under the most zealous efforts to put them into operation and effect.”
History texts often bestow great credit on the men of the Second Continental Congress for winning American independence. A case can also be made, however, that we won it in spite of them.
Sunday, April 20, 2008
Death by Smoking Ban
In order to get smoking bans passed, it was necessary to create an atmosphere of hatred toward the “enemy,” to work people into a frenzy over a threat to their health, whether the threat was real or not. What mattered was not truth or science but whether the desired result—smoking bans—could be achieved. So truth and science were quickly sacrificed to the-end-justifies-the-means policy of anti-smoking organizations. Michael Seigel, MD, is both a medical doctor and public health official. He has 21 years experience in tobacco policy research and currently teaches at the Boston University School of Public Health. Though adamantly opposed to smoking, he says: “The anti-smoking movement is driven by an agenda—an agenda that will not allow science, sound policy analysis, the law, or ethics to get in its way.”
Dr. Seigel has cited over a hundred anti-smoking groups—including the American Cancer Society, the American Lung Association and the American Heart Association—for misleading the public with fallacious scientific claims. His website, www.tobaccoanalysis.blogspot.com, details an astonishing array scientific misrepresentations, outright lies and hypocrisy by anti-smoking groups. These tactics have proven effective, even as they have become ever more shrill and absurd.
Recently Dr. Siegel ran a Most Ridiculous Secondhand Smoke Claim Tournament. The national championship was won by the St. Louis University Tobacco Prevention Center. Its winning entry introduced the scare of radioactivity from secondhand smoke by the claim it contains plutonium 210, which does not exist anywhere in the known universe. The St. Louis group previously had claimed secondhand smoke contained asbestos. When that was debunked, it issued a correction substituting plutonium 210 for asbestos. The American Cancer Society managed to make the Final Four in this liars tournament with this entry: “Immediate effects of secondhand smoke include cardiovascular problems such as damage to cell walls in the circulatory system, thickening of the blood and arteries, and arteriosclerosis (hardening of the arteries) or heart disease, increasing the chance of heart attack or stroke.” Ridiculous though that statement is, it failed to top the entry of the St. Louis University Tobacco Prevention Center, and ACS was eliminated from the competition.
The U.S. Surgeon's General's Office also figured in the contest with: "Even brief exposure to secondhand smoke has immediate adverse effects on the cardiovascular system and increases risk for heart disease and lung cancer." But it went down to defeat from Action on Smoking and Health, which came up with this whopper: “Even for people without such respiratory conditions, breathing drifting tobacco smoke for even brief periods can be deadly. For example, the Centers for Disease Controls [CDC] has warned that breathing drifting tobacco smoke for as little as 30 minutes (less than the time one might be exposed outdoors on a beach, sitting on a park bench, listening to a concert in a park, etc.) can raise a nonsmoker’s risk of suffering a fatal heart attack to that of a smoker."(!)
That such monumental lies have been instrumental in the passage of smoking bans is a measure of the gullibility and scientific illiteracy of the general public and elected officials. Of course, it is also a demonstration of the dishonesty of the smoking ban activists and the absence of genuine evidence for their cause. As the independent health consultants Littlewood & Fennel testified in their report to the National Toxicology Program's Board of Scientific Counselors, the anti-smoking movement is driven by “avowed anti-smoking advocates determined to somehow prove that ETS [environmental tobacco smoke] is a human carcinogen in the face of irrefutable evidence to the contrary.”
The constant repetition of phony claims about health hazards of secondhand smoke, carried out by a well-financed campaign, has obscured the many studies debunking these claims. For example, the Congressional Research Service concluded: “It is possible that very few or even no deaths can be attributed to ETS [environmental tobacco smoke].” Further, it stated that nonsmokers exposed to pack-a-day ETS every day for 40 years have “little or no risk of developing lung cancer”—much less dying from it. The CRS is part of the Library of Congress and has all the resources of that esteemed institution at its disposal. It is highly respected, nonpartisan, accepted by both Republicans and Democrats as fair and impartial, has no ties to tobacco companies, no regulatory or other agenda, and accepts no outside funding.
Then there was the Congressional Investigation by the U.S. House of Representatives of EPA's report on secondhand smoke. It found EPA guilty of “conscious misuse of science and the scientific process to achieve a political agenda that could not otherwise be justified.”
The American Cancer Society has sponsored at least four studies over the years, all of which failed to find any statistically significant health risk from secondhand smoke, according to the standard cited by its own director of analytic epidemiology. But that hasn't kept the ACS from claiming secondhand smoke is dangerous. The most powerful statistical study ever done on the subject was the Enstrom-Kabat study. It covered 100,000 people for 38 years. The ACS financed it, help set it up, and provided data for it until preliminary results indicated the opposite of what the ACS wanted. It then withdrew its financial support and denounced the study, which was eventually published in the British Medical Journal, one of the world's foremost medical journals. The study concluded: “The results do not support a causal relation between environmental tobacco smoke and tobacco related mortality.”
Statistically, the risk of secondhand smoke is far smaller than the risk of getting lung cancer from drinking pasteurized milk. Epidemiologists use “relative risk” (RR or Risk Ratio) as a means for gauging the severity of risk. The U.S. Surgeon General has stated the RR for secondhand smoke is between 1.20 to 1.30. The risk for lung cancer from drinking pasteurized milk is 2.14. And the relative risk for getting cancer from drinking the municipal tap water that tens of millions of Americans drink every day in thousands of cities across the U.S. is 2.0 to 4.0. But where are all the dead bodies from the millions of people exposed to this far higher risk? Do you know of any? So how can secondhand smoke, which has a far lower relative risk, be killing thousands of people as claimed? In 2001 the International Agency for Research on Cancer, in Lyon, France, reported: “ETS exposure during childhood is not associated with an increased risk of lung cancer. No clear dose-response relationship could be demonstrated for cumulative spousal ETS exposure.... Even exposure to ETS from other sources was not associated with lung cancer risk.”
While secondhand smoke has not been shown to represent a statistically significant health risk, deaths continue to mount from smoking bans. In a recent article in the Journal of Public Economics, researchers set forth evidence that smokers are driving further to where they can smoke, resulting in more fatal accidents involving alcohol. This could be due to driving longer distances to where smoking is permitted outdoors or where enforcement is unlikely, as well as driving across borders to where smoking in bars is legal. The study covered 120 counties, including 20 which banned smoking. It found that alcohol-related fatal car accidents increased 13%. For a typical county of 680,000 people, this is equivalent to about six deaths. And this pattern did not diminish over time. Where smoking bans had been in place longer than 18 months, the fatal accident rate increased 19%. The trend is especially apparent where border-hopping to smoky bars is possible—indicating very strongly the effect of smoking bans on the accident rate. Fatal accidents in Delaware County in Pennsylvania increased 26% after the adjacent state of Delaware went smoke free. And when Boulder County Colorado went smoke free, fatal accidents in adjacent Jefferson County went up by 40%.
There is also another category of deaths from smoking bans. The well-financed campaign of ever more virulent and fraudulent claims of ETS health dangers has spawned a level of hatred that has produced violence and death. We hear reports of deaths of a kind we never heard before the smoking ban campaigns. In Minneapolis, where I live, the Star Tribune carried an article headlined: “Man Charged with Severing Wife's Tongue and Windpipe.” It states the man slashed her throat because she smoked a cigarette to celebrate her birthday. She was in critical condition, and he was charged with attempted murder. We never used to see stories like that, but here are some more:
Utah: A teenager was murdered for smoking in downtown Salt Lake City.
Ohio: Man Beaten To Death For Not Giving Up Cigarette. Ricardo Leon, 23, died.
UK: Nurse stabbed to death at hospital in an outside smoking area.
UK: man killed wife and two sons over her smoking. John Jarvis, 42, stabbed his wife Patricia in the heart and then murdered their sons, John, 11, and Stuart, eight.
Louisiana: Pregnant woman shot over cigarette. 18-year-old refused to stop smoking.
Calilfornia: A 21-year-old woman was stabbed several times early Saturday outside a Carlsbad home when she went outside to smoke a cigarette, police said.
California: Smoker Gunned Down. A gunman fatally shot a man outside a sports bar in unincorporated Hayward as the man took a cigarette break, authorities said Friday.
Illinois: Smoker Falls To Death. Ian Honeycutt, 28, of Glenview, tumbled from a ninth-floor apartment, blown off a window sill by a gust of wind while smoking. His aunt asked him not to smoke inside, police sources said.
Ireland: Eamonn Mulvenna, 20 year old victim died when he fell from a fire escape being used as a smoking area because of the ban.
Canada: A 65 year old smoker dies out in the cold.
Alabama: Smoker Attacked. He was standing in a parking lot, smoking a cigarette when he was attacked.
New York: 60 year old man beaten unconscious for smoking.
Florida: Father Stabs Son Over Cigarette.
New Zealand: Abduction And Rape Of Smoking Woman. The incident proved people would be more vulnerable if they had to go outside and smoke, something Prime Minister Helen Clark had not thought of, he said.
Ireland: Three men had jaws broken as they smoked outside pubs in Sligo, Kilkenny and Dublin.
Colorado: Bar Owner Blames Smoking Ban For Rape. A Pueblo bar owner says the smoking ban that forced his female employee outside is directly responsible for her rape.
Texas: Date Rape Pill Put in Drink, While Going Outside for Cigarette. Maria says she and two other friends stepped outside to smoke a cigarette. She says it was during that time that someone spiked her drink.
UK: A female backpacker fell 100 feet to her death from the roof of a hostel early yesterday. The 20-year-old Canadian plunged six storeys into a lane at around 3am. One theory is that she climbed onto the flat roof of the no-smoking Edinburgh Backpackers hostel for a cigarette.
Colorado: Courtney Chinn, 25, of Colorado Springs was shot and killed in an area near the Anchor Lounge where smokers congregate on September 20, 2003. [It is said] the problem of crime outside of bars where smokers gather will persist.
Africa: Baby sister killed in brothers' anti smoking crossfire. 3 Year Old Girl Dies In Smoking Ban War.
UK: Boy smoker hanged himself. A 12-year-old boy hanged himself with his school tie rather than admit to his parents that he had been caught smoking.
Wisconsin: Girl kills herself after being caught with a cigarette.
Massachusetts: Melissa Pierce and Angela Aiello, after leaving the bar to smoke, were struck in the heads with a metal pipe. Richard Jervah of Lynn was pushed through a plate glass window. Arthur Brestovitsky was stabbed in the chest, face, and arm.
The above examples are from http://encyclopedia.smokersclub.com:80/4.html, which contains over a hundred examples of such violence. We didn't hear stories of these kinds of violence before the smoking ban activists started fomenting hatred with their jihad (holy war) against smoking. It's time for them to admit their lies result in killing far more people than secondhand smoke does (if it kills any at all).
Once again, it is clear that, regardless of the good intentions of the jihadist do-gooders, lying and a policy of the end justifying the means simply do not work. Those tactics cannot make a safer world than truth, science and respect for individual rights--including property rights. Liberty is still the best answer--in fact, the only answer--to a better, safer, healthier society. But some people never learn; they keep trying to prove that force is better than freedom and individual rights. And their mistakes continue to be paid for with the blood and lives of innocent people.
I am a retired environmental consultant. I have never smoked, never owned or worked in a bar or restaurant, never owned stock in a tobacco company or ever received one penny from the tobacco industry or anyone else for my views on secondhand smoke. For more of my writings on this subject, see http://www.amlibpub.com/liberty_blog/2006/07/surgeon-general-trades-integrity-for.html.
http://www.amlibpub.com/liberty_blog/2007/03/unfounded-scares-about-secondhand-smoke.html
http://www.amlibpub.com/liberty_blog/2005/07/secondhand-smoke-and-heart-disease.html,
http://www.amlibpub.com/liberty_blog/2005/07/smoking-ban-of-meeker-county-minnesota.html, (includes a link to my complete testimony at Meeker County.)
Dr. Seigel has cited over a hundred anti-smoking groups—including the American Cancer Society, the American Lung Association and the American Heart Association—for misleading the public with fallacious scientific claims. His website, www.tobaccoanalysis.blogspot.com, details an astonishing array scientific misrepresentations, outright lies and hypocrisy by anti-smoking groups. These tactics have proven effective, even as they have become ever more shrill and absurd.
Recently Dr. Siegel ran a Most Ridiculous Secondhand Smoke Claim Tournament. The national championship was won by the St. Louis University Tobacco Prevention Center. Its winning entry introduced the scare of radioactivity from secondhand smoke by the claim it contains plutonium 210, which does not exist anywhere in the known universe. The St. Louis group previously had claimed secondhand smoke contained asbestos. When that was debunked, it issued a correction substituting plutonium 210 for asbestos. The American Cancer Society managed to make the Final Four in this liars tournament with this entry: “Immediate effects of secondhand smoke include cardiovascular problems such as damage to cell walls in the circulatory system, thickening of the blood and arteries, and arteriosclerosis (hardening of the arteries) or heart disease, increasing the chance of heart attack or stroke.” Ridiculous though that statement is, it failed to top the entry of the St. Louis University Tobacco Prevention Center, and ACS was eliminated from the competition.
The U.S. Surgeon's General's Office also figured in the contest with: "Even brief exposure to secondhand smoke has immediate adverse effects on the cardiovascular system and increases risk for heart disease and lung cancer." But it went down to defeat from Action on Smoking and Health, which came up with this whopper: “Even for people without such respiratory conditions, breathing drifting tobacco smoke for even brief periods can be deadly. For example, the Centers for Disease Controls [CDC] has warned that breathing drifting tobacco smoke for as little as 30 minutes (less than the time one might be exposed outdoors on a beach, sitting on a park bench, listening to a concert in a park, etc.) can raise a nonsmoker’s risk of suffering a fatal heart attack to that of a smoker."(!)
That such monumental lies have been instrumental in the passage of smoking bans is a measure of the gullibility and scientific illiteracy of the general public and elected officials. Of course, it is also a demonstration of the dishonesty of the smoking ban activists and the absence of genuine evidence for their cause. As the independent health consultants Littlewood & Fennel testified in their report to the National Toxicology Program's Board of Scientific Counselors, the anti-smoking movement is driven by “avowed anti-smoking advocates determined to somehow prove that ETS [environmental tobacco smoke] is a human carcinogen in the face of irrefutable evidence to the contrary.”
The constant repetition of phony claims about health hazards of secondhand smoke, carried out by a well-financed campaign, has obscured the many studies debunking these claims. For example, the Congressional Research Service concluded: “It is possible that very few or even no deaths can be attributed to ETS [environmental tobacco smoke].” Further, it stated that nonsmokers exposed to pack-a-day ETS every day for 40 years have “little or no risk of developing lung cancer”—much less dying from it. The CRS is part of the Library of Congress and has all the resources of that esteemed institution at its disposal. It is highly respected, nonpartisan, accepted by both Republicans and Democrats as fair and impartial, has no ties to tobacco companies, no regulatory or other agenda, and accepts no outside funding.
Then there was the Congressional Investigation by the U.S. House of Representatives of EPA's report on secondhand smoke. It found EPA guilty of “conscious misuse of science and the scientific process to achieve a political agenda that could not otherwise be justified.”
The American Cancer Society has sponsored at least four studies over the years, all of which failed to find any statistically significant health risk from secondhand smoke, according to the standard cited by its own director of analytic epidemiology. But that hasn't kept the ACS from claiming secondhand smoke is dangerous. The most powerful statistical study ever done on the subject was the Enstrom-Kabat study. It covered 100,000 people for 38 years. The ACS financed it, help set it up, and provided data for it until preliminary results indicated the opposite of what the ACS wanted. It then withdrew its financial support and denounced the study, which was eventually published in the British Medical Journal, one of the world's foremost medical journals. The study concluded: “The results do not support a causal relation between environmental tobacco smoke and tobacco related mortality.”
Statistically, the risk of secondhand smoke is far smaller than the risk of getting lung cancer from drinking pasteurized milk. Epidemiologists use “relative risk” (RR or Risk Ratio) as a means for gauging the severity of risk. The U.S. Surgeon General has stated the RR for secondhand smoke is between 1.20 to 1.30. The risk for lung cancer from drinking pasteurized milk is 2.14. And the relative risk for getting cancer from drinking the municipal tap water that tens of millions of Americans drink every day in thousands of cities across the U.S. is 2.0 to 4.0. But where are all the dead bodies from the millions of people exposed to this far higher risk? Do you know of any? So how can secondhand smoke, which has a far lower relative risk, be killing thousands of people as claimed? In 2001 the International Agency for Research on Cancer, in Lyon, France, reported: “ETS exposure during childhood is not associated with an increased risk of lung cancer. No clear dose-response relationship could be demonstrated for cumulative spousal ETS exposure.... Even exposure to ETS from other sources was not associated with lung cancer risk.”
While secondhand smoke has not been shown to represent a statistically significant health risk, deaths continue to mount from smoking bans. In a recent article in the Journal of Public Economics, researchers set forth evidence that smokers are driving further to where they can smoke, resulting in more fatal accidents involving alcohol. This could be due to driving longer distances to where smoking is permitted outdoors or where enforcement is unlikely, as well as driving across borders to where smoking in bars is legal. The study covered 120 counties, including 20 which banned smoking. It found that alcohol-related fatal car accidents increased 13%. For a typical county of 680,000 people, this is equivalent to about six deaths. And this pattern did not diminish over time. Where smoking bans had been in place longer than 18 months, the fatal accident rate increased 19%. The trend is especially apparent where border-hopping to smoky bars is possible—indicating very strongly the effect of smoking bans on the accident rate. Fatal accidents in Delaware County in Pennsylvania increased 26% after the adjacent state of Delaware went smoke free. And when Boulder County Colorado went smoke free, fatal accidents in adjacent Jefferson County went up by 40%.
There is also another category of deaths from smoking bans. The well-financed campaign of ever more virulent and fraudulent claims of ETS health dangers has spawned a level of hatred that has produced violence and death. We hear reports of deaths of a kind we never heard before the smoking ban campaigns. In Minneapolis, where I live, the Star Tribune carried an article headlined: “Man Charged with Severing Wife's Tongue and Windpipe.” It states the man slashed her throat because she smoked a cigarette to celebrate her birthday. She was in critical condition, and he was charged with attempted murder. We never used to see stories like that, but here are some more:
Utah: A teenager was murdered for smoking in downtown Salt Lake City.
Ohio: Man Beaten To Death For Not Giving Up Cigarette. Ricardo Leon, 23, died.
UK: Nurse stabbed to death at hospital in an outside smoking area.
UK: man killed wife and two sons over her smoking. John Jarvis, 42, stabbed his wife Patricia in the heart and then murdered their sons, John, 11, and Stuart, eight.
Louisiana: Pregnant woman shot over cigarette. 18-year-old refused to stop smoking.
Calilfornia: A 21-year-old woman was stabbed several times early Saturday outside a Carlsbad home when she went outside to smoke a cigarette, police said.
California: Smoker Gunned Down. A gunman fatally shot a man outside a sports bar in unincorporated Hayward as the man took a cigarette break, authorities said Friday.
Illinois: Smoker Falls To Death. Ian Honeycutt, 28, of Glenview, tumbled from a ninth-floor apartment, blown off a window sill by a gust of wind while smoking. His aunt asked him not to smoke inside, police sources said.
Ireland: Eamonn Mulvenna, 20 year old victim died when he fell from a fire escape being used as a smoking area because of the ban.
Canada: A 65 year old smoker dies out in the cold.
Alabama: Smoker Attacked. He was standing in a parking lot, smoking a cigarette when he was attacked.
New York: 60 year old man beaten unconscious for smoking.
Florida: Father Stabs Son Over Cigarette.
New Zealand: Abduction And Rape Of Smoking Woman. The incident proved people would be more vulnerable if they had to go outside and smoke, something Prime Minister Helen Clark had not thought of, he said.
Ireland: Three men had jaws broken as they smoked outside pubs in Sligo, Kilkenny and Dublin.
Colorado: Bar Owner Blames Smoking Ban For Rape. A Pueblo bar owner says the smoking ban that forced his female employee outside is directly responsible for her rape.
Texas: Date Rape Pill Put in Drink, While Going Outside for Cigarette. Maria says she and two other friends stepped outside to smoke a cigarette. She says it was during that time that someone spiked her drink.
UK: A female backpacker fell 100 feet to her death from the roof of a hostel early yesterday. The 20-year-old Canadian plunged six storeys into a lane at around 3am. One theory is that she climbed onto the flat roof of the no-smoking Edinburgh Backpackers hostel for a cigarette.
Colorado: Courtney Chinn, 25, of Colorado Springs was shot and killed in an area near the Anchor Lounge where smokers congregate on September 20, 2003. [It is said] the problem of crime outside of bars where smokers gather will persist.
Africa: Baby sister killed in brothers' anti smoking crossfire. 3 Year Old Girl Dies In Smoking Ban War.
UK: Boy smoker hanged himself. A 12-year-old boy hanged himself with his school tie rather than admit to his parents that he had been caught smoking.
Wisconsin: Girl kills herself after being caught with a cigarette.
Massachusetts: Melissa Pierce and Angela Aiello, after leaving the bar to smoke, were struck in the heads with a metal pipe. Richard Jervah of Lynn was pushed through a plate glass window. Arthur Brestovitsky was stabbed in the chest, face, and arm.
The above examples are from http://encyclopedia.smokersclub.com:80/4.html, which contains over a hundred examples of such violence. We didn't hear stories of these kinds of violence before the smoking ban activists started fomenting hatred with their jihad (holy war) against smoking. It's time for them to admit their lies result in killing far more people than secondhand smoke does (if it kills any at all).
Once again, it is clear that, regardless of the good intentions of the jihadist do-gooders, lying and a policy of the end justifying the means simply do not work. Those tactics cannot make a safer world than truth, science and respect for individual rights--including property rights. Liberty is still the best answer--in fact, the only answer--to a better, safer, healthier society. But some people never learn; they keep trying to prove that force is better than freedom and individual rights. And their mistakes continue to be paid for with the blood and lives of innocent people.
I am a retired environmental consultant. I have never smoked, never owned or worked in a bar or restaurant, never owned stock in a tobacco company or ever received one penny from the tobacco industry or anyone else for my views on secondhand smoke. For more of my writings on this subject, see http://www.amlibpub.com/liberty_blog/2006/07/surgeon-general-trades-integrity-for.html.
http://www.amlibpub.com/liberty_blog/2007/03/unfounded-scares-about-secondhand-smoke.html
http://www.amlibpub.com/liberty_blog/2005/07/secondhand-smoke-and-heart-disease.html,
http://www.amlibpub.com/liberty_blog/2005/07/smoking-ban-of-meeker-county-minnesota.html, (includes a link to my complete testimony at Meeker County.)