|Peyton Manning||34/49||280 yards|
|Russell Wilson||18/25||206 yards|
One of the things that struck me about yesterday’s Superbowl was how powerfully misleading some statistics can be. If you were to present the above statistics to someone who had not watched the game and did not know the outcome, that person might think that the game was close or that the Broncos had won. If that person noted that Peyton Manning’s number of completions was a record for a Superbowl, that person might even think that he had performed extremely well during the game. What else could one conclude from the fact that he broke a record that only the best quarterbacks in the game even get the chance to set?
If you watched the game on Sunday, however, then you know that these statistics are extremely deceptive. The Broncos were absolutely dominated on both sides of the ball from the first snap to the last and Peyton Manning was completely ineffective. The only statistic that demonstrates the lopsided nature of the contest is the only one that ultimately matters. Seattle destroyed Denver 43-8 winning by five touchdowns.
Over the years, I have noticed that scientific and statistical studies are often used to mislead or deceive unwary members of the public. A scientific study is published or a statistic is used in such a way as to exaggerate its significance. It has often been my intent to compile a list of the ways in which this has been done in order to inform others of the various techniques that are used. Perhaps such a catalog can encourage skepticism and honesty in public discourse. Consider some examples of the misuse of science and statistics below.
The War in Iraq Encouraging an Economic Boom
One of the funniest stories that I read in the aftermath of the U.S. invasion of Iraq after 9/11 was a story intended to depict post invasion Iraq as a hub of thriving economic activity. The reporter noted that economic activity since the invasion had increased by some extremely large figure (as I recall 300%) and concluded that the war had been worth it because it had increased the prosperity and well being of Iraqi citizens enormously. With such an economic boom in progress, the efforts at rebuilding the nation would undoubtedly be extremely successful.
This is an example of a method that is frequently used to distort the truth. Take as your base value an extremely low number and then present any increase in terms of a fraction. If one is bombing a country into the stone age and, once the bombing ceases, economic activity goes from $1000.00 a day during the bombing to $4000.00 a day after the bombing, then this represents a 300% increase in economic activity, but this is hardly a healthy level of economic activity for a nation of 40 million people. I am not here arguing one way or the other about the U.S. invasion of Iraq, but whether or not you supported the invasion, do not be deceived by deceptive statistics.
The Effectiveness of Abstinence Based Sexual Education
A few years back, a number of studies were published purporting to show that abstinence based sexual education programs were completely ineffective. These studies tended to focus on one number, the age at which someone lost their virginity. As I recall, the studies demonstrated that abstinence based sexual education only succeeded in delaying the onset of extra-marital sexual activity by six months.
Now I have already written about how I feel about Christians using the political process to enforce their beliefs on others in my essays on abortion and homosexuality, but the studies demonstrating that teaching abstinence is ineffective were deeply flawed. Whatever you think about abstinence based programs of sexual education, it must be clear that representing such a complex facet of human behavior using a single scalar number is extremely deceptive. According to this statistic, a prostitute who lost her virginity at age 17.25 and turns tricks a dozen times a day is behaving in exactly the same way as a woman who had a sexually active relationship with a single boyfriend starting at age 17.25 and later married him. The limitations of such an approach should be obvious.
The Average Age of Marriage
Consider the “average age at which people first get married”. I have seen this statistic used to minimize the degree to which marriage has changed since the advent of the Sexual Revolution in the 1950’s. “People used to get married at an average age of 22, now they get married at an average age of 26”, some people say, “so the Sexual Revolution has promoted the positive change of making people more careful about whom they marry and this has enormous benefits.”
Unfortunately for those of us who find the idea of extra-marital sex appealing, such a simplistic case for the advantages of the Sexual Revolution falls apart upon closer examination. The problem is, when calculating this statistic how do you handle people who never get married? When someone never gets married, they are ignored in the calculation of the “age at which people were first married” because there is no numerical value to use in the calculation of the average. The fact that the average age of marriage has increased from 22 to 26 conceals the fact that a much larger percentage of the population is never getting married and bypasses the question of whether or not this is healthy for society. There are studies that show that people in unmarried cohabiting relationships do not enjoy the same health and monetary benefits as those who are married and this is an issue of serious consequence for the long term health of our society.
The Beer Belly Myth
Now I was originally going to restrict this essay to the misuse of various statistics, but I have decided to expand it to include misleading scientific studies. In that vein, I saw a headline on a website a while back to the effect that a scientific study had demonstrated that the cultural idea of the “beer belly” was an unfounded myth. Curious to see how such a thing was possible, I read the article with some care.
As it turns out, our ideas that drinking beer can cause an excessive gut are all a bunch of bunk. It isn’t the beer itself that causes the beer belly, the scientists funded by multinational breweries boldly claimed, it is the calories in the beer that are the true culprit. Drinking beer is, therefore, perfectly conducive to physical fitness and all those people who talk about a man having a “beer belly” are unfairly slandering tasty suds.
In the debates over “Agent Orange”, “Gulf War Syndrome” and “Autism and Vaccination”, people often use epidemiological studies to “demonstrate” that concerns about the safety of certain substances are invalid. While epidemiological studies can definitely tell us that a problem exists, they cannot conclusively demonstrate that no problem exists. An interesting article demonstrating why this is the case was recently published concerning the death of a young woman. Let us consider it briefly:
The news of stylist Annabel Tollman’s death was met with shock and sadness on Friday morning. How could a young, healthy 36-year-old woman with a bright future die suddenly in her sleep-apparently of a blood clot?
. . .
More specifically, it was a lethal combination of Rachel’s birth control, Ortho Tri-Cyclen, her polycystic ovarian syndrome (PCOS) and a genetic blood disorder Rachel never knew she had called Factor II, which causes a lack of an important blood-clotting protein. (Why Are Young, Healthy Women Getting Blood Clots?)
As we can see in this tragic story, sometimes health problems can be caused by a set of coincident factors. The presence of other chemicals and a genetic predisposition might be necessary in order for a toxin to manifest its deadly symptoms.
These facts cast doubt on the kind of reasoning that is done on many important issues. Consider the following excerpt from the Wikipedia article on Agent Orange:
Military personnel who loaded airplanes and helicopters used in Ranch Hand probably sustained some of the heaviest exposures. Members of the Army Chemical Corps, who stored and mixed herbicides and defoliated the perimeters of military bases, and mechanics who worked on the helicopters and planes, are also thought to have had some of the heaviest exposures. However, this same group of individuals has not shown remarkably higher incidences of the associated diseases. (Wikipedia Article on Agent Orange)
In the above example, doubt is cast on the toxic effects of Agent Orange by stating that higher levels of exposure did not lead to significantly increased symptoms. Notice the assumption that higher levels of exposure should necessarily result in higher levels of expressed toxicity if Agent Orange is a toxin. But what if, in addition to exposure, other factors needed to be present in order for the exposed person to exhibit the symptoms of exposure? In that case, the simplistic epidemiological studies that indicate that Agent Orange is not toxic might be flawed because some people will not exhibit the symptoms of exposure no matter how much of the chemical they are exposed to because they do not have the required genetic vulnerability. People with the genetic vulnerability, on the other hand, might become extremely ill or even die.
Over the years, I have come across many other ways in which scientific or statistical data has been misused in order to make one case or another. Sample sizes that were inadequate to support a particular claim, inappropriate assumptions that were made, applications of techniques and methods that were inappropriate to the situation in which they were applied. Unfortunately, I have lost most of these examples over the years because I did not write them down. As I come across more examples, I intend to record them here and I would ask for your help in expanding this little catalog.
Additional examples will be added here as I write about them:
The exaggeration of the significance of a study on camel bones was discussed in “An Error in the Bible“.
An example of straight up spin was discussed in “Straight Up Spin“.