Numbers are powerful. When I say that one out of ten people are gay or bisexual, that’s a pithy way of saying gay issues are so common, they demand constant attention. When I say that one out of four women are raped before the age of 18, that’s grounds for spending millions of federal dollars to try prevent more while quietly sentencing every man that’s accused of it. When I say that marijuana users are 85 times more likely to try cocaine, that’s reason to keep it illegal. These statistics say powerful things. They change minds, create uproar and turn a peripheral issue into a group’s rallying cry. They are also all wrong.
While mathematical in expression, statistics manifest by a decidedly human vehicle: the study. And like all human apparatuses, this one is only as good as the person that runs it. Take the “one in ten” figure. It‘s the result of a study done by Alfred Kinsey in 1948. It’s also fundamentally flawed: Kinsey asked volunteer (not random) male respondents if they’ve engaged in homosexual activity in the last three years. The study drew unduly from metropolitan areas, college students and ex-cons (all groups more likely to say they’re engaged in homosexual behavior but are not necessary gay). Studies done in the 1990s by the Alan Guttmacher Institute and the University of Chicago put the actual (but less catchy) number between one and three percent. Data gathered in Canada (2004), England (1992), France (1992), Norway (1988) and Denmark (1992) put the figure between one and six.
Rape statistics are similarly skewed. The one in four statistic is from a 1985 study by Mary Koss who used ambiguous questions to tally the numbers. For example, the study used the following inquiry to help determine the rape statistic: “Have you had sexual intercourse when you didn't want to because a man gave you alcohol or drugs?” Thus if you ever regretted sex that you had while drunk, and you being drunk could be attributed to a man in some way, you’ve been raped. The word “because” doesn’t mean there was a malicious intent or a threat of force, either. Most surprisingly, 73% of the girls Koss labeled as raped didn’t think what happened to them was rape. A better study by Louis Harris and Associates (1993) put the number closer to one in fifty.
The 85 statistic used to support the gateway theory (marijuana is a gateway to hard drugs), though the theory is older than the study. Quoted by the Executive Director of the Center for Alcohol and Substance Abuse, Joseph Califano, in 1997, the analysis is not only poorly done, it contradicts another study by the Department of Health and Human Services in the same year. The CASA conclusion is the result of some shady math relying on the fact that most people who use cocaine try cannabis first. Only .2% of all cocaine users have never tried weed. The DHHS study wasn’t the first to question the gateway theory (that’s the LaGuardia Report in 1944) but its conclusion is light years away from the CASA study: “For every 104 people who have used marijuana, there is only one regular user of cocaine and less than one heroin addict.”
There are two reasons why you’ve never heard of these alternative studies. First, they aren’t that interesting. When only six percent of the world is gay, two percent of women are raped and less than one percent of marjuiana users move on to harder drugs, it’s a lot harder to get mad at stuff (this is truer about the last two numbers). Second, the popular figures are what Joel Best calls “mutate statistics:” ones that everyone knows so no one questions. They are repeated over and over until they stop becoming the conclusion of one random study and start becoming Truth, Truth gay rights groups, women’s rights groups and anti-drug groups tend to cling to. Numbers are powerful things, capable of turning the course of an argument completely around. Shouldn’t we be sure we get them right?