Internet Explorer 11 is not supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

The Smell Test for Bad Data

There are many ways numbers can be misleading. Here are a few.

A former editor of ours at Forbes magazine, the late Sheldon Zalaznick, used to talk about putting copy he read through a smell test. By that he meant that he was looking for facts and data that seemed to give off a scent that made him believe it was inaccurate or misleading. This was an ability he developed through years of reading articles about corporate finance. When he detected a fact he thought was dubious, he’d circle it with a red pen and send the copy back to the writer. He was almost always right.   

Shelley was a model for us. After over a quarter of a century of reading and thinking about city and state governments, we frequently come across numbers that jump out to us as questionable. The alarming part is that these figures aren't frequently questioned by government officials who are using them for policy, management or members of the press who repeat them to the general public.

One bit of malodor can sometimes be found in big, round numbers that don’t seem to change as the years pass. Numbers that have lots of zeros simply beg for double-checking as very few real world events occur so neatly.

Try this for an example: Search online for the words “one million teenage girls get pregnant each year.” This big eye-catching number is set forth as absolute truth by many yet the figure -- which was probably never entirely accurate -- was already out of date over 20 years ago. In fact, the number has been dropping for much of that time. The U.S. Department of Health and Human Services indicates that in 2013 there were 455,175 pregnancies for women ages 15 to19.  

False figures often have a long life span when they’re making negative points. For example, the idea that there are more black men in prison than in college is widespread and stems from a report written about 15 years ago. It has a terrific shock value that tends to immunize it from questioning. Yet, according to the American Council on Education, the way to derive that number is by only counting “African-American males attending degree-granting institutions who enrolled for the fall semester.” There’s no question that the high levels of incarceration for black men in America are outrageously high. But if you include non-degree granting vocational institutions and men who enrolled after the fall semester, the number of black males in some kind of post-secondary education program nearly doubles. In 2010 there were 1,341,354 black men enrolled in post-secondary education, far exceeding the 844,600 black men who were incarcerated. 

MORE: This appears in our free Management e-newsletter. Subscribe here.

Another lesson: Whenever anyone is comparing annual data from one state to another or one city to another, the information should be based on precisely the same time period.

Consider financial reports for pension plans. Fifteen of the states base them on a calendar year, while the rest use the fiscal year to calculate annual investment results. When policymakers compare data from two states, the results can be very misleading. For example, Oregon’s pension investment results for 2008 showed a 27 percent loss, whereas California’s showed only a 4.9 percent drop. But this says little about actual pension performance because the 2008 investment results for Oregon represent the full blow of the devastating stock market drop in the fall, whereas California’s results only took the state through June 30th and the fall losses showed up in its 2009 report.



A few pieces of advice from our past experiences:

  • Beware of looking at job loss or job gains counted for governors that begin the day an official took office. He or she can’t possibly have had any impact on employment on inauguration day or soon thereafter.
  • Think twice when a piece of data is attached to a huge generality. For example, officials may say that “crime is down by X percent,” but unless observers know what classifications of crimes are taken into account, it’s hard to know how meaningful the decline has been.
  • Consider the concept of absolute versus relative risk. If an article indicates that a government program reduces the likelihood of a particular social ill by 90 percent, that seems very powerful. But if the reduction was from .1 percent to .01 percent, the change is virtually meaningless.
There’s an underlying principle here. If a piece of data is really surprising -- because it’s far bigger than expected or smaller or flies in the face of trusted information -- it isn’t wise to just enjoy knowing something new that seems remarkable. When a fact appears unbelievable, it may well be unworthy of belief.  

Government management experts. Their website is greenebarrett.com.