Internet Explorer 11 is not supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Government’s Troublesome Data-Hoarding Habit

Keeping everything forever is a recipe for inefficiency and waste.

The TV show host opens the front door and stops dead in her tracks: She sees one small aisle through the middle of the room, and on both sides are stacks of newspapers, magazines, clothing, and pizza boxes towering over her head. The homeowner leads her through the cluttered maze, claiming everything is under control and he knows exactly where all of his belongings are -- but there's just no possible way that's true.

Those of us who have seen the reality show "Hoarders" have watched in amazement as people try to justify the large collections of junk they've amassed over the years. What many of us don't realize, however, is that governments at all levels are hoarders as well.

Long before corporations embraced big data and business intelligence, the public sector was on the case, collecting mountains of data with hopes of finding efficiencies, making service improvements and bettering the lives of constituents. This type of data hoarding follows the old logic: It's better to have something and not need it than the reverse. But if government agencies aren't careful, this hoarding habit could result in an uninhabitable, unproductive operation -- just as it does for the hoarders we see on television.

Government databases are filled with everything from traffic data to pet-ownership statistics, and many agencies lack the necessary staff and infrastructure to maintain and analyze all of this information. Public-sector data analysts report that they spend 47 percent of their time collecting and organizing data but less than a third of their time actually gleaning actionable insights from it.

A primary cause of government data hoarding is the public sector's fragmentation: Data is segregated into specific departmentalized systems and in most cases cannot be compared or analyzed across entire organizations. As a result, analysts are forced to run multiple reports from each system and manually combine all of the data into spreadsheets. It's a time-consuming and error-prone process that many refer to as "Excel Hell."

Rather than accept this crowded and arduous data approach as reality, organizational leaders need to adopt strategic plans for the future that streamline the collection, storage and analysis process. In other words, governments must learn how to quickly identify and sort good data and bad data. They can adopt a number of effective approaches, and doing so will pave the way for greater accessibility, better analysis and major financial savings. The federal government ended up saving more than $1 billion when it opted to shut down 1,200 of its data centers with hopes of eliminating not only data duplication but also fragmentation and waste.

A good strategic plan should have the ultimate goal of aligning objectives with key results or outcomes in the most efficient way possible. Identify exactly which data will shine light on the organization's particular objectives and then establish an ideal time frame for the extraction of that data. When reporting historically, going too far back in time can skew analysis due to changing economic factors (among other things). It is important to be able to compare apples to apples, so organizational leaders must maintain a strategic plan that determines what data is still relevant and what should be archived. Setting this timeline also develops an enterprise-wide common understanding of priorities and goals. You're no longer measuring for measurement's sake.

One county government was seeking to better understand its workforce spending, particularly its overtime trends. While the organization's payroll information was housed in one system, its time-keeping information was on another, making it virtually impossible to extract relevant information from the noise. The hoards of data elements, combined with the "Excel Hell" of sorting through it all, was inefficient and ineffective.

By identifying and extracting just the historical and operational data that mattered most, the county was able to systematically filter out its unneeded data. As a result, not only could the county identify a payroll pattern but it could also gain actionable intelligence on how to plan for the future and achieve its strategic objectives. Today, instead of keeping every component of a payroll and time transaction, the county keeps only the relevant data that's aligned to retention regulations.

Data is crucially important, but only if it's current, relevant, of high quality and easily accessible. Keeping everything that's collected "just in case" is seldom a good idea, as going too far back in time will likely only result in outdated, inaccurate analysis. That alone is the strongest argument for governments to get out of the data-hoarding habit.

President and founder of Mo'mix Solutions