The daily struggle with traditional software for government reporting

The daily struggle with traditional software for government reporting
November 3, 2015

Part One: The burden of spreadsheets on data transparency

At the core of nearly every budget remains the common spreadsheet—capturing millions of data points each day.

Spreadsheets are useful for housing data and incorporating formulas, working out equations, and storing basic budget activity. The traditional spreadsheet is used by many CAFR and budget reporting teams to collect and store data to make crucial management decisions.

The same tool has been used for years among accounting and finance teams, who are trying to keep up in a fast-paced reporting environment with ever shrinking budgets. As file sizes grow larger, become more complex, and are shared across multiple agencies, it’s critical to stop and ask two questions:

  1. How efficient is the spreadsheet today?
  2. What does the data mean?

In a 2013 study investigating the spreadsheet usage practices of business professionals, Ventana Research found that on average, survey participants spend approximately 1.5 work days per month—18 days every year—updating, revising, consolidating, modifying, and correcting the spreadsheets that they collaborate on with others. Manual efforts to manage shared spreadsheets can be frustrating for accountants and financial professionals, especially when trying to meet strict deadlines.

The same study also found that 72 percent of participants said their most important spreadsheets are the ones they share with others. Traditional spreadsheets are simply not built to handle mass collaboration among users or across multiple government agencies.

Sharing spreadsheets is risky. And managing multiple spreadsheets is difficult, with many pitfalls such as:

  • Multiple manual entries
  • Repeating copy and paste
  • Restricting structure and formula changes
  • Circulating numerous versions
  • Outdated numbers
  • No audit trail
  • Poor security of confidential files

Aside from the process risks associated with this system, accessibility of data also arises as a concern for management and stakeholders.

The widespread use of spreadsheets and homegrown systems to collect and quantify data limits the level of visibility into origin and accuracy of the data being reported.

A continuous stream of rows and columns provides little value to stakeholders and citizens when deriving meaningful analysis of periodic data. Integrating numbers from multiple, large spreadsheets to slideshows and stakeholder documents containing graphs and charts is a manual, time-consuming task with a lot of room for error. The hours spent collecting, presenting, and interpreting data can quickly add up to hours upon hours of overtime.

Data drives budget decisions, operations, and is needed to gain citizen confidence. It’s imperative that accounting teams take a step back from their manual spreadsheet and reporting processes to identify the risks and inefficiency of the routine entanglement of data.

By applying technology advances to data processes in an effort to improve core systems and drive analysis, teams are creating more than just an accurate report—they are providing long-term value influencing some of the most important business decisions.

Ask yourself these two questions, and think about how you can make the spreadsheet work for you, not against you.

This is part one in a series of blog posts analyzing the difficulties associated with common processing programs. Check back regularly, or subscribe here to get the rest of the series.

Mike Sellberg

About the author

Mike Sellberg is Executive Vice President and Chief Product Officer at Workiva. He is the former EVP and CTO at iMed Studios and the former Divisional General Manager at Engineering Animation, Inc.