“We are surrounded by data but starved for insights.”
~ Jay Baer
‘More data’ is a familiar mantra for most organizations, and the public sector is not immune to the idea of collect now and make sense of it later. Big data is a critical component to delivering decision advantage to public sector agencies, allowing them to make better, faster decisions which deliver the desired outcomes for the lowest cost.
However, the desire to collect more data frequently leads to two negative outcomes: paralysis by analysis and gut-instinct decisions when using data becomes too overwhelming. To counteract these pitfalls, data scientists, commercial BI software like Tableau, and automated data collection have become ever-present in the private sector, which allows organizations to turn data into useful insight.
Unfortunately, hiring data scientists, adding commercial tools, and educating staffers across the federal government has been too slow. As a result, the level of manual reporting still in practice limits the time available for analysts to analyze. Thus, a pressing need to modernize technology, processes, and skillsets persists. Special emphasis must be placed on modernizing skillsets, which is so important that the Department of the Navy recently made the online learning service Udemy broadly available to employees.
To be successful, modernization efforts must account for how data is collected, what form it is in, and how it will be used to maximize its benefit. Today, 80–90% of internet data is unstructured, which makes it difficult to both collect and analyze.
This article will explore how the public sector must “un-unstructure” data and make it available to decision makers in real time so that leaders can make better, faster decisions.
The Problem: Lack of a Data Framework
Universally, the first obstacle to modernization is organizing and structuring a data framework to make sense of what has been collected and how it will be analyzed. This is the right starting point because most organizations are awash in raw data. Challenges include:
- Too much data
- Poor data literacy at mid-management levels
- Data accuracy, cleanliness, age
- “Trust” of data at the core of decision-making
- Poor decision making based on lack of accurate/clean data compromises organization’s optimal functions and missions.
- Use of static mediums to brief executive leadership, such as slides and spreadsheets that age the second they come off the printer
Despite these challenges, decisions must be made; and organizations hit roadblocks because information is incomplete, dynamic, and constantly moving.
Getting to better, faster, data-driven decisions require an organizational desire to modernize – something often discussed across the public sector but infrequently given time or resources to accomplish. It starts by asking several key questions:
- Are there ways to make collecting and analyzing information more efficient?
- What gaps exist between data I have and data I need?
- What data can I ignore with minimal consequence?
A lot of work goes into making that information useful, consumable, and operational. An example:
Prior to 2020, a Decision Lens customer was managing a $200M annual portfolio containing hundreds of requirements, each with over 100 columns of data and involving approximately 125 stakeholders. The stakeholders had various roles in managing various elements of the data as the requirements moved through their workflow.
It was clear they could not keep up as many data fields were incomplete, not updated, or invalidated, resulting in a lack of confidence and trust by leadership that used the data to make the decisions.
In 2020, this organization migrated to Decision Lens, a cloud-based software which offered a decision framework to structure and organize data. The large data set was then folded into the framework that included “buckets” involving key drivers such as Value, Cost, Risk, and other categories.
User permissions were established so that stakeholders were granted access to specific subsets of data related to their role. Triggers, gates, and timelines were put in place that required data elements to be submitted, tracked, edited, and validated as the requirements moved through an approval process.
Furthermore, the technology solution had the ability to run algorithms against the “buckets” of data. This provides leadership an ability to view dashboard-like outputs that include real-time prioritized requirements, budgets, shortfalls, overages, and schedules. Trusted data allows leadership to make informed decisions, defend and justify those decisions, and ensures the organization allocates funds to net the biggest bang for buck.
The goal, as highlighted in the outcomes above, is for decision-makers to have the real-time data critical to effective decision-making. This is where analytics tools come in to translate both the data and subjective information together to inform senior leaders’ gut intuition; and greatly increasing the value of their justifications.
How & Why to Modernize
Manual environments are still very much alive across the public sector, which results in critical delays in decision-making. Organizations must eliminate redundancies caused by iterating the same ad hoc, last-minute data calls.
Most organizations have not invested in structuring their environments but have the same challenges and needs. But with a renewed and seemingly genuine push for transparency in decision-making, leaders are embracing big data to inform their decision process.
Specifically, these organizations need:
- A structured approach to data collection
- A consistent enterprise-wide data framework to evaluate criteria against
- Automatic consolidation of data into a single collaborative and real-time environment.
- Streamlined prioritization
- Permission and role based access
- Integration to/from data sources
- Continuous Planningand what-if planning
As evident from the many steps above, “un-unstructuring” data is a complicated problem – not one solved solely with better visuals or more powerful pivot tables. While delivering last mile insights in a consumable manner for decision-makers is essential, a proper foundation must be established, and an ecosystem must be devised which can support the collection, transformation, analysis, and synthesis of the data. The foundation must include:
- Prioritization Framework. A standardized, but flexible criteria-based way to drive investment in mission-critical priorities
- Process Automation. Reduce reliance on spreadsheets with custom forms to collect, organize, and prioritize requirements
- Purpose Built Visualizations. Rapidly visualize courses of action live to reach data-informed prioritization decisions across all POM planning horizons
- What-If Planning. Present tradeoffs iteratively to leadership for mission aligned decision making
- Scenario Planning. Leverage powerful algorithms and visualizations to provide a quick view into potential courses of action
- Real-Time Data. Ensure executive decisions makers are always acting on up to date findings from program managers and portfolio owner
A modern decision workflow must use a combination of tools and process which take existing and new data and flow it in a trusted cadence, while layering in expert judgments with an appropriate dose of “trust”. Data can then be compared regardless of which organizations or subordinate units it came from. Leaders gain visibility into the enterprise portfolio to know where to take on more calculated risk to be better and more efficient, or when we decide to be risk averse when it makes sense.
Across the public sector, agencies are making their data strategies a priority. According to the DoD report Unleashing Data to Advance the National Defense Strategy, there is a vision for a data-centric organization that uses data at speed and scale for operational advantage and increased efficiency.
Organizations must tackle the big data challenge head on to better use data as a strategic asset by making it available to the enterprise and creating real-time insights for decision-makers. While the term modernization continues to buzz, it is time to commit to the transformation necessary which will allow agencies to move away from manual, antiquated, and data-deficient decision-making.
The operational agility achieved by reducing the need for spreadsheets and other manually intensive processes while incorporating a standardized, collaborative submission/data management process will result in higher quality insight and thus better decisions. Leaders will benefit from having real-time data delivered continuously to make decisions quickly with minimal delays. The elimination of redundancies caused by iterating the same ad hoc, last minute data calls will provide more time for data analysis.
Ultimately, organizations will achieve optimized decision-making and be able to defend and justify decisions with reliable, up-to-date source data that is trusted, accessible, and aligned with their mission.