Government IT Leaders Discuss the Value of Being Data Ready
Recently, I got to attend a summit of governmental IT leaders. These public sector leaders just like their peers within the private sector, see data increasingly as a mechanism for delivering more effective and efficient “business capabilities”. In this post, I will share some of the key takeaways from the presenters about the importance of creating what I like to call the “data ready enterprise”.
The Deputy CIO for the FCC said that they are moving from transforming their organization’s technology processes to transform their organization’s business processes. Taking this step can drive out better alignment between IT and the rest of the organization. “The problem for large public and private organizations is that they cannot stay rooted in the past”. For public sector organizations, “we need to provide greater transparency into what we do. We need to create at the same time a freer flow of data with our customers”. At the same time, “we need to drive business outcomes”. This means enabling better, faster, and yes, cheaper at the same time as they adapt, buy, and create. And we need to show that on the continuum of change that we are truly “leaning forward”.
The emergence of the governmental CDO
The Deputy CIO, USDA said that 11 Federal Agencies now have a CDO function. She said that overall 25% of enterprises have a CDO. Like others in the government, USDA is participating in the “Open Data Initiative”. To make this reality, she said that they have creating data stewards within USDA. She said that they are trying, as well, to limit the number of copies for the same dataset—a common problem across all enterprises.
Next up was one of these Federal CDOs. The CDO for the Department of Energy started his presentation by saying IT is all about the data dummy. “People need to remember that this is why the tech is here anyway”. He shared his prior experience at Capital One, a predicative analytics leader. He said that today 80% of the time for a data scientist is just finding the data. He recommended that organizations need to manage first data and then the technology. This is the opposite of the way things are done at many organizations. He went on to say that we need to stop managing the data in silos. He effectively asserted the need for an “enterprise analytics capability” to do this. He said face it, “data is a business asset and today, data management is, therefore, part of business accountability”
Data is the fuel for decision making
The Deputy Director, OUSD, DoD, said that he needs to generate real value for his stakeholders. For this reason, those in the government need to see “data as the fuel that drives the decision making”. In today’s enterprises, data is the currency. For many, they need to be prepared to work with what we have. My department leaders believe that we are here to effectively guide them through decisions. For this reason, he starts by discussing the data first in these discussions.
Organizations need to care about data quality
Fannie Mae’s Data Quality Service Manager shared how his organization has come to really care about data quality. He said that they have needed this to be an enterprise wide thing. At their organization, data affects the quality of work and the quality of life. From their experience, data quality like enterprise architecture starts with business architecture. Organizations can either be proactive or reactive for data quality. Good data management uses event management to determine when the rules do not work. And good data quality increasingly needs to become self-service—it should be run by the business so they trust the data that is output.
USPS needed a world class data system to be have sustainable right to win
The USPS, Senior Technology Architect, said that the USPS needed to change in order to survive. Strategically, USPS needed to move from a letter centric business to a package centric business. To be a world class package delivery business, they needed a world class package tracking system. This meant aggressively moving to bar codes and creating a world class supply chain that captures package scans at every step along their journey. Their legacy system architecture was too slow to deliver this and had long lag times for data—furthermore, it proved costly because it demanded regular downtime. Their new architecture, developed to respond to the above deficiencies, includes what they call a scan event architecture. This is capable enough to know when things have gone south. They now have 99.5% availability while managing 150,000,000 tracking events per day. With this performance, they are starting to earn business away from Internet Based Businesses. Their next step is to use package scan data to predict not only the day but the time window for delivery.
Public sector IT organizations just like for profit businesses need their data to be decision ready. And being decision ready means having data that is trustworthy and timely regardless of the enterprise nature. Additional Materials
Solution Page: Corporate Governance
Solution Page: Data Scientist Data Discovery
Blogs and Articles IT Leadership Group Discusses Governance and Analytics