Tag Archives: government
Recently, I got to attend a summit of governmental IT leaders. These public sector leaders just like their peers within the private sector, see data increasingly as a mechanism for delivering more effective and efficient “business capabilities”. In this post, I will share some of the key takeaways from the presenters about the importance of creating what I like to call the “data ready enterprise”.
The Deputy CIO for the FCC said that they are moving from transforming their organization’s technology processes to transform their organization’s business processes. Taking this step can drive out better alignment between IT and the rest of the organization. “The problem for large public and private organizations is that they cannot stay rooted in the past”. For public sector organizations, “we need to provide greater transparency into what we do. We need to create at the same time a freer flow of data with our customers”. At the same time, “we need to drive business outcomes”. This means enabling better, faster, and yes, cheaper at the same time as they adapt, buy, and create. And we need to show that on the continuum of change that we are truly “leaning forward”.
The emergence of the governmental CDO
The Deputy CIO, USDA said that 11 Federal Agencies now have a CDO function. She said that overall 25% of enterprises have a CDO. Like others in the government, USDA is participating in the “Open Data Initiative”. To make this reality, she said that they have creating data stewards within USDA. She said that they are trying, as well, to limit the number of copies for the same dataset—a common problem across all enterprises.
Next up was one of these Federal CDOs. The CDO for the Department of Energy started his presentation by saying IT is all about the data dummy. “People need to remember that this is why the tech is here anyway”. He shared his prior experience at Capital One, a predicative analytics leader. He said that today 80% of the time for a data scientist is just finding the data. He recommended that organizations need to manage first data and then the technology. This is the opposite of the way things are done at many organizations. He went on to say that we need to stop managing the data in silos. He effectively asserted the need for an “enterprise analytics capability” to do this. He said face it, “data is a business asset and today, data management is, therefore, part of business accountability”
Data is the fuel for decision making
The Deputy Director, OUSD, DoD, said that he needs to generate real value for his stakeholders. For this reason, those in the government need to see “data as the fuel that drives the decision making”. In today’s enterprises, data is the currency. For many, they need to be prepared to work with what we have. My department leaders believe that we are here to effectively guide them through decisions. For this reason, he starts by discussing the data first in these discussions.
Organizations need to care about data quality
Fannie Mae’s Data Quality Service Manager shared how his organization has come to really care about data quality. He said that they have needed this to be an enterprise wide thing. At their organization, data affects the quality of work and the quality of life. From their experience, data quality like enterprise architecture starts with business architecture. Organizations can either be proactive or reactive for data quality. Good data management uses event management to determine when the rules do not work. And good data quality increasingly needs to become self-service—it should be run by the business so they trust the data that is output.
USPS needed a world class data system to be have sustainable right to win
The USPS, Senior Technology Architect, said that the USPS needed to change in order to survive. Strategically, USPS needed to move from a letter centric business to a package centric business. To be a world class package delivery business, they needed a world class package tracking system. This meant aggressively moving to bar codes and creating a world class supply chain that captures package scans at every step along their journey. Their legacy system architecture was too slow to deliver this and had long lag times for data—furthermore, it proved costly because it demanded regular downtime. Their new architecture, developed to respond to the above deficiencies, includes what they call a scan event architecture. This is capable enough to know when things have gone south. They now have 99.5% availability while managing 150,000,000 tracking events per day. With this performance, they are starting to earn business away from Internet Based Businesses. Their next step is to use package scan data to predict not only the day but the time window for delivery.
Public sector IT organizations just like for profit businesses need their data to be decision ready. And being decision ready means having data that is trustworthy and timely regardless of the enterprise nature. Additional Materials
Solution Page: Corporate Governance
Solution Page: Data Scientist Data Discovery
Blogs and Articles IT Leadership Group Discusses Governance and Analytics
That’s right, Valentine’s Day is upon us, the day that symbolizes the power of love and has the ability to strengthen relationships between people. I’ve personally experienced 53 Valentine’s Days so I believe I speak with no small measure of authority on the topic of how to make the best of it. Here are my top five suggestions for having a great day:
- Know everything you can about the people you have relationships with
- Quality matters
- ALL your relationships matter
- Uncover your hidden or anonymous relationships
- Treat your relationships with respect all year long
OK, I admit, this is not the most romantic list ever and might get you in more trouble with your significant other than actually forgetting Valentine’s Day altogether! But, what did you expect? I work for a software company, not eHarmony!
Right. Software. Let’s put this list into the context of government agencies.
- Know everything – If your agency’s mission involves delivering services to citizens, likely, you have multiple “systems of record”, each with a supposed accurate record of all the people being tracked by each system. In reality though, it’s rare that the data about individuals is consistently accurate and complete from system to system. The ability to centralize all the data about individuals into a single, authoritative “record” is key to improving service delivery. Such a record will enable you to ensure the citizens you serve are able to take full advantage of all the services available to them. Further, having a single record for each citizen has the added benefit of reducing fraud, waste and abuse.
- Quality matters – Few things hinder the delivery of services more than bad data, data with errors, inconsistencies and gaps in completeness. It is difficult, at best, to make sound business decisions with bad data. At the individual level and at the macro level, agency decision makers need complete and accurate data to ensure each citizen is fully served.
- All relationships matter – In this context, going beyond having single records to represent people, it’s also important to have single, authoritative views of other entities – programs, services, providers, deliverables, places, etc.
- Uncover hidden relationships – Too often, in the complex eco-system of government programs and services, the inability to easily recognize relationships between people and the additional entities mentioned above creates inefficiencies in the “system”. For example, it can go unnoticed that a single parent is not enrolled in a special program designed for their unique life circumstances. Flipping the coin, not having a full view of hidden relationships also opens the door for the less scrupulous in society, giving them the ability to hide their fraudulent activities in plain sight.
- Treat relationships respectfully all year – Data hygiene is not a one-time endeavor. Having the right mindset, processes and tools to implement and automate the process of “mastering” data as an on-going process will better ensure the relationship between your agency and those it serves will remain positive and productive.
I may not win the “Cupid of the Year” award, but, I hope my light-hearted Valentine’s Day message has given you a thing or two to think about. Maybe Lennon and McCartney are right, between people, “Love is All You Need”. But, we at Informatica believe for Government-Citizen relationships, a little of the right software can go a long way.
If you work for or with the government and you care about the cloud, you’ve probably already read the recent MeriTalk report, “Cloud Without the Commitment”. As well, you’ve probably also read numerous opinions about the report. In fact, one of Informatica’s guest bloggers, David Linthicum, just posted his thoughts. As I read the report and the various opinions, I was struck by the seemingly, perhaps, unintentional suggestion that (sticking with MeriTalk’s dating metaphor) the “commitment issues” are a government problem. Mr. Linthicum’s perspective is “there is really no excuse for the government to delay migration to cloud-based platforms” and “It’s time to see some more progress”, suggesting that the onus in on government to move forward.
I do agree that, leveraged properly, there’s much more value to be extracted from the cloud by government. Further, I agree that cloud technologies have sufficiently matured to the point that it is feasible to consider migrating mission critical applications. Yet, is it possible that the government’s “fear of commitment” is, in some ways, justified?
Consider this stat from the MeriTalk report – only half (53%) of the respondents rate their experience with the cloud as very successful. That suggests the experience of the other half, as MeriTalk words it, “leave(s) something to be desired.” If I’m a government decision maker and I’m tasked with keeping mission critical systems up and sensitive data safe, am I going to jump at the opportunity to leverage an approach that only half of my peers are satisfied with? Maybe, maybe not.
Now factor this in:
- 53% are concerned about being locked into a contract where the average term is 3.6 years
- 58% believe cloud providers do not provide standardized services, thus creating lock in
Back to playing government decision maker, if I do opt to move applications to the cloud, once I get there, I’m bound to that particular provider – contractually and, at least to some extent, technologically. How comfortable am I with the notion of rewriting/rehosting my mission-critical, custom application to run in XYZ cloud? Good question, right?
Inevitably, government agencies will end up with mission-critical systems and sensitive data in the cloud, however, successful “marriages” are hard, making them a bit of a rare commodity
Do I believe government has a “fear of commitment”? Nah, I just see their behavior as prudent caution on their way to the altar.
As covered in Loraine Lawson’s blog, MeriTalk surveyed federal government IT professionals about their use of cloud computing. As it turns out, “89 percent out of 153 surveyed expressed ‘some apprehension about losing control of their IT services,’ according to MeriTalk.”
Loraine and I agree that what the survey says about the government’s data integration, management, and governance, is that they don’t seem to be very good at cloud data management…yet. Some of the other gruesome details include:
- 61 percent do not have quality, documented metadata.
- 52 percent do not have well understood data integration processes.
- 50 percent have not identified data owners.
- 49 percent do not have known systems of record.
“Overall, respondents did not express confidence about the success of their data governance and management efforts, with 41 percent saying their data integration management efforts were some degree of ‘not successful.’ This lead MeriTalk to conclude, ‘Data integration and remediation need work.’”
The problem with the government is that data integration, data governance, data management, and even data security have not been priorities. The government has a huge amount of data to manage, and they have not taken the necessary steps to adopt the best practices and technology that would allow them to manage it properly.
Now that everyone is moving to the cloud, the government included, questions are popping up about the proper way to manage data within the government, from the traditional government enterprises to the public cloud. Clearly, there is much work to be done to get the government ready for the cloud, or even ready for emerging best practices around data management and data integration.
If the government is to move in the right direction, they must first come to terms with the data. This means understanding where the data is, what it does, who owns it, access mechanisms, security, governance, etc., and apply this understanding holistically to most of the data under management.
The problem within the government is that the data is so complex, distributed, and, in many cases, unique, that it’s difficult for the government to keep good track of the data. Moreover, the way the government does procurement, typically in silos, leads to a much larger data integration problem. I was working with government agencies that had over 5,000 siloed systems, each with their own database or databases, and most do not leverage data integration technology to exchange data.
There are ad-hoc data integration approaches and some technology in place, but nowhere close to what’s need to support the amount and complexity of data. Now that government agencies are looking to move to the cloud, the issues around data management are beginning to be better understood.
So, what’s the government to do? This is a huge issue that can’t be fixed overnight. There should be incremental changes that occur over the next several years. This also means allocating more resources to data management and data integration than has been allocated in the past, and moving it much higher up in the priorities lists.
These are not insurmountable problems. However, they require a great deal of focus before things will get better. The movement to the cloud seems to be providing that focus.
Get connected. Be connected. Make connections. Find connections. The Internet of Things (IoT) is all about connecting people, processes, data and, as the name suggests, things. The recent social media frenzy surrounding the ALS Ice Bucket Challenge has certainly reminded everyone of the power of social media, the Internet and a willingness to answer a challenge. Fueled by personal and professional connections, the craze has transformed fund raising for at least one charity. Similarly, IoT may potentially be transformational to the business of the public sector, should government step up to the challenge.
Government is struggling with the concept and reality of how IoT really relates to the business of government, and perhaps rightfully so. For commercial enterprises, IoT is far more tangible and simply more fun. Gaming, televisions, watches, Google glasses, smartphones and tablets are all about delivering over-the-top, new and exciting consumer experiences. Industry is delivering transformational innovations, which are connecting people to places, data and other people at a record pace.
It’s time to accept the challenge. Government agencies need to keep pace with their commercial counterparts and harness the power of the Internet of Things. The end game is not to deliver new, faster, smaller, cooler electronics; the end game is to create solutions that let devices connecting to the Internet interact and share data, regardless of their location, manufacturer or format and make or find connections that may have been previously undetectable. For some, this concept is as foreign or scary as pouring ice water over their heads. For others, the new opportunity to transform policy, service delivery, leadership, legislation and regulation is fueling a transformation in government. And it starts with one connection.
One way to start could be linking previously siloed systems together or creating a golden record of all citizen interactions through a Master Data Management (MDM) initiative. It could start with a big data and analytics project to determine and mitigate risk factors in education or linking sensor data across multiple networks to increase intelligence about potential hacking or breaches. Agencies could stop waste, fraud and abuse before it happens by linking critical payment, procurement and geospatial data together in real time.
This is the Internet of Things for government. This is the challenge. This is transformation.
Recently, the UK’s Parliament and the Internet conference brought together leading figures from Government, Parliament, academia and the industry to discuss and debate the most pressing policy issues facing the Internet.
As expected, data privacy and security was top of the agenda for much of the day, with a number of discussions highlighting the extent to which consumer data is being exposed to security risks and the need for the right legislation and protection to keep it safe. (more…)
“The public deserves competent, efficient, and responsive service from the Federal Government. Executive departments and agencies must continuously evaluate their performance in meeting this standard and work to improve it.” whitehouse.gov (Executive Order 13571, 1993)
For government organizations striving to improve customer service, the path to success has not always been easy. Incremental improvement initiatives have only provided a marginal return. As previously discussed, these initiatives have fallen drastically short of the win-win scenario threshold. Further, demand for new, better, and faster services is out pacing the ability of these already strapped organizations to deliver on additional capabilities. Budget cuts, new regulations, high staff retirement rates, and a plethora of competing priorities seem to derail the best intentions. (more…)
I. The Problem
At a time when governments at all levels are being called upon to once again, “do more with less,” Data Center Consolidation (DCC) has become a hot topic. While the Federal government has called for a reduction of 2,400 Data Centers down to about 1,200 by 2015, what does that mean, and most important, how does government get there? (more…)
There has been much discussion, particularly in the UK, about banks restricting the use of their investment and retail arms. The thinking process behind this is that investment banking is much riskier and so by drawing a clear line between the two, consumers will be better protected if another financial crisis should hit. (more…)
I recently returned from China and Hong Kong after having met with several CIOs, media and analysts, as well as delivering keynotes focused on customer centricity. When I return to the US after traveling, I’m often asked about the state of IT in the geography I was just in. I’ve been to both China and Hong Kong several times over the past few years, and from my perspective, IT is maturing at a very rapid pace in that region.
During prior trips to Asia, it felt like the old days of data processing. I would speak with senior IT leaders and they were more concerned with the “blocking and tackling” of IT, and not looking at how IT can provide a strategic competitive advantage. Specifically in China, IT leadership was comfortable scaling by applying people to the problem rather than using commercial software. (more…)