Tag Archives: public sector
January 21, 2009. Why in the world would that be a date to recall? Well, for one, it was the day after Barack Obama was inaugurated as the 44th President of the United States. And secondly, it was the day President Obama released an arguably game changing document, his Memorandum on Transparency and Open Government. This one document set the stage for a new era in how government would look at the data it collects and creates. Since that time, the world of data has changed dramatically! Consider this – new analytics tools, new data types, new devices creating data, new storage ideas, new visualization applications, new concepts, new laws – the list of innovations goes on.
But, all these great innovations are not really why I’m writing today. Today, I’d like to call your attention to a news article I read in NextGov, “Amid Open Data Push, Agencies Feel Urge for Analytics”. I have to admit, as I read this article, I found myself getting just a little bit giddy. Why? Great question, thanks for asking. J Before going on with my thoughts, please take a moment to read the article. Go ahead, I have time. I’ll wait.
Picking up where I left off…
Since 2009, the notion of “open data” has been discussed primarily from one of two main perspectives:
- Transparency of government to citizens – Accountability
- What the private sector can do – Innovation
No doubt, there have been significant advances on both of these topics. Yet, as important as these concepts are, budget and resource constraints can cause open data efforts to be prioritized lower than, say, a mission-critical program.
Of course, I get this – mission first – but, a couple years ago it hit me, maybe government agencies are not seeing a potential opportunity that’s sitting right in front of them. Along with the mandate to publish open data, is the opportunity to consume open data and get it into their analytics engines, thus, supporting the agency’s mission! Just this slight mind shift has the potential to turn open data initiatives into a means to create value. Now do you see why I am excited by the article? (If not, I’ll assume you’ve yet to read it.) I’m thrilled to see agencies adding a third perspective to the open data conversation:
- Consumption of open data – Improving an agency’s ability to deliver on its mission(s)
I am looking forward to following the success of any agency effort to take advantage of open data as a strategic resource. If you have other examples beyond the cases noted in the NextGov article, please share!
That’s right, Valentine’s Day is upon us, the day that symbolizes the power of love and has the ability to strengthen relationships between people. I’ve personally experienced 53 Valentine’s Days so I believe I speak with no small measure of authority on the topic of how to make the best of it. Here are my top five suggestions for having a great day:
- Know everything you can about the people you have relationships with
- Quality matters
- ALL your relationships matter
- Uncover your hidden or anonymous relationships
- Treat your relationships with respect all year long
OK, I admit, this is not the most romantic list ever and might get you in more trouble with your significant other than actually forgetting Valentine’s Day altogether! But, what did you expect? I work for a software company, not eHarmony!
Right. Software. Let’s put this list into the context of government agencies.
- Know everything – If your agency’s mission involves delivering services to citizens, likely, you have multiple “systems of record”, each with a supposed accurate record of all the people being tracked by each system. In reality though, it’s rare that the data about individuals is consistently accurate and complete from system to system. The ability to centralize all the data about individuals into a single, authoritative “record” is key to improving service delivery. Such a record will enable you to ensure the citizens you serve are able to take full advantage of all the services available to them. Further, having a single record for each citizen has the added benefit of reducing fraud, waste and abuse.
- Quality matters – Few things hinder the delivery of services more than bad data, data with errors, inconsistencies and gaps in completeness. It is difficult, at best, to make sound business decisions with bad data. At the individual level and at the macro level, agency decision makers need complete and accurate data to ensure each citizen is fully served.
- All relationships matter – In this context, going beyond having single records to represent people, it’s also important to have single, authoritative views of other entities – programs, services, providers, deliverables, places, etc.
- Uncover hidden relationships – Too often, in the complex eco-system of government programs and services, the inability to easily recognize relationships between people and the additional entities mentioned above creates inefficiencies in the “system”. For example, it can go unnoticed that a single parent is not enrolled in a special program designed for their unique life circumstances. Flipping the coin, not having a full view of hidden relationships also opens the door for the less scrupulous in society, giving them the ability to hide their fraudulent activities in plain sight.
- Treat relationships respectfully all year – Data hygiene is not a one-time endeavor. Having the right mindset, processes and tools to implement and automate the process of “mastering” data as an on-going process will better ensure the relationship between your agency and those it serves will remain positive and productive.
I may not win the “Cupid of the Year” award, but, I hope my light-hearted Valentine’s Day message has given you a thing or two to think about. Maybe Lennon and McCartney are right, between people, “Love is All You Need”. But, we at Informatica believe for Government-Citizen relationships, a little of the right software can go a long way.
If you work for or with the government and you care about the cloud, you’ve probably already read the recent MeriTalk report, “Cloud Without the Commitment”. As well, you’ve probably also read numerous opinions about the report. In fact, one of Informatica’s guest bloggers, David Linthicum, just posted his thoughts. As I read the report and the various opinions, I was struck by the seemingly, perhaps, unintentional suggestion that (sticking with MeriTalk’s dating metaphor) the “commitment issues” are a government problem. Mr. Linthicum’s perspective is “there is really no excuse for the government to delay migration to cloud-based platforms” and “It’s time to see some more progress”, suggesting that the onus in on government to move forward.
I do agree that, leveraged properly, there’s much more value to be extracted from the cloud by government. Further, I agree that cloud technologies have sufficiently matured to the point that it is feasible to consider migrating mission critical applications. Yet, is it possible that the government’s “fear of commitment” is, in some ways, justified?
Consider this stat from the MeriTalk report – only half (53%) of the respondents rate their experience with the cloud as very successful. That suggests the experience of the other half, as MeriTalk words it, “leave(s) something to be desired.” If I’m a government decision maker and I’m tasked with keeping mission critical systems up and sensitive data safe, am I going to jump at the opportunity to leverage an approach that only half of my peers are satisfied with? Maybe, maybe not.
Now factor this in:
- 53% are concerned about being locked into a contract where the average term is 3.6 years
- 58% believe cloud providers do not provide standardized services, thus creating lock in
Back to playing government decision maker, if I do opt to move applications to the cloud, once I get there, I’m bound to that particular provider – contractually and, at least to some extent, technologically. How comfortable am I with the notion of rewriting/rehosting my mission-critical, custom application to run in XYZ cloud? Good question, right?
Inevitably, government agencies will end up with mission-critical systems and sensitive data in the cloud, however, successful “marriages” are hard, making them a bit of a rare commodity
Do I believe government has a “fear of commitment”? Nah, I just see their behavior as prudent caution on their way to the altar.
Get connected. Be connected. Make connections. Find connections. The Internet of Things (IoT) is all about connecting people, processes, data and, as the name suggests, things. The recent social media frenzy surrounding the ALS Ice Bucket Challenge has certainly reminded everyone of the power of social media, the Internet and a willingness to answer a challenge. Fueled by personal and professional connections, the craze has transformed fund raising for at least one charity. Similarly, IoT may potentially be transformational to the business of the public sector, should government step up to the challenge.
Government is struggling with the concept and reality of how IoT really relates to the business of government, and perhaps rightfully so. For commercial enterprises, IoT is far more tangible and simply more fun. Gaming, televisions, watches, Google glasses, smartphones and tablets are all about delivering over-the-top, new and exciting consumer experiences. Industry is delivering transformational innovations, which are connecting people to places, data and other people at a record pace.
It’s time to accept the challenge. Government agencies need to keep pace with their commercial counterparts and harness the power of the Internet of Things. The end game is not to deliver new, faster, smaller, cooler electronics; the end game is to create solutions that let devices connecting to the Internet interact and share data, regardless of their location, manufacturer or format and make or find connections that may have been previously undetectable. For some, this concept is as foreign or scary as pouring ice water over their heads. For others, the new opportunity to transform policy, service delivery, leadership, legislation and regulation is fueling a transformation in government. And it starts with one connection.
One way to start could be linking previously siloed systems together or creating a golden record of all citizen interactions through a Master Data Management (MDM) initiative. It could start with a big data and analytics project to determine and mitigate risk factors in education or linking sensor data across multiple networks to increase intelligence about potential hacking or breaches. Agencies could stop waste, fraud and abuse before it happens by linking critical payment, procurement and geospatial data together in real time.
This is the Internet of Things for government. This is the challenge. This is transformation.
What springs to mind when you think about old applications? What happens to them when they outlived their usefulness? Do they finally get to retire and have their day in the sun, or do they tenaciously hang on to life?
Think for a moment about your situation and of those around you. From the time work started you have been encouraged and sometimes forced to think about, plan for and fund your own retirement. Now consider the portfolio your organization has built up over the years; hundreds or maybe thousands of apps, spread across numerous platforms and locations – A mix of home-grown with the best-in-breed tools or acquired from the leading application vendors.
Evaluating Your Current Situation
- Do you know how many of those “legacy” systems are still running?
- Do you know how much these apps are costing?
- Is there a plan to retire them?
- How is the execution tracking to plan?
Truth is, even if you have a plan, it probably isn’t going well.
Providing better citizen service at a lower cost
This is something every state and local organization aspires to do by reducing costs. Many organizations are spending 75% or more of their budgets on just keeping the lights on – maintaining existing applications and infrastructure. Being able to fully retire some, or many of these applications saves significant money. Do you know how much these applications are costing your organization? Don’t forget to include the whole range of costs that applications incur – including the physical infrastructure costs such as mainframes, networks and storage, as well as the required software licenses and of course the time of the people that actually keep them running. What happens when those with with Cobol and CICS experience retire? Usually the answer is not good news. There is a lot to consider and many benefits to be gained through an effective application retirement strategy.
August 2011 report by ESG Global shows that some 68% of organizations had over six or more legacy applications running and that 50% planned to retire at least one of those over the following 12-18 months. It would be interesting to see today’s situation and be able evaluate how successful these application retirement plans have been.
A common problem is knowing where to start. You know there are applications that you should be able to retire, but planning, building and executing an effective and success plan can be tough. To help this process we have developed a strategy, framework and solution for effective and efficient application retirement. This is a good starting point on your application retirement journey.
To get a speedy overview, take six minutes to watch this video on application retirement.
We have created a community specifically for application managers in our ‘Potential At Work’ site. If you haven’t already signed up, take a moment and join this group of like-minded individuals from across the globe.
In the other, they hear administrative talk of smaller budgets and scarcer resources.
As stringent requirements for both transparency and accountability grow, this paradox of pressure increases.
Sometimes, the best way to cope is to TALK to somebody.
What if you could ask other data technologists candid questions like:
- Do you think government regulation helps or hurts the sharing of data?
- Do you think government regulators balance the privacy needs of the public with commercial needs?
- What are the implications of big data government regulation, especially for users?
- How can businesses expedite the government adoption of the cloud?
- How can businesses aid in the government overcoming the security risks associated with the cloud?
- How should the policy frameworks for handling big data differ between the government and the private sector?
What if you could tell someone who understood? What if they had sweet suggestions, terrific tips, stellar strategies for success? We think you can. We think they will.
That’s why Twitter needs a #DataChat.
What on earth is a #DataChat?
Good question. It’s a Twitter Chat – A public dialog, at a set time, on a set topic. It’s something like a crowd-sourced discussion. Any Twitter user can participate simply by including the applicable hashtag in each tweet. Our hashtag is #DataChat. We’ll connect on Twitter, on the third Thursday of each month to share struggles, victories and advice about data governance. We’re going to begin this week, Thursday April 17, at 3:00 PM Eastern Time. For our first chat, we are going to discuss topics that relate to data technologies in government organizations.
What don’t you join us? Tell us about it. Mark your calendar. Bring a friend.
Because, sometimes, you just need someone to talk to.
Loraine Lawson does an outstanding job of covering the issues around government use of “data heavy” projects. This includes a report by the government IT site, MeriTalk.
“The report identifies five factors, which it calls the Big Five of IT, that will significantly affect the flow of data into and out of organizations: Big data, data center consolidation, mobility, security and cloud computing.”
MeriTalk surveyed 201 state and local government IT professionals, and found that, while the majority of organizations plan to deploy the Big Five, 94 percent of IT pros say their agency is not fully prepared. “In fact, if Big Data, mobile, cloud, security and data center consolidation all took place today, 89 percent say they’d need additional network capacity to maintain service levels. Sixty-three percent said they’d face network bottleneck risks, according to the report.”
This report states what most who work with the government already know; the government is not ready for the influx of data. Nor is the government ready for the different uses of data, and thus there is a large amount of risk as the amount of data under management within the government explodes.
Add issues with the approaches and technologies leveraged for data integration to the list. As cloud computing and mobile computing continue to rise in popularity, there is not a clear strategy and technology for syncing data in the cloud, or on mobile devices, with data that exists within government agencies. Consolidation won’t be possible without a sound data integration strategy, nor will the proper use of big data technology.
The government sees a huge wave of data heading for it, as well as opportunities with new technology such as big data, cloud, and mobile. However, there doesn’t seem to be an overall plan to surf this wave. According to the report, if they do wade into the big data wave, they are likely to face much larger risks.
The answer to this problem is really rather simple. As the government moves to take advantage of the rising tide of data, as well as new technologies, they need to be funded to get the infrastructure and the technology they need to be successful. The use of data integration approaches and technologies, for example, will return the investment ten-fold, if properly introduced into the government problem domains. This includes integration with big data systems, mobile devices, and, of course, the rising use of cloud-based platforms.
While data integration is not a magic bullet for the government, nor any other organization, the proper and planned use of this technology goes a long way toward reducing the inherent risks that the report identified. Lacking that plan, I don’t think the government will get very far, very fast.
In the first two issues I spent time looking at the need for states to pay attention to the digital health and safety of their citizens, followed by the oft forgotten need to understand and protect the non-production data. This is data than has often proliferated and also ignored or forgotten about.
In many ways, non-production data is simpler to protect. Development and test systems can usually work effectively with realistic but not real PII data and realistic but not real volumes of data. On the other hand, production systems need the real production data complete with the wealth of information that enables individuals to be identified – and therefore presents a huge risk. If and when that data is compromised either deliberately or accidentally the consequences can be enormous; in the impact on the individual citizens and also the cost of remediation on the state. Many will remember the massive South Carolina data breach of late 2012 when over the course of 2 days a 74 GB database was downloaded and stolen, around 3.8 million payers and 1.9 million dependents had their social security information stolen and 3.3 million “lost” bank account details. The citizens’ pain didn’t end there, as the company South Carolina picked to help its citizens seems to have tried to exploit the situation.
The biggest problem with securing production data is that there are numerous legitimate users and uses of that data, and most often just a small number of potentially malicious or accidental attempts of inappropriate or dangerous access. So the question is… how does a state agency protect its citizens’ sensitive data while at the same time ensuring that legitimate uses and users continues – without performance impacts or any disruption of access? Obviously each state needs to make its own determination as to what approach works best for them.
This video does a good job at explaining the scope of the overall data privacy/security problems and also reviews a number of successful approaches to protecting sensitive data in both production and non-production environments. What you’ll find is that database encryption is just the start and is fine if the database is “stolen” (unless of course the key is stolen along with the data! Encryption locks the data away in the same way that a safe protects physical assets – but the same problem exists. If the key is stolen with the safe then all bets are off. Legitimate users are usually easily able deliberately breach and steal the sensitive contents, and it’s these latter occasions we need to understand and protect against. Given that the majority of data breaches are “inside jobs” we need to ensure that authorized users (end-users, DBAs, system administrators and so on) that have legitimate access only have access to the data they absolutely need, no more and no less.
So we have reached the end of the first series. In the first blog we looked at the need for states to place the same emphasis on the digital health and welfare of their citizens as they do on their physical and mental health. In the second we looked at the oft-forgotten area of non-production (development, testing, QA etc.) data. In this third and final piece we looked at the need to and some options for providing the complete protection of non-production data.
In my first article on the topic of citizens’ digital health and safety we looked at the states’ desire to keep their citizens healthy and safe and also at the various laws and regulations they have in place around data breaches and losses. The size and scale of the problem together with some ideas for effective risk mitigation are in this whitepaper.
Let’s now start delving a little deeper into the situation states are faced with. It’s pretty obvious that citizen data that enables an individual to be identified (PII) needs to be protected. We immediately think of the production data: data that is used in integrated eligibility systems; in health insurance exchanges; in data warehouses and so on. In some ways the production data is the least of our problems; our research shows that the average state has around 10 to 12 full copies of data for non-production (development, test, user acceptance and so on) purposes. This data tends to be much more vulnerable because it is widespread and used by a wide variety of people – often subcontractors or outsourcers, and often the content of the data is not well understood.
Obviously production systems need access to real production data (I’ll cover how best to protect that in the next issue), on the other hand non-production systems of every sort do not. Non-production systems most often need realistic, but not real data and realistic, but not real data volumes (except maybe for the performance/stress/throughput testing system). What need to be done? Well to start with, a three point risk remediation plan would be a good place to start.
1. Understand the non-production data using sophisticated data and schema profiling combined with NLP (Natural Language Processing) techniques help to identify previously unrealized PII that needs protecting.
2. Permanently mask the PII so that it is no longer the real data but is realistic enough for non-production uses and make sure that the same masking is applied to the attribute values wherever they appear in multiple tables/files.
3. Subset the data to reduce data volumes, this limits the size of the risk and also has positive effects on performance, run-times, backups etc.
Gartner has just published their 2013 magic quadrant for data masking this covers both what they call static (i.e. permanent or persistent masking) and dynamic (more on this in the next issue) masking. As usual the MQ gives a good overview of the issues behind the technology as well as a review of the position, strengths and weaknesses of the leading vendors.
It is (or at least should be) an imperative that from the top down state governments realize the importance and vulnerability of their citizens data and put in place a non-partisan plan to prevent any future breaches. As the reader might imagine, for any such plan to success needs a combination of cultural and organizational change (getting people to care) and putting the right technology – together these will greatly reduce the risk. In the next and final issue on this topic we will look at the vulnerabilities of production data, and what can be done to dramatically increase its privacy and security.