Category Archives: Public Sector

Where Were You on January 21, 2009?

Open_Data

Open Data

January 21, 2009. Why in the world would that be a date to recall? Well, for one, it was the day after Barack Obama was inaugurated as the 44th President of the United States. And secondly, it was the day President Obama released an arguably game changing document, his Memorandum on Transparency and Open Government. This one document set the stage for a new era in how government would look at the data it collects and creates. Since that time, the world of data has changed dramatically! Consider this – new analytics tools, new data types, new devices creating data, new storage ideas, new visualization applications, new concepts, new laws – the list of innovations goes on.

But, all these great innovations are not really why I’m writing today. Today, I’d like to call your attention to a news article I read in NextGov, “Amid Open Data Push, Agencies Feel Urge for Analytics”. I have to admit, as I read this article, I found myself getting just a little bit giddy. Why? Great question, thanks for asking. J Before going on with my thoughts, please take a moment to read the article. Go ahead, I have time. I’ll wait.

Picking up where I left off…

Since 2009, the notion of “open data” has been discussed primarily from one of two main perspectives:

  • Transparency of government to citizens – Accountability
  • What the private sector can do – Innovation

No doubt, there have been significant advances on both of these topics. Yet, as important as these concepts are, budget and resource constraints can cause open data efforts to be prioritized lower than, say, a mission-critical program.

Of course, I get this – mission first – but, a couple years ago it hit me, maybe government agencies are not seeing a potential opportunity that’s sitting right in front of them. Along with the mandate to publish open data, is the opportunity to consume open data and get it into their analytics engines, thus, supporting the agency’s mission! Just this slight mind shift has the potential to turn open data initiatives into a means to create value. Now do you see why I am excited by the article? (If not, I’ll assume you’ve yet to read it.)  I’m thrilled to see agencies adding a third perspective to the open data conversation:

  • Consumption of open data – Improving an agency’s ability to deliver on its mission(s)

I am looking forward to following the success of any agency effort to take advantage of open data as a strategic resource. If you have other examples beyond the cases noted in the NextGov article, please share!

Share
Posted in 5 Sales Plays, Public Sector | Tagged , , | Leave a comment

All You Need is Love! (Well, That and Some Software)

Picture1That’s right, Valentine’s Day is upon us, the day that symbolizes the power of love and has the ability to strengthen relationships between people. I’ve personally experienced 53 Valentine’s Days so I believe I speak with no small measure of authority on the topic of how to make the best of it. Here are my top five suggestions for having a great day:

  1. Know everything you can about the people you have relationships with
  2. Quality matters
  3. ALL your relationships matter
  4. Uncover your hidden or anonymous relationships
  5. Treat your relationships with respect all year long

OK, I admit, this is not the most romantic list ever and might get you in more trouble with your significant other than actually forgetting Valentine’s Day altogether! But, what did you expect? I work for a software company, not eHarmony! :-)

Right. Software. Let’s put this list into the context of government agencies.

  1. Know everything – If your agency’s mission involves delivering services to citizens, likely, you have multiple “systems of record”, each with a supposed accurate record of all the people being tracked by each system. In reality though, it’s rare that the data about individuals is consistently accurate and complete from system to system. The ability to centralize all the data about individuals into a single, authoritative “record” is key to improving service delivery. Such a record will enable you to ensure the citizens you serve are able to take full advantage of all the services available to them. Further, having a single record for each citizen has the added benefit of reducing fraud, waste and abuse.
  2. Quality matters – Few things hinder the delivery of services more than bad data, data with errors, inconsistencies and gaps in completeness. It is difficult, at best, to make sound business decisions with bad data. At the individual level and at the macro level, agency decision makers need complete and accurate data to ensure each citizen is fully served.
  3. All relationships matter – In this context, going beyond having single records to represent people, it’s also important to have single, authoritative views of other entities – programs, services, providers, deliverables, places, etc.
  4. Uncover hidden relationships – Too often, in the complex eco-system of government programs and services, the inability to easily recognize relationships between people and the additional entities mentioned above creates inefficiencies in the “system”. For example, it can go unnoticed that a single parent is not enrolled in a special program designed for their unique life circumstances. Flipping the coin, not having a full view of hidden relationships also opens the door for the less scrupulous in society, giving them the ability to hide their fraudulent activities in plain sight.
  5. Treat relationships respectfully all year – Data hygiene is not a one-time endeavor. Having the right mindset, processes and tools to implement and automate the process of “mastering” data as an on-going process will better ensure the relationship between your agency and those it serves will remain positive and productive.

I may not win the “Cupid of the Year” award, but, I hope my light-hearted Valentine’s Day message has given you a thing or two to think about. Maybe Lennon and McCartney are right, between people, “Love is All You Need”. But, we at Informatica believe for Government-Citizen relationships, a little of the right software can go a long way.

Care to learn more?

Share
Posted in 5 Sales Plays, Master Data Management, Public Sector | Tagged , , | Leave a comment

Government and Cloud – Their Journey to the Altar

cloudsIf you work for or with the government and you care about the cloud, you’ve probably already read the recent MeriTalk report, “Cloud Without the Commitment”. As well, you’ve probably also read numerous opinions about the report. In fact, one of Informatica’s guest bloggers, David Linthicum, just posted his thoughts. As I read the report and the various opinions, I was struck by the seemingly, perhaps, unintentional suggestion that (sticking with MeriTalk’s dating metaphor) the “commitment issues” are a government problem. Mr. Linthicum’s perspective is “there is really no excuse for the government to delay migration to cloud-based platforms” and “It’s time to see some more progress”, suggesting that the onus in on government to move forward.

Hm…

I do agree that, leveraged properly, there’s much more value to be extracted from the cloud by government. Further, I agree that cloud technologies have sufficiently matured to the point that it is feasible to consider migrating mission critical applications. Yet, is it possible that the government’s “fear of commitment” is, in some ways, justified?

Consider this stat from the MeriTalk report – only half (53%) of the respondents rate their experience with the cloud as very successful. That suggests the experience of the other half, as MeriTalk words it, “leave(s) something to be desired.” If I’m a government decision maker and I’m tasked with keeping mission critical systems up and sensitive data safe, am I going to jump at the opportunity to leverage an approach that only half of my peers are satisfied with? Maybe, maybe not.

Now factor this in:

  • 53% are concerned about being locked into a contract where the average term is 3.6 years
  • 58% believe cloud providers do not provide standardized services, thus creating lock in

Back to playing government decision maker, if I do opt to move applications to the cloud, once I get there, I’m bound to that particular provider – contractually and, at least to some extent, technologically. How comfortable am I with the notion of rewriting/rehosting my mission-critical, custom application to run in XYZ cloud? Good question, right?

Inevitably, government agencies will end up with mission-critical systems and sensitive data in the cloud, however, successful “marriages” are hard, making them a bit of a rare commodity

Do I believe government has a “fear of commitment”? Nah, I just see their behavior as prudent caution on their way to the altar.

Share
Posted in Cloud, Public Sector | Tagged , , | Leave a comment

Dark Data in Government: Sounds Sinister

Dark Data in Government: Sounds Sinister

Dark Data in Government: Sounds Sinister

Anytime I read about something characterized as “dark”, my mind immediately jumps to a vision of something sneaky or sinister, something better left unsaid or undiscovered. Maybe I watched too many Alfred Hitchcock movies in my youth, who knows. However, when coupled with the word “data”, “dark” is anything BUT sinister. Sure, as you might agree, the word “undiscovered” may still apply, but, only with a more positive connotation.

To level set, let’s make sure you understand my definition of dark data. I prefer using visualizations when I can so, picture this: the end of the first Indiana Jones movie, Raiders of the Lost Ark. In this scene, we see the Ark of the Covenant, stored in a generic container, being moved down the aisle in a massive warehouse full of other generic containers. What’s in all those containers? It’s pretty much anyone’s guess. There may be a record somewhere, but, for all intents and purposes, the materials stored in those boxes are useless.

Applying this to data, once a piece of data gets shoved into some generic container and is stored away, just like the Arc, the data becomes essentially worthless. This is dark data.

Opening up a government agency to all its dark data can have significant impacts, both positive and negative. Here are couple initial tips to get you thinking in the right direction:

  1. Begin with the end in mind – identify quantitative business benefits of exposing certain dark data.
  2. Determine what’s truly available – perform a discovery project – seek out data hidden in the corners of your agency – databases, documents, operational systems, live streams, logs, etc.
  3. Create an extraction plan – determine how you will get access to the data, how often does the data update, how will handle varied formats?
  4. Ingest the data – transform the data if needed, integrate if needed, capture as much metadata as possible (never assume you won’t need a metadata field, that’s just about the time you will be proven wrong).
  5. Govern the data – establish standards for quality, access controls, security protections, semantic consistency, etc. – don’t skimp here, the impact of bad data can never really be quantified.
  6. Store it – it’s interesting how often agencies think this is the first step
  7. Get the data ready to be useful to people, tools and applications – think about how to minimalize the need for users to manipulate data – reformatting, parsing, filtering, etc. – to better enable self-service.
  8. Make it available – at this point, the data should be easily accessible, easily discoverable, easily used by people, tools and applications.

Clearly, there’s more to shining the light on dark data than I can offer in this post. If you’d like to take the next step to learning what is possible, I suggest you download the eBook, The Dark Data Imperative.

Share
Posted in Big Data, Data Warehousing, Enterprise Data Management, Governance, Risk and Compliance, Intelligent Data Platform, Public Sector | Tagged , , | Leave a comment

Making the Hybrid Cloud Work for Public Sector

Making the Hybrid Cloud Work for Public Sector

Hybrid Cloud and Public Sector

If you’ve been working in the government sector for any amount of time, you had to see the advent of the “hybrid cloud” coming. Like all new technologies, when first introduced, “the cloud” was the answer to all your IT woes. It is cheaper, more reliable, infinitely scalable, instantly adaptable, and so on. But, as time has gone by and many of you have dipped your toes in the water, the reality is beginning to surface, and challenges are beginning to appear. Sure, moving email to the cloud was a great first step, and it certainly gave most agencies the ability to show progress in leveraging the cloud. Yes, archiving data to the cloud is also a good use case and is showing progress. But, what’s next? There are plenty of new SaaS offerings popping up, and purpose-built to solve various public sector challenges, and yes, they are generally decent applications. Yet, would it be fair to suggest new challenges are arising as your agency begins to adopt new cloud solutions? In particular, has the advent of specialized applications for government made your overall IT portfolio simpler or more complex? Government has always struggled with a vast array of siloed systems and isn’t the cloud creating yet more challenges in this regard? Well, maybe. Let’s take a look.

What I love about the cloud is it has something of value to offer practically any government organization, regardless of size, maturity, point of view, approach. Even for the most conservative IT shops, there are use cases that just plain make sense. And with the growing availability of FEDRAMP certified offerings, it’s becoming easier to procure. But, thinking realistically, for reasons of law, budget, time, architecture, we know the cloud will not be the solution for every public sector problem. Some applications, some data will never leave your agency’s premises. And here in lies the new complexity. You have applications and data on-prem. You have applications and data in the cloud. And you have business requirements that require these apps to work together, to share data.

So, now that you have a hybrid environment, what can you do about? Let’s face it, we can talk about technology, architecture and approaches all day long, but, it always comes down to this, what should be done with the data. You need answers to questions such as; Is it safe? Is it accessible? It is reliable? How do I know if the integrity has been compromised? What about the quality? How error-prone is the data? How complete is the data? How do we manage it across this new hybrid landscape? How can I get data from a public cloud application to my on-prem data warehouse? How can I leverage the flexibility of public IaaS to build a new application that will need access to data that is also required for an on-prem legacy application?

I know many government IT professional are wrestling with these questions and seeking solutions. So, here’s an interesting thought. Most of these questions are not exactly new, they are just taking on the added context of the cloud. Prior to the cloud, many agencies discovered answers in the form of a data integration platform. The platform is used to ensure every application, every user has access to the data they need to perform their mission or job. I think of it this way. The platform is a “standardized” abstraction layer that ensures all your data gets to where it needs to be, when it needs to be there, in the form it needs to be in. There are hundreds of government IT shops using such an approach.

Here’s the good news. This approach to integrating data can be extended to include the cloud.  Imagine placing “agents” in all the places where your data needs to live, the agents capable of communicating with each other to integrate, alter or move data. Now add to this the idea of a cloud-based remote control that allows you to control all the functions of the agents. Using such a platform now enables your agency to tie on-prem systems to cloud systems, minimizing the effect of having multiple silos of information. Now government workers and warfighters will have the ability to more quickly get complete, accurate data, regardless of where it originates and citizens will benefit from more effectively delivered services.

How would such an approach change your ideas on how to leverage the cloud for your agency? If you live near the Washington, DC area, you may wish to drop in on the Government Cloud Computing and Data Center Conference & Expo. One of my colleagues, Ronen Schwartz will be discussing this topic. For those not in the vicinity, you can learn more here.

Share
Posted in Cloud, Cloud Application Integration, Cloud Data Integration, Cloud Data Management, Public Sector | Tagged , , | Leave a comment

Is the Internet of Things relevant for the government?

Get connected. Be connected. Make connections. Find connections. The Internet of Things (IoT) is all about connecting people, processes, data and, as the name suggests, things. The recent social media frenzy surrounding the ALS Ice Bucket Challenge has certainly reminded everyone of the power of social media, the Internet and a willingness to answer a challenge. Fueled by personal and professional connections, the craze has transformed fund raising for at least one charity. Similarly, IoT may potentially be transformational to the business of the public sector, should government step up to the challenge.

shutterstock_132378518

Is the Internet of Things relevant for the government?

Government is struggling with the concept and reality of how IoT really relates to the business of government, and perhaps rightfully so. For commercial enterprises, IoT is far more tangible and simply more fun. Gaming, televisions, watches, Google glasses, smartphones and tablets are all about delivering over-the-top, new and exciting consumer experiences. Industry is delivering transformational innovations, which are connecting people to places, data and other people at a record pace.

It’s time to accept the challenge. Government agencies need to keep pace with their commercial counterparts and harness the power of the Internet of Things. The end game is not to deliver new, faster, smaller, cooler electronics; the end game is to create solutions that let devices connecting to the Internet interact and share data, regardless of their location, manufacturer or format and make or find connections that may have been previously undetectable. For some, this concept is as foreign or scary as pouring ice water over their heads. For others, the new opportunity to transform policy, service delivery, leadership, legislation and regulation is fueling a transformation in government. And it starts with one connection.

One way to start could be linking previously siloed systems together or creating a golden record of all citizen interactions through a Master Data Management (MDM) initiative. It could start with a big data and analytics project to determine and mitigate risk factors in education or linking sensor data across multiple networks to increase intelligence about potential hacking or breaches. Agencies could stop waste, fraud and abuse before it happens by linking critical payment, procurement and geospatial data together in real time.

This is the Internet of Things for government. This is the challenge. This is transformation.

This article was originally published on www.federaltimes.com. Please view the original listing here

 

Share
Posted in Big Data, Business Impact / Benefits, Data Integration, Data Security, Master Data Management, Public Sector, Uncategorized | Tagged , , , , , | Leave a comment

One Search Procurement – For the Purchasing of Indirect Goods and Services

One Search Procurement – for purchasing of indirect goods and services 

Informatica Procurement is the internal Amazon for purchasing of MRO, C-goods, indirect materials and services. Informatica Procurement supports enterprise companies in catalog procurement with an industry-independent catalog procurement solution that enables fast and cost-efficient procurement of products and services and supplier integration in an easy to use self-service concept.

Information Procurement at a glance

informatica-procurement-at-a-glance

Informatica recently announced the availability of Informatica Procurement 7.3, the catalog procurement solution. I meet with Melanie Kunz our product manager to learn from here what’s new.

Melanie, for our readers and followers, who is using Informatica Procurement, for which purposes?

Melanie Kunz

Melanie Kunz: Informatica Procurement is industry-independent. Our customers are based in different industries – from engineering and the automotive to companies in the public sector (e.g. Cities). The responsibilities of people who work with Informatica Procurement differ depending on the company. For some customers, only employees from the purchasing department order items in Informatica Procurement. For other customers, all employees are allowed to order their needs themselves. Examples are employees who need screws for the completion of their product or office staff who ordered the business cards for the manager.

What is the most important thing to know about Informatica Procurement 7.3?

Melanie Kunz: In companies where a lot of IT equipment is ordered, it is important to always see the current prices. With each price changes, the catalog would have to be imported into Informatica Procurement. With a punch out to the online shop of IT equipment manufacturer, this is much easier and more efficient. The data from these catalogs are all available in Informatica Procurement, but the price can always be called on a daily basis from the online shop.

Users no longer need to leave Informatica Procurement to order items from external online shops. Informatica Procurement now enables the user to locate internal and indexed external items in just one search. That means you do not have to use different eShops for when you order new office stationary, IT equipment or services.

Great, what is the value for enterprise users and purchasing departments?

Melanie Kunz: All items in Informatica Procurement have the negotiated prices. Informatica Procurement is simple and intuitive that each employee can use the system without training. The view concept allows the restriction on products. For each employee (each department), the administrator can define a view. This view contains only the products that can be seen and ordered.

When you open the detail view for an indexed external item, the current price is determined from the external online shop. This price is saved in item detail view for a defined period. In this way, the user always gets the current price for the item.

The newly designed detail view has an elegant and clear layout. Thus, a high level of user experience is safe. This also applies to the possibility of image enlargement in the search result list.

What if I order same products frequently, like my business cards?

Melanie Kunz: The overview of recent shopping carts help users to reorder the same items on an easy and fast way. A shopping cart from a previous order can use as basis for this new order.

Large organizations with 1000s of employees are even more might have totally different needs what they need for the daily business and maybe dedicated to their career level. How do you address this?

Melanie Kunz: The standard assortment feature has been enhanced in Informatica Procurement 7.3. Administrators can define the assortment per user. Furthermore, it is possible to specify whether users have to search the standard assortment first and only search in the entire assortment if they do not find the relevant item in the standard assortment.

All of these features and many more minor features not only enhance the user experience, but also reduce the processing time of an order drastically.

Informatica Procurement 7.3 “One Search” at a glance

One Search Procurement

 

Learn more on Informatica Procurement 7.3 with the latest webinar.

Share
Posted in Data Integration, Enterprise Data Management, Life Sciences, Manufacturing, Marketplace, Master Data Management, News & Announcements, Operational Efficiency, PiM, Public Sector | Tagged , , , | Leave a comment

Is Big Data Good or Evil? Maybe Neither?

I just finished reading a great article from one of my former colleagues, Bill Franks. He makes a strong argument that Big Data is not inherently good or evil anymore than money is. What makes Big Data (or any data as I see it) take on a characteristic of good or evil is how it is used. Same as money, right? Here’s the rest of Bill’s article.

Bill framed his thoughts within the context of a discussion with a group of government legislators who I would characterize based on his commentary as a bit skittish of government collecting Big Data. Given many recent headlines, I sincerely do not blame them for being concerned. In fact, I applaud them for being cautious.

At the same time, while Big Data seems to be the “type” of data everyone wants to speak about, the scope of the potential problem extends to ALL data. Just because a particular dataset is highly structured into a 20 year old schema that does not exclude it from misuse. I believe structured data has been around for so long people are comfortable with (or have forgotten about) the associated risks.

Any data can be used for good or ill. Clearly, it does not make sense to take the position that “we” should not collect, store and leverage data based on the notion someone could do something bad.

I suggest the real conversation should revolve around access to data. Bill touches on this as well. Far too often, data, whether Big Data or “traditional”, is openly accessible to some people who truly have no need based on job function.

Consider this example – a contracted application developer in a government IT shop is working on the latest version of an existing application for agency case managers. To test the application and get it successfully through a rigorous quality assurance process the IT developer needs a representative dataset. And where does this data come from? It is usually copied from live systems, with personally identifiable information still intact. Not good.

Another example – Creating a 360 degree view of the citizens in a jurisdiction to be shared cross-agency can certainly be an advantageous situation for citizens and government alike. For instance, citizens can be better served, getting more of what they need, while agencies can better protect from fraud, waste and abuse. Practically any agency serving the public could leverage the data to better serve and protect. However, this is a recognized sticky situation. How much data does a case worker from the Department of Human Services need versus that of a law enforcement officer or an emergency services worker need? The way this has been addressed for years is to create silos of data, carrying with it, its own host of challenges. However, as technology evolves, so too should process and approach.

Stepping back and looking at the problem from a different perspective, both examples above, different as they are, can be addressed by incorporating a layer of data security directly into the architecture of the enterprise. Rather than rely on a hodgepodge of data security mechanisms built into point applications and silo’d systems, create a layer through which all data, Big or otherwise, is accessed.

Big Data - Good or Evil

Through such a layer, data can be persistently and/or dynamically masked based on the needs and role of the user. In the first example of the developer, this person would not want access to a live system to do their work. However, the ability to replicate the working environment of the live system is crucial. So, in this case, live data could be masked or altered in a permanent fashion as it is moved from production to development. Personally identifiable information could be scrambled or replaced with XXXXs. Now developers can do their work and the enterprise can rest assured that no harm can come from anyone seeing this data.

Further, through this data security layer, data can be dynamically masked based on a user’s role, leaving the original data unaltered for those who do require it. There are plenty of examples of how this looks in practice, think credit card numbers being displayed as xxxx-xxxx-xxxx-3153. However, this is usually implemented at the application layer and considered to be a “best practice” rather than governed from a consistent layer in the enterprise.

The time to re-think the enterprise approach to data security is here. Properly implemented and deployed, many of the arguments against collecting, integrating and analyzing data from anywhere are addressed. No doubt, having an active discussion on the merits and risks of data is prudent and useful. Yet, perhaps it should not be a conversation to save or not save data, it should be a conversation about access

Share
Posted in Big Data, Public Sector | Tagged , , | Leave a comment

Top 5 Data Themes in Emerging Markets

Top 5 Data Themes in Emerging Markets

Top 5 Data Themes in Emerging Markets

Recently, my US-based job led me to a South African hotel room, where I watched Germany play Brazil in the World Cup. The global nature of the event was familiar to me. My work covers countries like Malaysia, Thailand, Singapore, South Africa and Costa Rica. And as I pondered the stunning score (Germany won, 7 to 1), my mind was drawn to emerging markets. What defines an emerging market? In particular, what are the data-related themes common to emerging markets? Because I work with global clients in the banking, oil and gas, telecommunications, and retail industries, I have learned a great deal about this. As a result, I wanted to share my top 5 observations about data in Emerging Markets.

1) Communication Infrastructure Matters

Many of the emerging markets, particularly in Africa, jumped from one or two generations of telco infrastructure directly into 3G and fiber within a decade. However, this truth only applies to large, cosmopolitan areas. International diversification of fiber connectivity is only starting to take shape. (For example, in Southern Africa, BRICS terrestrial fiber is coming online soon.) What does this mean for data management? First, global connectivity influences domestic last mile fiber deployment to households and businesses. This, in turn, will create additional adoption of new devices. This adoption will create critical mass for higher productivity services, such as eCommerce. As web based transactions take off, better data management practices will follow. Secondly, European and South American data centers become viable legal and performance options for African organizations. This could be a game changer for software vendors dealing in cloud services for BI, CRM, HCM, BPM and ETL.

2) Competition in Telecommunication Matters

If you compare basic wireless and broadband bundle prices between the US, the UK and South Africa, for example, the lack of true competition makes further coverage upgrades, like 4G and higher broadband bandwidths, easy to digest for operators. These upgrades make telecommuting, constant social media engagement possible. Keeping prices low, like in the UK, is the flipside achieving the same result. The worst case is high prices and low bandwidth from the last mile to global nodes. This also creates low infrastructure investment and thus, fewer consumers online for fewer hours. This is often the case in geographically vast countries (Africa, Latin America) with vast rural areas. Here, data management is an afterthought for the most part. Data is intentionally kept in application silos as these are the value creators. Hand coding is pervasive to string data together to make small moves to enhance the view of a product, location, consumer or supplier.

3) A Nation’s Judicial System Matters

If you do business in nations with a long, often British judicial tradition, chances are investment will happen. If you have such a history but it is undermined by a parallel history of graft from the highest to the lowest levels because of the importance of tribal traditions, only natural resources will save your economy. Why does it matter if one of my regional markets is “linked up” but shipping logistics are burdened by this excess cost and delay? The impact on data management is a lack of use cases supporting an enterprise-wide strategy across all territories. Why invest if profits are unpredictable or too meager? This is why small Zambia or Botswana are ahead of the largest African economy, Nigeria.

4) Expertise Location Matters

Anybody can have the most advanced vision on a data-driven, event-based architecture supporting the fanciest data movement and persistence standards. Without the skill to make the case to the business it is a lost cause unless your local culture still has IT in charge of specifying requirements, running the evaluation, selecting and implementing a new technology. It is also done for if there are no leaders who have experienced how other leading firms in the same or different sector went about it (un)successfully. Lastly, if you don’t pay for skill, your project failure risk just tripled. Duh!

5) Denial is Universal

No matter if you are an Asian oil company, a regional North American bank, a Central American National Bank or an African retail conglomerate. If finance or IT invested in any technologies prior and they saw a lack of adoption, for whatever reason, they will deny data management challenges despite other departments complaining. Moreover, if system integrators or internal client staff (mis)understand data management as fixing processes (which it is not) instead of supporting transactional integrity (which it is), clients are on the wrong track. Here, data management undeservedly becomes a philosophical battleground.

This is definitely not a complete list or super-thorough analysis but I think it covers the most crucial observations from my engagements. I would love to hear about your findings in emerging markets.

Stay tuned for part 2 of this series where I will talk about the denial and embrace of corporate data challenges as it pertains to an organization’s location.

Share
Posted in Governance, Risk and Compliance, Public Sector, Retail, Telecommunications, Utilities & Energy | Tagged | Leave a comment

Application Retirement: Old Applications, and Their Place In The Sun

obsolete_tech_large_sqareWhat springs to mind when you think about old applications? What happens to them when they outlived their usefulness? Do they finally get to retire and have their day in the sun, or do they tenaciously hang on to life?

Think for a moment about your situation and of those around you. From the time work started you have been encouraged and sometimes forced to think about, plan for and fund your own retirement. Now consider the portfolio your organization has built up over the years; hundreds or maybe thousands of apps, spread across numerous platforms and locations – A mix of home-grown with the best-in-breed tools or acquired from the leading application vendors.

Evaluating Your Current Situation

  • Do you know how many of those “legacy” systems are still running?
  • Do you know how much these apps are costing?
  • Is there a plan to retire them?
  • How is the execution tracking to plan?

Truth is, even if you have a plan, it probably isn’t going well.

Providing better citizen service at a lower cost

This is something every state and local organization aspires to do by reducing costs. Many organizations are spending 75% or more of their budgets on just keeping the lights on – maintaining existing applications and infrastructure. Being able to fully retire some, or many of these applications saves significant money. Do you know how much these applications are costing your organization? Don’t forget to include the whole range of costs that applications incur – including the physical infrastructure costs such as mainframes, networks and storage, as well as the required software licenses and of course the time of the people that actually keep them running. What happens when those with with Cobol and CICS experience retire? Usually the answer is not good news. There is a lot to consider and many benefits to be gained through an effective application retirement strategy.

August 2011 report by ESG Global shows that some 68% of organizations had over six or more legacy applications running and that 50% planned to retire at least one of those over the following 12-18 months. It would be interesting to see today’s situation and be able evaluate how successful these application retirement plans have been.

A common problem is knowing where to start. You know there are applications that you should be able to retire, but planning, building and executing an effective and success plan can be tough. To help this process we have developed a strategy, framework and solution for effective and efficient application retirement. This is a good starting point on your application retirement journey.

To get a speedy overview, take six minutes to watch this video on application retirement.

We have created a community specifically for application managers in our ‘Potential At Work’ site. If you haven’t already signed up, take a moment and join this group of like-minded individuals from across the globe.

Share
Posted in Application ILM, Application Retirement, Business Impact / Benefits, Data Archiving, Operational Efficiency, Public Sector | Tagged , , , , | Leave a comment