Category Archives: Application Retirement
ROI = every executive’s favorite acronym and one that is often challenging to demonstrate.
In our interactions with provider clients and prospects we are hearing that they’ve migrated to new EMRs but aren’t receiving the ROI they had budgeted or anticipated. In many cases, they are using the new EMR for documentation but still paying to maintain the legacy EMR for access to historical data for billing and care delivery. If health systems can retire these applications and still maintain operational access to the data, they will be able to realize the expected ROI and serve patients proactively.
My colleague Julie, Lockner wrote a blog post about how Informatica Application Retirement for Healthcare is helping healthcare organizations to retire legacy applications and realize ROI.
Healthcare organizations are currently engaged in major transformative initiatives. The American Recovery and Reinvestment Act of 2009 (ARRA) provided the healthcare industry incentives for the adoption and modernization of point-of-care computing solutions including electronic medical and health records (EMRs/EHRs). Funds have been allocated, and these projects are well on their way. In fact, the majority of hospitals in the US are engaged in implementing EPIC, a software platform that is essentially the ERP for healthcare.
These Cadillac systems are being deployed from scratch with very little data being ported from the old systems into the new. The result is a dearth of legacy applications running in aging hospital data centers, consuming every last penny of HIS budgets. Because the data still resides on those systems, hospital staff continues to use them making it difficult to shut down or retire.
Most of these legacy systems are not running on modern technology platforms – they run on systems such as HP Turbo Image, Intercache Mumps, and embedded proprietary databases. Finding people who know how to manage and maintain these systems is costly and risky – risky in that if data residing in those applications is subject to data retention requirements (patient records, etc.) and the data becomes inaccessible.
A different challenge for CFOs of these hospitals is the ROI on these EPIC implementations. Because these projects are multi-phased, multi-year, boards of directors are asking about the value realized from these investments. Many are coming up short because they are maintaining both applications in parallel. Relief will come when systems can be retired – but getting hospital staff and regulators to approve a retirement project requires evidence that they can still access data while adhering to compliance needs.
Many providers have overcome these hurdles by successfully implementing an application retirement strategy based on the Informatica Data Archive platform. Several of the largest pediatrics’ children’s hospitals in the US are either already saving or expecting to save $2 Million or more annually from retiring legacy applications. The savings come from:
- Eliminating software maintenance and license costs
- Eliminate hardware dependencies and costs
- Reduced storage requirements by 95% (data archived is stored in a highly compressed, accessible format)
- Improved efficiencies in IT by eliminating specialized processes or skills associated with legacy systems
- Freed IT resources – teams can spend more of their time working on innovations and new projects
Informatica Application Retirement Solutions for Healthcare provide hospitals with the ability to completely retire legacy applications, retire and maintain access to archive data for hospital staff. And with built in security and retention management, records managers and legal teams are satisfying compliance requirements. Contact your Informatica Healthcare team for more information on how you can get that EPIC ROI the board of directors is asking for.
The term “big data” has been bandied around so much in recent months that arguably, it’s lost a lot of meaning in the IT industry. Typically, IT teams have heard the phrase, and know they need to be doing something, but that something isn’t being done. As IDC pointed out last year, there is a concerning shortage of trained big data technology experts, and failure to recognise the implications that not managing big data can have on the business is dangerous. In today’s information economy, as increasingly digital consumers, customers, employees and social networkers we’re handing over more and more personal information for businesses and third parties to collate, manage and analyse. On top of the growth in digital data, emerging trends such as cloud computing are having a huge impact on the amount of information businesses are required to handle and store on behalf of their customers. Furthermore, it’s not just the amount of information that’s spiralling out of control: it’s also the way in which it is structured and used. There has been a dramatic rise in the amount of unstructured data, such as photos, videos and social media, which presents businesses with new challenges as to how to collate, handle and analyse it. As a result, information is growing exponentially. Experts now predict a staggering 4300% increase in annual data generation by 2020. Unless businesses put policies in place to manage this wealth of information, it will become worthless, and due to the often extortionate costs to store the data, it will instead end up having a huge impact on the business’ bottom line. Maxed out data centres Many businesses have limited resource to invest in physical servers and storage and so are increasingly looking to data centres to store their information in. As a result, data centres across Europe are quickly filling up. Due to European data retention regulations, which dictate that information is generally stored for longer periods than in other regions such as the US, businesses across Europe have to wait a very long time to archive their data. For instance, under EU law, telecommunications service and network providers are obliged to retain certain categories of data for a specific period of time (typically between six months and two years) and to make that information available to law enforcement where needed. With this in mind, it’s no surprise that investment in high performance storage capacity has become a key priority for many. Time for a clear out So how can organisations deal with these storage issues? They can upgrade or replace their servers, parting with lots of capital expenditure to bring in more power or more memory for Central Processing Units (CPUs). An alternative solution would be to “spring clean” their information. Smart partitioning allows businesses to spend just one tenth of the amount required to purchase new servers and storage capacity, and actually refocus how they’re organising their information. With smart partitioning capabilities, businesses can get all the benefits of archiving the information that’s not necessarily eligible for archiving (due to EU retention regulations). Furthermore, application retirement frees up floor space, drives the modernisation initiative, allows mainframe systems and older platforms to be replaced and legacy data to be migrated to virtual archives. Before IT professionals go out and buy big data systems, they need to spring clean their information and make room for big data. Poor economic conditions across Europe have stifled innovation for a lot of organisations, as they have been forced to focus on staying alive rather than putting investment into R&D to help improve operational efficiencies. They are, therefore, looking for ways to squeeze more out of their already shrinking budgets. The likes of smart partitioning and application retirement offer businesses a real solution to the growing big data conundrum. So maybe it’s time you got your feather duster out, and gave your information a good clean out this spring?
IT application managers are constantly going through a process of integrating, modernizing and consolidating enterprise applications to keep them efficient and providing the maximum business value to the corporation for their cost.
But, it is important to remember that there is significant risk in these projects. An article in the Harvard Business Review states that 17% of enterprise application projects go seriously wrong; going over budget by 200% and over schedule by 70%. The HRB article refers to these projects as “black swans.”
How can you reduce this risk of project failure? Typically, 30% to 40% of an enterprise application project is data migration. A recent study by Bloor Research shows that while success rates for data migration projects are improving, 38% of them still miss their schedule and budget targets.
How can you improve the odds of success in data migration projects?
- Use data profiling tools to understand your data before you move it.
- Use data quality tools to correct data quality problems. There is absolutely no point in moving bad data around the organization – but it happens.
- Use a proven external methodology. In plain English, work with people who have “done it before”
- Develop your own internal competence. Nobody knows your data, and more importantly, the business context of your data than your own staff. Develop the skills and engage your business subject matter experts.
Informatica has industry-leading tools, a proven methodology, and a service delivery team with hundreds of successful data migration implementations.
To find out more about successful data migration:
- Informatica World: Visit us at the Hands On Lab – Data Migration.
- Informatica World: Informatica Presentation on Application Data Migration.
Application Data Migrations with Informatica Velocity Migration Methodology
Friday June 5, 2013 9:00 to 10:00
- Informatica World: Data Migration Factory Presentation by Accenture
Accelerating the Power of Data Migration
Tuesday June 4, 2013 2:00 to 3:00
- Bloor White Paper: Lower Your Risk with Application Data Migration: Next Steps With Informatica
- Informatica White Paper: De-Risk Your Application Go Lives
Join us this year at Informatica World!
We have a great line up of speakers and events to help you become a data driven healthcare organization… I’ve provided a few highlights below:
Participate in the Informatica World Keynote sessions with Sohaib Abbasi and Rick Smolan who wrote “The Human Face of Big Data” — learn more via this quick YouTube video: http://www.youtube.com/watch?v=7K5d9ArRLJE&feature=player_embedded
With more than 100 interactive and in-depth breakout sessions, spanning 6 different tracks, (Platform & Products, Architecture, Best Practices, Big Data, Hybrid IT and Tech Talk), Informatica World is an excellent way to ensure you are getting the most from your Informatica investment. Learn best practices from organizations who are realizing the potential of their data like: Ochsner Health, Sutter Health, UMass Memorial, Qualcomm and Paypal.
Finally, we want you to balance work with a little play… we invite you to network with industry peers at our Healthcare Cocktail Reception on the evening of Wednesday, June 5th and again during our Data Driven Healthcare Breakfast Roundtable on Thursday, June 6th.
See you there!
HISTalk published a recent interview with Ochsner Health System CIO, Chris Belmont. Chris and his team are great Informatica clients and I really like how he conveyed the benefits of making Informatica the data backbone of their Epic implementation. I can’t say it any better than Chris already has so I’ve extracted a few take-always below and you can read the entire interview here
On the importance of migrating legacy data into the new EMR: “Informatica was critical in getting us there. We learned on the first site. We thought it was a good idea to go in there with an empty slate and say, let’s just build it all from scratch and start with a clean slate. Let’s make sure the record’s in good shape. We quickly realized that was a bad idea. Not just in the clinical areas, but in the registration area.”
On the value of Application Retirement: “That’s going to be a big win for us. In fact, we’re targeting about $13 million in operational benefit when we turn off those legacy platforms. Informatica is going to allow us to get there.”
On not ever being 100% Epic: “We’re watching it, but frankly it will be a while – and I would argue never – that we’ll be 100 percent Epic. A lot of the data that we have that Informatica allows us to get our hands on and load into our warehouse is non-Epic data.”
On the nuggets Informatica is helping them to uncover: “We’re correlating a lot of data, not just from Epic, but I think right now we have like 25 different systems that we’re running through Informatica and into our warehouse. The gold nuggets that are coming out of that data are just tremendous.”
On challenges and opportunities: “It’s going to be, how do we do more with the data we have…having that data in a format that’s easily, quickly, and very accessible is going to be key. Gone are the days where you can throw an army of analysts in a room and say, “Give me this report” and you wait three weeks and they give you something that’s less than optimal. I think the days of, “Tell me what I need to know before I even know that I need to know it” — I think those are the days that we’re looking forward to. With the tools we have with partners like Informatica with their tools, I think we can achieve it.”
Meet Chris and his team in Informatica Booth 5005 during HIMSS 2013.
HIMSS 2013 — right time, right place, it’s on!
According to a 2011 Ovum survey, 85% of respondents cited ballooning data sets as the cause of application performance problems. Many IT organizations fell short in 2012 letting unmanaged data growth impact the business. This year, Informatica is witnessing a surge of interest in Enterprise Data Archive solutions. This interest is being created because executives want to invest in innovative technologies for real-time and operational analytics. Yet, with little to no IT budget increase, IT leaders are getting creative.
Businesses are moving from on premises applications to Software as a Service (SaaS) freeing up time and resources – yet the legacy application being replaced all too often stays in the data center consuming costly resources. IT leaders are recognizing the quick win of retiring legacy applications. An application retirement strategy supports data center consolidation and application modernization initiatives – while ensuring data is retained to meet regulatory compliance and business needs. Significant cost savings are realized because mainframe systems can be turned off, maintenance costs go away. With this new source of revenue, executives can fund their analytics projects and drive competitive operations. (more…)
In my previous blog, I looked at the need among enterprises for application retirement. But, what kind of software solution is best for supporting effective application retirement?
It’s important to realise that retirement projects might start small, with one or two applications, and then quickly blossom into full-fledged rationalisation initiatives where hundreds of dissimilar applications are retired. So relying on individual applications or database vendors for tools and support can easily lead to a fragmented and uneven retirement strategy and archiving environment. In any event, some major application vendors offer little or even no archiving capabilities. (more…)
According to the IDC Financial Insights 2013 Predictions report, financial institutions across most regions are getting serious about updating their legacy systems to improve reduce operating costs, automate labor intensive processes, improve customer experiences, and avoid costly disruptions. Transforming a bank’s core systems or insurance provider’s main business systems is a strategic decision that has far-reaching implications on the firm’s future business strategies and success. When done right, the capabilities offered in today’s modern banking and insurance platforms can propel a company in front of their competition or be the nail in the coffin if your data is not migrated correctly, safeguards are not in place to protect against unwanted data breaches, and if you are not able to decommission those old systems as planned.
One of the most important and critical phases of any legacy modernization project is the process of migrating data from old to new. Migrating data involves:
- Ability to access existing data in the legacy systems
- Understand the data structures that need to be migrated
- Transform and execute one-to-one mapping with the relevant fields in the new system
- Identify data quality errors and other gaps in the data
- Validate what is entered into the new system by identifying transformation or mapping errors
- Seamlessly connect to the target tables and fields in the new system
Sounds easy enough right? Not so fast! (more…)
Whether the result of growth or acquisition, enterprises that have been around a decade or longer typically have large and complex information environments with lots of redundant and obsolete applications. But there’s always the worry that one day there will be a need to access the old applications’ data. So these applications are still managed and maintained even though the data is rarely needed resulting in sizable costs in license fees, maintenance, power, data center space, backups, and precious IT time. In many companies, there are hundreds, even thousands, of obsolete or redundant applications, and the business continues to support them with expensive production level infrastructure and SLAs.
But as we’re facing the longest double dip recession for 50 years, businesses are being forced to think about reclaiming this extraneous spend for more strategic purposes by retiring any outdated application…but without losing access to the data. Keeping data from dormant applications “live” as a safeguard is more than just good common sense. In many cases, keeping the data readily accessible is compulsory due to corporate, industry, and governmental compliance demands. But you needn’t spend full production costs to do so. (more…)