Category Archives: Application Retirement

Under the hood: decommissioning an SAP system with Informatica Data Archive for Application Retirement

If you reached this blog, you are already familiar with the reasons why you need to do a house cleaning on your old applications. If not, this subject has been explored in other discussions, like this one from Claudia Chandra.

All the explanations below are based on Informatica Data Archive for application retirement.

Very often, customers are surprised to know that Informatica’s solution for application retirement can also decommission SAP system. The market has the feeling that SAP is different, or “another beast”. And it really is!

A typical SAP requires software licenses, maintenance contracts, and hardware for the transactional application itself, the corresponding data warehouse and databases, operating systems, server, storage, and any additional software and hardware licenses that you may have on top of the application.  Your company may want to retire older versions of the application or consolidate multiple instances in order to save costs. Our engineering group has some very experienced SAP resources, including myself here, with more than 16 years of hands-on work with SAP technology. And we were able to simplify the SAP retirement process in a way that makes the Informatica Data Archive solution decommission SAP as any other type of application.

Next are the steps to decommission an SAP system using Informatica Data Archive.

Let’s start with some facts: SAP has some “special” tables which can only be read by the SAP kernel itself. In a typical SAP ECC 6.0, around 9% of these tables fall in these categories, representing around 6,000 tables.

More specifically, these tables are known as “clusters”, “pools” and I created a third category with transparent tables which have a binary column, or RAW data type, which only SAP application can unravel.

1)    Mining

In this step, we will get all the metadata of the SAP system being retired, including all transparent, cluster and pools tables, all columns with data types. This metadata will be kept with the data in the optimized archive.

2)    Extraction from source

Informatica Data Archive 6.1.x is able to connect to all database servers certified by SAP, to retrieve rows from the transparent tables.

On the SAP system, it is required to install an ABAP agent, which has the programs developed by Informatica to read all the rows from the special tables and archive files and to pull all the attachments in its original format. These programs are delivered as an SAP transport, which is imported in the SAP system prior to the beginning of the decommissioning process.

Leveraging the Java connector publicly available through the SAP portal (SAPJCo), Informatica Data Archive connects to an SAP application server on the system being decommissioned and make calls to the programs imported though the transport. The tasks are performed using background threads and the process is monitored from the Informatica Data Archive environment, including all the logging, status and monitoring of the whole retirement process happening in the SAP system.

Extraction of table rows in database

Below you can see what all SAP table types are and how our solution deals with it:

Table type Table name in SAP
(Logical name)
Table name in the database(Physical table) How we handle it?
Cluster tables BSEG RFBLG The engine reads all the rows from the logical tables by connecting to the SAP application level and store in the archive store as if the table existed in the database as a physical table.The engine also reads all rows of the physical tables and stores as they are, as a policy insurance only, since the data cannot be read without an SAP system up and running
Pool tables A016 KAPOL
Transparent tables with RAW field PCL2STXL PCL2STXL The engine creates a new table in the archive store and read all rows from the original table, but the RAW field is unraveled.The engine reads all rows of the physical tables and store as they are, as a policy insurance only, since the data cannot be read without an SAP system up and running

 

The engine also reads all rows of the original table PCL2 or STXL and stores as they are, as a policy insurance only, since the data cannot be read without an SAP system up and running

 

The Informatica Data Archive will extract the data of all tables, independently of their types.

Table rows in archive files

Another source of table rows is the archived data. SAP has its own archiving framework, which is based on a creation of archiving files, also known as ADK files. These files store table rows in an SAP proprietary compacted form, which can only be read by ABAP code running in a SAP system.

Once created, these files are located in the file system and can be stored in an external storage using an ArchiveLink implementation.

The Informatica Data Archive engine also reads the table rows from all ADK files, independent of their location, as long as the files are accessible by the SAP application being retired. These table rows will be stored in the archive store as well, along with the original table.

Very important: After the SAP system is retired, any implementation or ArchiveLink can be retired as well, along with the storage that was holding the ADK files.

3)    Attachments

Business transactions in SAP systems have the ability to have attachments linked to them. The SAP Generic Object Services (GOS) is a way to upload documents, add notes to a transaction, add URLs relevant to the document, all still referencing a business document, like a purchase order or a financial document. Some other SAP applications, like CRM, have its own mechanism of attaching documents, complementing GOS features.

All these methods can store the attachments in the SAP database, or at SAP Knowledge Provider (KPro) or externally in storages, leveraging an ArchiveLink implementation.

Informatica’s engine is able to download all the attachment files, notes and URLs as discrete files, independent of where they are stored, keeping the relationship to the original business document. The relationship is stored in a table created by Informatica in the archive store, which contains the key of the business document and the link to the attachments, notes and URLs that were assigned to it in the original SAP system.

All these files are stored in the archive store, along with the structured data – or tables.

4)    Load into optimized archive

All data and attachments are then loaded into Informatica’s optimized archive,. The archival store will compress the archived data up to 98%

5)    Search and data visualization

All structured data are accessible though JDBC/ODBC, as any other relational database. The user has the option to use the search capability that comes with the product, which allows users to run simple queries and view data as business entities.

Another option is to use the integrated reporting, capability within the product, which allows users to create pixel-perfect reports, using drag and drop technology, querying the data using SQL and displaying the data as business entities, which are defined in prebuilt SAP application accelerators. .

Informatica also has a collection of reports for SAP to display data for customers, vendors, general ledger accounts, assets and financial documents.

Some customers prefer to use their own corporate standard 3rd party reporting tool. That is also possible as long as the tool can connect to JDBC/ODBC sources, which is a market standard for connecting to databases.

Hopefully this blog helped you to understand what Informatica Data Archive for Application Retirement does to decommission an SAP system. If you need any further information, please comment below. Thank you.

Share
Posted in Application Retirement, Data Archiving | Tagged | Leave a comment

ROI via Application Retirement

ROI = every executive’s favorite acronym and one that is often challenging to demonstrate.

In our interactions with provider clients and prospects we are hearing that they’ve migrated to new EMRs but aren’t receiving the ROI they had budgeted or anticipated. In many cases, they are using the new EMR for documentation but still paying to maintain the legacy EMR for access to historical data for billing and care delivery. If health systems can retire these applications and still maintain operational access to the data, they will be able to realize the expected ROI and serve patients proactively.

My colleague Julie, Lockner wrote a blog post about how Informatica Application Retirement for Healthcare is helping healthcare organizations to retire legacy applications and realize ROI.

Read her blog post here or listen to a quick overview here.

Share
Posted in Application ILM, Application Retirement, Healthcare | Tagged , , | Leave a comment

Are You Getting an EPIC ROI? Retire Legacy Healthcare Applications!

Healthcare organizations are currently engaged in major transformative initiatives. The American Recovery and Reinvestment Act of 2009 (ARRA) provided the healthcare industry incentives for the adoption and modernization of point-of-care computing solutions including electronic medical and health records (EMRs/EHRs).   Funds have been allocated, and these projects are well on their way.  In fact, the majority of hospitals in the US are engaged in implementing EPIC, a software platform that is essentially the ERP for healthcare.

These Cadillac systems are being deployed from scratch with very little data being ported from the old systems into the new.  The result is a dearth of legacy applications running in aging hospital data centers, consuming every last penny of HIS budgets.  Because the data still resides on those systems, hospital staff continues to use them making it difficult to shut down or retire.

Most of these legacy systems are not running on modern technology platforms – they run on systems such as HP Turbo Image, Intercache Mumps, and embedded proprietary databases.  Finding people who know how to manage and maintain these systems is costly and risky – risky in that if data residing in those applications is subject to data retention requirements (patient records, etc.) and the data becomes inaccessible.

A different challenge for CFOs of these hospitals is the ROI on these EPIC implementations.  Because these projects are multi-phased, multi-year, boards of directors are asking about the value realized from these investments.  Many are coming up short because they are maintaining both applications in parallel.  Relief will come when systems can be retired – but getting hospital staff and regulators to approve a retirement project requires evidence that they can still access data while adhering to compliance needs.

Many providers have overcome these hurdles by successfully implementing an application retirement strategy based on the Informatica Data Archive platform.  Several of the largest pediatrics’ children’s hospitals in the US are either already saving or expecting to save $2 Million or more annually from retiring legacy applications.  The savings come from:

  • Eliminating software maintenance and license costs
  • Eliminate hardware dependencies and costs
  • Reduced storage requirements by 95% (data archived is stored in a highly compressed, accessible format)
  • Improved efficiencies in IT by eliminating specialized processes or skills associated with legacy systems
  • Freed IT resources – teams can spend more of their time working on innovations and new projects

Informatica Application Retirement Solutions for Healthcare provide hospitals with the ability to completely retire legacy applications, retire and maintain access to archive data for hospital staff.  And with built in security and retention management, records managers and legal teams are satisfying compliance requirements.   Contact your Informatica Healthcare team for more information on how you can get that EPIC ROI the board of directors is asking for.

Share
Posted in Application Retirement, Data Archiving, Healthcare | Tagged , , , , , , , , | Leave a comment

Data archiving – time for a spring clean?

The term “big data” has been bandied around so much in recent months that arguably, it’s lost a lot of meaning in the IT industry. Typically, IT teams have heard the phrase, and know they need to be doing something, but that something isn’t being done. As IDC pointed out last year, there is a concerning shortage of trained big data technology experts, and failure to recognise the implications that not managing big data can have on the business is dangerous. In today’s information economy, as increasingly digital consumers, customers, employees and social networkers we’re handing over more and more personal information for businesses and third parties to collate, manage and analyse. On top of the growth in digital data, emerging trends such as cloud computing are having a huge impact on the amount of information businesses are required to handle and store on behalf of their customers. Furthermore, it’s not just the amount of information that’s spiralling out of control: it’s also the way in which it is structured and used. There has been a dramatic rise in the amount of unstructured data, such as photos, videos and social media, which presents businesses with new challenges as to how to collate, handle and analyse it. As a result, information is growing exponentially. Experts now predict a staggering 4300% increase in annual data generation by 2020. Unless businesses put policies in place to manage this wealth of information, it will become worthless, and due to the often extortionate costs to store the data, it will instead end up having a huge impact on the business’ bottom line. Maxed out data centres Many businesses have limited resource to invest in physical servers and storage and so are increasingly looking to data centres to store their information in. As a result, data centres across Europe are quickly filling up. Due to European data retention regulations, which dictate that information is generally stored for longer periods than in other regions such as the US, businesses across Europe have to wait a very long time to archive their data. For instance, under EU law, telecommunications service and network providers are obliged to retain certain categories of data for a specific period of time (typically between six months and two years) and to make that information available to law enforcement where needed. With this in mind, it’s no surprise that investment in high performance storage capacity has become a key priority for many. Time for a clear out So how can organisations deal with these storage issues? They can upgrade or replace their servers, parting with lots of capital expenditure to bring in more power or more memory for Central Processing Units (CPUs). An alternative solution would be to “spring clean” their information. Smart partitioning allows businesses to spend just one tenth of the amount required to purchase new servers and storage capacity, and actually refocus how they’re organising their information. With smart partitioning capabilities, businesses can get all the benefits of archiving the information that’s not necessarily eligible for archiving (due to EU retention regulations). Furthermore, application retirement frees up floor space, drives the modernisation initiative, allows mainframe systems and older platforms to be replaced and legacy data to be migrated to virtual archives. Before IT professionals go out and buy big data systems, they need to spring clean their information and make room for big data. Poor economic conditions across Europe have stifled innovation for a lot of organisations, as they have been forced to focus on staying alive rather than putting investment into R&D to help improve operational efficiencies. They are, therefore, looking for ways to squeeze more out of their already shrinking budgets. The likes of smart partitioning and application retirement offer businesses a real solution to the growing big data conundrum. So maybe it’s time you got your feather duster out, and gave your information a good clean out this spring?

Share
Posted in Application Retirement, B2B Data Exchange, Data Aggregation, Data Archiving | Tagged , , , , , , , , , , | Leave a comment

Enterprise Application Projects Are Much Riskier Than You Think

IT application managers are constantly going through a process of integrating, modernizing and consolidating enterprise applications to keep them efficient and providing the maximum business value to the corporation for their cost.

But, it is important to remember that there is significant risk in these projects.  An article in the Harvard Business Review states that 17% of enterprise application projects go seriously wrong; going over budget by 200% and over schedule by 70%.  The HRB article refers to these projects as “black swans.”

How can you reduce this risk of project failure?  Typically, 30% to 40% of an enterprise application project is data migration.  A recent study by Bloor Research shows that while success rates for data migration projects are improving, 38% of them still miss their schedule and budget targets.

How can you improve the odds of success in data migration projects?

  1. Use data profiling tools to understand your data before you move it.
  2. Use data quality tools to correct data quality problems.  There is absolutely no point in moving bad data around the organization – but it happens.
  3. Use a proven external methodology. In plain English, work with people who have “done it before”
  4. Develop your own internal competence.  Nobody knows your data, and more importantly, the business context of your data than your own staff.  Develop the skills and engage your business subject matter experts.

Informatica has industry-leading tools, a proven methodology, and a service delivery team with hundreds of successful data migration implementations.

To find out more about successful data migration:

  • Informatica World:  Visit us at the Hands On Lab – Data Migration.
  • Informatica World: Informatica Presentation on Application Data Migration.

Application Data Migrations with Informatica Velocity Migration Methodology

Friday June 5, 2013          9:00 to 10:00

  • Informatica World: Data Migration Factory Presentation by  Accenture

Accelerating the Power of Data Migration

Tuesday June 4, 2013     2:00 to 3:00

 

Share
Posted in Application Retirement, Data Governance, Data Migration, Data Quality, Informatica Events | Tagged , , , , | Leave a comment

Informatica World Healthcare Path

Join us this year at Informatica World!

We have a great line up of speakers and events to help you become a data driven healthcare organization… I’ve provided a few highlights below:

Participate in the Informatica World Keynote sessions with Sohaib Abbasi and Rick Smolan who wrote “The Human Face of Big Data”  — learn more via this quick YouTube video: http://www.youtube.com/watch?v=7K5d9ArRLJE&feature=player_embedded

With more than 100 interactive and in-depth breakout sessions, spanning 6 different tracks, (Platform & Products, Architecture, Best Practices, Big Data, Hybrid IT and Tech Talk), Informatica World is an excellent way to ensure you are getting the most from your Informatica investment. Learn best practices from organizations who are realizing the potential of their data like: Ochsner Health, Sutter Health, UMass Memorial, Qualcomm and Paypal.

Finally, we want you to balance work with a little play… we invite you to network with industry peers at our Healthcare Cocktail Reception on the evening of Wednesday, June 5th and again during our Data Driven Healthcare Breakfast Roundtable on Thursday, June 6th.

See you there!

Share
Posted in Application Retirement, B2B, Complex Event Processing, Data Integration, Data Integration Platform, Data masking, Data Migration, Data Warehousing, Healthcare, Informatica Events, Master Data Management, Uncategorized | Tagged , | Leave a comment

Great Interview with Ochsner Health System

 HISTalk published a recent interview with Ochsner Health System CIO, Chris Belmont. Chris and his team are great Informatica clients and I really like how he conveyed the benefits of making Informatica the data backbone of their Epic implementation.  I can’t say it any better than Chris already has so I’ve extracted a few take-always below and you can read the entire interview here

On the importance of migrating legacy data into the new EMR: “Informatica was critical in getting us there. We learned on the first site. We thought it was a good idea to go in there with an empty slate and say, let’s just build it all from scratch and start with a clean slate. Let’s make sure the record’s in good shape. We quickly realized that was a bad idea. Not just in the clinical areas, but in the registration area.”

On the value of Application Retirement: “That’s going to be a big win for us. In fact, we’re targeting about $13 million in operational benefit when we turn off those legacy platforms. Informatica is going to allow us to get there.”

On not ever being 100% Epic:  “We’re watching it, but frankly it will be a while – and I would argue never – that we’ll be 100 percent Epic. A lot of the data that we have that Informatica allows us to get our hands on and load into our warehouse is non-Epic data.”

On the nuggets Informatica is helping them to uncover: “We’re correlating a lot of data, not just from Epic, but I think right now we have like 25 different systems that we’re running through Informatica and into our warehouse. The gold nuggets that are coming out of that data are just tremendous.”

On challenges and opportunities: “It’s going to be, how do we do more with the data we have…having that data in a format that’s easily, quickly, and very accessible is going to be key. Gone are the days where you can throw an army of analysts in a room and say, “Give me this report” and you wait three weeks and they give you something that’s less than optimal. I think the days of, “Tell me what I need to know before I even know that I need to know it” — I think those are the days that we’re looking forward to. With the tools we have with partners like Informatica with their tools, I think we can achieve it.”

Meet Chris and his team in Informatica Booth 5005 during HIMSS 2013.

HIMSS 2013 — right time, right place, it’s on!

 

Share
Posted in Application Retirement, Data Integration Platform, Data Migration, Healthcare, Master Data Management | Tagged , , , , , | Leave a comment

Ballooning Data Sets Cause Application Performance Problems

According to a 2011 Ovum survey, 85% of respondents cited ballooning data sets as the cause of application performance problems. Many IT organizations fell short in 2012 letting unmanaged data growth impact the business. This year, Informatica is witnessing a surge of interest in Enterprise Data Archive solutions. This interest is being created because executives want to invest in innovative technologies for real-time and operational analytics. Yet, with little to no IT budget increase, IT leaders are getting creative.

Businesses are moving from on premises applications to Software as a Service (SaaS) freeing up time and resources – yet the legacy application being replaced all too often stays in the data center consuming costly resources. IT leaders are recognizing the quick win of retiring legacy applications. An application retirement strategy supports data center consolidation and application modernization initiatives – while ensuring data is retained to meet regulatory compliance and business needs. Significant cost savings are realized because mainframe systems can be turned off, maintenance costs go away. With this new source of revenue, executives can fund their analytics projects and drive competitive operations. (more…)

Share
Posted in Application ILM, Application Retirement, Data Archiving, SaaS | Tagged , , | Leave a comment

Application Retirement – Preserving the Value of Data (Part Two)

In my previous blog, I looked at the need among enterprises for application retirement. But, what kind of software solution is best for supporting effective application retirement?

It’s important to realise that retirement projects might start small, with one or two applications, and then quickly blossom into full-fledged rationalisation initiatives where hundreds of dissimilar applications are retired. So relying on individual applications or database vendors for tools and support can easily lead to a fragmented and uneven retirement strategy and archiving environment. In any event, some major application vendors offer little or even no archiving capabilities. (more…)

Share
Posted in Application ILM, Application Retirement, Data Archiving | Tagged , , , , | Leave a comment