Category Archives: Application ILM

ROI via Application Retirement

ROI = every executive’s favorite acronym and one that is often challenging to demonstrate.

In our interactions with provider clients and prospects we are hearing that they’ve migrated to new EMRs but aren’t receiving the ROI they had budgeted or anticipated. In many cases, they are using the new EMR for documentation but still paying to maintain the legacy EMR for access to historical data for billing and care delivery. If health systems can retire these applications and still maintain operational access to the data, they will be able to realize the expected ROI and serve patients proactively.

My colleague Julie, Lockner wrote a blog post about how Informatica Application Retirement for Healthcare is helping healthcare organizations to retire legacy applications and realize ROI.

Read her blog post here or listen to a quick overview here.

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Application Retirement, Healthcare | Tagged , , | Leave a comment

What is In-Database Archiving in Oracle 12c and Why You Still Need a Database Archiving Solution to Complement It (Part 2)

In my last blog on this topic, I discussed several areas where a database archiving solution can complement or help you to better leverage the Oracle In-Database Archiving feature.  For an introduction of what the new In-Database Archiving feature in Oracle 12c is, refer to Part 1 of my blog on this topic.

Here, I will discuss additional areas where a database archiving solution can complement the new Oracle In-Database Archiving feature:

  • Graphical UI for ease of administration – In database archiving is currently a technical feature of Oracle database, and not easily visible or mange-able outside of the DBA persona.   This is where a database archiving solution provides a more comprehensive set of graphical user interfaces (GUI) that makes this feature easier to monitor and manage. 
  • Enabling application of In-Database Archiving for packaged applications and complex data models – Concepts of business entities or transactional records composed of related tables to maintain data and referential integrity as you archive, move, purge, and retain data, as well as business rules to determine when data has become inactive and can therefore be safely archived allow DBAs to apply this new Oracle feature to more complex data models.  Also, the availability of application accelerators (prebuilt metadata of business entities and business rules for packaged applications) enables the application of In-Database Archiving to packaged applications like Oracle E-Business Suite, PeopleSoft, Siebel, and JD Edwards

(more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Data Archiving, Data Governance, Governance, Risk and Compliance | Tagged , , , , , , | Leave a comment

What is In-Database Archiving in Oracle 12c and Why You Still Need a Database Archiving Solution to Complement It (Part 1)

What is the new In-Database Archiving in the latest Oracle 12c release?

On June 25, 2013, Oracle introduced a new feature called In-Database Archiving with its new release of Oracle 12.  “In-Database Archiving enables you to archive rows within a table by marking them as inactive. These inactive rows are in the database and can be optimized using compression, but are not visible to an application. The data in these rows is available for compliance purposes if needed by setting a session parameter. With In-Database Archiving you can store more data for a longer period of time within a single database, without compromising application performance. Archived data can be compressed to help improve backup performance, and updates to archived data can be deferred during application upgrades to improve the performance of upgrades.”

This is an Oracle specific feature and does not apply to other databases.

(more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Data Archiving, Data Governance, Governance, Risk and Compliance | Tagged , , , , , | Leave a comment

Get Your Data Butt Off The Couch and Move It

Data is everywhere.  It’s in databases and applications spread across your enterprise.  It’s in the hands of your customers and partners.  It’s in cloud applications and cloud servers.  It’s on spreadsheets and documents on your employee’s laptops and tablets.  It’s in smartphones, sensors and GPS devices.  It’s in the blogosphere, the twittersphere and your friends’ Facebook timelines. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, B2B, Big Data, Cloud Computing, Complex Event Processing, Data Governance, Data Integration, Data Migration, Data Quality, Data Services, Data Transformation, Data Warehousing, Enterprise Data Management, Integration Competency Centers | Tagged , , , | Leave a comment

How to Improve Your Application Performance with the Hardware You Already Have

Most application owners know that as data volumes accumulate, application performance can take a major hit if the underlying infrastructure is not aligned to keep up with demand. The problem is that constantly adding hardware to manage data growth can get costly – stealing budgets away from needed innovation and modernization initiatives.

Join Julie Lockner as she reviews the Cox Communications case study on how they were able to solve an application performance problem caused by too much data with the hardware they already had by using Informatica Data Archive with Smart Partitioning. Source: TechValidate. TVID: 3A9-97F-577

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Customers, Telecommunications | Tagged , , , , , , | Leave a comment

Keeping Your SAP HANA Costs Under Control

Adopting SAP HANA can offer significant new business value, but it can also be an expensive proposition. If you are contemplating or in the process of moving to HANA, it’s worth your time to understand your options for Nearlining your SAP data.  The latest version of Informatica ILM Nearline, released in February, has been certified by SAP and can run with SAP BW systems running on HANA or any relational database supported by SAP.

Nearlining your company’s production SAP BW before migrating to a HANA-based BW can provide huge saving potentials. Even if your HANA project has already started, Nearlining the production data will help keep the database growth flat. We have customers that have actually been able to shrink InfoProviders by enforcing strict rules on data retention on the data stored in the live database.

Informatica World is around the corner, and I will be there with my peers to demo and talk about the latest version of Informatica ILM Nearline.  Click here to learn more about Informatica World 2013 and make sure you sign up for one my Hands On Lab sessions on this topic. See you at the Aria in Las Vegas in June.

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Informatica Events | Tagged , , , , , | 1 Comment

Column-oriented Part 2: Flexible Data Modeling for a Simplified End User Experience

In my previous blog, I explained how Column-oriented Database Management Systems (CDBMS), also known as columnar databases or CBAT, offer a distinct advantage over the traditional row-oriented RDBMS in terms of I/O workload, deriving primarily from basing the granularity of I/O operations on the column rather than the entire row. This technological advantage has a direct impact on the complexity of data modeling tasks and on the end-user’s experience of the data warehouse, and this is what I will discuss in today’s post. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Data Integration | Tagged , , , , | 1 Comment

Column-oriented Part 1: The I/O Advantage

Column-oriented Database Management Systems (CDBMS), also referred to as columnar databases and CBAT, have been getting a lot of attention recently in the data warehouse marketplace and trade press. Interestingly, some of the newer companies offering CDBMS-based products give the impression that this is an entirely new development in the RDBMS arena. This technology has actually been around for quite a while. But the market has only recently started to recognize the many benefits of CDBMS. So, why is CDBMS now coming to be recognized as the technology that offers the best support for very large, complex data warehouses intended to support ad hoc analytics? In my opinion, one of the fundamental reasons is the reduction in I/O workload that it enables. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Data Integration | Tagged , , , , , , | 2 Comments

Columnar Deduplication and Column Tokenization: Improving Database Performance, Security and Interoperability

For some time now, a special technique called columnar deduplication has been implemented by a number of commercially available relational database management systems. In today’s blog post, I discuss the nature and benefits of this technique, which I will refer to as column tokenization for reasons that will become evident.

Column tokenization is a process in which a unique identifier (called a Token ID) is assigned to each unique value in a column, and then employed to represent that value anywhere it appears in the column. Using this approach, data size reductions of up to 50% can be achieved, depending on the number of unique values in the column (that is, on the column’s cardinality). Some RDBMSs use this technique simply as a way of compressing data; the column tokenization process is integrated into the buffer and I/O subsystems, and when a query is executed, each row needs to be materialized and the token IDs replaced by their corresponding values. At Informatica for the File Archive Service (FAS) part of the Information Lifecycle Management product family, column tokenization is the core of our technology: the tokenized structure is actually used during query execution, with row materialization occurring only when the final result set is returned. We also use special compression algorithms to achieve further size reduction, typically on the order of 95%.

(more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Data Integration | Tagged , , , , , , | Leave a comment

What the Physical Infrastructure Can Teach Us About Testing IT Applications

I was at an IT conference a few years ago. The speaker was talking about application testing. At the beginning of his talk, he asked the audience:

“Please raise your hand if you flew here from out of town.”

Most of the audience raised their hands. The speaker then said:

“OK, now if you knew that the airplane you flew on had been tested the same way your company tests its applications, would you have still flown on that plane?

After some uneasy chuckling, every hand went down.  Not a great affirmation of the state of application testing in most IT shops. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM | Tagged , , , , , | Leave a comment