Category Archives: Application ILM
Just like your on-premise database applications like E-Business Suite, PeopleSoft, Siebel and custom applications, SaaS applications such as Salesforce, Oracle CRM On Demand, Microsoft Dynamics, NetSuite, Eloqua and others will experience large data growth causing performance issues and increasing costs.
As data grows in your SaaS applications, the performance of accessing transactions and reporting will degrade. Your SaaS vendors will also require more time, effort, and cost to maintain and manage this data. Backups, upgrades and replication of these environments will take longer and application availability will be impacted due to longer maintenance windows. Your SaaS application vendors will require more storage to house the additional data and this cost will be passed on to you. (more…)
Verizon recently blogged about one of its clients who caught an employee outsourcing his software development day job to China. While sitting at his computer working a normal day, he paid someone else to log into his computer using his physical RSA token which he FeExed to the contractor in Shenyang. He would spend the day surfing the internet while ironically, he was being recognized as the top programmer in the building.
Several media outlets have picked up on this story even going as far to say he is the ‘Tom Sawyer’ of the software developer community. An initial, common reaction to this story might make one chuckle. Not me. Think of how that single act of irresponsibility could bring an enterprise down or expose someone to identify theft. (more…)
According to a 2011 Ovum survey, 85% of respondents cited ballooning data sets as the cause of application performance problems. Many IT organizations fell short in 2012 letting unmanaged data growth impact the business. This year, Informatica is witnessing a surge of interest in Enterprise Data Archive solutions. This interest is being created because executives want to invest in innovative technologies for real-time and operational analytics. Yet, with little to no IT budget increase, IT leaders are getting creative.
Businesses are moving from on premises applications to Software as a Service (SaaS) freeing up time and resources – yet the legacy application being replaced all too often stays in the data center consuming costly resources. IT leaders are recognizing the quick win of retiring legacy applications. An application retirement strategy supports data center consolidation and application modernization initiatives – while ensuring data is retained to meet regulatory compliance and business needs. Significant cost savings are realized because mainframe systems can be turned off, maintenance costs go away. With this new source of revenue, executives can fund their analytics projects and drive competitive operations. (more…)
In my previous blog I briefly mentioned the term “data temperature.” But what exactly does this term mean? Picture yourself logging to your bank website to look for a transaction in your checking account. Very frequently you want to look for pending transactions and debits and credits that happened in the last 10 days. Frequently you need to look further, maybe one month statement, to search for a check that you don’t remember was for what. Maybe once in a quarter, you need to get information about a debit that happened three months ago, about a subscription of a new magazine that is not coming to your mailbox. And of course, once a year you check yearly statements for your tax return. Give or take a few other scenarios, I am pretty sure I covered most of your use cases, right? (more…)
In my previous blog, I looked at the need among enterprises for application retirement. But, what kind of software solution is best for supporting effective application retirement?
It’s important to realise that retirement projects might start small, with one or two applications, and then quickly blossom into full-fledged rationalisation initiatives where hundreds of dissimilar applications are retired. So relying on individual applications or database vendors for tools and support can easily lead to a fragmented and uneven retirement strategy and archiving environment. In any event, some major application vendors offer little or even no archiving capabilities. (more…)
Informatica Recognized By Gartner as a Leader in Data Masking and by Infosecurity for Best Security Software
Informatica was named as a leader in the 2012 Gartner Magic Quadrant for Data Masking. A couple of weeks ago, Infosecurity named Informatica as a finalist for Best Security Software for 2013.
Both the Gartner Magic Quadrant for Data Masking and Infosecurity Products Guide recognized Informatica for continued innovation:
- Gartner states, “The data masking portfolio has been broadening. In addition to SDM technology… the market is beginning to offer dynamic data masking (DDM)… ” (more…)
According to the IDC Financial Insights 2013 Predictions report, financial institutions across most regions are getting serious about updating their legacy systems to improve reduce operating costs, automate labor intensive processes, improve customer experiences, and avoid costly disruptions. Transforming a bank’s core systems or insurance provider’s main business systems is a strategic decision that has far-reaching implications on the firm’s future business strategies and success. When done right, the capabilities offered in today’s modern banking and insurance platforms can propel a company in front of their competition or be the nail in the coffin if your data is not migrated correctly, safeguards are not in place to protect against unwanted data breaches, and if you are not able to decommission those old systems as planned.
One of the most important and critical phases of any legacy modernization project is the process of migrating data from old to new. Migrating data involves:
- Ability to access existing data in the legacy systems
- Understand the data structures that need to be migrated
- Transform and execute one-to-one mapping with the relevant fields in the new system
- Identify data quality errors and other gaps in the data
- Validate what is entered into the new system by identifying transformation or mapping errors
- Seamlessly connect to the target tables and fields in the new system
Sounds easy enough right? Not so fast! (more…)
Whether the result of growth or acquisition, enterprises that have been around a decade or longer typically have large and complex information environments with lots of redundant and obsolete applications. But there’s always the worry that one day there will be a need to access the old applications’ data. So these applications are still managed and maintained even though the data is rarely needed resulting in sizable costs in license fees, maintenance, power, data center space, backups, and precious IT time. In many companies, there are hundreds, even thousands, of obsolete or redundant applications, and the business continues to support them with expensive production level infrastructure and SLAs.
But as we’re facing the longest double dip recession for 50 years, businesses are being forced to think about reclaiming this extraneous spend for more strategic purposes by retiring any outdated application…but without losing access to the data. Keeping data from dormant applications “live” as a safeguard is more than just good common sense. In many cases, keeping the data readily accessible is compulsory due to corporate, industry, and governmental compliance demands. But you needn’t spend full production costs to do so. (more…)
Data volumes are exploding. We see it all around us. The problem is that too much data can have a very negative impact on user productivity. Think about how long it takes to sift through emails after returning from vacation? Consider how long it takes to complete a purchase on an Ecommerce sight on Black Friday? The more data, the longer any of these processes take and the more time spent combing through more and more data. Informatica has been successfully working with Symantec and our customers through our partnership to help them find ways to control the impact of ‘too much data’. We are helping them to define projects that improve their ability to meet SLAs and application performance, reduce costs and mitigate any compliance risks – all while IT budgets remain relatively flat. (more…)
In my last blog we explored the difference between database tools and the solutions that are needed to make the tools useful in complex OLTP systems. In this post we will look at how the same concepts apply to tools offered by the database vendors for table level compression techniques. As the database functionality continues to evolve it is important to know how to best leverage these powerful enabling technologies from the database vendors to deliver value to the business. (more…)