Category Archives: Application Retirement

Informatica Connect-a-Thon Event Winners

Building connectivity with Informatica

The advent of cloud based applications has raised expectations on the part of users – they expect more from analytics and big data products.  ISVs are challenged with managing data integration complexity and point application integrations can prevent them from focusing on core competencies. On the one hand customer application support requirements are critical to success, but so is innovation and new IP.

The Informatica Vibe Platform – a virtual data machine – can help ISVs eliminate integration complexity. By working with Informatica, ISVs are able to focus on one integration point – the Informatica platform — and leverage existing connectivity to hundreds of SaaS and on premise applications including Salesforce, Netsuite, Workday, Amazon Redshift, Microsoft Azure, SAP and many more.

Today at the Informatica World Cloud Innovation Summit, Connect-a-thon winners will demonstrate advanced data integrations – combining disparate data sets, normalizing data and leveraging the power of the Informatica platform to extend application capabilities. Winners honored include Domo, Couchbase, Snowflake, BigML, Databricks, and Thoughtspot.

BigML, Thoughtspot and Domo are powering a new age of advanced analytics by putting machine learning and self-service BI within reach of all users. For their next-gen database and data warehouse solutions, both Couchbase and Snowflake worked closely with Informatica to build a solution for integration and querying data that take full advantage of the cloud and hybrid environments. Connectivity with Informatica enables their customers to manage and transform data between their solutions and other relational, big data, or application. Customers are able to load and extract with Informatica’s virtual design-driven products and experience a higher level of data quality.

Informatica Connect-a-thon winners demonstrate how simply data can be taken from its silo and transformed for use in new applications and databases.

Learn more about building a connector on the Informatica Technology Partner Network. Test-drive these connectors on the Informatica Marketplace.

Share
Posted in Application Retirement, PowerCenter | Tagged | Leave a comment

What is an Enterprise Architecture Maturity Model?

Enterprise IT is in a state of constant evolution. As a result, business processes and technologies become increasingly more difficult to change and more costly to keep up-to-date. The solution to this predicament is an Enterprise Architecture (EA) process that can provide a framework for an optimized IT portfolio. IT Optimization strategy should be based on a comprehensive set of architectural principles which ensure consistency and make IT more responsive, efficient, and economical.

The rationalization, standardization, and consolidation process helps organizations understand their current EA maturity level and move forward on the appropriate roadmap. As they undertake the IT Optimization journey, the IT architecture matures through several stages, leveraging IT Optimization Architecture Principles to attain each level of maturity.

EA Maturity

Multiple Levels of Enterprise Architecture Maturity Model

Level 1: The first step involves helping a company develop its architecture vision and operating model, with attention to cost, globalization, investiture, or whatever is driving the company strategically. Once that vision is in place, enterprise architects can guide the organization through an iterative process of rationalization, consolidation, and eventually shared-services and cloud computing.

Level 2: The rationalization exercise helps an organization identify what standards to move towards as they eliminate the complexities and silos they have built up over the years, along with the specific technologies that will help them get there.

Depending on the company, Rationalization could start with a technical discussion and be IT-driven; or it could start at a business level. For example, a company might have distributed operations across the globe and desire to consolidate and standardize its business processes. That could drive change in the IT portfolio. Or a company that has gone through mergers and acquisitions might have redundant business processes to rationalize.

Rationalizing involves understanding the current state of an organization’s IT portfolio and business processes, and then mapping business capabilities to IT capabilities. This is done by developing scoring criteria to analyze the current portfolio, and ultimately by deciding on the standards that will propel the organization forward. Standards are the outcome of a rationalization exercise.

Standardized technology represents the second level of EA maturity. Organizations at this level have evolved beyond isolated independent silos. They have well-defined corporate governance and procurement policies, which yields measurable cost savings through reduced software licenses and the elimination of redundant systems and skill sets.

Level 3: Consolidation entails reducing the footprint of your IT portfolio. That could involve consolidating the number of database servers, application servers and storage devices, consolidating redundant security platforms, or adopting virtualization, grid computing, and related consolidation initiatives.

Consolidation may be a by-product of another technology transformation, or it may be the driver of these transformations. But whatever motivates the change, the key is to be in alignment with the overall business strategy. Enterprise architects understand where the business is going so they can pick the appropriate consolidation strategy.

Level 4: One of the key outcomes of a rationalization and consolidation exercise is the creation of a strategic roadmap that continually keeps IT in line with where the business is going.

Having a roadmap is especially important when you move down the path to shared services and cloud computing. For a company that has a very complex IT infrastructure and application portfolio, having a strategic roadmap helps the organization to move forward incrementally, minimizing risk, and giving the IT department every opportunity to deliver value to the business.

Twitter @bigdatabeat

Share
Posted in 5 Sales Plays, Application Retirement, Architects, Business Impact / Benefits, Business/IT Collaboration, CIO, Cloud, Mergers and Acquisitions | Tagged , , , , | Leave a comment

What’s Driving Core Banking Modernization?

Renew

What’s Driving Core Banking Modernization

When’s the last time you visited your local branch bank and spoke to a human being? How about talking to your banker over the phone?  Can’t remember?  Well you’re not alone and don’t worry, it’s not a bad thing. The days of operating physical branches with expensive workers to greet and service customers  are being replaced with more modern and customer friendly mobile banking applications that allow consumers to deposit checks from the phone, apply for a mortgage and sign closing documents electronically, to eliminating the need to go to an ATM and get physical cash by using mobile payment solutions like Apple Pay.  In fact, a new report titled ‘Bricks + Clicks: Building the Digital Branch,’ from Jeanne Capachin and Jim Marous takes an in-depth look at how banks and credit unions are changing their branch and customer channel strategies to meet the demand of today’s digital banking customer.

Why am I talking about this? These market trends are dominating the CEO and CIO agenda in today’s banking industry. I just returned from the 2015 IDC Asian Financial Congress event in Singapore where the digital journey for the next generation bank was a major agenda item. According the IDC Financial Insights, global banks will invest $31.5B USD in core banking modernization to enable these services, improve operational efficiency, and position these banks to better compete on technology and convenience across markets. Core banking modernization initiatives are complex, costly, and fraught with risks. Let’s take a closer look. (more…)

Share
Posted in Application Retirement, Architects, Banking & Capital Markets, Data Migration, Data Privacy, Data Quality, Vertical | Tagged , , | Leave a comment

Payers – What They Are Good At, And What They Need Help With

healthcare_bigdata

Payers – What They Are Good At, And What They Need Help With

In our house when we paint a room, my husband does the big rolling of the walls or ceiling, I do the cut-in work. I am good at prepping the room, taping all the trim and deliberately painting the corners. However, I am thrifty and constantly concerned that we won’t have enough paint to finish a room. My husband isn’t afraid to use enough paint and is extremely efficient at painting a wall in a single even coat. As a result, I don’t do the big rolling and he doesn’t do the cutting in. It took us awhile to figure this out, and a few rooms had to be repainted while we were figuring it out.  Now we know what we are good at, and what we need help with.

Payers roles are changing. Payers were previously focused on risk assessment, setting and collecting premiums, analyzing claims and making payments – all while optimizing revenues. Payers are pretty good at selling to employers, figuring out the cost/benefit ratio from an employers perspective and ensuring a good, profitable product. With the advent of the Affordable Healthcare Act along with a much more transient insured population, payers now must focus more on the individual insured and be able to communicate with the individuals in a more nimble manner than in the past.

Individual members will shop for insurance based on consumer feedback and price. They are interested in ease of enrollment and the ability to submit and substantiate claims quickly and intuitively. Payers are discovering that they need to help manage population health at a individual member level. And population health management requires less of a business-data analytics approach and more social media and gaming-style logic to understand patients. In this way, payers can help develop interventions to sustain behavioral changes for better health.

When designing such analytics, payers should consider the following key design steps:

Due to payers’ mature predictive analytics competencies, they will have a much easier time in the next generation of population behavior compared to their provider counterparts. As clinical content is often unstructured compared to the claims data, payers need to pay extra attention to context and semantics when deciphering clinical content submitted by providers. Payers can use help from vendors that can help them understand unstructured data, individual members. They can then use that data to create fantastic predictive analytic solutions.

Share
Posted in 5 Sales Plays, Application Retirement, Big Data, Cloud Application Integration, Cloud Data Integration, Customer Acquisition & Retention, Customer Services, Data Governance, Data Quality, Governance, Risk and Compliance, Healthcare, Total Customer Relationship | Tagged | Leave a comment

Software Modernization Strategies

Software Modernization

A Level-Up – Software Modernization

Every year, I get a replacement desk calendar to help keep all of our activities straight – and for a family of four, that is no easy task. I start with taking all of the little appointment cards the dentist, orthodontist, pediatrician and GP give to us for appointments that occur beyond the current calendar dates. I transcribe them all. Then I go through last year’s calendar to transfer any information that is relevant to this year’s calendar. And finally, I put the calendar down in the basement next to previous year calendars so I can refer back to them if I need. Last year’s calendar contains a lot of useful information, but no longer has the ability to solve my need to organize schedules for this year.

In a very loose way – this is very similar to application retirement. Many larger health plans have existing systems that were created several years (sometimes even several decades) ago. These legacy systems have been customized to reflect the health plan’s very specific business processes. They may be hosted on costly hardware, developed in antiquated software languages and rely on a few developers that are very close to retirement. The cost of supporting these (most likely) antiquated systems can be diverting valuable dollars away from innovation.

The process that I use to move appointment and contact data from one calendar to the next works for me – but is relatively small in scale. Imagine if I was trying to do this for an entire organization without losing context, detail or accuracy!

There are several methodologies for determining the best strategy for your organization to approach software modernization, including:

  • Architecture Driven Modernization (ADM) is the initiative to standardize views of the existing systems in order to enable common modernization activities like code analysis and comprehension, and software transformation.
  • SABA (Bennett et al., 1999) is a high-level framework for planning the evolution and migration of legacy systems, taking into account both organizational and technical issues.
  • SRRT (Economic Model to Software Rewriting and Replacement Times), Chan et al. (1996), Formal model for determining optimal software rewrite and replacement timings based on versatile metrics data.
  • And if all else fails:  Model Driven Engineering (MDE) is being investigated as an approach for reverse engineering and then forward engineering software code

My calendar migration process evolved over time, your method for software modernization should be well planned prior to the go-live date for the new software system.

Share
Posted in Application ILM, Application Retirement, Business Impact / Benefits, Data Archiving, Data Migration, data replication, Database Archiving, Healthcare, Operational Efficiency | Tagged , , | Leave a comment

Achieving Great Data in the Oil and Gas Industry

Have you noticed something different this winter season that most people are cheery about?  I’ll give you a hint. It’s not the great sales going on at your local shopping mall but something that helps you get to the mall allot more affordable then last year. It’s the extremely low gas prices across the globe, fueled by over-supply of oil vs. demand contributed from a boom in Geo-politics and boom in shale oil production in N. America and abroad. Like any other commodity, it’s impossible to predict where oil prices are headed however, one thing is sure that Oil and Gas companies will need timely and quality data as firms are investing in new technologies to become more agile, innovative, efficient, and competitive as reported by a recent IDC Energy Insights Predictions report for 2015.

The report predicts:

  1. 80% of the top O&G companies will reengineer processes and systems to optimize logistics, hedge risk and efficiently and safely deliver crude, LNG, and refined products by the end of 2017.
  2. Over the next 3 years, 40% of O&G majors and all software divisions of oilfield services (OFS) will co-innovate on domain specific technical projects with IT professional service firms.
  3. The CEO will expect immediate and accurate information about top Shale Plays to be available by the end of 2015 to improve asset value by 30%.
  4. By 2016, 70% percent of O&G companies will have invested in programs to evolve the IT environment to a third platform driven architecture to support agility and readily adapt to change.
  5. With continued labor shortages and over 1/3 of the O&G workforce under 45 in three years, O&G companies will turn to IT to meet productivity goals.
  6. By the end of 2017, 100% of the top 25 O&G companies will apply modeling and simulation tools and services to optimize oil field development programs and 25% will require these tools.
  7. Spending on connectivity related technologies will increase by 30% between 2014 and 2016, as O&G companies demand vendors provide the right balance of connectivity for a more complex set of data sources.
  8. In 2015, mergers, acquisitions and divestitures, plus new integrated capabilities, will drive 40% of O&G companies to re-evaluate their current deployments of ERP and hydrocarbon accounting.
  9. With a business case built on predictive analytics and optimization in drilling, production and asset integrity, 50% of O&G companies will have advanced analytics capabilities in place by 2016.
  10. With pressures on capital efficiency, by 2015, 25% of the Top 25 O&G companies will apply integrated planning and information to large capital projects, speeding up delivery and reducing over-budget risks by 30%.

Realizing value from these investments will also require Oil and Gas firms to modernize and improve their data management infrastructure and technologies to deliver great data whether to fuel actionable insights from Big Data technology to facilitating post-merger application consolidation and integration activities.  Great data is only achievable by Great Design supported by capable solutions designed to help access and deliver timely, trusted, and secure data to need it most.

Lack of proper data management investments and competences have long plagued the oil and gas sector with “less-than acceptable” data and higher operating costs. According to the “Upstream Data and Information Management Survey” conducted by Wipro Technologies, 56% of those surveyed felt that business users spent more than ¼ or more of their time on low value activities caused by existing data issues (e.g. accessing, cleansing, preparing data) for “high value” activities (e.g. analysis, planning, decision making).  The same survey showed the biggest data management issues were timely access to required data and data quality issues from source systems.

So what can Oil and Gas CIO’s and Enterprise Architects do to prepare for the future?  Here are some tips for consideration:

  • Look to migrate and automate legacy hand coded data transformation processes by adopting tools that can help streamline the development, testing, deployment, and maintenance of these complex tasks that help developers build, maintain, and monitor data transformation rules once and deploy them across the enterprise.
  • Simplify how data is distributed across systems with more modern architectures and solutions and avoid the cost and complexities of point to point integrations
  • Deal with and manage data quality upstream at the source and throughout the data life cycle vs. having end users fix unforeseen data quality errors manually.
  • Create a centralized source of shared business reference and master data that can manage a consistent record across heterogeneous systems such as well asset/material information (wellhead, field, pump, valve, etc.), employee data (drill/reservoir engineer, technician), location data (often geo-spatial), and accounting data (for financial roll-ups of cost, production data).
  • Establish standards and repeatable best practices by adopting an Integration Competency Center frame work to support the integration and sharing of data between operational and analytical systems.

In summary, low oil prices have a direct and positive impact to consumers especially during the winter season and holidays and I personally hope they continue for the unforeseeable future given that prices were double just a year ago. Unfortunately, no one can predict future energy prices however one thing is for sure, the demand for great data by Oil and Gas companies will continue to grow. As such, CIO’s and Enterprise Architects will need to consider and recognize the importance of improving their data management capabilities and technologies to ensure success in 2015. How ready are you?

Click to learn more about Informatica in today’s Energy Sector:

Share
Posted in Application Retirement, Architects, CIO, Data Integration, Data Quality, Data Synchronization, Utilities & Energy | Tagged , , , , | Leave a comment

To Determine The Business Value of Data, Don’t Talk About Data

The title of this article may seem counterintuitive, but the reality is that the business doesn’t care about data.  They care about their business processes and outcomes that generate real value for the organization. All IT professionals know there is huge value in quality data and in having it integrated and consistent across the enterprise.  The challenge is how to prove the business value of data if the business doesn’t care about it. (more…)

Share
Posted in Application Retirement, Business Impact / Benefits, CIO, Data Governance, Enterprise Data Management, Integration Competency Centers | Tagged , , , , , , | Leave a comment

Informatica Named a Leader in Gartner MQ for Data Archiving

Structured Data Archiving and Application Retirement

Gartner MQ for Structured Data Archiving and Application Retirement

For the first time, Gartner has released a Magic Quadrant for Structured Data Archiving and Application Retirement . This MQ describes an approach for how customers can proactively manage data growth. We are happy to report that Gartner has positioned Informatica as a leader in the space. This is based on our ‘completeness of vision’ and our ‘ability to execute.’

This magic quadrant focuses on what Gartner calls Structured Data Archiving. Data Archiving is used to index, migrate, preserve and protect application data in secondary databases or flat files. These are typically located on lower-cost storage, for policy-based retention.  Data Archiving makes data available in context of the originating business process or application. This is especially useful in the event of litigation or of an audit.

The Magic Quadrant calls out two use cases. These use cases are “live archiving of production applications” and “application retirement of legacy systems.”  Informatica refers to both use cases, together, as “Enterprise Data Archiving.” We consider this to be a foundational component of a comprehensive Information Lifecycle Management strategy.

The application landscape is constantly evolving. For this reason, data archiving is a strategic component of a data growth management strategy. Application owners need a plan to manage data as applications are upgraded, replaced, consolidated, moved to the cloud and/or retired. 

When you don’t have a plan in production, data accumulates in the business application. When this happens, performance bothers the business. In addition, data bloat bothers IT operations. When you don’t have a plan for legacy systems, applications accumulate in the data center. As a result, increasing budgets bother the CFO.

A data growth management plan must include the following:

  • How to cycle through applications and retire them
  • How to smartly store the application data
  • How to ultimately dispose data while staying compliant

Structured data archiving and application retirement technologies help automate and streamline these tasks.

Informatica Data Archive delivers unparalleled connectivity, scalability and a broad range of innovative options (i.e. Smart Partitioning, Live Archiving, and retiring aging and legacy data to the Informatica Data Vault), and comprehensive retention management and data reporting and visualization.  We believe our strengths in this space are the key ingredients for deploying a successful enterprise data archive.

For more information, read the Gartner Magic Quadrant for Structured Data Archiving and Application Retirement.

Share
Posted in Application ILM, Application Retirement, Data Archiving, Database Archiving, Governance, Risk and Compliance | Tagged , , , , , , | Leave a comment

Improve Oracle Performance While Streamlining IT Operations

jjjjjOracle DBAs are challenged with keeping mission critical databases up and running with predictable performance as data volumes grow.  Our customers are changing their approach to proactively managing Oracle performance while simplifying IT by leveraging our innovative Data Archive Smart Partitioning features.  Smart Partitioning leverages Oracle Database Partitioning, simplifying deploying and managing partitioning strategies.  DBAs have been able to respond to requests to improve business process performance without having to write any custom code or SQL scripts.

With Smart Partitioning, DBA’s have a new dialogue with business analysts – rather than wading in the technology weeds, they ask how many months, quarters or years of data are required to get the job done? And show – within a few clicks – how users can self-select how much gets processed when they run queries, reports or programs – basically showing them how they can control their own performance by controlling the volume of data they pull from the database.

Smart Partitioning is configured using easily understood business dimensions such as time, company, business unit etc. These dimensions make it easy to ‘slice’ data to meet the job at hand. Performance becomes manageable and under business control. Another benefit is in your non-production environments. Creating smaller sized, subset databases that are fully functional now fits easily into your cloning operations.

Finally, Informatica has been working closely with the Oracle Enterprise Solutions Group to align Informatica Data Archive Smart Partitioning with the Oracle ZS3 Appliance to maximize performance and savings while minimizing the complexity of implementing an Information Lifecycle Management strategy.

To learn more about our joint solutions, visit our website, read the recent press release here, download a Data Sheet, or watch a Informatica and Oracle joint webinar.

Share
Posted in Application ILM, Application Retirement, Data Archiving, Database Archiving | Tagged , , , , , | Leave a comment

Application Retirement: Old Applications, and Their Place In The Sun

obsolete_tech_large_sqareWhat springs to mind when you think about old applications? What happens to them when they outlived their usefulness? Do they finally get to retire and have their day in the sun, or do they tenaciously hang on to life?

Think for a moment about your situation and of those around you. From the time work started you have been encouraged and sometimes forced to think about, plan for and fund your own retirement. Now consider the portfolio your organization has built up over the years; hundreds or maybe thousands of apps, spread across numerous platforms and locations – A mix of home-grown with the best-in-breed tools or acquired from the leading application vendors.

Evaluating Your Current Situation

  • Do you know how many of those “legacy” systems are still running?
  • Do you know how much these apps are costing?
  • Is there a plan to retire them?
  • How is the execution tracking to plan?

Truth is, even if you have a plan, it probably isn’t going well.

Providing better citizen service at a lower cost

This is something every state and local organization aspires to do by reducing costs. Many organizations are spending 75% or more of their budgets on just keeping the lights on – maintaining existing applications and infrastructure. Being able to fully retire some, or many of these applications saves significant money. Do you know how much these applications are costing your organization? Don’t forget to include the whole range of costs that applications incur – including the physical infrastructure costs such as mainframes, networks and storage, as well as the required software licenses and of course the time of the people that actually keep them running. What happens when those with with Cobol and CICS experience retire? Usually the answer is not good news. There is a lot to consider and many benefits to be gained through an effective application retirement strategy.

August 2011 report by ESG Global shows that some 68% of organizations had over six or more legacy applications running and that 50% planned to retire at least one of those over the following 12-18 months. It would be interesting to see today’s situation and be able evaluate how successful these application retirement plans have been.

A common problem is knowing where to start. You know there are applications that you should be able to retire, but planning, building and executing an effective and success plan can be tough. To help this process we have developed a strategy, framework and solution for effective and efficient application retirement. This is a good starting point on your application retirement journey.

To get a speedy overview, take six minutes to watch this video on application retirement.

We have created a community specifically for application managers in our ‘Potential At Work’ site. If you haven’t already signed up, take a moment and join this group of like-minded individuals from across the globe.

Share
Posted in Application ILM, Application Retirement, Business Impact / Benefits, Data Archiving, Operational Efficiency, Public Sector | Tagged , , , , | Leave a comment