Category Archives: Mergers and Acquisitions

Sensational Find – $200 Million Hidden in a Teenager’s Bedroom!

That tag line got your attention – did it not?  Last week I talked about how companies are trying to squeeze more value out of their asset data (e.g. equipment of any kind) and the systems that house it.  I also highlighted the fact that IT departments in many companies with physical asset-heavy business models have tried (and often failed) to create a consistent view of asset data in a new ERP or data warehouse application.  These environments are neither equipped to deal with all life cycle aspects of asset information, nor are they fixing the root of the data problem in the sources, i.e. where the stuff is and what it look like. It is like a teenager whose parents have spent thousands of dollars on buying him the latest garments but he always wears the same three outfits because he cannot find the other ones in the pile he hoardes under her bed.  And now they bought him a smart phone to fix it.  So before you buy him the next black designer shirt, maybe it would be good to find out how many of the same designer shirts he already has, what state they are in and where they are.

Finding the asset in your teenager's mess

Finding the asset in your teenager’s mess

Recently, I had the chance to work on a like problem with a large overseas oil & gas company and a North American utility.  Both are by definition asset heavy, very conservative in their business practices, highly regulated, very much dependent on outside market forces such as the oil price and geographically very dispersed; and thus, by default a classic system integration spaghetti dish.

My challenge was to find out where the biggest opportunities were in terms of harnessing data for financial benefit.

The initial sense in oil & gas was that most of the financial opportunity hidden in asset data was in G&G (geophysical & geological) and the least on the retail side (lubricants and gas for sale at operated gas stations).  On the utility side, the go to area for opportunity appeared to be maintenance operations.  Let’s say that I was about right with these assertions but that there were a lot more skeletons in the closet with diamond rings on their fingers than I anticipated.

After talking extensively with a number of department heads in the oil company; starting with the IT folks running half of the 400 G&G applications, the ERP instances (turns out there were 5, not 1) and the data warehouses (3), I queried the people in charge of lubricant and crude plant operations, hydrocarbon trading, finance (tax, insurance, treasury) as well as supply chain, production management, land management and HSE (health, safety, environmental).

The net-net was that the production management people said that there is no issue as they already cleaned up the ERP instance around customer and asset (well) information. The supply chain folks also indicated that they have used another vendor’s MDM application to clean up their vendor data, which funnily enough was not put back into the procurement system responsible for ordering parts.  The data warehouse/BI team was comfortable that they cleaned up any information for supply chain, production and finance reports before dimension and fact tables were populated for any data marts.

All of this was pretty much a series of denial sessions on your 12-step road to recovery as the IT folks had very little interaction with the business to get any sense of how relevant, correct, timely and useful these actions are for the end consumer of the information.  They also had to run and adjust fixes every month or quarter as source systems changed, new legislation dictated adjustments and new executive guidelines were announced.

While every department tried to run semi-automated and monthly clean up jobs with scripts and some off-the-shelve software to fix their particular situation, the corporate (holding) company and any downstream consumers had no consistency to make sensible decisions on where and how to invest without throwing another legion of bodies (by now over 100 FTEs in total) at the same problem.

So at every stage of the data flow from sources to the ERP to the operational BI and lastly the finance BI environment, people repeated the same tasks: profile, understand, move, aggregate, enrich, format and load.

Despite the departmental clean-up efforts, areas like production operations did not know with certainty (even after their clean up) how many well heads and bores they had, where they were downhole and who changed a characteristic as mundane as the well name last and why (governance, location match).

Marketing (Trading) was surprisingly open about their issues.  They could not process incoming, anchored crude shipments into inventory or assess who the counterparty they sold to was owned by and what payment terms were appropriate given the credit or concentration risk associated (reference data, hierarchy mgmt.).  As a consequence, operating cash accuracy was low despite ongoing improvements in the process and thus, incurred opportunity cost.

Operational assets like rig equipment had excess insurance coverage (location, operational data linkage) and fines paid to local governments for incorrectly filing or not renewing work visas was not returned for up to two years incurring opportunity cost (employee reference data).

A big chunk of savings was locked up in unplanned NPT (non-production time) because inconsistent, incorrect well data triggered incorrect maintenance intervals. Similarly, OEM specific DCS (drill control system) component software was lacking a central reference data store, which did not trigger alerts before components failed. If you add on top a lack of linkage of data served by thousands of sensors via well logs and Pi historians and their ever changing roll-up for operations and finance, the resulting chaos is complete.

One approach we employed around NPT improvements was to take the revenue from production figure from their 10k and combine it with the industry benchmark related to number of NPT days per 100 day of production (typically about 30% across avg depth on & offshore types).  Then you overlay it with a benchmark (if they don’t know) how many of these NPT days were due to bad data, not equipment failure or alike, and just fix a portion of that, you are getting big numbers.

When I sat back and looked at all the potential it came to more than $200 million in savings over 5 years and this before any sensor data from rig equipment, like the myriad of siloed applications running within a drill control system, are integrated and leveraged via a Hadoop cluster to influence operational decisions like drill string configuration or asmyth.

Next time I’ll share some insight into the results of my most recent utility engagement but I would love to hear from you what your experience is in these two or other similar industries.

Disclaimer:
Recommendations contained in this post are estimates only and are based entirely upon information provided by the prospective customer  and on our observations.  While we believe our recommendations and estimates to be sound, the degree of success achieved by the prospective customer is dependent upon a variety of factors, many of which are not under Informatica’s control and nothing in this post shall be relied upon as representative of the degree of success that may, in fact, be realized and no warrantee or representation of success, either express or implied, is made.
FacebookTwitterLinkedInEmailPrintShare
Posted in Application Retirement, B2B, Big Data, Business Impact / Benefits, Business/IT Collaboration, CIO, Data Aggregation, Data Governance, Data Integration, Data Quality, Enterprise Data Management, Governance, Risk and Compliance, Manufacturing, Master Data Management, Mergers and Acquisitions, Operational Efficiency, Uncategorized, Utilities & Energy, Vertical | Tagged , , , , , , , | Leave a comment

Squeezing the Value out of the Old Annoying Orange

I believe that most in the software business believe that it is tough enough to calculate and hence financially justify the purchase or build of an application - especially middleware – to a business leader or even a CIO.  Most of business-centric IT initiatives involve improving processes (order, billing, service) and visualization (scorecarding, trending) for end users to be more efficient in engaging accounts.  Some of these have actually migrated to targeting improvements towards customers rather than their logical placeholders like accounts.  Similar strides have been made in the realm of other party-type (vendor, employee) as well as product data.  They also tackle analyzing larger or smaller data sets and providing a visual set of clues on how to interpret historical or predictive trends on orders, bills, usage, clicks, conversions, etc.

Squeeze that Orange

Squeeze that Orange

If you think this is a tough enough proposition in itself, imagine the challenge of quantifying the financial benefit derived from understanding where your “hardware” is physically located, how it is configured, who maintained it, when and how.  Depending on the business model you may even have to figure out who built it or owns it.  All of this has bottom-line effects on how, who and when expenses are paid and revenues get realized and recognized.  And then there is the added complication that these dimensions of hardware are often fairly dynamic as they can also change ownership and/or physical location and hence, tax treatment, insurance risk, etc.

Such hardware could be a pump, a valve, a compressor, a substation, a cell tower, a truck or components within these assets.  Over time, with new technologies and acquisitions coming about, the systems that plan for, install and maintain these assets become very departmentalized in terms of scope and specialized in terms of function.  The same application that designs an asset for department A or region B, is not the same as the one accounting for its value, which is not the same as the one reading its operational status, which is not the one scheduling maintenance, which is not the same as the one billing for any repairs or replacement.  The same folks who said the Data Warehouse is the “Golden Copy” now say the “new ERP system” is the new central source for everything.  Practitioners know that this is either naiveté or maliciousness. And then there are manual adjustments….

Moreover, to truly take squeeze value out of these assets being installed and upgraded, the massive amounts of data they generate in a myriad of formats and intervals need to be understood, moved, formatted, fixed, interpreted at the right time and stored for future use in a cost-sensitive, easy-to-access and contextual meaningful way.

I wish I could tell you one application does it all but the unsurprising reality is that it takes a concoction of multiple.  None or very few asset life cycle-supporting legacy applications will be retired as they often house data in formats commensurate with the age of the assets they were built for.  It makes little financial sense to shut down these systems in a big bang approach but rather migrate region after region and process after process to the new system.  After all, some of the assets have been in service for 50 or more years and the institutional knowledge tied to them is becoming nearly as old.  Also, it is probably easier to engage in often required manual data fixes (hopefully only outliers) bit-by-bit, especially to accommodate imminent audits.

So what do you do in the meantime until all the relevant data is in a single system to get an enterprise-level way to fix your asset tower of Babel and leverage the data volume rather than treat it like an unwanted step child?  Most companies, which operate in asset, fixed-cost heavy business models do not want to create a disruption but a steady tuning effect (squeezing the data orange), something rather unsexy in this internet day and age.  This is especially true in “older” industries where data is still considered a necessary evil, not an opportunity ready to exploit.  Fact is though; that in order to improve the bottom line, we better get going, even if it is with baby steps.

If you are aware of business models and their difficulties to leverage data, write to me.  If you even know about an annoying, peculiar or esoteric data “domain”, which does not lend itself to be easily leveraged, share your thoughts.  Next time, I will share some examples on how certain industries try to work in this environment, what they envision and how they go about getting there.

FacebookTwitterLinkedInEmailPrintShare
Posted in Application Retirement, Big Data, Business Impact / Benefits, Business/IT Collaboration, CIO, Customer Acquisition & Retention, Customers, Data Governance, Data Quality, Enterprise Data Management, Governance, Risk and Compliance, Healthcare, Life Sciences, Manufacturing, Master Data Management, Mergers and Acquisitions, Operational Efficiency, Product Information Management, Profiling, Telecommunications, Transportation, Utilities & Energy, Vertical | 1 Comment

Announcing Informatica Cloud Spring 2013

On Wednesday we announced our latest cloud integration release – Informatica Cloud Spring 2013. It’s a major step forward in terms of breadth and depth for our software as a service (SaaS) solution. Why, you ask?

Well, yes…but…there are a few aspects to today’s announcement that I think are particularly noteworthy. Here’s a summary.

(more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Cloud Computing, Data masking, Data Quality, Master Data Management, Mergers and Acquisitions, News & Announcements, PaaS, SaaS | Tagged , , , , , | Leave a comment

Much Ado About Nothing

All the talk about whether or not healthcare organizations will adopt cloud solutions is much ado about nothing – the simple fact is that they already have adopted cloud solutions and the trend will only accelerate.

The typical hospital IT department is buried under the burden of supporting hundreds of legacy and departmental systems, the multi-year implementation of at least one if not more enterprise electronic health record applications to meet the requirements of meaningful use, all the while contending with a conversion to ICD10 and a litany of other never-ending regulatory and compliance mandates. And this is happening in an economic climate of decreasing reimbursements and flat or declining IT budgets. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Cloud Computing, Healthcare, Mergers and Acquisitions | Tagged , , , , , , , | Leave a comment

How to Go From Multiple Salesforce Orgs to a Single Customer View

 

A World Map

Tracking key information across global, regional and departmental levels is often hard enough without considering multiple Salesforce orgs in your business.

If you’re here, then you may already know what a Salesforce org is, but if not, we have a definition available straight from the horse’s mouth:

“A deployment of Salesforce with a defined set of licensed users. An organization/org is the virtual space provided to an individual customer of salesforce.com. Your organization includes all of your data and applications, and is separate from all other organizations.” (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Cloud Computing, Customer Acquisition & Retention, Customers, Data Migration, Master Data Management, Mergers and Acquisitions | Leave a comment

ANNOUNCING! The 2012 Data Virtualization Architect-to-Architect & Business Value Program

Today, agility and timely visibility are critical to the business. No wonder CIO.com, states that business intelligence (BI) will be the top technology priority for CIOs in 2012. However, is your data architecture agile enough to handle these exacting demands?

In his blog Top 10 Business Intelligence Predictions For 2012, Boris Evelson of Forrester Research, Inc., states that traditional BI approaches often fall short for the two following reasons (among many others):

  • BI hasn’t fully empowered information workers, who still largely depend on IT
  • BI platforms, tools and applications aren’t agile enough (more…)
FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business/IT Collaboration, CIO, Customer Acquisition & Retention, Customers, Data Integration, Data Integration Platform, Data masking, Data Privacy, Data Quality, Data Services, Data Transformation, Data Warehousing, Governance, Risk and Compliance, Informatica 9.1, Informatica Events, Mainframe, Master Data Management, Mergers and Acquisitions, Operational Efficiency, Profiling, Real-Time, SOA, Vertical | Tagged , , , , , , , , , , , , | Leave a comment

What it Takes to Be a Leader in Data Virtualization!

If you haven’t already, I think you should read The Forrester Wave™: Data Virtualization, Q1 2012. For several reasons – one, to truly understand the space, and two, to understand the critical capabilities required to be a solution that solves real data integration problems.

At the very outset, let’s clearly define Data Virtualization. Simply put, Data Virtualization is foundational to Data Integration. It enables fast and direct access to the critical data and reports that the business needs and trusts. It is not to be confused with simple, traditional Data Federation. Instead, think of it as a superset which must complement existing data architectures to support BI agility, MDM and SOA. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration Platform, Data masking, Data Quality, Data Services, Data Transformation, Data Warehousing, Enterprise Data Management, Financial Services, Governance, Risk and Compliance, Healthcare, Informatica 9.1, Integration Competency Centers, Mainframe, Master Data Management, Mergers and Acquisitions, News & Announcements, Operational Efficiency, Pervasive Data Quality, Profiling, Public Sector, Real-Time, SOA, Telecommunications, Vertical | Tagged , , , , , , , , , , | Leave a comment

Dodd-Frank Legislation and Structured Data Retention

The “Dodd-Frank Wall Street Reform and Consumer Protection Act” has recently been passed by the US federal government to regulate financial institutions. Per this legislation, there will be more “watchdog” agencies that will be auditing banks, lending and investment institutions to ensure compliance. As an example, there will be an Office of Financial Research within the Federal Treasury responsible for collecting and analyzing data. This legislation brings with it a higher risk of fines for non-compliance. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Application Retirement, Big Data, Business Impact / Benefits, Business/IT Collaboration, CIO, Customer Acquisition & Retention, Customer Services, Customers, Data Governance, Data Services, Data Warehousing, Database Archiving, Enterprise Data Management, Financial Services, Governance, Risk and Compliance, Mainframe, Mergers and Acquisitions, Operational Efficiency | Tagged , , , , , , , , , , , , , , , | Leave a comment

Don’t Forget The Legacy Applications After Data Center Consolidation

As part of their cost cutting program, organizations are consolidating data centers and the applications within them.   Federal and state agencies in the public sector are among those where IT consolidation and moving applications to the cloud are top priorities as part of an overall goal to increase efficiencies and eliminate costs.  In other industries, many consolidations are also under way due to mergers and acquisitions and other cost cutting initiatives.  As you plan or undergo a consolidation project, you also need to plan for the retirement of legacy, redundant applications that are left behind.

(more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Application Retirement, Data Governance, Financial Services, Governance, Risk and Compliance, Healthcare, Mergers and Acquisitions, Operational Efficiency, Public Sector | Tagged , , , , , , | Leave a comment