Category Archives: CIO
I believe that most in the software business believe that it is tough enough to calculate and hence financially justify the purchase or build of an application - especially middleware – to a business leader or even a CIO. Most of business-centric IT initiatives involve improving processes (order, billing, service) and visualization (scorecarding, trending) for end users to be more efficient in engaging accounts. Some of these have actually migrated to targeting improvements towards customers rather than their logical placeholders like accounts. Similar strides have been made in the realm of other party-type (vendor, employee) as well as product data. They also tackle analyzing larger or smaller data sets and providing a visual set of clues on how to interpret historical or predictive trends on orders, bills, usage, clicks, conversions, etc.
If you think this is a tough enough proposition in itself, imagine the challenge of quantifying the financial benefit derived from understanding where your “hardware” is physically located, how it is configured, who maintained it, when and how. Depending on the business model you may even have to figure out who built it or owns it. All of this has bottom-line effects on how, who and when expenses are paid and revenues get realized and recognized. And then there is the added complication that these dimensions of hardware are often fairly dynamic as they can also change ownership and/or physical location and hence, tax treatment, insurance risk, etc.
Such hardware could be a pump, a valve, a compressor, a substation, a cell tower, a truck or components within these assets. Over time, with new technologies and acquisitions coming about, the systems that plan for, install and maintain these assets become very departmentalized in terms of scope and specialized in terms of function. The same application that designs an asset for department A or region B, is not the same as the one accounting for its value, which is not the same as the one reading its operational status, which is not the one scheduling maintenance, which is not the same as the one billing for any repairs or replacement. The same folks who said the Data Warehouse is the “Golden Copy” now say the “new ERP system” is the new central source for everything. Practitioners know that this is either naiveté or maliciousness. And then there are manual adjustments….
Moreover, to truly take squeeze value out of these assets being installed and upgraded, the massive amounts of data they generate in a myriad of formats and intervals need to be understood, moved, formatted, fixed, interpreted at the right time and stored for future use in a cost-sensitive, easy-to-access and contextual meaningful way.
I wish I could tell you one application does it all but the unsurprising reality is that it takes a concoction of multiple. None or very few asset life cycle-supporting legacy applications will be retired as they often house data in formats commensurate with the age of the assets they were built for. It makes little financial sense to shut down these systems in a big bang approach but rather migrate region after region and process after process to the new system. After all, some of the assets have been in service for 50 or more years and the institutional knowledge tied to them is becoming nearly as old. Also, it is probably easier to engage in often required manual data fixes (hopefully only outliers) bit-by-bit, especially to accommodate imminent audits.
So what do you do in the meantime until all the relevant data is in a single system to get an enterprise-level way to fix your asset tower of Babel and leverage the data volume rather than treat it like an unwanted step child? Most companies, which operate in asset, fixed-cost heavy business models do not want to create a disruption but a steady tuning effect (squeezing the data orange), something rather unsexy in this internet day and age. This is especially true in “older” industries where data is still considered a necessary evil, not an opportunity ready to exploit. Fact is though; that in order to improve the bottom line, we better get going, even if it is with baby steps.
If you are aware of business models and their difficulties to leverage data, write to me. If you even know about an annoying, peculiar or esoteric data “domain”, which does not lend itself to be easily leveraged, share your thoughts. Next time, I will share some examples on how certain industries try to work in this environment, what they envision and how they go about getting there.
CIOs, CDOs and other IT executives are wrestling with a technology landscape that is not merely shifting and evolving—it’s transforming faster than mere mortals can keep up with. And it’s impossible to predict what technologies will be the right ones for your organization three years from now. Change is simply happening too fast. This Potential at Work article talks about the fundamental shift in architectural approach necessary to managing change, and how Informatica Vibe is the architectural secret sauce. Check out how Tony Young and Mark Smith think about the problem and the way out of the morass, and chime in with your own ideas.
Everyone knows that Informatica is the Data Integration company that helps organizations connect their disparate software into a cohesive and synchronous enterprise information system. The value to business is enormous and well documented in the form of use cases, ROI studies and loyalty / renewal rates that are industry-leading.
Event Processing, on the other hand is a technology that has been around only for a few years now and has yet to reach Main Street in Systems City, IT. But if you look at how event processing is being used, it’s amazing that more people haven’t heard about it. The idea at its core (pun intended) is very simple – monitor your data / events – those things that happen on a daily, hourly, minute-ly basis and then look for important patterns that are positive or negative indicators, and then set up your systems to automatically take action when those patterns come up – like notify a sales rep when a pattern indicates a customer is ready to buy, or stop that transaction, your company is about to be defrauded.
Since this is an Informatica blog, then you probably have a decent set of “muscles” in place already and so why, you ask, would you need 6 pack abs? Because 6 packs abs are a good indication of a strong musculature core and are the basis of a stable and highly athletic body. It’s the same parallel for companies because in today’s competitive business environment, you need strength, stability, and agility to compete. And since IT systems increasingly ARE the business, if your company isn’t performing as strong, lean, and mean as possible, then you can be sure your competitors will be looking to implement every advantage they can.
You may also be thinking why would you need something like Event Processing when you already have good Business Intelligence systems in place? The reality is that it’s not easy to monitor and measure useful but sometimes hidden data /event / sensor / social media sources and also to discern which patterns have meaning and which patterns may be discovered as false negatives. But the real difference is that BI usually reports to you after the fact when the value of acting on the situation has diminished significantly.
So while muscles are important to be able to stand up and run, and good quality, strong muscles are necessary to do heavy lifting, it’s those 6 pack abs on top of it all that give you the mean lean fighting machine to identify significant threats and opportunities amongst your data, and in essence, to better compete and win.
“Business-IT alignment.” The words have been touted so much by vendors (including Informatica), that they have become a platitude. But that doesn’t mean the concept itself isn’t still critical. This recent article in the IT Leader Potential at Work community lists the three actions IT must take to make a real impact on the business, starting by “inciting a revolution” among IT staff and turning them into business thinkers. Check out the article and share your thoughts here.
Whether you are establishing a new outsourced delivery model for your integration services or getting ready for the next round of contract negotiations with your existing supplier, you need a way to hold the supplier accountable – especially when it is an exclusive arrangement. Here are four key metrics that should be included in the multi-year agreement. (more…)
We are excited to announce the new Potential at Work Community for Application Leaders.
As an application leader, you have a very demanding job. You have to successfully manage issues such as:
- Driving the maximum business value from your company’s enterprise application investments
- Keeping all of your enterprise applications current and meeting user requirements
- Delivering on your service agreements and managing all of the “ilities.”
- Defining an enterprise application strategy that includes on-premise and cloud
- Delivering timely, authoritative and trustworthy data for your enterprise applications
This community is here to help you to do exactly that and to help you to excel in both your current job and your career ahead. Our goal is to provide tips, insights, best practices and information from experts to help you become more successful.
Our first edition is focused on the theme of managing an enterprise cloud application strategy. For those who are in the process of selecting cloud application vendors, I’ve included a very handy Vendor Selection Checklist that is used by Informatica’s Vice President of Applications.
Are we interested in your input to the community? Absolutely! If you have an idea or content to share with the community, please contact us and we will get you published.
Join the community and start unleashing your potential by clicking on this link:
Roger Nolan firstname.lastname@example.org
Julie Lockner email@example.com
Click here for more information on the Potential at Work communities.
Informatica Corporation CIO, Tony Young talks about the benefits of the Virtual Data Machine for companies.
Last month in The Biggest Dirty Little Secret in IT I highlighted a disturbing phenomenon – that in highly data-driven organizations that have large IT departments, as they get larger they become less efficient. In short, diseconomies of scale begin to creep in which slow down processes and drive up costs. The article went on to identify the root cause as a high degree of manual IT processes which don’t scale well. The question I will address in this article is what can we do to tackle the problem, and what is it worth? (more…)