Category Archives: Operational Efficiency

Consider PIM When (Re)Defining Your Direct-To-Consumer Model

directmodel

Over the past few years, we have assisted an increasing shift in customer behavior. Pervasive internet connectivity – along with the exponential adoption of mobile devices – has enabled shoppers to research and purchase products of all kinds, anytime and anywhere, using a combination of touch points they find most convenient. This is not a passing fad.

Consumers expect rich data and images to make purchase choices; business users require access to analytical data in order to make mission-critical decisions. These demands for information are driving a need for improved product data availability and accuracy. And this is changing the way businesses go to market.

A staggering number of stores and manufacturers are reforming their models to response to this challenge. The direct-to-consumer (DTC) model, while not new, is rapidly becoming the center stage to address these challenges. The optimal DTC model will vary depending on specific and contextual business objectives. However, there are many strategic benefits to going direct, but the main objectives include growing sales, gaining control over pricing, strengthening the brand, getting closer to consumers, and testing out new products and markets.

Supermarket Tube Informatica

Supermarket Tube Informatica

It is my contention that while the DTC model is gaining the deserved attention, much remains to be done. In fact, among many challenges that DTC poses, the processes and activities associated with sourcing product information, enriching product data to drive sales and lower returns, and managing product assortments across all channels loom large. More precisely, the challenges that need to be overcome are better exemplified by these points:

  • Products have several variations to support different segments, markets, and campaigns.
  • Product components, ingredients, care information, environmental impact data and other facets of importance to the customer.
  • People are visual. As a result, easy website navigation is essential. Eye-catching images that highlight your products or services (perhaps as they’re being performed or displayed as intended) is an effective way to visually communicate information to your customers and make it easier for them to evaluate options. If information and pictures are readily accessible, customers are more likely to engage.
  • Ratings, reviews and social data, stored within the product’s record rather than in separate systems.
  • Purchasing and sales measurements, for example, sales in-store, return rates, sales velocity, product views online, as well as viewing and purchasing correlations are often held across several systems. However, this information is increasingly needed for search and recommendation.

The importance of product data and its use, combined with the increased  demands on business as a result of inefficient, non-scaling approaches to data management, provide an imperative to considering a PIM to ‘power’ cross-channel retail. Once established, PIM users repeatedly report higher ROI. It is likely that we’ll see PIM systems rank alongside CRM, ERP, CMS, order management and merchandising systems as the pillars of cross-channel retailing at scale.

For all these reasons, choosing the right PIM strategy (and partner) is now a key decision. Get this decision wrong and it could become an expensive mistake.

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Operational Efficiency, PiM, Product Information Management, Retail | Tagged , , , | Leave a comment

Death of the Data Scientist: Silver Screen Fiction?

Maybe the word “death” is a bit strong, so let’s say “demise” instead.  Recently I read an article in the Harvard Business Review around how Big Data and Data Scientists will rule the world of the 21st century corporation and how they have to operate for maximum value.  The thing I found rather disturbing was that it takes a PhD – probably a few of them – in a variety of math areas to give executives the necessary insight to make better decisions ranging from what product to develop next to who to sell it to and where.

Who will walk the next long walk.... (source: Wikipedia)

Who will walk the next long walk…. (source: Wikipedia)

Don’t get me wrong – this is mixed news for any enterprise software firm helping businesses locate, acquire, contextually link, understand and distribute high-quality data.  The existence of such a high-value role validates product development but it also limits adoption.  It is also great news that data has finally gathered the attention it deserves.  But I am starting to ask myself why it always takes individuals with a “one-in-a-million” skill set to add value.  What happened to the democratization  of software?  Why is the design starting point for enterprise software not always similar to B2C applications, like an iPhone app, i.e. simpler is better?  Why is it always such a gradual “Cold War” evolution instead of a near-instant French Revolution?

Why do development environments for Big Data not accommodate limited or existing skills but always accommodate the most complex scenarios?  Well, the answer could be that the first customers will be very large, very complex organizations with super complex problems, which they were unable to solve so far.  If analytical apps have become a self-service proposition for business users, data integration should be as well.  So why does access to a lot of fast moving and diverse data require scarce PIG or Cassandra developers to get the data into an analyzable shape and a PhD to query and interpret patterns?

I realize new technologies start with a foundation and as they spread supply will attempt to catch up to create an equilibrium.  However, this is about a problem, which has existed for decades in many industries, such as the oil & gas, telecommunication, public and retail sector. Whenever I talk to architects and business leaders in these industries, they chuckle at “Big Data” and tell me “yes, we got that – and by the way, we have been dealing with this reality for a long time”.  By now I would have expected that the skill (cost) side of turning data into a meaningful insight would have been driven down more significantly.

Informatica has made a tremendous push in this regard with its “Map Once, Deploy Anywhere” paradigm.  I cannot wait to see what’s next – and I just saw something recently that got me very excited.  Why you ask? Because at some point I would like to have at least a business-super user pummel terabytes of transaction and interaction data into an environment (Hadoop cluster, in memory DB…) and massage it so that his self-created dashboard gets him/her where (s)he needs to go.  This should include concepts like; “where is the data I need for this insight?’, “what is missing and how do I get to that piece in the best way?”, “how do I want it to look to share it?” All that is required should be a semi-experienced knowledge of Excel and PowerPoint to get your hands on advanced Big Data analytics.  Don’t you think?  Do you believe that this role will disappear as quickly as it has surfaced?

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, CIO, Customer Acquisition & Retention, Customer Services, Data Aggregation, Data Integration, Data Integration Platform, Data Quality, Data Warehousing, Enterprise Data Management, Financial Services, Healthcare, Life Sciences, Manufacturing, Master Data Management, Operational Efficiency, Profiling, Scorecarding, Telecommunications, Transportation, Uncategorized, Utilities & Energy, Vertical | Tagged , , , , | 1 Comment

History Repeats Itself Through Business Intelligence (Part 2)

liftcar

In a previous blog post, I wrote about when business “history” is reported via Business Intelligence (BI) systems, it’s usually too late to make a real difference.  In this post, I’m going to talk about how business history becomes much more useful when combined operationally and in real time.

E. P. Thompson, a historian pointed out that all history is the history of unintended consequences.  His idea / theory was that history is not always recorded in documents, but instead is ultimately derived from examining cultural meanings as well as the structures of society  through hermeneutics (interpretation of texts) semiotics and in many forms and signs of the times, and concludes that history is created by people’s subjectivity and therefore is ultimately represented as they REALLY live.

The same can be extrapolated for businesses.  However, the BI systems of today only capture a miniscule piece of the larger pie of knowledge representation that may be gained from things like meetings, videos, sales calls, anecdotal win / loss reports, shadow IT projects, 10Ks and Qs, even company blog posts ;-)   – the point is; how can you better capture the essence of meaning and perhaps importance out of the everyday non-database events taking place in your company and its activities – in other words, how it REALLY operates.

One of the keys to figuring out how businesses really operate is identifying and utilizing those undocumented RULES that are usually underlying every business.  Select company employees, often veterans, know these rules intuitively. If you watch them, and every company has them, they just have a knack for getting projects pushed through the system, or making customers happy, or diagnosing a problem in a short time and with little fanfare.  They just know how things work and what needs to be done.

These rules have been, and still are difficult to quantify and apply or “Data-ify” if you will. Certain companies (and hopefully Informatica) will end up being major players in the race to datify these non-traditional rules and events, in addition to helping companies make sense out of big data in a whole new way. But in daydreaming about it, it’s not hard to imagine business systems that will eventually be able to understand the optimization rules of a business, accounting for possible unintended scenarios or consequences, and then apply them in the time when they are most needed.  Anyhow, that’s the goal of a new generation of Operational Intelligence systems.

In my final post on the subject, I’ll explain how it works and business problems it solves (in a nutshell). And if I’ve managed to pique your curiosity and you want to hear about Operational Intelligence sooner, tune in to to a webinar we’re having TODAY at 10 AM PST. Here’s the link.

http://www.informatica.com/us/company/informatica-talks/?commid=97187

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, Business/IT Collaboration, CIO, Complex Event Processing, Data Integration Platform, Operational Efficiency, Real-Time, SOA, Ultra Messaging | Tagged , , , , | 1 Comment

True Facts About Informatica RulePoint Real-Time Integration

Cabralia_computer_center

Shhhh… RulePoint Programmer Hard at Work

End of year.  Out with the old, in with the new.  A time where everyone gets their ducks in order, clears the pipe and gets ready for the New Year. For R&D, one of the gating events driving the New Year is the annual sales kickoff event where we present to Sales the new features so they can better communicate a products’ road map and value to potential buyers.  All well and good.  But part of the process is to fill out a Q and A that explains the product “Value Prop” and they only gave us 4 lines. I think the answer also helps determine speaking slots and priority.

So here’s the question I had to fill out -

FOR SALES TO UNDERSTAND THE PRODUCT BETTER, WE ASK THAT YOU ANSWER THE FOLLOWING QUESTION:

WHAT IS THE PRODUCT VALUE PROPOSITION AND ARE THERE ANY SIGNIFICANT DEPLOYMENTS OR OTHER CUSTOMER EXPERIENCES YOU HAVE HAD THAT HAVE HELPED TO DEFINE THE PRODUCT OFFERING?

Here’s what I wrote:

Informatica RULEPOINT is a real-time integration and event processing software product that is deployed very innovatively by many businesses and vertical industries.  Its value proposition is that it helps large enterprises discover important situations from their droves of data and events and then enables users to take timely action on discovered business opportunities as well as stop problems while or before they happen.

Here’s what I wanted to write:

RulePoint is scalable, low latency, flexible and extensible and was born in the pure and exotic wilds of the Amazon from the minds of natives that have never once spoken out loud – only programmed.  RulePoint captures the essence of true wisdom of the greatest sages of yesteryear. It is the programming equivalent and captures what Esperanto linguistically tried to do but failed to accomplish.

As to high availability, (HA) there has never been anything in the history of software as available as RulePoint. Madonna’s availability only pales in comparison to RulePoint’s availability.  We are talking 8 Nines cubed and then squared ( ;-) ). Oracle = Unavailable. IBM = Unavailable. Informatica RulePoint = Available.

RulePoint works hard, but plays hard too.  When not solving those mission critical business problems, RulePoint creates Arias worthy of Grammy nominations. In the wee hours of the AM, RulePoint single-handedly prevented the outbreak and heartbreak of psoriasis in East Angola.

One of the little known benefits of RulePoint is its ability to train the trainer, coach the coach and play the player. Via chalk talks? No, RulePoint uses mind melds instead.  Much more effective. RulePoint knows Chuck Norris.  How do you think Chuck Norris became so famous in the first place? Yes, RulePoint. Greenpeace used RulePoint to save dozens of whales, 2 narwhal, a polar bear and a few collateral penguins (the bear was about to eat the penguins).  RulePoint has been banned in 16 countries because it was TOO effective.  “Veni, Vidi, RulePoint Vici” was Julius Caesar’s actual quote.

The inspiration for Gandalf in the Lord of the Rings? RulePoint. IT heads worldwide shudder with pride when they hear the name RulePoint mentioned and know that they acquired it. RulePoint is stirred but never shaken. RulePoint is used to train the Sherpas that help climbers reach the highest of heights. RulePoint cooks Minute rice in 20 seconds.

The running of the bulls in Pamplona every year -  What do you think they are running from? Yes,  RulePoint. RulePoint put the Vinyasa back into Yoga. In fact, RulePoint will eventually create a new derivative called Full Contact Vinyasa Yoga and it will eventually supplant gymnastics in the 2028 Summer Olympic games.

The laws of physics were disproved last year by RulePoint.  RulePoint was drafted in the 9th round by the LA Lakers in the 90s, but opted instead to teach math to inner city youngsters. 5 years ago, RulePoint came up with an antivenin to the Black Mamba and has yet to ask for any form of recompense. RulePoint’s rules bend but never break. The stand-in for the “Mind” in the movie “A Beautiful Mind” was RulePoint.

RulePoint will define a new category for the Turing award and will name it the 2Turing Award.  As a bonus, the 2Turing Award will then be modestly won by RulePoint and the whole category will be retired shortly thereafter.  RulePoint is… tada… the most interesting software in the world.

But I didn’t get to write any of these true facts and product differentiators on the form. No room.

Hopefully I can still get a primo slot to talk about RulePoint.

 

And so from all the RulePoint and Emerging Technologies team, including sales and marketing, here’s hoping you have great holiday season and a Happy New Year!

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, Business/IT Collaboration, CIO, Complex Event Processing, Data Integration Platform, Operational Efficiency, Uncategorized | Tagged , , , , | Leave a comment

Sensational Find – $200 Million Hidden in a Teenager’s Bedroom!

That tag line got your attention – did it not?  Last week I talked about how companies are trying to squeeze more value out of their asset data (e.g. equipment of any kind) and the systems that house it.  I also highlighted the fact that IT departments in many companies with physical asset-heavy business models have tried (and often failed) to create a consistent view of asset data in a new ERP or data warehouse application.  These environments are neither equipped to deal with all life cycle aspects of asset information, nor are they fixing the root of the data problem in the sources, i.e. where the stuff is and what it look like. It is like a teenager whose parents have spent thousands of dollars on buying him the latest garments but he always wears the same three outfits because he cannot find the other ones in the pile he hoardes under her bed.  And now they bought him a smart phone to fix it.  So before you buy him the next black designer shirt, maybe it would be good to find out how many of the same designer shirts he already has, what state they are in and where they are.

Finding the asset in your teenager's mess

Finding the asset in your teenager’s mess

Recently, I had the chance to work on a like problem with a large overseas oil & gas company and a North American utility.  Both are by definition asset heavy, very conservative in their business practices, highly regulated, very much dependent on outside market forces such as the oil price and geographically very dispersed; and thus, by default a classic system integration spaghetti dish.

My challenge was to find out where the biggest opportunities were in terms of harnessing data for financial benefit.

The initial sense in oil & gas was that most of the financial opportunity hidden in asset data was in G&G (geophysical & geological) and the least on the retail side (lubricants and gas for sale at operated gas stations).  On the utility side, the go to area for opportunity appeared to be maintenance operations.  Let’s say that I was about right with these assertions but that there were a lot more skeletons in the closet with diamond rings on their fingers than I anticipated.

After talking extensively with a number of department heads in the oil company; starting with the IT folks running half of the 400 G&G applications, the ERP instances (turns out there were 5, not 1) and the data warehouses (3), I queried the people in charge of lubricant and crude plant operations, hydrocarbon trading, finance (tax, insurance, treasury) as well as supply chain, production management, land management and HSE (health, safety, environmental).

The net-net was that the production management people said that there is no issue as they already cleaned up the ERP instance around customer and asset (well) information. The supply chain folks also indicated that they have used another vendor’s MDM application to clean up their vendor data, which funnily enough was not put back into the procurement system responsible for ordering parts.  The data warehouse/BI team was comfortable that they cleaned up any information for supply chain, production and finance reports before dimension and fact tables were populated for any data marts.

All of this was pretty much a series of denial sessions on your 12-step road to recovery as the IT folks had very little interaction with the business to get any sense of how relevant, correct, timely and useful these actions are for the end consumer of the information.  They also had to run and adjust fixes every month or quarter as source systems changed, new legislation dictated adjustments and new executive guidelines were announced.

While every department tried to run semi-automated and monthly clean up jobs with scripts and some off-the-shelve software to fix their particular situation, the corporate (holding) company and any downstream consumers had no consistency to make sensible decisions on where and how to invest without throwing another legion of bodies (by now over 100 FTEs in total) at the same problem.

So at every stage of the data flow from sources to the ERP to the operational BI and lastly the finance BI environment, people repeated the same tasks: profile, understand, move, aggregate, enrich, format and load.

Despite the departmental clean-up efforts, areas like production operations did not know with certainty (even after their clean up) how many well heads and bores they had, where they were downhole and who changed a characteristic as mundane as the well name last and why (governance, location match).

Marketing (Trading) was surprisingly open about their issues.  They could not process incoming, anchored crude shipments into inventory or assess who the counterparty they sold to was owned by and what payment terms were appropriate given the credit or concentration risk associated (reference data, hierarchy mgmt.).  As a consequence, operating cash accuracy was low despite ongoing improvements in the process and thus, incurred opportunity cost.

Operational assets like rig equipment had excess insurance coverage (location, operational data linkage) and fines paid to local governments for incorrectly filing or not renewing work visas was not returned for up to two years incurring opportunity cost (employee reference data).

A big chunk of savings was locked up in unplanned NPT (non-production time) because inconsistent, incorrect well data triggered incorrect maintenance intervals. Similarly, OEM specific DCS (drill control system) component software was lacking a central reference data store, which did not trigger alerts before components failed. If you add on top a lack of linkage of data served by thousands of sensors via well logs and Pi historians and their ever changing roll-up for operations and finance, the resulting chaos is complete.

One approach we employed around NPT improvements was to take the revenue from production figure from their 10k and combine it with the industry benchmark related to number of NPT days per 100 day of production (typically about 30% across avg depth on & offshore types).  Then you overlay it with a benchmark (if they don’t know) how many of these NPT days were due to bad data, not equipment failure or alike, and just fix a portion of that, you are getting big numbers.

When I sat back and looked at all the potential it came to more than $200 million in savings over 5 years and this before any sensor data from rig equipment, like the myriad of siloed applications running within a drill control system, are integrated and leveraged via a Hadoop cluster to influence operational decisions like drill string configuration or asmyth.

Next time I’ll share some insight into the results of my most recent utility engagement but I would love to hear from you what your experience is in these two or other similar industries.

Disclaimer:
Recommendations contained in this post are estimates only and are based entirely upon information provided by the prospective customer  and on our observations.  While we believe our recommendations and estimates to be sound, the degree of success achieved by the prospective customer is dependent upon a variety of factors, many of which are not under Informatica’s control and nothing in this post shall be relied upon as representative of the degree of success that may, in fact, be realized and no warrantee or representation of success, either express or implied, is made.
FacebookTwitterLinkedInEmailPrintShare
Posted in Application Retirement, B2B, Big Data, Business Impact / Benefits, Business/IT Collaboration, CIO, Data Aggregation, Data Governance, Data Integration, Data Quality, Enterprise Data Management, Governance, Risk and Compliance, Manufacturing, Master Data Management, Mergers and Acquisitions, Operational Efficiency, Uncategorized, Utilities & Energy, Vertical | Tagged , , , , , , , | Leave a comment

Inside Dreamforce ’13: Why sales may be building 360-degree customer views outside Salesforce

I had a disturbing conversation at Dreamforce. Long story short, thousands of highly skilled and highly paid financial advisors (read sales reps) at a large financial services company are spending most of their day pulling together information about their clients in a spreadsheet, leaving only a few hours to engage with clients and generate revenue.

Not all valuable customer information is in Salesforce

360_degree_city-1024x640

Are you in sales? Your time is too valuable to squander on jobs technology can do. Stop building 360-degree customer views in spreadsheets. Get it delivered in Salesforce.

Why? They don’t have a 360-degree customer view within Salesforce.

Why not? Not all client information that’s valuable to the financial advisors is in Salesforce.  Important client information is in other applications too, such as:

  • Marketing automation application
  • Customer support application
  • Account management applications
  • Finance applications
  • Business intelligence applications

Are you in sales? Do you work for a company that has multiple products or lines of business? Then you can probably relate. In my 15 years of experience working with sales, I’ve found this to be a harsh reality. You have to manually pull together customer information, which is a time-consuming process that doesn’t boost job satisfaction.

Stop building 360-degree customer views in spreadsheets 

So what can you do about it? Stop building 360-degree customer views in spreadsheets. There is a better way and your sales operations leader can help.

One of my favorite customer success stories is about one of the world’s leading wealth management companies, with 16,000 financial advisors globally.  Like most companies, their goal is to increase revenue by understanding their customers’ needs and making relevant cross-sell and up-sell offers.

But, the financial advisors needed an up-to-date view of the “total customer relationship” with the bank before they talked to their high net-worth clients. They wanted to appear knowledgeable and offer a product the client might actually want.

Can you guess what was holding them back? The bank operated in an account-centric world. Each line of business had its own account management application. To get a 360-degree customer view, the financial advisors spent 70% of their time pulling important client information from different applications into spreadsheets. Sound familiar?

Once the head of sales realized this, he decided to invest in information management technology that provides clean, consistent and connected customer information and delivers a 360-degree customer view within Salesforce.

The result? They’ve had a $50 million dollar impact annually and a 30% increase in productivity. In fact, word spread to other banks and the 360-degree customer view in Salesforce became an incentive to attract top talent in the industry.

Ask sales operations to give you 360-degree customer views within Salesforce

I urge you to take action. In particular, talk to your sales operations leader if he or she is at all interested in improving performance and productivity, acquiring and retaining top sales talent, and cutting costs.

Want to see how you can get 360-degree customer views in Salesforce? Check out this demo: Enrich Customer Data in Your CRM Application with MDM. Then schedule a meeting with your sales operations leader.

Have a similar experience to share? Please share it in the comments below.

FacebookTwitterLinkedInEmailPrintShare
Posted in Customer Acquisition & Retention, Customers, Data Integration, Data Quality, Financial Services, Master Data Management, Operational Efficiency, Uncategorized | Tagged , , , , | Leave a comment

Squeezing the Value out of the Old Annoying Orange

I believe that most in the software business believe that it is tough enough to calculate and hence financially justify the purchase or build of an application - especially middleware – to a business leader or even a CIO.  Most of business-centric IT initiatives involve improving processes (order, billing, service) and visualization (scorecarding, trending) for end users to be more efficient in engaging accounts.  Some of these have actually migrated to targeting improvements towards customers rather than their logical placeholders like accounts.  Similar strides have been made in the realm of other party-type (vendor, employee) as well as product data.  They also tackle analyzing larger or smaller data sets and providing a visual set of clues on how to interpret historical or predictive trends on orders, bills, usage, clicks, conversions, etc.

Squeeze that Orange

Squeeze that Orange

If you think this is a tough enough proposition in itself, imagine the challenge of quantifying the financial benefit derived from understanding where your “hardware” is physically located, how it is configured, who maintained it, when and how.  Depending on the business model you may even have to figure out who built it or owns it.  All of this has bottom-line effects on how, who and when expenses are paid and revenues get realized and recognized.  And then there is the added complication that these dimensions of hardware are often fairly dynamic as they can also change ownership and/or physical location and hence, tax treatment, insurance risk, etc.

Such hardware could be a pump, a valve, a compressor, a substation, a cell tower, a truck or components within these assets.  Over time, with new technologies and acquisitions coming about, the systems that plan for, install and maintain these assets become very departmentalized in terms of scope and specialized in terms of function.  The same application that designs an asset for department A or region B, is not the same as the one accounting for its value, which is not the same as the one reading its operational status, which is not the one scheduling maintenance, which is not the same as the one billing for any repairs or replacement.  The same folks who said the Data Warehouse is the “Golden Copy” now say the “new ERP system” is the new central source for everything.  Practitioners know that this is either naiveté or maliciousness. And then there are manual adjustments….

Moreover, to truly take squeeze value out of these assets being installed and upgraded, the massive amounts of data they generate in a myriad of formats and intervals need to be understood, moved, formatted, fixed, interpreted at the right time and stored for future use in a cost-sensitive, easy-to-access and contextual meaningful way.

I wish I could tell you one application does it all but the unsurprising reality is that it takes a concoction of multiple.  None or very few asset life cycle-supporting legacy applications will be retired as they often house data in formats commensurate with the age of the assets they were built for.  It makes little financial sense to shut down these systems in a big bang approach but rather migrate region after region and process after process to the new system.  After all, some of the assets have been in service for 50 or more years and the institutional knowledge tied to them is becoming nearly as old.  Also, it is probably easier to engage in often required manual data fixes (hopefully only outliers) bit-by-bit, especially to accommodate imminent audits.

So what do you do in the meantime until all the relevant data is in a single system to get an enterprise-level way to fix your asset tower of Babel and leverage the data volume rather than treat it like an unwanted step child?  Most companies, which operate in asset, fixed-cost heavy business models do not want to create a disruption but a steady tuning effect (squeezing the data orange), something rather unsexy in this internet day and age.  This is especially true in “older” industries where data is still considered a necessary evil, not an opportunity ready to exploit.  Fact is though; that in order to improve the bottom line, we better get going, even if it is with baby steps.

If you are aware of business models and their difficulties to leverage data, write to me.  If you even know about an annoying, peculiar or esoteric data “domain”, which does not lend itself to be easily leveraged, share your thoughts.  Next time, I will share some examples on how certain industries try to work in this environment, what they envision and how they go about getting there.

FacebookTwitterLinkedInEmailPrintShare
Posted in Application Retirement, Big Data, Business Impact / Benefits, Business/IT Collaboration, CIO, Customer Acquisition & Retention, Customers, Data Governance, Data Quality, Enterprise Data Management, Governance, Risk and Compliance, Healthcare, Life Sciences, Manufacturing, Master Data Management, Mergers and Acquisitions, Operational Efficiency, Product Information Management, Profiling, Telecommunications, Transportation, Utilities & Energy, Vertical | 1 Comment

Integration Gives You Muscle. Data Integration with Event Processing Gives You 6-Pack Abs.

6Packabs

Everyone knows that Informatica is the Data Integration company that helps organizations connect their disparate software into a cohesive and synchronous enterprise information system.  The value to business is enormous and well documented in the form of use cases, ROI studies and loyalty / renewal rates that are industry-leading.

Event Processing, on the other hand is a technology that has been around only for a few years now and has yet to reach Main Street in Systems City, IT.  But if you look at how event processing is being used, it’s amazing that more people haven’t heard about it.  The idea at its core (pun intended) is very simple – monitor your data / events – those things that happen on a daily, hourly, minute-ly basis and then look for important patterns that are positive or negative indicators, and then set up your systems to automatically take action when those patterns come up – like notify a sales rep when a pattern indicates a customer is ready to buy, or stop that transaction, your company is about to be defrauded.

Since this is an Informatica blog, then you probably have a decent set of “muscles” in place already and so why, you ask, would you need 6 pack abs?  Because 6 packs abs are a good indication of a strong musculature core and are the basis of a stable and highly athletic body.  It’s the same parallel for companies because in today’s competitive business environment, you need strength, stability, and agility to compete.  And since IT systems increasingly ARE the business, if your company isn’t performing as strong, lean, and mean as possible, then you can be sure your competitors will be looking to implement every advantage they can.

You may also be thinking why would you need something like Event Processing when you already have good Business Intelligence systems in place?  The reality is that it’s not easy to monitor and measure useful but sometimes hidden data /event / sensor / social media sources and also to discern which patterns have meaning and which patterns may be discovered as false negatives.  But the real difference is that BI usually reports to you after the fact when the value of acting on the situation has diminished significantly.

So while muscles are important to be able to stand up and run, and good quality, strong muscles are necessary to do heavy lifting, it’s those 6 pack abs on top of it all that give you the mean lean fighting machine to identify significant threats and opportunities amongst your data, and in essence, to better compete and win.

FacebookTwitterLinkedInEmailPrintShare
Posted in CIO, Complex Event Processing, Data Integration, Data Integration Platform, Operational Efficiency, Real-Time | Tagged , , , , , , , , , | Leave a comment

Application Leaders: Get Advice on How to Put Your Potential to Work

We are excited to announce the new Potential at Work Community for Application Leaders.

As an application leader, you have a very demanding job.  You have to successfully manage issues such as:

  • Driving the maximum business value from your company’s enterprise application investments
  • Keeping all of your enterprise applications current and meeting user requirements
  • Delivering on your service agreements and managing all of the “ilities.”
  • Defining an enterprise application strategy that includes on-premise and cloud
  • Delivering timely, authoritative and trustworthy data for your enterprise applications

This community is here to help you to do exactly that and to help you to excel in both your current job and your career ahead.  Our goal is to provide tips, insights, best practices and information from   experts to  help you become more successful.

Our first edition is focused on the theme of managing  an enterprise cloud application strategy.   For those who are in the process of selecting cloud application vendors, I’ve included a very handy Vendor Selection Checklist that is used by Informatica’s  Vice President of Applications. 

Are we interested in your input to the community?  Absolutely!  If you have an idea or content to share with the community, please contact us and we will get you published.

Join the community and   start unleashing your potential by clicking on this link:   

 Potential at Work Community for Application Leaders

Best regards,

Roger Nolan       rnolan@informatica.com

Julie Lockner      jlockner@informatica.com

Click here for more information on the Potential at Work communities.

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Business/IT Collaboration, CIO, Enterprise Data Management, Operational Efficiency, Product Information Management | Tagged , , , , , , | Leave a comment

Architects: Your Greatest Impact Is On Integration Systems

Do you have a buy versus build architectural principle?  If so, why do you have so many developers (or armies of consultant programmers) in your IT organization? While some of them may be developing web apps or functional extensions to purchased applications, for the most part they are doing integration development. They are probably building fine-grained web services to access data resources, or coarse-grained services that provide some useful business function, or process orchestration  solutions to automate long-running business processes, or data migration in support of application modernization, or data synchronization in support of a data warehouse or master data management solution.  The list goes on.  In short, you are an integration shop. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Data Integration, Data Integration Platform, Enterprise Data Management, Integration Competency Centers, Operational Efficiency | Tagged , | Leave a comment