Stephan Zoder

Stephan Zoder
Stephan Zoder has been the creative engine behind new go-to-market applications of old and new technologies for over a decade. He has been in a variety of regional or global leadership positions in technical sales, professional services, business development and product strategy at a number of industry-leading software vendors in the MDM, CRM, SCM, BI and MRO space. He has worked with Fortune 500 and mid-sized companies alike to help IT and business executives deliver measurable value on a technology solution’s promise. Stephan also brings a wealth of industry knowledge to his role; including energy, telecommunications, healthcare, industrial manufacturing, retail, national and state government, aerospace and automotive. Prior to Informatica, Stephan was responsible for managing IBM’s MDM and industry data warehouse model portfolio strategy for a variety of sectors. In this capacity his expertise contributed to Sunil Soares’ book “Selling information governance to the business”, IBM’s masteringdatamanagement.com blog and AMCIS 2011 paper on “NextGen Analytical MDM”. He now leads Informatica’s effort to assess, quantify, develop and deliver data management-based solutions to clients by being a trusted counsel and advocate to executives. Stephan holds a master’s degree in economic policy from George Washington University. Aside from his wife and 4 children, he is an avid Kendoka and skier.

The Ones Not Screwing Up with Compliance Will Win

A few weeks ago, a regional US bank asked me to perform some compliance and use case analysis around fixing their data management situation.  This bank prides itself on customer service and SMB focus, while using large-bank product offerings.  However, they were about a decade behind the rest of most banks in modernizing their IT infrastructure to stay operationally on top of things.

compliance

Bank Efficiency Ratio per AUM (Assets under Management), bankregdata.com

This included technologies like ESB, BPM, CRM, etc.  They also were a sub-optimal user of EDW and analytics capabilities. Having said all this; there was a commitment to change things up, which is always a needed first step to any recovery program.

THE STAKEHOLDERS

As I conducted my interviews across various departments (list below) it became very apparent that they were not suffering from data poverty (see prior post) but from lack of accessibility and use of data.

  • Compliance
  • Vendor Management & Risk
  • Commercial and Consumer Depository products
  • Credit Risk
  • HR & Compensation
  • Retail
  • Private Banking
  • Finance
  • Customer Solutions

FRESH BREEZE

This lack of use occurred across the board.  The natural reaction was to throw more bodies and more Band-Aid marts at the problem.  Users also started to operate under the assumption that it will never get better.  They just resigned themselves to mediocrity.  When some new players came into the organization from various systemically critical banks, they shook things up.

Here is a list of use cases they want to tackle:

  • The proposition of real-time offers based on customer events as simple as investment banking products for unusually high inflow of cash into a deposit account.
  • The use of all mortgage application information to understand debt/equity ratio to make relevant offers.
  • The capture of true product and customer profitability across all lines of commercial and consumer products including trust, treasury management, deposits, private banking, loans, etc.
  • The agile evaluation, creation, testing and deployment of new terms on existing and products under development by shortening the product development life cycle.
  • The reduction of wealth management advisors’ time to research clients and prospects.
  • The reduction of unclaimed use tax, insurance premiums and leases being paid on consumables, real estate and requisitions due to the incorrect status and location of the equipment.  This originated from assets no longer owned, scrapped or moved to different department, etc.
  • The more efficient reconciliation between transactional systems and finance, which often uses multiple party IDs per contract change in accounts receivable, while the operating division uses one based on a contract and its addendums.  An example would be vendor payment consolidation, to create a true supplier-spend; and thus, taking advantage of volume discounts.
  • The proactive creation of central compliance footprint (AML, 314, Suspicious Activity, CTR, etc.) allowing for quicker turnaround and fewer audit instances from MRAs (matter requiring attention).

MONEY TO BE MADE – PEOPLE TO SEE

Adding these up came to about $31 to $49 million annually in cost savings, new revenue or increased productivity for this bank with $24 billion total assets.

So now that we know there is money to be made by fixing the data of this organization, how can we realistically roll this out in an organization with many competing IT needs?

The best way to go about this is to attach any kind of data management project to a larger, business-oriented project, like CRM or EDW.  Rather than wait for these to go live without good seed data, why not feed them with better data as a key work stream within their respective project plans?

To summarize my findings I want to quote three people I interviewed.  A lady, who recently had to struggle through an OCC audit told me she believes that the banks, which can remain compliant at the lowest cost will ultimately win the end game.  Here she meant particularly tier 2 and 3 size organizations.  A gentleman from commercial banking left this statement with me, “Knowing what I know now, I would not bank with us”.  The lady from earlier also said, “We engage in spreadsheet Kung Fu”, to bring data together.

Given all this, what would you suggest?  Have you worked with an organization like this? Did you encounter any similar or different use cases in financial services institutions?

FacebookTwitterLinkedInEmailPrintShare
Posted in Banking & Capital Markets, Data Quality, Financial Services, Governance, Risk and Compliance | Tagged , , , | Leave a comment

Telecommunications and Data: What If Your Fiancée Flunked Finance?

About 15 or so years ago, some friends of mine called me to share great news.  Their dating relationship had become serious and they were headed toward marriage.  After a romantic proposal and a beautiful ring, it was time to plan the wedding and invite the guests.

Telecommunication and data

Lack of a Steady Income Stream is Not Romantic

This exciting time was confounded by a significant challenge. Though they were very much in love, one of them had an incredibly tough time making wise financial choices. During the wedding planning process, the financially astute fiancée grew concerned about the problems the challenged partner could bring. Even though the financially illiterate fiancée had every other admirable quality, the finance issue nearly created enough doubt to end the engagement.  Fortunately, my friends moved forward with the ceremony, were married and immediately went to work on learning new healthy financial habits as a couple.

Telecommunication and data

Is financial folly a relationship red flag?

Let’s segue into how this relates to telecommunications and data, specifically to your average communications operator. Just like a concerned fiancée, you’d think twice about making a commitment to an organization that didn’t have a strong foundation.

Like the financially challenged fiancée, the average operator has a number of excellent qualities: functioning business model, great branding, international roaming, creative ads, long-term prospects, smart people at the helm and all the data and IT assets you can imagine.  Unfortunately, despite the externally visible bells and whistles, over time they tend to lose operational soundness around the basics. Specifically, their lack of data quality causes them to forfeit an ever increasing amount of billing revenue. Their poor data costs them millions each year.

A recent set of engagements highlighted this phenomenon. The small carrier (3-6 million subscribers) who implements a more consistent, unique way to manage core subscriber profile and product data could recover underbilling of $6.9 million annually. A larger carrier (10-20 million subscribers) could recover $28.1 million every year from fixing billing errors. (This doesn’t even cover the large Indian and Chinese carriers who have over 100 million customers!)

Typically, a billing error starts with an incorrect set up of a service line item base price and related 30+ discount line variances.  Next, the wrong service discount item is applied at contract start.  If that did not happen (or on top of those), it will occur when the customer calls in during or right before the end of the first contract period (12-24 months) to complain about the service quality, bill shock, etc.  Here, the call center rep will break an existing triple play bundle by deleting an item and setting up a separate non-bundle service line item at a lower price (higher discount).  The head of billing actually told us, “our reps just give a residential subscriber a discount of $2 for calling us”.  It’s even higher for commercial clients.

To make matters worse, this change will trigger misaligned (incorrect) activation dates or even bill duplication, all of which will have to be fixed later by multiple staff on the BSS and OSS side or may even trigger an investigation project by the revenue assurance department.  Worst case, the deletion of the item from the bundle (especially for B2B clients) will not terminate the wholesale cost the carrier still owes a national carrier for a broadband line, which often is 1/3 of the retail price for a business customer.

To come full circle to my initial “accounting challenged” example; would you marry (invest in) this organization?  Do you think this can or should be solved in a big bang approach or incrementally?  Where would you start: product management, the service center, residential or commercial customers?

Observations and illustrations contained in this post are estimates only and are based entirely upon information provided by the prospective customer and on our observations and benchmarks.  While we believe our recommendations and estimates to be sound, the degree of success achieved by the prospective customer is dependent upon a variety of factors, many of which are not under Informatica’s control and nothing in this post shall be relied upon as representative of the degree of success that may, in fact, be realized and no warranty or representation of success, either express or implied, is made.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Data Privacy | Tagged , | Leave a comment

Who Has the Heart to Adopt this Orphan Oil Well?

As I browsed my BBC app a few weeks ago, I ran into this article about environmental contamination of oil wells in the UK, which were left to their own devices. The article explains that a lack of data and proper data management is causing major issues for gas and oil companies. In fact, researchers found no data for more than 2,000 inactive wells, many of which have been abandoned or “orphaned”(sealed and covered up). I started to scratch my head imagining what this problem looks like in places like Brazil, Nigeria, Malaysia, Angola and the Middle East. In these countries and regions, regulatory oversight is, on average, a bit less regulated.

Data Management

Like Oliver, this well needs a home!

On top of that, please excuse my cynicism here, but an “Orphan” well is just as ridiculous a concept as a “Dry” well.  A hole without liquid inside is not a well but – you guessed it – a hole.  Also, every well has a “Parent”, meaning

  • The person or company who drilled it
  • A  land owner who will get paid from its production and allowed the operation (otherwise it would be illegal)
  • A financier who fronted the equipment and research cost
  • A regulator, who is charged with overseeing the reservoir’s exploration

Let the “hydrocarbon family court judge” decide whose problem this orphan is with well founded information- no pun intended.  After all, this “domestic disturbance” is typically just as well documented as any police “house call”, when you hear screams from next door. Similarly, one would expect that when (exploratory) wells are abandoned and improperly capped or completed, there is a long track record about financial or operational troubles at the involved parties.  Apparently I was wrong.  Nobody seems to have a record of where the well actually was on the surface, let alone subsurface, to determine perforation risks in itself or from an actively managed bore nearby.

This reminds me of a meeting with an Asian NOC’s PMU IT staff, who vigorously disagreed with every other department on the reality on the ground versus at group level. The PMU folks insisted on having fixed all wells’ key attributes:

  1. Knowing how many wells and bores they had across the globe and all types of commercial models including joint ventures
  2. Where they were and are today
  3. What their technical characteristics were and currently are

The other departments, from finance to strategy, clearly indicated that 10,000 wells across the globe currently being “mastered” with (at least initially) cheap internal band aid fixes has a margin of error of up to 10%.   So much for long term TCO.  After reading this BBC article, this internal disagreement made even more sense.

If this chasm does not make a case for proper mastering of key operational entities, like wells, I don’t know what does. It also begs the question how any operation with potentially very negative long term effects can have no legally culpable party being capture in some sort of, dare I say, master register.  Isn’t this the sign of “rule of law” governing an advanced nation, e.g. having a land register, building permits, wills, etc.?

I rest my case, your honor.  May the garden ferries forgive us for spoiling their perfectly manicured lawn.  With more fracking and public scrutiny on the horizon, maybe regulators need to establish their own “trusted” well master file, rather than rely on oil firms’ data dumps.  After all, the next downhole location may be just a foot away from perforating one of these “orphans” setting your kitchen sink faucet on fire.

Do you think another push for local government to establish “well registries” like they did ten years ago for national IDs, is in order?

Disclaimer: Recommendations and illustrations contained in this post are estimates only and are based entirely upon information provided by the prospective customer and on our observations and benchmarks.  While we believe our recommendations and estimates to be sound, the degree of success achieved by the prospective customer is dependent upon a variety of factors, many of which are not under Informatica’s control and nothing in this post shall be relied upon as representative of the degree of success that may, in fact, be realized and no warranty or representation of success, either express or implied, is made.

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, Master Data Management | Tagged , , | Leave a comment

Business Beware! Corporate IT Is “Fixing” YOUR Data

It is troublesome to me to repeatedly get into conversations with IT managers who want to fix data “for the sake of fixing it”.  While this is presumably increasingly rare, due to my department’s role, we probably see a higher occurrence than the normal software vendor employee.  Given that, please excuse the inflammatory title of this post.

Nevertheless, once the deal is done, we find increasingly fewer of these instances, yet still enough, as the average implementation consultant or developer cares about this aspect even less.  A few months ago a petrochemical firm’s G&G IT team lead told me that he does not believe that data quality improvements can or should be measured.  He also said, “if we need another application, we buy it.  End of story.”  Good for software vendors, I thought, but in most organizations $1M here or there do not lay around leisurely plus decision makers want to see the – dare I say it – ROI.

This is not what a business - IT relationship should feel like

This is not what a business – IT relationship should feel like

However, IT and business leaders should take note that a misalignment due to lack OR disregard of communication is a critical success factor.  If the business does not get what it needs and wants AND it differs what Corporate IT is envisioning and working on – and this is what I am talking about here – it makes any IT investment a risky proposition.

Let me illustrate this with 4 recent examples I ran into:

1. Potential for flawed prioritization

A retail customer’s IT department apparently knew that fixing and enriching a customer loyalty record across the enterprise is a good and financially rewarding idea.  They only wanted to understand what the less-risky functional implementation choices where. They indicated that if they wanted to learn what the factual financial impact of “fixing” certain records or attributes, they would just have to look into their enterprise data warehouse.  This is where the logic falls apart as the warehouse would be just as unreliable as the “compromised” applications (POS, mktg, ERP) feeding it.

Even if they massaged the data before it hit the next EDW load, there is nothing inherently real-time about this as all OLTP are running processes of incorrect (no bidirectional linkage) and stale data (since the last load).

I would question if the business is now completely aligned with what IT is continuously correcting. After all, IT may go for the “easy or obvious” fixes via a weekly or monthly recurring data scrub exercise without truly knowing, which the “biggest bang for the buck” is or what the other affected business use cases are, they may not even be aware of yet.  Imagine the productivity impact of all the roundtripping and delay in reporting this creates.  This example also reminds me of a telco client, I encountered during my tenure at another tech firm, which fed their customer master from their EDW and now just found out that this pattern is doomed to fail due to data staleness and performance.

2. Fix IT issues and business benefits will trickle down

Client number two is a large North American construction Company.  An architect built a business case for fixing a variety of data buckets in the organization (CRM, Brand Management, Partner Onboarding, Mobility Services, Quotation & Requisitions, BI & EPM).

Grand vision documents existed and linked to the case, which stated how data would get better (like a sick patient) but there was no mention of hard facts of how each of the use cases would deliver on this.  After I gave him some major counseling what to look out and how to flesh it out – radio silence. Someone got scared of the math, I guess.

3. Now that we bought it, where do we start

The third culprit was a large petrochemical firm, which apparently sat on some excess funds and thought (rightfully so) it was a good idea to fix their well attributes. More power to them.  However, the IT team is now in a dreadful position having to justify to their boss and ultimately the E&P division head why they prioritized this effort so highly and spent the money.  Well, they had their heart in the right place but are a tad late.   Still, I consider this better late than never.

4. A senior moment

The last example comes from a South American communications provider. They seemingly did everything right given the results they achieved to date.  This gets to show that misalignment of IT and business does not necessarily wreak havoc – at least initially.

However, they are now in phase 3 of their roll out and reality caught up with them.  A senior moment or lapse in judgment maybe? Whatever it was; once they fixed their CRM, network and billing application data, they had to start talking to the business and financial analysts as complaints and questions started to trickle in. Once again, better late than never.

So what is the take-away from these stories. Why wait until phase 3, why have to be forced to cram some justification after the purchase?  You pick, which one works best for you to fix this age-old issue.  But please heed Sohaib’s words of wisdom recently broadcast on CNN Money “IT is a mature sector post bubble…..now it needs to deliver the goods”.  And here is an action item for you – check out the new way for the business user to prepare their own data (30 minutes into the video!).  Agreed?

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Business/IT Collaboration, CIO, Customer Acquisition & Retention, Customer Services, Data Aggregation, Data Governance, Data Integration, Data Quality, Data Warehousing, Enterprise Data Management, Master Data Management | Leave a comment

3 Reasons Why Martial Arts Should Be A Job Requirement For A Chief Data Officer

Forget degrees from Harvard or MIT, forget NoSQL, Hadoop or OBIEE.  These are all powerful tools but they will not win you the face-off.  It starts with who you are (or are not), who or what you are going up against and what has happened in the past. Why should martial arts be a job requirement for Chief Data Officers? I boiled it down to three simple reasons to help you understand.

Does this look like your CDO?

Does this look like your CDO?

I started practicing Kendo three years ago and it surprises me every single practice how inadequate I still am, how much I can glean from my opponent to determine future behavior and how unimportant “background noise” really is.  Even if I have a good day, some strike from a teenager or a retiree, who has been practicing for a decade or more, will remind me that I got only 1% better compared to last month and I have a long way to go.  At our last practice, one of my Senseis told me that the higher ranks get their Ki-Ken-Tai-Ichi (alignment of spirit, sword, body) right maybe half the time.  It’s a life lesson every time.

These three facts are probably also true for many one-on-one sports where adversaries study each other for more than just a couple of seconds before their next swing or shot.  If you ask me, most western sports are about endurance, strength, mindset and strategy with a heavy focus on the physical aspects.  Kendo is 90% strategy and mindset. That is why six and sixty-year olds alike can excel in it. It is more akin to chess with baseball bats.

You study your opponent from the second he walks up to the chair in the middle of the podium for a gentleman-like exchange of cerebral willpower but in the end you will smack him relentlessly with a bat.  Everything you do is directly driven from how you feel, what you think, how your opponent  moves, what your opponent feels and thinks. The goal is not to react to a hand being raised but to anticipate your opponent’s next move based on their most recent actions. By the time your eyes (and, a tenth of a second later, your brain) capture the right hand going up to strike towards you – you’ve already lost, as it is too late to react.

You are effectively analyzing core data domains and key attributes, like posture.  Business data requires the same rigor and focus on the essential. There is also a tremendous amount of process (formalities like repeated bowing) and deeper meaning in everything you do; call it “Governance”.

A Chief Data Officer (CDO) needs to mind the same aspects in his or her existence.

  • You are not the professional you think you are (humility)
  • Someone else always knows something you don’t, so every additional bit helps to predict future actions (willingness to learn)
  • How to eliminate all the noise detracting from the ultimate goal (focus)

In reality, the data problem has not been solved long ago.  Something new can be learned to combat this age-old problem.  The learning piece comes into play when we are willing to listen to people who have done or seen similar problems being fixed in another environment, not necessarily the same industry or department. The third is that political and technical detractors like procurement processes, M&A, new leadership or transactional volume spikes from more applications will continue to pop up.  However, it is on you, the CDO, to uncover, isolate and preach that fixing a process may not always be the root cause of a business issue and as such needs to be put in perspective.  As I always say “throwing bad data at a better process” just saved you a step but still renders errors, rework and bad decisions.

So what does this mean in “real” terms:

  • Seek and accept opinions frequently, even if they don’t match your issue perfectly. Often a customer is a customer is a customer….admit it. Your business model may not be that special after all.
  • Watch what the others do on a fundamental level, i.e. becoming data-driven organizations. These could be competitors, partners, organizations you (should) admire.
  • Internalize and socialize what the core asset, goal of the organization is, which will move the needle the most. Often it will be your intelligence (speak for information or data).

I will leave you with these thoughts and invite you to sit down, cross your legs, close your eyes and get all esoteric on me, young grasshopper, but please envision what you organization should look like and how it should make its money in five years from now.   Throwing more resources at new problems, ignoring core data issues and reacting when things bubble up at greater numbers will likely not cut it.

And here is where I will bow out. Take a moment and think about it; how does your take on life influence your assessment of what you encounter in your workplace?

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Data Governance, Data Integration, Data Quality, Enterprise Data Management, Hadoop | Tagged , | Leave a comment

Becoming a Revenue Driven Business Model through Data is Painful for Government Agencies

Recently, I presented a Business Value Assessment to a client.  The findings were based on a revenue-generating state government agency. Everyone at the presentation was stunned to find out how much money was left on the table by not basing their activities on transactions, which could be cleanly tied to the participating citizenry and a variety of channel partners. There was over $38 million in annual benefits left over, which included partially recovered lost revenue, cost avoidance and reduction. A higher data impact to this revenue driven business model could have prevented this.

Should government leaders go to the “School of Data” to understand where more revenue can be created without necessary tax hikes? (Source:creativecommons.org)

Should government leaders go to the “School of Data” to understand where more revenue can be created without necessary tax hikes? (Source:creativecommons.org)

Given the total revenue volume, this may seem small. However, after factoring in the little technology effort required to “collect and connect” data from existing transactions, it is actually extremely high.

The real challenge for this organization will be the required policy transformation to turn the organization from “data-starved” to “data-intensive”. This would eliminate strategic decisions around new products, locations and customers relying on surveys that face sampling errors, biases, etc. Additionally, surveys are often delayed, making them practically ineffective in this real-time world we live in today.

Despite no applicable legal restrictions, the leadership’s main concern was that gathering more data would erode the public’s trust and positive image of the organization.

To be clear; by “more” data being collected by this type of government agency I mean literally 10% of what any commercial retail entity has gathered on all of us for decades.  This is not the next NSA revelation as any conspiracy theorist may fear.

While I respect their culturally driven self-censorship despite no legal barricades, it raises their stakeholders’ (the state’s citizenry) concern over its performance.  To be clear, there would be no additional revenue for the state’s programs without more citizen data.  You may believe that they already know everything about you, including your income, property value, tax information, etc. However, inter-departmental sharing of criminally-non-relevant information is legally constrained.

Another interesting finding from this evaluation was that they had no sense of conversion rate from email and social media campaigns. Impressions from click-throughs as well as hard/soft bounces were more important than tracking who actually generated revenue.

This is a very market-driven organization compared to other agencies. It actually does try to measure itself like a commercial enterprise and attempts to change in order to generate additional revenue for state programs benefiting the citizenry. I can only imagine what non-revenue-generating agencies (local, state or federal) do in this respect.  Is revenue-oriented thinking something the DoD, DoJ or Social Security should subscribe to?

Think tanks and political pundits are now looking at the trade-off between bringing democracy to every backyard on our globe and its long-term, budget ramifications. The DoD is looking to reduce the active component to its lowest in decades given the U.S. federal debt level.

Putting the data bits and pieces together for revenue

Putting the data bits and pieces together for revenue

recent article in HBR explains that cost cutting has never sustained an organization’s growth over a longer period of time, but new revenue sources did. Is your company or government agency only looking at cost and personnel productivity?

Disclaimer:

Recommendations and illustrations contained in this post are estimates only and are based entirely upon information provided by the prospective customer and on our observations and benchmarks.  While we believe our recommendations and estimates to be sound, the degree of success achieved by the prospective customer is dependent upon a variety of factors, many of which are not under Informatica’s control and nothing in this post shall be relied upon as representative of the degree of success that may, in fact, be realized and no warranty or representation of success, either express or implied, is made.

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Data Integration, Data Quality, Operational Efficiency, Real-Time | Tagged , , | Leave a comment

Data: The Unsung Hero (or Villain) of every Communications Service Provider

The faceless hero of CSPs: Data

The faceless hero of CSPs: Data

Analyzing current business trends helps illustrate how difficult and complex the Communication Service Provider business environment has become. CSPs face many challenges. Clients expect high quality, affordable content that can move between devices with minimum advertising or privacy concerns. To illustrate this phenomenon, here are a few recent examples:

  • Apple is working with Comcast/NBC Universal on a new converged offering
  • Vodafone purchased the Spanish cable operator, Ono, having to quickly separate the wireless customers from the cable ones and cross-sell existing products
  • Net neutrality has been scuttled in the US and upheld in the EU so now a US CSP can give preferential bandwidth to content providers, generating higher margins
  • Microsoft’s Xbox community collects terabytes of data every day making effective use, storage and disposal based on local data retention regulation a challenge
  • Expensive 4G LTE infrastructure investment by operators such as Reliance is bringing streaming content to tens of millions of new consumers

To quickly capitalize on “new” (often old, but unknown) data sources, there has to be a common understanding of:

  • Where the data is
  • What state it is in
  • What it means
  • What volume and attributes are required to accommodate a one-off project vs. a recurring one

When a multitude of departments request data for analytical projects with their one-off, IT-unsanctioned on-premise or cloud applications, how will you go about it? The average European operator has between 400 and 1,500 (known) applications. Imagine what the unknown count is.

A European operator with 20-30 million subscribers incurs an average of $3 million per month due to unpaid invoices. This often results from incorrect or incomplete contact information. Imagine how much you would have to add for lost productivity efforts, including gathering, re-formatting, enriching, checking and sending  invoices. And this does not even account for late invoice payments or extended incorrect credit terms.

Think about all the wrong long-term conclusions that are being drawn from this wrong data. This single data problem creates indirect cost in excess of three times the initial, direct impact of unpaid invoices.

Want to fix your data and overcome the accelerating cost of change? Involve your marketing, CEM, strategy, finance and sales leaders to help them understand data’s impact on the bottom line.

Disclaimer: Recommendations and illustrations contained in this post are estimates only and are based entirely upon information provided by the prospective customer and on our observations and benchmarks. While we believe our recommendations and estimates to be sound, the degree of success achieved by the prospective customer is dependent upon a variety of factors, many of which are not under Informatica’s control and nothing in this post shall be relied upon as representative of the degree of success that may, in fact, be realized and no warranty or representation of success, either express or implied, is made.

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Business/IT Collaboration, Data Governance, Data Integration, Data Quality, Operational Efficiency | Tagged , , , , | Comments Off

Is your social media investment hampered by your “data poverty”?

Why is nobody measuring the “bag of money” portion? Source: emediate.com

Why is nobody measuring the “bag of money” portion? Source: emediate.com

Recently, I talked with a company that had allocated millions of dollars for paid social media promotion. Their hope was that a massive investment in Twitter and Facebook campaigns would lead to “more eyeballs” for their online gambling sites. Although they had internal social media expertise, they lacked a comprehensive partnership with IT. In addition, they lacked a properly funded policy vision. As a result, when asked how much of their socially-driven traffic resulted in actual sales, their answer was a resounding “No Idea.” I attribute this to “data poverty.”

There is a key reason that they were unable to quantify the ROI of their promotion: Their business model is, by design, “data poor.”  Although a great deal of customer data was available to them, they didn’t elect to use it. They could have used available data to identify “known players” as well as individuals with “playing potential.” There was no law prohibiting them from acquiring this data. However, they were uncomfortable obtaining a higher degree of attribution beyond name, address, e-mail and age.  They feared that customers would view them as a commercial counterpart to the NSA. As a result, key data elements like net worth, life-time-value, credit risk, location, marital status, employment status, number of friends/followers and property value we not considered when targeting potential users on social media. So, though the Social Media team considered this granular targeting to be a “dream-come-true,” others within the organization considered it to be too “1984.”

In addition to a hesitation to leverage available data, they were also limited by their dependence on a 3rd party IT provider. This lack of self-sufficiency created data quality issues, which limited their productivity. Ultimately, this dependency prevented them from capitalizing on new market opportunities in a timely way.

It should have been possible for them to craft a multi-channel approach. They ought to have been able to serve up promoted Tweets, banner ads and mobile application ads. They should have been able to track the click-through, IP and timestamp information from each one. They should have been able to make a BCR for redeeming a promotional offer at a retail location.

Strategic channel allocation would certainly have triggered additional sales. In fact, when we applied click-through, CAC and conversion benchmarks to their available transactional information, we modeled over $8 million in additional sales and $3 million in customer acquisition cost savings. In addition to the financial benefits, strategic channel allocation would have generated more data (and resulting insights) about their prospects and customers than they had when they began.

But, because they were hesitant to use all the data available to them, they failed to capitalize on their opportunities. Don’t let this happen to you. Make a strategic policy change to become a data-driven company.

Beyond the revenue gains of targeted social marketing, there are other reasons to become a data-driven company. Clean data can help you correctly identify ideal channel partners. This company failed to use sufficient data to properly select and retain their partners.  Hundreds of channel partners were removed without proper, data-driven confirmation. Reasons for this removal included things like “death of owner,”“fire,” and “unknown”.  To ensure more thorough vetting, the company could have used data points like the owner’s age, past business endeavors and legal proceedings. They could also have have studied location-fenced attributes like footfall, click-throughs and sales cannibalization risk. In fact, when we modeled the potential overall annual savings, across all business scenarios, for becoming a data driven company, the potential savings amount approached $40 million dollars.

Would a $40 million dollar savings inspire you to invest in your data? Would that amount be enough to motivate you to acquire, standardize, deduplicate, link, hierarchically structure and enrich YOUR data? It’s a no brainer. But it requires a policy shift to make your data work for you. Without this, it’s all just “potential”.

Do you have stories about companies that recently switched from traditional operations to smarter, data-driven operations? If so, I’d love to hear from you.

Disclaimer: Recommendations and illustrations contained in this post are estimates only and are based entirely upon information provided by the prospective customer  and on our observations and benchmarks.  While we believe our recommendations and estimates to be sound, the degree of success achieved by the prospective customer is dependent upon a variety of factors, many of which are not under Informatica’s control and nothing in this post shall be relied upon as representative of the degree of success that may, in fact, be realized and no warranty or representation of success, either express or implied, is made.

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Data Quality, Master Data Management | Tagged , , | 2 Comments

MDM for Utilities: Data Revives a 100 Year-Old Business Model

In both Europe and North America, there’s something profoundly different about today’s Utility Bill. Today’s bill goes well beyond the amount of money you owe for service that month. Today your Utility Bill is full of DATA. Perhaps it contains baseline analytics around usage and temperature, over a twelve month span. Perhaps it contains additional warranties for the power lines that go from the street to your property (typically not covered for repairs by the utility).  In fact, recently, even 3rd party companies have been sending me mail, offering to engage in a fixed-rate payment plan to smooth out my cash flow from the customary projected (from last year) vs actual (read) usage readings. Since smart readers are cropping up everywhere, I was wondering why this “actual vs plan” was still practice, even in the most urban of environments.

MDM for Utilities

Keep flipping-the-switch profitable. Source: thetyee.ca

In fact, a modern utility company has far more intricate data than a consumer sees on a bill. Behind the scenes, utilities leverage a plethora of data pools. Utility companies now have robust asset management, job order and scheduling systems. In addition, they use advanced analytics that monitor sensor data to predict maintenance needs. Mostly importantly, utilities run monthly analytics to prepare rate case requests with local regulators. These are then used to lock in new cost-plus structures for local and business billing in the years ahead.

Unfortunately, most of these applications sit in geographical or departmental silos, and only connect with each other in batch mode, if at all.  These data silos make utilities susceptible to frequent, costly data clean-up projects. These Data clean-up projects always surface the Utility’s shortcomings with regard to data standardization, duplication, linkage and hierarchical structuring. However, until recently, few Utilities were willing to invest to ensure that the newly cleaned data pools remained clean.

Enter MDM for Utilities

Master Data Management is the lynch pin for resolving Utility data issues. Without a clean, enriched, truthful picture of substation, breaker, valve, pump and line information, how can an operator adequately document the need for a rate hike? MDM can help answer questions like:

  • Was that breaker really installed in back in 1900?
  • Or is the year 1900 simply the default date for this data field?
  • Does the substation design mirror what was actually installed?
  • Is the breaker physically located where it is supposed to be?
  • Am I paying maintenance for a breaker that is actually owned by another operator?
  • Why am I sending a crew to inspect equipment that was deemed in-working-order one month earlier?
  • Is the housing development meter really located where the installing contractor claims it was installed?

Without MDM, utilities face all sorts of potential problems:

  1. Maintenance budgets can be either underfunded or overfunded
  2. Job vs bill requests can fail to align with local county delineations
  3. New housing construction can be significantly underbid

MDM for Utilities When a Utility operator uses the wealth of data they possess to optimize their operations, they inevitably reap financial benefits. The Utility company of the future invests in the maintained integrity of their data pool, rather than continually wasting cycles bodies on quarterly data cleansing. To learn how your company can do the same, please register for our Utility Industry MDM webinar on April 1 at 10 AM PST. In the webinar, Informatica and Noah Consulting will address the use cases and financial value MDM can bring to the utility industry.

Disclaimer: Recommendations and illustrations contained in this post are estimates only and are based entirely upon information provided by the prospective customer and on our observations and benchmarks.  While we believe our recommendations and estimates to be sound, the degree of success achieved by the prospective customer is dependent upon a variety of factors, many of which are not under Informatica’s control and nothing in this post shall be relied upon as representative of the degree of success that may, in fact, be realized and no warranty or representation of success, either express or implied, is made.

FacebookTwitterLinkedInEmailPrintShare
Posted in Master Data Management, Utilities & Energy | Tagged , , , , | Leave a comment

Would YOU Buy a Ford Pinto Just To Get the Fuzzy Dice?

Today, I am going to take a stab at rationalizing why one could even consider solving a problem with a solution that is well-known to be sub-par. Consider the Ford Pinto: Would you choose this car for your personal, land-based transportation simply because of the new plush dice in the window? For my European readers, replace the Pinto with the infamous Trabant and you get my meaning.  The fact is, both of these vehicles made the list of the “worst cars ever built” due to their mediocre design, environmental hazards or plain personal safety record.

What is a Pinto-like buying decision in information technology procurement? (source: msn autos)

What is a Pinto-like buying decision in information technology procurement? (source: msn autos)

Rational people would never choose a vehicle this way. So I always ask myself, “How can IT organizations rationalize buying product X just because product Y is thrown in for free?” Consider the case in which an organization chooses their CRM or BPM system simply because the vendor throws in an MDM or Data Quality Solution for free: Can this be done with a straight face?  You often hear vendors claim that “everything in our house is pre-integrated”, “plug & play” or “we have accelerators for this.” I would hope that IT procurement officers have come to understand that these phrases don’t close a deal in a cloud-based environment. That is even less so in an on-premise construct as it can never achieve this Nirvana unless it is customized based on client requirements.

Anyone can see the logic in getting “2 for the price of 1.” However, as IT procurement organizations seek to save a percentage of money every deal, they can’t lose sight of this key fact:

Standing up software (configuring, customizing, maintaining) and operating it over several years requires CLOSE inspection and scrutiny.

Like a Ford Pinto, Software cannot just be driven off the lot without a care, leaving you only to worry about changing the oil and filters at recommended intervals. Customization, operational risk and maintenance are a significant cost, which all my seasoned padawans will know. If Pinto buyers would have understood the Total Cost of Ownership before they made their purchase, they would have opted for Toyotas instead. Here is the bottom line:

If less than 10% of the overall requirements are solved by the free component
AND (and this is a big AND)
If less than 12% of the overall financial value is provided by the free component
Then it makes ZERO sense select a solution based on freebie add-ons.

When an add-on component is of significantly lower-quality than industry leading solutions, it becomes even more illogical to rely on it simply because it’s “free.” If analysts have affirmed that the leading solutions have stronger capabilities, flexibility and scalability, what does an IT department truly “save” by choosing an inferior “free” add-on?

So just why DO procurement officers gravitate toward “free” add-ons, rather than high quality solutions? As a former procurement manager, I remember the motivations perfectly. Procurement teams are often measured by, and rewarded for, the savings they achieve. Because their motivation is near-term savings, long term quality issues are not the primary decision driver. And, if IT fails to successfully communicate the risks, cost drivers and potential failure rates to Procurement, the motivation to save up-front money will win every time.

Both sellers and buyers need to avoid these dances of self-deception, the “Pre-Integration Tango” and the “Freebie Cha-Cha”.  No matter how much you loved driving that Pinto or Trabant off the dealer lot, your opinion changed after you drove it for 50,000 miles.

I’ve been in procurement. I’ve built, sold and implemented “accelerators” and “blueprints.” In my opinion, 2-for-1 is usually a bad idea in software procurement. The best software is designed to make 1+1=3. I would love to hear from you if you agree with my above “10% requirements/12% value” rule-of-thumb.  If not, let me know what your decision logic would be.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Governance, Enterprise Data Management | Tagged , , | 10 Comments