Stephan Zoder

Stephan Zoder
Stephan Zoder has been the creative engine behind new go-to-market applications of old and new technologies for over a decade. He has been in a variety of regional or global leadership positions in technical sales, professional services, business development and product strategy at a number of industry-leading software vendors in the MDM, CRM, SCM, BI and MRO space. He has worked with Fortune 500 and mid-sized companies alike to help IT and business executives deliver measurable value on a technology solution’s promise. Stephan also brings a wealth of industry knowledge to his role; including energy, telecommunications, healthcare, industrial manufacturing, retail, national and state government, aerospace and automotive. Prior to Informatica, Stephan was responsible for managing IBM’s MDM and industry data warehouse model portfolio strategy for a variety of sectors. In this capacity his expertise contributed to Sunil Soares’ book “Selling information governance to the business”, IBM’s masteringdatamanagement.com blog and AMCIS 2011 paper on “NextGen Analytical MDM”. He now leads Informatica’s effort to assess, quantify, develop and deliver data management-based solutions to clients by being a trusted counsel and advocate to executives. Stephan holds a master’s degree in economic policy from George Washington University. Aside from his wife and 4 children, he is an avid Kendoka and skier.

The King of Benchmarks Rules the Realm of Averages

A mid-sized insurer recently approached our team for help. They wanted to understand how they fell short in making their case to their executives. Specifically, they proposed that fixing their customer data was key to supporting the executive team’s highly aggressive 3-year growth plan. (This plan was 3x today’s revenue).  Given this core organizational mission – aside from being a warm and fuzzy place to work supporting its local community – the slam dunk solution to help here is simple.  Just reducing the data migration effort around the next acquisition or avoiding the ritual annual, one-off data clean-up project already pays for any tool set enhancing data acquisitions, integration and hygiene.  Will it get you to 3x today’s revenue?  It probably won’t.  What will help are the following:

The King of Benchmarks Rules the Realm of Averages

Making the Math Work (courtesy of Scott Adams)

Hard cost avoidance via software maintenance or consulting elimination is the easy part of the exercise. That is why CFOs love it and focus so much on it.  It is easy to grasp and immediate (aka next quarter).

Soft cost reduction, like staff redundancies are a bit harder.  Despite them being viable, in my experience very few decision makers want work on a business case to lay off staff.  My team had one so far. They look at these savings as freed up capacity, which can be re-deployed more productively.   Productivity is also a bit harder to quantify as you typically have to understand how data travels and gets worked on between departments.

However, revenue effects are even harder and esoteric to many people as they include projections.  They are often considered “soft” benefits, although they outweigh the other areas by 2-3 times in terms of impact.  Ultimately, every organization runs their strategy based on projections (see the insurer in my first paragraph).

The hardest to quantify is risk. Not only is it based on projections – often from a third party (Moody’s, TransUnion, etc.) – but few people understand it. More often, clients don’t even accept you investigating this area if you don’t have an advanced degree in insurance math. Nevertheless, risk can generate extra “soft” cost avoidance (beefing up reserve account balance creating opportunity cost) but also revenue (realizing a risk premium previously ignored).  Often risk profiles change due to relationships, which can be links to new “horizontal” information (transactional attributes) or vertical (hierarchical) from parent-child relationships of an entity and the parent’s or children’s transactions.

Given the above, my initial advice to the insurer would be to look at the heartache of their last acquisition, use a benchmark for IT productivity from improved data management capabilities (typically 20-26% – Yankee Group) and there you go.  This is just the IT side so consider increasing the upper range by 1.4x (Harvard Business School) as every attribute change (last mobile view date) requires additional meetings on a manager, director and VP level.  These people’s time gets increasingly more expensive.  You could also use Aberdeen’s benchmark of 13hrs per average master data attribute fix instead.

You can also look at productivity areas, which are typically overly measured.  Let’s assume a call center rep spends 20% of the average call time of 12 minutes (depending on the call type – account or bill inquiry, dispute, etc.) understanding

  • Who the customer is
  • What he bought online and in-store
  • If he tried to resolve his issue on the website or store
  • How he uses equipment
  • What he cares about
  • If he prefers call backs, SMS or email confirmations
  • His response rate to offers
  • His/her value to the company

If he spends these 20% of every call stringing together insights from five applications and twelve screens instead of one frame in seconds, which is the same information in every application he touches, you just freed up 20% worth of his hourly compensation.

Then look at the software, hardware, maintenance and ongoing management of the likely customer record sources (pick the worst and best quality one based on your current understanding), which will end up in a centrally governed instance.  Per DAMA, every duplicate record will cost you between $0.45 (party) and $0.85 (product) per transaction (edit touch).  At the very least each record will be touched once a year (likely 3-5 times), so multiply your duplicated record count by that and you have your savings from just de-duplication.  You can also use Aberdeen’s benchmark of 71 serious errors per 1,000 records, meaning the chance of transactional failure and required effort (% of one or more FTE’s daily workday) to fix is high.  If this does not work for you, run a data profile with one of the many tools out there.

If the sign says it - do it!

If the sign says it – do it!

If standardization of records (zip codes, billing codes, currency, etc.) is the problem, ask your business partner how many customer contacts (calls, mailing, emails, orders, invoices or account statements) fail outright and/or require validation because of these attributes.  Once again, if you apply the productivity gains mentioned earlier, there are you savings.  If you look at the number of orders that get delayed in form of payment or revenue recognition and the average order amount by a week or a month, you were just able to quantify how much profit (multiply by operating margin) you would be able to pull into the current financial year from the next one.

The same is true for speeding up the introduction or a new product or a change to it generating profits earlier.  Note that looking at the time value of funds realized earlier is too small in most instances especially in the current interest environment.

If emails bounce back or snail mail gets returned (no such address, no such name at this address, no such domain, no such user at this domain), e(mail) verification tools can help reduce the bounces. If every mail piece (forget email due to the miniscule cost) costs $1.25 – and this will vary by type of mailing (catalog, promotion post card, statement letter), incorrect or incomplete records are wasted cost.  If you can, use fully loaded print cost incl. 3rd party data prep and returns handling.  You will never capture all cost inputs but take a conservative stab.

If it was an offer, reduced bounces should also improve your response rate (also true for email now). Prospect mail response rates are typically around 1.2% (Direct Marketing Association), whereas phone response rates are around 8.2%.  If you know that your current response rate is half that (for argument sake) and you send out 100,000 emails of which 1.3% (Silverpop) have customer data issues, then fixing 81-93% of them (our experience) will drop the bounce rate to under 0.3% meaning more emails will arrive/be relevant. This in turn multiplied by a standard conversion rate (MarketingSherpa) of 3% (industry and channel specific) and average order (your data) multiplied by operating margin gets you a   benefit value for revenue.

If product data and inventory carrying cost or supplier spend are your issue, find out how many supplier shipments you receive every month, the average cost of a part (or cost range), apply the Aberdeen master data failure rate (71 in 1,000) to use cases around lack of or incorrect supersession or alternate part data, to assess the value of a single shipment’s overspend.  You can also just use the ending inventory amount from the 10-k report and apply 3-10% improvement (Aberdeen) in a top-down approach. Alternatively, apply 3.2-4.9% to your annual supplier spend (KPMG).

You could also investigate the expediting or return cost of shipments in a period due to incorrectly aggregated customer forecasts, wrong or incomplete product information or wrong shipment instructions in a product or location profile. Apply Aberdeen’s 5% improvement rate and there you go.

Consider that a North American utility told us that just fixing their 200 Tier1 suppliers’ product information achieved an increase in discounts from $14 to $120 million. They also found that fixing one basic out of sixty attributes in one part category saves them over $200,000 annually.

So what ROI percentages would you find tolerable or justifiable for, say an EDW project, a CRM project, a new claims system, etc.? What would the annual savings or new revenue be that you were comfortable with?  What was the craziest improvement you have seen coming to fruition, which nobody expected?

Next time, I will add some more “use cases” to the list and look at some philosophical implications of averages.

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Business/IT Collaboration, Data Integration, Data Migration, Data Quality, Enterprise Data Management, Master Data Management | Tagged , , , | Leave a comment

Emerging Markets: Does Location Matter?

I recently wrapped up two overseas trips; one to Central America and another to South Africa. As such, I had the opportunity to meet with a national bank and a regional retailer. It prompted me to ask the question: Does location matter in emerging markets?

I wish I could tell you that there was a common theme on how firms in the same sector or country (even city) treat data on a philosophical or operational level but I cannot.   It is such a unique experience every time as factors like ownership history, regulatory scrutiny, available/affordable skill set and past as well as current financial success create a unique grey pattern rather than a comfortable black and white separation. This is even more obvious when I mix in recent meetings I had with North American organizations in the same sectors.

Banking in Latin vs North America

While a national bank in Latin America may seem lethargic, unimaginative and unpolished at first, you can feel the excitement when they can conceive, touch and play with the potential of new paradigms, like becoming data-driven.  Decades of public ownership did not seem to have stifled their willingness to learn and improve. On the other side, there is a stock market-listed, regional US bank and half the organization appears to believe in meddling along without expert IT knowledge, which reduced adoption and financial success in past projects.  Back office leadership also firmly believes in “relationship management” over data-driven “value management”.

To quote a leader in their finance department, “we don’t believe that knowing a few more characteristics about a client creates more profit….the account rep already knows everything about them and what they have and need”.  Then he said, “Not sure why the other departments told you there are issues.  We have all this information but it may not be rolled out to them yet or they have no license to view it to date.”  This reminded me of the “All Quiet on the Western Front” mentality.  If it is all good over here, why are most people saying it is not?  Granted; one more attribute may not tip the scale to higher profits but a few more and their historical interrelationship typically does.

tapping emerging market

“All Quiet on the Western Front” mentality?

As an example; think about the correlation of average account balance fluctuations, property sale, bill pay account payee set ups, credit card late charges and call center interactions over the course of a year.

The Latin American bankers just said, “We have no idea what we know and don’t know…but we know that even long standing relationships with corporate clients are lacking upsell execution”.  In this case, upsell potential centered on wire transfer SWIFT message transformation to their local standard they report of and back.  Understanding the SWIFT message parameters in full creates an opportunity to approach originating entities and cutting out the middleman bank.

Retailing in Africa vs Europe

The African retailer’s IT architects indicated that customer information is centralized and complete and that integration is not an issue as they have done it forever.   Also, consumer householding information is not a viable concept due to different regional interpretations, vendor information is brand specific and therefore not centrally managed and event based actions are easily handled in BizTalk.  Home delivery and pickup is in its infancy.

The only apparent improvement area is product information enrichment for an omnichannel strategy. This would involve enhancing attribution for merchandise demand planning, inventory and logistics management and marketing.  Attributes could include not only full and standardized capture of style, packaging, shipping instructions, logical groupings, WIP vs finished goods identifiers, units of measure, images and lead times but also regional cultural and climate implications.

However, data-driven retailers are increasingly becoming service and logistics companies to improve wallet share, even in emerging markets.  Look at the successful Russian eTailer Ozon, which is handling 3rd party merchandise for shipping and cash management via a combination of agency-style mom & pop shops and online capabilities.  Having good products at the lowest price alone is not cutting it anymore and it has not for a while.  Only luxury chains may be able to avoid this realization for now. Store size and location come at a premium these days. Hypermarkets are ill-equipped to deal with high-profit specialty items.  Commercial real estate vacancies on British high streets are at a high (Economist, July 13, 2014) and footfall is at a seven-year low.   The Centre for Retail Research predicts that 20% of store locations will close over the next five years.

If specialized, high-end products are the most profitable, I can (test) sell most of them online or at least through fewer, smaller stores saving on carrying cost.   If my customers can then pick them up and return them however they want (store, home) and I can reduce returns from normally 30% (per the Economist) to fewer than 10% by educating and servicing them as unbureaucratically as possible, I just won the semifinals.  If I can then personalize recommendations based on my customers’ preferences, life style events, relationships, real-time location and reward them in a meaningful way, I just won the cup.

AT Kearney "Seizing Africa's Retail Opportunities" (2014)

AT Kearney “Seizing Africa’s Retail Opportunities” (2014)

Emerging markets may seem a few years behind but companies like Amazon or Ozon have shown that first movers enjoy tremendous long-term advantages.

So what does this mean for IT?  Putting your apps into the cloud (maybe even outside your country) may seem like an easy fix.  However, it may not only create performance and legal issues but also unexpected cost to support decent SLA terms.  Does your data support transactions for higher profits today to absorb this additional cost of going into the cloud?  Focus on transactional applications and their management obfuscates the need for a strong backbone for data management, just like the one you built for your messaging and workflows ten years ago.  Then you can tether all the fancy apps to it you want.

Have any emerging markets’ war stories or trends to share?  I would love to hear them.  Stay tuned for future editions of this series.

FacebookTwitterLinkedInEmailPrintShare
Posted in Financial Services, Retail, Vertical | Tagged , , | Leave a comment

Top 5 Data Themes in Emerging Markets

Top 5 Data Themes in Emerging Markets

Top 5 Data Themes in Emerging Markets

Recently, my US-based job led me to a South African hotel room, where I watched Germany play Brazil in the World Cup. The global nature of the event was familiar to me. My work covers countries like Malaysia, Thailand, Singapore, South Africa and Costa Rica. And as I pondered the stunning score (Germany won, 7 to 1), my mind was drawn to emerging markets. What defines an emerging market? In particular, what are the data-related themes common to emerging markets? Because I work with global clients in the banking, oil and gas, telecommunications, and retail industries, I have learned a great deal about this. As a result, I wanted to share my top 5 observations about data in Emerging Markets.

1) Communication Infrastructure Matters

Many of the emerging markets, particularly in Africa, jumped from one or two generations of telco infrastructure directly into 3G and fiber within a decade. However, this truth only applies to large, cosmopolitan areas. International diversification of fiber connectivity is only starting to take shape. (For example, in Southern Africa, BRICS terrestrial fiber is coming online soon.) What does this mean for data management? First, global connectivity influences domestic last mile fiber deployment to households and businesses. This, in turn, will create additional adoption of new devices. This adoption will create critical mass for higher productivity services, such as eCommerce. As web based transactions take off, better data management practices will follow. Secondly, European and South American data centers become viable legal and performance options for African organizations. This could be a game changer for software vendors dealing in cloud services for BI, CRM, HCM, BPM and ETL.

2) Competition in Telecommunication Matters

If you compare basic wireless and broadband bundle prices between the US, the UK and South Africa, for example, the lack of true competition makes further coverage upgrades, like 4G and higher broadband bandwidths, easy to digest for operators. These upgrades make telecommuting, constant social media engagement possible. Keeping prices low, like in the UK, is the flipside achieving the same result. The worst case is high prices and low bandwidth from the last mile to global nodes. This also creates low infrastructure investment and thus, fewer consumers online for fewer hours. This is often the case in geographically vast countries (Africa, Latin America) with vast rural areas. Here, data management is an afterthought for the most part. Data is intentionally kept in application silos as these are the value creators. Hand coding is pervasive to string data together to make small moves to enhance the view of a product, location, consumer or supplier.

3) A Nation’s Judicial System Matters

If you do business in nations with a long, often British judicial tradition, chances are investment will happen. If you have such a history but it is undermined by a parallel history of graft from the highest to the lowest levels because of the importance of tribal traditions, only natural resources will save your economy. Why does it matter if one of my regional markets is “linked up” but shipping logistics are burdened by this excess cost and delay? The impact on data management is a lack of use cases supporting an enterprise-wide strategy across all territories. Why invest if profits are unpredictable or too meager? This is why small Zambia or Botswana are ahead of the largest African economy, Nigeria.

4) Expertise Location Matters

Anybody can have the most advanced vision on a data-driven, event-based architecture supporting the fanciest data movement and persistence standards. Without the skill to make the case to the business it is a lost cause unless your local culture still has IT in charge of specifying requirements, running the evaluation, selecting and implementing a new technology. It is also done for if there are no leaders who have experienced how other leading firms in the same or different sector went about it (un)successfully. Lastly, if you don’t pay for skill, your project failure risk just tripled. Duh!

5) Denial is Universal

No matter if you are an Asian oil company, a regional North American bank, a Central American National Bank or an African retail conglomerate. If finance or IT invested in any technologies prior and they saw a lack of adoption, for whatever reason, they will deny data management challenges despite other departments complaining. Moreover, if system integrators or internal client staff (mis)understand data management as fixing processes (which it is not) instead of supporting transactional integrity (which it is), clients are on the wrong track. Here, data management undeservedly becomes a philosophical battleground.

This is definitely not a complete list or super-thorough analysis but I think it covers the most crucial observations from my engagements. I would love to hear about your findings in emerging markets.

Stay tuned for part 2 of this series where I will talk about the denial and embrace of corporate data challenges as it pertains to an organization’s location.

FacebookTwitterLinkedInEmailPrintShare
Posted in Governance, Risk and Compliance, Public Sector, Retail, Telecommunications, Utilities & Energy | Tagged | Leave a comment

The Ones Not Screwing Up with Compliance Will Win

A few weeks ago, a regional US bank asked me to perform some compliance and use case analysis around fixing their data management situation.  This bank prides itself on customer service and SMB focus, while using large-bank product offerings.  However, they were about a decade behind the rest of most banks in modernizing their IT infrastructure to stay operationally on top of things.

compliance

Bank Efficiency Ratio per AUM (Assets under Management), bankregdata.com

This included technologies like ESB, BPM, CRM, etc.  They also were a sub-optimal user of EDW and analytics capabilities. Having said all this; there was a commitment to change things up, which is always a needed first step to any recovery program.

THE STAKEHOLDERS

As I conducted my interviews across various departments (list below) it became very apparent that they were not suffering from data poverty (see prior post) but from lack of accessibility and use of data.

  • Compliance
  • Vendor Management & Risk
  • Commercial and Consumer Depository products
  • Credit Risk
  • HR & Compensation
  • Retail
  • Private Banking
  • Finance
  • Customer Solutions

FRESH BREEZE

This lack of use occurred across the board.  The natural reaction was to throw more bodies and more Band-Aid marts at the problem.  Users also started to operate under the assumption that it will never get better.  They just resigned themselves to mediocrity.  When some new players came into the organization from various systemically critical banks, they shook things up.

Here is a list of use cases they want to tackle:

  • The proposition of real-time offers based on customer events as simple as investment banking products for unusually high inflow of cash into a deposit account.
  • The use of all mortgage application information to understand debt/equity ratio to make relevant offers.
  • The capture of true product and customer profitability across all lines of commercial and consumer products including trust, treasury management, deposits, private banking, loans, etc.
  • The agile evaluation, creation, testing and deployment of new terms on existing and products under development by shortening the product development life cycle.
  • The reduction of wealth management advisors’ time to research clients and prospects.
  • The reduction of unclaimed use tax, insurance premiums and leases being paid on consumables, real estate and requisitions due to the incorrect status and location of the equipment.  This originated from assets no longer owned, scrapped or moved to different department, etc.
  • The more efficient reconciliation between transactional systems and finance, which often uses multiple party IDs per contract change in accounts receivable, while the operating division uses one based on a contract and its addendums.  An example would be vendor payment consolidation, to create a true supplier-spend; and thus, taking advantage of volume discounts.
  • The proactive creation of central compliance footprint (AML, 314, Suspicious Activity, CTR, etc.) allowing for quicker turnaround and fewer audit instances from MRAs (matter requiring attention).

MONEY TO BE MADE – PEOPLE TO SEE

Adding these up came to about $31 to $49 million annually in cost savings, new revenue or increased productivity for this bank with $24 billion total assets.

So now that we know there is money to be made by fixing the data of this organization, how can we realistically roll this out in an organization with many competing IT needs?

The best way to go about this is to attach any kind of data management project to a larger, business-oriented project, like CRM or EDW.  Rather than wait for these to go live without good seed data, why not feed them with better data as a key work stream within their respective project plans?

To summarize my findings I want to quote three people I interviewed.  A lady, who recently had to struggle through an OCC audit told me she believes that the banks, which can remain compliant at the lowest cost will ultimately win the end game.  Here she meant particularly tier 2 and 3 size organizations.  A gentleman from commercial banking left this statement with me, “Knowing what I know now, I would not bank with us”.  The lady from earlier also said, “We engage in spreadsheet Kung Fu”, to bring data together.

Given all this, what would you suggest?  Have you worked with an organization like this? Did you encounter any similar or different use cases in financial services institutions?

FacebookTwitterLinkedInEmailPrintShare
Posted in Banking & Capital Markets, Data Quality, Financial Services, Governance, Risk and Compliance | Tagged , , , | Leave a comment

Telecommunications and Data: What If Your Fiancée Flunked Finance?

About 15 or so years ago, some friends of mine called me to share great news.  Their dating relationship had become serious and they were headed toward marriage.  After a romantic proposal and a beautiful ring, it was time to plan the wedding and invite the guests.

Telecommunication and data

Lack of a Steady Income Stream is Not Romantic

This exciting time was confounded by a significant challenge. Though they were very much in love, one of them had an incredibly tough time making wise financial choices. During the wedding planning process, the financially astute fiancée grew concerned about the problems the challenged partner could bring. Even though the financially illiterate fiancée had every other admirable quality, the finance issue nearly created enough doubt to end the engagement.  Fortunately, my friends moved forward with the ceremony, were married and immediately went to work on learning new healthy financial habits as a couple.

Telecommunication and data

Is financial folly a relationship red flag?

Let’s segue into how this relates to telecommunications and data, specifically to your average communications operator. Just like a concerned fiancée, you’d think twice about making a commitment to an organization that didn’t have a strong foundation.

Like the financially challenged fiancée, the average operator has a number of excellent qualities: functioning business model, great branding, international roaming, creative ads, long-term prospects, smart people at the helm and all the data and IT assets you can imagine.  Unfortunately, despite the externally visible bells and whistles, over time they tend to lose operational soundness around the basics. Specifically, their lack of data quality causes them to forfeit an ever increasing amount of billing revenue. Their poor data costs them millions each year.

A recent set of engagements highlighted this phenomenon. The small carrier (3-6 million subscribers) who implements a more consistent, unique way to manage core subscriber profile and product data could recover underbilling of $6.9 million annually. A larger carrier (10-20 million subscribers) could recover $28.1 million every year from fixing billing errors. (This doesn’t even cover the large Indian and Chinese carriers who have over 100 million customers!)

Typically, a billing error starts with an incorrect set up of a service line item base price and related 30+ discount line variances.  Next, the wrong service discount item is applied at contract start.  If that did not happen (or on top of those), it will occur when the customer calls in during or right before the end of the first contract period (12-24 months) to complain about the service quality, bill shock, etc.  Here, the call center rep will break an existing triple play bundle by deleting an item and setting up a separate non-bundle service line item at a lower price (higher discount).  The head of billing actually told us, “our reps just give a residential subscriber a discount of $2 for calling us”.  It’s even higher for commercial clients.

To make matters worse, this change will trigger misaligned (incorrect) activation dates or even bill duplication, all of which will have to be fixed later by multiple staff on the BSS and OSS side or may even trigger an investigation project by the revenue assurance department.  Worst case, the deletion of the item from the bundle (especially for B2B clients) will not terminate the wholesale cost the carrier still owes a national carrier for a broadband line, which often is 1/3 of the retail price for a business customer.

To come full circle to my initial “accounting challenged” example; would you marry (invest in) this organization?  Do you think this can or should be solved in a big bang approach or incrementally?  Where would you start: product management, the service center, residential or commercial customers?

Observations and illustrations contained in this post are estimates only and are based entirely upon information provided by the prospective customer and on our observations and benchmarks.  While we believe our recommendations and estimates to be sound, the degree of success achieved by the prospective customer is dependent upon a variety of factors, many of which are not under Informatica’s control and nothing in this post shall be relied upon as representative of the degree of success that may, in fact, be realized and no warranty or representation of success, either express or implied, is made.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Data Privacy | Tagged , | Leave a comment

Who Has the Heart to Adopt this Orphan Oil Well?

As I browsed my BBC app a few weeks ago, I ran into this article about environmental contamination of oil wells in the UK, which were left to their own devices. The article explains that a lack of data and proper data management is causing major issues for gas and oil companies. In fact, researchers found no data for more than 2,000 inactive wells, many of which have been abandoned or “orphaned”(sealed and covered up). I started to scratch my head imagining what this problem looks like in places like Brazil, Nigeria, Malaysia, Angola and the Middle East. In these countries and regions, regulatory oversight is, on average, a bit less regulated.

Data Management

Like Oliver, this well needs a home!

On top of that, please excuse my cynicism here, but an “Orphan” well is just as ridiculous a concept as a “Dry” well.  A hole without liquid inside is not a well but – you guessed it – a hole.  Also, every well has a “Parent”, meaning

  • The person or company who drilled it
  • A  land owner who will get paid from its production and allowed the operation (otherwise it would be illegal)
  • A financier who fronted the equipment and research cost
  • A regulator, who is charged with overseeing the reservoir’s exploration

Let the “hydrocarbon family court judge” decide whose problem this orphan is with well founded information- no pun intended.  After all, this “domestic disturbance” is typically just as well documented as any police “house call”, when you hear screams from next door. Similarly, one would expect that when (exploratory) wells are abandoned and improperly capped or completed, there is a long track record about financial or operational troubles at the involved parties.  Apparently I was wrong.  Nobody seems to have a record of where the well actually was on the surface, let alone subsurface, to determine perforation risks in itself or from an actively managed bore nearby.

This reminds me of a meeting with an Asian NOC’s PMU IT staff, who vigorously disagreed with every other department on the reality on the ground versus at group level. The PMU folks insisted on having fixed all wells’ key attributes:

  1. Knowing how many wells and bores they had across the globe and all types of commercial models including joint ventures
  2. Where they were and are today
  3. What their technical characteristics were and currently are

The other departments, from finance to strategy, clearly indicated that 10,000 wells across the globe currently being “mastered” with (at least initially) cheap internal band aid fixes has a margin of error of up to 10%.   So much for long term TCO.  After reading this BBC article, this internal disagreement made even more sense.

If this chasm does not make a case for proper mastering of key operational entities, like wells, I don’t know what does. It also begs the question how any operation with potentially very negative long term effects can have no legally culpable party being capture in some sort of, dare I say, master register.  Isn’t this the sign of “rule of law” governing an advanced nation, e.g. having a land register, building permits, wills, etc.?

I rest my case, your honor.  May the garden ferries forgive us for spoiling their perfectly manicured lawn.  With more fracking and public scrutiny on the horizon, maybe regulators need to establish their own “trusted” well master file, rather than rely on oil firms’ data dumps.  After all, the next downhole location may be just a foot away from perforating one of these “orphans” setting your kitchen sink faucet on fire.

Do you think another push for local government to establish “well registries” like they did ten years ago for national IDs, is in order?

Disclaimer: Recommendations and illustrations contained in this post are estimates only and are based entirely upon information provided by the prospective customer and on our observations and benchmarks.  While we believe our recommendations and estimates to be sound, the degree of success achieved by the prospective customer is dependent upon a variety of factors, many of which are not under Informatica’s control and nothing in this post shall be relied upon as representative of the degree of success that may, in fact, be realized and no warranty or representation of success, either express or implied, is made.

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, Master Data Management | Tagged , , | Leave a comment

Business Beware! Corporate IT Is “Fixing” YOUR Data

It is troublesome to me to repeatedly get into conversations with IT managers who want to fix data “for the sake of fixing it”.  While this is presumably increasingly rare, due to my department’s role, we probably see a higher occurrence than the normal software vendor employee.  Given that, please excuse the inflammatory title of this post.

Nevertheless, once the deal is done, we find increasingly fewer of these instances, yet still enough, as the average implementation consultant or developer cares about this aspect even less.  A few months ago a petrochemical firm’s G&G IT team lead told me that he does not believe that data quality improvements can or should be measured.  He also said, “if we need another application, we buy it.  End of story.”  Good for software vendors, I thought, but in most organizations $1M here or there do not lay around leisurely plus decision makers want to see the – dare I say it – ROI.

This is not what a business - IT relationship should feel like

This is not what a business – IT relationship should feel like

However, IT and business leaders should take note that a misalignment due to lack OR disregard of communication is a critical success factor.  If the business does not get what it needs and wants AND it differs what Corporate IT is envisioning and working on – and this is what I am talking about here – it makes any IT investment a risky proposition.

Let me illustrate this with 4 recent examples I ran into:

1. Potential for flawed prioritization

A retail customer’s IT department apparently knew that fixing and enriching a customer loyalty record across the enterprise is a good and financially rewarding idea.  They only wanted to understand what the less-risky functional implementation choices where. They indicated that if they wanted to learn what the factual financial impact of “fixing” certain records or attributes, they would just have to look into their enterprise data warehouse.  This is where the logic falls apart as the warehouse would be just as unreliable as the “compromised” applications (POS, mktg, ERP) feeding it.

Even if they massaged the data before it hit the next EDW load, there is nothing inherently real-time about this as all OLTP are running processes of incorrect (no bidirectional linkage) and stale data (since the last load).

I would question if the business is now completely aligned with what IT is continuously correcting. After all, IT may go for the “easy or obvious” fixes via a weekly or monthly recurring data scrub exercise without truly knowing, which the “biggest bang for the buck” is or what the other affected business use cases are, they may not even be aware of yet.  Imagine the productivity impact of all the roundtripping and delay in reporting this creates.  This example also reminds me of a telco client, I encountered during my tenure at another tech firm, which fed their customer master from their EDW and now just found out that this pattern is doomed to fail due to data staleness and performance.

2. Fix IT issues and business benefits will trickle down

Client number two is a large North American construction Company.  An architect built a business case for fixing a variety of data buckets in the organization (CRM, Brand Management, Partner Onboarding, Mobility Services, Quotation & Requisitions, BI & EPM).

Grand vision documents existed and linked to the case, which stated how data would get better (like a sick patient) but there was no mention of hard facts of how each of the use cases would deliver on this.  After I gave him some major counseling what to look out and how to flesh it out – radio silence. Someone got scared of the math, I guess.

3. Now that we bought it, where do we start

The third culprit was a large petrochemical firm, which apparently sat on some excess funds and thought (rightfully so) it was a good idea to fix their well attributes. More power to them.  However, the IT team is now in a dreadful position having to justify to their boss and ultimately the E&P division head why they prioritized this effort so highly and spent the money.  Well, they had their heart in the right place but are a tad late.   Still, I consider this better late than never.

4. A senior moment

The last example comes from a South American communications provider. They seemingly did everything right given the results they achieved to date.  This gets to show that misalignment of IT and business does not necessarily wreak havoc – at least initially.

However, they are now in phase 3 of their roll out and reality caught up with them.  A senior moment or lapse in judgment maybe? Whatever it was; once they fixed their CRM, network and billing application data, they had to start talking to the business and financial analysts as complaints and questions started to trickle in. Once again, better late than never.

So what is the take-away from these stories. Why wait until phase 3, why have to be forced to cram some justification after the purchase?  You pick, which one works best for you to fix this age-old issue.  But please heed Sohaib’s words of wisdom recently broadcast on CNN Money “IT is a mature sector post bubble…..now it needs to deliver the goods”.  And here is an action item for you – check out the new way for the business user to prepare their own data (30 minutes into the video!).  Agreed?

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Business/IT Collaboration, CIO, Customer Acquisition & Retention, Customer Services, Data Aggregation, Data Governance, Data Integration, Data Quality, Data Warehousing, Enterprise Data Management, Master Data Management | Leave a comment

3 Reasons Why Martial Arts Should Be A Job Requirement For A Chief Data Officer

Forget degrees from Harvard or MIT, forget NoSQL, Hadoop or OBIEE.  These are all powerful tools but they will not win you the face-off.  It starts with who you are (or are not), who or what you are going up against and what has happened in the past. Why should martial arts be a job requirement for Chief Data Officers? I boiled it down to three simple reasons to help you understand.

Does this look like your CDO?

Does this look like your CDO?

I started practicing Kendo three years ago and it surprises me every single practice how inadequate I still am, how much I can glean from my opponent to determine future behavior and how unimportant “background noise” really is.  Even if I have a good day, some strike from a teenager or a retiree, who has been practicing for a decade or more, will remind me that I got only 1% better compared to last month and I have a long way to go.  At our last practice, one of my Senseis told me that the higher ranks get their Ki-Ken-Tai-Ichi (alignment of spirit, sword, body) right maybe half the time.  It’s a life lesson every time.

These three facts are probably also true for many one-on-one sports where adversaries study each other for more than just a couple of seconds before their next swing or shot.  If you ask me, most western sports are about endurance, strength, mindset and strategy with a heavy focus on the physical aspects.  Kendo is 90% strategy and mindset. That is why six and sixty-year olds alike can excel in it. It is more akin to chess with baseball bats.

You study your opponent from the second he walks up to the chair in the middle of the podium for a gentleman-like exchange of cerebral willpower but in the end you will smack him relentlessly with a bat.  Everything you do is directly driven from how you feel, what you think, how your opponent  moves, what your opponent feels and thinks. The goal is not to react to a hand being raised but to anticipate your opponent’s next move based on their most recent actions. By the time your eyes (and, a tenth of a second later, your brain) capture the right hand going up to strike towards you – you’ve already lost, as it is too late to react.

You are effectively analyzing core data domains and key attributes, like posture.  Business data requires the same rigor and focus on the essential. There is also a tremendous amount of process (formalities like repeated bowing) and deeper meaning in everything you do; call it “Governance”.

A Chief Data Officer (CDO) needs to mind the same aspects in his or her existence.

  • You are not the professional you think you are (humility)
  • Someone else always knows something you don’t, so every additional bit helps to predict future actions (willingness to learn)
  • How to eliminate all the noise detracting from the ultimate goal (focus)

In reality, the data problem has not been solved long ago.  Something new can be learned to combat this age-old problem.  The learning piece comes into play when we are willing to listen to people who have done or seen similar problems being fixed in another environment, not necessarily the same industry or department. The third is that political and technical detractors like procurement processes, M&A, new leadership or transactional volume spikes from more applications will continue to pop up.  However, it is on you, the CDO, to uncover, isolate and preach that fixing a process may not always be the root cause of a business issue and as such needs to be put in perspective.  As I always say “throwing bad data at a better process” just saved you a step but still renders errors, rework and bad decisions.

So what does this mean in “real” terms:

  • Seek and accept opinions frequently, even if they don’t match your issue perfectly. Often a customer is a customer is a customer….admit it. Your business model may not be that special after all.
  • Watch what the others do on a fundamental level, i.e. becoming data-driven organizations. These could be competitors, partners, organizations you (should) admire.
  • Internalize and socialize what the core asset, goal of the organization is, which will move the needle the most. Often it will be your intelligence (speak for information or data).

I will leave you with these thoughts and invite you to sit down, cross your legs, close your eyes and get all esoteric on me, young grasshopper, but please envision what you organization should look like and how it should make its money in five years from now.   Throwing more resources at new problems, ignoring core data issues and reacting when things bubble up at greater numbers will likely not cut it.

And here is where I will bow out. Take a moment and think about it; how does your take on life influence your assessment of what you encounter in your workplace?

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Data Governance, Data Integration, Data Quality, Enterprise Data Management, Hadoop | Tagged , | Leave a comment

Becoming a Revenue Driven Business Model through Data is Painful for Government Agencies

Recently, I presented a Business Value Assessment to a client.  The findings were based on a revenue-generating state government agency. Everyone at the presentation was stunned to find out how much money was left on the table by not basing their activities on transactions, which could be cleanly tied to the participating citizenry and a variety of channel partners. There was over $38 million in annual benefits left over, which included partially recovered lost revenue, cost avoidance and reduction. A higher data impact to this revenue driven business model could have prevented this.

Should government leaders go to the “School of Data” to understand where more revenue can be created without necessary tax hikes? (Source:creativecommons.org)

Should government leaders go to the “School of Data” to understand where more revenue can be created without necessary tax hikes? (Source:creativecommons.org)

Given the total revenue volume, this may seem small. However, after factoring in the little technology effort required to “collect and connect” data from existing transactions, it is actually extremely high.

The real challenge for this organization will be the required policy transformation to turn the organization from “data-starved” to “data-intensive”. This would eliminate strategic decisions around new products, locations and customers relying on surveys that face sampling errors, biases, etc. Additionally, surveys are often delayed, making them practically ineffective in this real-time world we live in today.

Despite no applicable legal restrictions, the leadership’s main concern was that gathering more data would erode the public’s trust and positive image of the organization.

To be clear; by “more” data being collected by this type of government agency I mean literally 10% of what any commercial retail entity has gathered on all of us for decades.  This is not the next NSA revelation as any conspiracy theorist may fear.

While I respect their culturally driven self-censorship despite no legal barricades, it raises their stakeholders’ (the state’s citizenry) concern over its performance.  To be clear, there would be no additional revenue for the state’s programs without more citizen data.  You may believe that they already know everything about you, including your income, property value, tax information, etc. However, inter-departmental sharing of criminally-non-relevant information is legally constrained.

Another interesting finding from this evaluation was that they had no sense of conversion rate from email and social media campaigns. Impressions from click-throughs as well as hard/soft bounces were more important than tracking who actually generated revenue.

This is a very market-driven organization compared to other agencies. It actually does try to measure itself like a commercial enterprise and attempts to change in order to generate additional revenue for state programs benefiting the citizenry. I can only imagine what non-revenue-generating agencies (local, state or federal) do in this respect.  Is revenue-oriented thinking something the DoD, DoJ or Social Security should subscribe to?

Think tanks and political pundits are now looking at the trade-off between bringing democracy to every backyard on our globe and its long-term, budget ramifications. The DoD is looking to reduce the active component to its lowest in decades given the U.S. federal debt level.

Putting the data bits and pieces together for revenue

Putting the data bits and pieces together for revenue

recent article in HBR explains that cost cutting has never sustained an organization’s growth over a longer period of time, but new revenue sources did. Is your company or government agency only looking at cost and personnel productivity?

Disclaimer:

Recommendations and illustrations contained in this post are estimates only and are based entirely upon information provided by the prospective customer and on our observations and benchmarks.  While we believe our recommendations and estimates to be sound, the degree of success achieved by the prospective customer is dependent upon a variety of factors, many of which are not under Informatica’s control and nothing in this post shall be relied upon as representative of the degree of success that may, in fact, be realized and no warranty or representation of success, either express or implied, is made.

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Data Integration, Data Quality, Operational Efficiency, Real-Time | Tagged , , | Leave a comment

Data: The Unsung Hero (or Villain) of every Communications Service Provider

The faceless hero of CSPs: Data

The faceless hero of CSPs: Data

Analyzing current business trends helps illustrate how difficult and complex the Communication Service Provider business environment has become. CSPs face many challenges. Clients expect high quality, affordable content that can move between devices with minimum advertising or privacy concerns. To illustrate this phenomenon, here are a few recent examples:

  • Apple is working with Comcast/NBC Universal on a new converged offering
  • Vodafone purchased the Spanish cable operator, Ono, having to quickly separate the wireless customers from the cable ones and cross-sell existing products
  • Net neutrality has been scuttled in the US and upheld in the EU so now a US CSP can give preferential bandwidth to content providers, generating higher margins
  • Microsoft’s Xbox community collects terabytes of data every day making effective use, storage and disposal based on local data retention regulation a challenge
  • Expensive 4G LTE infrastructure investment by operators such as Reliance is bringing streaming content to tens of millions of new consumers

To quickly capitalize on “new” (often old, but unknown) data sources, there has to be a common understanding of:

  • Where the data is
  • What state it is in
  • What it means
  • What volume and attributes are required to accommodate a one-off project vs. a recurring one

When a multitude of departments request data for analytical projects with their one-off, IT-unsanctioned on-premise or cloud applications, how will you go about it? The average European operator has between 400 and 1,500 (known) applications. Imagine what the unknown count is.

A European operator with 20-30 million subscribers incurs an average of $3 million per month due to unpaid invoices. This often results from incorrect or incomplete contact information. Imagine how much you would have to add for lost productivity efforts, including gathering, re-formatting, enriching, checking and sending  invoices. And this does not even account for late invoice payments or extended incorrect credit terms.

Think about all the wrong long-term conclusions that are being drawn from this wrong data. This single data problem creates indirect cost in excess of three times the initial, direct impact of unpaid invoices.

Want to fix your data and overcome the accelerating cost of change? Involve your marketing, CEM, strategy, finance and sales leaders to help them understand data’s impact on the bottom line.

Disclaimer: Recommendations and illustrations contained in this post are estimates only and are based entirely upon information provided by the prospective customer and on our observations and benchmarks. While we believe our recommendations and estimates to be sound, the degree of success achieved by the prospective customer is dependent upon a variety of factors, many of which are not under Informatica’s control and nothing in this post shall be relied upon as representative of the degree of success that may, in fact, be realized and no warranty or representation of success, either express or implied, is made.

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Business/IT Collaboration, Data Governance, Data Integration, Data Quality, Operational Efficiency | Tagged , , , , | Comments Off