Tag Archives: Data Quality

Garbage In, Garbage Out? Don’t Take Data for Granted in Analytics Initiatives!

Cant trust data_1The verdict is in. Data is now broadly perceived as a source of competitive advantage. We all feel the heat to deliver good data. It is no wonder organizations view Analytics initiatives as highly strategic. But the big question is, can you really trust your data? Or are you just creating pretty visualizations on top of bad data?

We also know there is a shift towards self-service Analytics. But did you know that according to Gartner, “through 2016, less than 10% of self-service BI initiatives will be governed sufficiently to prevent inconsistencies that adversely affect the business”?1 This means that you may actually show up at your next big meeting and have data that contradicts your colleague’s data.  Perhaps you are not working off of the same version of the truth. Maybe you have siloed data on different systems and they are not working in concert? Or is your definition of ‘revenue’ or ‘leads’ different from that of your colleague’s?

So are we taking our data for granted? Are we just assuming that it’s all available, clean, complete, integrated and consistent?  As we work with organizations to support their Analytics journey, we often find that the harsh realities of data are quite different from perceptions. Let’s further investigate this perception gap.

For one, people may assume they can easily access all data. In reality, if data connectivity is not managed effectively, we often need to beg borrow and steal to get the right data from the right person. If we are lucky. In less fortunate scenarios, we may need to settle for partial data or a cheap substitute for the data we really wanted. And you know what they say, the only thing worse than no data is bad data. Right?

Another common misperception is: “Our data is clean. We have no data quality issues”.  Wrong again.  When we work with organizations to profile their data, they are often quite surprised to learn that their data is full of errors and gaps.  One company recently discovered within one minute of starting their data profiling exercise, that millions of their customer records contained the company’s own address instead of the customers’ addresses… Oops.

Another myth is that all data is integrated.  In reality, your data may reside in multiple locations: in the cloud, on premise, in Hadoop and on mainframe and anything in between. Integrating data from all these disparate and heterogeneous data sources is not a trivial task, unless you have the right tools.

And here is one more consideration to mull over. Do you find yourself manually hunting down and combining data to reproduce the same ad hoc report over and over again? Perhaps you often find yourself doing this in the wee hours of the night? Why reinvent the wheel? It would be more productive to automate the process of data ingestion and integration for reusable and shareable reports and Analytics.

Simply put, you need great data for great Analytics. We are excited to host Philip Russom of TDWI in a webinar to discuss how data management best practices can enable successful Analytics initiatives. 

And how about you?  Can you trust your data?  Please join us for this webinar to learn more about building a trust-relationship with your data!

  1. Gartner Report, ‘Predicts 2015: Power Shift in Business Intelligence and Analytics Will Fuel Disruption’; Authors: Josh Parenteau, Neil Chandler, Rita L. Sallam, Douglas Laney, Alan D. Duncan; Nov 21 2014
Share
Posted in Architects, Business/IT Collaboration, Data Governance, Data Integration, Data Warehousing | Tagged , , , , , , | 1 Comment

Is Your Data Ready to Maximize Value from Your CRM Investments?

Is Your Data Ready to Maximize Value from Your CRM Investments?

Is Your Data Ready to Maximize Value from Your CRM Investments?

A friend of mine recently reached out to me about some advice on CRM solutions in the market.  Though I have not worked for a CRM vendor, I’ve had both direct experience working for companies that implemented such solutions to my current role interacting with large and small organizations regarding their data requirements to support ongoing application investments across industries. As we spoke, memories started to surface when he and I had worked on implementing Salesforce.com (SFDC) many years ago. Memories that we wanted to forget but important to call out given his new situation.

We worked together for a large mortgage lending software vendor selling loan origination solutions to brokers and small lenders mainly through email and snail mail based marketing.  He was responsible for Marketing Operations, and I ran Product Marketing. The company looked at Salesforce.com to help streamline our sales operations  and improve how we marketed and serviced our customers.  The existing CRM system was from the early 90’s and though it did what the company needed it to do, it was heavily customized, costly to operate, and served its life. It was time to upgrade, to help grow the business, improve business productivity, and enhance customer relationships.

After 90 days of rolling out SFDC, we ran into some old familiar problems across the business.  Sales reps continued to struggle in knowing who was a current customer using our software, marketing managers could not create quality mailing lists for prospecting purposes, and call center reps were not able to tell if the person on the other end was a customer or prospect. Everyone wondered why this was happening given we adopted the best CRM solution in the market.  You can imagine the heartburn and ulcers we all had after making such a huge investment in our new CRM solution.  C-Level executives were questioning our decisions and blaming the applications. The truth was, the issues were not related to SFDC but the data that we had migrated into the system and the lack proper governance and a capable information architecture to support the required data management integration between systems that caused these significant headaches.

During the implementation phase, IT imported our entire customer database of 200K+ unique customer entities from the old system to SFDC. Unfortunately, the mortgage industry was very transient and on average there were roughly 55K licenses mortgage brokers and lenders in the market and because no one ever validated the accuracy of who was really a customer vs. someone who had ever bought out product, we had a serious data quality issues including:

  • Trial users  who purchased evaluation copies of our products that expired were tagged as current customers
  • Duplicate records caused by manual data entry errors consisting of companies with similar but entered slightly differently with the same business address were tagged as unique customers
  • Subsidiaries of parent companies in different parts of the country that were tagged again as a unique customer.
  • Lastly, we imported the marketing contact database of prospects which were incorrectly accounted for as a customer in the new system

We also failed to integrate real-time purchasing data and information from our procurement systems for sales and support to handle customer requests. Instead of integrating that data in real-time with proper technology, IT had manually loaded these records at the end of the week via FTP resulting in incorrect billing information, statement processing, and a ton of complaints from customers through our call center. The price we paid for not paying attention to our data quality and integration requirements before we rolled out Salesforce.com was significant for a company of our size. For example:

  • Marketing got hit pretty hard. Each quarter we mailed evaluation copies of new products to our customer database of 200K, each costing the company $12 per to produce and mail. Total cost = $2.4M annually.  Because we had such bad data,  we would get 60% of our mailings returned because of invalid addresses or wrong contact information. The cost of bad data to marketing = $1.44M annually.
  • Next, Sales struggled miserably when trying to upgrade a customer by running cold call campaigns using the names in the database. As a result, sales productivity dropped by 40% and experienced over 35% sales turnover that year. Within a year of using SFDC, our head of sales got let go. Not good!
  • Customer support used SFDC to service customers, our average all times were 40 min per service ticket. We had believed that was “business as usual” until we surveyed what reps were spending their time each day and over 50% said it was dealing with billing issues caused by bad contact information in the CRM system.

At the end of our conversation, this was my advice to my friend:

  • Conduct a data quality audit of the systems that would interact with the CRM system. Audit how complete your critical master and reference data is including names, addresses, customer ID, etc.
  • Do this before you invest in a new CRM system. You may find that much of the challenges faced with your existing applications may be caused by the data gaps vs. the legacy application.
  • If they had a data governance program, involve them in the CRM initiative to ensure they understand what your requirements are and see how they can help.
  • However, if you do decide to modernize, collaborate and involve your IT teams, especially between your Application Development teams and your Enterprise Architects to ensure all of the best options are considered to handle your data sharing and migration needs.
  • Lastly, consult with your technology partners including your new CRM vendor, they may be working with solution providers to help address these data issues as you are probably not the only one in this situation.

Looking Ahead!

CRM systems have come a long way in today’s Big Data and Cloud Era. Many firms are adopting more flexible solutions offered through the Cloud like Salesforce.com, Microsoft Dynamics, and others. Regardless of how old or new, on premise or in the cloud, companies invest in CRM not to just serve their sales teams or increase marketing conversion rates, but to improve your business relationship with your customers. Period! It’s about ensuring you have data in these systems that is trustworthy, complete, up to date, and actionable to improve customer service and help drive sales of new products and services to increase wallet share. So how to do you maximize your business potential from these critical business applications?

Whether you are adopting your first CRM solution or upgrading an existing one, keep in mind that Customer Relationship Management is a business strategy, not just a software purchase. It’s also about having a sound and capable data management and governance strategy supported by people, processes, and technology to ensure you can:

  • Access and migrate data from old to new avoiding develop cost overruns and project delays.
  • Identify, detect, and distribute transactional and reference data from existing systems into your front line business application in real-time!
  • Manage data quality errors including duplicate records, invalid names and contact information due to proper data governance and proactive data quality monitoring and measurement during and after deployment
  • Govern and share authoritative master records of customer, contact, product, and other master data between systems in a trusted manner.

Will your data be ready for your new CRM investments?  To learn more:

Follow me on Twitter @DataisGR8

Share
Posted in Architects, Cloud, Cloud Application Integration, Cloud Computing, Cloud Data Integration, Cloud Data Management, CMO, Customer Acquisition & Retention, SaaS | Tagged , , , , , , , , , | Leave a comment

Keep the Ring, Get Me an iPad: Emotional Vs Rational Marketing

Rational Marketing

Rational Marketing

The holidays that just passed weren’t the only thing to celebrate, according to historical trends. As we moved from December into 2015, how many of you were seeing a lot more engagement announcements on Facebook, or even became engaged yourself?

December is the most popular month to get engaged (according to wedding website TheKnot.com), so it’s likely many of us gearing up for the typical spring/summer calendar full of weekend weddings.

While December is not known as a big month for weddings, it is a big time for jewelers, including the months leading up to it. Diamonds, gold, and other fine jewelry become very popular purchases at this time.

Fine jewelry is an emotional buying decision, which you can see from the jewelry store commercials that evoke sentiment for our loved ones.

But that emotional pull to purchase diamonds, gold and precious stones could be changing significantly.

Diamond sales are down this year – but what is up? Technology-related gifts, including smart phones, tablets, and other functional devices. To understand why, all you have to do is think about the ages of people getting engaged: 18-34 year-olds.

People in that age range who are getting engaged right now just aren’t drawn in by the emotional purchase of fine jewelry anymore, if they ever were. They value technology purchases.

But it’s not just function over form. The emotional motivation behind a purchase (whether technology or fine jewelry or any high-dollar item) is always there.

“Status for this generation isn’t about money — it’s about attention,” said psychology professor Kit Yarrow in a recent Pacific Standard magazine article. Therefore, a smart phone is considered a better gift (and better use for the money) than fine jewelry, since it allows you to share your life and stay connected much more than a gold and diamond ring can do.

As the article notes, using technology to create “an everlasting Facebook album from that scuba diving trip in Bali says so much more than one lone photo of a pave diamond necklace.”

WHAT FUELS YOUR BUSINESS DECISIONS?

The average decision process for a consumer making a purchase is estimated at 80% emotional and 20% rational, according to an annual customer loyalty report from Brand Keys.

It’s interesting to think that the car in your garage, or the shoes on your feet, could have ultimately been something you felt you wanted (80%), and then justified the need for later (20%). Brands, especially in the luxury market, depend on this ratio.

This realization brings us to your business planning as we begin 2015. What guides your business decisions as a data-fueled marketer: emotions, or rationale?

How do brands make decisions about how to operate, what customers to market to, where to locate stores, what marketing campaigns to do, and many more strategic plans? It needs to be much more in the “rational” category – but how do you do that as a data-fueled marketer?

As consumers, we are emotional creatures without even realizing it. That can be a habit we bring to other things in our lives as well, including decisions at work.

Since your customers still have emotional reasons for making a purchase or using a service, the only thing that should be emotional is your messaging to your customers; not your planning. Creating customer profiles and making decisions from them should never be solely a ‘gut feeling’ or only based on your professional instincts.

At the same time, we all know that in our work, over time we develop good instincts about what we do. We learn to trust our sense of what will work or won’t work in the market, or in the supply chain, or within product development – whatever it is you do. You can never ignore that, because no one can completely predict the future with total accuracy. You have to trust your experience and knowledge to lead you.

Turn the 80/20 ratio on its head, and instead focus 20% on emotional thinking and 80% on rational thinking. Make your brand’s business decisions and planning based on good data.

Who are your customers? Where do they live? What do they do and what are their preferences? Basing the answers to these questions only on what has worked in the past, or what you think your customers should want, will only lead to bad business decisions.

The first step, however, is to know that your customer data is valid and complete. Gartner estimates that 40% of failed business initiatives are due to bad data. Validate, correct, and enrich your customer data before you use it. Then as a truly data-fueled marketer, you can use the 20/80 ratio properly and steer your brand to a great 2015.

For more about data quality best practices, check out this white paper written for marketers that goes beyond the basics.

Share
Posted in Customers, Data Quality, Data Services | Tagged , , , | Leave a comment

Data as a Service will ensure that 2015 will be known as “The Year of the Customer!”

Data as a Service

Customers are at the center

Not so long ago, customers were simply faceless names and transactions understood through disjointed sales data and potentially inaccurate contact information.

Over the past few years, we’ve seen companies across industries make remarkable business transformations to become customer-centric organizations. These companies understand that customers are no longer loyal to brands or products alone. Instead, they’re loyal to companies who provide the optimal, most personalized customer experiences.

By understanding more about their customers, their interests, and their interaction preferences, organizations can ultimately encourage increased sales and usage of their products and services.

As we begin 2015 and predict what the next trends will be, I believe that this year will finally be the year that customer centricity becomes the norm – and effective management of data will play the most critical role to date in getting companies to reach their customer centricity goals.

But it won’t necessarily happen overnight. So how should companies get started with this effort?

“A requirement behind customer centricity is the ability to understand customers at a fairly granular level and to be able to identify the customers or the segments of customers who are valuable from the ones who aren’t,” writes Peter Fader (Co-Director of the Wharton Customer Analytics Initiative at the University of Pennsylvania). “If you can’t sort out your customers — if you can’t look at them and know who is good and who is bad — then you can’t be customer centric. That’s step one.”

More and more companies are working through strategies for what Peter Fader describes as step one. They understand their data, and explore ways to utilize this information to gain valuable insights. For example, consider the advancements that Citrix achieved (read more in this case study). By better understanding their customer data, they saw a 20% improvement in lead conversion.

The organizations that have a better understanding of their customers are leading the way by utilizing technology to ensure data accuracy. If their contact data (address, email, and phone) is correct, then they can effectively reach that customer without fail. If their contact data is poor, connecting with customers becomes impossible and can ultimately impact their ability to compete.

Companies like BCBG understand this and are utilizing data quality services to reach up to 15% more customers (read more in this case study).

As companies continue to understand their customer data, they’ll look to fill in the gaps. Sometimes, these gaps are obvious. If a customer’s contact profile has a hole in it – for example a missing phone number – it becomes clear that the hole must be filled.

Utilizing Data as a Service enrichment and validation capabilities, organizations have the opportunity to clean up missing data without wasting a high value customer interaction to ask for their phone number. Instead, they can spend their time selling to this customer.

In addition to filling the contact profile gaps, Data as a Service subscription data is also a great way to expand the view of the customer and learn more about them. Companies can enrich their customer profiles with demographic information or industry data to round out their customer profiles, further supporting their customer-centricity goals.

In 2015, we will see companies utilizing their customer data to form a deeper connection and ultimately increase sales. The habit of “Speaking at” customers will fall by the wayside of true engagement. If customers are the lifeblood of an organization, then, in 2015, we’ll see more and more companies leveraging Data as a Service to increase customer loyalty — and ultimately fuel business growth.

 

Share
Posted in Customers, Data Quality | Tagged , , , , | Leave a comment

8 Information Quality Predictions for 2015 And Beyond

Information Quality Predictions

Information Quality Predictions

Andy Hayler of Information Difference wrote in October last year that it’s been 10 years since the master data management (MDM) industry emerged. Andy sees MDM technology maturing and project success rates rising. He concluded that MDM has moved past its infancy and has a promising future as it is approaching its teenage years.

The last few months have allowed me to see MDM, data quality and data governance from a completely different perspective. I sat with other leaders here at Informatica, analysts who focus on information quality and spent time talking to our partners who work closely with customers on data management initiatives. As we collectively attempted to peer into the crystal ball and forecast what will be hot – and what will not – in this year and beyond for MDM and data quality, here are few top predictions that stood out.

1. MDM will become a single platform for all master entities
“The classical notion of boundaries that existed where we would say, this is MDM versus this is not MDM is going to get blurred,” says Dennis Moore – SVP, Information Quality Solutions (IQS), Informatica. “Today, we master a fairly small number of attributes in MDM. Rather than only mastering core attributes, we need to master business level entities, like customer, product, location, assets, things, etc., and combine all relevant attributes into a single platform which can be used to develop new “data fueled” applications. This platform will allow mastering of data, aggregate data from other sources, and also syndicate that data out into other systems.”

Traditionally MDM was an invisible hub that was connected to all the spokes. Instead, Dennis says – “MDM will become more visible and will act as an application development platform.”

2. PIM is becoming more integrated environment that covers all information about products and related data in single place
More and more customers want to have single interface which will allow them to manage all product information. Along with managing a product’s length, width, height, color, cost etc., they probably want to see data about the history, credit rating, previous quality rating, sustainability scorecard, returns, credits and so on. Dennis says – “All the product information in one place helps make better decisions with embedded analytics, giving answers to questions such as:

  • What were my sales last week?
  • Which promotions are performing well and poorly?
  • Which suppliers are not delivering on their SLAs?
  • Which stores aren’t selling according to plan?
  • How are the products performing in specific markets?”

Essentially, PIM will become a sovereign supplier of product data that goes in your catalog and ecommerce system that will be used by merchandisers, buyers, and product and category managers. It will become the buyer’s guide and a desktop for the person whose job is to figure out how to effectively promote products to meet sales targets.

3. MDM will become an integral part of big data analytics projects
“Big data analytics suffers from the same challenges as traditional data warehouses – bad data quality produces sub-optimal intelligence. MDM has traditionally enabled better analysis and reporting with high quality master data. Big data analytics will also immensely benefit from MDM’s most trustworthy information.” – Said Ravi Shankar – VP of Product Marketing, MDM, Informatica

Naveen Sharma who heads Enterprise Data Management practice at Cognizant reemphasized what I heard from Dennis. He says – “With big data and information quality coming together, some of the boundaries between a pure MDM system and a pure analytical system will start to soften”. Naveen explains – “MDM is now seen as an integral part of big data analytics projects and it’s a huge change from a couple of years ago. Two of large retailers we work with are going down the path of trying to bring not only the customer dimension but the associated transactional data to derive meaning into an extended MDM platform. I see this trend continuing in 2015 and beyond with other verticals as well.”

4. Business requirements are leading to the creation of solutions
There are several business problems being solved by MDM, such as improving supplier spend management and collaboration with better supplier data. Supply chain, sourcing and procurement teams gain significant cost savings and a boost in productivity by mastering supplier, raw materials and product information and fueling their business and analytical applications with that clean, consistent and connected information. Jakki Geiger, Senior Director of IQS Solutions Marketing at Informatica says, “Business users want more than just the underlying infrastructure to manage business-critical data about suppliers, raw materials, and products. They want to access this information directly through a business-friendly user interface. They want a business process-driven workflow to manage the full supplier lifecycle, including: supplier registration, qualification, verification, onboarding and off-boarding. Instead of IT building these business-user focused solutions on top of an MDM foundation, vendors are starting to build ready-to-use MDM solutions like the Total Supplier Relationship solution.” Read more about Valspar’s raw materials spend management use case.

5. Increased adoption of matching and linking capabilities on Hadoop 
“Many of our customers have significantly increased the amount of data they want to master,” says Dennis Moore. Days when tens of millions of master records were a lot are long gone and having hundreds of millions of master records and billions of source records is becoming almost common. An increasing number of master data sources –internal and external to organization – are contributing significantly to the rise in data volumes. To accommodate these increasing volumes, Dennis predicts that large enterprises will look at running complex matching and linking capabilities on Hadoop – a cost effective and flexible way to analyze large amount of data.

6. Master insight management is going to be next big step
“MDM will evolve into master insight management as organizations try to relate trusted data they created in MDM with transactional and social interaction data,” said Rob Karel – VP of Product Strategy and Product Marketing, IQS, Informatica. “The innovations in machine and deep learning techniques will help organizations such as healthcare prescribe next best treatment based on history of patients, retailers suggest best offers based on customer interest and behavior, public sector companies will see big steps in social services, etc.”

Rob sees MDM at the heart of this innovation bringing together relevant information about multiple master entities and acting as a core system for insight management innovations.

7. MDM and Data Governance
Aaron Zornes – Chief research officer at the MDM Institute predicts that in 2014-15, vendor MDM solutions will move from “passive-aggressive” mode to “proactive” data governance mode. Data governance for MDM will move beyond simple stewardship to convergence of task management, workflow, policy management and enforcement according to Aaron.

8. The market will solidify for cloud based MDM adoption
Aaron says – “Cloud-innate services for DQ and DG will be more prevalent; however, enterprise MDM will remain on premise with increasing integration to cloud applications in 2015.

Naveen sees lot of synergy around cloud based MDM offerings and says – “The market is solidifying for MDM on cloud but the flood gates are yet to open”.  Naveen does not see any reason why MDM market will not go to cloud and gives the example of CRM which was at similar junction before Saleforce came into play. Naveen sees similar shift for MDM and says – “The fears companies have about their data security on cloud is eventually going to fade. If you look closely at any of the recent breaches, these all involved hacks into company networks and not into cloud provider networks. The fact that cloud service providers spend more dollars on data security than any one company can spend on their on-premise security layer will be a major factor affecting the transition”. Naveen sees that big players in MDM will include cloud offerings as part of their toolkit in coming years.

Ravi also predicts an increase in cloud adoption for MDM in future as the concern for placing master data in the cloud becomes less with maximum security provided by cloud vendors.

So, what do you predict? I would love to hear your opinions and comments.

~Prash
@MDMGeek
www.mdmgeek.com

Share
Posted in Big Data, Cloud, Data Governance, Data Quality, Enterprise Data Management, Master Data Management, Product Information Management | Tagged , , , , , | Leave a comment

The 3 Little Architects and the Big Bad Mr. Wolf – A Data Parody for today’s Financial Industry

The 3 Little Architects and the Big Bad Wolf

The 3 Little Architects

Once upon a time, there were 3 Information Architects working in the financial services industry, each with different firms and backgrounds but all responsible for recommending the right technology solutions to help their firms comply with industry regulations including ongoing bank stress testing across the globe.  Since 2008, bank regulators have been focused on measuring systemic risk and requiring banks to provide transparency into how risk is measured and reported to support their capital adequacy needs.

The first architect grew through the ranks starting as a Database Administrator, a black belt in SQL and COBOL programming. Hand coding was their DNA for many years and thought of as the best approach given how customized their business and systems were vs. other organizations. As such, Architect #1 and their team went down the path of building their data management capabilities through custom hand coded scripts, manual data extractions and transformations, and dealing with data quality issues through the business organizations after the data is delivered.   Though their approach and decisions delivered on their short term needs, the firm realized the overhead required to make changes and respond to new requests driven by new industry regulations and changing market conditions.

The second architect is a “gadget guy” at heart who grew up using off the shelf tools vs. hand coding for managing data. He and his team decides not to hand code their data management processes, instead adopt and built their solution leveraging best of breed tools, some of which were open source, others from existing solutions the company had from previous projects for data integration, data quality, and metadata management.  Though their tools helped automate much of the “heavy lifting” he and is IT team were still responsible for integrating these point solutions to work together which required ongoing support and change management.

The last architect is as technically competent as his peers however understood the value of building something once to use across the business. His approach was a little different than the first two. Understanding the risks and costs of hand coding or using one off tools to do the work, he decided to adopt an integrated platform designed to handle the complexities, sources, and volumes of data required by the business.  The platform also incorporated shared metadata, reusable data transformation rules and mappings, a single source of required master and reference data, and provided agile development capabilities to reduce the cost of implementation and ongoing change management. Though this approach was more expensive to implement, the long term cost benefit and performance benefits made the decision a “no brainer’.

Lurking in the woods is Mr. Wolf. Mr. Wolf is not your typical antagonist however is a regulatory auditor whose responsibility is to ensure these banks can explain how risk is calculated as reported to the regulatory authorities. His job isn’t to shut these banks down, instead making sure the financial industry is able to measure risk across the enterprise, explain how risk is measured, and ensure these firms are adequately capitalized as mandated by new and existing industry regulations.

Mr. Wolf visits the first bank for an annual stress test audit. Looking at the result of their stress test, he asks the compliance teams to explain how their data was produced, transformed, calculated, to support the risk measurements they reported as part of the audit. Unfortunately, due to the first architect’s recommendations of hand coding their data management processes, IT failed to provide explanations and documentation on what they did, they found the developers that created their systems were no longer with the firm. As a result, the bank failed miserably, resulting in stiff penalties and higher audit costs.

Next, Architect #2’s bank was next. Having heard of what happened to their peer in the news, the architect and IT teams were confident that they were in good shape to pass their stress test audit. After digging into the risk reports, Mr. Wolf questioned the validity of the data used to calculate Value at Risk (VaR). Unfortunately, the tools that were adopted were never designed nor guaranteed by the vendors to work with each other resulting in invalid data mapping and data quality rules and gaps within their technical metadata documentation. As a result, bank #2 also failed their audit and found themselves with a ton of on one-off tools that helped automate their data management processes but lacked the integration and sharing of rules and metadata to satisfy the regulator’s demand for risk transparency.

Finally, Mr. Wolf investigated Architect #3’s firm. Having seen the result of the first two banks, Mr. Wolf was leery of their ability to pass their stress test audits. Similar demands were presented by Mr. Wolf however this time, Bank #3 provided detailed and comprehensive metadata documentation of their risk data measurements, descriptions of the data used in each report, an comprehensive report of each data quality rule used to cleanse their data, and detailed information on each counterparty and legal entity used to calculate VaR.  Unable to find gaps in their audit, Mr. Wolf, expecting to “blow” the house down, delivered a passing grade for Bank 3 and their management team due to the right investments they made to support their enterprise risk data management needs.

The moral of this story, similar to the familiar one involving the three little pigs is about the importance of having a solid foundation to weather market and regulatory storms or the violent bellow of a big bad wolf.  A foundation that includes the required data integration, data quality, master data management, and metadata management needs but also supports collaboration and visibility of how data is produced, used, and performing across the business. Ensuring current and future compliance in today’s financial services industry requires firms to have a solid data management platform, one that is intelligent, comprehensive, and allows Information Architects to help mitigate the risks and costs of hand coding or using point tools to get by only in the short term.

Are you prepared to meet Mr. Wolf?

Share
Posted in Architects, Banking & Capital Markets, Data Governance, Data Integration, Data Integration Platform, Data Quality, Enterprise Data Management | Tagged , , , , | Leave a comment

Achieving Great Data in the Oil and Gas Industry

Have you noticed something different this winter season that most people are cheery about?  I’ll give you a hint. It’s not the great sales going on at your local shopping mall but something that helps you get to the mall allot more affordable then last year. It’s the extremely low gas prices across the globe, fueled by over-supply of oil vs. demand contributed from a boom in Geo-politics and boom in shale oil production in N. America and abroad. Like any other commodity, it’s impossible to predict where oil prices are headed however, one thing is sure that Oil and Gas companies will need timely and quality data as firms are investing in new technologies to become more agile, innovative, efficient, and competitive as reported by a recent IDC Energy Insights Predictions report for 2015.

The report predicts:

  1. 80% of the top O&G companies will reengineer processes and systems to optimize logistics, hedge risk and efficiently and safely deliver crude, LNG, and refined products by the end of 2017.
  2. Over the next 3 years, 40% of O&G majors and all software divisions of oilfield services (OFS) will co-innovate on domain specific technical projects with IT professional service firms.
  3. The CEO will expect immediate and accurate information about top Shale Plays to be available by the end of 2015 to improve asset value by 30%.
  4. By 2016, 70% percent of O&G companies will have invested in programs to evolve the IT environment to a third platform driven architecture to support agility and readily adapt to change.
  5. With continued labor shortages and over 1/3 of the O&G workforce under 45 in three years, O&G companies will turn to IT to meet productivity goals.
  6. By the end of 2017, 100% of the top 25 O&G companies will apply modeling and simulation tools and services to optimize oil field development programs and 25% will require these tools.
  7. Spending on connectivity related technologies will increase by 30% between 2014 and 2016, as O&G companies demand vendors provide the right balance of connectivity for a more complex set of data sources.
  8. In 2015, mergers, acquisitions and divestitures, plus new integrated capabilities, will drive 40% of O&G companies to re-evaluate their current deployments of ERP and hydrocarbon accounting.
  9. With a business case built on predictive analytics and optimization in drilling, production and asset integrity, 50% of O&G companies will have advanced analytics capabilities in place by 2016.
  10. With pressures on capital efficiency, by 2015, 25% of the Top 25 O&G companies will apply integrated planning and information to large capital projects, speeding up delivery and reducing over-budget risks by 30%.

Realizing value from these investments will also require Oil and Gas firms to modernize and improve their data management infrastructure and technologies to deliver great data whether to fuel actionable insights from Big Data technology to facilitating post-merger application consolidation and integration activities.  Great data is only achievable by Great Design supported by capable solutions designed to help access and deliver timely, trusted, and secure data to need it most.

Lack of proper data management investments and competences have long plagued the oil and gas sector with “less-than acceptable” data and higher operating costs. According to the “Upstream Data and Information Management Survey” conducted by Wipro Technologies, 56% of those surveyed felt that business users spent more than ¼ or more of their time on low value activities caused by existing data issues (e.g. accessing, cleansing, preparing data) for “high value” activities (e.g. analysis, planning, decision making).  The same survey showed the biggest data management issues were timely access to required data and data quality issues from source systems.

So what can Oil and Gas CIO’s and Enterprise Architects do to prepare for the future?  Here are some tips for consideration:

  • Look to migrate and automate legacy hand coded data transformation processes by adopting tools that can help streamline the development, testing, deployment, and maintenance of these complex tasks that help developers build, maintain, and monitor data transformation rules once and deploy them across the enterprise.
  • Simplify how data is distributed across systems with more modern architectures and solutions and avoid the cost and complexities of point to point integrations
  • Deal with and manage data quality upstream at the source and throughout the data life cycle vs. having end users fix unforeseen data quality errors manually.
  • Create a centralized source of shared business reference and master data that can manage a consistent record across heterogeneous systems such as well asset/material information (wellhead, field, pump, valve, etc.), employee data (drill/reservoir engineer, technician), location data (often geo-spatial), and accounting data (for financial roll-ups of cost, production data).
  • Establish standards and repeatable best practices by adopting an Integration Competency Center frame work to support the integration and sharing of data between operational and analytical systems.

In summary, low oil prices have a direct and positive impact to consumers especially during the winter season and holidays and I personally hope they continue for the unforeseeable future given that prices were double just a year ago. Unfortunately, no one can predict future energy prices however one thing is for sure, the demand for great data by Oil and Gas companies will continue to grow. As such, CIO’s and Enterprise Architects will need to consider and recognize the importance of improving their data management capabilities and technologies to ensure success in 2015. How ready are you?

Click to learn more about Informatica in today’s Energy Sector:

Share
Posted in Application Retirement, Architects, CIO, Data Integration, Data Quality, Data Synchronization, Utilities & Energy | Tagged , , , , | Leave a comment

Time to Celebrate! Informatica is Once Again Positioned as a Leader in Gartner’s Magic Quadrant for Data Quality Tools!

It’s holiday season once again at Informatica and this one feels particularly special because we just received an early present from Gartner: Informatica has just been positioned as a leader in Gartner’s Magic Quadrant for Data Quality Tools report for 2014! Click here to download the full report.

Gartner's Magic Quadrant Data Quality Tools, 2014

Gartner’s Magic Quadrant Data Quality Tools, 2014

And as it turns out, this is a gift that keeps on giving.  For eight years in a row, Informatica has been ranked as a leader in Gartner’s Magic Quadrant for Data Quality Tools. In fact, for the past two years running, Informatica has been positioned highest and best for ability to execute and completeness of vision, the two dimensions Gartner measures in their report.  These results once again validate our operational excellence as well as our prescience with our data quality products offerings. Yes folks, some days it’s hard to be humble.

Consistency and leadership are becoming hallmarks for Informatica in these and other analyst reports, and it’s hardly an accident. Those milestones are the result of our deep understanding of the market, continued innovation in product design, seamless execution on sales and marketing, and relentless dedication to customer success. Our customer loyalty has never been stronger with those essential elements in place. However, while celebrating our achievements, we are equally excited about the success our customers have achieved using our data quality products.

Managing and producing quality data is indispensable in today’s data-centric world. Gaining access to clean, trusted information should be one of a company’s most important tasks, and has previously been shown to be directly linked to growth and continued innovation.

We are truly living in a digital world – a world revolving around the Internet, gadgets and apps – all of which generate data, and lots of it.  Should your organization take advantage of its increasing masses of data? You bet. But remember: only clean, trusted data has real value.  Informatica’s mission is to help you excel by turning your data into valuable information assets that you can put to good use.

To see for yourself what the industry leading data quality tool can do, click here.

And from all of our team at Informatica, Happy holidays to you and yours.

Happy Holidays!

Happy Holidays!

Share
Posted in Data Governance, Data Quality | Tagged , , | 1 Comment

Take These Steps to Avoid Wasting Your Marketing Technology Budget

Avoid Wasting Your Marketing Technology Budget

Don’t Waste Your Marketing Tech Budget

This year, the irresistible pull of digital marketing met an unstoppable force: Girl Scout cookies. It’s an $800 million-a-year fundraiser that is only expected to increase with a newly announced addition of digital sales.

The New York Times reports that beginning in this month and into January, for the first time, the Girl Scouts of America will be able to sell Thin Mints and other favorites online through invite-only websites. The websites will be accompanied by a mobile app, giving customers new digital options.

As the Girl Scouts update from a door-to-door approach to include a newly introduced digital program, it’s just one more sign of where marketing trends are heading.

From digital cookies to digital marketing technology:

If 2014 is the year of the digital cookie, then 2015 will be the year of marketing technology. Here’s just a few of the strongest indicators:

  • A study found that 67% of marketing departments plan to increase spending on technology over the next two years, according to the Harvard Business Review.
  • Gartner predicts that by 2017, CMOs will outspend CIOs on IT-related expenses.
  • Also by 2017, one-third of the total marketing budget will be dedicated to digital marketing, according to survey results from Teradata.
  • A new LinkedIn/Salesforce survey found that 56% of marketers see their relationships with the CIO as very important or critical.
  • Social media is a mainstream channel for marketers, making technology for measuring and managing this channel of paramount importance. This is not just true of B2C companies. Of high level executive B2B buyers, 75% used social media to make purchasing decisions, according to a 2014 survey by market research firm IDC.

From social to analytics to email marketing, much of what marketers see in technology offerings is often labeled as “cloud-based.” While cloud technology has many features and benefits, what are we really saying when we talk about the cloud?

What the cloud means… to marketers.

Beginning around 2012, multitudes of businesses in many industries began adapting “the cloud” as a feature or a benefit to their products or services. Whether or not the business truly was cloud-based was not as clear, which led to the term “cloudwashing.” We hear the so much about cloud, it is easy for us to overlook what it really means and what the benefits really are.

The cloud is more than a buzzword – and in particular, marketers need to know what it truly means to them.

For marketers, “the cloud” has many benefits. A service that is cloud-based gives you amazing flexibility and choices over the way you use a product or service:

  • A cloud-enabled product or service can be integrated into your existing systems. For marketers, this can range from integration into websites, marketing automation systems, CRMs, point-of-sale platforms, and any other business application.
  • You don’t have to learn a new system, the way you might when adapting a new application, software, or other enterprise system. You won’t have to set aside a lot of time and effort for new training for you or your staff.
  • Due to the flexibility that lets you integrate anywhere, you can deploy a cloud-based product or service across all of your organization’s applications or processes, increasing efficiencies and ensuring that all of your employees have access to the same technology tools at the same time.
  • There’s no need to worry about ongoing system updates, as those happen automatically behind the scenes.

In 2015, marketers should embrace the convenience of cloud-based services, as they help put the focus on benefits instead of spending time managing the technology.

Are you using data quality in the cloud?

If you are planning to move data out of an on-premise application or software to a cloud-based service, you can take advantage of this ideal time to ensure these data quality best practices are in place.

Verify and cleanse your data first, before it is moved to the cloud. Since it’s likely that your move to the cloud will make this data available across your organization — within marketing, sales, customer service, and other departments — applying data quality best practices first will increase operational efficiency and bring down costs from invalid or unusable data.

There may be more to add to this list, depending on the nature of your own business. Make sure that:

  • Postal addresses are valid, accurate, current and complete
  • Email addresses are valid
  • Telephone numbers are valid, accurate, and current
  • Increase the effectiveness of future data analysis by making sure all data fields are consistent and every individual data element is clearly defined
  • Fill in missing data
  • Remove duplicate contact and customer records

Once you have cleansed and verified your existing data and move it to the cloud, use a real-time verification and cleansing solution at the point of entry or point of collection in real-time to ensure good data quality across your organization on an ongoing basis.

The biggest roadblock to effective marketing technology is: Bad data.

Budgeting for marketing technology is going to become a bigger and bigger piece of the pie (or cookie, if you prefer) for B2C and B2B organizations alike. The first step all marketers need to take to make sure those investments fully pay off and don’t go wasted is great customer data.

Marketing technology is fueled by data. A recent Harvard Business Review article listed some of the most important marketing technologies. They included tools for analytics, conversion, email, search engine marketing, remarketing, mobile, and marketing automation.

What do they all have in common? These tools all drive customer communication, engagement, and relationships, all of which require valid and actionable customer data to work at all.

You can’t plan your marketing strategy off of data that tells you the wrong things about who your customers are, how they prefer to be contacted, and what messages work the best. Make data quality a major part of your 2015 marketing technology planning to get the most from your investment.

Marketing technology is going to be big in 2015 — where do you start?

With all of this in mind, how can marketers prepare for their technology needs in 2015? Get started with this free virtual conference from MarketingProfs that is totally focused on marketing technology.

This great event includes a keynote from Teradata’s CMO, Lisa Arthur, on “Using Data to Build Strong Marketing Strategies.” Register here for the December 12 Marketing Technology Virtual Conference from MarketingProfs.

Even if you can’t make it live that day at the virtual conference, it’s still smart to sign up so you receive on-demand recordings from the sessions when the event ends. Register now!

Share
Posted in Cloud, Data Migration, Data Quality, Data Services | Tagged , , , | Leave a comment

When Data Integration Saves Lives

When Data Integration Saves Lives

When Data Integration Saves Lives

In an article published in Health Informatics, its author, Gabriel Perna, claims that data integration could save lives, as we learn more about illnesses and causal relationships.

According to the article, in Hamilton County Ohio, it’s not unusual to see kids from the same neighborhoods coming to the hospital for asthma attacks.  Thus, researchers wanted to know if it was fact or mistaken perception that an unusually high number of children in the same neighborhood were experiencing asthma attacks.  The next step was to review existing data to determine the extent of the issues, and perhaps how to solve the problem altogether.

“The researchers studied 4,355 children between the ages of 1 and 16 who visited the emergency department or were hospitalized for asthma at Cincinnati Children’s between January 2009 and December 2012. They tracked those kids for 12 months to see if they returned to the ED or were readmitted for asthma.”

Not only were the researchers able to determine a sound correlation between the two data sets, but they were able to advance the research to predict which kids were at high-risk based upon where they live.  Thus, some of the cause and the effects have been determined.

This came about when researchers began thinking out of the box, when it comes to dealing with traditional and non-traditional medical data.  They integrated housing and census data, in this case, with that of the data from the diagnosis and treatment of the patients.  These are data sets unlikely to find their way to each other, but together they have a meaning that is much more valuable than if they just stayed in their respective silos.

“Non-traditional medical data integration has begun to take place in some medical collaborative environments already. The New York-Presbyterian Regional Health Collaborative created a medical village, which ‘goes beyond the established patient-centered medical home mode.’ It not only connects an academic medical center with a large ambulatory network, medical homes, and other providers with each other, but community resources such as school-based clinics and specialty-care centers (the ones that are a part of NYP’s network).”

The fact of the matter is that data is the key to understanding what the heck is going on when cells of sick people begin to emerge.  While researchers and doctors can treat the individual patients there is not a good understanding of the larger issues that may be at play.  In this case, poor air quality in poor neighborhoods.  Thus, they understand what problem needs to be corrected.

The universal sharing of data is really the larger solution here, but one that won’t be approached without a common understanding of the value, and funding.  As we pass laws around the administration of health care, as well as how data is to be handled, perhaps it’s time we look at what the data actually means.  This requires a massive deployment of data integration technology, and the fundamental push to share data with a central data repository, as well as with health care providers.

Share
Posted in Data Integration, Data Integration Platform, Data Quality, Data Services | Tagged , , , | Leave a comment