Category Archives: Operational Efficiency

The Data-Driven CMO: A Q&A with Glenn Gow (CEO of Crimson Research)

Q&A with Crimson Research

I recently had the opportunity to have a very interesting discussion with Glenn Gow, the CEO of Crimson Marketing.  I was impressed at what an interesting and smart guy he was, and with the tremendous insight he has into the marketing discipline.  He consults with over 150 CMOs every year, and has a pretty solid understanding about the pains they are facing, the opportunities in front of them, and the approaches that the best-of-the-best are taking that are leading them towards new levels of success.

I asked Glenn if he would be willing to do a Q&A in order to share some of his insight.  I hope you find his perspective as interesting as I did!

 crimson_logo

______________________________________________

Q: What do you believe is the single biggest advantage that marketers have today?

A: Being able to use data in marketing is absolutely your single biggest competitive advantage as a marketer.  And therefore your biggest challenge is capturing, leveraging and rationalizing that data.  The marketers we speak with tend to fall into two buckets.

  1. Those who understand that the way they manage data is critical to their marketing success.  These marketers use data to inform their decisions, and then rely on it to measure their effectiveness.
  2. Those who haven’t yet discovered that data is the key to their success. Often these people start with systems in mind – marketing automation, CRM, etc.  But after implementing and beginning to use these systems, they almost always come to the realization that they have a data problem.

______________________________________________

Q:  How has this world of unprecedented data sources and volumes changed the marketing discipline?

A:  In short… dramatically.  The shift has really happened in the last two years. The big impetus for this change has really been the availability of data.  You’ve probably heard this figure, but Google’s Eric Schmidt likes to say that every two days now, we create as much information as we did from the dawn of civilization until 2003.

We believe this is a massive opportunity for marketers.  The question is, how do we leverage this data.  How do we pull the golden nuggets out that will help us do our jobs better.  Marketers now have access to information they’ve never had access to or even contemplated before.  This gives them the ability to become a more effective marketer. And by the way… they have to!  Customers expect them to!

For example, ad re-targeting.  Customers expect to be shown ads that are relevant to them, and if marketers don’t successfully do this, they can actually damage their brand.

In addition, competitors are taking full advantage of data, and are getting better every day at winning the hearts and minds of their customers – so marketers need to act before their competitors do.

Marketers have a tremendous opportunity – rich data is available and the technology is available to harness it is now, so that they can win a war that they could never before.

______________________________________________

Q:  Where are the barriers they are up against in harnessing this data?

A:
  I’d say that barriers can really be broken down into 4 main buckets: existing architecture, skill sets, relationships, and governance.

  • Existing Architecture: The way that data has historically been collected and stored doesn’t have the CMO’s needs in mind.  The CMO has an abundance of data theoretically at their fingertips, but they cannot do what they want with it.  The CMO needs to insist on, and work together with the CIO to build an overarching data strategy that meets their needs – both today and tomorrow because the marketing profession and tool sets are rapidly changing.  That means the CMO and their team need to step into a conversation they’ve never had before with the CIO and his/her team.  And it’s not about systems integration but it’s about data integration.
  • Existing Skill Sets:  The average marketer today is a right-brained individual.  They entered the profession because they are naturally gifted at branding, communications, and outbound perspectives.  And that requirement doesn’t go away – it’s still important.  But today’s marketer now needs to grow their left-brained skills, so they can take advantage of inbound information, marketing technologies, data, etc.  It’s hard to ask a right-brained person to suddenly be effective at managing this data.  The CMO needs to fill this skillset gap primarily by bringing in people that understand it, but they cannot ignore it themselves.  The CMO needs to understand how to manage a team of data scientists and operations people to dig through and analyze this data.  Some CMOs have actually learned to love data analysis themselves (in fact your CMO at Informatica Marge Breya is one of them).
  • Existing Relationships:  In a data-driven marketing world, relationships with the CIO become paramount.  They have historically determined what data is collected, where it is stored, what it is connected to, and how it is managed.  Today’s CMO isn’t just going to the CIO with a simple task, as in asking them to build a new dashboard.  They have to collectively work together to build a data strategy that will work for the organization as a whole.  And marketing is the “new kid on the block” in this discussion – the CIO has been working with finance, manufacturing, etc. for years, so it takes some time (and great data points!) to build that kind of cohesive relationship.  But most CIOs understand that it’s important, if for no other reason that they see budgets increasingly shifting to marketing and the rest of the Lines of Business.
  • Governance:  Who is ultimately responsible for the data that lives within an organization?  It’s not an easy question to answer.  And since marketing is a relatively new entrant into the data discussion, there are often a lot of questions left to answer. If marketing wants access to the customer data, what are we going to let them do with it? Read it?  Append to it?  How quickly does this happen? Who needs to author or approve changes to a data flow?  Who manages opt ins/outs and regulatory black lists?  And how does that impact our responsibility as an organization?  This is a new set of conversations for the CMO – but they’re absolutely critical.

______________________________________________

Q:  Are the CMOs you speak with concerned with measuring marketing success?

A:  Absolutely.  CMOs are feeling tremendous pressure from the CEO to quantify their results.  There was a recent Duke University study of CMOs that asked if they were feeling pressure from the CEO or board to justify what they’re doing.  64% of the respondents said that they do feel this pressure, and 63% say this pressure is increasing.

CMOs cannot ignore this.  They need to have access to the right data that they can trust to track the effectiveness of their organizations.  They need to quantitatively demonstrate the impact that their activities have had on corporate revenue – not just ROI or Marketing Qualified Leads.  They need to track data points all the way through the sales cycle to close and revenue, and to show their actual impact on what the CEO really cares about.

______________________________________________

Q:  Do you think marketers who undertake marketing automation products without a solid handle on their data first are getting solid results?

A:
  That is a tricky one.  Ideally, yes, they’d have their data in great shape before undertaking a marketing automation process.  The vast majority of companies who have implemented the various marketing technology tools have encountered dramatic data quality issues, often coming to light during the process of implementing their systems. So data quality and data integration is the ideal first step.

But the truth is, solving a company’s data problem isn’t a simple, straight-forward challenge.  It takes time and it’s not always obvious how to solve the problem.  Marketers need to be part of this conversation.  They need to drive how they’re going to be managing data moving forward.  And they need to involve people who understand data well, whether they be internal (typically in IT), or external (consulting companies like Crimson, and technology providers like Informatica).

So the reality for a CMO, is that it has to be a parallel path.  CMOs need to get involved in ensuring that data is managed in a way they can use effectively as a marketer, but in the meantime, they cannot stop doing their day-to-day job.  So, sure, they may not be getting the most out of their investment in marketing automation, but it’s the beginning of a process that will see tremendous returns over the long term.

______________________________________________

Q:  Is anybody really getting it “right” yet?

A:  This is the best part… yes!  We are starting to see more and more forward-thinking organizations really harnessing their data for competitive advantage, and using technology in very smart ways to tie it all together and make sense of it.  In fact, we are in the process of writing a book entitled “Moneyball for Marketing” that features eleven different companies who have marketing strategies and execution plans that we feel are leading their industries.

______________________________________________

So readers, what do you think?  Who do you think is getting it “right” by leveraging their data with smart technology and truly getting meaningful an impactful results?

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, CMO, Customer Acquisition & Retention, Operational Efficiency, Vibe | Tagged , , , , , , | Leave a comment

3 Barriers to Delivering Omnichannel Experiences

 

This blog post initially appeared on CMSwire.com and is reblogged here with their consent.

3 Barriers to Delivering Omnichannel Experiences

Image via Lars Plougmann via CC BY-SA 2.0 license

I was recently searching for fishing rods for my 5-year old son and his friends to use at our neighborhood pond. I know nothing about fishing, so I needed to get educated. First up, a Google search on my laptop at home. Then, I jostled between my phone, tablet and laptop visiting websites, reading descriptions, looking at photos and reading reviews. Offline, I talked to friends and visited local stores recently, searching for fishing rods for my 5-year old son and his friends to use at our neighborhood pond. I know nothing about fishing, so I needed to get educated. First up, a Google search on my laptop at home. Then, I jostled between my phone, tablet and laptop visiting websites, reading descriptions, looking at photos and reading reviews. Offline, I talked to friends and visited local stores.

The product descriptions weren’t very helpful. What is a “practice casting plug”? Turns out, this was a great feature! Instead of a hook, the rod had a rubber fish to practice casting safely. What a missed opportunity for the retailers who didn’t share this information. I bought the fishing rods from the retailer that educated me with valuable product information and offered free three to five day shipping.

What does this mean for companies who sell products across multiple channels?

Virtually everyone is a cross-channel shopper: 95 percent of consumers frequently or at least occasionally shop a retailer’s website and store, according to the “Omni-Channel Insights” study by CFI Group. In the report, “The Omnichannel Opportunity: Unlocking the Power of the Connected Customer,” Deloitte predicts more than 50 percent of in-store purchases will be influenced digitally by the end of 2014.

Because of all this crosschannel activity, a new term is trending: omnichannel

What Does Omnichannel Mean?

Let’s take a look back in time. Retailers started with one channel — the brick-and-mortar store. Then they introduced the catalog and call center. Then they built another channel — e-Commerce. Instead of making it an extension of the brick-and-mortar experience, many implemented an independent strategy, including operations, resources, technology and inventory. Retailers recently started integrating brick-and-mortar and e-Commerce channels, but it’s not always consistent. And now they are building another channel — mobile sites and apps.

Multichannel is a retailer-centric, transaction-focused view of operations. Each channel operates and aims to boost sales independently. Omnichannel is a customer-centric view. The goal is to understand through which channels customers want to engage at each stage of the shopping journey and enable a seamless, integrated and consistent brand experience across channels and devices.

Shoppers expect an omnichannel experience, but delivering it efficiently isn’t easy. Those responsible for enabling an omnichannel experience are encountering barriers. Let’s look at the three barriers most relevant for marketing, merchandising, sales, customer experience and information management leaders.

Barrier #1: Shift from product-centric to customer-centric view

Many retailers focus on how many products are sold by channel. Three key questions are:

  1. How can we drive store sales growth?
  2. How can we drive online sales growth?
  3. What’s our mobile strategy?

This is the old way of running a retail business. The new way is analyzing customer data to understand how they are engaging and transacting across channels.

Why is this difficult? At the Argyle eCommerce Leadership Forum, Vice President of Multichannel at GameStop Corp Jason Allen shared the $8.8 billion video game retailer’s approach to overcoming this barrier. While online represents 3 percent of sales, no one measured how much the online channel was influencing overall business.

They started by collecting customer data for analytics to find out who their customers were and how they interacted with Game Stop online and in 6,600 stores across 15 countries. The analysis revealed customers used multiple channels: 60 percent engaged on the web, and 26 percent of web visitors who didn’t buy online bought in-store within 48 hours.

This insight changed the perception of the online channel as a small contributor. Now they use two metrics to measure performance. While the online channel delivers 3 percent of sales, it influences 22 percent of overall business.

Take Action: Start collecting customer data. Analyze it. Learn who your customers are. Find out how they engage and transact with your business across channels.

Barrier #2: Shift from fragmented customer data to centralized customer data everyone can use

Nikki Baird, Managing Partner at Retail Systems Research (RSR), told me she believes the fundamentals of retail are changing from “right product, right price, right place, right time” to:

  1. Who is my customer?
  2. What are they trying to accomplish?
  3. How can we help?

According to RSR, creating a consistent customer experience remains the most valued capability for retailers, but 54 percent indicated their biggest inhibitor was not having a single view of the customer across channels.

Why is this difficult? A $12 billion specialty retailer known for its relentless focus on customer experience, with 200 stores and an online channel had to overcome this barrier. To deliver a high-touch omnichannel experience, they needed to replace the many views of the customer with one unified customer view. They invested in master data management (MDM) technology and competencies.

2014-17-July-Customer-Information-Challenge.jpg

 

Now they bring together customer, employee and product data scattered across 30 applications (e.g., e-Commerce, POS, clienteling, customer service, order management) into a central location, where it’s managed and shared on an ongoing basis. Employees’ applications are fueled with clean, consistent and connected customer data. They are able to deliver a high-touch omnichannel experience because they can answer important questions about customers and their valuable relationships, such as:

  • Who is this customer and who’s in their household?
  • Who do they buy for, what do they buy, where do they buy?
  • Which employees do they typically buy from in store?

Take Action: Think of the valuable information customers share when they interact with different parts of your business. Tap into it by bridging customer information silos. Bring fragmented customer information together in one central location. Make it universally accessible. Don’t let it remain locked up in departmental applications. Keep it up-to-date. Automate the process of updating customer information across departmental applications.

Barrier #3: Shift from fragmented product data to centralized product data everyone can use

Two-thirds of purchase journeys start with a Google search. To have a fighting chance, retailers need rich and high quality product information to rank higher than the competition.

2014-17-July-Geiger-Image5.pngTake a look at the image on the left. Would you buy this product? Probably not. One-third of shoppers who don’t make a purchase didn’t have enough information to make a purchase decision. What product information does a shopper need to convert in the moment? Rich, high quality information has conversion power.

Consumers return about 40 percent of all fashion and 15 percent of electronics purchases. That’s not good for retailers or shoppers. Minimize costly returns with complete product information so shoppers can make more informed purchase decisions. Jason Allen’s advice is, “Focus less on the cart and check out. Focus more on search, product information and your store locator. Eighty percent of customers are coming to the web for research.”

Why is this difficult? Crestline is a multichannel direct marketing firm selling promotional products through direct mail and e-Commerce. The barrier to quickly bringing products to market and updating product information across channels was fragmented and complex product information. To replace the manual, time consuming spreadsheet process to manage product information, they invested in product information management (PIM) technology.

2014-17-July-Product-Information-Challenge.jpg

Now Crestline’s product introduction and update process is 300 percent more efficient. Because they are 100 percent current on top products and over 50 percent current for all products, the company is boosting margins and customer service.

Take Action: Think about all the product information shoppers need to research and make a decision. Tap into it by bridging product information silos. Bring fragmented product information together in one central location. Make it universally usable, not channel-specific. Keep it up-to-date. Automate the process of publishing product information across channels, including the applications used by customer service and store associates.

Key Takeaways

Delivering an omnichannel experience efficiently isn’t easy. The Game Stop team collected and analyzed customer data to learn more about who their customers are and how they interact with the company. A specialty retailer centralized fragmented customer data. Crestline centralized product information to accelerate their ability to bring products to market and make updates across channels. Which of these barriers are holding you back from delivering an omnichannel experience?

Title image by Lars Plougmann (Flickr) via a CC BY-SA 2.0 license

 

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Aggregation, Operational Efficiency, Retail | Tagged | Leave a comment

One Search Procurement – For the Purchasing of Indirect Goods and Services

One Search Procurement – for purchasing of indirect goods and services 

Informatica Procurement is the internal Amazon for purchasing of MRO, C-goods, indirect materials and services. Informatica Procurement supports enterprise companies in catalog procurement with an industry-independent catalog procurement solution that enables fast and cost-efficient procurement of products and services and supplier integration in an easy to use self-service concept.

Information Procurement at a glance

informatica-procurement-at-a-glance

Informatica recently announced the availability of Informatica Procurement 7.3, the catalog procurement solution. I meet with Melanie Kunz our product manager to learn from here what’s new.

Melanie, for our readers and followers, who is using Informatica Procurement, for which purposes?

Melanie Kunz

Melanie Kunz: Informatica Procurement is industry-independent. Our customers are based in different industries – from engineering and the automotive to companies in the public sector (e.g. Cities). The responsibilities of people who work with Informatica Procurement differ depending on the company. For some customers, only employees from the purchasing department order items in Informatica Procurement. For other customers, all employees are allowed to order their needs themselves. Examples are employees who need screws for the completion of their product or office staff who ordered the business cards for the manager.

What is the most important thing to know about Informatica Procurement 7.3?

Melanie Kunz: In companies where a lot of IT equipment is ordered, it is important to always see the current prices. With each price changes, the catalog would have to be imported into Informatica Procurement. With a punch out to the online shop of IT equipment manufacturer, this is much easier and more efficient. The data from these catalogs are all available in Informatica Procurement, but the price can always be called on a daily basis from the online shop.

Users no longer need to leave Informatica Procurement to order items from external online shops. Informatica Procurement now enables the user to locate internal and indexed external items in just one search. That means you do not have to use different eShops for when you order new office stationary, IT equipment or services.

Great, what is the value for enterprise users and purchasing departments?

Melanie Kunz: All items in Informatica Procurement have the negotiated prices. Informatica Procurement is simple and intuitive that each employee can use the system without training. The view concept allows the restriction on products. For each employee (each department), the administrator can define a view. This view contains only the products that can be seen and ordered.

When you open the detail view for an indexed external item, the current price is determined from the external online shop. This price is saved in item detail view for a defined period. In this way, the user always gets the current price for the item.

The newly designed detail view has an elegant and clear layout. Thus, a high level of user experience is safe. This also applies to the possibility of image enlargement in the search result list.

What if I order same products frequently, like my business cards?

Melanie Kunz: The overview of recent shopping carts help users to reorder the same items on an easy and fast way. A shopping cart from a previous order can use as basis for this new order.

Large organizations with 1000s of employees are even more might have totally different needs what they need for the daily business and maybe dedicated to their career level. How do you address this?

Melanie Kunz: The standard assortment feature has been enhanced in Informatica Procurement 7.3. Administrators can define the assortment per user. Furthermore, it is possible to specify whether users have to search the standard assortment first and only search in the entire assortment if they do not find the relevant item in the standard assortment.

All of these features and many more minor features not only enhance the user experience, but also reduce the processing time of an order drastically.

Informatica Procurement 7.3 “One Search” at a glance

One Search Procurement

 

Learn more on Informatica Procurement 7.3 with the latest webinar.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Enterprise Data Management, Life Sciences, Manufacturing, Marketplace, Master Data Management, News & Announcements, Operational Efficiency, PiM, Public Sector | Tagged , , , | Leave a comment

How Much is Disconnected Well Data Costing Your Business?

“Not only do we underestimate the cost for projects up to 150%, but we overestimate the revenue it will generate.” This quotation from an Energy & Petroleum (E&P) company executive illustrates the negative impact of inaccurate, inconsistent and disconnected well data and asset data on revenue potential. 

“Operational Excellence” is a common goal of many E&P company executives pursuing higher growth targets. But, inaccurate, inconsistent and disconnected well data and asset data may be holding them back. It obscures the complete picture of the well information lifecycle, making it difficult to maximize production efficiency, reduce Non-Productive Time (NPT), streamline the oilfield supply chain, calculate well by-well profitability,  and mitigate risk.

Well data expert, Stephanie Wilkin shares details about the award-winning collaboration between Noah Consulting and Devon Energy.

Well data expert, Stephanie Wilkin shares details about the award-winning collaboration between Noah Consulting and Devon Energy.

To explain how E&P companies can better manage well data and asset data, we hosted a webinar, “Attention E&P Executives: Streamlining the Well Information Lifecycle.” Our well data experts Stephanie Wilkin, Senior Principal Consultant at Noah Consulting, and Stephan Zoder, Director of Value Engineering at Informatica shared some advice. E&P companies should reevaluate “throwing more bodies at a data cleanup project twice a year.” This approach does not support the pursuit of operational excellence.

In this interview, Stephanie shares details about the award-winning collaboration between Noah Consulting and Devon Energy to create a single trusted source of well data, which is standardized and mastered.

Q. Congratulations on winning the 2014 Innovation Award, Stephanie!
A. Thanks Jakki. It was really exciting working with Devon Energy. Together we put the technology and processes in place to manage and master well data in a central location and share it with downstream systems on an ongoing basis. We were proud to win the 2014 Innovation Award for Best Enterprise Data Platform.

Q. What was the business need for mastering well data?
A. As E&P companies grow so do their needs for business-critical well data. All departments need clean, consistent and connected well data to fuel their applications. We implemented a master data management (MDM) solution for well data with the goals of improving information management, business productivity, organizational efficiency, and reporting.

Q. How long did it take to implement the MDM solution for well data?
A. The Devon Energy project kicked off in May of 2012. Within five months we built the complete solution from gathering business requirements to development and testing.

Q. What were the steps in implementing the MDM solution?
A: The first and most important step was securing buy-in on a common definition for master well data or Unique Well Identifier (UWI). The key was to create a definition that would meet the needs of various business functions. Then we built the well master, which would be consistent across various systems, such as G&G, Drilling, Production, Finance, etc. We used the Professional Petroleum Data Management Association (PPDM) data model and created more than 70 unique attributes for the well, including Lahee Class, Fluid Direction, Trajectory, Role and Business Interest.

As part of the original go-live, we had three source systems of well data and two target systems connected to the MDM solution. Over the course of the next year, we added three additional source systems and four additional target systems. We did a cross-system analysis to make sure every department has the right wells and the right data about those wells. Now the company uses MDM as the single trusted source of well data, which is standardized and mastered, to do analysis and build reports.

Q. What’s been the traditional approach for managing well data?
A. Typically when a new well is created, employees spend time entering well data into their own systems. For example, one person enters well data into the G&G application. Another person enters the same well data into the Drilling application. A third person enters the same well data into the Finance application. According to statistics, it takes about 30 minutes to enter wells into a particular financial application.

So imagine if you need to add 500 new wells to your systems. This is common after a merger or acquisition. That translates to roughly 250 hours or 6.25 weeks of employee time saved on the well create process! By automating across systems, you not only save time, you eliminate redundant data entry and possible errors in the process.

Q. That sounds like a painfully slow and error-prone process.
A. It is! But that’s only half the problem. Without a single trusted source of well data, how do you get a complete picture of your wells? When you compare the well data in the G&G system to the well data in the Drilling or Finance systems, it’s typically inconsistent and difficult to reconcile. This leads to the question, “Which one of these systems has the best version of the truth?” Employees spend too much time manually reconciling well data for reporting and decision-making.

Q. So there is a lot to be gained by better managing well data.
A. That’s right. The CFO typically loves the ROI on a master well data project. It’s a huge opportunity to save time and money, boost productivity and get more accurate reporting.

Q: What were some of the business requirements for the MDM solution?
A: We couldn’t build a solution that was narrowly focused on meeting the company’s needs today. We had to keep the future in mind. Our goal was to build a framework that was scalable and supportable as the company’s business environment changed. This allows the company to add additional data domains or attributes to the well data model at any time.

Noah Consulting's MDM Trust Framework for well data

The Noah Consulting MDM Trust Framework was used to build a single trusted source of well data

Q: Why did you choose Informatica MDM?
A: The decision to use Informatica MDM for the MDM Trust Framework came down to the following capabilities:

  • Match and Merge: With Informatica, we get a lot of flexibility. Some systems carry the API or well government ID, but some don’t. We can match and merge records differently based on the system.
  • X-References: We keep a cross-reference between all the systems. We can go back to the master well data and find out where that data came from and when. We can see where changes have occurred because Informatica MDM tracks the history and lineage.
  • Scalability: This was a key requirement. While we went live after only 5 months, we’ve been continually building out the well master based on the requiremets of the target systems.
  • Flexibility: Down the road, if we want to add an additional facet or classification to the well master, the framework allows for that.
  • Simple Integration: Instead of building point-to-point integrations, we use the hub model.

In addition to Informatica MDM, our Noah Consulting MDM Trust Framework includes Informatica PowerCenter for data integration, Informatica Data Quality for data cleansing and Informatica Data Virtualization.

Q: Can you give some examples of the business value gained by mastering well data?
A: One person said to me, “I’m so overwhelmed! We’ve never had one place to look at this well data before.” With MDM centrally managing master well data and fueling key business applications, many upstream processes can be optimized to achieve their full potential value.

People spend less time entering well data on the front end and reconciling well data on the back end. Well data is entered once and it’s automatically shared across all systems that need it. People can trust that it’s consistent across systems. Also, because the data across systems is now tied together, it provides business value they were unable to realize before, such as predictive analytics. 

Q. What’s next?
A. There’s a lot of insight that can be gained by understanding the relationships between the well, and the people, equipment and facilities associated with it. Next, we’re planning to add the operational hierarchy. For example, we’ll be able to identify which production engineer, reservoir engineer and foreman are working on a particular well.

We’ve also started gathering business requirements for equipment and facilities to be tied to each well. There’s a lot more business value on the horizon as the company streamlines their well information lifecycle and the valuable relationships around the well.

If you missed the webinar, you can watch the replay now: Attention E&P Executives: Streamlining the Well Information Lifecycle.

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Data Integration, Data Quality, Enterprise Data Management, Master Data Management, Operational Efficiency, PowerCenter, Utilities & Energy | Tagged , , , , , , , | Leave a comment

Marketing in a Data-Driven World… From Mad Men to Mad Scientist

Are Marketers More Mad Men or Mad Scientists?

I have been in marketing for over two decades. As I meet people in social situations, on airplanes, and on the sidelines at children’s soccer games, and they ask what it is I do, I get responses that constantly amuse me and lead me to the conclusion that the general public has absolutely no idea what a marketer does. I am often asked things like “have you created any commercials that I might have seen?” and peppered with questions that evoke visions of Mad Men-esque 1960’s style agency work and late night creative martini-filled pitch sessions.

I admit I do love to catch the occasional Mad Men episode, and a few weeks ago, I stumbled upon one that had me chuckling. You may remember the one that Don Draper is pitching a lipstick advertisement and after persuading the executive to see things his way, he says something along the lines of, “We’ll never know, will we? It’s not a science.”

How the times have changed. I would argue that in today’s data-driven world, marketing is no longer an art and is now squarely a science.

Sure, great marketers still understand their buyers at a gut level, but their hunches are no longer the impetus of a marketing campaign. Their hunches are now the impetus for a data-driven, fact-finding mission, and only after the analysis has been completed and confirms or contradicts this hunch, is the campaign designed and launched.

This is only possible because today, marketers have access to enormous amounts of data – not just the basic demographics of years past. Most marketers realize that there is great promise in all of that data, but it’s just too complicated, time-consuming, and costly to truly harness it. How can you really ever make sense of the hundreds of data sources and tens of thousands of variables within these sources? Social media, web analytics, geo-targeting, internal customer and financial systems, in house marketing automation systems, third party data augmentation in the cloud… the list goes on and on!

How can marketers harness the right data, in the right way, right away? The answer starts with making the commitment that your marketing team – and hopefully your organization as a whole – will think “data first”. In the coming weeks, I will focus on what exactly thinking data first means, and how it will pay dividends to marketers.

In the mean time, I will make the personal commitment to be more patient about answering the silly questions and comments about marketers.

Now, it’s your turn to comment… 

What are some of the most amusing misconceptions about marketers that you’ve encountered?

- and -

Do you agree? Is marketing an art? A science? Or somewhere in between?

Are Marketers More Mad Men or Mad Scientists?

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, CMO, Operational Efficiency | Tagged , , , , , , , | Leave a comment

Health Plans, Create Competitive Differentiation with Risk Adjustment

improve risk adjustmentExploring Risk Adjustment as a Source of Competitive Differentiation

Risk adjustment is a hot topic in healthcare. Today, I interviewed my colleague, Noreen Hurley to learn more. Noreen tell us about your experience with risk adjustment.

Before I joined Informatica I worked for a health plan in Boston. I managed several programs  including CMS Five Start Quality Rating System and Risk Adjustment Redesign.  We recognized the need for a robust diagnostic profile of our members in support of risk adjustment. However, because the information resides in multiple sources, gathering and connecting the data presented many challenges. I see the opportunity for health plans to transform risk adjustment.

As risk adjustment becomes an integral component in healthcare, I encourage health plans to create a core competency around the development of diagnostic profiles. This should be the case for health plans and ACO’s.  This profile is the source of reimbursement for an individual. This profile is also the basis for clinical care management.  Augmented with social and demographic data, the profile can create a roadmap for successfully engaging each member.

Why is risk adjustment important?

Risk Adjustment is increasingly entrenched in the healthcare ecosystem.  Originating in Medicare Advantage, it is now applicable to other areas.  Risk adjustment is mission critical to protect financial viability and identify a clinical baseline for  members.

What are a few examples of the increasing importance of risk adjustment?

1)      Centers for Medicare and Medicaid (CMS) continues to increase the focus on Risk Adjustment. They are evaluating the value provided to the Federal government and beneficiaries.  CMS has questioned the efficacy of home assessments and challenged health plans to provide a value statement beyond the harvesting of diagnoses codes which result solely in revenue enhancement.   Illustrating additional value has been a challenge. Integrating data across the health plan will help address this challenge and derive value.

2)      Marketplace members will also require risk adjustment calculations.  After the first three years, the three “R’s” will dwindle down to one ‘R”.  When Reinsurance and Risk Corridors end, we will be left with Risk Adjustment. To succeed with this new population, health plans need a clear strategy to obtain, analyze and process data.  CMS processing delays make risk adjustment even more difficult.  A Health Plan’s ability to manage this information  will be critical to success.

3)      Dual Eligibles, Medicaid members and ACO’s also rely on risk management for profitability and improved quality.

With an enhanced diagnostic profile — one that is accurate, complete and shared — I believe it is possible to enhance care, deliver appropriate reimbursements and provide coordinated care.

How can payers better enable risk adjustment?

  • Facilitate timely analysis of accurate data from a variety of sources, in any  format.
  • Integrate and reconcile data from initial receipt through adjudication and  submission.
  • Deliver clean and normalized data to business users.
  • Provide an aggregated view of master data about members, providers and the relationships between them to reveal insights and enable a differentiated level of service.
  • Apply natural language processing to capture insights otherwise trapped in text based notes.

With clean, safe and connected data,  health plans can profile members and identify undocumented diagnoses. With this data, health plans will also be able to create reports identifying providers who would benefit from additional training and support (about coding accuracy and completeness).

What will clean, safe and connected data allow?

  • Allow risk adjustment to become a core competency and source of differentiation.  Revenue impacts are expanding to lines of business representing larger and increasingly complex populations.
  • Educate, motivate and engage providers with accurate reporting.  Obtaining and acting on diagnostic data is best done when the member/patient is meeting with the caregiver.  Clear and trusted feedback to physicians will contribute to a strong partnership.
  • Improve patient care, reduce medical cost, increase quality ratings and engage members.
FacebookTwitterLinkedInEmailPrintShare
Posted in B2B, B2B Data Exchange, Business Impact / Benefits, Business/IT Collaboration, CIO, Customer Acquisition & Retention, Data Governance, Data Integration, Enterprise Data Management, Healthcare, Master Data Management, Operational Efficiency | Tagged , , | Leave a comment

Application Retirement: Old Applications, and Their Place In The Sun

obsolete_tech_large_sqareWhat springs to mind when you think about old applications? What happens to them when they outlived their usefulness? Do they finally get to retire and have their day in the sun, or do they tenaciously hang on to life?

Think for a moment about your situation and of those around you. From the time work started you have been encouraged and sometimes forced to think about, plan for and fund your own retirement. Now consider the portfolio your organization has built up over the years; hundreds or maybe thousands of apps, spread across numerous platforms and locations – A mix of home-grown with the best-in-breed tools or acquired from the leading application vendors.

Evaluating Your Current Situation

  • Do you know how many of those “legacy” systems are still running?
  • Do you know how much these apps are costing?
  • Is there a plan to retire them?
  • How is the execution tracking to plan?

Truth is, even if you have a plan, it probably isn’t going well.

Providing better citizen service at a lower cost

This is something every state and local organization aspires to do by reducing costs. Many organizations are spending 75% or more of their budgets on just keeping the lights on – maintaining existing applications and infrastructure. Being able to fully retire some, or many of these applications saves significant money. Do you know how much these applications are costing your organization? Don’t forget to include the whole range of costs that applications incur – including the physical infrastructure costs such as mainframes, networks and storage, as well as the required software licenses and of course the time of the people that actually keep them running. What happens when those with with Cobol and CICS experience retire? Usually the answer is not good news. There is a lot to consider and many benefits to be gained through an effective application retirement strategy.

August 2011 report by ESG Global shows that some 68% of organizations had over six or more legacy applications running and that 50% planned to retire at least one of those over the following 12-18 months. It would be interesting to see today’s situation and be able evaluate how successful these application retirement plans have been.

A common problem is knowing where to start. You know there are applications that you should be able to retire, but planning, building and executing an effective and success plan can be tough. To help this process we have developed a strategy, framework and solution for effective and efficient application retirement. This is a good starting point on your application retirement journey.

To get a speedy overview, take six minutes to watch this video on application retirement.

We have created a community specifically for application managers in our ‘Potential At Work’ site. If you haven’t already signed up, take a moment and join this group of like-minded individuals from across the globe.

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Application Retirement, Business Impact / Benefits, Data Archiving, Operational Efficiency, Public Sector | Tagged , , , , | Leave a comment

Becoming a Revenue Driven Business Model through Data is Painful for Government Agencies

Recently, I presented a Business Value Assessment to a client.  The findings were based on a revenue-generating state government agency. Everyone at the presentation was stunned to find out how much money was left on the table by not basing their activities on transactions, which could be cleanly tied to the participating citizenry and a variety of channel partners. There was over $38 million in annual benefits left over, which included partially recovered lost revenue, cost avoidance and reduction. A higher data impact to this revenue driven business model could have prevented this.

Should government leaders go to the “School of Data” to understand where more revenue can be created without necessary tax hikes? (Source:creativecommons.org)

Should government leaders go to the “School of Data” to understand where more revenue can be created without necessary tax hikes? (Source:creativecommons.org)

Given the total revenue volume, this may seem small. However, after factoring in the little technology effort required to “collect and connect” data from existing transactions, it is actually extremely high.

The real challenge for this organization will be the required policy transformation to turn the organization from “data-starved” to “data-intensive”. This would eliminate strategic decisions around new products, locations and customers relying on surveys that face sampling errors, biases, etc. Additionally, surveys are often delayed, making them practically ineffective in this real-time world we live in today.

Despite no applicable legal restrictions, the leadership’s main concern was that gathering more data would erode the public’s trust and positive image of the organization.

To be clear; by “more” data being collected by this type of government agency I mean literally 10% of what any commercial retail entity has gathered on all of us for decades.  This is not the next NSA revelation as any conspiracy theorist may fear.

While I respect their culturally driven self-censorship despite no legal barricades, it raises their stakeholders’ (the state’s citizenry) concern over its performance.  To be clear, there would be no additional revenue for the state’s programs without more citizen data.  You may believe that they already know everything about you, including your income, property value, tax information, etc. However, inter-departmental sharing of criminally-non-relevant information is legally constrained.

Another interesting finding from this evaluation was that they had no sense of conversion rate from email and social media campaigns. Impressions from click-throughs as well as hard/soft bounces were more important than tracking who actually generated revenue.

This is a very market-driven organization compared to other agencies. It actually does try to measure itself like a commercial enterprise and attempts to change in order to generate additional revenue for state programs benefiting the citizenry. I can only imagine what non-revenue-generating agencies (local, state or federal) do in this respect.  Is revenue-oriented thinking something the DoD, DoJ or Social Security should subscribe to?

Think tanks and political pundits are now looking at the trade-off between bringing democracy to every backyard on our globe and its long-term, budget ramifications. The DoD is looking to reduce the active component to its lowest in decades given the U.S. federal debt level.

Putting the data bits and pieces together for revenue

Putting the data bits and pieces together for revenue

recent article in HBR explains that cost cutting has never sustained an organization’s growth over a longer period of time, but new revenue sources did. Is your company or government agency only looking at cost and personnel productivity?

Disclaimer:

Recommendations and illustrations contained in this post are estimates only and are based entirely upon information provided by the prospective customer and on our observations and benchmarks.  While we believe our recommendations and estimates to be sound, the degree of success achieved by the prospective customer is dependent upon a variety of factors, many of which are not under Informatica’s control and nothing in this post shall be relied upon as representative of the degree of success that may, in fact, be realized and no warranty or representation of success, either express or implied, is made.

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Data Integration, Data Quality, Operational Efficiency, Real-Time | Tagged , , | Leave a comment

When It Comes to Data Integration Skills, Big Data and Cloud Projects Need the Most Expertise

Looking for a data integration expert? Join the club. As cloud computing and big data become more desirable within the Global 2000, an abundance of data integration talent is required to make both cloud and big data work properly.

The fact of the matter is that you can’t deploy a cloud-based system without some sort of data integration as part of the solution. Either from on-premise to cloud, cloud-to-cloud, or even intra-company use of private clouds, these projects need someone who knows what they are doing when it comes to data integration.

linthicum

While many cloud projects were launched without a clear understanding of the role of data integration, most people understand it now. As companies become more familiar with the could, they learn that data integration is key to the solution. For this reason, it’s important for teams to have at least some data integration talent.

The same goes for big data projects. Massive amounts of data need to be loaded into massive databases. You can’t do these projects using ad-hoc technologies anymore. The team needs someone with integration knowledge, including what technologies to bring to the project.

Generally speaking, big data systems are built around data integration solutions. Similar to cloud, the use of data integration architectural expertise should be a core part of the project. I see big data projects succeed and fail, and the biggest cause of failure is the lack of data integration expertise.

The demand for data integration talent has exploded with the growth of both big data and cloud computing. A week does not go by that I’m not asked for the names of people who have data integration, cloud computing and big data systems skills. I know several people who fit that bill, however they all have jobs and recently got raises.

The scary thing is, if these jobs go unfilled by qualified personnel, project directors may hire individuals without the proper skills and experience. Or worse, they may not hire anyone at all. If they plod along without the expertise required, in a year they’ll wonder why the systems are not sharing data the way they should, resulting in a big failure.

So, what can organizations do? You can find or build the talent you need before starting important projects. Thus, now is the time to begin the planning process, including how to find and hire the right resources. This might even mean internal training, hiring mentors or outside consultants, or working with data integration technology providers. Do everything necessary to make sure you get data integration done right the first time.

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, Data Integration, Operational Efficiency | Tagged , , | Leave a comment

Data: The Unsung Hero (or Villain) of every Communications Service Provider

The faceless hero of CSPs: Data

The faceless hero of CSPs: Data

Analyzing current business trends helps illustrate how difficult and complex the Communication Service Provider business environment has become. CSPs face many challenges. Clients expect high quality, affordable content that can move between devices with minimum advertising or privacy concerns. To illustrate this phenomenon, here are a few recent examples:

  • Apple is working with Comcast/NBC Universal on a new converged offering
  • Vodafone purchased the Spanish cable operator, Ono, having to quickly separate the wireless customers from the cable ones and cross-sell existing products
  • Net neutrality has been scuttled in the US and upheld in the EU so now a US CSP can give preferential bandwidth to content providers, generating higher margins
  • Microsoft’s Xbox community collects terabytes of data every day making effective use, storage and disposal based on local data retention regulation a challenge
  • Expensive 4G LTE infrastructure investment by operators such as Reliance is bringing streaming content to tens of millions of new consumers

To quickly capitalize on “new” (often old, but unknown) data sources, there has to be a common understanding of:

  • Where the data is
  • What state it is in
  • What it means
  • What volume and attributes are required to accommodate a one-off project vs. a recurring one

When a multitude of departments request data for analytical projects with their one-off, IT-unsanctioned on-premise or cloud applications, how will you go about it? The average European operator has between 400 and 1,500 (known) applications. Imagine what the unknown count is.

A European operator with 20-30 million subscribers incurs an average of $3 million per month due to unpaid invoices. This often results from incorrect or incomplete contact information. Imagine how much you would have to add for lost productivity efforts, including gathering, re-formatting, enriching, checking and sending  invoices. And this does not even account for late invoice payments or extended incorrect credit terms.

Think about all the wrong long-term conclusions that are being drawn from this wrong data. This single data problem creates indirect cost in excess of three times the initial, direct impact of unpaid invoices.

Want to fix your data and overcome the accelerating cost of change? Involve your marketing, CEM, strategy, finance and sales leaders to help them understand data’s impact on the bottom line.

Disclaimer: Recommendations and illustrations contained in this post are estimates only and are based entirely upon information provided by the prospective customer and on our observations and benchmarks. While we believe our recommendations and estimates to be sound, the degree of success achieved by the prospective customer is dependent upon a variety of factors, many of which are not under Informatica’s control and nothing in this post shall be relied upon as representative of the degree of success that may, in fact, be realized and no warranty or representation of success, either express or implied, is made.

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Business/IT Collaboration, Data Governance, Data Integration, Data Quality, Operational Efficiency | Tagged , , , , | Comments Off