Category Archives: B2B Data Exchange

IDMP Field Notes: Compliance Trends in Q1 2015

With the European Medicines Agency (EMA) date for compliance to IDMP (Identification of Medicinal Products) looming, Q1 2015 has seen a significant increase in IDMP activity.  Both Informatica & HighPoint Solution’s IDMP Round Table in January, and a February Marcus Evans conference in Berlin provided excellent forums for sharing progress, thoughts and strategies.  Additional confidential conversations with pharmaceutical companies show an increase in the number of approved and active projects, although some are still seeking full funding.  The following paragraphs sum up the activity and trends that I have witnessed in the first three months of the year.

I’ll start with my favourite quote, which is from Dr. Jörg Stüben of Boehringer Ingelheim, who asked:

“Isn’t part of compliance being in control of your data?” 

I like it because to me it is just the right balance of stating the obvious, and questioning the way the majority of pharmaceutical companies approach compliance:  A report that has to be created and submitted.  If a company is in control of their data, regulatory compliance would be easier and come at a lower cost.  More importantly, the company itself would benefit from easy access to high quality data.

Dr. Stüben’s question was raised during his excellent presentation at the Marcus Evans conference.  Not only did he question the status quo, but proposed an alternate way for IDMP compliance:  Let Boehringer benefit from their investment in IDMP compliance.   His approach can be summarised as follows:

  • Embrace a holistic approach to being in control of data, i.e. adopt data governance practices.
  • This is not about just compliance. Include optional attributes that will deliver value to the organisation if correctly managed.
  • Get started by creating simple, clear work packages.

Although Dr Stüben did not outline his technical solution, it would include data quality tools and a product data hub.

At the same conference, Stefan Fischer Rivera & Stefan Brügger of Bayer and Guido Claes from Janssen Pharmaceuticals both came out strongly in favour of using a Master Data Management (MDM) approach to achieving compliance.  Both companies have MDM technology and processes within their organisations, and realise the value a MDM approach can bring to achieving compliance in terms of data management and governance.  Having Mr Claes express how well Informatica’s MDM and Data Quality solutions support his existing substance data management program, made his presentation even more enjoyable to me.

Whilst the exact approaches of Bayer and Janssen differed, there were some common themes:

  • Consider both the short term (compliance) and the long term (data governance) in the strategy
  • Centralised MDM is ideal, but a federated approach is practical for July 2016
  • High quality data should be available to a wide audience outside of IDMP compliance

The first and third bullet points map very closely to Dr. Stüben’s key points, and in fact show a clear trend in 2015:

IDMP Compliance is an opportunity to invest in your data management solutions and processes for the benefit of the entire organisation.

Although the EMA was not represented at the conference, Andrew Marr presented their approach to IDMP, and master data in general.  The EMA is undergoing a system re-organisation to focus on managing Substance, Product, Organisation and Reference data centrally, rather than within each regulation or program as it is today.  MDM will play a key role in managing this data, setting a high standard of data control and management for regulatory purposes.  It appears that the EMA is also using IDMP to introduce better data management practice.

Depending on the size of the company, and the skills & tools available, other non-MDM approaches have been presented or discussed during the first part of 2015.  These include using XML and SharePoint to manage product data.  However I share a primary concern with others in the industry with this approach:  How well can you manage and control change using these tools?  Some pharmaceutical companies have openly stated that data contributors often spend more time looking for data than doing their own jobs.  A XML/SharePoint approach will do little to ease this burden, but an MDM approach will.

Despite the others approaches and solutions being discovered, there is another clear trend in Q1 2015

MDM is becoming a favoured approach for IDMP compliance due to its strong governance, centralised attribute-level data management and ability to track changes.

Interestingly, the opportunity to invest in data management, and the rise of MDM as a favoured approach has been backed up with research by Gens Associates.  Messers Gens and Brolund found a rapid incGens Associates IA with captionrease in investment during 2014 of what they term Information Architecture, in which MDM plays a key role.  IDMP is seen as a major driver for this investment.  They go on to state that investment  in master data management programs will allow a much easier and cost effective approach to data exchange (internally and externally), resulting in substantial benefits.  Unfortunately they do not elaborate on these benefits, but I have placed a summary on benefits of using MDM for IDMP compliance here.

In terms of active projects, the common compliance activities I have seen in the first quarter of 2015 are as follows:

  • Most companies are in the discovery phase: identifying the effort for compliance
  • Some are starting to make technology choices, and have submitted RFPs/RFQs
    • Those furthest along in technology already have MDM programs or initiatives underway
  • Despite getting a start, some are still lacking enough funding for achieving compliance
    • Output from the discovery phase will in some cases be used to request full funding
  • A significant number of projects have a goal to implement better data management practice throughout the company. IDMP will be the as the first release.

A final trend I have noticed in 2015 is regarding the magnitude of the compliance task ahead:

Those who have made the most progress are those who are most concerned about achieving compliance on time. 

The implication is that the companies who are starting late do not yet realise the magnitude of the task ahead.  It is not yet too late to comply and achieve long term benefits through better data management, despite only 15 months before the initial EMA deadline.  Informatica has customers who have implemented MDM within 6 months.  15 months is achievable provided the project (or program) gets the focus and resources required.

IDMP compliance is a common challenge to all those in the pharmaceutical industry.  Learning from others will help avoid common mistakes and provide tips on important topics.  For example, how to secure funding and support from senior management is a common concern among those tasked with compliance.  In order to encourage learning and networking, Informatica and HighPoint Solutions will be hosting our third IDMP roundtable in London on May 13th.  Please do join us to share your experiences, and learn from the experiences of others.

Share
Posted in B2B, B2B Data Exchange, Business Impact / Benefits, Data Security, Healthcare | Tagged , , , , | Leave a comment

The CMOs Role in Delivering Omnichannel Customer Experiences

omnichannel

The CMOs Role in Delivering Omnichannel Customer Experiences

This article was originally posted on Argyle CMO Journal and is re-posted here with permission.

According to a new global study from SDL, 90% of consumers expect a consistent customer experience across channels and devices when they interact with brands. However, according to these survey results, Gartner Survey Finds Importance of Customer Experience on the Rise — Marketing is on the Hook, fewer than half of the companies surveyed rank their customer experience as exceptional today. The good news is that two-thirds expect it to be exceptional in two years. In fact, 89% plan to compete primarily on the basis of the customer experience by 2016.

So, what role do CMOs play in delivering omnichannel customer experiences?

According to a recent report, Gartner’s Executive Summary for Leadership Accountability and Credibility within the C-Suite, a high percentage of CEOs expect CMOs to lead the integrated cross-functional customer experience. Also, customer experience is one of the top three areas of investment for CMOs in the next two years.

I had the pleasure of participating on a panel discussion at the Argyle CMO Forum in Dallas a few months ago. It focused on the emergence of omnichannel and the need to deliver seamless, integrated and consistent customer experiences across channels.

Lisa Zoellner, Chief Marketing Officer of Golfsmith International, was the dynamic moderator, kept the conversation lively, and the audience engaged. I was a panelist alongside:

Below are some highlights from the panel.

Lisa Zoellner, CMO, Golfsmith International opened the panel with a statistic. “Fifty-five percent of marketers surveyed feel they are playing catch up to customer expectations. But in that gap is a big opportunity.”

What is your definition of omnichannel?

There was consensus among the group that omnichannel is about seeing your business through the eyes of your customer and delivering seamless, integrated and consistent customer experiences across channels.

Customers don’t think in terms of channels and touch points; they just expect seamless, integrated and consistent customer experiences. It’s one brand to the customer. But there is a gap between customer expectations and what most businesses can deliver today.

In fact, executives at most organizations I’ve spoken with, including the panelists, believe they are in the very beginning stages of their journey towards delivering omnichannel customer experiences. The majority are still struggling to get a single view of customers, products and inventory across channels.

“Customers don’t think in terms of channels and touch points; they just expect seamless, integrated and consistent customer experiences.”

What are some of the core challenges standing in your way?

A key takeaway was that omnichannel requires organizations to fundamentally change how they do business. In particular, it requires changing existing business practices and processes. It cannot be done without cross-functional collaboration.

I think Chris Berg, VP, Store Operations at The Home Depot said it well, “One of the core challenges is the annual capital allocation cycle, which makes it difficult for organizations to be nimble. Most companies set strategies and commitments 12-24 months out and approach these strategies in silos. Marketing, operations, and merchandising teams typically ask for capital separately. Rarely does this process start with asking the question, ‘What is the core strategy we want to align ourselves around over the next 24 months?’ If you begin there and make a single capital allocation request to pursue that strategy, you remove one of the largest obstacles standing in the way.”

Chip Burgard, Senior Vice President of Marketing at CitiMortgage focused on two big barriers. “The first one is a systems barrier. I know a lot of companies struggle with this problem. We’re operating with a channel-centric rather than a customer-centric view. Now that we need to deliver omnichannel customer experiences, we realize we’re not as customer-centric as we thought we were. We need to understand what products our customers have across lines-of-business such as, credit cards, banking, investments and mortgage. But, our systems weren’t providing a total customer relationship view across products and channels. Now, we’re making progress on that. The second barrier is compensation. We have a commission-based sales force. How do you compensate the loan officers if a customer starts the transaction with the call center but completes it in the branch? That’s another issue we’re working on.”

Lisa Zoellner, CMO at Golfsmith International added, “I agree that compensation is a big barrier. Companies need to rethink their compensation plans. The sticky question is ‘Who gets credit for the sale?’ It’s easy to say that you’re channel-agnostic, but when someone’s paycheck is tied to the performance of a particular channel, it makes it difficult to drive that type of culture change.”

“We have a complicated business. More than 500 Hyatt hotels and resorts span multiple brands and regions,” said Chris Brogan, SVP of Strategy and Analytics at Hyatt Hotels & Resorts. “But, customers want a seamless experience no matter where they travel. They expect that the preference they shared during their Hyatt stay at a hotel in Singapore is understood by the person working at the next hotel in Dallas. So, we’re bridging those traditional silos all the way down to the hotel. A guest doesn’t care if the person they’re interacting with is from the building engineering department, from the food and beverage department, or the rooms department. It’s all part of the same customer experience. So we’re looking at how we share the information that’s important to guests to keep the customer the focus of our operations.”

“We’re working together collectively to meet our customers’ needs across the channels they are using to engage with us.”

How are companies powering great customer experiences with great customer data?

Chris Brogan, SVP of Strategy and Analytics at Hyatt Hotels & Resorts, said, “We’re going through a transformation to unleash our colleagues to deliver great customer experiences at every stage of the guest journey. Our competitive differentiation comes from knowing our customers better than our competitors. We manage our customer data like a strategic asset so we can use that information to serve customers better and build loyalty for our brand.”

Hyatt connects the fragmented customer data from numerous applications including sales, marketing, ecommerce, customer service and finance. They bring the core customer profiles together into a single, trusted location, where they are continually managed. Now their customer profiles are clean, de-duplicated, enriched, and validated. They can see the members of a household as well as the connections between corporate hierarchies. Business and analytics applications are fueled with this clean, consistent and connected information so customer-facing teams can do their jobs more effectively and hotel teams can extend simple, meaningful gestures that drive guest loyalty.

When he first joined Hyatt, Chris did a search for his name in the central customer database and found 13 different versions of himself. This included the single Chris Brogan who lived across the street from Wrigley Field with his buddies in his 20s and the Chris Brogan who lives in the suburbs with his wife and two children. “I can guarantee those two guys want something very different from a hotel stay. Mostly just sleep now,” he joked. Those guest profiles have now been successfully consolidated.

This solid customer data foundation means Hyatt colleagues can more easily personalize a guest’s experience. For example, colleagues at the front desk are now able to use the limited check-in time to congratulate a new Diamond member on just achieving the highest loyalty program tier or offer a better room to those guests most likely to take them up on the offer and appreciate it.

According to Chris, “Successful marketing, sales and customer experience initiatives need to be built on a solid customer data foundation. It’s much harder to execute effectively and continually improve if your customer data is not in order.”

How are you shifting from channel-centric to customer-centric?

Chip Burgard, SVP of Marketing at CitiMortgage answered, “In the beginning of our omnichannel journey, we were trying to allow customer choice through multi-channel. Our whole organization was designed around people managing different channels. But, we quickly realized that allowing separate experiences that a customer can choose from is not being customer-centric.

Now we have new sales leadership that understands the importance of delivering seamless, integrated and consistent customer experiences across channels. And they are changing incentives to drive that customer-centric behavior. We’re no longer holding people accountable specifically for activity in their channels. We’re working together collectively to meet our customers’ needs across the channels they are using to engage with us.”

Chris Berg, VP of Store Operations at The Home Depot, explained, “For us, it’s about transitioning from a store-centric to customer-centric approach. It’s a cultural change. The managers of our 2,000 stores have traditionally been compensated based on their own store’s performance. But we are one brand. For example in the future, a store may be fulfilling an order, however because of the geography of where the order originated they may not receive credit for the sale. We’re in the process of working through how to better reward that collaboration. Also, we’re making investments in our systems so they support an omnichannel, or what we call interconnected, business. We have 40,000 products in store and over 1,000,000 products online. Now that we’re on the interconnected journey, we’re rethinking how we manage our product information so we can better manage inventory across channels more effectively and efficiently.”

Summary

Omnichannel is all about shifting from channel-centric to customer-centric – much more customer-centric than you are today. Knowing who your customers are and having a view of products and inventory across channels are the basic requirements to delivering exceptional customer experiences across channels and touch points.

This is not a project. A business transformation is required to empower people to deliver omnichannel customer experiences. The executive team needs to drive it and align compensation and incentives around it. A collaborative cross-functional approach is needed to achieve it.

Omnichannel depends on customer-facing teams such as marketing, sales and call centers to have access to a total customer relationship view based on clean, consistent and connected customer, product and inventory information. This is the basic foundation needed to deliver seamless, integrated and consistent customer experiences across channels and touch points and improve their effectiveness.

Share
Posted in B2B, B2B Data Exchange, Business Impact / Benefits, Business/IT Collaboration, Data Services | Tagged , , , | Leave a comment

Banks and the Art of the Possible: Disruptors are Re-shaping Banking

Banks and the Art of the Possible

Banks and the Art of the Possible: Disruptors are Re-shaping Banking

The problem many banks encounter today is that they have vast sums of investment tied up in old ways of doing things. Historically, customers chose a bank and remained ’loyal’ throughout their lifetime…now competition is rife and loyalty is becoming a thing of a past. In order to stay ahead of the competition, gain and keep customers, they need to understand the ever-evolving market, disrupt norms and continue to delight customers. The tradition of staying with one bank due to family convention or from ease has now been replaced with a more informed customer who understands the variety of choice at their fingertips.

Challenger Banks don’t build on ideas of tradition and legacy and see how they can make adjustments to them. They embrace change. Longer-established banks can’t afford to do nothing, and assume their size and stature will attract customers.

Here’s some useful information

Accenture’s recent report, The Bank of Things, succinctly explains what ‘Customer 3.0’ is all about. The connected customer isn’t necessarily younger. It’s everybody. Banks can get to know their customers better by making better use of information. It all depends on using intelligent data rather than all data. Interrogating the wrong data can be time-consuming, costly and results in little actionable information.

When an organisation sets out with the intention of knowing its customers, then it can calibrate its data according with where the gold nuggets – the real business insights – come from. What do people do most? Where do they go most? Now that they’re using branches and phone banking less and less – what do they look for in a mobile app?

Customer 3.0 wants to know what the bank can offer them all-the-time, on the move, on their own device. They want offers designed for their lifestyle. Correctly deciphered data can drive the level of customer segmentation that empowers such marketing initiatives. This means an organisation has to have the ability and the agility to move with its customers. It’s a journey that never ends -technology will never have a cut-off point just like customer expectations will never stop evolving.

It’s time for banks to re-shape banking

Informatica have been working with major retail banks globally to redefine banking excellence and realign operations to deliver it. We always start by asking our customers the revealing question “Have you looked at the art of the possible to future-proof your business over the next five to ten years and beyond?” This is where the discussion begins to explore really interesting notions about unlocking potential. No bank can afford to ignore them.

Share
Posted in B2B Data Exchange, Banking & Capital Markets, Business Impact / Benefits, Business/IT Collaboration, Cloud Data Integration, Data Services, Financial Services | Tagged , , , , | Leave a comment

Data Mania- Using REST APIs and Native Connectors to Separate Yourself from the SaaS Pack

Data Mania

Data Mania: An Event for SaaS & ISV Leaders

With Informatica’s Data Mania on Wednesday, I’ve been thinking a lot lately about REST APIs. In particular, I’ve been considering how and why they’ve become so ubiquitous, especially for SaaS companies. Today they are the prerequisite for any company looking to connect with other ecosystems, accelerate adoption and, ultimately, separate themselves from the pack.

Let’s unpack why.

To trace the rise of the REST API, we’ll first need to take a look at the SOAP web services protocol that preceded it.  SOAP is still very much in play and remains important to many application integration scenarios. But it doesn’t receive much use or love from the thousands of SaaS applications that just want to get or place data with one another or in one of the large SaaS ecosystems like Salesforce.

Why this is the case has more to do with needs and demands of a SaaS business than it does with the capabilities of SOAP web services. SOAP, as it turns out, is perfectly fine for making and receiving web service calls, but it does require work on behalf of both the calling application and the producing application. And therein lies the rub.

SOAP web service calls are by their very nature incredibly structured arrangements, with specifications that must be clearly defined by both parties. Only after both the calling and producing application have their frameworks in place can the call be validated. While the contract within SOAP WSDLs makes SOAP more robust, it also makes it too rigid, and less adaptable to change. But today’s apps need a more agile and more loosely defined API framework that requires less work to consume and can adapt to the inevitable and frequent changes demanded by cloud applications.

Enter REST APIs

REST APIs are the perfect vehicle for today’s SaaS businesses and mash-up applications. Sure, they’re more loosely defined than SOAP, but when all you want to do is get and receive some data, now, in the context you need, nothing is easier or better for the job than a REST API.

With a REST API, the calls are mostly done as HTTP with some loose structure and don’t require a lot of mechanics from the calling application, or effort on behalf of the producing application.

SaaS businesses prefer REST APIs because they are easy to consume. They also make it easy to onboard new customers and extend the use of the platform to other applications. The latter is important because it is primarily through integration that SaaS applications get to become part of an enterprise business process and gain the stickiness needed to accelerate adoption and growth.

Without APIs of any sort, integration can only be done through manual data movement, which opens the application and enterprise up to the potential errors caused by fat-finger data movement. That typically will give you the opposite result of stickiness, and is to be avoided at all costs.

While publishing an API as a way to get and receive data from other applications is a great start, it is just a means to an end. If you’re a SaaS business with greater ambitions, you may want to consider taking the next step of building native connectors to other apps using an integration system such as Informatica Cloud. A connector can provide a nice layer of abstraction on the APIs so that the data can be accessed as application data objects within business processes. Clearly, stickiness with any SaaS application improves in direct proportion to the number of business processes or other applications that it is integrated with.

The Informatica Cloud Connector SDK is Java-based and enables you easily to cut and paste the code necessary to create the connectors. Informatica Cloud’s SDKs are also richer and make it possible for you to adapt the REST API to something any business user will want to use – which is a huge advantage.

In addition to making your app stickier, native connectors have the added benefit of increasing your portability. Without this layer of abstraction, direct interaction with a REST API that’s been structurally changed would be impossible without also changing the data flows that depend on it. Building a native connector makes you more agile, and inoculates your custom built integration from breaking.

Building your connectors with Informatica Cloud also provides you with some other advantages. One of the most important is entrance to a community that includes all of the major cloud ecosystems and the thousands of business apps that orbit them. As a participant, you’ll become part of an interconnected web of applications that make up the business processes for the enterprises that use them.

Another ancillary benefit is access to integration templates that you can easily customize to connect with any number of known applications. The templates abstract the complexity from complicated integrations, can be quickly customized with just a few composition screens, and are easily invoked using Informatica Cloud’s APIs.

The best part of all this is that you can use Informatica Cloud’s integration technology to become a part of any business process without stepping outside of your application.

For those interested in continuing the conversation and learning more about how leading SaaS businesses are using REST API’s and native connectors to separate themselves, I invite you to join me at Data Mania, March 4th in San Francisco. Hope to see you there.

Share
Posted in B2B, B2B Data Exchange, Cloud, Cloud Application Integration, Cloud Data Integration, Data Integration, Data Services, SaaS | Tagged , , , , , , , | Leave a comment

Analytics-Casual Versus Analytics-Driven

Analytics

Analytics-Casual Versus Analytics-Driven

What does it take to be an analytics-driven business? That’s a question that requires a long answer. Recently, Gartner research director Lisa Kart took on this question, noting how the key to becoming an analytics-driven business.

So, the secret of becoming an analytics-driven business is to bust down the silos — easier than done, of course. The good news, as Kart tells it, is that one doesn’t need to be casting a wide net across the world in search of the right data for the right occasion. The biggest opportunities are with connecting the data you already have, she says.

Taking Kart’s differentiation of just-using-analytics versus analytics-driven culture a step further, hare is a brief rundown of how businesses just using analytics approach the challenge, versus their more enlightened counterparts:

Business just using analytics: Lots of data, but no one really understands how much is around, or what to do with it.

Analytics-driven business: The enterprise has a vision and strategy, supported from the top down, closely tied to the business strategy. Management also recognizes that existing data has great value to the business.

Business just using analytics: Every department does its own thing, with varying degrees of success.

Analytics-driven business: Makes connections between all the data – of all types — floating around the organization. For example, gets a cross-channel view of a customer by digging deeper and connecting the silos together to transform the data into something consumable.

Business just using analytics: Some people in marketing have been collecting customer data and making recommendations to their managers.

Analytics-driven business: Marketing departments, through analytics, engage and interact with customers, Kart says. An example would be creating high end, in-store customer experiences that gave customers greater intimacy and interaction.

Business just using analytics: The CFO’s staff crunches numbers within their BI tools and arrive at what-if scenarios.

Analytics-driven business: Operations and finance departments share online data to improve performance using analytics. For example, a company may tap into a variety of data, including satellite images, weather patterns, and other factors that may shape business conditions, Kart says.

Business just using analytics: Some quants in the organization pour over the data and crank out reports.

Analytics-driven business: Encourages maximum opportunities for innovation by putting analytics in the hands of all employees. Analytics-driven businesses recognize that more innovation comes from front-line decision-makers than the executive suite.

Business just using analytics: Decision makers put in report requests to IT for analysis.

Analytics-driven business: Decision makers can go to an online interface that enables them to build and display reports with a click (or two).

Business just using analytics: Analytics spits out standard bar charts, perhaps a scattergram.

Analytics-driven business: Decision makers can quickly visualize insights through 3D graphics, also reflecting real-time shifts.

Share
Posted in B2B, B2B Data Exchange | Tagged , | 1 Comment

Big Data is Nice to Have, But Big Culture is What Delivers Success

big_data

Big Data is Nice to Have, But Big Culture is What Delivers Success

Despite spending more than $30 Billion in annual spending on Big Data, successful big data implementations elude most organizations. That’s the sobering assessment of a recent study of 226 senior executives from Capgemini, which found that only 13 percent feel they have truly have made any headway with their big data efforts.

The reasons for Big Data’s lackluster performance include the following:

  • Data is in silos or legacy systems, scattered across the enterprise
  • No convincing business case
  • Ineffective alignment of Big Data and analytics teams across the organization
  • Most data locked up in petrified, difficult to access legacy systems
  • Lack of Big Data and analytics skills

Actually, there is nothing new about any of these issues – in fact, the perceived issues with Big Data initiatives so far map closely with the failed expect many other technology-driven initiatives. First, there’s the hype that tends to get way ahead of any actual well-functioning case studies. Second, there’s the notion that managers can simply take a solution of impressive magnitude and drop it on top of their organizations, expecting overnight delivery of profits and enhanced competitiveness.

Technology, and Big Data itself, is but a tool that supports the vision, well-designed plans and hard work of forward-looking organizations. Those managers seeking transformative effects need to look deep inside their organizations, at how deeply innovation is allowed to flourish, and in turn, how their employees are allowed to flourish. Think about it: if line employees suddenly have access to alternative ways of doing things, would they be allowed to run with it? If someone discovers through Big Data that customers are using a product differently than intended, do they have the latitude to promote that new use? Or do they have to go through chains of approval?

Big Data may be what everybody is after, but Big Culture is the ultimate key to success.

For its part, Capgemini provides some high-level recommendations for better baking in transformative values as part of Big Data initiatives, based on their observations of best-in-class enterprises:

The vision thing: “It all starts with vision,” says Capgemini’s Ron Tolido. “If the company executive leadership does not actively, demonstrably embrace the power of technology and data as the driver of change and future performance, nothing digitally convincing will happen. We have not even found one single exception to this rule. The CIO may live and breathe Big Data and there may even be a separate Chief Data Officer appointed – expect more of these soon – if they fail to commit their board of executives to data as the engine of success, there will be a dark void beyond the proof of concept.”

Establish a well-defined organizational structure: “Big Data initiatives are rarely, if ever, division-centric,” the Capgemini report states. “They often cut across various departments in an organization. Organizations that have clear organizational structures for managing rollout can minimize the problems of having to engage multiple stakeholders.”

Adopt a systematic implementation approach:  Surprisingly, even the largest and most sophisticated organizations that do everything on process don’t necessarily approach Big Data this way, the report states. “Intuitively, it would seem that a systematic and structured approach should be the way to go in large-scale implementations. However, our survey shows that this philosophy and approach are rare. Seventy-four percent of organizations did not have well-defined criteria to identify, qualify and select Big Data use-cases. Sixty-seven percent of companies did not have clearly defined KPIs to assess initiatives. The lack of a systematic approach affects success rates.”

Adopt a “venture capitalist” approach to securing buy-in and funding: “The returns from investments in emerging digital technologies such as Big Data are often highly speculative, given the lack of historical benchmarks,” the Capgemini report points out. “Consequently, in many organizations, Big Data initiatives get stuck due to the lack of a clear and attributable business case.” To address this challenge, the report urges that Big Data leaders manage investments “by using a similar approach to venture capitalists. This involves making multiple small investments in a variety of proofs of concept, allowing rapid iteration, and then identifying PoCs that have potential and discarding those that do not.”

Leverage multiple channels to secure skills and capabilities: “The Big Data talent gap is something that organizations are increasingly coming face-to-face with. Closing this gap is a larger societal challenge. However, smart organizations realize that they need to adopt a multi-pronged strategy. They not only invest more on hiring and training, but also explore unconventional channels to source talent. Capgemini advises reaching out to partner organizations for the skills needed to develop Big Data initiatives. These can be employee exchanges, or “setting up innovation labs in high-tech hubs such as Silicon Valley.” Startups may also be another source of Big Data talent.

Share
Posted in B2B, B2B Data Exchange, Big Data, Business Impact / Benefits | Tagged , , | Leave a comment

Informatica Doubled Big Data Business in 2014 As Hadoop Crossed the Chasm

Big Data

Informatica Doubled Big Data Business in 2014 As Hadoop Crossed the Chasm

2014 was a pivotal turning point for Informatica as our investments in Hadoop and efforts to innovate in big data gathered momentum and became a core part of Informatica’s business. Our Hadoop related big data revenue growth was in the ballpark of leading Hadoop startups – more than doubling over 2013.

In 2014, Informatica reached about 100 enterprise customers of our big data products with an increasing number going into production with Informatica together with Hadoop and other big data technologies.  Informatica’s big data Hadoop customers include companies in financial services, insurance, telcommunications, technology, energy, life sciences, healthcare and business services.  These innovative companies are leveraging Informatica to accelerate their time to production and drive greater value from their big data investments.

These customers are in-production or implementing a wide range of use cases leveraging Informatica’s great data pipeline capabilities to better put the scale, efficiency and flexibility of Hadoop to work.  Many Hadoop customers start by optimizing their data warehouse environments by moving data storage, profiling, integration and cleansing to Hadoop in order to free up capacity in their traditional analytics data warehousing systems. Customers that are further along in their big data journeys have expanded to use Informatica on Hadoop for exploratory analytics of new data types, 360 degree customer analytics, fraud detection, predictive maintenance, and analysis of massive amounts of Internet of Things machine data for optimization of energy exploration, manufacturing processes, network data, security and other large scale systems initiatives.

2014 was not just a year of market momentum for Informatica, but also one of new product development innovations.  We shipped enhanced functionality for entity matching and relationship building at Hadoop scale (a key part of Master Data Management), end-to-end data lineage through Hadoop, as well as high performance real-time streaming of data into Hadoop. We also launched connectors to NoSQL and analytics databases including Datastax Cassandra, MongoDB and Amazon Redshift. Informatica advanced our capabilities to curate great data for self-serve analytics with a connector to output Tableau’s data format and launched our self-service data preparation solution, Informatica Rev.

Customers can now quickly try out Informatica on Hadoop by downloading the free trials for the Big Data Edition and Vibe Data Stream that we launched in 2014.  Now that Informatica supports all five of the leading Hadoop distributions, customers can build their data pipelines on Informatica with confidence that no matter how the underlying Hadoop technologies evolve, their Informatica mappings will run.  Informatica provides highly scalable data processing engines that run natively in Hadoop and leverage the best of open source innovations such as YARN, MapReduce, and more.   Abstracting data pipeline mappings from the underlying Hadoop technologies combined with visual tools enabling team collaboration empowers large organizations to put Hadoop into production with confidence.

As we look ahead into 2015, we have ambitious plans to continue to expand and evolve our product capabilities with enhanced productivity to help customers rapidly get more value from their data in Hadoop. Stay tuned for announcements throughout the year.

Try some of Informatica’s products for Hadoop on the Informatica Marketplace here.

Share
Posted in B2B Data Exchange, Big Data, Data Integration, Data Services, Hadoop | Tagged , , , , , , | Leave a comment

The Supply Chain Impact of Adding an Allergen to a Chocolate Bar

supply_chainThroughout the lifecycle of a consumable product, many parties are constantly challenged with updating and managing product information, like ingredients and allergens. Since December 13th, 2014, this task has become even more complex than before for companies producing or selling food and beverage products in the European Union due to the new EU 1169/2011 rules. As a result, changing the basic formula of a chocolate bar by adding one allergen, like nuts for example should be carefully considered as it can have a tremendous impact on its complete supply chain if manufacturer and retailer(s) want to remain compliant with EU regulation 1169/2011 and to inform the consumer about the changes to the product.

EU_1169_nuts_to_a_chololate_bar-supply_chainLet’s say, the chocolate bar is available in three (3) varieties: dark-, whole milk and white chocolate. Each of the varieties is available in two (2) sizes: normal and mini size. The chocolate bar is distributed in twelve (12) countries within the European Union using different packaging. This would require the manufacturer to introduce 72 (3 varieties * 2 sizes * 12 countries) new GTINs at an item level. If the chocolate producer decides to do any package or seasonal promotions, multipacks or introduce a new variety, this number would even be higher. The new attributes, including updated information on allergens, have to be modified for each product and 72 new GTINs have to be generated for the chocolates bars by the chocolate manufacturer’s data maintenance department. Trading partners have to be updated about the modifications, too. Assuming the manufacturer uses the Global Data Synchronization Network (GDSN), the new GTINs will have to be registered along with their new attributes eight weeks before the modified product will be available. In addition to that, packaging hierarchies have to be taken into consideration as well. Let’s say, each item has an associated case and pallet, the number of updates would sum up at 216 (72 updates * 3 product hierarchies).

Managing the updates related to the product modifications, results in high administrative and other costs. Trading partners across the supply chain report significant impact to their annual sales and costs. According to GS1 Europe, one retailer reported 6,000-8,000 GTIN changes per year, leading to 2-3% of additional administrative and support costs. If the GTIN had to change for every new minor variant of a product, they forecast the number of changes per year could rise to 20,000-25,000, leading to a significant increase in further additional administrative and support costs.

The change of the chocolate bar’s recipe also means that a corresponding change to the mandatory product data displayed on the label is required. This means for the online retailer(s) selling the chocolate bar that they have to update information displayed in the online shops. Considering that a retailer has to deliver exactly the product that is displayed in his online shop, there will be a period of time when the old version of the product and the new version coexist in the supply chain. During this period it is not possible for the retailer to know if the version of the product ordered on a website will be available at the time and place the order is picked.

GS1 Europe suggests handling this issue as follows: Retailers working to GS1 standards use GTINs to pick on-line orders. If the modified chocolate bar with nuts is given a new GTIN it increases the possibility that the correct variant can be made available for picking and, even if it is not available at the pick point, the retailer can recognize automatically if the version being picked is different from the version that was ordered. In this latter case the product can be offered as a substitute when the goods are delivered and the consumer can choose whether to accept it or not. On their websites, GS1 provides comprehensive information on how to comply with the new European Food Information Regulation.

Using the Global Data Synchronization Network (GDSN), suppliers and retailers are able to share standardized product data, cut down the cost of building point to point integrations and speed-up new product introductions by getting access to the most accurate and most current product information. The Informatica GDSN Accelerator is an add-on to the Informatica Product Information Management (PIM) system that provides an interface to access a GDSN certified data pool. It is designed to help organizations securely and continuously exchange, update and synchronize product data with trading partners according to the standards defined by Global Standards One (GS1). GDSN ensures that data exchanged between trading partners is accurate and compliant with globally supported standards in maintaining uniqueness, classification and identification of source and recipients. Integrated in your PIM System, the GDSN Accelerator allows for leveraging product data of highest standards to be exchanged with your trading partners via the GDSN.

Thanks to the automated product data exchange, efforts and costs related to the modification of a product, as demonstrated in the chocolate bar example can be significantly reduced for both, manufacturers and retailers. The product data can be easily transferred to the data pool and you can fully control the information sharing with a specific trading partner or with all recipients of a target market.

Related blogs:

How GS1 and PIM Help to Fulfill Legal Regulations and Feed Distribution Channels

5 Ways to Comply with the New European Food Information Regulation

Share
Posted in B2B Data Exchange, PiM | Tagged , , , | 1 Comment

Product Intelligence: How To Make Your Product Information Smarter

As we discussed at length in our #HappyHoliData series, no matter what the customer industry or use case, information quality is a key value component to deliver the right services or products to the right customer.

In my blog on 2015 omnichannel trends impacting customer experience I commented on product trust as a key expectation in the eyes of customers.

For product managers, merchandizers or category managers this means: which products shall we offer for which price? How is the competition pricing this item? With which content is the competition promoting this SKU? Are my retailers and distributors sticking to my price policy. Companies need quicker insights for taking decisions on their assortment, prices and compelling content and for better customer facing service.

Recently, we’ve been spending time discussing this challenge with the folks at Indix, an innovator in the product intelligence space, to find ways to help businesses improve their product information quality.  For background, Indix is building the world’s largest database of product information and currently tracks over 600 million products, over 600,000 seller, over 40,000 brands, over 10,000 attributes across over 6,000 categories. (source: Indix.com)

Indix takes all of that data, then cleanses and normalizes it and breaks it down into two types of product information — offers data and catalog data.  The offers data includes all the dynamic information related to the sale of a product such as the number of stores at which it is sold, price history, promotions, channels, availability, and shipping. The catalog data comprises relatively unchanging product information, such as brand, images, descriptions, specifications, attributes, tags, and facets.

product intelligence indix informatica

We’ve been talking with the Indix team about how powerful it could be to integrate product intelligence directly into the Informatica PIM.  Just imagine if Informatica customers could seamlessly bring in relevant offers and catalog content into the PIM through a direct connection to the Indix Product Intelligence Platform and begin using market and competitive data immediately.

What do you think?  

We’re going to be at NRF and meet selected people to discuss more.  If you like the idea, or have some feedback on the concept, let us know.  We’d love to see you while we’re there and talk further about this idea with you.

Share
Posted in B2B, B2B Data Exchange, PiM, Product Information Management, Retail | Tagged , , | Leave a comment

What Should Come First: Business Processes or Analytics?

business processesAs more and more businesses become fully digitized, the instantiation of their business processes and business capabilities becomes based in software. And when businesses implement software, there are choices to be made that can impact whether these processes and capabilities become locked in time or establish themselves as a continuing basis for business differentiation.

Make sure you focus upon the business goals

business processesI want to suggest that whether the software instantiations of business process and business capabilities deliver business differentiation depends upon whether business goals and analytics are successfully embedded in a software implementation from the start. I learned this first hand several years ago. I was involved in helping a significant insurance company with their implementation of analytics software. Everyone in the management team was in favor of the analytics software purchase. However, the project lead wanted the analytics completed after an upgrade had occurred to their transactional processing software. Fortunately, the firm’s CIO had a very different perspective. This CIO understood that decisions regarding the transaction processing software implementation could determine whether critical metrics and KPIs could be measured. So instead of doing analytics as an afterthought, this CIO had the analytics done as a fore thought. In other words, he slowed down the transactional software implementation. He got his team to think first about the goals for the software implementation and the business goals for the enterprise. With these in hand, his team determined what metrics and KPIs were needed to measure success and improvement. They then required the transaction software development team to ensure that the software implemented the fields needed to measure the metrics and KPIs. In some cases, this was as simple as turning on a field or training users to enter a field as the transaction software went live.

Make the analytics part of everyday business decisions and business processes

Tom DavenportThe question is how common is this perspective because it really matters. Tom Davenport says that “if you really want to put analytics to work in an enterprise, you need to make them an integral part of everyday business decisions and business processes—the methods by which work gets done” (Analytics at Work, Thomas Davenport, Harvard Business Review Press, page 121). For many, this means turning their application development on its head like our insurance CIO. This means in particular that IT implementation teams should no longer be about just slamming in applications. They need to be more deliberate. They need to start by identifying the business problems that they want to get solved through the software instantiation of a business process. They need as well to start with how they want to improve process by the software rather than thinking about getting the analytics and data in as an afterthought.

Why does this matter so much? Davenport suggests that “embedding analytics into processes improves the ability of the organization to implement new insights. It eliminates gaps between insights, decisions, and actions” (Analytics at Work, Thomas Davenport, Harvard Business Review Press, page 121). Tom gives the example of a car rental company that embedded analytics into its reservation system and was able with the data provided to expunge long held shared beliefs. This change, however, resulted in a 2% increased fleet utilization and returned $19m to the company from just one location.

Look beyond the immediate decision to the business capability

Davenport also suggests as well that enterprises need look beyond their immediate task or decision and appreciate the whole business process or what happens upstream or downstream. This argues that analytics be focused on the enterprise capability system. Clearly, maximizing performance of the enterprise capability system requires an enterprise perspective upon analytics. As well, it should be noted that a systems perspective allows business leadership to appreciate how different parts of the business work together as a whole. Analytics, therefore, allow the business to determine how to drive better business outcomes for the entire enterprise.

At the same time, focusing upon the enterprise capabilities system in many cases will overtime lead a reengineering of overarching business processes and a revamping of their supporting information systems. This allows in turn the business to capitalize on the potential of business capability and analytics improvement. From my experience, most organizations need some time to see what a change in analytics performance means. This is why it can make sense to start by measuring baseline process performance before determining enhancements to the business process. Once completed, however, refinement to the enhanced process can be determined by continuously measuring processes performance data.

Related links

Related Blogs

Analytics Stories: A Banking Case Study
Analytics Stories: A Financial Services Case Study
Analytics Stories: A Healthcare Case Study
Who Owns Enterprise Analytics and Data?
Competing on Analytics: A Follow Up to Thomas H. Davenport’s Post in HBR
Thomas Davenport Book “Competing On Analytics”

Solution Brief: The Intelligent Data Platform

Author Twitter: @MylesSuer

Share
Posted in B2B, B2B Data Exchange, Business Impact / Benefits, Business/IT Collaboration, Enterprise Data Management | Tagged , , , | Leave a comment