Tag Archives: B2B

The New Insurance Model

insurance

The New Insurance Model

Make it about me
I know I’m not alone in feeling unimportant when I contact large organisations and find they lack the customer view we’re all being told we can expect in a digital, multichannel age. I have to pro-act to get things done. I have to ask my insurance provider, for example, if my car premium reflects my years of loyalty, or if I’m due a multi-policy discount.

The time has come for insurers to focus on how they use data for true competitive advantage and customer loyalty. In this void, with a lack of tailored service, I will continue to shop around for something better. It doesn’t have to be like this.

Know data – no threat
A new report from KPMG, Transforming Insurance: Securing competitive advantage (download the pdf here) explores the viable use of data for predictable analytics in insurance. The report finds that almost two thirds of insurer respondents to its survey only use analytics for reporting what happened, rather than for driving future customer interactions. This is a process that tends to take place in distinct data silos, focused on an organisation’s internal business divisions, rather than on customer engagements.

The report missed a critical point. The discussion for insurers is not around data analytics – to an extent they do that already. The focus needs to shift quickly to understanding the data they already have and using it to augment their capabilities. ‘Transformation’ is a huge step. ‘Augmentation’ can be embarked on with no delay and at relatively low costs. It will keep insurers ahead of new market threats.

New players have no locked-down idea about how insurance models should work, but they do recognise how to identify customer needs through the data their customers freely provide. Tesco made a smooth transition from Club Card to insurance provider because it had the data necessary to market the propositions its customers needed. It knew a lot about them. What is there to stop other data-driven organisations like Amazon, Google, and Facebook from entering the market? The barrier for entry has never been lower, and those with a data-centric understanding of their customers are poised to scramble over it.

Changing the design point – thinking data first
There is an immediate strategic need for the insurance sector to view data as more than functional – information to define risk categories and set premiums. In the light of competitive threats, the insurance industry has to recognise and harness the business value of the vast amounts of data it has collected and continues to gather. A new design point is needed – one that creates a business architecture which thinks Data First.

To adopt a data first architecture is to augment the capabilities a company already has. The ‘nirvana’ business model for the insurer is to expand customer propositions beyond the individual (party, car, house, health, annuity) to the household (similar profiles, easier profiling). Based on the intelligent use of data, policy-centric grows to customer-centricity, with a viable evolution path to household-centricity, untied to legacy limitations.

Win back the customer
Changing the data architecture is a pragmatic solution to a strategic problem. By putting data first, insurers can find the golden nuggets already sitting in their systems. They can make the connections across each customer’s needs and life-stage. By trusting the data, insurers can elevate the quality of their customer service to a level of real personal care, enabling them to secure the loyalty of their customers before the market starts to rumble as new players make their pitch.

Focusing on a data architecture, the organisation also takes complexity out of the eco-system and creates headroom for innovation – fresh ideas around cross-sell and up-sell, delivering more complete and loyalty-generating service offerings to customers. Loyalty fosters trust, driving stronger relationships between insurer and client.

Insurers have the power – they have the data – to ensure that when next time someone like me makes contact they can impress me, sell me more, make me happier and, above all, make me stay.

Nirvana.

Share
Posted in B2B, Business Impact / Benefits, DaaS, Data First, Data Services | Tagged , , , | Leave a comment

IDMP Field Notes: Compliance Trends in Q1 2015

With the European Medicines Agency (EMA) date for compliance to IDMP (Identification of Medicinal Products) looming, Q1 2015 has seen a significant increase in IDMP activity.  Both Informatica & HighPoint Solution’s IDMP Round Table in January, and a February Marcus Evans conference in Berlin provided excellent forums for sharing progress, thoughts and strategies.  Additional confidential conversations with pharmaceutical companies show an increase in the number of approved and active projects, although some are still seeking full funding.  The following paragraphs sum up the activity and trends that I have witnessed in the first three months of the year.

I’ll start with my favourite quote, which is from Dr. Jörg Stüben of Boehringer Ingelheim, who asked:

“Isn’t part of compliance being in control of your data?” 

I like it because to me it is just the right balance of stating the obvious, and questioning the way the majority of pharmaceutical companies approach compliance:  A report that has to be created and submitted.  If a company is in control of their data, regulatory compliance would be easier and come at a lower cost.  More importantly, the company itself would benefit from easy access to high quality data.

Dr. Stüben’s question was raised during his excellent presentation at the Marcus Evans conference.  Not only did he question the status quo, but proposed an alternate way for IDMP compliance:  Let Boehringer benefit from their investment in IDMP compliance.   His approach can be summarised as follows:

  • Embrace a holistic approach to being in control of data, i.e. adopt data governance practices.
  • This is not about just compliance. Include optional attributes that will deliver value to the organisation if correctly managed.
  • Get started by creating simple, clear work packages.

Although Dr Stüben did not outline his technical solution, it would include data quality tools and a product data hub.

At the same conference, Stefan Fischer Rivera & Stefan Brügger of Bayer and Guido Claes from Janssen Pharmaceuticals both came out strongly in favour of using a Master Data Management (MDM) approach to achieving compliance.  Both companies have MDM technology and processes within their organisations, and realise the value a MDM approach can bring to achieving compliance in terms of data management and governance.  Having Mr Claes express how well Informatica’s MDM and Data Quality solutions support his existing substance data management program, made his presentation even more enjoyable to me.

Whilst the exact approaches of Bayer and Janssen differed, there were some common themes:

  • Consider both the short term (compliance) and the long term (data governance) in the strategy
  • Centralised MDM is ideal, but a federated approach is practical for July 2016
  • High quality data should be available to a wide audience outside of IDMP compliance

The first and third bullet points map very closely to Dr. Stüben’s key points, and in fact show a clear trend in 2015:

IDMP Compliance is an opportunity to invest in your data management solutions and processes for the benefit of the entire organisation.

Although the EMA was not represented at the conference, Andrew Marr presented their approach to IDMP, and master data in general.  The EMA is undergoing a system re-organisation to focus on managing Substance, Product, Organisation and Reference data centrally, rather than within each regulation or program as it is today.  MDM will play a key role in managing this data, setting a high standard of data control and management for regulatory purposes.  It appears that the EMA is also using IDMP to introduce better data management practice.

Depending on the size of the company, and the skills & tools available, other non-MDM approaches have been presented or discussed during the first part of 2015.  These include using XML and SharePoint to manage product data.  However I share a primary concern with others in the industry with this approach:  How well can you manage and control change using these tools?  Some pharmaceutical companies have openly stated that data contributors often spend more time looking for data than doing their own jobs.  A XML/SharePoint approach will do little to ease this burden, but an MDM approach will.

Despite the others approaches and solutions being discovered, there is another clear trend in Q1 2015

MDM is becoming a favoured approach for IDMP compliance due to its strong governance, centralised attribute-level data management and ability to track changes.

Interestingly, the opportunity to invest in data management, and the rise of MDM as a favoured approach has been backed up with research by Gens Associates.  Messers Gens and Brolund found a rapid incGens Associates IA with captionrease in investment during 2014 of what they term Information Architecture, in which MDM plays a key role.  IDMP is seen as a major driver for this investment.  They go on to state that investment  in master data management programs will allow a much easier and cost effective approach to data exchange (internally and externally), resulting in substantial benefits.  Unfortunately they do not elaborate on these benefits, but I have placed a summary on benefits of using MDM for IDMP compliance here.

In terms of active projects, the common compliance activities I have seen in the first quarter of 2015 are as follows:

  • Most companies are in the discovery phase: identifying the effort for compliance
  • Some are starting to make technology choices, and have submitted RFPs/RFQs
    • Those furthest along in technology already have MDM programs or initiatives underway
  • Despite getting a start, some are still lacking enough funding for achieving compliance
    • Output from the discovery phase will in some cases be used to request full funding
  • A significant number of projects have a goal to implement better data management practice throughout the company. IDMP will be the as the first release.

A final trend I have noticed in 2015 is regarding the magnitude of the compliance task ahead:

Those who have made the most progress are those who are most concerned about achieving compliance on time. 

The implication is that the companies who are starting late do not yet realise the magnitude of the task ahead.  It is not yet too late to comply and achieve long term benefits through better data management, despite only 15 months before the initial EMA deadline.  Informatica has customers who have implemented MDM within 6 months.  15 months is achievable provided the project (or program) gets the focus and resources required.

IDMP compliance is a common challenge to all those in the pharmaceutical industry.  Learning from others will help avoid common mistakes and provide tips on important topics.  For example, how to secure funding and support from senior management is a common concern among those tasked with compliance.  In order to encourage learning and networking, Informatica and HighPoint Solutions will be hosting our third IDMP roundtable in London on May 13th.  Please do join us to share your experiences, and learn from the experiences of others.

Share
Posted in B2B, B2B Data Exchange, Business Impact / Benefits, Data Security, Healthcare | Tagged , , , , | Leave a comment

The CMOs Role in Delivering Omnichannel Customer Experiences

omnichannel

The CMOs Role in Delivering Omnichannel Customer Experiences

This article was originally posted on Argyle CMO Journal and is re-posted here with permission.

According to a new global study from SDL, 90% of consumers expect a consistent customer experience across channels and devices when they interact with brands. However, according to these survey results, Gartner Survey Finds Importance of Customer Experience on the Rise — Marketing is on the Hook, fewer than half of the companies surveyed rank their customer experience as exceptional today. The good news is that two-thirds expect it to be exceptional in two years. In fact, 89% plan to compete primarily on the basis of the customer experience by 2016.

So, what role do CMOs play in delivering omnichannel customer experiences?

According to a recent report, Gartner’s Executive Summary for Leadership Accountability and Credibility within the C-Suite, a high percentage of CEOs expect CMOs to lead the integrated cross-functional customer experience. Also, customer experience is one of the top three areas of investment for CMOs in the next two years.

I had the pleasure of participating on a panel discussion at the Argyle CMO Forum in Dallas a few months ago. It focused on the emergence of omnichannel and the need to deliver seamless, integrated and consistent customer experiences across channels.

Lisa Zoellner, Chief Marketing Officer of Golfsmith International, was the dynamic moderator, kept the conversation lively, and the audience engaged. I was a panelist alongside:

Below are some highlights from the panel.

Lisa Zoellner, CMO, Golfsmith International opened the panel with a statistic. “Fifty-five percent of marketers surveyed feel they are playing catch up to customer expectations. But in that gap is a big opportunity.”

What is your definition of omnichannel?

There was consensus among the group that omnichannel is about seeing your business through the eyes of your customer and delivering seamless, integrated and consistent customer experiences across channels.

Customers don’t think in terms of channels and touch points; they just expect seamless, integrated and consistent customer experiences. It’s one brand to the customer. But there is a gap between customer expectations and what most businesses can deliver today.

In fact, executives at most organizations I’ve spoken with, including the panelists, believe they are in the very beginning stages of their journey towards delivering omnichannel customer experiences. The majority are still struggling to get a single view of customers, products and inventory across channels.

“Customers don’t think in terms of channels and touch points; they just expect seamless, integrated and consistent customer experiences.”

What are some of the core challenges standing in your way?

A key takeaway was that omnichannel requires organizations to fundamentally change how they do business. In particular, it requires changing existing business practices and processes. It cannot be done without cross-functional collaboration.

I think Chris Berg, VP, Store Operations at The Home Depot said it well, “One of the core challenges is the annual capital allocation cycle, which makes it difficult for organizations to be nimble. Most companies set strategies and commitments 12-24 months out and approach these strategies in silos. Marketing, operations, and merchandising teams typically ask for capital separately. Rarely does this process start with asking the question, ‘What is the core strategy we want to align ourselves around over the next 24 months?’ If you begin there and make a single capital allocation request to pursue that strategy, you remove one of the largest obstacles standing in the way.”

Chip Burgard, Senior Vice President of Marketing at CitiMortgage focused on two big barriers. “The first one is a systems barrier. I know a lot of companies struggle with this problem. We’re operating with a channel-centric rather than a customer-centric view. Now that we need to deliver omnichannel customer experiences, we realize we’re not as customer-centric as we thought we were. We need to understand what products our customers have across lines-of-business such as, credit cards, banking, investments and mortgage. But, our systems weren’t providing a total customer relationship view across products and channels. Now, we’re making progress on that. The second barrier is compensation. We have a commission-based sales force. How do you compensate the loan officers if a customer starts the transaction with the call center but completes it in the branch? That’s another issue we’re working on.”

Lisa Zoellner, CMO at Golfsmith International added, “I agree that compensation is a big barrier. Companies need to rethink their compensation plans. The sticky question is ‘Who gets credit for the sale?’ It’s easy to say that you’re channel-agnostic, but when someone’s paycheck is tied to the performance of a particular channel, it makes it difficult to drive that type of culture change.”

“We have a complicated business. More than 500 Hyatt hotels and resorts span multiple brands and regions,” said Chris Brogan, SVP of Strategy and Analytics at Hyatt Hotels & Resorts. “But, customers want a seamless experience no matter where they travel. They expect that the preference they shared during their Hyatt stay at a hotel in Singapore is understood by the person working at the next hotel in Dallas. So, we’re bridging those traditional silos all the way down to the hotel. A guest doesn’t care if the person they’re interacting with is from the building engineering department, from the food and beverage department, or the rooms department. It’s all part of the same customer experience. So we’re looking at how we share the information that’s important to guests to keep the customer the focus of our operations.”

“We’re working together collectively to meet our customers’ needs across the channels they are using to engage with us.”

How are companies powering great customer experiences with great customer data?

Chris Brogan, SVP of Strategy and Analytics at Hyatt Hotels & Resorts, said, “We’re going through a transformation to unleash our colleagues to deliver great customer experiences at every stage of the guest journey. Our competitive differentiation comes from knowing our customers better than our competitors. We manage our customer data like a strategic asset so we can use that information to serve customers better and build loyalty for our brand.”

Hyatt connects the fragmented customer data from numerous applications including sales, marketing, ecommerce, customer service and finance. They bring the core customer profiles together into a single, trusted location, where they are continually managed. Now their customer profiles are clean, de-duplicated, enriched, and validated. They can see the members of a household as well as the connections between corporate hierarchies. Business and analytics applications are fueled with this clean, consistent and connected information so customer-facing teams can do their jobs more effectively and hotel teams can extend simple, meaningful gestures that drive guest loyalty.

When he first joined Hyatt, Chris did a search for his name in the central customer database and found 13 different versions of himself. This included the single Chris Brogan who lived across the street from Wrigley Field with his buddies in his 20s and the Chris Brogan who lives in the suburbs with his wife and two children. “I can guarantee those two guys want something very different from a hotel stay. Mostly just sleep now,” he joked. Those guest profiles have now been successfully consolidated.

This solid customer data foundation means Hyatt colleagues can more easily personalize a guest’s experience. For example, colleagues at the front desk are now able to use the limited check-in time to congratulate a new Diamond member on just achieving the highest loyalty program tier or offer a better room to those guests most likely to take them up on the offer and appreciate it.

According to Chris, “Successful marketing, sales and customer experience initiatives need to be built on a solid customer data foundation. It’s much harder to execute effectively and continually improve if your customer data is not in order.”

How are you shifting from channel-centric to customer-centric?

Chip Burgard, SVP of Marketing at CitiMortgage answered, “In the beginning of our omnichannel journey, we were trying to allow customer choice through multi-channel. Our whole organization was designed around people managing different channels. But, we quickly realized that allowing separate experiences that a customer can choose from is not being customer-centric.

Now we have new sales leadership that understands the importance of delivering seamless, integrated and consistent customer experiences across channels. And they are changing incentives to drive that customer-centric behavior. We’re no longer holding people accountable specifically for activity in their channels. We’re working together collectively to meet our customers’ needs across the channels they are using to engage with us.”

Chris Berg, VP of Store Operations at The Home Depot, explained, “For us, it’s about transitioning from a store-centric to customer-centric approach. It’s a cultural change. The managers of our 2,000 stores have traditionally been compensated based on their own store’s performance. But we are one brand. For example in the future, a store may be fulfilling an order, however because of the geography of where the order originated they may not receive credit for the sale. We’re in the process of working through how to better reward that collaboration. Also, we’re making investments in our systems so they support an omnichannel, or what we call interconnected, business. We have 40,000 products in store and over 1,000,000 products online. Now that we’re on the interconnected journey, we’re rethinking how we manage our product information so we can better manage inventory across channels more effectively and efficiently.”

Summary

Omnichannel is all about shifting from channel-centric to customer-centric – much more customer-centric than you are today. Knowing who your customers are and having a view of products and inventory across channels are the basic requirements to delivering exceptional customer experiences across channels and touch points.

This is not a project. A business transformation is required to empower people to deliver omnichannel customer experiences. The executive team needs to drive it and align compensation and incentives around it. A collaborative cross-functional approach is needed to achieve it.

Omnichannel depends on customer-facing teams such as marketing, sales and call centers to have access to a total customer relationship view based on clean, consistent and connected customer, product and inventory information. This is the basic foundation needed to deliver seamless, integrated and consistent customer experiences across channels and touch points and improve their effectiveness.

Share
Posted in B2B, B2B Data Exchange, Business Impact / Benefits, Business/IT Collaboration, Data Services | Tagged , , , | Leave a comment

You Can’t Improve What You Don’t Measure

omni-channel

Register for the Webinar on 19th March, 2015

80% of companies surveyed said that they offer superior customer service, but only 8% of their customers agreed with them. (Source: Bain & Company)

With numbers like that there is plenty of room to improve.  But improve what?

Traditionally retailers have measured themselves against year over year increase in sales for like-stores, increased margins and lower operating costs. But, retailing has changed, customers can interact and transact with you across multiple touch points along their path to purchase and beyond. Poor performance at any one of these interaction points could lose you a customer and damage your brand.

A better measure is to calculate the customer experience across the omni-channel landscape. This will provide better insight into how you are attracting and retaining customers, and how well you are serving them. However, many retailers lack the technology and processes to deliver on a plan to improve the omni-channel customer experience.

Once you have decided to do something, what are you going to measure? Is it time spent on website versus sales? Speed to resolve problems in contact center versus number of repeat transactions from customer? Number of touch points before purchase? But what about the softer measures like how well your staff interact with customers in-store or social channels? How many “Pins” you have, or how do you assign value to them?

Organizations need to account for (CHURN, ATTRITION, LOYALTY and LIFETIME VALUE) to be able to evaluate their performance from a holistic view of their customer, not just in the confines of their own operational silo.

In an up and coming webinar Arkady Kleyner, from Intricity will break apart key components of the Omni-Channel Customer Experience calculation. Additionally, Arkady will identify the upstream components that keep this measure accurate and current.

Attend this webinar to learn:

  • The foundational calculations of Omni-Channel Customer Experience
  • Common customizations to fit different scenarios
  • Upstream components to keep the calculation current and accurate
  • Register here to receive a calendar invitation with the webinar details.
  • Join us for a 1 hour webinar and Q/A session. The event will occur March 19th at 2:00PM EST.
Share
Posted in Customer Acquisition & Retention, Customers, Retail | Tagged , , , , | Leave a comment

Startup Winners of the Informatica Data Mania Connect-a-Thon

Last week was Informatica’s first ever Data Mania event, held at the Contemporary Jewish Museum in San Francisco. We had an A-list lineup of speakers from leading cloud and data companies, such as Salesforce, Amazon Web Services (AWS), Tableau, Dun & Bradstreet, Marketo, AppDynamics, Birst, Adobe, and Qlik. The event and speakers covered a range of topics all related to data, including Big Data processing in the cloud, data-driven customer success, and cloud analytics.

While these companies are giants today in the world of cloud and have created their own unique ecosystems, we also wanted to take a peek at and hear from the leaders of tomorrow. Before startups can become market leaders in their own realm, they face the challenge of ramping up a stellar roster of customers so that they can get to subsequent rounds of venture funding. But what gets in their way are the numerous data integration challenges of onboarding customer data onto their software platform. When these challenges remain unaddressed, R&D resources are spent on professional services instead of building value-differentiating IP.  Bugs also continue to mount, and technical debt increases.

Enter the Informatica Cloud Connector SDK. Built entirely in Java and able to browse through any cloud application’s API, the Cloud Connector SDK parses the metadata behind each data object and presents it in the context of what a business user should see. We had four startups build a native connector to their application in less than two weeks: BigML, Databricks, FollowAnalytics, and ThoughtSpot. Let’s take a look at each one of them.

BigML

With predictive analytics becoming a growing imperative, machine-learning algorithms that can have a higher probability of prediction are also becoming increasingly important.  BigML provides an intuitive yet powerful machine-learning platform for actionable and consumable predictive analytics. Watch their demo on how they used Informatica Cloud’s Connector SDK to help them better predict customer churn.

Can’t play the video? Click here, http://youtu.be/lop7m9IH2aw

Databricks

Databricks was founded out of the UC Berkeley AMPLab by the creators of Apache Spark. Databricks Cloud is a hosted end-to-end data platform powered by Spark. It enables organizations to unlock the value of their data, seamlessly transitioning from data ingest through exploration and production. Watch their demo that showcases how the Informatica Cloud connector for Databricks Cloud was used to analyze lead contact rates in Salesforce, and also performing machine learning on a dataset built using either Scala or Python.

Can’t play the video? Click here, http://youtu.be/607ugvhzVnY

FollowAnalytics

With mobile usage growing by leaps and bounds, the area of customer engagement on a mobile app has become a fertile area for marketers. Marketers are charged with acquiring new customers, increasing customer loyalty and driving new revenue streams. But without the technological infrastructure to back them up, their efforts are in vain. FollowAnalytics is a mobile analytics and marketing automation platform for the enterprise that helps companies better understand audience engagement on their mobile apps. Watch this demo where FollowAnalytics first builds a completely native connector to its mobile analytics platform using the Informatica Cloud Connector SDK and then connects it to Microsoft Dynamics CRM Online using Informatica Cloud’s prebuilt connector for it. Then, see FollowAnalytics go one step further by performing even deeper analytics on their engagement data using Informatica Cloud’s prebuilt connector for Salesforce Wave Analytics Cloud.

Can’t play the video? Click here, http://youtu.be/E568vxZ2LAg

ThoughtSpot

Analytics has taken center stage this year due to the rise in cloud applications, but most of the existing BI tools out there still stick to the old way of doing BI. ThoughtSpot brings a consumer-like simplicity to the world of BI by allowing users to search for the information they’re looking for just as if they were using a search engine like Google. Watch this demo where ThoughtSpot uses Informatica Cloud’s vast library of over 100 native connectors to move data into the ThoughtSpot appliance.

Can’t play the video? Click here, http://youtu.be/6gJD6hRD9h4

Share
Posted in B2B, Business Impact / Benefits, Cloud, Data Integration, Data Integration Platform, Data Privacy, Data Quality, Data Services, Data Transformation | Tagged , , , , , | Leave a comment

Informatica joins new ServiceMax Marketplace – offers rapid, cost effective integration with ERP and Cloud apps for Field Service Automation

ERP

Informatica Partners with ServiceMax

To deliver flawless field service, companies often require integration across multiple applications for various work processes.  A good example is automatically ordering and shipping parts through an ERP system to arrive ahead of a timely field service visit.  Informatica has partnered with ServiceMax, the leading field service automation solution, and subsequently joined the new ServiceMax Marketplace to offer customers integration solutions for many ERP and Cloud applications frequently involved in ServiceMax deployments.  Comprised of Cloud Integration Templates built on Informatica Cloud for frequent customer integration “patterns”, these solutions will speed and cost contain the ServiceMax implementation cycle and help customers realize the full potential of their field service initiatives.

Existing members of the ServiceMax Community can see a demo or take advantage of a free 30-day trial that provides full capabilities of Informatica Cloud Integration for ServiceMax with prebuilt connectors to hundreds of 3rd party systems including SAP, Oracle, Salesforce, Netsuite and Workday, powered by the Informatica Vibe virtual data machine for near-universal access to cloud and on-premise data.  The Informatica Cloud Integration for Servicemax solution:

  • Accelerates ERP integration through prebuilt Cloud templates focused on key work processes and the objects on common between systems as much as 85%
  • Synchronizes key master data such as Customer Master, Material Master, Sales Orders, Plant information, Stock history and others
  • Enables simplified implementation and customization through easy to use user interfaces
  • Eliminates the need for IT intervention during configuration and deployment of ServiceMax integrations.

We look forward to working with ServiceMax through the ServiceMax Marketplace to help joint customers deliver Flawless Service!

Share
Posted in 5 Sales Plays, Business Impact / Benefits, Cloud, Cloud Application Integration, Cloud Data Integration, Cloud Data Management, Data Integration, Data Integration Platform, Data Migration, Data Synchronization, Operational Efficiency, Professional Services, SaaS | Tagged , , , | Leave a comment

Data Mania- Using REST APIs and Native Connectors to Separate Yourself from the SaaS Pack

Data Mania

Data Mania: An Event for SaaS & ISV Leaders

With Informatica’s Data Mania on Wednesday, I’ve been thinking a lot lately about REST APIs. In particular, I’ve been considering how and why they’ve become so ubiquitous, especially for SaaS companies. Today they are the prerequisite for any company looking to connect with other ecosystems, accelerate adoption and, ultimately, separate themselves from the pack.

Let’s unpack why.

To trace the rise of the REST API, we’ll first need to take a look at the SOAP web services protocol that preceded it.  SOAP is still very much in play and remains important to many application integration scenarios. But it doesn’t receive much use or love from the thousands of SaaS applications that just want to get or place data with one another or in one of the large SaaS ecosystems like Salesforce.

Why this is the case has more to do with needs and demands of a SaaS business than it does with the capabilities of SOAP web services. SOAP, as it turns out, is perfectly fine for making and receiving web service calls, but it does require work on behalf of both the calling application and the producing application. And therein lies the rub.

SOAP web service calls are by their very nature incredibly structured arrangements, with specifications that must be clearly defined by both parties. Only after both the calling and producing application have their frameworks in place can the call be validated. While the contract within SOAP WSDLs makes SOAP more robust, it also makes it too rigid, and less adaptable to change. But today’s apps need a more agile and more loosely defined API framework that requires less work to consume and can adapt to the inevitable and frequent changes demanded by cloud applications.

Enter REST APIs

REST APIs are the perfect vehicle for today’s SaaS businesses and mash-up applications. Sure, they’re more loosely defined than SOAP, but when all you want to do is get and receive some data, now, in the context you need, nothing is easier or better for the job than a REST API.

With a REST API, the calls are mostly done as HTTP with some loose structure and don’t require a lot of mechanics from the calling application, or effort on behalf of the producing application.

SaaS businesses prefer REST APIs because they are easy to consume. They also make it easy to onboard new customers and extend the use of the platform to other applications. The latter is important because it is primarily through integration that SaaS applications get to become part of an enterprise business process and gain the stickiness needed to accelerate adoption and growth.

Without APIs of any sort, integration can only be done through manual data movement, which opens the application and enterprise up to the potential errors caused by fat-finger data movement. That typically will give you the opposite result of stickiness, and is to be avoided at all costs.

While publishing an API as a way to get and receive data from other applications is a great start, it is just a means to an end. If you’re a SaaS business with greater ambitions, you may want to consider taking the next step of building native connectors to other apps using an integration system such as Informatica Cloud. A connector can provide a nice layer of abstraction on the APIs so that the data can be accessed as application data objects within business processes. Clearly, stickiness with any SaaS application improves in direct proportion to the number of business processes or other applications that it is integrated with.

The Informatica Cloud Connector SDK is Java-based and enables you easily to cut and paste the code necessary to create the connectors. Informatica Cloud’s SDKs are also richer and make it possible for you to adapt the REST API to something any business user will want to use – which is a huge advantage.

In addition to making your app stickier, native connectors have the added benefit of increasing your portability. Without this layer of abstraction, direct interaction with a REST API that’s been structurally changed would be impossible without also changing the data flows that depend on it. Building a native connector makes you more agile, and inoculates your custom built integration from breaking.

Building your connectors with Informatica Cloud also provides you with some other advantages. One of the most important is entrance to a community that includes all of the major cloud ecosystems and the thousands of business apps that orbit them. As a participant, you’ll become part of an interconnected web of applications that make up the business processes for the enterprises that use them.

Another ancillary benefit is access to integration templates that you can easily customize to connect with any number of known applications. The templates abstract the complexity from complicated integrations, can be quickly customized with just a few composition screens, and are easily invoked using Informatica Cloud’s APIs.

The best part of all this is that you can use Informatica Cloud’s integration technology to become a part of any business process without stepping outside of your application.

For those interested in continuing the conversation and learning more about how leading SaaS businesses are using REST API’s and native connectors to separate themselves, I invite you to join me at Data Mania, March 4th in San Francisco. Hope to see you there.

Share
Posted in B2B, B2B Data Exchange, Cloud, Cloud Application Integration, Cloud Data Integration, Data Integration, Data Services, SaaS | Tagged , , , , , , , | Leave a comment

Internet of Things (IoT) Changes the Data Integration Game in 2015

Data Integration

Internet of Things (IoT) Changes the Data Integration Game in 2015

As reported by the Economic Times, “In the coming years, enormous volumes of machine-generated data from the Internet of Things (IoT) will emerge. If exploited properly, this data – often dubbed machine or sensor data, and often seen as the next evolution in Big Data – can fuel a wide range of data-driven business process improvements across numerous industries.”

We can all see this happening in our personal lives.  Our thermostats are connected now, our cars have been for years, even my toothbrush has a Bluetooth connection with my phone.  On the industrial sides, devices have also been connected for years, tossing off megabytes of data per day that have been typically used for monitoring, with the data tossed away as quickly as it appears.

So, what changed?  With the advent of big data, cheap cloud, and on-premise storage, we now have the ability to store machine or sensor data spinning out of industrial machines, airliners, health diagnostic devices, etc., and leverage that data for new and valuable uses.

For example, the ability determine the likelihood that a jet engine will fail, based upon the sensor data gathered, and how that data compared with existing known patterns of failure.  Instead of getting an engine failure light on the flight deck, the pilots can see that the engine has a 20 percent likelihood of failure, and get the engine serviced before it fails completely.

The problem with all of this very cool stuff is that we need to once again rethink data integration.  Indeed, if the data can’t get from the machine sensors to a persistent data store for analysis, then none of this has a chance of working.

That’s why those who are moving to IoT-based systems need to do two things.  First, they must create a strategy for extracting data from devices, such as industrial robots or ann  Audi A8.  Second, they need a strategy to take  all of this disparate data that’s firing out of devices at megabytes per second, and put it where it needs to go, and in the right native structure (or in an unstructured data lake), so it can be leveraged in useful ways, and in real time.

The challenge is that machines and devices are not traditional IT systems.  I’ve built connectors for industrial applications in my career.  The fact is, you need to adapt to the way that the machines and devices produce data, and not the other way around.  Data integration technology needs to adapt as well, making sure that it can deal with streaming and unstructured data, including many instances where the data needs to be processed in flight as it moves from the device, to the database.

This becomes a huge opportunity for data integration providers who understand the special needs of IoT, as well as the technology that those who build IoT-based systems can leverage.  However, the larger value is for those businesses that learn how to leverage IoT to provide better services to their customers by offering insights that have previously been impossible.  Be it jet engine reliability, the fuel efficiency of my car, or feedback to my physician from sensors on my body, this is game changing stuff.  At the heart of its ability to succeed is the ability to move data from place-to-place.

Share
Posted in Business Impact / Benefits, Business/IT Collaboration, Data Integration, Data Services | Tagged , , , | 5 Comments

Analytics-Casual Versus Analytics-Driven

Analytics

Analytics-Casual Versus Analytics-Driven

What does it take to be an analytics-driven business? That’s a question that requires a long answer. Recently, Gartner research director Lisa Kart took on this question, noting how the key to becoming an analytics-driven business.

So, the secret of becoming an analytics-driven business is to bust down the silos — easier than done, of course. The good news, as Kart tells it, is that one doesn’t need to be casting a wide net across the world in search of the right data for the right occasion. The biggest opportunities are with connecting the data you already have, she says.

Taking Kart’s differentiation of just-using-analytics versus analytics-driven culture a step further, hare is a brief rundown of how businesses just using analytics approach the challenge, versus their more enlightened counterparts:

Business just using analytics: Lots of data, but no one really understands how much is around, or what to do with it.

Analytics-driven business: The enterprise has a vision and strategy, supported from the top down, closely tied to the business strategy. Management also recognizes that existing data has great value to the business.

Business just using analytics: Every department does its own thing, with varying degrees of success.

Analytics-driven business: Makes connections between all the data – of all types — floating around the organization. For example, gets a cross-channel view of a customer by digging deeper and connecting the silos together to transform the data into something consumable.

Business just using analytics: Some people in marketing have been collecting customer data and making recommendations to their managers.

Analytics-driven business: Marketing departments, through analytics, engage and interact with customers, Kart says. An example would be creating high end, in-store customer experiences that gave customers greater intimacy and interaction.

Business just using analytics: The CFO’s staff crunches numbers within their BI tools and arrive at what-if scenarios.

Analytics-driven business: Operations and finance departments share online data to improve performance using analytics. For example, a company may tap into a variety of data, including satellite images, weather patterns, and other factors that may shape business conditions, Kart says.

Business just using analytics: Some quants in the organization pour over the data and crank out reports.

Analytics-driven business: Encourages maximum opportunities for innovation by putting analytics in the hands of all employees. Analytics-driven businesses recognize that more innovation comes from front-line decision-makers than the executive suite.

Business just using analytics: Decision makers put in report requests to IT for analysis.

Analytics-driven business: Decision makers can go to an online interface that enables them to build and display reports with a click (or two).

Business just using analytics: Analytics spits out standard bar charts, perhaps a scattergram.

Analytics-driven business: Decision makers can quickly visualize insights through 3D graphics, also reflecting real-time shifts.

Share
Posted in B2B, B2B Data Exchange | Tagged , | 1 Comment

Big Data is Nice to Have, But Big Culture is What Delivers Success

big_data

Big Data is Nice to Have, But Big Culture is What Delivers Success

Despite spending more than $30 Billion in annual spending on Big Data, successful big data implementations elude most organizations. That’s the sobering assessment of a recent study of 226 senior executives from Capgemini, which found that only 13 percent feel they have truly have made any headway with their big data efforts.

The reasons for Big Data’s lackluster performance include the following:

  • Data is in silos or legacy systems, scattered across the enterprise
  • No convincing business case
  • Ineffective alignment of Big Data and analytics teams across the organization
  • Most data locked up in petrified, difficult to access legacy systems
  • Lack of Big Data and analytics skills

Actually, there is nothing new about any of these issues – in fact, the perceived issues with Big Data initiatives so far map closely with the failed expect many other technology-driven initiatives. First, there’s the hype that tends to get way ahead of any actual well-functioning case studies. Second, there’s the notion that managers can simply take a solution of impressive magnitude and drop it on top of their organizations, expecting overnight delivery of profits and enhanced competitiveness.

Technology, and Big Data itself, is but a tool that supports the vision, well-designed plans and hard work of forward-looking organizations. Those managers seeking transformative effects need to look deep inside their organizations, at how deeply innovation is allowed to flourish, and in turn, how their employees are allowed to flourish. Think about it: if line employees suddenly have access to alternative ways of doing things, would they be allowed to run with it? If someone discovers through Big Data that customers are using a product differently than intended, do they have the latitude to promote that new use? Or do they have to go through chains of approval?

Big Data may be what everybody is after, but Big Culture is the ultimate key to success.

For its part, Capgemini provides some high-level recommendations for better baking in transformative values as part of Big Data initiatives, based on their observations of best-in-class enterprises:

The vision thing: “It all starts with vision,” says Capgemini’s Ron Tolido. “If the company executive leadership does not actively, demonstrably embrace the power of technology and data as the driver of change and future performance, nothing digitally convincing will happen. We have not even found one single exception to this rule. The CIO may live and breathe Big Data and there may even be a separate Chief Data Officer appointed – expect more of these soon – if they fail to commit their board of executives to data as the engine of success, there will be a dark void beyond the proof of concept.”

Establish a well-defined organizational structure: “Big Data initiatives are rarely, if ever, division-centric,” the Capgemini report states. “They often cut across various departments in an organization. Organizations that have clear organizational structures for managing rollout can minimize the problems of having to engage multiple stakeholders.”

Adopt a systematic implementation approach:  Surprisingly, even the largest and most sophisticated organizations that do everything on process don’t necessarily approach Big Data this way, the report states. “Intuitively, it would seem that a systematic and structured approach should be the way to go in large-scale implementations. However, our survey shows that this philosophy and approach are rare. Seventy-four percent of organizations did not have well-defined criteria to identify, qualify and select Big Data use-cases. Sixty-seven percent of companies did not have clearly defined KPIs to assess initiatives. The lack of a systematic approach affects success rates.”

Adopt a “venture capitalist” approach to securing buy-in and funding: “The returns from investments in emerging digital technologies such as Big Data are often highly speculative, given the lack of historical benchmarks,” the Capgemini report points out. “Consequently, in many organizations, Big Data initiatives get stuck due to the lack of a clear and attributable business case.” To address this challenge, the report urges that Big Data leaders manage investments “by using a similar approach to venture capitalists. This involves making multiple small investments in a variety of proofs of concept, allowing rapid iteration, and then identifying PoCs that have potential and discarding those that do not.”

Leverage multiple channels to secure skills and capabilities: “The Big Data talent gap is something that organizations are increasingly coming face-to-face with. Closing this gap is a larger societal challenge. However, smart organizations realize that they need to adopt a multi-pronged strategy. They not only invest more on hiring and training, but also explore unconventional channels to source talent. Capgemini advises reaching out to partner organizations for the skills needed to develop Big Data initiatives. These can be employee exchanges, or “setting up innovation labs in high-tech hubs such as Silicon Valley.” Startups may also be another source of Big Data talent.

Share
Posted in B2B, B2B Data Exchange, Big Data, Business Impact / Benefits | Tagged , , | Leave a comment