Category Archives: B2B Data Exchange

What’s Next for the Cloud, Coming to a City Near You (If you’re in NYC or San Francisco)

Training Seminars for Cloud Application Owners and Salesforce Practitioners

Free Training Seminars for Cloud Application Owners and Salesforce Practitioners

March 20th 2015 was the official start of spring and to be honest, it couldn’t have come soon enough for us folks in the North East.  After a long, cold and snowy winter we’re looking forward to the spring thaw and the first green shoots of burgeoning life. Spring is also the time that we like to tackle new projects and start afresh after our winter hibernation.

For those of us in technology new spring projects often reflect the things we do in everyday life.  Naturally our mind turns at this time to spring cleaning and spring training.  To be honest, we’d have to admit that we haven’t scrubbed our data in three months so data cleansing is a must, but so too is training.  We probably haven’t picked up a book or attended a seminar since last November.  But what training should we do?  And “what should we do next?”

Luckily, Informatica is providing the answer.  We’ve put together two free, half day training seminars for cloud application owners and Salesforce practitioners.  That’s two dates, two fantastic locations and dozens of brilliant speakers lined up to give you some new pointers for what’s coming next in the world of cloud and SaaS.

The goals of the event are to give you the tools and knowledge to strengthen your Salesforce implementation and help you delight your customers.  The sessions will include presentations by experts from Salesforce and our partner Bluewolf.  There will also be some best practices presentations and demonstrations from Informatica’s team of very talented engineers.

Just glance at the seminar summary and you’ll see what we mean:

Session 1: Understand the Opportunity of Every Customer Interaction

In this session Eric Berridge, Co-founder and CEO of Bluewolf Inc. will discuss how you can develop a customer obsessed culture and get the most value from every customer interaction.

Session 2: Delight Your Customers by Taking Your Salesforce Implementation to the Next Level

Ajay Gandhi, Informatica’s VP Product Marketing is next up and he’s going to provide a fabulous session on what you look out for, and where should you invest as your Salesforce footprint grows.

Session 3: Anticipate Your Business Needs With a Fresh Approach to Customer Analytics

The seminar wraps up with Benjamin Pruden, Sr. Manager Product Marketing, at Salesforce. Ben’s exciting session touches on one of the hottest topics in the industry today.  He’s going to explain how you can obtain a comprehensive understanding of your most valuable customers with cloud-analytics and data-driven dashboards.

I’m sure you’ll agree that it’s a pretty impressive seminar and well worth a couple of hours of your time.

The New York event is happening at Convene (810 Seventh Ave, 52nd and 53rd ) on April 7thClick here for more details and to reserve your seat.

The San Francisco event is a week later on April 14th at Hotel Nikko (222 Mason Street). Make sure you click here and register today.

Come join us on the 7th or the 14th to learn how to take your cloud business to the next level. Oh, and don’t forget that you’ll also be treating yourself to some well-deserved spring training!

Share
Posted in B2B, B2B Data Exchange, Business Impact / Benefits, Cloud, Cloud Application Integration, Cloud Data Integration, Cloud Data Management | Tagged , , , , , , , | Leave a comment

Data Wizard Beta: Paving the Way for Next-Generation Data Loaders

data transformation

The Data Wizard, Changes the Landscape of What Traditional Data Loaders Can Do

The emergence of the business cloud is making the need for data ever more prevalent. Whatever your business, if your role is in the sales, marketing or service departments, chances are your productivity depends a great deal on the ability to move data quickly in and out of Salesforce and its ecosphere of applications.

With the in-built data transformation intelligence, the Data Wizard (click here to try the Beta version), changes the landscape of what traditional data loaders can do. The Data Wizard takes care of the following aspects, so that you don’t have to:

  1. Data Transformations: We built in over 300 standard data transformations so you don’t have to format the data before bringing it in (eg. combining first and last names into full names, adding numeric columns for totals, splitting address fields into its separate components).
  2. Built-in intelligence: We automate the mapping of data into Salesforce for a range of common use cases (eg., Automatically mapping matching fields, intelligently auto-generating date format conversions , concatenating multiple fields).
  3. App-to-app integration: We incorporated pre-built integration templates to encapsulate the logic required for integrating Salesforce with other applications (eg., single click update of customer addresses in a Cloud ERP application based on Account addresses in Salesforce) .

Unlike the other data loading apps out there, the Data Wizard doesn’t presuppose any technical ability on the part of the user. It was purpose-built to solve the needs of every type of user, from the Salesforce administrator to the business analyst.

Despite the simplicity the Data Wizard offers, it is built on the robust Informatica Cloud integration platform, providing the same reliability and performance that is key to the success of Informatica Cloud’s enterprise customers, who integrate over 5 billion rows of data per day. We invite you to try the Data Wizard for free, and contribute to the Beta process by providing us with your feedback.

Share
Posted in B2B, B2B Data Exchange, Big Data, Business Impact / Benefits, Cloud, Cloud Application Integration, Cloud Data Integration, Data Integration, Data Integration Platform, Data Services | Tagged , , , , , | Leave a comment

Reasons to Attend the Secure@Source Launch Event

SecureatSource-Launch

Introducing Informatica Secure@Source

Security professionals are in dire need of a solution that provides visibility into where sensitive and confidential data resides, as well as visibility into the data’s risk.[1]  This knowledge would allow those responsible to take an effective, proactive approach to combating cybercrime. By focusing on the data, Informatica and our customers, partners and market ecosystem are collaborating to make data-centric security with Data Security Intelligence the next line of defense.

Security technologies that focus on securing the network and perimeter require additional safeguards when sensitive and confidential data traverse beyond these protective controls. Data proliferates to cloud-based applications and mobile devices. Application security and identity access management tools may lack visibility and granular control when data is replicated to Big Data and advanced analytics platforms.

Informatica is filling this need with its data-centric security portfolio, which now includes Secure@Source.  Informatica Secure@Source is the industry’s first data security intelligence solution that delivers insight into where sensitive and confidential data reside, as well as the data’s risk profile.

Join us at our online launch event on April 8th where we will showcase Secure@Source and share reactions from an amazing panel including:

The opportunity for Data Security Intelligence is extensive.  In a recently published report, Neuralytix defined Data-Centric Security as “an approach to security that focuses on the data itself; to cover the gaps of traditional network, host and application security solutions.”  A critical element for successful data security is collecting intelligence required to prioritize where to focus security controls and efforts that mitigate risk. This is precisely what Informatica Secure@Source was designed to achieve.

What has emerged from a predominantly manual practice, the data security intelligence software market is expected to reach $800M by 2018 with a CAGR of 27.8%.  We are excited about this opportunity! As a leader in data management software, we are uniquely qualified to take an active role in shaping this emerging market category.

Informatica Secure@Source addresses the need to get smarter about where our sensitive and private data reside, who is accessing it, prioritize which controls to implement, and work harmoniously with existing security architectures, policies and procedures. Our customers are asking us for data security intelligence, the industry deserves it.  With more than 60% of security professionals stating their biggest challenge is not knowing where their sensitive and confidential data reside, the need for Data Security Intelligence has never been greater

Neuralytix says “data security is about protecting individual data objects that traverse across networks, in and out of a public or private cloud, from source applications to targets such as partner systems, to back office SaaS applications to data warehouses and analytics platforms”.  We couldn’t agree more.  We believe that the best way to incorporate a data-centric security approach is to begin with data security intelligence.

JOIN US at the online launch event on April 8th for the security industry’s most exciting new Data Security Intelligence solution, Informatica Secure@Source.

[1] “The State of Data Centric Security,” Ponemon Institute, sponsored by Informatica, June 2014

Share
Posted in B2B, B2B Data Exchange, Banking & Capital Markets, Data Security | Tagged , , , , | Leave a comment

Why Data Integration is Exploding Right Now

Data Integration

Mashing Up Our Business Data with External Data Sources Makes Our Data Even More Valuable.

In case you haven’t noticed, data integration is all the rage right now.  Why?  There are three major reasons for this trend that we’ll explore below, but a recent USA Today story focused on corporate data as a much more valuable asset than it was just a few years ago.  Moreover, the sheer volume of data is exploding.

For instance, in a report published by research company IDC, they estimated that the total count of data created or replicated worldwide in 2012 would add up to 2.8 zettabytes (ZB).  By 2020, IDC expects the annual data-creation total to reach 40 ZB, which would amount to a 50-fold increase from where things stood at the start of 2010.

But the growth of data is only a part of the story.  Indeed, I see three things happening that drive interest in data integration.

First, the growth of cloud computing.  The growth of data integration around the growth of cloud computing is logical, considering that we’re relocating data to public clouds, and that data must be synced with systems that remain on-premise.

The data integration providers, such as Informatica, have stepped up.  They provide data integration technology that can span enterprises, managed service providers, and clouds that dealing with the special needs of cloud-based systems.  Moreover, at the same time, data integration improves the ways we doing data governance, and data quality,

Second, the growth of big data.  A recent IDC forecast shows that the big data technology and services market will grow at a 26.4% compound annual growth rate to $41.5 billion through 2018, or, about six times the growth rate of the overall information technology market. Additionally, by 2020, IDC believes that line of business buyers will help drive analytics beyond its historical sweet spot of relational to the double-digit growth rates of real-time intelligence and exploration/discovery of the unstructured worlds.

The world of big data razor blades around data integration.  The more that enterprises rely on big data, and the more that data needs to move from place to place, the more a core data integration strategy and technology is needed.  That means you can’t talk about big data without talking about big data integration.

Data integration technology providers have responded with technology that keeps up with the volume of data that moves from place to place.  As linked to the growth of cloud computing above, providers also create technology with the understanding  that data now moves within enterprises, between enterprises and clouds, and even from cloud to cloud.  Finally, data integration providers know how to deal with both structured and unstructured data these days.

Third, better understanding around the value of information.  Enterprise managers always knew their data was valuable, but perhaps they did not understand the true value that it can bring.

With the growth of big data, we now have access to information that helps us drive our business in the right directions.  Predictive analytics, for instance, allows us to take years of historical data and determine patterns that allow us to predict the future.  Mashing up our business data with external data sources makes our data even more valuable.

Of course, data integration drives much of this growth.  Thus the refocus on data integration approaches and tech.  There are years and years of evolution still ahead of us, and much to be learned from the data we maintain.

Share
Posted in B2B, B2B Data Exchange, Big Data, Business/IT Collaboration, Data First, Data Integration, Data Integration Platform, Data Security, Data Services | Tagged , , , | 1 Comment

Why “Gut Instincts” Needs to be Brought Back into Data Analytics

Gut_Instincts

Why “Gut Instincts” Needs to be Brought Back into Data Analytics

Last fall, at a large industry conference, I had the opportunity to conduct a series of discussions with industry leaders in a portable video studio set up in the middle of the conference floor. As part of our exercise, we had a visual artist do freeform storyboarding of the discussion on large swaths of five-foot by five-foot paper, which we then reviewed at the end of the session. For example, in a discussion of cloud computing, the artist drew a rendering of clouds, raining data on a landscape below, illustrated by sketches of office buildings. At a glance, one could get a good read of where the discussion went, and the points that were being made.

Data visualization is one of those up-and-coming areas that has just begin to breach the technology zone. There are some powerful front-end tools that help users to see, at a glance, trends and outliers through graphical representations – be they scattergrams, histograms or even 3D diagrams or something else eye-catching.  The “Infographic” that has become so popular in recent years is an amalgamation of data visualization and storytelling. The bottom line is technology is making it possible to generate these representations almost instantly, enabling relatively quick understanding of what the data may be saying.

The power that data visualization is bringing organizations was recently explored by Benedict Carey in The New York Times, who discussed how data visualization is emerging as the natural solution to “big data overload.”

This is much more than a front-end technology fix, however. Rather, Carey cites a growing body of knowledge emphasizing the development of “perceptual learning,” in which people working with large data sets learn to “see” patterns and interesting variations in the information they are exploring. It’s almost a return of the “gut” feel for answers, but developed for the big data era.

As Carey explains it:

“Scientists working in a little-known branch of psychology called perceptual learning have shown that it is possible to fast-forward a person’s gut instincts both in physical fields, like flying an airplane, and more academic ones, like deciphering advanced chemical notation. The idea is to train specific visual skills, usually with computer-game-like modules that require split-second decisions. Over time, a person develops a ‘good eye’ for the material, and with it an ability to extract meaningful patterns instantaneously.”

Video games may be leading the way in this – Carey cites the work of Dr. Philip Kellman, who developed a video-game-like approach to training pilots to instantly “read” instrument panels as a whole, versus pondering every gauge and dial. He reportedly was able to enable pilots to absorb within one hour what normally took 1,000 hours of training. Such perceptual-learning based training is now employed in medical schools to help prospective doctors become familiar with complicated procedures.

There are interesting applications for business, bringing together a range of talent to help decision-makers better understand the information they are looking at. In Carey’s article, an artist was brought into a medical research center to help scientists look at data in many different ways – to get out of their comfort zones. For businesses, it means getting away from staring at bars and graphs on their screens and perhaps turning data upside down or inside-out to get a different picture.

Share
Posted in B2B, B2B Data Exchange, Business/IT Collaboration, Data First, Data Integration, Data Services | Tagged , , , , , | Leave a comment

Becoming Analytics-Driven Requires a Cultural Shift, But It’s Doable

Analytics-Driven Requires a Cultural Shift

Becoming Analytics-Driven Requires a Cultural Shift, But It’s Doable

For those hoping to push through a hard-hitting analytics effort that will serve as a beacon of light within an otherwise calcified organization, there’s probably a lot of work cut out for you. Evolving into an organization that fully grasps the power and opportunities of data analytics requires cultural change, and this is a challenge organizations have only begin to grasp.

“Sitting down with pizza and coffee could get you around can get around most of the technical challenges,” explained Sam Ransbotham, Ph.D, associate professor Boston College, at a recent panel webcast hosted by MIT Sloan Management Review, “but the cultural problems are much larger.”

That’s one of the key takeaways from a the panel, in which Ransbotham was joined by Tuck Rickards, head of digital transformation practice at Russell Reynolds Associates, a digital recruiting firm, and Denis Arnaud, senior data scientist Amadeus Travel Intelligence. The panel, which examined the impact of corporate culture on data analytics, was led by Michael Fitzgerald, contributing editor at MIT Sloan Management Review.

The path to becoming an analytics-driven company is a journey that requires transformation across most or all departments, the panelists agreed. “It’s fundamentally different to be a data-driven decision company than kind of a gut-feel decision-making company,” said Rickards. “Acquiring this capability to do things differently usually requires a massive culture shift.”

That’s because the cultural aspects of the organization – “the values, the behaviors, the decision making norms and the outcomes go hand in hand with data analytics,” said Ransbotham. “It doesn’t do any good to have a whole bunch of data processes if your company doesn’t have the culture to act on them and do something with them.” Rickards adds that bringing this all together requires an agile, open source mindset, with frequent, open communication across the organization.

So how does one go about building and promoting a culture that is conducive to getting the maximum benefit from data analytics? The most important piece is being about people who ate aware and skilled in analytics – both from within the enterprise and from outside, the panelists urged. Ransbotham points out that it may seem daunting, but it’s not. “This is not some gee-whizz thing,” he said. “We have to get rid of this mindset that these things are impossible. Everybody who has figured it out has figured it out somehow. We’re a lot more able to pick up on these things that we think — the technology is getting easier, it doesn’t require quite as much as it used to.”

The key to evolving corporate culture to becoming more analytics-driven is to identify or recruit enlightened and skilled individuals who can provide the vision and build a collaborative environment. “The most challenging part is looking for someone who can see the business more broadly, and can interface with the various business functions –ideally, someone who can manage change and transformation throughout the organization,” Rickards said.

Arnaud described how his organization – an online travel service — went about building an espirit de corps between data analytics staff and business staff to ensure the success of their company’s analytics efforts. “Every month all the teams would do a hands-on workshop, together in some place in Europe [Amadeus is headquartered in Madrid, Spain].” For example, a workshop may focus on a market analysis for a specific customer, and the participants would explore the entire end-to-end process for working with the customer, “from the data collection all the way through to data acquisition through data crunching and so on. The one knowing the data analysis techniques would explain them, and the one knowing the business would explain that, and so on.” As a result of these monthly workshops, business and analytics teams members have found it “much easier to collaborate,” he added.

Web-oriented companies such as Amadeus – or Amazon and eBay for that matter — may be paving the way with analytics-driven operations, but companies in most other industries are not at this stage yet, both Rickards and Ransbotham point out. The more advanced web companies have built “an end-to-end supply chain, wrapped around customer interaction,” said Rickards. “If you think of most traditional businesses, financial services or automotive or healthcare are a million miles away from that. It starts with having analytic capabilities, but it’s a real journey to take that capability across the company.”

The analytics-driven business of the near future – regardless of industry – will likely to be staffed with roles not seen as of yet today. “If you are looking to re-architect the business, you may be imagining roles that you don’t have in the company today,” said Rickards. Along with the need for chief analytics officers, data scientists, and data analysts, there will be many new roles created. “If you are on the analytics side of this, you can be in an analytics group or a marketing group, with more of a CRM or customer insights title. Yu can be in a planning or business functions. In a similar way on the technology side, there are people very focused on architecture and security.”

Ultimately, the demand will be for leaders and professionals who understand both the business and technology sides of the opportunity, Rickards continued. Ultimately, he added, “you can have good people building a platform, and you can have good data scientists. But you better have someone on the top of that organization knowing the business purpose.’

Share
Posted in B2B, B2B Data Exchange, Big Data, Business Impact / Benefits, Business/IT Collaboration, Data Services, Data Synchronization | Tagged , , , , | Leave a comment

IDMP Field Notes: Compliance Trends in Q1 2015

With the European Medicines Agency (EMA) date for compliance to IDMP (Identification of Medicinal Products) looming, Q1 2015 has seen a significant increase in IDMP activity.  Both Informatica & HighPoint Solution’s IDMP Round Table in January, and a February Marcus Evans conference in Berlin provided excellent forums for sharing progress, thoughts and strategies.  Additional confidential conversations with pharmaceutical companies show an increase in the number of approved and active projects, although some are still seeking full funding.  The following paragraphs sum up the activity and trends that I have witnessed in the first three months of the year.

I’ll start with my favourite quote, which is from Dr. Jörg Stüben of Boehringer Ingelheim, who asked:

“Isn’t part of compliance being in control of your data?” 

I like it because to me it is just the right balance of stating the obvious, and questioning the way the majority of pharmaceutical companies approach compliance:  A report that has to be created and submitted.  If a company is in control of their data, regulatory compliance would be easier and come at a lower cost.  More importantly, the company itself would benefit from easy access to high quality data.

Dr. Stüben’s question was raised during his excellent presentation at the Marcus Evans conference.  Not only did he question the status quo, but proposed an alternate way for IDMP compliance:  Let Boehringer benefit from their investment in IDMP compliance.   His approach can be summarised as follows:

  • Embrace a holistic approach to being in control of data, i.e. adopt data governance practices.
  • This is not about just compliance. Include optional attributes that will deliver value to the organisation if correctly managed.
  • Get started by creating simple, clear work packages.

Although Dr Stüben did not outline his technical solution, it would include data quality tools and a product data hub.

At the same conference, Stefan Fischer Rivera & Stefan Brügger of Bayer and Guido Claes from Janssen Pharmaceuticals both came out strongly in favour of using a Master Data Management (MDM) approach to achieving compliance.  Both companies have MDM technology and processes within their organisations, and realise the value a MDM approach can bring to achieving compliance in terms of data management and governance.  Having Mr Claes express how well Informatica’s MDM and Data Quality solutions support his existing substance data management program, made his presentation even more enjoyable to me.

Whilst the exact approaches of Bayer and Janssen differed, there were some common themes:

  • Consider both the short term (compliance) and the long term (data governance) in the strategy
  • Centralised MDM is ideal, but a federated approach is practical for July 2016
  • High quality data should be available to a wide audience outside of IDMP compliance

The first and third bullet points map very closely to Dr. Stüben’s key points, and in fact show a clear trend in 2015:

IDMP Compliance is an opportunity to invest in your data management solutions and processes for the benefit of the entire organisation.

Although the EMA was not represented at the conference, Andrew Marr presented their approach to IDMP, and master data in general.  The EMA is undergoing a system re-organisation to focus on managing Substance, Product, Organisation and Reference data centrally, rather than within each regulation or program as it is today.  MDM will play a key role in managing this data, setting a high standard of data control and management for regulatory purposes.  It appears that the EMA is also using IDMP to introduce better data management practice.

Depending on the size of the company, and the skills & tools available, other non-MDM approaches have been presented or discussed during the first part of 2015.  These include using XML and SharePoint to manage product data.  However I share a primary concern with others in the industry with this approach:  How well can you manage and control change using these tools?  Some pharmaceutical companies have openly stated that data contributors often spend more time looking for data than doing their own jobs.  A XML/SharePoint approach will do little to ease this burden, but an MDM approach will.

Despite the others approaches and solutions being discovered, there is another clear trend in Q1 2015

MDM is becoming a favoured approach for IDMP compliance due to its strong governance, centralised attribute-level data management and ability to track changes.

Interestingly, the opportunity to invest in data management, and the rise of MDM as a favoured approach has been backed up with research by Gens Associates.  Messers Gens and Brolund found a rapid incGens Associates IA with captionrease in investment during 2014 of what they term Information Architecture, in which MDM plays a key role.  IDMP is seen as a major driver for this investment.  They go on to state that investment  in master data management programs will allow a much easier and cost effective approach to data exchange (internally and externally), resulting in substantial benefits.  Unfortunately they do not elaborate on these benefits, but I have placed a summary on benefits of using MDM for IDMP compliance here.

In terms of active projects, the common compliance activities I have seen in the first quarter of 2015 are as follows:

  • Most companies are in the discovery phase: identifying the effort for compliance
  • Some are starting to make technology choices, and have submitted RFPs/RFQs
    • Those furthest along in technology already have MDM programs or initiatives underway
  • Despite getting a start, some are still lacking enough funding for achieving compliance
    • Output from the discovery phase will in some cases be used to request full funding
  • A significant number of projects have a goal to implement better data management practice throughout the company. IDMP will be the as the first release.

A final trend I have noticed in 2015 is regarding the magnitude of the compliance task ahead:

Those who have made the most progress are those who are most concerned about achieving compliance on time. 

The implication is that the companies who are starting late do not yet realise the magnitude of the task ahead.  It is not yet too late to comply and achieve long term benefits through better data management, despite only 15 months before the initial EMA deadline.  Informatica has customers who have implemented MDM within 6 months.  15 months is achievable provided the project (or program) gets the focus and resources required.

IDMP compliance is a common challenge to all those in the pharmaceutical industry.  Learning from others will help avoid common mistakes and provide tips on important topics.  For example, how to secure funding and support from senior management is a common concern among those tasked with compliance.  In order to encourage learning and networking, Informatica and HighPoint Solutions will be hosting our third IDMP roundtable in London on May 13th.  Please do join us to share your experiences, and learn from the experiences of others.

Share
Posted in B2B, B2B Data Exchange, Business Impact / Benefits, Data Security, Healthcare | Tagged , , , , | Leave a comment

The CMOs Role in Delivering Omnichannel Customer Experiences

omnichannel

The CMOs Role in Delivering Omnichannel Customer Experiences

This article was originally posted on Argyle CMO Journal and is re-posted here with permission.

According to a new global study from SDL, 90% of consumers expect a consistent customer experience across channels and devices when they interact with brands. However, according to these survey results, Gartner Survey Finds Importance of Customer Experience on the Rise — Marketing is on the Hook, fewer than half of the companies surveyed rank their customer experience as exceptional today. The good news is that two-thirds expect it to be exceptional in two years. In fact, 89% plan to compete primarily on the basis of the customer experience by 2016.

So, what role do CMOs play in delivering omnichannel customer experiences?

According to a recent report, Gartner’s Executive Summary for Leadership Accountability and Credibility within the C-Suite, a high percentage of CEOs expect CMOs to lead the integrated cross-functional customer experience. Also, customer experience is one of the top three areas of investment for CMOs in the next two years.

I had the pleasure of participating on a panel discussion at the Argyle CMO Forum in Dallas a few months ago. It focused on the emergence of omnichannel and the need to deliver seamless, integrated and consistent customer experiences across channels.

Lisa Zoellner, Chief Marketing Officer of Golfsmith International, was the dynamic moderator, kept the conversation lively, and the audience engaged. I was a panelist alongside:

Below are some highlights from the panel.

Lisa Zoellner, CMO, Golfsmith International opened the panel with a statistic. “Fifty-five percent of marketers surveyed feel they are playing catch up to customer expectations. But in that gap is a big opportunity.”

What is your definition of omnichannel?

There was consensus among the group that omnichannel is about seeing your business through the eyes of your customer and delivering seamless, integrated and consistent customer experiences across channels.

Customers don’t think in terms of channels and touch points; they just expect seamless, integrated and consistent customer experiences. It’s one brand to the customer. But there is a gap between customer expectations and what most businesses can deliver today.

In fact, executives at most organizations I’ve spoken with, including the panelists, believe they are in the very beginning stages of their journey towards delivering omnichannel customer experiences. The majority are still struggling to get a single view of customers, products and inventory across channels.

“Customers don’t think in terms of channels and touch points; they just expect seamless, integrated and consistent customer experiences.”

What are some of the core challenges standing in your way?

A key takeaway was that omnichannel requires organizations to fundamentally change how they do business. In particular, it requires changing existing business practices and processes. It cannot be done without cross-functional collaboration.

I think Chris Berg, VP, Store Operations at The Home Depot said it well, “One of the core challenges is the annual capital allocation cycle, which makes it difficult for organizations to be nimble. Most companies set strategies and commitments 12-24 months out and approach these strategies in silos. Marketing, operations, and merchandising teams typically ask for capital separately. Rarely does this process start with asking the question, ‘What is the core strategy we want to align ourselves around over the next 24 months?’ If you begin there and make a single capital allocation request to pursue that strategy, you remove one of the largest obstacles standing in the way.”

Chip Burgard, Senior Vice President of Marketing at CitiMortgage focused on two big barriers. “The first one is a systems barrier. I know a lot of companies struggle with this problem. We’re operating with a channel-centric rather than a customer-centric view. Now that we need to deliver omnichannel customer experiences, we realize we’re not as customer-centric as we thought we were. We need to understand what products our customers have across lines-of-business such as, credit cards, banking, investments and mortgage. But, our systems weren’t providing a total customer relationship view across products and channels. Now, we’re making progress on that. The second barrier is compensation. We have a commission-based sales force. How do you compensate the loan officers if a customer starts the transaction with the call center but completes it in the branch? That’s another issue we’re working on.”

Lisa Zoellner, CMO at Golfsmith International added, “I agree that compensation is a big barrier. Companies need to rethink their compensation plans. The sticky question is ‘Who gets credit for the sale?’ It’s easy to say that you’re channel-agnostic, but when someone’s paycheck is tied to the performance of a particular channel, it makes it difficult to drive that type of culture change.”

“We have a complicated business. More than 500 Hyatt hotels and resorts span multiple brands and regions,” said Chris Brogan, SVP of Strategy and Analytics at Hyatt Hotels & Resorts. “But, customers want a seamless experience no matter where they travel. They expect that the preference they shared during their Hyatt stay at a hotel in Singapore is understood by the person working at the next hotel in Dallas. So, we’re bridging those traditional silos all the way down to the hotel. A guest doesn’t care if the person they’re interacting with is from the building engineering department, from the food and beverage department, or the rooms department. It’s all part of the same customer experience. So we’re looking at how we share the information that’s important to guests to keep the customer the focus of our operations.”

“We’re working together collectively to meet our customers’ needs across the channels they are using to engage with us.”

How are companies powering great customer experiences with great customer data?

Chris Brogan, SVP of Strategy and Analytics at Hyatt Hotels & Resorts, said, “We’re going through a transformation to unleash our colleagues to deliver great customer experiences at every stage of the guest journey. Our competitive differentiation comes from knowing our customers better than our competitors. We manage our customer data like a strategic asset so we can use that information to serve customers better and build loyalty for our brand.”

Hyatt connects the fragmented customer data from numerous applications including sales, marketing, ecommerce, customer service and finance. They bring the core customer profiles together into a single, trusted location, where they are continually managed. Now their customer profiles are clean, de-duplicated, enriched, and validated. They can see the members of a household as well as the connections between corporate hierarchies. Business and analytics applications are fueled with this clean, consistent and connected information so customer-facing teams can do their jobs more effectively and hotel teams can extend simple, meaningful gestures that drive guest loyalty.

When he first joined Hyatt, Chris did a search for his name in the central customer database and found 13 different versions of himself. This included the single Chris Brogan who lived across the street from Wrigley Field with his buddies in his 20s and the Chris Brogan who lives in the suburbs with his wife and two children. “I can guarantee those two guys want something very different from a hotel stay. Mostly just sleep now,” he joked. Those guest profiles have now been successfully consolidated.

This solid customer data foundation means Hyatt colleagues can more easily personalize a guest’s experience. For example, colleagues at the front desk are now able to use the limited check-in time to congratulate a new Diamond member on just achieving the highest loyalty program tier or offer a better room to those guests most likely to take them up on the offer and appreciate it.

According to Chris, “Successful marketing, sales and customer experience initiatives need to be built on a solid customer data foundation. It’s much harder to execute effectively and continually improve if your customer data is not in order.”

How are you shifting from channel-centric to customer-centric?

Chip Burgard, SVP of Marketing at CitiMortgage answered, “In the beginning of our omnichannel journey, we were trying to allow customer choice through multi-channel. Our whole organization was designed around people managing different channels. But, we quickly realized that allowing separate experiences that a customer can choose from is not being customer-centric.

Now we have new sales leadership that understands the importance of delivering seamless, integrated and consistent customer experiences across channels. And they are changing incentives to drive that customer-centric behavior. We’re no longer holding people accountable specifically for activity in their channels. We’re working together collectively to meet our customers’ needs across the channels they are using to engage with us.”

Chris Berg, VP of Store Operations at The Home Depot, explained, “For us, it’s about transitioning from a store-centric to customer-centric approach. It’s a cultural change. The managers of our 2,000 stores have traditionally been compensated based on their own store’s performance. But we are one brand. For example in the future, a store may be fulfilling an order, however because of the geography of where the order originated they may not receive credit for the sale. We’re in the process of working through how to better reward that collaboration. Also, we’re making investments in our systems so they support an omnichannel, or what we call interconnected, business. We have 40,000 products in store and over 1,000,000 products online. Now that we’re on the interconnected journey, we’re rethinking how we manage our product information so we can better manage inventory across channels more effectively and efficiently.”

Summary

Omnichannel is all about shifting from channel-centric to customer-centric – much more customer-centric than you are today. Knowing who your customers are and having a view of products and inventory across channels are the basic requirements to delivering exceptional customer experiences across channels and touch points.

This is not a project. A business transformation is required to empower people to deliver omnichannel customer experiences. The executive team needs to drive it and align compensation and incentives around it. A collaborative cross-functional approach is needed to achieve it.

Omnichannel depends on customer-facing teams such as marketing, sales and call centers to have access to a total customer relationship view based on clean, consistent and connected customer, product and inventory information. This is the basic foundation needed to deliver seamless, integrated and consistent customer experiences across channels and touch points and improve their effectiveness.

Share
Posted in B2B, B2B Data Exchange, Business Impact / Benefits, Business/IT Collaboration, Data Services | Tagged , , , | Leave a comment

Banks and the Art of the Possible: Disruptors are Re-shaping Banking

Banks and the Art of the Possible

Banks and the Art of the Possible: Disruptors are Re-shaping Banking

The problem many banks encounter today is that they have vast sums of investment tied up in old ways of doing things. Historically, customers chose a bank and remained ’loyal’ throughout their lifetime…now competition is rife and loyalty is becoming a thing of a past. In order to stay ahead of the competition, gain and keep customers, they need to understand the ever-evolving market, disrupt norms and continue to delight customers. The tradition of staying with one bank due to family convention or from ease has now been replaced with a more informed customer who understands the variety of choice at their fingertips.

Challenger Banks don’t build on ideas of tradition and legacy and see how they can make adjustments to them. They embrace change. Longer-established banks can’t afford to do nothing, and assume their size and stature will attract customers.

Here’s some useful information

Accenture’s recent report, The Bank of Things, succinctly explains what ‘Customer 3.0’ is all about. The connected customer isn’t necessarily younger. It’s everybody. Banks can get to know their customers better by making better use of information. It all depends on using intelligent data rather than all data. Interrogating the wrong data can be time-consuming, costly and results in little actionable information.

When an organisation sets out with the intention of knowing its customers, then it can calibrate its data according with where the gold nuggets – the real business insights – come from. What do people do most? Where do they go most? Now that they’re using branches and phone banking less and less – what do they look for in a mobile app?

Customer 3.0 wants to know what the bank can offer them all-the-time, on the move, on their own device. They want offers designed for their lifestyle. Correctly deciphered data can drive the level of customer segmentation that empowers such marketing initiatives. This means an organisation has to have the ability and the agility to move with its customers. It’s a journey that never ends -technology will never have a cut-off point just like customer expectations will never stop evolving.

It’s time for banks to re-shape banking

Informatica have been working with major retail banks globally to redefine banking excellence and realign operations to deliver it. We always start by asking our customers the revealing question “Have you looked at the art of the possible to future-proof your business over the next five to ten years and beyond?” This is where the discussion begins to explore really interesting notions about unlocking potential. No bank can afford to ignore them.

Share
Posted in B2B Data Exchange, Banking & Capital Markets, Business Impact / Benefits, Business/IT Collaboration, Cloud Data Integration, Data Services, Financial Services | Tagged , , , , | Leave a comment

Data Mania- Using REST APIs and Native Connectors to Separate Yourself from the SaaS Pack

Data Mania

Data Mania: An Event for SaaS & ISV Leaders

With Informatica’s Data Mania on Wednesday, I’ve been thinking a lot lately about REST APIs. In particular, I’ve been considering how and why they’ve become so ubiquitous, especially for SaaS companies. Today they are the prerequisite for any company looking to connect with other ecosystems, accelerate adoption and, ultimately, separate themselves from the pack.

Let’s unpack why.

To trace the rise of the REST API, we’ll first need to take a look at the SOAP web services protocol that preceded it.  SOAP is still very much in play and remains important to many application integration scenarios. But it doesn’t receive much use or love from the thousands of SaaS applications that just want to get or place data with one another or in one of the large SaaS ecosystems like Salesforce.

Why this is the case has more to do with needs and demands of a SaaS business than it does with the capabilities of SOAP web services. SOAP, as it turns out, is perfectly fine for making and receiving web service calls, but it does require work on behalf of both the calling application and the producing application. And therein lies the rub.

SOAP web service calls are by their very nature incredibly structured arrangements, with specifications that must be clearly defined by both parties. Only after both the calling and producing application have their frameworks in place can the call be validated. While the contract within SOAP WSDLs makes SOAP more robust, it also makes it too rigid, and less adaptable to change. But today’s apps need a more agile and more loosely defined API framework that requires less work to consume and can adapt to the inevitable and frequent changes demanded by cloud applications.

Enter REST APIs

REST APIs are the perfect vehicle for today’s SaaS businesses and mash-up applications. Sure, they’re more loosely defined than SOAP, but when all you want to do is get and receive some data, now, in the context you need, nothing is easier or better for the job than a REST API.

With a REST API, the calls are mostly done as HTTP with some loose structure and don’t require a lot of mechanics from the calling application, or effort on behalf of the producing application.

SaaS businesses prefer REST APIs because they are easy to consume. They also make it easy to onboard new customers and extend the use of the platform to other applications. The latter is important because it is primarily through integration that SaaS applications get to become part of an enterprise business process and gain the stickiness needed to accelerate adoption and growth.

Without APIs of any sort, integration can only be done through manual data movement, which opens the application and enterprise up to the potential errors caused by fat-finger data movement. That typically will give you the opposite result of stickiness, and is to be avoided at all costs.

While publishing an API as a way to get and receive data from other applications is a great start, it is just a means to an end. If you’re a SaaS business with greater ambitions, you may want to consider taking the next step of building native connectors to other apps using an integration system such as Informatica Cloud. A connector can provide a nice layer of abstraction on the APIs so that the data can be accessed as application data objects within business processes. Clearly, stickiness with any SaaS application improves in direct proportion to the number of business processes or other applications that it is integrated with.

The Informatica Cloud Connector SDK is Java-based and enables you easily to cut and paste the code necessary to create the connectors. Informatica Cloud’s SDKs are also richer and make it possible for you to adapt the REST API to something any business user will want to use – which is a huge advantage.

In addition to making your app stickier, native connectors have the added benefit of increasing your portability. Without this layer of abstraction, direct interaction with a REST API that’s been structurally changed would be impossible without also changing the data flows that depend on it. Building a native connector makes you more agile, and inoculates your custom built integration from breaking.

Building your connectors with Informatica Cloud also provides you with some other advantages. One of the most important is entrance to a community that includes all of the major cloud ecosystems and the thousands of business apps that orbit them. As a participant, you’ll become part of an interconnected web of applications that make up the business processes for the enterprises that use them.

Another ancillary benefit is access to integration templates that you can easily customize to connect with any number of known applications. The templates abstract the complexity from complicated integrations, can be quickly customized with just a few composition screens, and are easily invoked using Informatica Cloud’s APIs.

The best part of all this is that you can use Informatica Cloud’s integration technology to become a part of any business process without stepping outside of your application.

For those interested in continuing the conversation and learning more about how leading SaaS businesses are using REST API’s and native connectors to separate themselves, I invite you to join me at Data Mania, March 4th in San Francisco. Hope to see you there.

Share
Posted in B2B, B2B Data Exchange, Cloud, Cloud Application Integration, Cloud Data Integration, Data Integration, Data Services, SaaS | Tagged , , , , , , , | Leave a comment