Category Archives: B2B
In 2014, Informatica Cloud focused a great deal of attention on the needs and challenges of the citizen integrator. These are the critical business users at the core of every company: The customer-facing sales rep at the front, as well as the tireless admin at the back. We all know and rely on these men and women. And up until very recently, they’ve been almost entirely reliant on IT for the integration tasks and processes needed to be successful at their jobs.
A lot of that has changed over the last year or so. In a succession of releases, we provided these business users with the tools to take matters into their hands. And with the assistance of key ecosystem partners, such as Salesforce, SAP, Amazon, Workday, NetSuite and the hundreds of application developers that orbit them, we’ve made great progress toward giving business users the self-sufficiency they need, and demand. But, beyond giving these users the tools to integrate and connect with their apps and information at will, what we’ve really done is give them the ability to focus their attention and efforts on their most valuable customers. By doing so, we have got to core of the real purpose and importance of the whole cloud project or enterprise: The customer relationship.
In a recent Fortune interview, Salesforce CEO and cloud evangelist Marc Benioff echoed that idea when he stated that “The CEO is now in charge of the customer relationship.” What he meant by that is companies now have the ability to tie all aspects of their marketing – website, customer service, email marketing, social, sales, etc. – into “one canonical file” with all the respective customer information. By organizing the enterprise around the customer this way, the company can then pivot all of their efforts toward the customer relationship, which is what is required if a business is going to have and sustain success as we move through the 2010s and beyond.
We are in complete agreement with Marc and think it wouldn’t be too much of a stretch to declare 2015 as the year of the customer relationship. In fact, helping companies and business users focus their attention toward the customer has been a core focus of ours for some time. For an example, you don’t have to look much further than the latest iteration of our real-time application integration capability.
In a short video demo that I recommend to everyone, my colleague Eric does a fantastic job of walking users through the real-time features available through the Informatica Cloud platform.
As the demo demonstrates, the real-time features let you build a workflow process application that interacts with data from cloud and on-premise sources right from the Salesforce user interface (UI). It’s quick and easy, thus allowing you to devote more time to your customers and less time on “plumbing.”
The workflows themselves are created with the help of a drag-and-drop process designer that enables the user to quickly create a new process and configure the parameters, inputs and outputs, and decision steps with the click of a few buttons.
Once the process guide is created, it displays as a window embedded right in the Salesforce UI. So if, for example, you’ve created an opportunity-to-order guide, you can follow a wizard-driven process that walks your users from new opportunity creation through to the order confirmation, and everything in between.
As users move through the process, they can interact in real time with data from any on-premise or cloud-based source they choose. In the example from the video, the user, Eric, chooses a likely prospect from a list of company contacts, and with a few keystrokes creates a new opportunity in Salesforce. In a further demonstration of the real-time capability, Eric performs a NetSuite query, logs a client call, escalates a case to customer service, pulls the latest price book information from an Oracle database, builds out the opportunity items, creates the order in SAP, and syncs it all back to Salesforce, all without leaving the wizard interface.
The capabilities available via Informatica Cloud’s application integration are a gigantic leap forward for business users and an evolutionary step toward pivoting the enterprise toward the customer. As 2015 takes hold we will see this become increasingly important as companies continue to invest in the cloud. This is especially true for those cloud applications, like the Salesforce Analytics, Marketing and Sales Clouds, that need immediate access to the latest and most reliable customer data to make them all work — and truly establish you as the CEO in charge of customer relationships.
Back in 2004, we saw the rapid growth of SaaS providers such as Salesforce.com. However, there was typically no consistent data integration strategy to go along with the use of SaaS. In many instances, SaaS-delivered applications became the new data silos in the enterprise, silos that lacked a sound integration plan and integration technology.
10 years later, we’ve gotten to a point where we have the ability to solve problems using SaaS and data integration problems around the use of SaaS. However, we typically lack the knowledge and understanding of how to effectively use data integration technology within an enterprise to integrate SaaS problem domains.
Lawson looks at both sides of the SaaS integration argument. “Surveys certainly show that integration is less of a concern for SaaS than in the early days, when nearly 88 percent of SaaS companies said integration concerns would slow down adoption and more than 88 percent said it’s an important or extremely important factor in winning new customers.”
Again, while we’ve certainly gotten better at integration, we’re nowhere near being out of the woods. “A Dimensional Research survey of 350 IT executives showed that 67 percent cited data integration problems as a challenge with SaaS business applications. And as with traditional systems, integration can add hidden costs to your project if you ignore it.”
As I’ve stated many times in this blog, integration requires a bit of planning and the use of solid technology. While this does require some extra effort and money, the return on the value of this work is huge.
SaaS integration requires that you take a bit of a different approach than traditional enterprise integration. SaaS systems typically place your data behind well-defined APIs that can be accessed directly or through a data integration technology. While the information can be consumed by anything that can invoke an API, enterprises still have to deal with structure and content differences, and that’s typically best handled using the right data integration technology.
Other things to consider, things that are again often overlooked, is the need for both data governance and data security around your SaaS integration solution. There should be a centralized control mechanism to support the proper management and security of the data, as well as a mechanism to deal with data quality issues that often emerge when consuming data from any cloud computing services.
The reality is that SaaS is here to stay. Even enterprise software players that put off the move to SaaS-delivered systems, are not standing up SaaS offerings. The economics around the use of SaaS are just way to compelling. However, as SaaS-delivered systems become more common place, so will the emergence of new silos. This will not be an issue, if you leverage the right SaaS integration approach and technology. What will your approach be?
As we discussed at length in our #HappyHoliData series, no matter what the customer industry or use case, information quality is a key value component to deliver the right services or products to the right customer.
In my blog on 2015 omnichannel trends impacting customer experience I commented on product trust as a key expectation in the eyes of customers.
For product managers, merchandizers or category managers this means: which products shall we offer for which price? How is the competition pricing this item? With which content is the competition promoting this SKU? Are my retailers and distributors sticking to my price policy. Companies need quicker insights for taking decisions on their assortment, prices and compelling content and for better customer facing service.
Recently, we’ve been spending time discussing this challenge with the folks at Indix, an innovator in the product intelligence space, to find ways to help businesses improve their product information quality. For background, Indix is building the world’s largest database of product information and currently tracks over 600 million products, over 600,000 seller, over 40,000 brands, over 10,000 attributes across over 6,000 categories. (source: Indix.com)
Indix takes all of that data, then cleanses and normalizes it and breaks it down into two types of product information — offers data and catalog data. The offers data includes all the dynamic information related to the sale of a product such as the number of stores at which it is sold, price history, promotions, channels, availability, and shipping. The catalog data comprises relatively unchanging product information, such as brand, images, descriptions, specifications, attributes, tags, and facets.
We’ve been talking with the Indix team about how powerful it could be to integrate product intelligence directly into the Informatica PIM. Just imagine if Informatica customers could seamlessly bring in relevant offers and catalog content into the PIM through a direct connection to the Indix Product Intelligence Platform and begin using market and competitive data immediately.
What do you think?
We’re going to be at NRF and meet selected people to discuss more. If you like the idea, or have some feedback on the concept, let us know. We’d love to see you while we’re there and talk further about this idea with you.
As more and more businesses become fully digitized, the instantiation of their business processes and business capabilities becomes based in software. And when businesses implement software, there are choices to be made that can impact whether these processes and capabilities become locked in time or establish themselves as a continuing basis for business differentiation.
Make sure you focus upon the business goals
I want to suggest that whether the software instantiations of business process and business capabilities deliver business differentiation depends upon whether business goals and analytics are successfully embedded in a software implementation from the start. I learned this first hand several years ago. I was involved in helping a significant insurance company with their implementation of analytics software. Everyone in the management team was in favor of the analytics software purchase. However, the project lead wanted the analytics completed after an upgrade had occurred to their transactional processing software. Fortunately, the firm’s CIO had a very different perspective. This CIO understood that decisions regarding the transaction processing software implementation could determine whether critical metrics and KPIs could be measured. So instead of doing analytics as an afterthought, this CIO had the analytics done as a fore thought. In other words, he slowed down the transactional software implementation. He got his team to think first about the goals for the software implementation and the business goals for the enterprise. With these in hand, his team determined what metrics and KPIs were needed to measure success and improvement. They then required the transaction software development team to ensure that the software implemented the fields needed to measure the metrics and KPIs. In some cases, this was as simple as turning on a field or training users to enter a field as the transaction software went live.
Make the analytics part of everyday business decisions and business processes
The question is how common is this perspective because it really matters. Tom Davenport says that “if you really want to put analytics to work in an enterprise, you need to make them an integral part of everyday business decisions and business processes—the methods by which work gets done” (Analytics at Work, Thomas Davenport, Harvard Business Review Press, page 121). For many, this means turning their application development on its head like our insurance CIO. This means in particular that IT implementation teams should no longer be about just slamming in applications. They need to be more deliberate. They need to start by identifying the business problems that they want to get solved through the software instantiation of a business process. They need as well to start with how they want to improve process by the software rather than thinking about getting the analytics and data in as an afterthought.
Why does this matter so much? Davenport suggests that “embedding analytics into processes improves the ability of the organization to implement new insights. It eliminates gaps between insights, decisions, and actions” (Analytics at Work, Thomas Davenport, Harvard Business Review Press, page 121). Tom gives the example of a car rental company that embedded analytics into its reservation system and was able with the data provided to expunge long held shared beliefs. This change, however, resulted in a 2% increased fleet utilization and returned $19m to the company from just one location.
Look beyond the immediate decision to the business capability
Davenport also suggests as well that enterprises need look beyond their immediate task or decision and appreciate the whole business process or what happens upstream or downstream. This argues that analytics be focused on the enterprise capability system. Clearly, maximizing performance of the enterprise capability system requires an enterprise perspective upon analytics. As well, it should be noted that a systems perspective allows business leadership to appreciate how different parts of the business work together as a whole. Analytics, therefore, allow the business to determine how to drive better business outcomes for the entire enterprise.
At the same time, focusing upon the enterprise capabilities system in many cases will overtime lead a reengineering of overarching business processes and a revamping of their supporting information systems. This allows in turn the business to capitalize on the potential of business capability and analytics improvement. From my experience, most organizations need some time to see what a change in analytics performance means. This is why it can make sense to start by measuring baseline process performance before determining enhancements to the business process. Once completed, however, refinement to the enhanced process can be determined by continuously measuring processes performance data.
Analytics Stories: A Banking Case Study
Analytics Stories: A Financial Services Case Study
Analytics Stories: A Healthcare Case Study
Who Owns Enterprise Analytics and Data?
Competing on Analytics: A Follow Up to Thomas H. Davenport’s Post in HBR
Thomas Davenport Book “Competing On Analytics”
Solution Brief: The Intelligent Data Platform
Author Twitter: @MylesSuer
It’s true. Data integration is a whole new game, compared to five years ago, or, in some organizations, five minutes ago. The right approaches to data integration continue to evolve around a few principal forces: First, the growth of cloud computing, as pointed out by Stafford. Second, the growing use of big data systems, and the emerging use of data as a strategic asset for the business.
These forces combine to drive us to the understanding that old approaches to data integration won’t provide the value that they once did. As someone who was a CTO of three different data integration companies, I’ve seen these patterns change over the time that I was building technology, and that change has accelerated in the last 7 years.
The core opportunities lie with the enterprise architect, and their ability to drive an understanding of the value of data integration, as well as drive change within their organization. After all, they, or the enterprises CTOs and CIOs (whomever makes decisions about technological approaches), are supposed to drive the organization in the right technical directions that will provide the best support for the business. While most enterprise architects follow the latest hype, such as cloud computing and big data, many have missed the underlying data integration strategies and technologies that will support these changes.
“The integration challenges of cloud adoption alone give architects and developers a once in a lifetime opportunity to retool their skillsets for a long-term, successful career, according to both analysts. With the right skills, they’ll be valued leaders as businesses transition from traditional application architectures, deployment methodologies and sourcing arrangements.”
The problem is that, while most agree that data integration is important, they typically don’t understand what it is, and the value it can bring. These days, many developers live in a world of instant updates. With emerging DevOps approaches and infrastructure, they really don’t get the need, or the mechanisms, required to share data between application or database silos. In many instances, they resort to coding interfaces between source and target systems. This leads to brittle and unreliable integration solutions, and thus hurts and does not help new cloud application and big data deployments.
The message is clear: Those charged with defining technology strategies within enterprises need to also focus on data integration approaches, methods, patterns, and technologies. Failing to do so means that the investments made in new and emerging technology, such as cloud computing and big data, will fail to provide the anticipated value. At the same time, enterprise architects need to be empowered to make such changes. Most enterprises are behind on this effort. Now it’s time to get to work.
2014 was the year that Big Data went mainstream from conversations asking “What is Big Data?” to “How do we harness the power of Big Data to solve real business problems”. It seemed like everyone jumped on the Big Data band wagon from new software start-ups offering the “next generation” predictive analytic applications to traditional database, data quality, business intelligence, and data integration vendors, all calling themselves Big Data providers. The truth is, they all play a role in this Big Data movement.
Earlier in 2014, Wikibon estimated the Big Data market is currently on pace to top $50 billion in 2017, which translates to a 38% compound annual growth rate over the six year period from 2011 (the first year Wikibon sized the Big Data market) to 2017. Most of the excitement around Big Data has been around Hadoop as early adopters who experimented with open source versions quickly grew to adopt enterprise-class solutions from companies like Cloudera™, HortonWorks™, MapR™, and Amazon’s RedShift™ to address real-world business problems including: (more…)
In the report, Gartner cites. “Global-scale scandals around sensitive data losses have highlighted the need for effective data protection, especially from insider attacks. Data masking, which is focused on protecting data from insiders and outsiders, is a must-have technology in enterprises’ and governments’ security portfolios.”
Organizations realize that data protection must be hardened to protect against the inevitable breach; originating from either internal or external threats. Data masking covers gaps in data protection in production and non-production environments that can be exploited by attackers.
Informatica customers are elevating the importance of data security initiatives in 2015 given the high exposure of recent breaches and the shift from just stealing identities and intellectual property, to politically charged platforms. This raises the concern that existing security controls are insufficient and a more data-centric security approach is necessary.
Recent enforcement by the Federal Trade Commission in the US and emerging legislation worldwide has clearly indicated that sensitive data access and sharing should be tightly controlled; this is the strength of data masking.
Data Masking de-identifies and/or de-sensitizes private and confidential data by hiding it from those who are unauthorized to access it. Other terms for data masking include data obfuscation, sanitization, scrambling, de-identification, and anonymization.
To learn more, Download the Gartner Magic Quadrant Data Masking Report now. And visit the Informatica website for data masking product information.
About the Magic Quadrant
Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.
Happy Holidays, Happy HoliData
In case you have missed our #HappyHoliData series on Twitter and LinkedIn, I decided to provide a short summary of best practices which are unleashing information potential. Simply scroll and click on the case study which is relevant for you and your business. The series touches on different industries and use cases. But all have one thing in common: All consider information quality as key value to their business to deliver the right services or products to the right customer.
Thanks a lot to all my great teammates, who made this series happen.
Happy Holidays, Happy HoliData.
I think this new capability, Salesforce Lightning Connect, is an innovative development and gives OData, an OASIS standard, a leg-up on its W3C-defined competitor Linked Data. OData is a REST-based protocol that provides access to data over the web. The fundamental data model is relational and the query language closely resembles what is possible with stripped-down SQL. This is much more familiar to most people than the RDF-based model using by Linked Data or its SPARQL query language.
Standardization of OData has been going on for years (they are working on version 4), but it has suffered from a bit of a chicken-egg problem. Applications haven’t put a large priority on supporting the consumption of OData because there haven’t been enough OData providers, and data providers haven’t prioritized making their data available through OData because there haven’t been enough consumers. With Salesforce, a cloud leader declaring that they will consume OData, the equation changes significantly.
But these things take time – what does someone do who is a user of Salesforce (or any other OData consumer) if most of their data sources they have cannot be accessed as an OData provider? It is the old last-mile problem faced by any communications or integration technology. It is fine to standardize, but how do you get all the existing endpoints to conform to the standard. You need someone to do the labor-intensive work of converting to the standard representation for lots of endpoints.
Informatica has been in the last-mile business for years. As it happens, the canonical model that we always used has been a relational model that lines up very well with the model used by OData. For us to host an OData provider for any of the data sources that we already support, we only needed to do one conversion from the internal format that we’ve always used to the OData standard. This OData provider capability will be available soon.
But there is also the firewall issue. The consumer of the OData has to be able to access the OData provider. So, if you want Salesforce to be able to show data from your Oracle database, you would have to open up a hole in your firewall that provides access to your database. Not many people are interested in doing that – for good reason.
Informatica Cloud’s Vibe secure agent architecture is a solution to the firewall issue that will also work with the new OData provider. The OData provider will be hosted on Informatica’s Cloud servers, but will have access to any installed secure agents. Agents require a one-time install on-premise, but are thereafter managed from the cloud and are automatically kept up-to-date with the latest version by Informatica . An agent doesn’t require a port to be opened, but instead opens up an outbound connection to the Informatica Cloud servers through which all communication occurs. The agent then has access to any on-premise applications or data sources.
OData is especially well suited to reading external data. However, there are better ways for creating or updating external data. One problem is that Salesforce only handles reads, but even when it does handle writes, it isn’t usually appropriate to add data to most applications by just inserting records in tables. Usually a collection of related information must to be provided in order for the update to make sense. To facilitate this, applications provide APIs that provide a higher level of abstraction for updates. Informatica Cloud Application Integration can be used now to read or write data to external applications from with Salesforce through the use of guides that can be displayed from any Salesforce screen. Guides make it easy to generate a friendly user interface that shows exactly the data you want your users to see and to guide them through the collection of new or updated data that needs to be written back to your app.
This is a continuation from Part 1 of the Blog which you can read here.
Now, if you are in IT, reading about how Informatica Rev enables the everyday business users in your company to participate in the Data Democracy might feel like treachery. You are likely thinking that Informatica is letting the bull loose in your own fine china shop. You likely feel, first of all, that Informatica is supporting the systemic bypass of all the data governance that IT has worked hard to put in place and then second of all, that Informatica is supporting the alienation of the very IT people that have approved of and invested in Informatica for decades.
While I can understand this thought process I am here to, proudly, inform you that your thoughts cannot be further from the truth! In fact, in the not too distant future, Informatica is in a very strong position to create a very unique technology solution to ensure you can better govern all the data in your enterprise and do it in a way that will allow you to proactively deliver the right data to the business, yes, before the masses of everyday business users have started to knock your door down to even ask for it. Informatica’s unique solution will ensure the IT and Business divide that has existed in your company for decades, actually becomes a match made in heaven. And you in IT get the credit for leading this transformation of your company to a Data Democracy. Listen to this webinar to hear Justin Glatz, Executive Director of Information Technology at Code Nast speak about how he will be leading Conde Nast’s transformation to Data Democracy.
“How?” you might ask. Well, first let’s face it, today you do not have any visibility into how the business is procuring and using most data, and therefore you are not governing most of it. Without a change in your tooling, your ability to gain this visibility is diminishing greatly, especially since the business does not have to come to you to procure and use their cloud based applications. By having all of your everyday business users use Informatica Rev, you, for the first time will have the potential to gain a truly complete picture of how data is being used in your company. Even the data they do not come to you to procure.
In the not too distant future, you will gain this visibility through an IT companion application to Informatica Rev. You will then gain the ability to easily operationalize your business user’s exact transformation logic or Recipe as we call it in Informatica Rev, into your existing repositories be they your enterprise data warehouse, datamart or master data management repository for example. And by-the-way you are likely already using Informatica PowerCenter or Informatica Cloud or Informatica MDM to manage these repositories anyway so you already have the needed infrastructure we will be integrating Informatica Rev with. And if you are not using Informatica for managing these repositories, the draw of becoming proactive with your business and leading the transformation of your company to a Data Democracy will be enough to make you want to go get Informatica.
Just as these Professionals have found success by participating in the Data Democracy, with Informatica Rev you finally can do so, too. You can try Informatica Rev for free by clicking here.