Category Archives: B2B
Back in 2004, we saw the rapid growth of SaaS providers such as Salesforce.com. However, there was typically no consistent data integration strategy to go along with the use of SaaS. In many instances, SaaS-delivered applications became the new data silos in the enterprise, silos that lacked a sound integration plan and integration technology.
10 years later, we’ve gotten to a point where we have the ability to solve problems using SaaS and data integration problems around the use of SaaS. However, we typically lack the knowledge and understanding of how to effectively use data integration technology within an enterprise to integrate SaaS problem domains.
Lawson looks at both sides of the SaaS integration argument. “Surveys certainly show that integration is less of a concern for SaaS than in the early days, when nearly 88 percent of SaaS companies said integration concerns would slow down adoption and more than 88 percent said it’s an important or extremely important factor in winning new customers.”
Again, while we’ve certainly gotten better at integration, we’re nowhere near being out of the woods. “A Dimensional Research survey of 350 IT executives showed that 67 percent cited data integration problems as a challenge with SaaS business applications. And as with traditional systems, integration can add hidden costs to your project if you ignore it.”
As I’ve stated many times in this blog, integration requires a bit of planning and the use of solid technology. While this does require some extra effort and money, the return on the value of this work is huge.
SaaS integration requires that you take a bit of a different approach than traditional enterprise integration. SaaS systems typically place your data behind well-defined APIs that can be accessed directly or through a data integration technology. While the information can be consumed by anything that can invoke an API, enterprises still have to deal with structure and content differences, and that’s typically best handled using the right data integration technology.
Other things to consider, things that are again often overlooked, is the need for both data governance and data security around your SaaS integration solution. There should be a centralized control mechanism to support the proper management and security of the data, as well as a mechanism to deal with data quality issues that often emerge when consuming data from any cloud computing services.
The reality is that SaaS is here to stay. Even enterprise software players that put off the move to SaaS-delivered systems, are not standing up SaaS offerings. The economics around the use of SaaS are just way to compelling. However, as SaaS-delivered systems become more common place, so will the emergence of new silos. This will not be an issue, if you leverage the right SaaS integration approach and technology. What will your approach be?
As we discussed at length in our #HappyHoliData series, no matter what the customer industry or use case, information quality is a key value component to deliver the right services or products to the right customer.
In my blog on 2015 omnichannel trends impacting customer experience I commented on product trust as a key expectation in the eyes of customers.
For product managers, merchandizers or category managers this means: which products shall we offer for which price? How is the competition pricing this item? With which content is the competition promoting this SKU? Are my retailers and distributors sticking to my price policy. Companies need quicker insights for taking decisions on their assortment, prices and compelling content and for better customer facing service.
Recently, we’ve been spending time discussing this challenge with the folks at Indix, an innovator in the product intelligence space, to find ways to help businesses improve their product information quality. For background, Indix is building the world’s largest database of product information and currently tracks over 600 million products, over 600,000 seller, over 40,000 brands, over 10,000 attributes across over 6,000 categories. (source: Indix.com)
Indix takes all of that data, then cleanses and normalizes it and breaks it down into two types of product information — offers data and catalog data. The offers data includes all the dynamic information related to the sale of a product such as the number of stores at which it is sold, price history, promotions, channels, availability, and shipping. The catalog data comprises relatively unchanging product information, such as brand, images, descriptions, specifications, attributes, tags, and facets.
We’ve been talking with the Indix team about how powerful it could be to integrate product intelligence directly into the Informatica PIM. Just imagine if Informatica customers could seamlessly bring in relevant offers and catalog content into the PIM through a direct connection to the Indix Product Intelligence Platform and begin using market and competitive data immediately.
What do you think?
We’re going to be at NRF and meet selected people to discuss more. If you like the idea, or have some feedback on the concept, let us know. We’d love to see you while we’re there and talk further about this idea with you.
As more and more businesses become fully digitized, the instantiation of their business processes and business capabilities becomes based in software. And when businesses implement software, there are choices to be made that can impact whether these processes and capabilities become locked in time or establish themselves as a continuing basis for business differentiation.
Make sure you focus upon the business goals
I want to suggest that whether the software instantiations of business process and business capabilities deliver business differentiation depends upon whether business goals and analytics are successfully embedded in a software implementation from the start. I learned this first hand several years ago. I was involved in helping a significant insurance company with their implementation of analytics software. Everyone in the management team was in favor of the analytics software purchase. However, the project lead wanted the analytics completed after an upgrade had occurred to their transactional processing software. Fortunately, the firm’s CIO had a very different perspective. This CIO understood that decisions regarding the transaction processing software implementation could determine whether critical metrics and KPIs could be measured. So instead of doing analytics as an afterthought, this CIO had the analytics done as a fore thought. In other words, he slowed down the transactional software implementation. He got his team to think first about the goals for the software implementation and the business goals for the enterprise. With these in hand, his team determined what metrics and KPIs were needed to measure success and improvement. They then required the transaction software development team to ensure that the software implemented the fields needed to measure the metrics and KPIs. In some cases, this was as simple as turning on a field or training users to enter a field as the transaction software went live.
Make the analytics part of everyday business decisions and business processes
The question is how common is this perspective because it really matters. Tom Davenport says that “if you really want to put analytics to work in an enterprise, you need to make them an integral part of everyday business decisions and business processes—the methods by which work gets done” (Analytics at Work, Thomas Davenport, Harvard Business Review Press, page 121). For many, this means turning their application development on its head like our insurance CIO. This means in particular that IT implementation teams should no longer be about just slamming in applications. They need to be more deliberate. They need to start by identifying the business problems that they want to get solved through the software instantiation of a business process. They need as well to start with how they want to improve process by the software rather than thinking about getting the analytics and data in as an afterthought.
Why does this matter so much? Davenport suggests that “embedding analytics into processes improves the ability of the organization to implement new insights. It eliminates gaps between insights, decisions, and actions” (Analytics at Work, Thomas Davenport, Harvard Business Review Press, page 121). Tom gives the example of a car rental company that embedded analytics into its reservation system and was able with the data provided to expunge long held shared beliefs. This change, however, resulted in a 2% increased fleet utilization and returned $19m to the company from just one location.
Look beyond the immediate decision to the business capability
Davenport also suggests as well that enterprises need look beyond their immediate task or decision and appreciate the whole business process or what happens upstream or downstream. This argues that analytics be focused on the enterprise capability system. Clearly, maximizing performance of the enterprise capability system requires an enterprise perspective upon analytics. As well, it should be noted that a systems perspective allows business leadership to appreciate how different parts of the business work together as a whole. Analytics, therefore, allow the business to determine how to drive better business outcomes for the entire enterprise.
At the same time, focusing upon the enterprise capabilities system in many cases will overtime lead a reengineering of overarching business processes and a revamping of their supporting information systems. This allows in turn the business to capitalize on the potential of business capability and analytics improvement. From my experience, most organizations need some time to see what a change in analytics performance means. This is why it can make sense to start by measuring baseline process performance before determining enhancements to the business process. Once completed, however, refinement to the enhanced process can be determined by continuously measuring processes performance data.
Analytics Stories: A Banking Case Study
Analytics Stories: A Financial Services Case Study
Analytics Stories: A Healthcare Case Study
Who Owns Enterprise Analytics and Data?
Competing on Analytics: A Follow Up to Thomas H. Davenport’s Post in HBR
Thomas Davenport Book “Competing On Analytics”
Solution Brief: The Intelligent Data Platform
Author Twitter: @MylesSuer
It’s true. Data integration is a whole new game, compared to five years ago, or, in some organizations, five minutes ago. The right approaches to data integration continue to evolve around a few principal forces: First, the growth of cloud computing, as pointed out by Stafford. Second, the growing use of big data systems, and the emerging use of data as a strategic asset for the business.
These forces combine to drive us to the understanding that old approaches to data integration won’t provide the value that they once did. As someone who was a CTO of three different data integration companies, I’ve seen these patterns change over the time that I was building technology, and that change has accelerated in the last 7 years.
The core opportunities lie with the enterprise architect, and their ability to drive an understanding of the value of data integration, as well as drive change within their organization. After all, they, or the enterprises CTOs and CIOs (whomever makes decisions about technological approaches), are supposed to drive the organization in the right technical directions that will provide the best support for the business. While most enterprise architects follow the latest hype, such as cloud computing and big data, many have missed the underlying data integration strategies and technologies that will support these changes.
“The integration challenges of cloud adoption alone give architects and developers a once in a lifetime opportunity to retool their skillsets for a long-term, successful career, according to both analysts. With the right skills, they’ll be valued leaders as businesses transition from traditional application architectures, deployment methodologies and sourcing arrangements.”
The problem is that, while most agree that data integration is important, they typically don’t understand what it is, and the value it can bring. These days, many developers live in a world of instant updates. With emerging DevOps approaches and infrastructure, they really don’t get the need, or the mechanisms, required to share data between application or database silos. In many instances, they resort to coding interfaces between source and target systems. This leads to brittle and unreliable integration solutions, and thus hurts and does not help new cloud application and big data deployments.
The message is clear: Those charged with defining technology strategies within enterprises need to also focus on data integration approaches, methods, patterns, and technologies. Failing to do so means that the investments made in new and emerging technology, such as cloud computing and big data, will fail to provide the anticipated value. At the same time, enterprise architects need to be empowered to make such changes. Most enterprises are behind on this effort. Now it’s time to get to work.
2014 was the year that Big Data went mainstream from conversations asking “What is Big Data?” to “How do we harness the power of Big Data to solve real business problems”. It seemed like everyone jumped on the Big Data band wagon from new software start-ups offering the “next generation” predictive analytic applications to traditional database, data quality, business intelligence, and data integration vendors, all calling themselves Big Data providers. The truth is, they all play a role in this Big Data movement.
Earlier in 2014, Wikibon estimated the Big Data market is currently on pace to top $50 billion in 2017, which translates to a 38% compound annual growth rate over the six year period from 2011 (the first year Wikibon sized the Big Data market) to 2017. Most of the excitement around Big Data has been around Hadoop as early adopters who experimented with open source versions quickly grew to adopt enterprise-class solutions from companies like Cloudera™, HortonWorks™, MapR™, and Amazon’s RedShift™ to address real-world business problems including: (more…)
In the report, Gartner cites. “Global-scale scandals around sensitive data losses have highlighted the need for effective data protection, especially from insider attacks. Data masking, which is focused on protecting data from insiders and outsiders, is a must-have technology in enterprises’ and governments’ security portfolios.”
Organizations realize that data protection must be hardened to protect against the inevitable breach; originating from either internal or external threats. Data masking covers gaps in data protection in production and non-production environments that can be exploited by attackers.
Informatica customers are elevating the importance of data security initiatives in 2015 given the high exposure of recent breaches and the shift from just stealing identities and intellectual property, to politically charged platforms. This raises the concern that existing security controls are insufficient and a more data-centric security approach is necessary.
Recent enforcement by the Federal Trade Commission in the US and emerging legislation worldwide has clearly indicated that sensitive data access and sharing should be tightly controlled; this is the strength of data masking.
Data Masking de-identifies and/or de-sensitizes private and confidential data by hiding it from those who are unauthorized to access it. Other terms for data masking include data obfuscation, sanitization, scrambling, de-identification, and anonymization.
To learn more, Download the Gartner Magic Quadrant Data Masking Report now. And visit the Informatica website for data masking product information.
About the Magic Quadrant
Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.
Happy Holidays, Happy HoliData
In case you have missed our #HappyHoliData series on Twitter and LinkedIn, I decided to provide a short summary of best practices which are unleashing information potential. Simply scroll and click on the case study which is relevant for you and your business. The series touches on different industries and use cases. But all have one thing in common: All consider information quality as key value to their business to deliver the right services or products to the right customer.
Thanks a lot to all my great teammates, who made this series happen.
Happy Holidays, Happy HoliData.
I think this new capability, Salesforce Lightning Connect, is an innovative development and gives OData, an OASIS standard, a leg-up on its W3C-defined competitor Linked Data. OData is a REST-based protocol that provides access to data over the web. The fundamental data model is relational and the query language closely resembles what is possible with stripped-down SQL. This is much more familiar to most people than the RDF-based model using by Linked Data or its SPARQL query language.
Standardization of OData has been going on for years (they are working on version 4), but it has suffered from a bit of a chicken-egg problem. Applications haven’t put a large priority on supporting the consumption of OData because there haven’t been enough OData providers, and data providers haven’t prioritized making their data available through OData because there haven’t been enough consumers. With Salesforce, a cloud leader declaring that they will consume OData, the equation changes significantly.
But these things take time – what does someone do who is a user of Salesforce (or any other OData consumer) if most of their data sources they have cannot be accessed as an OData provider? It is the old last-mile problem faced by any communications or integration technology. It is fine to standardize, but how do you get all the existing endpoints to conform to the standard. You need someone to do the labor-intensive work of converting to the standard representation for lots of endpoints.
Informatica has been in the last-mile business for years. As it happens, the canonical model that we always used has been a relational model that lines up very well with the model used by OData. For us to host an OData provider for any of the data sources that we already support, we only needed to do one conversion from the internal format that we’ve always used to the OData standard. This OData provider capability will be available soon.
But there is also the firewall issue. The consumer of the OData has to be able to access the OData provider. So, if you want Salesforce to be able to show data from your Oracle database, you would have to open up a hole in your firewall that provides access to your database. Not many people are interested in doing that – for good reason.
Informatica Cloud’s Vibe secure agent architecture is a solution to the firewall issue that will also work with the new OData provider. The OData provider will be hosted on Informatica’s Cloud servers, but will have access to any installed secure agents. Agents require a one-time install on-premise, but are thereafter managed from the cloud and are automatically kept up-to-date with the latest version by Informatica . An agent doesn’t require a port to be opened, but instead opens up an outbound connection to the Informatica Cloud servers through which all communication occurs. The agent then has access to any on-premise applications or data sources.
OData is especially well suited to reading external data. However, there are better ways for creating or updating external data. One problem is that Salesforce only handles reads, but even when it does handle writes, it isn’t usually appropriate to add data to most applications by just inserting records in tables. Usually a collection of related information must to be provided in order for the update to make sense. To facilitate this, applications provide APIs that provide a higher level of abstraction for updates. Informatica Cloud Application Integration can be used now to read or write data to external applications from with Salesforce through the use of guides that can be displayed from any Salesforce screen. Guides make it easy to generate a friendly user interface that shows exactly the data you want your users to see and to guide them through the collection of new or updated data that needs to be written back to your app.
This is a continuation from Part 1 of the Blog which you can read here.
Now, if you are in IT, reading about how Informatica Rev enables the everyday business users in your company to participate in the Data Democracy might feel like treachery. You are likely thinking that Informatica is letting the bull loose in your own fine china shop. You likely feel, first of all, that Informatica is supporting the systemic bypass of all the data governance that IT has worked hard to put in place and then second of all, that Informatica is supporting the alienation of the very IT people that have approved of and invested in Informatica for decades.
While I can understand this thought process I am here to, proudly, inform you that your thoughts cannot be further from the truth! In fact, in the not too distant future, Informatica is in a very strong position to create a very unique technology solution to ensure you can better govern all the data in your enterprise and do it in a way that will allow you to proactively deliver the right data to the business, yes, before the masses of everyday business users have started to knock your door down to even ask for it. Informatica’s unique solution will ensure the IT and Business divide that has existed in your company for decades, actually becomes a match made in heaven. And you in IT get the credit for leading this transformation of your company to a Data Democracy. Listen to this webinar to hear Justin Glatz, Executive Director of Information Technology at Code Nast speak about how he will be leading Conde Nast’s transformation to Data Democracy.
“How?” you might ask. Well, first let’s face it, today you do not have any visibility into how the business is procuring and using most data, and therefore you are not governing most of it. Without a change in your tooling, your ability to gain this visibility is diminishing greatly, especially since the business does not have to come to you to procure and use their cloud based applications. By having all of your everyday business users use Informatica Rev, you, for the first time will have the potential to gain a truly complete picture of how data is being used in your company. Even the data they do not come to you to procure.
In the not too distant future, you will gain this visibility through an IT companion application to Informatica Rev. You will then gain the ability to easily operationalize your business user’s exact transformation logic or Recipe as we call it in Informatica Rev, into your existing repositories be they your enterprise data warehouse, datamart or master data management repository for example. And by-the-way you are likely already using Informatica PowerCenter or Informatica Cloud or Informatica MDM to manage these repositories anyway so you already have the needed infrastructure we will be integrating Informatica Rev with. And if you are not using Informatica for managing these repositories, the draw of becoming proactive with your business and leading the transformation of your company to a Data Democracy will be enough to make you want to go get Informatica.
Just as these Professionals have found success by participating in the Data Democracy, with Informatica Rev you finally can do so, too. You can try Informatica Rev for free by clicking here.
Informatica Cloud Data Preparation has launched! Those are the words we aspired to hear, the words that served as our rallying cry, when all we had was an idea coupled with a ton of talent, passion and drive. Well, today, we launch Informatica Rev, as business users refer to it. (Check out the press release here.)
As we launch today, we now have over 3,500 individual users across over 800 logos. These users are everyday business users who just want to improve the speed and quality of their business decisions. And by doing so, they help their corporation find success in the marketplace. And in turn they also find success in their own careers. You can hear more from Customers talking about their experience using Informatica Rev during our December 16 Future of Work webinar.
These users are people who, previously, were locked out of the exclusive Data Club because they did not have the time to be excel jocks or know how to code. But now, these are people who have found success by turning their backs on this Club and aggressively participating in the Data Democracy.
And they are able to finally participate in the “Data Democracy” because of Informatica Rev. You can try Informatica Rev for free by clicking here.
These people play every conceivable role in the information economy. They are marketing managers, marketing operations leads, tradeshow managers , sales people, sales operations leads, accounting analysts, recruiting leads, benefits managers, to mention a few. They work for large companies to small/mid-size companies and even sole proprietorships. They are even IT leads who might have more technical knowledge than their business counterparts, but are increasingly getting barraged by requests from their business side counterparts, and are just looking to be more productive with these requests. Let’s take a peek into how Informatica Rev allows them to participate in the Data Democracy, and changes their lives for the better.
Before Informatica Rev, a marketing analyst was simply unable to respond to rapid changes to competitor prices because by the time the competitor pricing data was assembled by people or tools they relied on, the competitor prices changed. This lead to lost revenue opportunities for the company. I almost don’t need to state that this end result is not an insignificant repercussion of the inability to respond at the rapid pace of business.
Let’s explore what a marketing analyst does today. When a file with competitor prices was received by the analyst, the initial questions they ask were “Which of my SKUs is each competitive price for?” and ”Do the prices vary by some geography” and to answer these questions, they use Excel VLOOKUPS and some complex macros. By the time the Excel work is done, if they know what a VLOOKUP is, the competitor data is old. Therefore, at some point, there was no reason to continue this analysis and just accept the inability to capture this revenue.
With Informatica Rev, a Marketing Analyst can use Intelligent Guidance to understand the competitor data file to determine its completeness and then with Smart Combine easily combine the competitor data with their own. This is with no code, formal training, and in a few minutes all by themselves. And with Tableau as their BI tool, they can then use the Export to TDE capability to seamlessly export to Tableau to analyze trends in price changes to decide on their strategy. Voila!
Before Informatica Rev, a tradeshow manager used to spend an inordinate amount of time trying to validate leads so that they could then load them into a Marketing Automation System. After a tradeshow, time is of the essence and leads need to be processed rapidly otherwise they will decay, and fewer opportunities will result for the company. Again, I almost don’t need to state that this end result is not an insignificant repercussion of the inability to respond at the rapid pace of business. But, the Tradeshow Manager finds themselves using Excel VLOOKUPS and other creative but time consuming ways to validate the lead information. They simply want to know, “Which leads have missing titles or phone numbers?” and ” What is the correct phone number?” and” How many are new leads?” and ” How many are in accounts closing this quarter?”
All of these are questions that can be answered, but take a lot of time in Excel and even after all that Excel work, the final lead list was still error prone causing missed sales opportunities. With Informatica Rev, a Tradeshow Manager can answer these questions rapidly with no code, formal training, and in a few minutes all by themselves. With the Intelligent Guidance capability they can easily surface where the missing data lies. With Fast Combine they can access their opportunity information in Salesforce and be guided through the process of combining tradeshow and salesforce data to correctly replace the missing data. Again, Voila!
Before Informatica Rev, an Accounting Analyst spent inordinate amounts of time processing trade partner data, every month, reconciling it with the trade partner’s receivables to determine if they had been paid the correct amount by their trade partner. Not only was this process time consuming, it was error prone and after all of the effort, they actually left millions in earned revenue, unreceived. And again, I almost don’t need to state that this end result is not an insignificant repercussion of the inability to respond at the rapid pace of business, and also effectively managing operational costs within the analysts company. So, let’s take a look at what the Accounting Analyst does, today. Every trade partner sends large files with different structures of purchase data in them. The Accounting Analyst initially asks, “What data is in them?”,” For what time period?”,” How many transactions?”,” From which products?”, “Which of our actual products does their name for our product tie to?”
Then, after they get these answers, they need to combine it with the payments data they received from the trade partner in order to answer the question, “Have we been paid the right amount and if not what is the difference?” All of these questions are ones that can be answered, but used to take a lot of time with Excel VLOOKUPS and complex macros. And often, the reconciliation was performed incorrectly leaving receivables, well, un-received. With Informatica Rev, an Accounting Analyst can benefit from Intelligent Guidance where they are lead through the process of rapidly understanding their questions about the trade partner files, with a few simple clicks. Furthermore Informatica Rev’s Smart Combine capability suggests how to combine receivables data with trade partner data. So there you have it, now they know if the correct amount has been paid. And the best part is that they were able to answer these questions rapidly with no code, formal training, and in a few minutes all by themselves. Now, this process has to be done every month. Using Recipes, every step the Accounting Analyst took last month is recorded, so they do not have to repeat it this month. Just re-import the new trade partner data and you reconciled. And Again, Voila!
One more thing for you, the everyday business user. In the future, you will be able to send this Recipe to IT. This capability will allow you to communicate your exact data requirement to IT, just as you created it with no mis-interpretation on anyone’s behalf. IT can then rapidly institutionalize your logic exactly as you defined it, into the enterprise datawarehouse, datamart or some other repository of your or your IT department’s liking. Perhaps this means the end to those requirements gathering sessions?
More importantly, I feel this means that you just got your exact requirement added into a central repository in a matter of minutes. And you did not need to make a case to be part of an enterprise project either. This capability is a necessary part for you to participate in the Data Democracy and maintain your rapid pace of business. This is a piece that Informatica is uniquely positioned to solve for you as your IT department likely already has Informatica.
Just as these Professionals have found success by participating in the Data Democracy, with Informatica Rev you finally can do so, too.
Please look for Part 2 of this Blog, tomorrow, where I will discuss how Informatica Rev elegantly bridges the IT and Business divide, empowering IT to lead the charge into Data Democracy. But in the meantime check out Informatica Rev for yourself and let me know what you think.