Tag Archives: Integration
Now in its third year (2012, 2013), The State of Salesforce Annual Review continues to be the most comprehensive report on the Salesforce ecosystem. Based on the data from over 1,000 global Salesforce users, this report highlights how companies are using the Salesforce platform, where resources are being allocated, and where industry hype meets reality. Over the past three years, the report has evolved much like the technology, shifting and transforming to address recent advancements, and well as tracking longitudinal trends in the space.
We’ve found that key integration partners like Informatica Cloud continue to grow in importance within the Salesforce ecosystem. Beyond the core platform offerings from Salesforce, third-party apps and integration technologies have received considerable attention as companies look to extend the value of their initial investments and unite systems. The need to sync multiple platforms and applications is an emerging need in the Salesforce ecosystem—which will be highlighted in the 2014 report.
As Salesforce usage expands, so does our approach to survey execution. In line with this evolution, here’s what we’ve learned over the last three years from data collection:
Functions, Departments Make a Difference
Sales, Marketing, IT, and Service all have their own needs and pain points. As Salesforce moves quickly across the enterprise, we want to recognize the values, priorities, and investments by each department. Not only are the primary clouds for each function at different stages of maturity, but the ways in which each department uses their cloud are unique. We anticipate discovery of how enterprises are collaborating across functions and clouds.
Focus on Region
As our international data set continues to grow we are investing in regionalized reports for the US, UK, France, and Australia. While we saw indications of differences between each region in last year’s survey, they were not statistically significant.
Customer Engagement is a Top Priority
Everyone agrees that customer engagement is important, but what are companies actually doing about it? A section on predictive analytics and questions about engagement specific to departments has been included in this year’s survey. We suspect that the recent trend of companies empowering employees with a combination of data and mobile will be validated in the survey results.
Variation Across Industries
As an added bonus, we will build a report targeting specific insights from the Financial Services industry.
We Need Your Help
Our dataset depends on input from Salesforce users spanning all functions, roles, industries, and regions. Every response matters. Please take 15 minutes to share your Salesforce experiences, and you will receive a personalized report, comparing your responses to the aggregate survey results.
Eighteen months ago, I was sitting in a conference room, nothing remarkable except for the great view down 6th Avenue toward the Empire State Building. The pre-sales consultant sitting across from me had just given a visually appealing demonstration to the CIO of a multinational insurance corporation. There were fancy graphics and colorful charts sharply displayed on an iPad and refreshing every few seconds. The CIO asked how long it had taken to put the presentation together. The consultant excitedly shared that it only took him four to five hours, to which the CIO responded, “Well, if that took you less than five hours, we should be able to get a production version in about two to three weeks, right?”
The facts of the matter were completely different however. The demo, while running with the firm’s own data, had been running from a spreadsheet, housed on the laptop of the consultant and procured after several weeks of scrubbing, formatting, and aggregating data from the CIO’s team; this does not even mention the preceding data procurement process. And so, as the expert in the room, the voice of reason, the CIO turned to me wanting to know how long it would take to implement the solution. At least six months, was my assessment. I had seen their data, and it was a mess. I had seen the flow, not a model architecture and the sheer volume of data was daunting. If it was not architected correctly, the pretty colors and graphs would take much longer to refresh; this was not the answer he wanted to hear.
The advancement of social media, new web experiences and cutting edge mobile technology have driven users to expect more of their applications. As enterprises push to drive value and unlock more potential in their data, insurers of all sizes have attempted to implement analytical and business intelligence systems. But here’s the truth: by and large most insurance enterprises are not in a place with their data to make effective use of the new technologies in BI, mobile or social. The reality is that data cleanliness, fit for purpose, movement and aggregation is being done in a BI when it should be done lower down so that all applications can take advantage of it.
Let’s face it – quality data is important. Movement and shaping of data in the enterprise is important. Identification of master data and metadata in the enterprise is important and data governance is important. It brings to mind episode 165, “The Apology”, of the mega-hit show Seinfeld. Therein George Costanza accuses erstwhile friend Jason Hanky of being a “step skipper”. What I have seen in enterprise data is “step skipping” as users clamor for new and better experiences, but the underlying infrastructure and data is less than ready for consumption. So the enterprise bootstraps, duct tapes and otherwise creates customizations where it doesn’t architecturally belong.
Clearly this calls for a better solution; A more robust and architecturally sustainable data ecosystem, which shepherds the data from acquisition through to consumption and all points in between. It also must be attainable by even modestly sized insurance firms.
First, you need to bring the data under your control. That may mean external data integration, or just moving it from transactional, web, or client-server systems into warehouses, marts or other large data storage schemes and back again. But remember, the data is in various stages of readiness. This means that through out of the box or custom cleansing steps the data needs to be processed, enhanced and stored in a way that is more in line with corporate goals for governing the quality of that data. And this says nothing of the need to change a data normalization factor between source and target. When implemented as a “factory” approach, the ability to bring new data streams online, integrate them quickly and maintain high standards become small incremental changes and not a ground up monumental task. Move your data shaping, cleansing, standardization and aggregation further down in the stack and many applications will benefit from the architecture.
Critical to this process is that insurance enterprises need to ensure the data remains secure, private and is managed in accordance with rules and regulations. They must also govern the archival, retention and other portions of the data lifecycle.
At any point in the life of your information, you are likely sending or receiving data from an agent, broker, MGA or service provider, which needs to be processed using the robust ecosystem, described above. Once an effective data exchange infrastructure is implemented, the steps to process the data can nicely complement your setup as information flows to and from your trading partners.
Finally, as your enterprise determines “how” to implement these solutions, you may look to a cloud based system for speed to market and cost effectiveness compared to on-premises solutions.
And don’t forget to register for Informatica World 2014 in Las Vegas, where you can take part in sessions and networking tailored specifically for insurers.
Unlike some of my friends, History was a subject in high school and college that I truly enjoyed. I particularly appreciated biographies of favorite historical figures because it painted a human face and gave meaning and color to the past. I also vowed at that time to navigate my life and future under the principle attributed to Harvard professor Jorge Agustín Nicolás Ruiz de Santayana y Borrás that goes, “Those who cannot remember the past are condemned to repeat it.”
So that’s a little ditty regarding my history regarding history.
Forwarding now to the present in which I have carved out my career in technology, and in particular, enterprise software, I’m afforded a great platform where I talk to lots of IT and business leaders. When I do, I usually ask them, “How are you implementing advanced projects that help the business become more agile or effective or opportunistically proactive?” They usually answer something along the lines of “this is the age and renaissance of data science and analytics” and then end up talking exclusively about their meat and potatoes business intelligence software projects and how 300 reports now run their business.
Then when I probe and hear their answer more in depth, I am once again reminded of THE history quote and think to myself there’s an amusing irony at play here. When I think about the Business Intelligence systems of today, most are designed to “remember” and report on the historical past through large data warehouses of a gazillion transactions, along with basic, but numerous shipping and billing histories and maybe assorted support records.
But when it comes right down to it, business intelligence “history” is still just that. Nothing is really learned and applied right when and where it counted – AND when it would have made all the difference had the company been able to react in time.
So, in essence, by using standalone BI systems as they are designed today, companies are indeed condemned to repeat what they have already learned because they are too late – so the same mistakes will be repeated again and again.
This means the challenge for BI is to reduce latency, measure the pertinent data / sensors / events, and get scalable – extremely scalable and flexible enough to handle the volume and variety of the forthcoming data onslaught.
There’s a part 2 to this story so keep an eye out for my next blog post History Repeats Itself (Part 2)
The cost for 1GB of magnetic disk storage 20 years ago was $1,000 – now it’s eight cents. 1GB is enough to store about 20 thousand letter-size scanned documents. To store the same number of paper documents would require two four-drawer filing cabinets which would cost about $400. The cost of electronic data storage is five thousand times less than paper storage.
Costs have dropped consistently 40% per year which accounts for the more than 12,000 times reduction in cost since 1992. The cost for RAID or mainframe disk storage is somewhat greater, but the historical trend for other storage devices has been similar and the forecast for the foreseeable future is that costs will continue to decrease at the same rate. Twenty years from now we will be able to buy one tera-byte of storage for a penny. (more…)
Informatica World pre-conference session kicked off on Monday at the Gaylord National Harbor, Washington DC. One of the four sessions was “Leveraging the Flexibility of Informatica MDM – An Architecture Deep Dive”. Dmitri Korablev, VP of MDM Strategy, Ron Matusof, VP of MDM Solution Architecture, and Steve Hoskin, MDM Chief Architect conducted the session. The key objective was to explain the advanced architectural concepts of Informatica MDM relating to security, high performance, high availability, concurrency, and integration.
Dmitri started off quoting Albert Einstein “make everything as simple as possible, but not simpler” as the guiding principle that drove the design of Informatica MDM. He and Ron presented the six-step process to design an MDM solution – defining usage scenarios, selecting solution options, evaluating consumption patterns, defining data model, defining solution architecture, and applying non-functional requirements to the solution. Deeper conversations in each of the steps emphasized the guiding principle – keep MDM solution design simple! (more…)
The August 23rd issue of ComputerWorld contains a series of articles related to 2020 vision, megatrends, and careers. Predicting the future is always a risky business, but nonetheless the articles provide some interesting food for thought, and useful suggestions, for actions that should be taken now. (more…)
I spent last week at EMCWorld, and most of my time was spent engaging with customers in a variety of ways. One thing I always find interesting is the amazing consistency of priorities across our global customers.
For example, we held a session for executives, and I simply asked the open-ended question “regardless of whether it involves our products or not, what is your top IT priority this year.” The answer was clear, overwhelming and simple, yet also rather surprising.
But before I tell you the answer, I want to tell you why it was surprising to me. I keep up with the CIO surveys and the trends and buzzwords. This particular trend seems to be invisible in the media hype and yet this group of CIO’s and senior execs were almost in unanimous agreement that it was at the top of their priority list. (more…)
OK, I’m excited. What do AAA, Dolby and Bax Global all have in common besides all being Informatica customers? Give up? Let me tell you … they’re all presenting at this week’s San Francisco Dreamforce event alongside our On Demand general manager, Ron Papas (you know the one who’s been ‘drinking the Kool-Aid’). Guess what they’ll be talking about … give up? Their presentation is called “Salesforce Integration – It’s not just for IT any more”.
The responsibility of SaaS integration often lies outside of IT; in fact SaaS administrators have different skillsets to IT admins and these guys are going to explain why software (like ours) that addresses the needs of the SaaS admins is imperative for success in today’s business environment.
By the way, did I tell you I was excited?! Well, I have reason to be. For the first time in a long time, the industry is witnessing a momentous shift in the way companies manage their data integration processes – and you know what? Without us, it wouldn’t be possible!
Informatica data integration is critical for SasS!
A blog at IT-Director.com caught my eye yesterday. Entitled “IT Budgets, Clouds and Virtualization” it included the following comment:
“For Cloud computing, chief amongst these concerns is the readiness of commercial organisations to trust significant proportions of their essential, and hence incredibly valuable, corporate information to platforms and suppliers over whom they have little control and who might hold the data wherever they wish. Such a leap of faith is today beyond consideration in many business scenarios.”
This is spot-on. There is so much talk about “cloud computing this, and cloud computing that”. When it comes to corporations there are many examples of outsourcing non-core business processes to the cloud. Here at Informatica we use over 17 different services ourselves. I’d say the most mission-critical of these is our email marketing system (can’t tell you who or I’d have to shoot you!). We’re rolling it out worldwide across our marketing team and have spent the last few months integrating it with our own on-premise CRM system, contact hub and datawarehouse. Not a trivial task but incredibly important for me (well, I’m a marketeer) but probably less mission-critical to our CFO!
At the end of the day corporations WILL move data into the clouds so whilst I agree with Tony in the above-mentioned article, I also disagree with him (OK, bit of a split personality here now). I agree that it is foolish to simply “go to the clouds”, but I disagree about the state of the industry. It is possible to keep the data secure and we, amongst others, have proven that with our on-demand integration service. It is also possible to integrate such services into core business processes. My statement would be – don’t overlook the integration. You do it at your peril. We’ve had a LOT of experience of helping companies do this effectively – after all we’re the data integration company!