Tag Archives: cloud integration
But, as Billy Macinnes, in his July MicroScope article, reminds us, the opportunities come with many challenges, and so far only a few ISVs have risen high enough to truly meet them all. While the article itself is more concerned about where ISVs are headed, Macinnes and the industry experts he references, such as Mike West and Philip Howard, make it clear that no one is going anywhere far without a cloud strategy that meaningfully addresses data integration.
As a business app consumer myself, I too am excited by the possibilities that exist. I am intrigued by the way in which the new applications embrace the user-first ethos and deliver consumer-app-like interfaces and visual experiences. What concerns me is what happens next, once you get beyond the pretty design and actually try to solve the business use case for which the app is intended. This, unfortunately, is where many business apps fail. While most can access data from a single specific application, few can successfully interact with external data coming from multiple sources.
Like many of the challenges (such as licensing and provisioning) faced by today’s ISVs, data integration is something that lies outside of the expertise area of a typical app developer. Let’s say, for example, you’ve just come up with a new way to anticipate customer needs and match it with excess inventory. While the developer expertise and art of the app may be, say, in a new algorithm, the user experience, ultimately, is equally dependent on your ability to surface data – inventory, pricing, SKU numbers, etc. – that may be held in SaaS and on-premises systems and seamlessly marry it – behind the scenes – to cloud-based customer information.
The bottom line is that regardless of the genius behind your idea or user interface, if you can’t feed relevant data into your application and ensure its completeness, quality and security for meaningful consumption, your app will be dead in the water. As a result, app developers are spending an inordinate amount of time – in some cases up to 80% of their development cycle – working through data issues. Even with that, many still get stuck and end up with little more for their effort than a hard lesson in the difficulties of enterprise data integration long understood by every integration developer.
Fortunately, there is a better way: cloud integration.
Cloud integration enables the developer to focus on their app and core business. The ISV can offer cloud integration to its customers as an external resource or as an embedded part of its app. While some may see this as a choice, any ISV looking to provide the best possible user experience has no real option other than to embed the integration services as part of their application.
Look at any successful business app, and chances are you’ll find something that empowers users to work independently, without having to rely on other teams or tools for solutions. Take, for example, the common use case of bringing data into an app via a CSV file. With integration built directly into the app, the user can upload the file and resolve any semantic conflicts herself, with no assistance from IT. Without it, the user is now reliant on others to do his or her job, and ultimately less productive. Clearly, the better experience is the one that provides users with easy access to everything needed – including data from multiple sources – to get the work done themselves. And the most effective way you can do that is by embedding integration into the application.
Now that we’ve settled why cloud integration works best as an embedded capability, let’s take a closer look at how it works within the application context.
With cloud integration embedded into your app, you can essentially work behind the scenes to connect different data sources and incorporate the mapping and workflows between your app and the universe of enterprise data sources. How it accomplishes that is through abstraction. By abstracting connectivity to these data sources, you take the complexities involved with bringing data from an external source – such as SAP or Salesforce – and place it within a well-defined integration template or Vibe Integration Package (VIP). Once these abstractions are defined, you can then, as an application developer, access these templates through REST API and bring the specified data into your application.
While connectivity abstraction and REST APIs are important on their own, like all great pairings, it is only in combination that their true utility is realized. In fact, taken separately, neither is of much value to the application developer. Alone, a REST API can access the raw data type, but without the abstraction, the information is too unintelligible and incomplete to be of any use. And without the REST API, the abstracted data has no way of getting from the source to the application.
The value that REST APIs together with connectivity abstraction bring cannot be overstated, especially when the connectivity can span multiple integration templates. The mechanism for accomplishing integration is, like an automobile transmission, incredibly complex. To give an analogy, just like a car’s shift lever exposes a simple interface to move the gears from Park to Drive, activating a series of complex sensors to make the appropriate motions under the hood, the integration templates allow the user to work with the data in any way they want without ever having to understand or know about the complexities going on underneath.
As the leading cloud integration solution and platform, Informatica Cloud has long recognized the importance of pairing REST APIs and connectivity abstraction.
The first and most important function within our REST API is administration. It enables you to set up your organization and the administration of your users and permissions. The second function allows you to run and monitor integration tasks. And with the third, end users can configure the integration templates themselves, and enforce the business rules to apply for their specific process. You can view the entire set of Informatica Cloud REST API capabilities here.
It is in this last area – integration configurability – where we are truly setting ourselves apart. The Vibe Integration Packages (VIPs) not only abstract backend connectivity but also ensure that the data is complete – with the needed attributes from the underlying apps – and is of high quality and formatted for easy consumption in the end-user application. With the Packages, we’ve put together many of the most common integrations with reusable integration logic that is configurable through a variety of parameters. Our configurable templates enable your app users to customize and fine-tune their integrations – with custom fields, objects, etc. – to meet the specific behavior and functionality of their integrations. For example, the Salesforce to SAP VIP includes all the integration templates you need to solve different business use cases, such as integrating product, order and account information.
With their reusability and groupings encompassing many of the common integration use cases, our Vibe Integration Packages really are revolutionizing work for everyone. Using Informatica Cloud’s Visual Designer, developers can quickly create new, reusable VIPs, with parameterized values, for business users to consume. And SaaS administrators and business analysts can perform complex business integrations in a fraction of the time it took previously, and customize new integrations on the fly, without IT’s help.
More and more, developers are building great-looking apps with even greater aspirations. In many cases, the only thing holding them back is the ability to access back-office data without using external tools and interfaces, or outside assistance. With Informatica Cloud, data integration need no longer take a backseat to design, or anything else. Through our REST API, abstractions and Vibe Integration Packages, we help developers put an end to the compromise on user experience by bringing in the data directly through the application – for the benefit of everyone.
Amazon Redshift, one of the fast-rising stars in the AWS ecosystem has taken the data warehousing world by storm ever since it was introduced almost two years ago. Amazon Redshift operates completely in the cloud, and allows you to provision nodes on-demand. This model allows you to overcome many of the pains associated with traditional data warehousing techniques, such as provisioning extra server hardware, sizing and preparing databases for loading or extensive SQL scripting.
However, when loading data into Redshift, you may find it challenging to do so in a timely manner. To reduce the time taken to load this data, you may have to spend a tremendous amount of time writing SQL optimization queries which takes away the value proposition of using Redshift in the first place.
Informatica Cloud helps you load this data quickly into Redshift in just a few minutes. To start using Informatica Cloud, you’ll need to establish connections from Redshift and your other data source first. Here are a few easy steps to help you get started with establishing connections from a relational database such as MySQL as well as Redshift into Informatica Cloud:
- Login into your Informatica Cloud account, go to Configure -> Connections, click “New”, and select “MySQL” for “Type”
- Select your Secure Agent and fill in the rest of the database details:
- Test your connection and then click ‘OK’ to save and exit
- Now, login to your AWS account and go to Redshift service page
- Go to your cluster configuration page and make a note of the cluster and cluster database properties: Number of Nodes, Endpoint, Port, Database Name, JDBC URL. You also will need:
- The Redshift database user name and password (which is different from your AWS account)
- AWS account Access Key
- AWS account Secret Key
- Exit the AWS console.
- Now, back in your Informatica Cloud account, go to Configure -> Connections and click “New”.
- Select “AWS Redshift (Informatica)” for “Type” and fill in the rest of the details from the information you have from above
- Test the connection and then click ‘OK’ to save and exit
As you can see, establishing connections was extremely easy and can be done in less than 5 minutes. To learn how customers such as UBM used Informatica Cloud to deliver next-generation customer insights with Amazon Redshift, please join us on September 16 for a webinar where we’ll have product experts from Amazon and UBM explaining how your company can benefit from cloud data warehousing for petabyte-scale analytics using Amazon Redshift.
What I love about the cloud is it has something of value to offer practically any government organization, regardless of size, maturity, point of view, approach. Even for the most conservative IT shops, there are use cases that just plain make sense. And with the growing availability of FEDRAMP certified offerings, it’s becoming easier to procure. But, thinking realistically, for reasons of law, budget, time, architecture, we know the cloud will not be the solution for every public sector problem. Some applications, some data will never leave your agency’s premises. And here in lies the new complexity. You have applications and data on-prem. You have applications and data in the cloud. And you have business requirements that require these apps to work together, to share data.
So, now that you have a hybrid environment, what can you do about? Let’s face it, we can talk about technology, architecture and approaches all day long, but, it always comes down to this, what should be done with the data. You need answers to questions such as; Is it safe? Is it accessible? It is reliable? How do I know if the integrity has been compromised? What about the quality? How error-prone is the data? How complete is the data? How do we manage it across this new hybrid landscape? How can I get data from a public cloud application to my on-prem data warehouse? How can I leverage the flexibility of public IaaS to build a new application that will need access to data that is also required for an on-prem legacy application?
I know many government IT professional are wrestling with these questions and seeking solutions. So, here’s an interesting thought. Most of these questions are not exactly new, they are just taking on the added context of the cloud. Prior to the cloud, many agencies discovered answers in the form of a data integration platform. The platform is used to ensure every application, every user has access to the data they need to perform their mission or job. I think of it this way. The platform is a “standardized” abstraction layer that ensures all your data gets to where it needs to be, when it needs to be there, in the form it needs to be in. There are hundreds of government IT shops using such an approach.
Here’s the good news. This approach to integrating data can be extended to include the cloud. Imagine placing “agents” in all the places where your data needs to live, the agents capable of communicating with each other to integrate, alter or move data. Now add to this the idea of a cloud-based remote control that allows you to control all the functions of the agents. Using such a platform now enables your agency to tie on-prem systems to cloud systems, minimizing the effect of having multiple silos of information. Now government workers and warfighters will have the ability to more quickly get complete, accurate data, regardless of where it originates and citizens will benefit from more effectively delivered services.
How would such an approach change your ideas on how to leverage the cloud for your agency? If you live near the Washington, DC area, you may wish to drop in on the Government Cloud Computing and Data Center Conference & Expo. One of my colleagues, Ronen Schwartz will be discussing this topic. For those not in the vicinity, you can learn more here.
Informatica Cloud Summer ’14 Release Breaks Down Barriers with Unified Data Integration and Application Integration for Real Time and Bulk Patterns
This past week, Informatica Cloud marked an important milestone with the Summer 2014 release of the Informatica Cloud platform. This was the 20th Cloud release, and I am extremely proud of what our team has accomplished.
“SDL’s vision is to help our customers use data insights to create meaningful experiences, regardless of where or how the engagement occurs. It’s multilingual, multichannel and on a global scale. Being able to deliver the right information at the right time to the right customer with Informatica Cloud Summer 2014 is critical to our business and will continue to set us apart from our competition.”
– Paul Harris, Global Business Applications Director, SDL Pic
When I joined Informatica Cloud, I knew that it had the broadest cloud integration portfolio in the marketplace: leading data integration and analytic capabilities for bulk integration, comprehensive cloud master data management and test data management, and over a hundred connectors for cloud apps, enterprise systems and legacy data sources.. all delivered in a self-service design with point-and-click wizards for citizen integrators, without the need for complex and costly manual custom coding.
But, I also learned that our broad portfolio belies another structural advantage: because of Informatica Cloud’s unique, unified platform architecture, it has the ability to surface application (or real time) integration capabilities alongside its data integration capabilities with shared metadata across real time and batch workflows.
With the Summer 2014 release, we’ve brought our application integration capabilities to the forefront. We now provide the most-complete cloud app integration capability in the marketplace. With a design environment that’s meant not for just developers but also line of business IT, now app admins can also build real time process workflows that cut across on-premise and cloud and include built-in human workflows. And with the capability to translate these process workflows instantly into mobile apps for iPhone and Android mobile devices, we’re not just setting ourselves apart but also giving customers the unique capabilities they need for their increasingly mobile employees.
“Schneider’s strategic initiative to improve front-office performance relied on recording and measuring sales person engagement in real time on any mobile device or desktop. The enhanced real time cloud application integration features of Informatica Cloud Summer 2014 makes it all possible and was key to the success of a highly visible and transformative initiative.”
– Mark Nardella, Global Sales Process Director, Schneider Electric SE
With this release, we’re also giving customers the ability to create workflows around data sharing that mix and match batch and real time integration patterns. This is really important. Because unlike the past, where you had to choose between batch and real time, in today’s world of on-premise, cloud-based, transactional and social data, you’re now more than ever having to deal with both real time interactions and the processing of large volumes of data. For example, let’s surmise a typical scenario these days at high-end retail stores. Using a clienteling iPad app, the sales rep looks up bulk purchase history and inventory availability data in SAP, confirms availability and delivery date, and then processes the customer’s order via real time integration with NetSuite. And if you ask any customer, having a single workflow to unify all of that for instant and actionable insights is a huge advantage.
“Our industry demands absolute efficiency, speed and trust when dealing with financial information, and the new cloud application integration feature in the latest release of Informatica Cloud will help us service our customers more effectively by delivering the data they require in a timely fashion. Keeping call-times to a minimum and improving customer satisfaction in real time.”
– Kimberly Jansen, Director CRM, Misys PLC
We’ve also included some exciting new Vibe Integration packages or VIPs. VIPs deliver pre-built business process mappings between front-office and back-office applications. The Summer 2014 release includes new bidirectional VIPs for Siebel to Salesforce and SAP to Salesforce that make it easier for customers to connect their Salesforce with these mission-critical business applications.
And lastly, but not least importantly, the release includes a critical upgrade to our API Framework that provides the Informatica Cloud iPaaS end-to-end support for connectivity to any company’s internal or external APIs. With the newly available API creation, definition and consumption patterns, developers or citizen integrators can now easily expose integrations as APIs and users can consume them via integration workflows or apps, without the need for any additional custom code.
The features and capabilities released this summer are available to all existing Informatica Cloud customers, and everyone else through our free 30-day trial offer.
In the journey from a single-purpose cloud CRM app to the behemoth that it is today, Salesforce has made many smart acquisitions. However, the recent purchase of RelateIQ may have just been its most ingenious. Although a relatively small startup, RelateIQ has gained a big reputation for its innovative use of data science and predictive analytics, which would be highly beneficial to Salesforce customers.
As relevant from the acquisition, there is little doubt that the cloud application world is making a tectonic shift to data science and the appetite for Big Data to be pieced together to fuel the highly desired 360-degree view is only growing stronger.
But while looking ahead is certainly important, those of us who live in the present have much work yet still to accomplish in the here and now. For many, that means figuring out the best way to leverage data integration strategies and the Salesforce platform to gain actionable intelligence for our sales, marketing and CRM projects – today. Up until recently, this has involved manual, IT-heavy processes.
We need look no further than three common use cases where typical data integration strategies and technologies fail today’s business users:
Automated Metadata Discovery
The first, and perhaps most frustrating, has to do with discovering related objects. For example, objects, such as Accounts, don’t exist in a vacuum. They have related objects such as Contacts that can provide the business user – be it a salesperson or customer service manager, with context of the customer.
Now, during the typical data-integration process, these related objects are obscured from view because most integration technologies in the market today cannot automatically recognize the object metadata that could potentially relate all of the customer data together. The result is an incomplete view of the customer in Salesforce, and a lost opportunity to leverage the platform’s capability to strengthen a relationship or close a deal.
The Informatica Cloud platform is engineered ground up to be aware of the application ecosystem API and understand its metadata. As a result, our mapping engine can automatically discover metadata and relate objects to one another. This automated metadata discovery gives business users the ability to discover, choose and bring all of the related objects together into one mapping flow. Now, with a just a few clicks, business users, can quickly piece together relevant information in Salesforce and take the appropriate action.
Bulk Preparation of Data
The second instance where most data integration solutions typically fall short is with respect to the bulk preparation of data for analytic runs prior to the data transformation process. With the majority of Line-of-Business (LOB) applications now being run in the cloud, the typical business user has multiple terabytes of cloud data that either need to be warehoused on-premise or in the cloud, for a BI app to perform its analytics.
As a result, the best practice for bringing in data for analytics requires the ability to select and aggregate multiple data records from multiple data sources into a single batch, to speed up transformation and loading and – ultimately – intelligence gathering. Unfortunately, advanced transformations such as aggregations are something that is simply beyond the capabilities of most other integration technologies. Instead, most bring the data in one record or message at time, inhibiting speedy data loading and delivery of critical intelligence to business users when they need it most.
Alternatively, Informatica has leveraged its intellectual property in the market-leading data integration product, PowerCenter, and developed data aggregation transformation functionality within Informatica Cloud. This enables business users to pick a select group of important data points from multiple data sources – for example, country or dollar size of revenue opportunities – to aggregate and quickly process huge volumes in a single batch.
In-App Business Accelerators
Similar to what we’ve experienced with the mobile iOS and Android platforms, there recently has been an explosion of new, single-purpose business applications being developed and released for the cloud. While the platform-specific mobile model is predicated on and has flourished because of the “build it and they will come” premise, the paradigm does not work in the same way for cloud-based applications that are trying to become a platform.
The reason for this is that, unlike with iOS and Android platforms, in the cloud world, business users have a main LOB application that they are familiar with and rely on packaged integrations to bring in from other LOB cloud applications. However, with Salesforce extending its reach to become a cloud platform, the center of data gravity is shifting towards it being used for more than just CRM and the success of this depends upon these packaged integrations. Up until now, these integrations have just consisted of sample code and have been incredibly complex (and IT-heavy) to build and very cumbersome to use. As a result, business users lack the agility to easily customize fields or their workflows to match their unique business processes. Ultimately, the packages that were intended to accelerate business ended up inhibiting it.
Informatica Cloud’s Vibe Integration Packages (or VIPs) have made the promise of the integration package as a business accelerator into a reality, for all business users. Unlike sample code, VIPs are sophisticated templates that encapsulate the intelligence or logic of how you integrate the data between apps. While VIPs abstract complexity to give users out-of-the box integration, their pre-built mapping also provides great flexibility. Now, with just a few keystrokes, business users can map custom fields or leverage their unique business model into their Salesforce workflows.
A few paragraphs back I began this discussion with the recent acquisition of RelateIQ by Salesforce. While we can make an educated guess as to what that will bring in the future, no one knows for sure. What we do know is that, at present, Salesforce’s completeness as a platform – and source of meaningful analytics – requires something in addition to run your relevant business applications and solutions through it. A complete iPaaS (Integration Platform as a Service) solution such as Informatica Cloud has the power to make the vision into reality. Informatica enables this through the meta-data rich platform for data discovery, Industry leading data and application integration capabilities and business accelerators that put the power back in the hands of citizen application integrators and business users.
Join our webinar August 14 to learn more: Informatica Cloud Summer 2014: The Most Complete iPaaS
SOA was born out of purposeful intent, to solve a specific problem in a particularly novel way: standards-based and interoperable service-based integration driven by the WS-* standardization efforts. It foreshadowed the fragmentation of the monolithic on-premise software providers and pre-dated the rise of a new cloud-centric world – and it arguably arrived too fast for many organizations to take advantage of it on-premise. The constant churn of WS-* specifications didn’t help the cause either.
Some IT shops got bogged down in religious arguments over WS-* vs. REST while others pushed on, bolting on service interfaces to existing application stacks and protocols and building new service infrastructure as an investment for the future. The result, as we all know, was a lot of hype and dashed expectations for some.
Fast forward five years, and the future foreshadowed by SOA is almost a reality. And while SOA (the acronym) may be dead, the need for a service-oriented architecture is very much alive.
We now live in a hybrid world, populated by cloud, social and on-premise applications, and the move to the cloud for business is a fait accompli — or at the least, inevitable. Cloud initiatives are fueling a new type of service-oriented integration – one where, unlike in the past, the approach is no longer strictly defined by protocols but rather by application services and event-based integration.
In this new world, IT no longer controls the architecture of the apps its business users use (or where they execute), and so consumers and providers – cloud apps, on-premise apps and systems – need to interact in loosely-coupled service-oriented ways. This evolution forces new integration realities that had for many been hidden from sight and kept within the domain of application owners.
Eight or nine years ago, when SOA fever was at its height, everyone was running around trying to transform their internal systems and build new and complex infrastructure to meet an incomplete technological imperative.
Today, the landscape has completely changed. The need for ESBs and tightly coupled integrations that expose the innards of your infrastructure no longer apply. Eventually, as applications move to the Cloud, there will no longer be much infrastructure left to expose. Instead, the integrations are and will increasingly be occurring in the cloud, over an open framework, through high-level service-centric APIs.
At Informatica, we’ve taken the lessons and imperatives of SOA – simplicity, data consistency and accessibility and security – and incorporated it into a platform that makes the promise of service-oriented, hybrid, event-driven integration a reality.
We’ve innovated, and now deliver tooling that both enables technically savvy application owners to implement integrations themselves and IT to assist. And we’ve also made it possible for application owners to consume data and business services and processes in an intuitive user interface that abstracts the underlying details of our hybrid integration platform.
The result is an integration platform that empowers application owners. This is what makes what we’re currently doing at Informatica Cloud so particularly exciting, and potentially disruptive.
With practically every on-premise application having a counterpart in the SaaS world, enterprise IT departments have truly made the leap to a new way of computing that is transforming their organizations. The last mile of cloud transformation lies in the field of integration, and it is for this purpose that Informatica had a dedicated Cloud Day this year at Informatica World 2014.
The day kicked off with an introduction by Ronen Schwartz, VP and GM of Informatica Cloud, to the themes of intelligent data integration, comprehensive cloud data management, and cloud process automation. The point was made that with SaaS applications being customized frequently, and the need for more data insights from these apps, it is important to have a single platform that can excel at both batch and real-time integration. A whole series of exciting panel discussions followed, ranging from mission critical Salesforce.com integration, to cloud data warehouses, to hybrid integration use cases involving Informatica PowerCenter and Informatica Cloud.
In the mission critical Salesforce.com integration panel, we had speakers from Intuit, InsideTrack, and Cloud Sherpas. Intuit talked about how they went live with Informatica Cloud in under four weeks, with only two developers on hand. InsideTrack had an interesting use case, wherein, they were using the force.com platform to build a native app that tracked performance of students and the impact of coaching on them. InsideTrack connected to several databases outside the Salesforce platform to perform sophisticated analytics and bring them into their app through the power of Informatica Cloud. Cloud Sherpas, a premier System Integrator, and close partner of both Salesforce.com and Informatica outlined three customer case studies of how they used Informatica Cloud to solve complex integration challenges. The first was a medical devices company that was trying to receive up-to-the-minute price quotes be integrating Salesforce and SAP, the second was a global pharmaceuticals company that was using Salesforce to capture data about their research subjects and needed to synchronize that information with their databases, and the third was Salesforce.com itself.
The die-hard data geeks came out in full force for the cloud data warehousing panel. Accomplished speakers from Microstrategy, Amazon, and The Weather Channel discussed data warehousing using Amazon Web Services. A first-time attendee to this panel would have assumed that cloud data warehousing simply dealt with running relational databases on virtual machines spun up from EC2, but instead participants were in enthralled to learn that Amazon Redshift was a relational database that ran 100% in the cloud. The Weather Channel uses Amazon Redshift to perform analytics on almost 750 million rows of data. Using Informatica Cloud, they can load this data into Redshift in a mere half hour. Microstrategy talked about their cloud analytics initiatives and how they looked at it holistically from a hybrid standpoint.
On that note, it was time for the panel of hybrid integration practitioners to take the stage, with Qualcomm and Conde Nast discussing their use of PowerCenter and Cloud. Qualcomm emphasized that the value of Informatica Cloud was the easy access to a variety of connectors, and that they were using connectors for Salesforce, NetSuite, several relational databases, and web services. Conde Nast mentioned that it was extremely easy to port mappings between PowerCenter and Cloud due to the common code base between the two.
Salesforce.com is one of the most widely used cloud applications across every industry. Initially, Salesforce gained dominance from mid-market customers due to the agility and ease of deployment that the SaaS approach delivered. A cloud-based CRM system enabled SMB companies to easily automate sales processes that recorded customer interactions during the sales cycle and scale without costly infrastructure to maintain. This resulted in faster growth, thereby showing rapid ROI of a Salesforce deployment in most cases.
The Eye of the Enterprise
When larger enterprises saw the rapid growth that mid-market players had achieved, they realized that Salesforce was a unique technology enabler capable of helping their businesses to also speed time to market and scale more effectively. In most enterpises, the Salesforce deployments were driven by line-of-business units such as Sales and Customer Service, with varying degrees of coordination with central IT groups – in fact, most initial deployments of Salesforce orgs were done fairly autonomously from central IT.
With Great Growth Comes Greater Integration Challenges
When these business units needed to engage with each other to run cross functional tasks, the lack of a single customer view across the siloed Salesforce instances became a problem. Each individual Salesforce org had its own version of the truth and it was impossible to locate where in the sales cycle each customer was in respect to each business unit. As a consequence, cross-selling and upselling became very difficult. In short, the very application that was a key technology enabler for growth was now posing challenges to meet business objectives.
Scaling for Growth with Custom Apps
While many companies use the pre-packaged functionality in Salesforce, ISVs have also begun building custom apps using the Force.com platform due to its extensibility and rapid customization features. By using Salesforce to build native applications from the ground up, they could design innovative user interfaces that expose powerful functionality to end users. However, to truly add value, it was not just the user interface that was important, but also the back-end of the technology stack. This was especially evident when it came to aggregating data from several sources, and surfacing them in the custom Force.com apps.
On April 23rd at 10am PDT, you’ll hear how two CIOs from two different companies tackled the above integration challenges with Salesforce: Rising Star finalist of the 2013 Silicon Valley Business Journal CIO Awards, Eric Johnson of Informatica, and Computerworld’s 2014 Premier 100 IT Leaders, Derald Sue of InsideTrack.
SaaS companies are growing rapidly and becoming the top priority for most CIOs. With such high growth expectations, many SaaS vendors are investing in sales and marketing to acquire new customers even if it means having a negative net profit margin as a result. Moreover, with the pressure to grow rapidly, there is an increased urgency to ensure that the Average Sales Price (ASP) of every transaction increases in order to meet revenue targets.
The nature of the cloud allows these SaaS companies to release new features every few months, which sales reps can then promote to new customers. When new functionalities are not used nor understood, customers often feel that they have overpaid for a SaaS product. In such cases, customers usually downgrade to a lower-priced edition or worse, leave the vendor entirely. To make up for this loss, the sales representatives must work harder to acquire new leads, which results in less attention for existing customers. Preventing customer churn is very important. The Cost to Acquire a Customer (CAC) for upsells is 19% of the CAC to acquire new customer dollars. In comparison, the CAC to renew existing customers is only 15% of the CAC to acquire new customer dollars.
Accurate customer usage data helps determine which features customers use and which are under utilized. Gathering this data can help pinpoint high-value features that are not used, especially for customers that have recently upgraded to a higher edition. The process of collecting this data involves several touch points – from recording clicks within the app to analyzing the open rate of entire modules. This is where embedded cloud integration comes into play.
Embedding integration within a SaaS application allows vendors to gain operational insights into each aspect of how their app is being used. With this data, vendors are able to provide feedback to product management in regards to further improvements. Additionally, embedding integration can alert the customer success management team of potential churn, thereby allowing them to implement preventative measures.
To learn more about how a specialized analytics environment can be set up for SaaS apps, join Informatica and Gainsight on April 9th at 10am PDT for an informational webinar Powering Customer Analytics with Embedded Cloud Integration.
An explosion in mobile devices and social media usage has been the driving force behind large brands using big data solutions for deep, insightful analytics. In fact, a recent mobile consumer survey found that 71% of people used their mobile devices to access social media.
With social media becoming a major avenue for advertising, and mobile devices being the medium of access, there are numerous data points that global brands can cross-reference to get a more complete picture of their consumer, and their buying propensities. Analyzing these multitudes of data points is the reason behind the rise of big data solutions such as Hadoop.
However, Hadoop itself is only one Big Data framework, and consists of several different flavors. Facebook, which called itself the owner of the world’s largest Hadoop cluster, at 100 petabytes, outgrew its capabilities on Hadoop and is looking into a technology which would allow it to abstract its Hadoop workloads across several geographically dispersed datacenters.
When it comes to analytics projects that require intensive data warehousing, there is no one-size fits all answer for Big Data as the use cases can be extremely varied, ranging from short-term to long-term. Deploying Hadoop clusters requires specialized skills and proper capacity planning. In contrast, Big Data solutions in the cloud such as Amazon RedShift allow users to provision database nodes on demand and in a matter of minutes, without the need to take into account large outlays of infrastructure such as servers, and datacenter space. As a result, cloud-based Big Data can be a viable alternative for short-term analytics projects as well as fulfilling sandbox requirements to test out larger Big Data integration projects. Cloud-based Big Data may also make sense in situations where only a subset of the data is required for analysis as opposed to the entire dataset.
With cloud integration, much of the complexity of connecting to data sources and targets is abstracted away. Consequently, when a cloud-based Big Data deployment is combined with a cloud integration solution, it can result in even more time and cost savings and get the projects off the ground much faster.
We’ll be discussing several use cases around cloud-based Big Data in our webinar on August 22nd, Big Data in the Cloud with Informatica Cloud and Amazon Redshift, with special guests from Amazon on the event.