Tag Archives: cloud
Last week was Informatica’s first ever Data Mania event, held at the Contemporary Jewish Museum in San Francisco. We had an A-list lineup of speakers from leading cloud and data companies, such as Salesforce, Amazon Web Services (AWS), Tableau, Dun & Bradstreet, Marketo, AppDynamics, Birst, Adobe, and Qlik. The event and speakers covered a range of topics all related to data, including Big Data processing in the cloud, data-driven customer success, and cloud analytics.
While these companies are giants today in the world of cloud and have created their own unique ecosystems, we also wanted to take a peek at and hear from the leaders of tomorrow. Before startups can become market leaders in their own realm, they face the challenge of ramping up a stellar roster of customers so that they can get to subsequent rounds of venture funding. But what gets in their way are the numerous data integration challenges of onboarding customer data onto their software platform. When these challenges remain unaddressed, R&D resources are spent on professional services instead of building value-differentiating IP. Bugs also continue to mount, and technical debt increases.
Enter the Informatica Cloud Connector SDK. Built entirely in Java and able to browse through any cloud application’s API, the Cloud Connector SDK parses the metadata behind each data object and presents it in the context of what a business user should see. We had four startups build a native connector to their application in less than two weeks: BigML, Databricks, FollowAnalytics, and ThoughtSpot. Let’s take a look at each one of them.
With predictive analytics becoming a growing imperative, machine-learning algorithms that can have a higher probability of prediction are also becoming increasingly important. BigML provides an intuitive yet powerful machine-learning platform for actionable and consumable predictive analytics. Watch their demo on how they used Informatica Cloud’s Connector SDK to help them better predict customer churn.
Can’t play the video? Click here, http://youtu.be/lop7m9IH2aw
Databricks was founded out of the UC Berkeley AMPLab by the creators of Apache Spark. Databricks Cloud is a hosted end-to-end data platform powered by Spark. It enables organizations to unlock the value of their data, seamlessly transitioning from data ingest through exploration and production. Watch their demo that showcases how the Informatica Cloud connector for Databricks Cloud was used to analyze lead contact rates in Salesforce, and also performing machine learning on a dataset built using either Scala or Python.
Can’t play the video? Click here, http://youtu.be/607ugvhzVnY
With mobile usage growing by leaps and bounds, the area of customer engagement on a mobile app has become a fertile area for marketers. Marketers are charged with acquiring new customers, increasing customer loyalty and driving new revenue streams. But without the technological infrastructure to back them up, their efforts are in vain. FollowAnalytics is a mobile analytics and marketing automation platform for the enterprise that helps companies better understand audience engagement on their mobile apps. Watch this demo where FollowAnalytics first builds a completely native connector to its mobile analytics platform using the Informatica Cloud Connector SDK and then connects it to Microsoft Dynamics CRM Online using Informatica Cloud’s prebuilt connector for it. Then, see FollowAnalytics go one step further by performing even deeper analytics on their engagement data using Informatica Cloud’s prebuilt connector for Salesforce Wave Analytics Cloud.
Can’t play the video? Click here, http://youtu.be/E568vxZ2LAg
Analytics has taken center stage this year due to the rise in cloud applications, but most of the existing BI tools out there still stick to the old way of doing BI. ThoughtSpot brings a consumer-like simplicity to the world of BI by allowing users to search for the information they’re looking for just as if they were using a search engine like Google. Watch this demo where ThoughtSpot uses Informatica Cloud’s vast library of over 100 native connectors to move data into the ThoughtSpot appliance.
Can’t play the video? Click here, http://youtu.be/6gJD6hRD9h4
The problem many banks encounter today is that they have vast sums of investment tied up in old ways of doing things. Historically, customers chose a bank and remained ’loyal’ throughout their lifetime…now competition is rife and loyalty is becoming a thing of a past. In order to stay ahead of the competition, gain and keep customers, they need to understand the ever-evolving market, disrupt norms and continue to delight customers. The tradition of staying with one bank due to family convention or from ease has now been replaced with a more informed customer who understands the variety of choice at their fingertips.
Challenger Banks don’t build on ideas of tradition and legacy and see how they can make adjustments to them. They embrace change. Longer-established banks can’t afford to do nothing, and assume their size and stature will attract customers.
Here’s some useful information
Accenture’s recent report, The Bank of Things, succinctly explains what ‘Customer 3.0’ is all about. The connected customer isn’t necessarily younger. It’s everybody. Banks can get to know their customers better by making better use of information. It all depends on using intelligent data rather than all data. Interrogating the wrong data can be time-consuming, costly and results in little actionable information.
When an organisation sets out with the intention of knowing its customers, then it can calibrate its data according with where the gold nuggets – the real business insights – come from. What do people do most? Where do they go most? Now that they’re using branches and phone banking less and less – what do they look for in a mobile app?
Customer 3.0 wants to know what the bank can offer them all-the-time, on the move, on their own device. They want offers designed for their lifestyle. Correctly deciphered data can drive the level of customer segmentation that empowers such marketing initiatives. This means an organisation has to have the ability and the agility to move with its customers. It’s a journey that never ends -technology will never have a cut-off point just like customer expectations will never stop evolving.
It’s time for banks to re-shape banking
Informatica have been working with major retail banks globally to redefine banking excellence and realign operations to deliver it. We always start by asking our customers the revealing question “Have you looked at the art of the possible to future-proof your business over the next five to ten years and beyond?” This is where the discussion begins to explore really interesting notions about unlocking potential. No bank can afford to ignore them.
With Informatica’s Data Mania on Wednesday, I’ve been thinking a lot lately about REST APIs. In particular, I’ve been considering how and why they’ve become so ubiquitous, especially for SaaS companies. Today they are the prerequisite for any company looking to connect with other ecosystems, accelerate adoption and, ultimately, separate themselves from the pack.
Let’s unpack why.
To trace the rise of the REST API, we’ll first need to take a look at the SOAP web services protocol that preceded it. SOAP is still very much in play and remains important to many application integration scenarios. But it doesn’t receive much use or love from the thousands of SaaS applications that just want to get or place data with one another or in one of the large SaaS ecosystems like Salesforce.
Why this is the case has more to do with needs and demands of a SaaS business than it does with the capabilities of SOAP web services. SOAP, as it turns out, is perfectly fine for making and receiving web service calls, but it does require work on behalf of both the calling application and the producing application. And therein lies the rub.
SOAP web service calls are by their very nature incredibly structured arrangements, with specifications that must be clearly defined by both parties. Only after both the calling and producing application have their frameworks in place can the call be validated. While the contract within SOAP WSDLs makes SOAP more robust, it also makes it too rigid, and less adaptable to change. But today’s apps need a more agile and more loosely defined API framework that requires less work to consume and can adapt to the inevitable and frequent changes demanded by cloud applications.
Enter REST APIs
REST APIs are the perfect vehicle for today’s SaaS businesses and mash-up applications. Sure, they’re more loosely defined than SOAP, but when all you want to do is get and receive some data, now, in the context you need, nothing is easier or better for the job than a REST API.
With a REST API, the calls are mostly done as HTTP with some loose structure and don’t require a lot of mechanics from the calling application, or effort on behalf of the producing application.
SaaS businesses prefer REST APIs because they are easy to consume. They also make it easy to onboard new customers and extend the use of the platform to other applications. The latter is important because it is primarily through integration that SaaS applications get to become part of an enterprise business process and gain the stickiness needed to accelerate adoption and growth.
Without APIs of any sort, integration can only be done through manual data movement, which opens the application and enterprise up to the potential errors caused by fat-finger data movement. That typically will give you the opposite result of stickiness, and is to be avoided at all costs.
While publishing an API as a way to get and receive data from other applications is a great start, it is just a means to an end. If you’re a SaaS business with greater ambitions, you may want to consider taking the next step of building native connectors to other apps using an integration system such as Informatica Cloud. A connector can provide a nice layer of abstraction on the APIs so that the data can be accessed as application data objects within business processes. Clearly, stickiness with any SaaS application improves in direct proportion to the number of business processes or other applications that it is integrated with.
The Informatica Cloud Connector SDK is Java-based and enables you easily to cut and paste the code necessary to create the connectors. Informatica Cloud’s SDKs are also richer and make it possible for you to adapt the REST API to something any business user will want to use – which is a huge advantage.
In addition to making your app stickier, native connectors have the added benefit of increasing your portability. Without this layer of abstraction, direct interaction with a REST API that’s been structurally changed would be impossible without also changing the data flows that depend on it. Building a native connector makes you more agile, and inoculates your custom built integration from breaking.
Building your connectors with Informatica Cloud also provides you with some other advantages. One of the most important is entrance to a community that includes all of the major cloud ecosystems and the thousands of business apps that orbit them. As a participant, you’ll become part of an interconnected web of applications that make up the business processes for the enterprises that use them.
Another ancillary benefit is access to integration templates that you can easily customize to connect with any number of known applications. The templates abstract the complexity from complicated integrations, can be quickly customized with just a few composition screens, and are easily invoked using Informatica Cloud’s APIs.
The best part of all this is that you can use Informatica Cloud’s integration technology to become a part of any business process without stepping outside of your application.
For those interested in continuing the conversation and learning more about how leading SaaS businesses are using REST API’s and native connectors to separate themselves, I invite you to join me at Data Mania, March 4th in San Francisco. Hope to see you there.
Informatica’s Redshift connector is a state-of-the-art Bulk-Load type connector which allows users to perform all CRUD operations on Amazon Redshift. It makes use of AWS best practices to load data at high throughput in a safe and secure manner and is available on Informatica Cloud and PowerCenter.
Today we are excited to announce the support of Amazon’s newly launched custom JDBC and ODBC drivers for Redshift. Both the drivers are certified for Linux and Windows environments.
Informatica’s Redshift connector will package the JDBC 4.1 driver which further enhances our meta-data fetch capabilities for tables and views in Redshift. That improves our overall design-time responsiveness by over 25%. It also allows us to query multiple tables/views and retrieve the result-set using primary and foreign key relationships.
Amazon’s ODBC driver enhances our FULL Push Down Optimization capabilities on Redshift. Some of the key differentiating factors are support for the SYSDATE variable, functions such as ADD_TO_DATE(), ASCII(), CONCAT(), LENGTH(), TO_DATE(), VARIANCE() etc. which weren’t possible before.
Amazon’s ODBC driver is not pre-packaged but can be directly downloaded from Amazon’s S3 store.
Once installed, the user can change the default ODBC System DSN in ODBC Data Source Administrator.
Strata 2015 – Making Data Work for Everyone with Cloud Integration, Cloud Data Management and Cloud Machine Learning
Are you ready to answer “Yes” to the questions:
a) “Are you Cloud Ready?”
b) “Are you Machine Learning Ready?”
I meet with hundreds of Informatica Cloud customers and prospects every year. While they are investing in Cloud, and seeing the benefits, they also know that there is more innovation out there. They’re asking me, what’s next for Cloud? And specifically, what’s next for Informatica in regards to Cloud Data Integration and Cloud Data Management? I’ll share more about my response throughout this blog post.
The spotlight will be on Big Data and Cloud at the Strata + Hadoop World conference taking place in Silicon Valley from February 17-20 with the theme “Make Data Work”. I want to focus this blog post on two topics related to making data work and business insights:
- How existing cloud technologies, innovations and partnerships can help you get ready for the new era in cloud analytics.
- How you can make data work in new and advanced ways for every user in your company.
Today, Informatica is announcing the availability of its Cloud Integration Secure Agent on Microsoft Azure and Linux Virtual Machines as well as an Informatica Cloud Connector for Microsoft Azure Storage. Users of Azure data services such as Azure HDInsight, Azure Machine Learning and Azure Data Factory can make their data work with access to the broadest set of data sources including on-premises applications, databases, cloud applications and social data. Read more from Microsoft about their news at Strata, including their relationship with Informatica, here.
“Informatica, a leader in data integration, provides a key solution with its Cloud Integration Secure Agent on Azure,” said Joseph Sirosh, Corporate Vice President, Machine Learning, Microsoft. “Today’s companies are looking to gain a competitive advantage by deriving key business insights from their largest and most complex data sets. With this collaboration, Microsoft Azure and Informatica Cloud provide a comprehensive portfolio of data services that deliver a broad set of advanced cloud analytics use cases for businesses in every industry.”
Even more exciting is how quickly any user can deploy a broad spectrum of data services for cloud analytics projects. The fully-managed cloud service for building predictive analytics solutions from Azure and the wizard-based, self-service cloud integration and data management user experience of Informatica Cloud helps overcome the challenges most users have in making their data work effectively and efficiently for analytics use cases.
The new solution enables companies to bring in data from multiple sources for use in Azure data services including Azure HDInsight, Azure Machine Learning, Azure Data Factory and others – for advanced analytics.
The broad availability of Azure data services, and Azure Machine Learning in particular, is a game changer for startups and large enterprises. Startups can now access cloud-based advanced analytics with minimal cost and complexity and large businesses can use scalable cloud analytics and machine learning models to generate faster and more accurate insights from their Big Data sources.
Success in using machine learning requires not only great analytics models, but also an end-to-end cloud integration and data management capability that brings in a wide breadth of data sources, ensures that data quality and data views match the requirements for machine learning modeling, and an ease of use that facilitates speed of iteration while providing high-performance and scalable data processing.
For example, the Informatica Cloud solution on Azure is designed to deliver on these critical requirements in a complementary approach and support advanced analytics and machine learning use cases that provide customers with key business insights from their largest and most complex data sets.
Using the Informatica Cloud solution on Azure connector with Informatica Cloud Data Integration enables optimized read-write capabilities for data to blobs in Azure Storage. Customers can use Azure Storage objects as sources, lookups, and targets in data synchronization tasks and advanced mapping configuration tasks for efficient data management using Informatica’s industry leading cloud integration solution.
As Informatica fulfills the promise of “making great data ready to use” to our 5,500 customers globally, we continue to form strategic partnerships and develop next-generation solutions to stay one step ahead of the market with our Cloud offerings.
My goal in 2015 is to help each of our customers say that they are Cloud Ready! And collaborating with solutions such as Azure ensures that our joint customers are also Machine Learning Ready!
To learn more, try our free Informatica Cloud trial for Microsoft Azure data services.
It’s no secret that the explosion of software-as-a-service (SaaS) apps has revolutionized the way businesses operate. From humble beginnings, the titans of SaaS today include companies such as Salesforce.com, NetSuite, Marketo, and Workday that have gone public and attained multi-billion dollar valuations. The success of these SaaS leaders has had a domino effect in adjacent areas of the cloud – infrastructure, databases, and analytics.
Amazon Web Services (AWS), which originally had only six services in 2006 with the launch of Amazon EC2, now has over 30 ranging from storage, relational databases, data warehousing, Big Data, and more. Salesforce.com’s Wave platform, Tableau Software, and Qlik have made great advances in the cloud analytics arena, to give better visibility to line-of-business users. And as SaaS applications embrace new software design paradigms that extend their functionality, application performance monitoring (APM) analytics has emerged as a specialized field from vendors such as New Relic and AppDynamics.
So, how exactly did the growth of SaaS contribute to these adjacent sectors taking off?
The growth of SaaS coincided with the growth of powerful smartphones and tablets. Seeing this form factor as important to the end user, SaaS companies rushed to produce mobile apps that offered core functionality on their mobile device. Measuring adoption of these mobile apps was necessary to ensure that future releases met all the needs of the end user. Mobile apps contain a ton of information such as app responsiveness, features utilized, and data consumed. As always, there were several types of users, with some preferring a laptop form factor over a smartphone or tablet. With the ever increasing number of data points to measure within a SaaS app, the area of application performance monitoring analytics really took off.
Simultaneously, the growth of the SaaS titans cemented their reputation as not just applications for a certain line-of-business, but into full-fledged platforms. This growth emboldened a number of SaaS startups to develop apps that solved specialized or even vertical business problems in healthcare, warranty-and-repair, quote-to-cash, and banking. To get started quickly and scale rapidly, these startups leveraged AWS and its plethora of services.
The final sector that has taken off thanks to the growth of SaaS is the area of cloud analytics. SaaS grew by leaps and bounds because of its ease of use, and rapid deployment that could be achieved by business users. Cloud analytics aims to provide the same ease of use for business users when providing deep insights into data in an interactive manner.
In all these different sectors, what’s common is the fact that SaaS growth has created an uptick in the volume of data and the technologies that serve to make it easier to understand. During Informatica’s Data Mania event (March 4th, San Francisco) you’ll find several esteemed executives from Salesforce, Amazon, Adobe, Microsoft, Dun & Bradstreet, Qlik, Marketo, and AppDynamics talk about the importance of data in the world of SaaS.
In my discussions with CIOs, their opinions differ widely about the go forward nature of the CIO role. While most feel the CIO role will remain an important function, they also feel a sea state change is in process. According to Tim Crawford, a former CIO and strategic advisor to CIOs, “CIOs are getting out of the data center business”. In my discussions, not all yet see the complete demise for their data centers. However, it is becoming more common for CIOs to see themselves “becoming an orchestrator of business services versus a builder of new operational services”. One CIO put it this way, “the building stuff is now really table stakes. Cloud and loosely oriented partnerships are bringing vendor management to the forefront”.
As more and more of the service portfolio are provided by third parties in either infrastructure as a service (IaaS) or software as a service (SaaS) modes, the CIO needs to take on what will become an increasingly important role –the service broker. An element of the service broker role that will have increasingly importance is the ability to glue together business systems w6hether they are on premise, cloud managed (Iaas), or software as a service (Saas). Regardless of who creates or manages the applications of the enterprise, it is important to remember that integration is to a large degree the nervous system that connects applications into business capabilities. As such, the CIO’s team has a critical and continuing role in managing this linkage. For example, spaghetti code integrations can easily touch 20 or more systems for ERP or expense management systems.
Brokering integration services
As CIOs start to consider the move to cloud, they need to determine how this nervous system is connected, maintained, and improved. In particular, they need to determine maybe for the first time how to integrate their cloud systems to the rest of their enterprise systems. They clearly can continue to do so by building and maintaining hand coding or by using their existing ETL tools. This can work where one takes on an infrastructure as a service model. But it falls apart when looking at the total cost of ownership of managing the change of a SaaS model. This fact begs an interesting question. Shouldn’t the advantages of SaaS occur as well for integration? Shouldn’t there be Cloud Data Management (Integration as a Service)options? The answer is yes. Instead of investing in maintain integrations of SaaS systems which because of agile methodologies can change more frequently than traditional software development, couldn’tsomeone else manage this mess for me.
The advantage of the SaaS model is total cost of ownership and faster time to value. Instead of managing, integration between SaaS and historical environments, the integration between SaaS applications and historical applications can be maintained by the cloud data Management vendor. This would save both cost and time. As well, it would free you to focus your team’s energy upon cleaning up the integrations between historical systems and each other. This is a big advantage for organizations trying to get on the SaaS bandwagon but not incur significantly increased costs as a result.
Infrastructure as a Service (IaaS)—Provides processor, databases, etc. remotely but you control and maintain what goes on them
Software as a Service (Saas)—Provides software applications and underling infrastructure as a Service
Cloud Data Management—Provides Integration of applications in particular SaaS applications as a service
CIOs are embarking upon big changes. Building stuff is becoming less and less relevant. However, even as more and more services are managed remotely (even by other parties), it remains critical that CIOs and their teams manage the glue between applications. With SaaS application in particular, this is where Cloud Data Management can really help you control integrations with less time and cost.
Author Twitter: @MylesSuer
If you work for or with the government and you care about the cloud, you’ve probably already read the recent MeriTalk report, “Cloud Without the Commitment”. As well, you’ve probably also read numerous opinions about the report. In fact, one of Informatica’s guest bloggers, David Linthicum, just posted his thoughts. As I read the report and the various opinions, I was struck by the seemingly, perhaps, unintentional suggestion that (sticking with MeriTalk’s dating metaphor) the “commitment issues” are a government problem. Mr. Linthicum’s perspective is “there is really no excuse for the government to delay migration to cloud-based platforms” and “It’s time to see some more progress”, suggesting that the onus in on government to move forward.
I do agree that, leveraged properly, there’s much more value to be extracted from the cloud by government. Further, I agree that cloud technologies have sufficiently matured to the point that it is feasible to consider migrating mission critical applications. Yet, is it possible that the government’s “fear of commitment” is, in some ways, justified?
Consider this stat from the MeriTalk report – only half (53%) of the respondents rate their experience with the cloud as very successful. That suggests the experience of the other half, as MeriTalk words it, “leave(s) something to be desired.” If I’m a government decision maker and I’m tasked with keeping mission critical systems up and sensitive data safe, am I going to jump at the opportunity to leverage an approach that only half of my peers are satisfied with? Maybe, maybe not.
Now factor this in:
- 53% are concerned about being locked into a contract where the average term is 3.6 years
- 58% believe cloud providers do not provide standardized services, thus creating lock in
Back to playing government decision maker, if I do opt to move applications to the cloud, once I get there, I’m bound to that particular provider – contractually and, at least to some extent, technologically. How comfortable am I with the notion of rewriting/rehosting my mission-critical, custom application to run in XYZ cloud? Good question, right?
Inevitably, government agencies will end up with mission-critical systems and sensitive data in the cloud, however, successful “marriages” are hard, making them a bit of a rare commodity
Do I believe government has a “fear of commitment”? Nah, I just see their behavior as prudent caution on their way to the altar.
A lot of the trends we are seeing in enterprise integration today are being driven by the adoption of cloud based technologies from IaaS, PaaS and SaaS. I just was reading this story about a recent survey on cloud adoption and thought that a lot of this sounds very similar to things that we have seen before in enterprise IT.
Why discuss this? What can we learn? A couple of competing quotes come to mind.
Those who forget the past are bound to repeat it. – Edmund Burke
We are doomed to repeat the past no matter what. – Kurt Vonnegut
While every enterprise has to deal with their own complexities there are several past technology adoption patterns that can be used to drive discussion and compare today’s issues in order to drive decisions in how a company designs and deploys their current enterprise cloud architecture. Flexibility in design should be a key goal in addition to satisfying current business and technical requirements. So, what are the big patterns we have seen in the last 25 years that have shaped the cloud integration discussion?
1. 90s: Migration and replacement at the solution or application level. A big trend of the 90s was replacing older home grown systems or main frame based solutions with new packaged software solutions. SAP really started a lot of this with ERP and then we saw the rise of additional solutions for CRM, SCM, HRM, etc.
This kept a lot of people that do data integration very busy. From my point of view this era was very focused on replacement of technologies and this drove a lot of focus on data migration. While there were some scenarios around data integration to leave solutions in place these tended to be more in the area of systems that required transactional integrity and high level of messaging or back office solutions. On the classic front office solutions enterprises in large numbers did rip & replace and migration to new solutions.
2. 00s: Embrace and extend existing solutions with web applications. The rise of the Internet Browser combined with a popular and powerful standard programming language in Java shaped and drove enterprise integration in this time period. In addition, due to many of the mistakes and issues that IT groups had in the 90s there appeared to be a very strong drive to extend existing investments and not do rip and replace. IT and businesses were trying to figure out how to add new solutions to what they had in place. A lot of enterprise integration, service bus and what we consider as classic application development and deployment solutions came to market and were put in place.
3. 00s: Adoption of new web application based packaged solutions. A big part of this trend was driven by .Net & Java becoming more or less the de-facto desired language of enterprise IT. Software vendors not on these platforms were for the most part forced to re-platform or lose customers. New software vendors in many ways had an advantage because enterprises were already looking at large data migration to upgrade the solutions they had in place. In either case IT shops were looking to be either a .Net or Java shop and it caused a lot of churn.
4. 00s: First generation cloud applications and platforms. The first adoption of cloud applications and platforms were driven by projects and specific company needs. From Salesforce.com being used just for sales management before it became a platform to Amazon being used as just a run-time to develop and deploy applications before it became a full scale platform and an every growing list of examples as every vendor wants to be the cloud platform of choice. The integration needs originally were often on the light side because so many enterprises treated it as an experiment at first or a one off for a specific set of users. This has changed a lot in the last 10 years as many companies repeated their on premise silo of data problems in the cloud as they usage went from one cloud app to 2, 5, +10, etc. In fact, if you strip away where a solution happens to be deployed (on prem or cloud) the reality is that if an enterprise had previously had a poorly planned on premise architecture and solution portfolio they probably have just as poorly planned cloud architecture solution and portfolio. Adding them together just leads to disjoint solutions that are hard to integrate, hard to maintain and hard to evolve. In other words the opposite of the being flexible goal.
5. 10s: Consolidation of technology and battle of the cloud platforms. It appears we are just getting started in the next great market consolidation and every enterprise IT group is going to need to decide their own criteria for how they balance current and future investments. Today we have Salesforce, Amazon, Google, Apple, SAP and a few others. In 10 years some of these will either not exist as they do today or be marginalized. No one can say which ones for sure and this is why prioritizing flexibility in terms or architecture for cloud adoption.
For me the main take aways from the past 25 years of technology adoption trends for anyone that thinks about enterprise and data integration would be the following.
a) It’s all starts and ends with data. Yes, applications, process, and people are important but it’s about the data.
b) Coarse grain and loosely coupled approaches to integration are the most flexible. (e.g. avoid point to point at all costs)
c) Design with the knowledge of what data is critical and what data might or should be accessible or movable
d) Identify data and applications that might have to stay where it is no matter what.(e.g. the main frame is never dying)
e) Make sure your integration and application groups have access to or include someone that understand security. While a lot of integration developers think they understand security it’s usually after the fact that you find out they really do not.
So, it’s possible to shape your cloud adoption and architecture future by at least understanding how past technology and solution adoption has shaped the present. For me it is important to remember it is all about the data and prioritizing flexibility as a technology requirement at least at the same level as features and functions. Good luck.
In 2014, Informatica Cloud focused a great deal of attention on the needs and challenges of the citizen integrator. These are the critical business users at the core of every company: The customer-facing sales rep at the front, as well as the tireless admin at the back. We all know and rely on these men and women. And up until very recently, they’ve been almost entirely reliant on IT for the integration tasks and processes needed to be successful at their jobs.
A lot of that has changed over the last year or so. In a succession of releases, we provided these business users with the tools to take matters into their hands. And with the assistance of key ecosystem partners, such as Salesforce, SAP, Amazon, Workday, NetSuite and the hundreds of application developers that orbit them, we’ve made great progress toward giving business users the self-sufficiency they need, and demand. But, beyond giving these users the tools to integrate and connect with their apps and information at will, what we’ve really done is give them the ability to focus their attention and efforts on their most valuable customers. By doing so, we have got to core of the real purpose and importance of the whole cloud project or enterprise: The customer relationship.
In a recent Fortune interview, Salesforce CEO and cloud evangelist Marc Benioff echoed that idea when he stated that “The CEO is now in charge of the customer relationship.” What he meant by that is companies now have the ability to tie all aspects of their marketing – website, customer service, email marketing, social, sales, etc. – into “one canonical file” with all the respective customer information. By organizing the enterprise around the customer this way, the company can then pivot all of their efforts toward the customer relationship, which is what is required if a business is going to have and sustain success as we move through the 2010s and beyond.
We are in complete agreement with Marc and think it wouldn’t be too much of a stretch to declare 2015 as the year of the customer relationship. In fact, helping companies and business users focus their attention toward the customer has been a core focus of ours for some time. For an example, you don’t have to look much further than the latest iteration of our real-time application integration capability.
In a short video demo that I recommend to everyone, my colleague Eric does a fantastic job of walking users through the real-time features available through the Informatica Cloud platform.
As the demo demonstrates, the real-time features let you build a workflow process application that interacts with data from cloud and on-premise sources right from the Salesforce user interface (UI). It’s quick and easy, thus allowing you to devote more time to your customers and less time on “plumbing.”
The workflows themselves are created with the help of a drag-and-drop process designer that enables the user to quickly create a new process and configure the parameters, inputs and outputs, and decision steps with the click of a few buttons.
Once the process guide is created, it displays as a window embedded right in the Salesforce UI. So if, for example, you’ve created an opportunity-to-order guide, you can follow a wizard-driven process that walks your users from new opportunity creation through to the order confirmation, and everything in between.
As users move through the process, they can interact in real time with data from any on-premise or cloud-based source they choose. In the example from the video, the user, Eric, chooses a likely prospect from a list of company contacts, and with a few keystrokes creates a new opportunity in Salesforce. In a further demonstration of the real-time capability, Eric performs a NetSuite query, logs a client call, escalates a case to customer service, pulls the latest price book information from an Oracle database, builds out the opportunity items, creates the order in SAP, and syncs it all back to Salesforce, all without leaving the wizard interface.
The capabilities available via Informatica Cloud’s application integration are a gigantic leap forward for business users and an evolutionary step toward pivoting the enterprise toward the customer. As 2015 takes hold we will see this become increasingly important as companies continue to invest in the cloud. This is especially true for those cloud applications, like the Salesforce Analytics, Marketing and Sales Clouds, that need immediate access to the latest and most reliable customer data to make them all work — and truly establish you as the CEO in charge of customer relationships.