Category Archives: Cloud Computing

The State of Salesforce Report: Trends, Data, and What’s Next

A guest post by Jonathan Staley, Product Marketing Manager at Bluewolf Beyond, a consulting practice focused on innovating live cloud environments.

Guest blogger Jonathan Staley

Guest blogger Jonathan Staley

Now in its third year (2012, 2013), The State of Salesforce Annual Review continues to be the most comprehensive report on the Salesforce ecosystem. Based on the data from over 1,000 global Salesforce users, this report highlights how companies are using the Salesforce platform, where resources are being allocated, and where industry hype meets reality. Over the past three years, the report has evolved much like the technology, shifting and transforming to address recent advancements, and well as tracking longitudinal trends in the space.

We’ve found that key integration partners like Informatica Cloud continue to grow in importance within the Salesforce ecosystem. Beyond the core platform offerings from Salesforce, third-party apps and integration technologies have received considerable attention as companies look to extend the value of their initial investments and unite systems. The need to sync multiple platforms and applications is an emerging need in the Salesforce ecosystem—which will be highlighted in the 2014 report.

As Salesforce usage expands, so does our approach to survey execution. In line with this evolution, here’s what we’ve learned over the last three years from data collection:

Functions, Departments Make a Difference

Sales, Marketing, IT, and Service all have their own needs and pain points. As Salesforce moves quickly across the enterprise, we want to recognize the values, priorities, and investments by each department. Not only are the primary clouds for each function at different stages of maturity, but the ways in which each department uses their cloud are unique. We anticipate discovery of how enterprises are collaborating across functions and clouds.

Focus on Region

As our international data set continues to grow we are investing in regionalized reports for the US, UK, France, and Australia. While we saw indications of differences between each region in last year’s survey, they were not statistically significant.

Customer Engagement is a Top Priority

Everyone agrees that customer engagement is important, but what are companies actually doing about it? A section on predictive analytics and questions about engagement specific to departments has been included in this year’s survey. We suspect that the recent trend of companies empowering employees with a combination of data and mobile will be validated in the survey results.

Variation Across Industries

As an added bonus, we will build a report targeting specific insights from the Financial Services industry.

We Need Your Help

Our dataset depends on input from Salesforce users spanning all functions, roles, industries, and regions. Every response matters. Please take 15 minutes to share your Salesforce experiences, and you will receive a personalized report, comparing your responses to the aggregate survey results.

Click the Banner to take the Survey

Click the Banner to take the Survey

FacebookTwitterLinkedInEmailPrintShare
Posted in Cloud, Cloud Application Integration, Cloud Computing, Cloud Data Integration, Cloud Data Management, SaaS | Tagged , , , , | Leave a comment

From the Ashes of SOA: Service-Oriented Integration for the Hybrid App World

Service-Oriented Integration

Service-Oriented Integration for the Hybrid App World

History is full of instances where a new technology or idea seemingly arrives before its time and has difficulty taking hold because the organizational, cultural or technological foundation simply isn’t there to support it. One such infamous case I keep coming back to is the spectacular rise and gradual fall from grace of SOA. It’s been over five years since Ann Thomas Manes put a nail in its coffin with her provocative (and widely interpreted) SOA obituary. Ann’s point was simple: SOA as “an acronym” got in the way. Too much time was devoted to technology debates (e.g., ‘what’s the best ESB?’ or ‘WS-* vs. REST’), and everyone missed the important issue: architecture and services.

SOA was born out of purposeful intent, to solve a specific problem in a particularly novel way: standards-based and interoperable service-based integration driven by the WS-* standardization efforts. It foreshadowed the fragmentation of the monolithic on-premise software providers and pre-dated the rise of a new cloud-centric world – and it arguably arrived too fast for many organizations to take advantage of it on-premise. The constant churn of WS-* specifications didn’t help the cause either.

Some IT shops got bogged down in religious arguments over WS-* vs. REST while others pushed on, bolting on service interfaces to existing application stacks and protocols and building new service infrastructure as an investment for the future. The result, as we all know, was a lot of hype and dashed expectations for some.

Fast forward five years, and the future foreshadowed by SOA is almost a reality. And while SOA (the acronym) may be dead, the need for a service-oriented architecture is very much alive.

We now live in a hybrid world, populated by cloud, social and on-premise applications, and the move to the cloud for business is a fait accompli — or at the least, inevitable. Cloud initiatives are fueling a new type of service-oriented integration – one where, unlike in the past, the approach is no longer strictly defined by protocols but rather by application services and event-based integration.

In this new world, IT no longer controls the architecture of the apps its business users use (or where they execute), and so consumers and providers – cloud apps, on-premise apps and systems – need to interact in loosely-coupled service-oriented ways. This evolution forces new integration realities that had for many been hidden from sight and kept within the domain of application owners.

Eight or nine years ago, when SOA fever was at its height, everyone was running around trying to transform their internal systems and build new and complex infrastructure to meet an incomplete technological imperative.

Today, the landscape has completely changed. The need for ESBs and tightly coupled integrations that expose the innards of your infrastructure no longer apply. Eventually, as applications move to the Cloud, there will no longer be much infrastructure left to expose. Instead, the integrations are and will increasingly be occurring in the cloud, over an open framework, through high-level service-centric APIs.

At Informatica, we’ve taken the lessons and imperatives of SOA – simplicity, data consistency and accessibility and security – and incorporated it into a platform that makes the promise of service-oriented, hybrid, event-driven integration a reality.

We’ve innovated, and now deliver tooling that both enables technically savvy application owners to implement integrations themselves and IT to assist. And we’ve also made it possible for application owners to consume data and business services and processes in an intuitive user interface that abstracts the underlying details of our hybrid integration platform.

The result is an integration platform that empowers application owners. This is what makes what we’re currently doing at Informatica Cloud so particularly exciting, and potentially disruptive.

FacebookTwitterLinkedInEmailPrintShare
Posted in Cloud Computing, Cloud Data Integration | Tagged , , , , , , | Leave a comment

How Parallel Data Loading and Amazon Redshift Redefine Data Warehousing Performance

As Informatica Cloud product managers, we spend a lot of our time thinking about things like relational databases. Recently, we’ve been considering their limitations, and, specifically, how difficult and expensive it is to provision an on-premise data warehouse to handle the petabytes of fluid data generated by cloud applications and social media.  As a result, companies have to often make tradeoffs and decide which data is worth putting into their data warehouse.

Certainly, relational databases have enormous value. They’ve been around for several decades and have served as a bulwark for storing and analyzing structured data. Without them, we wouldn’t be able to extract and store data from on-premise CRM, ERP and HR applications and push it downstream for BI applications to consume.

With the advent of cloud applications and social media however, we are now faced with managing a daily barrage of massive amounts of rapidly changing data, as well as the complexities of analyzing it within the same context as data from on-premise applications. Add to that the stream of data coming from Big Data sources such as Hadoop which then needs to be organized into a structured format so that various correlation analyses can be run by BI applications – and you can begin to understand the enormity of the problem.

Up until now, the only solution has been to throw development resources at legacy on-premise databases, and hope for the best. But given the cost and complexity, this is clearly not a sustainable long-term strategy.

As an alternative, Amazon Redshift, a petabyte-scale data warehouse service in the cloud has the right combination of performance and capabilities to handle the demands of social media and cloud app data, without the additional complexity or expense. Its Massively Parallel Processing (MPP) architecture allows for the lightning fast loading and querying of data. It also features a larger block size, which reduces the number of I/O requests needed to load data, and leads to better performance.

By combining Informatica Cloud with Amazon Redshift’s parallel loading architecture, you can make use of push-down optimization algorithms, which process data transformations in the most optimal source or target database engines. Informatica Cloud also offers native connectivity to cloud and social media apps, such as Salesforce, NetSuite, Workday, LinkedIn, and Twitter, to name a few, which makes it easy to funnel data from these apps into your Amazon Redshift cluster at faster speeds.

If you’re at the Amazon Web Services Summit today in New York City, then you heard our announcement that Informatica Cloud is offering a free 60-day trial for Amazon Redshift with no limitations on the number of rows, jobs, application endpoints, or scheduling. If you’d like to learn more, please visit our Redshift Trial page or go directly to the trial.

FacebookTwitterLinkedInEmailPrintShare
Posted in Cloud, Cloud Computing, Cloud Data Integration, Cloud Data Management | Tagged , , , , | Leave a comment

Harness the Flow of Valuable Data Files Throughout Your System

Managing the recovery and flow of data files throughout your enterprise is much like managing the flow of oil from well to refinery – a wide range of tasks must be carefully completed to ensure optimal resource recovery. If these tasks are not handled properly, or are not addressed in the correct order, valuable resources may be lost. When the process involves multiple pipelines, systems, and variables, managing the flow of data can be difficult.

Organizations have many options to automate the processes of gathering data, transferring files, and executing key IT jobs. These options include home-built scheduling solutions, system integrated schedulers, and enterprise schedulers. Enterprise schedulers, such as Skybot Scheduler, often offer the most control over the organization’s workflow, as they offer the ability to create schedules connecting various applications, systems, and platforms.

In this way, the enterprise scheduler facilitates the transfer of data into and out of Informatica PowerCenter and Informatica Cloud, and ensures that raw materials are refined into valuable resources.

Enterprise Scheduling Automates Your Workflow

Think of an enterprise scheduler as the pipeline bearing data from its source to the refinery. Rather than allowing jobs or processes to execute randomly or to sit idle, the enterprise scheduler automates your organization’s workflow, ensuring that tasks are executed under the appropriate conditions without the need for manual monitoring or the risk of data loss.

Skybot Scheduler addresses the most common pain points associated with data recovery, including:

  • Scheduling dependencies: In order for PowerCenter or Cloud to complete the data gathering processes, other dependencies must be addressed. Information must be swept and updated, and files may need to be reformatted. Skybot Scheduler automates these tasks, keeping the data recovery process consistently moving forward.
  • Reacting to key events: As with oil recovery, small details can derail the successful mining of data. Key events, such as directory changes, file arrivals, and evaluation requirements can lead to a clog in the pipeline. Skybot Scheduler maintains the flow of data by recognizing these key events and reacting to them automatically.

Choose the Best Pipeline Available

Skybot Scheduler is one of the most powerful enterprise scheduling solutions available today, and is the only enterprise scheduler integrated with PowerCenter and Cloud.

Capable of creating comprehensive cross-platform automation schedules, Skybot Scheduler manages the many steps in the process of extracting, transforming, and loading data. Skybot maintains the flow of data by recognizing directory changes and other key events, and reacting to them automatically.

In short, by managing your workflow, Skybot Scheduler increases the efficiency of ETL processes and reduces the potential of a costly error.

To learn more about the power of enterprise scheduling and the Skybot Scheduler check out this webinar:  Improving Informatica ETL Processing with Enterprise Job Scheduling   or download the Free Trial.

FacebookTwitterLinkedInEmailPrintShare
Posted in Cloud Computing, Data Integration, Marketplace | Tagged , , | Leave a comment

It’s the Most Wonderful Time of the Year (If You’re Into Cloud)

It's the Most Wonderful Time of the Year (If You're Into Cloud)

I know it’s far too soon to be thinking of Christmas, but May is the most wonderful time of the year. That’s because it’s Informatica World week and I get the chance to experience the same tingle of anticipation watching the keynotes as I do on Christmas morning just before the big reveal. “You definitely live a sheltered life!” I hear you cry, but it’s true. In years past, Informatica World has proven to be the zenith of events in our cloud calendar, and this year so far, has not disappointed.

Monday

We began on Monday, or “Cloud Day“, with two banner events. The morning kicked off with the convening of the “Informatica Cloud Product Advisory Council,” hosted by the cloud product team.  This exclusive event consisted of a select group of customers who volunteered their time to speak at “Cloud Day.”

Monday afternoon’s session was open to all Informatica World attendees and proved to be a great networking opportunity as industry peers, cloud integration experts and practitioners all mingled. But it didn’t stop there. There were also keynotes delivered by the industry experts on topics such as cloud security and Analytics and Salesforce integration. If you missed cloud day, take a look at Ashwin Viswanth’s entertaining recap here.

Tuesday

The cloud momentum continued on Tuesday with some fabulous breakout sessions. In particular Conde Nast and Informatica held a session that discussed strategies in moving to a hybrid IT architecture. Informatica product management delivered a well-attended “Informatica Cloud 101” session. The day ended with Qualcomm and Inside Track hosting a lively discussion that explained how they use Informatica Cloud to manage data synchronization between Salesforce and other applications.

Wednesday

Wednesday was the day that kept on giving. It was packed full of sessions on the cloud path and highlighted the breadth of the solution. It started with a fantastic case study from the Weather Channel (yes THAT weather channel). They explained what “cloud first” means to them, and in particular how they built a cloud data warehouse with Informatica Cloud that runs on Amazon’s Redshift.

Next came a road map session from the Informatica Cloud Product Management team. This session gave a look at what’s next for the world’s leading cloud data management platform. It focused on the new web-based process designer that will enable integration developers to build real-time process integrations. Next, ConocoPhillips held a session called “Journey to the Internet of Things with PowerCenter and Informatica Cloud”. Conoco Phillips explained how they use PowerCenter and Informatica Cloud to aggregate data from sensor-bearing oil wells across Alaska and Calgary!

Wednesday’s packed Cloud agenda came to a close with a session from Schneider-Electric and Corvisa Cloud who talked about how they use Informatica Cloud Real Time to improve the operational efficiency of their Salesforce.com implementations by leveraging real-time data.

“Phew! That’s a lot of Cloud expertise in one day”, you say. And I would totally agree. That’s goes to show that I wasn’t lying when I said that I’m excited to hear about all things cloud this week. And I’ve only got to Wednesday. For those of you keeping track, there’s still ANOTHER conference day left.

Thursday

Today, Thursday, plays host to the last two cloud sessions. If your business relies on embedding data into your own applications, then you might want to check out “How Embedded iPaaS Can Transform Your Business”. This session is delivered by RMS who are the vendors of a financial trading application. They will explain how they create seamless and secure connections between its cloud-based suite of financial applications and its clients’ on-premise data sources by embedding Informatica Cloud. They will also discuss in detail how they interface with Informatica Cloud’s REST API’s.

The RMS session should provide enough technical meat for anyone, but if not then you should make your way to the last Informatica session of Informatica World 2014 (let’s all give a collective sigh that they week is almost over. AAAAWWWWWWHHHHHH). The honor of presenting the last session goes to the Informatica Cloud product management team. It’s seems as if we’ve come full circle.

The final session is called “Developing Advanced Integrations in the Cloud” and will show how the new cloud designer helps you aggregate and merge multiple data sources, use Vibe Integration Packages, and connect cloud applications to on-premises applications like Oracle EBS and SAP. The Cloud PM team always does fantastic demos, so this session will be worth missing your flight.

Phew, what a week! Hopefully by now you’ve realized why I think May is the “Most Wonderful Time of the Year”. All sign along now… “It’s the most wonderful time of the year. The cloud team is going and everyone’s knowing it’s nearly here… In-form-atica-World. Is nearly here!”

FacebookTwitterLinkedInEmailPrintShare
Posted in Cloud Computing, Informatica World 2014 | Tagged , , | Leave a comment

Cloud Day Tidings and Tidbits at Informatica World 2014

With practically every on-premise application having a counterpart in the SaaS world, enterprise IT departments have truly made the leap to a new way of computing that is transforming their organizations. The last mile of cloud transformation lies in the field of integration, and it is for this purpose that Informatica had a dedicated Cloud Day this year at Informatica World 2014.

The day kicked off with an introduction by Ronen Schwartz, VP and GM of Informatica Cloud, to the themes of intelligent data integration, comprehensive cloud data management, and cloud process automation. The point was made that with SaaS applications being customized frequently, and the need for more data insights from these apps, it is important to have a single platform that can excel at both batch and real-time integration. A whole series of exciting panel discussions followed, ranging from mission critical Salesforce.com integration, to cloud data warehouses, to hybrid integration use cases involving Informatica PowerCenter and Informatica Cloud.

In the mission critical Salesforce.com integration panel, we had speakers from Intuit, InsideTrack, and Cloud Sherpas. Intuit talked about how they went live with Informatica Cloud in under four weeks, with only two developers on hand. InsideTrack had an interesting use case, wherein, they were using the force.com platform to build a native app that tracked performance of students and the impact of coaching on them. InsideTrack connected to several databases outside the Salesforce platform to perform sophisticated analytics and bring them into their app through the power of Informatica Cloud. Cloud Sherpas, a premier System Integrator, and close partner of both Salesforce.com and Informatica outlined three customer case studies of how they used Informatica Cloud to solve complex integration challenges. The first was a medical devices company that was trying to receive up-to-the-minute price quotes be integrating Salesforce and SAP, the second was a global pharmaceuticals company that was using Salesforce to capture data about their research subjects and needed to synchronize that information with their databases, and the third was Salesforce.com itself.

The die-hard data geeks came out in full force for the cloud data warehousing panel. Accomplished speakers from Microstrategy, Amazon, and The Weather Channel discussed data warehousing using Amazon Web Services. A first-time attendee to this panel would have assumed that cloud data warehousing simply dealt with running relational databases on virtual machines spun up from EC2, but instead participants were in enthralled to learn that Amazon Redshift was a relational database that ran 100% in the cloud. The Weather Channel uses Amazon Redshift to perform analytics on almost 750 million rows of data. Using Informatica Cloud, they can load this data into Redshift in a mere half hour. Microstrategy talked about their cloud analytics initiatives and how they looked at it holistically from a hybrid standpoint.

On that note, it was time for the panel of hybrid integration practitioners to take the stage, with Qualcomm and Conde Nast discussing their use of PowerCenter and Cloud. Qualcomm emphasized that the value of Informatica Cloud was the easy access to a variety of connectors, and that they were using connectors for Salesforce, NetSuite, several relational databases, and web services. Conde Nast mentioned that it was extremely easy to port mappings between PowerCenter and Cloud due to the common code base between the two.

FacebookTwitterLinkedInEmailPrintShare
Posted in Cloud Computing, Uncategorized | Tagged , , , , | Leave a comment

What do CIOs think about when integrating Salesforce?

Viswanath_0416Salesforce.com is one of the most widely used cloud applications across every industry. Initially, Salesforce gained dominance from mid-market customers due to the agility and ease of deployment that the SaaS approach delivered. A cloud-based CRM system enabled SMB companies to easily automate sales processes that recorded customer interactions during the sales cycle and scale without costly infrastructure to maintain. This resulted in faster growth, thereby showing rapid ROI of a Salesforce deployment in most cases.

The Eye of the Enterprise

When larger enterprises saw the rapid growth that mid-market players had achieved, they realized that Salesforce was a unique technology enabler capable of helping their businesses to also speed time to market and scale more effectively. In most enterpises, the Salesforce deployments were driven by line-of-business units such as Sales and Customer Service, with varying degrees of coordination with central IT groups – in fact, most initial deployments of Salesforce orgs were done fairly autonomously from central IT.

With Great Growth Comes Greater Integration Challenges

When these business units needed to engage with each other to run cross functional tasks, the lack of a single customer view across the siloed Salesforce instances became a problem. Each individual Salesforce org had its own version of the truth and it was impossible to locate where in the sales cycle each customer was in respect to each business unit. As a consequence, cross-selling and upselling became very difficult. In short, the very application that was a key technology enabler for growth was now posing challenges to meet business objectives.

Scaling for Growth with Custom Apps

While many companies use the pre-packaged functionality in Salesforce, ISVs have also begun building custom apps using the Force.com platform due to its extensibility and rapid customization features. By using Salesforce to build native applications from the ground up, they could design innovative user interfaces that expose powerful functionality to end users. However, to truly add value, it was not just the user interface that was important, but also the back-end of the technology stack. This was especially evident when it came to aggregating data from several sources, and surfacing them in the custom Force.com apps.

On April 23rd at 10am PDT, you’ll hear how two CIOs from two different companies tackled the above integration challenges with Salesforce: Rising Star finalist of the 2013 Silicon Valley Business Journal CIO Awards, Eric Johnson of Informatica, and Computerworld’s 2014 Premier 100 IT Leaders, Derald Sue of InsideTrack.

FacebookTwitterLinkedInEmailPrintShare
Posted in Business/IT Collaboration, CIO, Cloud Computing, SaaS | Tagged , , , | Leave a comment

The Need for Specialized SaaS Analytics

SaaS

SaaS companies are growing rapidly and becoming the top priority for most CIOs. With such high growth expectations, many SaaS vendors are investing in sales and marketing to acquire new customers even if it means having a negative net profit margin as a result. Moreover, with the pressure to grow rapidly, there is an increased urgency to ensure that the Average Sales Price (ASP) of every transaction increases in order to meet revenue targets.

The nature of the cloud allows these SaaS companies to release new features every few months, which sales reps can then promote to new customers. When new functionalities are not used nor understood, customers often feel that they have overpaid for a SaaS product. In such cases, customers usually downgrade to a lower-priced edition or worse, leave the vendor entirely. To make up for this loss, the sales representatives must work harder to acquire new leads, which results in less attention for existing customers. Preventing customer churn is very important. The Cost to Acquire a Customer (CAC) for upsells is 19% of the CAC to acquire new customer dollars. In comparison, the CAC to renew existing customers is only 15% of the CAC to acquire new customer dollars.

Accurate customer usage data helps determine which features customers use and which are under utilized. Gathering this data can help pinpoint high-value features that are not used, especially for customers that have recently upgraded to a higher edition. The process of collecting this data involves several touch points – from recording clicks within the app to analyzing the open rate of entire modules. This is where embedded cloud integration comes into play.

Embedding integration within a SaaS application allows vendors to gain operational insights into each aspect of how their app is being used. With this data, vendors are able to provide feedback to product management in regards to further improvements. Additionally, embedding integration can alert the customer success management team of potential churn, thereby allowing them to implement preventative measures.

To learn more about how a specialized analytics environment can be set up for SaaS apps, join Informatica and Gainsight on April 9th at 10am PDT for an informational webinar Powering Customer Analytics with Embedded Cloud Integration.

FacebookTwitterLinkedInEmailPrintShare
Posted in Cloud Computing, Data Integration, SaaS | Tagged , , , , | Leave a comment

The Typical Journey of SaaS Adoption

SaaS applicationsWithin most organizations today, it is not a question of if SaaS applications should be deployed, but how quickly. The era of having to justify adoption of SaaS applications is long over, and the focus has shifted towards a deciding which SaaS applications to deploy, in which departments, and in what timeframes. With this view in mind, let us explore the typical journey that most companies take when deciding which SaaS applications to implement first.

Related: Learn more about customer facing processes vs. customer fulfillment processes in the March 25th webinar ‘Accelerate Business Velocity with NetSuite and Salesforce Integration

Customer Facing Processes

The main impetus behind switching to a SaaS application is because of the agility the cloud brings. Customizations that normally take weeks to get implemented take minutes or days, and can be performed by employees that do not possess an in-depth knowledge of the technical infrastructure of the SaaS system. With that being said, it is customer-facing processes that require application customizations almost immediately because optimizing these processes results in bringing in revenue quickly into the company, thereby enabling CIOs to show rapid ROI of a SaaS application.

It is no wonder that front-office applications such as Salesforce have become one of the largest SaaS vendors out there today. The entire process of converting a lead to a closed opportunity has several steps in between, and may require multiple workflows in parallel. But the journey doesn’t stop there. To keep customers satisfied and retain them, their product needs to be fulfilled, and this is where customer fulfillment processes come into play.

Customer Fulfillment Processes

Once an opportunity has been closed, the process of getting the product to the customer begins. Traditionally, this role has been done by large-scale on-premise ERP vendors, but leading cloud ERP companies such as NetSuite are showing how the complex task of fulfilling orders and realizing revenue can be done faster. Processes such as applying category-specific price and quantity discounts, special tax regulations involving several regions and nations, and fulfillment through multiple delivery options all have several moving parts. Moreover, the task of invoicing the customer, collecting payment, and recording numerous financial transactions is an entire sub-process in of itself and the only way it can be streamlined is through cloud ERP applications.

Optimizing the Entire Lead-to-Cash Process with Cloud Integration

When looking at customer-facing and customer-fulfillment processes together, it is very clear that SaaS apps in both categories need to work hand-in-hand to ensure that an organization’s customers are satisfied, and continue to engage in repeat business. This is why organizations that are starting the rollout of front-office SaaS applications also need to be thinking about rolling out back-office ERP SaaS applications along with a cloud integration solution to tie it all together. In the March 25th webinar, ‘Accelerate Business Velocity with NetSuite and Salesforce Integration’, we’ll talk about a blueprint for integrating both these types of apps together and how the Australian Institute of Management deployed these apps as part of a multi-million dollar IT transformation project.

FacebookTwitterLinkedInEmailPrintShare
Posted in Cloud Computing, SaaS | Leave a comment

Cloud Designer and Dynamic Rules Linking in Informatica Cloud Spring 2014

Informatica Cloud Spring 2014: Advanced Integration, Simplified for All! Once upon a time, database schema changes were rare and handled with scrutiny. The stability of source data led to the development of the traditional Data Integration model. In this traditional model, a developer pulled a fixed number of source fields into an integration, transformed these fields, and then mapped the data into appropriate target fields.
The world of data has profoundly changed. Today’s Cloud applications allow an administrator to add custom fields to an object at a moment’s notice. Because source data is increasingly malleable, the traditional Data Integration model is no longer optimal. The Data Integration model must evolve.

Today’s integrations must dynamically adapt to ever-changing environments. (Webinar HERE)

To meet these demands, Informatica has built the Informatica Cloud Mapping Designer. The Mapping Designer provides power and adaptability to integrations through the “link rules” and “incoming field rules” features. Integration developers no longer need to deal with fields on a one-by-one basis. Cloud Designer allows the integration developer to specify a set of dynamic “rules” that tell the mapping how fields need to be handled.

For example, the default rule is “Include all fields”, which is both simple and powerful. The “all fields” rule dynamically resolves to bring in as many fields as exist at the source at run time. Regardless of how many new fields the application developer or database administrator may have thrown in to the source after the integration was developed, this simple rule can bring in all the new fields into the integration dynamically. This exponentially increases developer productivity, as the integration developer is not making modifications just to keep up with changes to the integration endpoints. Instead, the integration is “future proofed”.

Link rules can be defined in combination using both “includes” and “excludes” criteria. The rules can be of four types:

  • Include or Exclude All fields
  • Include or Exclude Fields of a particular datatype (example: String, numeric, decimal, datetime, blob etc)
  • Include or Exclude Fields that fit a name pattern (example: any field that ends with “_c” or any field that starts with “Shipping_”)
  • Include or Exclude Fields by a particular name (example: “Id”, “Name” etc)

Any combination of the link rules can be put together to create sophisticated dynamic rules for fields to flow.

Each transformation in the integration can specify the set of rules that determine what fields flow into that particular transformation. For example, if I need all custom fields from a Salesforce source to flow into a target, I would simply “Include fields by name pattern : suffixed with ‘_c’” – which is the naming convention for custom field names in Salesforce.  In another example, If I need to perform standardization of date formats for all datetime fields in an expression, I can define a rule to “Include fields by datatype – datetime”.

The dynamic nature of the link rules is what empowers a mapping created in Informatica Cloud Designer to be easily converted into a highly reusable integration template through parameterization.

For example, the entire source object can be parameterized and the integration developer may focus on the core integration logic without having to worry about individual fields. For example I can build an integration for bringing data into a slowly changing dimension table in a datawarehouse and this integration can apply to any source object. When the integration is executed by substituting different source objects for the source parameter, the integration would work as expected since the logical rules can dynamically bring in the fields regardless of what the source object structure is. Now all of a sudden, an integration developer is only required to build one reusable integration template for replicating multiple objects to the datawarehouse and NOT dozens or even hundreds of such repeated integration mappings. Needless to say, maintenance is hugely optimized.

With the power of logically defining field propagation through an integration combined with the ability to parameterize just about any part of the integration logic, the Cloud Mapping Designer provides a unique and powerful platform for developing reusable end to end integration solutions (such as Opportunity to Order, Accounts load to Salesforce, SAP product catalog to Salesforce, File load to Amazon redshift etc). Such prebuilt end-to-end solutions or VIPs (Vibe Integration Packages) can be easily customized by any consuming customer to adapt to their unique environments and business needs by tweaking only certain configurations but largely reusing the core integration logic.

What could be better than building integrations… building far fewer integrations that are reusable and self-adapting

To learn more, join the upcoming Cloud Spring release Webinar on Thursday, March 13.

FacebookTwitterLinkedInEmailPrintShare
Posted in Cloud Computing | Tagged , | Leave a comment