Category Archives: Cloud Application Integration

Cloud Application Integration

Connected Data for SaaS

Connected_Data

Connected Data for SaaS

Informatica, over the last two years, successfully transformed from running 80% of its application portfolio on premises to 80% in the cloud. Success was based on two key criteria:

  • Ensuring the SaaS-based processes are integrated with no disruption
  • Data in the cloud continues to be available and accessible for analytics

With industry analysts predicting that the majority of new application deployments will be SaaS-based by 2017, the requirement of having connected data should not be negotiable. It is a must have. Most SaaS applications ensure businesses are able to keep processes integrated using connected and shared data through application programming interfaces (APIs).

If you are a consumer of SaaS applications, you probably know the importance of having clean, connected and secure data from the cloud. The promise of SaaS is improved agility. When data is not easily accessible, that promise is broken. With the plethora of options available in the SaaS ecosystem and marketplace, not having clean, connected and safe data is a compelling event for switching SaaS vendors.

If you are in the SaaS application development industry, you probably know that building these APIs and connectors is a critical requirement for success. However, how do you decide which applications you should build connectors for when the ecosystem keeps changing? Investment in developing connectors and interfaces consumes resources and competes with developing competitive and differentiating features.

This week, Informatica launched its inaugural DataMania event in San Francisco where the leading topic was SaaS application and data integration. Speakers from AWS, Adobe, App Dynamics, Dun & Bradstreet, and Marketo – to name a few – contributed to the discussion and confirmed that we entering into the era of the Data Ready Enterprise. Also during the event, Informatica announced the Connect-a-thon, a hackathon-like event, where SaaS vendors can get connected to hundreds of cloud and on-premises apps.

Without a doubt, transitioning to a cloud and SaaS-based application architecture can only be successful if the applications are easily connectable with shared data. Here at Informatica, this was absolutely the case. Whether you are in the business or a consumer of SaaS applications, consider the benefits of using a standard library of connectors, such as what Informatica Cloud offers so you can focus your time and energy on innovation and more strategic parts of your business.

Share
Posted in Cloud, Cloud Application Integration, Cloud Data Integration | Tagged , , , | Leave a comment

Informatica joins new ServiceMax Marketplace – offers rapid, cost effective integration with ERP and Cloud apps for Field Service Automation

ERP

Informatica Partners with ServiceMax

To deliver flawless field service, companies often require integration across multiple applications for various work processes.  A good example is automatically ordering and shipping parts through an ERP system to arrive ahead of a timely field service visit.  Informatica has partnered with ServiceMax, the leading field service automation solution, and subsequently joined the new ServiceMax Marketplace to offer customers integration solutions for many ERP and Cloud applications frequently involved in ServiceMax deployments.  Comprised of Cloud Integration Templates built on Informatica Cloud for frequent customer integration “patterns”, these solutions will speed and cost contain the ServiceMax implementation cycle and help customers realize the full potential of their field service initiatives.

Existing members of the ServiceMax Community can see a demo or take advantage of a free 30-day trial that provides full capabilities of Informatica Cloud Integration for ServiceMax with prebuilt connectors to hundreds of 3rd party systems including SAP, Oracle, Salesforce, Netsuite and Workday, powered by the Informatica Vibe virtual data machine for near-universal access to cloud and on-premise data.  The Informatica Cloud Integration for Servicemax solution:

  • Accelerates ERP integration through prebuilt Cloud templates focused on key work processes and the objects on common between systems as much as 85%
  • Synchronizes key master data such as Customer Master, Material Master, Sales Orders, Plant information, Stock history and others
  • Enables simplified implementation and customization through easy to use user interfaces
  • Eliminates the need for IT intervention during configuration and deployment of ServiceMax integrations.

We look forward to working with ServiceMax through the ServiceMax Marketplace to help joint customers deliver Flawless Service!

Share
Posted in 5 Sales Plays, Business Impact / Benefits, Cloud, Cloud Application Integration, Cloud Data Integration, Cloud Data Management, Data Integration, Data Integration Platform, Data Migration, Data Synchronization, Operational Efficiency, Professional Services, SaaS | Tagged , , , | Leave a comment

Data Mania- Using REST APIs and Native Connectors to Separate Yourself from the SaaS Pack

Data Mania

Data Mania: An Event for SaaS & ISV Leaders

With Informatica’s Data Mania on Wednesday, I’ve been thinking a lot lately about REST APIs. In particular, I’ve been considering how and why they’ve become so ubiquitous, especially for SaaS companies. Today they are the prerequisite for any company looking to connect with other ecosystems, accelerate adoption and, ultimately, separate themselves from the pack.

Let’s unpack why.

To trace the rise of the REST API, we’ll first need to take a look at the SOAP web services protocol that preceded it.  SOAP is still very much in play and remains important to many application integration scenarios. But it doesn’t receive much use or love from the thousands of SaaS applications that just want to get or place data with one another or in one of the large SaaS ecosystems like Salesforce.

Why this is the case has more to do with needs and demands of a SaaS business than it does with the capabilities of SOAP web services. SOAP, as it turns out, is perfectly fine for making and receiving web service calls, but it does require work on behalf of both the calling application and the producing application. And therein lies the rub.

SOAP web service calls are by their very nature incredibly structured arrangements, with specifications that must be clearly defined by both parties. Only after both the calling and producing application have their frameworks in place can the call be validated. While the contract within SOAP WSDLs makes SOAP more robust, it also makes it too rigid, and less adaptable to change. But today’s apps need a more agile and more loosely defined API framework that requires less work to consume and can adapt to the inevitable and frequent changes demanded by cloud applications.

Enter REST APIs

REST APIs are the perfect vehicle for today’s SaaS businesses and mash-up applications. Sure, they’re more loosely defined than SOAP, but when all you want to do is get and receive some data, now, in the context you need, nothing is easier or better for the job than a REST API.

With a REST API, the calls are mostly done as HTTP with some loose structure and don’t require a lot of mechanics from the calling application, or effort on behalf of the producing application.

SaaS businesses prefer REST APIs because they are easy to consume. They also make it easy to onboard new customers and extend the use of the platform to other applications. The latter is important because it is primarily through integration that SaaS applications get to become part of an enterprise business process and gain the stickiness needed to accelerate adoption and growth.

Without APIs of any sort, integration can only be done through manual data movement, which opens the application and enterprise up to the potential errors caused by fat-finger data movement. That typically will give you the opposite result of stickiness, and is to be avoided at all costs.

While publishing an API as a way to get and receive data from other applications is a great start, it is just a means to an end. If you’re a SaaS business with greater ambitions, you may want to consider taking the next step of building native connectors to other apps using an integration system such as Informatica Cloud. A connector can provide a nice layer of abstraction on the APIs so that the data can be accessed as application data objects within business processes. Clearly, stickiness with any SaaS application improves in direct proportion to the number of business processes or other applications that it is integrated with.

The Informatica Cloud Connector SDK is Java-based and enables you easily to cut and paste the code necessary to create the connectors. Informatica Cloud’s SDKs are also richer and make it possible for you to adapt the REST API to something any business user will want to use – which is a huge advantage.

In addition to making your app stickier, native connectors have the added benefit of increasing your portability. Without this layer of abstraction, direct interaction with a REST API that’s been structurally changed would be impossible without also changing the data flows that depend on it. Building a native connector makes you more agile, and inoculates your custom built integration from breaking.

Building your connectors with Informatica Cloud also provides you with some other advantages. One of the most important is entrance to a community that includes all of the major cloud ecosystems and the thousands of business apps that orbit them. As a participant, you’ll become part of an interconnected web of applications that make up the business processes for the enterprises that use them.

Another ancillary benefit is access to integration templates that you can easily customize to connect with any number of known applications. The templates abstract the complexity from complicated integrations, can be quickly customized with just a few composition screens, and are easily invoked using Informatica Cloud’s APIs.

The best part of all this is that you can use Informatica Cloud’s integration technology to become a part of any business process without stepping outside of your application.

For those interested in continuing the conversation and learning more about how leading SaaS businesses are using REST API’s and native connectors to separate themselves, I invite you to join me at Data Mania, March 4th in San Francisco. Hope to see you there.

Share
Posted in B2B, B2B Data Exchange, Cloud, Cloud Application Integration, Cloud Data Integration, Data Integration, Data Services, SaaS | Tagged , , , , , , , | Leave a comment

Informatica Supports New Custom ODBC/JDBC Drivers for Amazon Redshift

Informatica’s Redshift connector is a state-of-the-art Bulk-Load type connector which allows users to perform all CRUD operations on Amazon Redshift. It makes use of AWS best practices to load data at high throughput in a safe and secure manner and is available on Informatica Cloud and PowerCenter.

Today we are excited to announce the support of Amazon’s newly launched custom JDBC and ODBC drivers for Redshift. Both the drivers are certified for Linux and Windows environments.

Informatica’s Redshift connector will package the JDBC 4.1 driver which further enhances our meta-data fetch capabilities for tables and views in Redshift. That improves our overall design-time responsiveness by over 25%. It also allows us to query multiple tables/views and retrieve the result-set using primary and foreign key relationships.

Amazon’s ODBC driver enhances our FULL Push Down Optimization capabilities on Redshift. Some of the key differentiating factors are support for the SYSDATE variable, functions such as ADD_TO_DATE(), ASCII(), CONCAT(), LENGTH(), TO_DATE(), VARIANCE() etc. which weren’t possible before.

Amazon’s ODBC driver is not pre-packaged but can be directly downloaded from Amazon’s S3 store.

Once installed, the user can change the default ODBC System DSN in ODBC Data Source Administrator.

Redshift

To learn more, sign up for the free trial of Informatica’s Redshift connector for Informatica Cloud or PowerCenter.

Share
Posted in Business Impact / Benefits, Business/IT Collaboration, Cloud, Cloud Application Integration, Cloud Computing, Cloud Data Integration, Cloud Data Management | Tagged , , , | Leave a comment

Strata 2015 – Making Data Work for Everyone with Cloud Integration, Cloud Data Management and Cloud Machine Learning

Making Data Work for Everyone with Cloud Integration, Cloud Data Management and Cloud Machine Learning

Making Data Work for Everyone with Cloud

Are you ready to answer “Yes” to the questions:

a) “Are you Cloud Ready?”
b) “Are you Machine Learning Ready?”

I meet with hundreds of Informatica Cloud customers and prospects every year. While they are investing in Cloud, and seeing the benefits, they also know that there is more innovation out there. They’re asking me, what’s next for Cloud? And specifically, what’s next for Informatica in regards to Cloud Data Integration and Cloud Data Management? I’ll share more about my response throughout this blog post.

The spotlight will be on Big Data and Cloud at the Strata + Hadoop World conference taking place in Silicon Valley from February 17-20 with the theme  “Make Data Work”. I want to focus this blog post on two topics related to making data work and business insights:

  • How existing cloud technologies, innovations and partnerships can help you get ready for the new era in cloud analytics.
  • How you can make data work in new and advanced ways for every user in your company.

Today, Informatica is announcing the availability of its Cloud Integration Secure Agent on Microsoft Azure and Linux Virtual Machines as well as an Informatica Cloud Connector for Microsoft Azure Storage. Users of Azure data services such as Azure HDInsight, Azure Machine Learning and Azure Data Factory can make their data work with access to the broadest set of data sources including on-premises applications, databases, cloud applications and social data. Read more from Microsoft about their news at Strata, including their relationship with Informatica, here.

“Informatica, a leader in data integration, provides a key solution with its Cloud Integration Secure Agent on Azure,” said Joseph Sirosh, Corporate Vice President, Machine Learning, Microsoft. “Today’s companies are looking to gain a competitive advantage by deriving key business insights from their largest and most complex data sets. With this collaboration, Microsoft Azure and Informatica Cloud provide a comprehensive portfolio of data services that deliver a broad set of advanced cloud analytics use cases for businesses in every industry.”

Even more exciting is how quickly any user can deploy a broad spectrum of data services for cloud analytics projects. The fully-managed cloud service for building predictive analytics solutions from Azure and the wizard-based, self-service cloud integration and data management user experience of Informatica Cloud helps overcome the challenges most users have in making their data work effectively and efficiently for analytics use cases.

The new solution enables companies to bring in data from multiple sources for use in Azure data services including Azure HDInsight, Azure Machine Learning, Azure Data Factory and others – for advanced analytics.

The broad availability of Azure data services, and Azure Machine Learning in particular, is a game changer for startups and large enterprises. Startups can now access cloud-based advanced analytics with minimal cost and complexity and large businesses can use scalable cloud analytics and machine learning models to generate faster and more accurate insights from their Big Data sources.

Success in using machine learning requires not only great analytics models, but also an end-to-end cloud integration and data management capability that brings in a wide breadth of data sources, ensures that data quality and data views match the requirements for machine learning modeling, and an ease of use that facilitates speed of iteration while providing high-performance and scalable data processing.

For example, the Informatica Cloud solution on Azure is designed to deliver on these critical requirements in a complementary approach and support advanced analytics and machine learning use cases that provide customers with key business insights from their largest and most complex data sets.

Using the Informatica Cloud solution on Azure connector with Informatica Cloud Data Integration enables optimized read-write capabilities for data to blobs in Azure Storage. Customers can use Azure Storage objects as sources, lookups, and targets in data synchronization tasks and advanced mapping configuration tasks for efficient data management using Informatica’s industry leading cloud integration solution.

As Informatica fulfills the promise of “making great data ready to use” to our 5,500 customers globally, we continue to form strategic partnerships and develop next-generation solutions to stay one step ahead of the market with our Cloud offerings.

My goal in 2015 is to help each of our customers say that they are Cloud Ready! And collaborating with solutions such as Azure ensures that our joint customers are also Machine Learning Ready!

To learn more, try our free Informatica Cloud trial for Microsoft Azure data services.

Share
Posted in B2B, Business Impact / Benefits, Business/IT Collaboration, Cloud, Cloud Application Integration, Cloud Computing, Cloud Data Integration, Data Services | Tagged , , , , , , , , , , , , , , , , | 2 Comments

Payers – What They Are Good At, And What They Need Help With

healthcare_bigdata

Payers – What They Are Good At, And What They Need Help With

In our house when we paint a room, my husband does the big rolling of the walls or ceiling, I do the cut-in work. I am good at prepping the room, taping all the trim and deliberately painting the corners. However, I am thrifty and constantly concerned that we won’t have enough paint to finish a room. My husband isn’t afraid to use enough paint and is extremely efficient at painting a wall in a single even coat. As a result, I don’t do the big rolling and he doesn’t do the cutting in. It took us awhile to figure this out, and a few rooms had to be repainted while we were figuring it out.  Now we know what we are good at, and what we need help with.

Payers roles are changing. Payers were previously focused on risk assessment, setting and collecting premiums, analyzing claims and making payments – all while optimizing revenues. Payers are pretty good at selling to employers, figuring out the cost/benefit ratio from an employers perspective and ensuring a good, profitable product. With the advent of the Affordable Healthcare Act along with a much more transient insured population, payers now must focus more on the individual insured and be able to communicate with the individuals in a more nimble manner than in the past.

Individual members will shop for insurance based on consumer feedback and price. They are interested in ease of enrollment and the ability to submit and substantiate claims quickly and intuitively. Payers are discovering that they need to help manage population health at a individual member level. And population health management requires less of a business-data analytics approach and more social media and gaming-style logic to understand patients. In this way, payers can help develop interventions to sustain behavioral changes for better health.

When designing such analytics, payers should consider the following key design steps:

Due to payers’ mature predictive analytics competencies, they will have a much easier time in the next generation of population behavior compared to their provider counterparts. As clinical content is often unstructured compared to the claims data, payers need to pay extra attention to context and semantics when deciphering clinical content submitted by providers. Payers can use help from vendors that can help them understand unstructured data, individual members. They can then use that data to create fantastic predictive analytic solutions.

Share
Posted in 5 Sales Plays, Application Retirement, Big Data, Cloud Application Integration, Cloud Data Integration, Customer Acquisition & Retention, Customer Services, Data Governance, Data Quality, Governance, Risk and Compliance, Healthcare, Total Customer Relationship | Tagged | Leave a comment

What’s All the Mania Around SaaS Data?

SaaSIt’s no secret that the explosion of software-as-a-service (SaaS) apps has revolutionized the way businesses operate. From humble beginnings, the titans of SaaS today include companies such as Salesforce.com, NetSuite, Marketo, and Workday that have gone public and attained multi-billion dollar valuations.  The success of these SaaS leaders has had a domino effect in adjacent areas of the cloud – infrastructure, databases, and analytics.

Amazon Web Services (AWS), which originally had only six services in 2006 with the launch of Amazon EC2, now has over 30 ranging from storage, relational databases, data warehousing, Big Data, and more. Salesforce.com’s Wave platform, Tableau Software, and Qlik have made great advances in the cloud analytics arena, to give better visibility to line-of-business users. And as SaaS applications embrace new software design paradigms that extend their functionality, application performance monitoring (APM) analytics has emerged as a specialized field from vendors such as New Relic and AppDynamics.

So, how exactly did the growth of SaaS contribute to these adjacent sectors taking off?

The growth of SaaS coincided with the growth of powerful smartphones and tablets. Seeing this form factor as important to the end user, SaaS companies rushed to produce mobile apps that offered core functionality on their mobile device. Measuring adoption of these mobile apps was necessary to ensure that future releases met all the needs of the end user. Mobile apps contain a ton of information such as app responsiveness, features utilized, and data consumed. As always, there were several types of users, with some preferring a laptop form factor over a smartphone or tablet. With the ever increasing number of data points to measure within a SaaS app, the area of application performance monitoring analytics really took off.

Simultaneously, the growth of the SaaS titans cemented their reputation as not just applications for a certain line-of-business, but into full-fledged platforms. This growth emboldened a number of SaaS startups to develop apps that solved specialized or even vertical business problems in healthcare, warranty-and-repair, quote-to-cash, and banking. To get started quickly and scale rapidly, these startups leveraged AWS and its plethora of services.

The final sector that has taken off thanks to the growth of SaaS is the area of cloud analytics. SaaS grew by leaps and bounds because of its ease of use, and rapid deployment that could be achieved by business users. Cloud analytics aims to provide the same ease of use for business users when providing deep insights into data in an interactive manner.

In all these different sectors, what’s common is the fact that SaaS growth has created an uptick in the volume of data and the technologies that serve to make it easier to understand. During Informatica’s Data Mania event (March 4th, San Francisco) you’ll find several esteemed executives from Salesforce, Amazon, Adobe, Microsoft, Dun & Bradstreet, Qlik, Marketo, and AppDynamics talk about the importance of data in the world of SaaS.

Share
Posted in Business Impact / Benefits, Business/IT Collaboration, Cloud, Cloud Application Integration, Cloud Computing, Cloud Data Integration, Cloud Data Management, SaaS | Tagged , , , , | Leave a comment

Being a CIO Today Requires More Than a Service Broker Orientation

cloud service brokerIn my discussions with CIOs, their opinions differ widely about the go forward nature of the CIO role. While most feel the CIO role will remain an important function, they also feel a sea state change is in process. According to Tim Crawford, a former CIO and strategic advisor to CIOs, “CIOs are getting out of the data center business”. In my discussions, not all yet see the complete demise for their data centers. However, it is becoming more common for CIOs to see themselves “becoming an orchestrator of business services versus a builder of new operational services”. One CIO put it this way, “the building stuff is now really table stakes. Cloud and loosely oriented partnerships are bringing vendor management to the forefront”.

cloud service brokerAs more and more of the service portfolio are provided by third parties in either infrastructure as a service (IaaS) or software as a service (SaaS) modes, the CIO needs to take on what will become an increasingly important role –the service broker. An element of the service broker role that will have increasingly importance is the ability to glue together business systems w6hether they are on premise, cloud managed (Iaas), or software as a service (Saas). Regardless of who creates or manages the applications of the enterprise, it is important to remember that integration is to a large degree the nervous system that connects applications into business capabilities. As such, the CIO’s team has a critical and continuing role in managing this linkage. For example, spaghetti code integrations can easily touch 20 or more systems for ERP or expense management systems.

Brokering integration services

As CIOs start to consider the move to cloud, they need to determine how this nervous system is connected, maintained, and improved. In particular, they need to determine maybe for the first time how to integrate their cloud systems to the rest of their enterprise systems. They clearly can continue to do so by building and maintaining hand coding or by using their existing ETL tools. This can work where one takes on an infrastructure as a service model. But it falls apart when looking at the total cost of ownership of managing the change of a SaaS model. This fact begs an interesting question. Shouldn’t the advantages of SaaS occur as well for integration? Shouldn’t there be Cloud Data Management (Integration as a Service)options? The answer is yes. Instead of investing in maintain integrations of SaaS systems which because of agile methodologies can change more frequently than traditional software development, couldn’tsomeone else manage this mess for me.

The advantage of the SaaS model is total cost of ownership and faster time to value. Instead of managing, integration between SaaS and historical environments, the integration between SaaS applications and historical applications can be maintained by the cloud data Management vendor. This would save both  cost and time. As well, it would free you to focus your team’s energy upon cleaning up the integrations between historical systems and each other. This is a big advantage for organizations trying to get on the SaaS bandwagon but not incur significantly increased costs as a result.

Definitions please

Infrastructure as a Service (IaaS)—Provides processor, databases, etc. remotely but you control and maintain what goes on them

Software as a Service (Saas)—Provides software applications and underling infrastructure as a Service

Cloud Data Management—Provides Integration of applications in particular SaaS applications as a service

Parting remarks

CIOs are embarking upon big changes. Building stuff is becoming less and less relevant. However, even as more and more services are managed remotely (even by other parties), it remains critical that CIOs and their teams manage the glue between applications. With SaaS application in particular, this is where Cloud Data Management can really help you control integrations with less time and cost.

Related links

Solution Brief:

Hybrid IT Integration

Related Blogs

What is the Role of the CIO in Driving Enterprise Analytics?

IT Is All About Data!

The Secret To Being A Successful CIO

Driving IT Business Alignment: One CIOs Journey

How Is The CIO Role Starting To Change?

The CIO Challenged

Author Twitter: @MylesSuer

 

Share
Posted in Cloud Application Integration | Tagged , , | Leave a comment

Federal Migration to Cloud Computing, Hindered by Data Issues

cloud_computing

Moving towards Cloud Computing

As reviewed by Loraine Lawson,  a MeriTalk survey about cloud adoption found that a “In the latest survey of 150 federal executives, nearly one in five say one-quarter of their IT services are fully or partially delivered via the cloud.”

For the most part, the shifts are more tactical in nature.  These federal managers are shifting email (50 percent), web hosting (45 percent) and servers/storage (43 percent).  Most interesting is that they’re not moving traditional business applications, custom business apps, or middleware. Why? Data, and data integration issues.

“Federal agencies are worried about what happens to data in the cloud, assuming they can get it there in the first place:

  • 58 percent of executives fret about cloud-to-legacy system integration as a barrier.
  • 57 percent are worried about migration challenges, suggesting they’re not sure the data can be moved at all.
  • 54 percent are concerned about data portability once the data is in the cloud.
  • 53 percent are worried about ‘contract lock-in.’ ”

The reality is that the government does not get much out of the movement to cloud without committing core business applications and thus core data.  While e-mail and Web hosting, and some storage is good, the real cloud computing money is made when moving away from expensive hardware and software.  Failing to do that, you fail to find the value, and, in this case, spend more taxpayer dollars than you should.

Data issues are not just a concern in the government.  Most larger enterprise have the same issues as well.  However, a few are able to get around these issues with good planning approaches and the right data management and data integration technology.  It’s just a matter of making the initial leap, which most Federal IT executives are unwilling to do.

In working with CIOs of Federal agencies in the last few years, the larger issue is that of funding.  While everyone understands that moving to cloud-based systems will save money, getting there means hiring government integrators and living with redundant systems for a time.  That involves some major money.  If most of the existing budget goes to existing IP operations, then the move may not be practical.  Thus, there should be funds made available to work on the cloud projects with the greatest potential to reduce spending and increase efficiencies.

The shame of this situation is that the government was pretty much on the leading edge with cloud computing. back in 2008 and 2009.  The CIO of the US Government, Vivek Kundra, promoted the use of cloud computing, and NIST drove the initial definitions of “The Cloud,” including IaaS, SaaS, and PaaS.  But, when it came down to making the leap, most agencies balked at the opportunity citing issues with data.

Now that the technology has evolved even more, there is really no excuse for the government to delay migration to cloud-based platforms.  The clouds are ready, and the data integration tools have cloud integration capabilities backed in.  It’s time to see some more progress.

Share
Posted in Business Impact / Benefits, Cloud, Cloud Application Integration, Cloud Data Integration, Data Integration, Data Integration Platform | Tagged , , , | 2 Comments

How Great Data in the Cloud Can Make for Greater Business Outcomes

Great Cloud Data Improves Business Outcomes

Great Cloud Data Improves Business Outcomes

The technology you use in your business can either help or hinder your business objectives.

In the past, slow and manual processes had an inhibiting effect on customer services and sales interactions, thus dragging down the bottom line.

Now, with cloud technology and customers interacting at record speeds, companies expect greater returns from each business outcome. What do I mean when I say business outcome?

Well according to Bluewolf’s State of Salesforce Report, you can split these into four categories: acquisition, expansion, retention and cost reduction.

With the right technology and planning, a business can speedily acquire more customers, expand to new markets, increase customer retention and ensure they are doing all of this efficiently and cost effectively. But what happens when the data or the way you’re interacting with these technologies grow unchecked, and/or becomes corrupted and unreliable.

With data being the new fuel for decision-making, you need to make sure it’s clean, safe and reliable.

With clean data, Salesforce customers, in the above-referenced Bluewolf survey, reported efficiency and productivity gains (66%), improved customer experience (34%), revenue growth (32%) and cost reduction (21%) in 2014.

It’s been said that it costs a business 10X more to acquire new customers than it does to retain existing ones. But, despite the additional cost, real continued growth requires the acquisition of new customers.

Gaining new customers, however, requires a great sales team who knows what and to whom they’re selling. With Salesforce, you have that information at your fingertips, and the chance to let your sales team be as good as they can possibly be.

And this is where having good data fits in and becomes critically important. Because, well, you can have great technology, but it’s only going to be as good as the data you’re feeding it.

The same “garbage in, garbage out” maxim holds true for practically any data-driven or –reliant business process or outcome, whether it’s attracting new customers or building a brand. And with the Salesforce Sales Cloud and Marketing Cloud you have the technology to both attract new customers and build great brands, but if you’re feeding your Clouds with inconsistent and fragmented data, you can’t trust that you’ve made the right investments or decisions in the right places.

The combination of good data and technology can help to answer so many of your critical business questions. How do I target my audience without knowledge of previous successes? What does my ideal customer look like? What did they buy? Why did they buy it?

For better or worse, but mainly better, answering those questions with just your intuition and/or experience is pretty much out of the question. Without the tool to look at, for example, past campaigns and sales, and combining this view to see who your real market is, you’ll never be fully effective.

The same is true for sales. Without the right Leads, and the ability to interact with these Leads effectively, i.e., having the right contact details, company, knowing there’s only one version of that record, can make the discovery process a long and painful one.

But customer acquisition isn’t the only place where data plays a vital role.

When expanding to new markets or upselling and cross selling to existing customers, it’s the data you collect and report on that will help inform where you should focus your efforts.

Knowing what existing relationships you can leverage can make the difference between proactively offering solutions to your customers and losing them to a competitor. With Salesforce’s Analytics Cloud, this visibility that used to take weeks and months to view can now be put together in a matter of minutes. But how do you make strategic decisions on what market to tap into or what relationships to leverage, if you can only see one or two regions? What if you could truly visualize how you interact with your customers?  Or see beyond the hairball of interconnected business hierarchies and interactions to know definitively what subsidiary, household or distributor has what? Seeing the connections you have with your customers can help uncover the white space that you could tap into.

Naturally this entire process means nothing if you’re not actually retaining these customers. Again, this is another area that is fuelled by data. Knowing who your customers are, what issues they’re having and what they could want next could help ensure you are always providing your customer with the ultimate experience.

Last, but by no means least, there is cost reduction. Only by ensuring that all of this data is clean — and continuously cleansed — and your Cloud technologies are being fully utilized, can you then help ensure the maximum return on your Cloud investment.

Learn more about how Informatica Cloud can help you maximize your business outcomes through ensuring your data is trusted in the Cloud.

Share
Posted in Cloud, Cloud Application Integration, Cloud Computing, Cloud Data Integration, Cloud Data Management, Data First | Tagged , , , | Leave a comment