Tag Archives: Cloud Computing

Federal Migration to Cloud Computing, Hindered by Data Issues

cloud_computing

Moving towards Cloud Computing

As reviewed by Loraine Lawson,  a MeriTalk survey about cloud adoption found that a “In the latest survey of 150 federal executives, nearly one in five say one-quarter of their IT services are fully or partially delivered via the cloud.”

For the most part, the shifts are more tactical in nature.  These federal managers are shifting email (50 percent), web hosting (45 percent) and servers/storage (43 percent).  Most interesting is that they’re not moving traditional business applications, custom business apps, or middleware. Why? Data, and data integration issues.

“Federal agencies are worried about what happens to data in the cloud, assuming they can get it there in the first place:

  • 58 percent of executives fret about cloud-to-legacy system integration as a barrier.
  • 57 percent are worried about migration challenges, suggesting they’re not sure the data can be moved at all.
  • 54 percent are concerned about data portability once the data is in the cloud.
  • 53 percent are worried about ‘contract lock-in.’ ”

The reality is that the government does not get much out of the movement to cloud without committing core business applications and thus core data.  While e-mail and Web hosting, and some storage is good, the real cloud computing money is made when moving away from expensive hardware and software.  Failing to do that, you fail to find the value, and, in this case, spend more taxpayer dollars than you should.

Data issues are not just a concern in the government.  Most larger enterprise have the same issues as well.  However, a few are able to get around these issues with good planning approaches and the right data management and data integration technology.  It’s just a matter of making the initial leap, which most Federal IT executives are unwilling to do.

In working with CIOs of Federal agencies in the last few years, the larger issue is that of funding.  While everyone understands that moving to cloud-based systems will save money, getting there means hiring government integrators and living with redundant systems for a time.  That involves some major money.  If most of the existing budget goes to existing IP operations, then the move may not be practical.  Thus, there should be funds made available to work on the cloud projects with the greatest potential to reduce spending and increase efficiencies.

The shame of this situation is that the government was pretty much on the leading edge with cloud computing. back in 2008 and 2009.  The CIO of the US Government, Vivek Kundra, promoted the use of cloud computing, and NIST drove the initial definitions of “The Cloud,” including IaaS, SaaS, and PaaS.  But, when it came down to making the leap, most agencies balked at the opportunity citing issues with data.

Now that the technology has evolved even more, there is really no excuse for the government to delay migration to cloud-based platforms.  The clouds are ready, and the data integration tools have cloud integration capabilities backed in.  It’s time to see some more progress.

Share
Posted in Business Impact / Benefits, Cloud, Cloud Application Integration, Cloud Data Integration, Data Integration, Data Integration Platform | Tagged , , , | 2 Comments

Is Your Data Ready to Maximize Value from Your CRM Investments?

Is Your Data Ready to Maximize Value from Your CRM Investments?

Is Your Data Ready to Maximize Value from Your CRM Investments?

A friend of mine recently reached out to me about some advice on CRM solutions in the market.  Though I have not worked for a CRM vendor, I’ve had both direct experience working for companies that implemented such solutions to my current role interacting with large and small organizations regarding their data requirements to support ongoing application investments across industries. As we spoke, memories started to surface when he and I had worked on implementing Salesforce.com (SFDC) many years ago. Memories that we wanted to forget but important to call out given his new situation.

We worked together for a large mortgage lending software vendor selling loan origination solutions to brokers and small lenders mainly through email and snail mail based marketing.  He was responsible for Marketing Operations, and I ran Product Marketing. The company looked at Salesforce.com to help streamline our sales operations  and improve how we marketed and serviced our customers.  The existing CRM system was from the early 90’s and though it did what the company needed it to do, it was heavily customized, costly to operate, and served its life. It was time to upgrade, to help grow the business, improve business productivity, and enhance customer relationships.

After 90 days of rolling out SFDC, we ran into some old familiar problems across the business.  Sales reps continued to struggle in knowing who was a current customer using our software, marketing managers could not create quality mailing lists for prospecting purposes, and call center reps were not able to tell if the person on the other end was a customer or prospect. Everyone wondered why this was happening given we adopted the best CRM solution in the market.  You can imagine the heartburn and ulcers we all had after making such a huge investment in our new CRM solution.  C-Level executives were questioning our decisions and blaming the applications. The truth was, the issues were not related to SFDC but the data that we had migrated into the system and the lack proper governance and a capable information architecture to support the required data management integration between systems that caused these significant headaches.

During the implementation phase, IT imported our entire customer database of 200K+ unique customer entities from the old system to SFDC. Unfortunately, the mortgage industry was very transient and on average there were roughly 55K licenses mortgage brokers and lenders in the market and because no one ever validated the accuracy of who was really a customer vs. someone who had ever bought out product, we had a serious data quality issues including:

  • Trial users  who purchased evaluation copies of our products that expired were tagged as current customers
  • Duplicate records caused by manual data entry errors consisting of companies with similar but entered slightly differently with the same business address were tagged as unique customers
  • Subsidiaries of parent companies in different parts of the country that were tagged again as a unique customer.
  • Lastly, we imported the marketing contact database of prospects which were incorrectly accounted for as a customer in the new system

We also failed to integrate real-time purchasing data and information from our procurement systems for sales and support to handle customer requests. Instead of integrating that data in real-time with proper technology, IT had manually loaded these records at the end of the week via FTP resulting in incorrect billing information, statement processing, and a ton of complaints from customers through our call center. The price we paid for not paying attention to our data quality and integration requirements before we rolled out Salesforce.com was significant for a company of our size. For example:

  • Marketing got hit pretty hard. Each quarter we mailed evaluation copies of new products to our customer database of 200K, each costing the company $12 per to produce and mail. Total cost = $2.4M annually.  Because we had such bad data,  we would get 60% of our mailings returned because of invalid addresses or wrong contact information. The cost of bad data to marketing = $1.44M annually.
  • Next, Sales struggled miserably when trying to upgrade a customer by running cold call campaigns using the names in the database. As a result, sales productivity dropped by 40% and experienced over 35% sales turnover that year. Within a year of using SFDC, our head of sales got let go. Not good!
  • Customer support used SFDC to service customers, our average all times were 40 min per service ticket. We had believed that was “business as usual” until we surveyed what reps were spending their time each day and over 50% said it was dealing with billing issues caused by bad contact information in the CRM system.

At the end of our conversation, this was my advice to my friend:

  • Conduct a data quality audit of the systems that would interact with the CRM system. Audit how complete your critical master and reference data is including names, addresses, customer ID, etc.
  • Do this before you invest in a new CRM system. You may find that much of the challenges faced with your existing applications may be caused by the data gaps vs. the legacy application.
  • If they had a data governance program, involve them in the CRM initiative to ensure they understand what your requirements are and see how they can help.
  • However, if you do decide to modernize, collaborate and involve your IT teams, especially between your Application Development teams and your Enterprise Architects to ensure all of the best options are considered to handle your data sharing and migration needs.
  • Lastly, consult with your technology partners including your new CRM vendor, they may be working with solution providers to help address these data issues as you are probably not the only one in this situation.

Looking Ahead!

CRM systems have come a long way in today’s Big Data and Cloud Era. Many firms are adopting more flexible solutions offered through the Cloud like Salesforce.com, Microsoft Dynamics, and others. Regardless of how old or new, on premise or in the cloud, companies invest in CRM not to just serve their sales teams or increase marketing conversion rates, but to improve your business relationship with your customers. Period! It’s about ensuring you have data in these systems that is trustworthy, complete, up to date, and actionable to improve customer service and help drive sales of new products and services to increase wallet share. So how to do you maximize your business potential from these critical business applications?

Whether you are adopting your first CRM solution or upgrading an existing one, keep in mind that Customer Relationship Management is a business strategy, not just a software purchase. It’s also about having a sound and capable data management and governance strategy supported by people, processes, and technology to ensure you can:

  • Access and migrate data from old to new avoiding develop cost overruns and project delays.
  • Identify, detect, and distribute transactional and reference data from existing systems into your front line business application in real-time!
  • Manage data quality errors including duplicate records, invalid names and contact information due to proper data governance and proactive data quality monitoring and measurement during and after deployment
  • Govern and share authoritative master records of customer, contact, product, and other master data between systems in a trusted manner.

Will your data be ready for your new CRM investments?  To learn more:

Follow me on Twitter @DataisGR8

Share
Posted in Architects, Cloud, Cloud Application Integration, Cloud Computing, Cloud Data Integration, Cloud Data Management, CMO, Customer Acquisition & Retention, SaaS | Tagged , , , , , , , , , | Leave a comment

Cloud Integration Issues? Look to the Enterprise Architects

Cloud Integration and Enterprise Architects

Cloud Integration and Enterprise Architects

According to an article by Jan Stafford, “When enterprises adopt cloud computing, many of their legacy methods of software integration are instantly obsolete. Hanging on to old integration methods is like trying to fit square pegs into round holes…”

It’s true.  Data integration is a whole new game, compared to five years ago, or, in some organizations, five minutes ago.  The right approaches to data integration continue to evolve around a few principal forces: First, the growth of cloud computing, as pointed out by Stafford.  Second, the growing use of big data systems, and the emerging use of data as a strategic asset for the business.

These forces combine to drive us to the understanding that old approaches to data integration won’t provide the value that they once did.  As someone who was a CTO of three different data integration companies, I’ve seen these patterns change over the time that I was building technology, and that change has accelerated in the last 7 years.

The core opportunities lie with the enterprise architect, and their ability to drive an understanding of the value of data integration, as well as drive change within their organization.  After all, they, or the enterprises CTOs and CIOs (whomever makes decisions about technological approaches), are supposed to drive the organization in the right technical directions that will provide the best support for the business.  While most enterprise architects follow the latest hype, such as cloud computing and big data, many have missed the underlying data integration strategies and technologies that will support these changes.

“The integration challenges of cloud adoption alone give architects and developers a once in a lifetime opportunity to retool their skillsets for a long-term, successful career, according to both analysts. With the right skills, they’ll be valued leaders as businesses transition from traditional application architectures, deployment methodologies and sourcing arrangements.”

The problem is that, while most agree that data integration is important, they typically don’t understand what it is, and the value it can bring.  These days, many developers live in a world of instant updates.  With emerging DevOps approaches and infrastructure, they really don’t get the need, or the mechanisms, required to share data between application or database silos.  In many instances, they resort to coding interfaces between source and target systems.  This leads to brittle and unreliable integration solutions, and thus hurts and does not help new cloud application and big data deployments.

The message is clear: Those charged with defining technology strategies within enterprises need to also focus on data integration approaches, methods, patterns, and technologies.  Failing to do so means that the investments made in new and emerging technology, such as cloud computing and big data, will fail to provide the anticipated value.  At the same time, enterprise architects need to be empowered to make such changes.  Most enterprises are behind on this effort.  Now it’s time to get to work.

Share
Posted in B2B, Business Impact / Benefits, Cloud, Cloud Data Integration | Tagged , , , | 2 Comments

Amazon Web Services and Informatica Deliver Data-Ready Cloud Computing Infrastructure for Every Business

At re:Invent 2014 in Las Vegas,  Informatica and AWS announced a broad strategic partnership to deliver data-ready cloud computing infrastructure to any type or size of business.

Informatica’s comprehensive portfolio across Informatica Cloud and PowerCenter solutions connect to multiple AWS Data Services including Amazon Redshift, RDS, DynamoDB, S3, EMR and Kinesis – the broadest pre-built connectivity available to AWS Data Services. Informatica and AWS offerings are pre-integrated, enabling customers to rapidly and cost-effectively implement data warehousing, large scale analytics, lift and shift, and other key use cases in cloud-first and hybrid IT environments. Now, any company can use Informatica’s portfolio to get a plug-and-play on-ramp to the cloud with AWS.

Economical and Flexible Path to the Cloud

As business information needs intensify and data environments become more complex, the combination of AWS and Informatica enables organizations to increase the flexibility and reduce the costs of their information infrastructures through:

  • More cost-effective data warehousing and analytics – Customers benefit from lower costs and increased agility when unlocking the value of their data with no on-premise data warehousing/analytics environment to design, deploy and manage.
  • Broad, easy connectivity to AWS – Customers gain full flexibility in integrating data from any Informatica-supported data source (the broadest set of sources supported by any integration vendor) through the use of pre-built connectors for AWS.
  • Seamless hybrid integration – Hybrid integration scenarios across Informatica PowerCenter and Informatica Cloud data integration deployments are able to connect seamlessly to AWS services.
  • Comprehensive use case coverage – Informatica solutions for data integration and warehousing, data archiving, data streaming and big data across cloud and on-premise applications mesh with AWS solutions such as RDS, Redshift, Kinesis, S3, DynamoDB, EMR and other AWS ecosystem services to drive new and rapid value for customers.

New Support for AWS Services

Informatica introduced a number of new Informatica Cloud integrations with AWS services, including connectors for Amazon DynamoDB, Amazon Elastic MapReduce (Amazon EMR) and Amazon Simple Storage Service (Amazon S3), to complement the existing connectors for Amazon Redshift and Amazon Relational Database Service (Amazon RDS).

Additionally, the latest Informatica PowerCenter release for Amazon Elastic Compute Cloud (Amazon EC2) includes support for:

  • PowerCenter Standard Edition and Data Quality Standard Edition
  • Scaling options – Grid, high availability, pushdown optimization, partitioning
  • Connectivity to Amazon RDS and Amazon Redshift
  • Domain and repository DB in Amazon RDS for current database PAM (policies and measures)

To learn more, try our 60-day free Informatica Cloud trial for Amazon Redshift.

If you’re in Vegas, please come by our booth at re:Invent, Nov. 11-14, in Booth #1031, Venetian / Sands, Hall.

Share
Posted in Business Impact / Benefits, Cloud, Cloud Application Integration, Cloud Computing | Tagged , , , | Leave a comment

Elevating Your App by Embedding Advanced, REST-ful Cloud Integration

Cloud Integration

Elevating Your App by Embedding Advanced, REST-ful Cloud Integration

While the great migration to the cloud has created opportunities for practically everyone in tech, no one group stands to benefit from it more than application developers. In partnership with category-leading SaaS companies like Salesforce, NetSuite, Ultimate Software and others, ISVs have not only easy access to open APIs to leverage but also complete marketplaces to establish themselves and exploit for market and mind share. And within those ecosystems, there stands a whole new class of users looking to these applications to solve their specific business problems.

But, as Billy Macinnes, in his July MicroScope article, reminds us, the opportunities come with many challenges, and so far only a few ISVs have risen high enough to truly meet them all. While the article itself is more concerned about where ISVs are headed, Macinnes and the industry experts he references, such as Mike West and Philip Howard, make it clear that no one is going anywhere far without a cloud strategy that meaningfully addresses data integration.

As a business app consumer myself, I too am excited by the possibilities that exist. I am intrigued by the way in which the new applications embrace the user-first ethos and deliver consumer-app-like interfaces and visual experiences. What concerns me is what happens next, once you get beyond the pretty design and actually try to solve the business use case for which the app is intended. This, unfortunately, is where many business apps fail. While most can access data from a single specific application, few can successfully interact with external data coming from multiple sources.

Like many of the challenges (such as licensing and provisioning) faced by today’s ISVs, data integration is something that lies outside of the expertise area of a typical app developer. Let’s say, for example, you’ve just come up with a new way to anticipate customer needs and match it with excess inventory. While the developer expertise and art of the app may be, say, in a new algorithm, the user experience, ultimately, is equally dependent on your ability to surface data – inventory, pricing, SKU numbers, etc. – that may be held in SaaS and on-premises systems and seamlessly marry it – behind the scenes – to cloud-based customer information.

The bottom line is that regardless of the genius behind your idea or user interface, if you can’t feed relevant data into your application and ensure its completeness, quality and security for meaningful consumption, your app will be dead in the water. As a result, app developers are spending an inordinate amount of time – in some cases up to 80% of their development cycle – working through data issues. Even with that, many still get stuck and end up with little more for their effort than a hard lesson in the difficulties of enterprise data integration long understood by every integration developer.

Fortunately, there is a better way: cloud integration.

Cloud integration enables the developer to focus on their app and core business. The ISV can offer cloud integration to its customers as an external resource or as an embedded part of its app. While some may see this as a choice, any ISV looking to provide the best possible user experience has no real option other than to embed the integration services as part of their application.

Look at any successful business app, and chances are you’ll find something that empowers users to work independently, without having to rely on other teams or tools for solutions. Take, for example, the common use case of bringing data into an app via a CSV file. With integration built directly into the app, the user can upload the file and resolve any semantic conflicts herself, with no assistance from IT. Without it, the user is now reliant on others to do his or her job, and ultimately less productive. Clearly, the better experience is the one that provides users with easy access to everything needed – including data from multiple sources – to get the work done themselves. And the most effective way you can do that is by embedding integration into the application.

Now that we’ve settled why cloud integration works best as an embedded capability, let’s take a closer look at how it works within the application context.

With cloud integration embedded into your app, you can essentially work behind the scenes to connect different data sources and incorporate the mapping and workflows between your app and the universe of enterprise data sources. How it accomplishes that is through abstraction. By abstracting connectivity to these data sources, you take the complexities involved with bringing data from an external source – such as SAP or Salesforce – and place it within a well-defined integration template or Vibe Integration Package (VIP). Once these abstractions are defined, you can then, as an application developer, access these templates through REST API and bring the specified data into your application.

While connectivity abstraction and REST APIs are important on their own, like all great pairings, it is only in combination that their true utility is realized. In fact, taken separately, neither is of much value to the application developer. Alone, a REST API can access the raw data type, but without the abstraction, the information is too unintelligible and incomplete to be of any use. And without the REST API, the abstracted data has no way of getting from the source to the application.

The value that REST APIs together with connectivity abstraction bring cannot be overstated, especially when the connectivity can span multiple integration templates. The mechanism for accomplishing integration is, like an automobile transmission, incredibly complex. To give an analogy, just like a car’s shift lever exposes a simple interface to move the gears from Park to Drive, activating a series of complex sensors to make the appropriate motions under the hood, the integration templates allow the user to work with the data in any way they want without ever having to understand or know about the complexities going on underneath.

As the leading cloud integration solution and platform, Informatica Cloud has long recognized the importance of pairing REST APIs and connectivity abstraction.

The first and most important function within our REST API is administration. It enables you to set up your organization and the administration of your users and permissions. The second function allows you to run and monitor integration tasks. And with the third, end users can configure the integration templates themselves, and enforce the business rules to apply for their specific process. You can view the entire set of Informatica Cloud REST API capabilities here.

It is in this last area – integration configurability – where we are truly setting ourselves apart. The Vibe Integration Packages (VIPs) not only abstract backend connectivity but also ensure that the data is complete – with the needed attributes from the underlying apps – and is of high quality and formatted for easy consumption in the end-user application. With the Packages, we’ve put together many of the most common integrations with reusable integration logic that is configurable through a variety of parameters. Our configurable templates enable your app users to customize and fine-tune their integrations – with custom fields, objects, etc. – to meet the specific behavior and functionality of their integrations. For example, the Salesforce to SAP VIP includes all the integration templates you need to solve different business use cases, such as integrating product, order and account information.

With their reusability and groupings encompassing many of the common integration use cases, our Vibe Integration Packages really are revolutionizing work for everyone. Using Informatica Cloud’s Visual Designer, developers can quickly create new, reusable VIPs, with parameterized values, for business users to consume. And SaaS administrators and business analysts can perform complex business integrations in a fraction of the time it took previously, and customize new integrations on the fly, without IT’s help.

More and more, developers are building great-looking apps with even greater aspirations. In many cases, the only thing holding them back is the ability to access back-office data without using external tools and interfaces, or outside assistance. With Informatica Cloud, data integration need no longer take a backseat to design, or anything else. Through our REST API, abstractions and Vibe Integration Packages, we help developers put an end to the compromise on user experience by bringing in the data directly through the application – for the benefit of everyone.

Share
Posted in Cloud Computing, Cloud Data Integration | Tagged , | Leave a comment

Making the Hybrid Cloud Work for Public Sector

Making the Hybrid Cloud Work for Public Sector

Hybrid Cloud and Public Sector

If you’ve been working in the government sector for any amount of time, you had to see the advent of the “hybrid cloud” coming. Like all new technologies, when first introduced, “the cloud” was the answer to all your IT woes. It is cheaper, more reliable, infinitely scalable, instantly adaptable, and so on. But, as time has gone by and many of you have dipped your toes in the water, the reality is beginning to surface, and challenges are beginning to appear. Sure, moving email to the cloud was a great first step, and it certainly gave most agencies the ability to show progress in leveraging the cloud. Yes, archiving data to the cloud is also a good use case and is showing progress. But, what’s next? There are plenty of new SaaS offerings popping up, and purpose-built to solve various public sector challenges, and yes, they are generally decent applications. Yet, would it be fair to suggest new challenges are arising as your agency begins to adopt new cloud solutions? In particular, has the advent of specialized applications for government made your overall IT portfolio simpler or more complex? Government has always struggled with a vast array of siloed systems and isn’t the cloud creating yet more challenges in this regard? Well, maybe. Let’s take a look.

What I love about the cloud is it has something of value to offer practically any government organization, regardless of size, maturity, point of view, approach. Even for the most conservative IT shops, there are use cases that just plain make sense. And with the growing availability of FEDRAMP certified offerings, it’s becoming easier to procure. But, thinking realistically, for reasons of law, budget, time, architecture, we know the cloud will not be the solution for every public sector problem. Some applications, some data will never leave your agency’s premises. And here in lies the new complexity. You have applications and data on-prem. You have applications and data in the cloud. And you have business requirements that require these apps to work together, to share data.

So, now that you have a hybrid environment, what can you do about? Let’s face it, we can talk about technology, architecture and approaches all day long, but, it always comes down to this, what should be done with the data. You need answers to questions such as; Is it safe? Is it accessible? It is reliable? How do I know if the integrity has been compromised? What about the quality? How error-prone is the data? How complete is the data? How do we manage it across this new hybrid landscape? How can I get data from a public cloud application to my on-prem data warehouse? How can I leverage the flexibility of public IaaS to build a new application that will need access to data that is also required for an on-prem legacy application?

I know many government IT professional are wrestling with these questions and seeking solutions. So, here’s an interesting thought. Most of these questions are not exactly new, they are just taking on the added context of the cloud. Prior to the cloud, many agencies discovered answers in the form of a data integration platform. The platform is used to ensure every application, every user has access to the data they need to perform their mission or job. I think of it this way. The platform is a “standardized” abstraction layer that ensures all your data gets to where it needs to be, when it needs to be there, in the form it needs to be in. There are hundreds of government IT shops using such an approach.

Here’s the good news. This approach to integrating data can be extended to include the cloud.  Imagine placing “agents” in all the places where your data needs to live, the agents capable of communicating with each other to integrate, alter or move data. Now add to this the idea of a cloud-based remote control that allows you to control all the functions of the agents. Using such a platform now enables your agency to tie on-prem systems to cloud systems, minimizing the effect of having multiple silos of information. Now government workers and warfighters will have the ability to more quickly get complete, accurate data, regardless of where it originates and citizens will benefit from more effectively delivered services.

How would such an approach change your ideas on how to leverage the cloud for your agency? If you live near the Washington, DC area, you may wish to drop in on the Government Cloud Computing and Data Center Conference & Expo. One of my colleagues, Ronen Schwartz will be discussing this topic. For those not in the vicinity, you can learn more here.

Share
Posted in Cloud, Cloud Application Integration, Cloud Data Integration, Cloud Data Management, Public Sector | Tagged , , | Leave a comment

Informatica Cloud Summer ’14 Release Breaks Down Barriers with Unified Data Integration and Application Integration for Real Time and Bulk Patterns

This past week, Informatica Cloud marked an important milestone with the Summer 2014 release of the Informatica Cloud platform. This was the 20th Cloud release, and I am extremely proud of what our team has accomplished.

“SDL’s vision is to help our customers use data insights to create meaningful experiences, regardless of where or how the engagement occurs. It’s multilingual, multichannel and on a global scale. Being able to deliver the right information at the right time to the right customer with Informatica Cloud Summer 2014 is critical to our business and will continue to set us apart from our competition.”

– Paul Harris, Global Business Applications Director, SDL Pic

When I joined Informatica Cloud, I knew that it had the broadest cloud integration portfolio in the marketplace: leading data integration and analytic capabilities for bulk integration, comprehensive cloud master data management and test data management, and over a hundred connectors for cloud apps, enterprise systems and legacy data sources.. all delivered in a self-service design with point-and-click wizards for citizen integrators, without the need for complex and costly manual custom coding.

But, I also learned that our broad portfolio belies another structural advantage: because of Informatica Cloud’s unique, unified platform architecture, it has the ability to surface application (or real time) integration capabilities alongside its data integration capabilities with shared metadata across real time and batch workflows.

With the Summer 2014 release, we’ve brought our application integration capabilities to the forefront. We now provide the most-complete cloud app integration capability in the marketplace. With a design environment that’s meant not for just developers but also line of business IT, now app admins can also build real time process workflows that cut across on-premise and cloud and include built-in human workflows. And with the capability to translate these process workflows instantly into mobile apps for iPhone and Android mobile devices, we’re not just setting ourselves apart but also giving customers the unique capabilities they need for their increasingly mobile employees.

InformaticaCloud

Informatica Cloud Summer Release Webinar Replay

“Schneider’s strategic initiative to improve front-office performance relied on recording and measuring sales person engagement in real time on any mobile device or desktop. The enhanced real time cloud application integration features of Informatica Cloud Summer 2014 makes it all possible and was key to the success of a highly visible and transformative initiative.”

– Mark Nardella, Global Sales Process Director, Schneider Electric SE

With this release, we’re also giving customers the ability to create workflows around data sharing that mix and match batch and real time integration patterns. This is really important.  Because unlike the past, where you had to choose between batch and real time, in today’s world of on-premise, cloud-based, transactional and social data, you’re now more than ever having to deal with both real time interactions and the processing of large volumes of data. For example, let’s surmise a typical scenario these days at high-end retail stores. Using a clienteling iPad app, the sales rep looks up bulk purchase history and inventory availability data in SAP, confirms availability and delivery date, and then processes the customer’s order via real time integration with NetSuite. And if you ask any customer, having a single workflow to unify all of that for instant and actionable insights is a huge advantage.

“Our industry demands absolute efficiency, speed and trust when dealing with financial information, and the new cloud application integration feature in the latest release of Informatica Cloud will help us service our customers more effectively by delivering the data they require in a timely fashion. Keeping call-times to a minimum and improving customer satisfaction in real time.”

– Kimberly Jansen, Director CRM, Misys PLC

We’ve also included some exciting new Vibe Integration packages or VIPs. VIPs deliver pre-built business process mappings between front-office and back-office applications. The Summer 2014 release includes new bidirectional VIPs for Siebel to Salesforce and SAP to Salesforce that make it easier for customers to connect their Salesforce with these mission-critical business applications.

And lastly, but not least importantly, the release includes a critical upgrade to our API Framework that provides the Informatica Cloud iPaaS end-to-end support for connectivity to any company’s internal or external APIs. With the newly available API creation, definition and consumption patterns, developers or citizen integrators can now easily expose integrations as APIs and users can consume them via integration workflows or apps, without the need for any additional custom code.

The features and capabilities released this summer are available to all existing Informatica Cloud customers, and everyone else through our free 30-day trial offer.

Share
Posted in Big Data, Cloud, Cloud Application Integration, Cloud Computing, Cloud Data Integration, Data Integration, Uncategorized | Tagged , , , , , | Leave a comment

Bulletproof Tips to Optimize Data Transformations for Salesforce

Optimize Data Transformations for Salesforce

Optimize Data Transformations for Salesforce

In the journey from a single-purpose cloud CRM app to the behemoth that it is today, Salesforce has made many smart acquisitions. However, the recent purchase of RelateIQ may have just been its most ingenious. Although a relatively small startup, RelateIQ has gained a big reputation for its innovative use of data science and predictive analytics, which would be highly beneficial to Salesforce customers.

As relevant from the acquisition, there is little doubt that the cloud application world is making a tectonic shift to data science and the appetite for Big Data to be pieced together to fuel the highly desired 360-degree view is only growing stronger.

But while looking ahead is certainly important, those of us who live in the present have much work yet still to accomplish in the here and now. For many, that means figuring out the best way to leverage data integration strategies and the Salesforce platform to gain actionable intelligence for our sales, marketing and CRM projects – today. Up until recently, this has involved manual, IT-heavy processes.

We need look no further than three common use cases where typical data integration strategies and technologies fail today’s business users:

Automated Metadata Discovery
The first, and perhaps most frustrating, has to do with discovering related objects. For example, objects, such as Accounts, don’t exist in a vacuum. They have related objects such as Contacts that can provide the business user – be it a salesperson or customer service manager, with context of the customer.

Now, during the typical data-integration process, these related objects are obscured from view because most integration technologies in the market today cannot automatically recognize the object metadata that could potentially relate all of the customer data together. The result is an incomplete view of the customer in Salesforce, and a lost opportunity to leverage the platform’s capability to strengthen a relationship or close a deal.

The Informatica Cloud platform is engineered ground up to be aware of the application ecosystem API and understand its metadata. As a result, our mapping engine can automatically discover metadata and relate objects to one another. This automated metadata discovery gives business users the ability to discover, choose and bring all of the related objects together into one mapping flow. Now, with a just a few clicks, business users, can quickly piece together relevant information in Salesforce and take the appropriate action.

Bulk Preparation of Data
The second instance where most data integration solutions typically fall short is with respect to the bulk preparation of data for analytic runs prior to the data transformation process. With the majority of Line-of-Business (LOB) applications now being run in the cloud, the typical business user has multiple terabytes of cloud data that either need to be warehoused on-premise or in the cloud, for a BI app to perform its analytics.

As a result, the best practice for bringing in data for analytics requires the ability to select and aggregate multiple data records from multiple data sources into a single batch, to speed up transformation and loading and – ultimately – intelligence gathering. Unfortunately, advanced transformations such as aggregations are something that is simply beyond the capabilities of most other integration technologies. Instead, most bring the data in one record or message at time, inhibiting speedy data loading and delivery of critical intelligence to business users when they need it most.

Alternatively, Informatica has leveraged its intellectual property in the market-leading data integration product, PowerCenter, and developed data aggregation transformation functionality within Informatica Cloud. This enables business users to pick a select group of important data points from multiple data sources – for example, country or dollar size of revenue opportunities – to aggregate and quickly process huge volumes in a single batch.

In-App Business Accelerators
Similar to what we’ve experienced with the mobile iOS and Android platforms, there recently has been an explosion of new, single-purpose business applications being developed and released for the cloud. While the platform-specific mobile model is predicated on and has flourished because of the “build it and they will come” premise, the paradigm does not work in the same way for cloud-based applications that are trying to become a platform.

The reason for this is that, unlike with iOS and Android platforms, in the cloud world, business users have a main LOB application that they are familiar with and rely on packaged integrations to bring in from other LOB cloud applications. However, with Salesforce extending its reach to become a cloud platform, the center of data gravity is shifting towards it being used for more than just CRM and the success of this depends upon these packaged integrations. Up until now, these integrations have just consisted of sample code and have been incredibly complex (and IT-heavy) to build and very cumbersome to use. As a result, business users lack the agility to easily customize fields or their workflows to match their unique business processes. Ultimately, the packages that were intended to accelerate business ended up inhibiting it.

Informatica Cloud’s Vibe Integration Packages (or VIPs) have made the promise of the integration package as a business accelerator into a reality, for all business users. Unlike sample code, VIPs are sophisticated templates that encapsulate the intelligence or logic of how you integrate the data between apps. While VIPs abstract complexity to give users out-of-the box integration, their pre-built mapping also provides great flexibility. Now, with just a few keystrokes, business users can map custom fields or leverage their unique business model into their Salesforce workflows.

A few paragraphs back I began this discussion with the recent acquisition of RelateIQ by Salesforce. While we can make an educated guess as to what that will bring in the future, no one knows for sure. What we do know is that, at present, Salesforce’s completeness as a platform – and source of meaningful analytics – requires something in addition to run your relevant business applications and solutions through it. A complete iPaaS (Integration Platform as a Service) solution such as Informatica Cloud has the power to make the vision into reality. Informatica enables this through the meta-data rich platform for data discovery, Industry leading data and application integration capabilities and business accelerators that put the power back in the hands of citizen application integrators and business users.

Join our webinar August 14 to learn more: Informatica Cloud Summer 2014: The Most Complete iPaaS

Share
Posted in Cloud, Cloud Application Integration, Cloud Computing, Cloud Data Integration, Uncategorized | Tagged , , , , , | Leave a comment

How Parallel Data Loading and Amazon Redshift Redefine Data Warehousing Performance

As Informatica Cloud product managers, we spend a lot of our time thinking about things like relational databases. Recently, we’ve been considering their limitations, and, specifically, how difficult and expensive it is to provision an on-premise data warehouse to handle the petabytes of fluid data generated by cloud applications and social media.  As a result, companies have to often make tradeoffs and decide which data is worth putting into their data warehouse.

Certainly, relational databases have enormous value. They’ve been around for several decades and have served as a bulwark for storing and analyzing structured data. Without them, we wouldn’t be able to extract and store data from on-premise CRM, ERP and HR applications and push it downstream for BI applications to consume.

With the advent of cloud applications and social media however, we are now faced with managing a daily barrage of massive amounts of rapidly changing data, as well as the complexities of analyzing it within the same context as data from on-premise applications. Add to that the stream of data coming from Big Data sources such as Hadoop which then needs to be organized into a structured format so that various correlation analyses can be run by BI applications – and you can begin to understand the enormity of the problem.

Up until now, the only solution has been to throw development resources at legacy on-premise databases, and hope for the best. But given the cost and complexity, this is clearly not a sustainable long-term strategy.

As an alternative, Amazon Redshift, a petabyte-scale data warehouse service in the cloud has the right combination of performance and capabilities to handle the demands of social media and cloud app data, without the additional complexity or expense. Its Massively Parallel Processing (MPP) architecture allows for the lightning fast loading and querying of data. It also features a larger block size, which reduces the number of I/O requests needed to load data, and leads to better performance.

By combining Informatica Cloud with Amazon Redshift’s parallel loading architecture, you can make use of push-down optimization algorithms, which process data transformations in the most optimal source or target database engines. Informatica Cloud also offers native connectivity to cloud and social media apps, such as Salesforce, NetSuite, Workday, LinkedIn, and Twitter, to name a few, which makes it easy to funnel data from these apps into your Amazon Redshift cluster at faster speeds.

If you’re at the Amazon Web Services Summit today in New York City, then you heard our announcement that Informatica Cloud is offering a free 60-day trial for Amazon Redshift with no limitations on the number of rows, jobs, application endpoints, or scheduling. If you’d like to learn more, please visit our Redshift Trial page or go directly to the trial.

Share
Posted in Cloud, Cloud Computing, Cloud Data Integration, Cloud Data Management | Tagged , , , , | Leave a comment

The Three Ingredients of Enterprise Information Management

Information Management

Enterprise Information Management

There is no shortage of buzzwords that speak to the upside and downside of data.  Big Data, Data as an Asset, the Internet of Things, Cloud Computing, One Version of the Truth, Data Breach, Black Hat Hacking, and so on. Clearly we are in the Information Age as described by Alvin Toffler in The Third Wave. But yet, most organizations are not effectively dealing with the risks of a data-driven economy nor are they getting the full benefits of all that data. They are stuck in a fire-fighting mode where each information management opportunity or problem is a one-time event that is man-handled with heroic efforts. There is no repeatability. The organization doesn’t learn from prior lessons and each business unit re-invents similar solutions. IT projects are typically late, over budget, and under delivered. There is a way to break out of this rut. (more…)

Share
Posted in CIO, Data Integration, Enterprise Data Management, Integration Competency Centers | Tagged , , , , , , , , , | Leave a comment