Tag Archives: Cloud Data Integration

Salesforce Lightning Connect and OData: What You Need to Know

Salesforce Lightning Connect and OData

Salesforce Lightning Connect and OData

Last month, Salesforce announced that they are democratizing integration through the introduction of Salesforce1 Lightning Connect. This new capability makes it possible to work with data that is stored outside of Salesforce using the same force.com constructs (SOQL, Apex, VisualForce, etc) that are used with Salesforce objects. The important caveat is that that external data has to be available through the OData protocol, and the provider of that protocol has to be accessible from the internet.

I think this new capability, Salesforce Lightning Connect, is an innovative development and gives OData, an OASIS standard, a leg-up on its W3C-defined competitor Linked Data. OData is a REST-based protocol that provides access to data over the web. The fundamental data model is relational and the query language closely resembles what is possible with stripped-down SQL. This is much more familiar to most people than the RDF-based model using by Linked Data or its SPARQL query language.

Standardization of OData has been going on for years (they are working on version  4), but it has suffered from a bit of a chicken-egg problem. Applications haven’t put a large priority on supporting the consumption of OData because there haven’t been enough OData providers, and data providers haven’t prioritized making their data available through OData because there haven’t been enough consumers. With Salesforce, a cloud leader declaring that they will consume OData, the equation changes significantly.

But these things take time – what does someone do who is a user of Salesforce (or any other OData consumer) if most of their data sources they have cannot be accessed as an OData provider? It is the old last-mile problem faced by any communications or integration technology. It is fine to standardize, but how do you get all the existing endpoints to conform to the standard. You need someone to do the labor-intensive work of converting to the standard representation for lots of endpoints.

Informatica has been in the last-mile business for years. As it happens, the canonical model that we always used has been a relational model that lines up very well with the model used by OData. For us to host an OData provider for any of the data sources that we already support, we only needed to do one conversion from the internal format that we’ve always used to the OData standard. This OData provider capability will be available soon.

But there is also the firewall issue. The consumer of the OData has to be able to access the OData provider. So, if you want Salesforce to be able to show data from your Oracle database, you would have to open up a hole in your firewall that provides access to your database. Not many people are interested in doing that – for good reason.

Informatica Cloud’s Vibe secure agent architecture is a solution to the firewall issue that will also work with the new OData provider. The OData provider will be hosted on Informatica’s Cloud servers, but will have access to any installed secure agents. Agents require a one-time install on-premise, but are thereafter managed from the cloud and are automatically kept up-to-date with the latest version by Informatica . An agent doesn’t require a port to be opened, but instead opens up an outbound connection to the Informatica Cloud servers through which all communication occurs. The agent then has access to any on-premise applications or data sources.

OData is especially well suited to reading external data. However, there are better ways for creating or updating external data. One problem is that Salesforce only handles reads, but even when it does handle writes, it isn’t usually appropriate to add data to most applications by just inserting records in tables. Usually a collection of related information must to be provided in order for the update to make sense. To facilitate this, applications provide APIs that provide a higher level of abstraction for updates. Informatica Cloud Application Integration can be used now to read or write data to external applications from with Salesforce through the use of guides that can be displayed from any Salesforce screen. Guides make it easy to generate a friendly user interface that shows exactly the data you want your users to see and to guide them through the collection of new or updated data that needs to be written back to your app.

FacebookTwitterLinkedInEmailPrintShare
Posted in B2B, Business Impact / Benefits, Cloud, Cloud Computing, Cloud Data Integration, Data Governance | Tagged , , , | Leave a comment

Government Cloud Data Integration: Some Helpful Advice

Government Cloud Data Integration

Government Cloud Data Integration

Recently, a study found that the Government Cloud Data Integration has not been extremely effective. This post will provide some helpful advice.

As covered in Loraine Lawson’s blog, MeriTalk surveyed federal government IT professionals about their use of cloud computing. As it turns out, “89 percent out of 153 surveyed expressed ‘some apprehension about losing control of their IT services,’ according to MeriTalk.”

Loraine and I agree that what the survey says about the government’s data integration, management, and governance, is that they don’t seem to be very good at cloud data management…yet. Some of the other gruesome details include:

  • 61 percent do not have quality, documented metadata.
  • 52 percent do not have well understood data integration processes.
  • 50 percent have not identified data owners.
  • 49 percent do not have known systems of record.

“Overall, respondents did not express confidence about the success of their data governance and management efforts, with 41 percent saying their data integration management efforts were some degree of ‘not successful.’ This lead MeriTalk to conclude, ‘Data integration and remediation need work.’”

The problem with the government is that data integration, data governance, data management, and even data security have not been priorities. The government has a huge amount of data to manage, and they have not taken the necessary steps to adopt the best practices and technology that would allow them to manage it properly.

Now that everyone is moving to the cloud, the government included, questions are popping up about the proper way to manage data within the government, from the traditional government enterprises to the public cloud. Clearly, there is much work to be done to get the government ready for the cloud, or even ready for emerging best practices around data management and data integration.

If the government is to move in the right direction, they must first come to terms with the data. This means understanding where the data is, what it does, who owns it, access mechanisms, security, governance, etc., and apply this understanding holistically to most of the data under management.

The problem within the government is that the data is so complex, distributed, and, in many cases, unique, that it’s difficult for the government to keep good track of the data. Moreover, the way the government does procurement, typically in silos, leads to a much larger data integration problem. I was working with government agencies that had over 5,000 siloed systems, each with their own database or databases, and most do not leverage data integration technology to exchange data.

There are ad-hoc data integration approaches and some technology in place, but nowhere close to what’s need to support the amount and complexity of data. Now that government agencies are looking to move to the cloud, the issues around data management are beginning to be better understood.

So, what’s the government to do? This is a huge issue that can’t be fixed overnight. There should be incremental changes that occur over the next several years. This also means allocating more resources to data management and data integration than has been allocated in the past, and moving it much higher up in the priorities lists.

These are not insurmountable problems. However, they require a great deal of focus before things will get better. The movement to the cloud seems to be providing that focus.

FacebookTwitterLinkedInEmailPrintShare
Posted in Cloud Data Integration | Tagged , | Leave a comment

Informatica Cloud Summer ’14 Release Breaks Down Barriers with Unified Data Integration and Application Integration for Real Time and Bulk Patterns

This past week, Informatica Cloud marked an important milestone with the Summer 2014 release of the Informatica Cloud platform. This was the 20th Cloud release, and I am extremely proud of what our team has accomplished.

“SDL’s vision is to help our customers use data insights to create meaningful experiences, regardless of where or how the engagement occurs. It’s multilingual, multichannel and on a global scale. Being able to deliver the right information at the right time to the right customer with Informatica Cloud Summer 2014 is critical to our business and will continue to set us apart from our competition.”

– Paul Harris, Global Business Applications Director, SDL Pic

When I joined Informatica Cloud, I knew that it had the broadest cloud integration portfolio in the marketplace: leading data integration and analytic capabilities for bulk integration, comprehensive cloud master data management and test data management, and over a hundred connectors for cloud apps, enterprise systems and legacy data sources.. all delivered in a self-service design with point-and-click wizards for citizen integrators, without the need for complex and costly manual custom coding.

But, I also learned that our broad portfolio belies another structural advantage: because of Informatica Cloud’s unique, unified platform architecture, it has the ability to surface application (or real time) integration capabilities alongside its data integration capabilities with shared metadata across real time and batch workflows.

With the Summer 2014 release, we’ve brought our application integration capabilities to the forefront. We now provide the most-complete cloud app integration capability in the marketplace. With a design environment that’s meant not for just developers but also line of business IT, now app admins can also build real time process workflows that cut across on-premise and cloud and include built-in human workflows. And with the capability to translate these process workflows instantly into mobile apps for iPhone and Android mobile devices, we’re not just setting ourselves apart but also giving customers the unique capabilities they need for their increasingly mobile employees.

InformaticaCloud

Informatica Cloud Summer Release Webinar Replay

“Schneider’s strategic initiative to improve front-office performance relied on recording and measuring sales person engagement in real time on any mobile device or desktop. The enhanced real time cloud application integration features of Informatica Cloud Summer 2014 makes it all possible and was key to the success of a highly visible and transformative initiative.”

– Mark Nardella, Global Sales Process Director, Schneider Electric SE

With this release, we’re also giving customers the ability to create workflows around data sharing that mix and match batch and real time integration patterns. This is really important.  Because unlike the past, where you had to choose between batch and real time, in today’s world of on-premise, cloud-based, transactional and social data, you’re now more than ever having to deal with both real time interactions and the processing of large volumes of data. For example, let’s surmise a typical scenario these days at high-end retail stores. Using a clienteling iPad app, the sales rep looks up bulk purchase history and inventory availability data in SAP, confirms availability and delivery date, and then processes the customer’s order via real time integration with NetSuite. And if you ask any customer, having a single workflow to unify all of that for instant and actionable insights is a huge advantage.

“Our industry demands absolute efficiency, speed and trust when dealing with financial information, and the new cloud application integration feature in the latest release of Informatica Cloud will help us service our customers more effectively by delivering the data they require in a timely fashion. Keeping call-times to a minimum and improving customer satisfaction in real time.”

– Kimberly Jansen, Director CRM, Misys PLC

We’ve also included some exciting new Vibe Integration packages or VIPs. VIPs deliver pre-built business process mappings between front-office and back-office applications. The Summer 2014 release includes new bidirectional VIPs for Siebel to Salesforce and SAP to Salesforce that make it easier for customers to connect their Salesforce with these mission-critical business applications.

And lastly, but not least importantly, the release includes a critical upgrade to our API Framework that provides the Informatica Cloud iPaaS end-to-end support for connectivity to any company’s internal or external APIs. With the newly available API creation, definition and consumption patterns, developers or citizen integrators can now easily expose integrations as APIs and users can consume them via integration workflows or apps, without the need for any additional custom code.

The features and capabilities released this summer are available to all existing Informatica Cloud customers, and everyone else through our free 30-day trial offer.

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Cloud, Cloud Application Integration, Cloud Computing, Cloud Data Integration, Data Integration, Uncategorized | Tagged , , , , , | Leave a comment

Informatica Cloud Wins Best Cloud Management CODiE Award

I spent last week at the annual SIIA All About the Cloud Conference in San Francisco. The Software & Information Industry Association (SIIA) “is the principal trade association for the software and digital content industries.” They are also the organization that gives out the annual CODiE Awards, “the industry’s only peer‐review awards program, acknowledge innovation and excellence for products and services in the business software, digital content and education technology industries.”

I’m pleased to be able to say that Informatica Cloud has been recognized this year as the “Best Cloud Management Solution.”

(more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Business/IT Collaboration, Cloud Computing, Data Migration, Data Synchronization, Marketplace, News & Announcements, PaaS, SaaS | Tagged , , , , , , , , , , , | Leave a comment

Cloud Integration Predictions in Review

As is the custom, before I dive into my predictions for next year, I thought I’d look back and see how I did with my 2010 cloud integration predictions. Last year I predicted that:

1)  There will be an increased need for best-of-breed cloud data integration solutions. I also predicted that we’d see an emergence of online marketplaces that allow you to purchase prepackaged integration services.

Results: In Loraine Lawson’s recent Data Integration predictions post, she noted, “this year, integration seems to have shifted from a liability to an asset for the cloud.” Another publication referred to cloud integration as, “the capability that will drive mainstream Cloud adoption.” Clearly the importance of data integration to cloud computing success has been recognized. I believe 2010 was the year we crossed the chasm.

Bring on the tornado!
FacebookTwitterLinkedInEmailPrintShare
Posted in Business/IT Collaboration, CIO, Cloud Computing, Integration Competency Centers, PaaS, SaaS | Tagged , , , , , , , , | 1 Comment

Adopting Cloud Applications: Evaluating TCO

Last week I sat down with Scott Geffre, IT Director at Informatica, to learn more about our internal implementation of salesforce.com and our approach to cloud integration. It’s a topic that will also be featured in a number of Informatica World 2010 sessions in November, but as part of this cloud integration best practices interview series, I wanted to share the discussion here. Scott is focused on managing CRM applications at Informatica and has been with the company for over five years. Here is part one of our discussion: evaluating SaaS/Cloud TCO. (more…)
FacebookTwitterLinkedInEmailPrintShare
Posted in Business/IT Collaboration, CIO, Cloud Computing, PaaS, SaaS | Tagged , , , , , , , , , , , | 1 Comment

Introducing Informatica Cloud 9 – The Defining Capability For Cloud Computing

Informatica 9 Today we made an announcement called Informatica Cloud 9.  This is the culmination of many years of hard work and effort and builds on the Informatica 9 announcement we made last week.  So what is so special about Informatica Cloud 9?  Is it the new Platform-as-a-Service offering?  Or it is the new Cloud Services we delivered?  Or is it the new capabilities on Amazon EC2?  What are all these things and why are they important?

Let me explain: (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration | Tagged , , , | Leave a comment

Informatica On Amazon EC2 – Why?

Last week we made a significant announcement relating to the availability of our flagship product “PowerCenter” on the Amazon EC2 infrastructure. Why?

The answer lies in the profound advancements that are happening in the area of “cloud infrastructure.”   For many years we have talked about running applications “on-premise” and “on-demand.”  Companies like Salesforce.com have pioneered the delivery of applications on their own infrastructure and we’ve become very familiar with the concept of software-as-a-service.  So where has Amazon come from? (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Cloud Computing, Customers, Data Integration, News & Announcements | Tagged , , , , , , , | 6 Comments

20 Years Old: The Web, Social Media And Our Reliance On “Data”

I was reading an article from CNET last week about the web.  The title was “It was 20 years ago today:  the web“.  The crux of the article was to discuss the impact of that famous research report authored by Tim Berners-Lee called “Information Management:  A proposal”.

What amazes me is how far we have come…

Today the web is everywhere.  It has made the world flat – something that ancient civilizations believed, but something now being delivered electronically through the ether.  Combine that with the improvements in telecommunications and we have a world in which data is king.

It is the lifeblood of every organization and is pumped across networks, through supply chains and between organizations.  Data rules, it is all-powering, it is the currency of the 21st century, and it is the web that has released it to grow to it’s full potential.

(more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration | Tagged , , | Leave a comment

Data Integration Platforms: Not Just “What”, but “How” – Part 2

As I discussed before, it’s not enough to walk through a functional checklist for a data integration platform.  It’s important to make sure that it works in the right way.   In my last posting, I discussed the concept of a “unified” platform and its implications for the user experience.

The second key aspect of how a data integration platform works is its openness—how much it is designed to work with the broader IT environment.  Data integration, by definition, touches a large portion of the IT environment, which could mean thousands of different applications and data sources in large organizations.  Moreover, it’s not just the systems inside the firewall you need to be concerned with.

In most cases, it’s important to also support integration with the systems of B2B partners such as customers, suppliers, distributors, etc., as well as any SaaS partners.  And the platform has to support any technology standard such as those for operating systems or databases which have been instituted.  Frankly, there’s not much use in a data integration platform that isn’t designed to work with as broad a range of applications and systems as possible. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Cloud Computing, Data Integration | Tagged , , , | Leave a comment