12

The Achilles Heel Of Cloud Computing – Data Integration

Loraine Lawson did a great job covering the topic of the integration challenges around the cloud and virtualization. She reports that “…a recent Internet Evolution column [by David Vellante] looks more broadly at the cloud integration question and concludes that insufficient integration is holding up both cloud computing and virtualization.”

In fact, what currently limits the number of cloud deployments is the lack of a clear understanding of data integration in the context of cloud computing. This is a rather easy problem to solve, but it’s often an afterthought.

The core issue is that cloud computing providers, other than Salesforce.com, don’t consider integration. Perhaps they are thinking, “If you use our cloud, then there is no reason to sync your data back to your enterprise. After all, we’re the final destination for your enterprise data, right?” Wrong.

Cloud computing is a great way to go in many instances, but if you think you don’t have to provide core data integration from the cloud computing platforms to the on-premise systems, you have another thing coming.

Indeed, there are many reasons to provide data integration within cloud computing problem domains. The first and foremost is that you need to maintain an up-to-date copy of your enterprise data on-premise in case of trouble that could come in the form of cloud computing outages, clouds going out of business, or clouds purchased by companies that have no interest in staying in that business. There are many examples of these occurrences now, and they will only get worse in the future.

However, the primary purpose of data integration within the context of cloud computing is to assist in driving processes between systems, on-premise and cloud-delivered, and to manage data integration across those very different and geographically dispersed platforms. Those who only had to deal with systems talking intra-data center have some new challenges when they consider cloud computing.

These are the problem domains that require, dare I say it, data integration architecture and strategic technology. This means that you need to consider all of the source and target schemas, and how you’re going to securely and reliably move data between those points, accounting for the differences on the fly. Moreover, you need to consider MDM issues, as mentioned above, as well, as security, and data governance.

Here are a few words of advice:

First, consider the overall requirements of the business. Sounds obvious, but many who deploy cloud computing systems do not have a complete understanding of the overall business requirements.

Second, focus on the holistic architecture, on-premise and cloud-delivered, including how they will and should exchange data to support the core business.

Finally, select the right data integration technology for the job, and do so only after taking everything into account. You’ll find that there are both on-premise and on-demand options, and in many instances you may have to mix and match solutions.

You can’t do the cloud without data integration, and until we get data integration right, you can’t do the cloud. Pretty simple, if you think about it.

FacebookTwitterLinkedInEmailPrintShare
This entry was posted in Cloud Computing, Customers, Data Integration, Data Warehousing, Master Data Management and tagged , , , , , , . Bookmark the permalink.

12 Responses to The Achilles Heel Of Cloud Computing – Data Integration

  1. Graham Perry says:

    Integration with external data also needs to be considered. B2B portals for example.

  2. Pingback: How Do You Handle Data Integration in the Cloud? | Yooxe

  3. Pingback: 3/18/2010 Update « CloudRoad

  4. David,

    With regards to Data Integration in any Cloud (public or private) how does one address this problem without granular virtualization across heterogeneous data sources such that:

    1. data substrate is comprised of discrete data objects (items or entities)
    2. each data object is endowed with an unambiguous identifier applicable to its host network
    3. each data object identifier resolves to a structured representation of said objects description (basically an Entity-Attribute-Value graph)
    4. data object collections are query-able via query languages capable of path navigation oriented expressions
    5. data access policies based on rules that also leverage an EAV substrate.

    What I describe above is achievable if HTTP is used as the data access middleware protocol. The entire process of de-referencing a data objects structured description would simply come down to a conversation that terminates in the delivery of data object metadata in a form deliverable by the server and palatable to the requesting client.

    The Linked Data meme is fundamentally about the above, when applied to the age-old challenge of data access and integration. The same applies to Microsoft’s recent OData protocol (albeit with less fidelity).

    What do you think?

    Regards,
    Kingsley

  5. David Linthicum says:

    Kingsley

    Sounds like a technology approach for a product, or defining a much lower level of abstraction that I’m referencing. For instance, the integration patterns I describe in my book take varying directions, but the core patterns are much the same. You may be making this a bit overly complex, and I would love to have you to define the advantages as well as the approach. It’s not clear form this post.

    Kind regards.

    Dave

  6. David,

    Basically this is about data virtualization where the granularity of underlying data objects transcends underlying infrastructure heterogeneity. In a nutshell, how we can leverage HTTP a the engine of data virtualization.

    Advantages of HTTP come down to its purity re. separation of layers and its compatibility with RESTful client-sever patterns (which ultimately lay the model for scalable distributed solutions riding the virtualized data substrate).

    URI abstraction combined with HTTP ensure the following are completely distinct:

    1. Data Storage/Persistence
    2. Datum Identification (via URIs for Identifiers)
    3. Data Representation (structured description via Entity-Attribute-Model where serialization formats are negotiated between clients and servers)
    4. Messaging Protocol (includes rich request and response metadata)
    5. Data Presentation.

    The above implies that a single generic HTTP URI (not a URL) placed anywhere on a network ends up delivering powerful conceptual representation when de-referenced (e.g. description of a Person culled from eCRM, ERP, and other line-of-business applications + other HTTP accessible data sources e.g. Web).

    Kingsley

  7. Pingback: How Do You Handle Data Integration in the Cloud? — JOSHUASCOTT.NET

  8. Vatneriillege says:

    Hi dee hi! This may be a great site, I am totally content I lastly came across it seeing that I was interested in it for an exceptionally extended period.I am about to introduce myself now. So I’m thirty yrs. old and in addition I’m being employed as an actual radiographer. I really desire to experiment with games which includes Tengai Makyou III: Kabuki Fuun Den as well as play physical activities such as whale watching.My brand new leisure activity is focused on Review of Google Sniper 2.0. I discovered a unique page over it which stated: The particular Search engines Sniper Couple of strategy is the revolutionary model in the original Bing fantastic product this is a basic and pretty functional simple to implement set of directions for even a newcomer along with the Bing Mindblowing Two product is an effective simple clear and understandable number of guidelines after only a Rookie affiliate marketer. Should you not want to browse the e-book you can watch your video lessons instead. The search engines Sniper A pair of method is pertaining to opting for relatively simple key terms so that you can won’t should wait many months ahead of experiencing any improvements. Search engines Killer Couple of is technique to generate income, in fact it is the best way to get started. Can it be the most beneficial online marketing program in existence. I’m not sure whether the data is appropriate nevertheless it reduced the problem with Review of google sniper 2.0.Now I’m most likely to waste lots of time on this website and I wish I’m going to find a lot of fantastic tips. I’m hoping I will even chip in to the data available on this site. Thank you for your time, have a good time everyone!!!

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>