Tag Archives: Cloud Data Integration
At the recent Bosch Connected World conference in Berlin, Stefan Bungart, Software Leader Europe at GE, presented a very interesting keynote, “How Data Eats the World”—which I assume refers to Marc Andreesen’s statement that “Software eats the world”. One of the key points he addressed in his keynote was the importance of generating actionable insight from Big Data, securely and in real-time at every level, from local to global and at an industrial scale will be the key to survival. Companies that do not invest in DATA now, will eventually end up like consumer companies which missed the Internet: It will be too late.
As software and the value of data are becoming a larger part of the business value chain, the lines between different industries become more vague, or as GE’s Chairman and CEO Jeff Immelt once stated: “If you went to bed last night as an industrial company, you’re going to wake up today as a software and analytics company.” This is not only true for an industrial company, but for many companies that produce “things”: cars, jet-engines, boats, trains, lawn-mowers, tooth-brushes, nut-runners, computers, network-equipment, etc. GE, Bosch, Technicolor and Cisco are just a few of the industrial companies that offer an Internet of Things (IoT) platform. By offering the IoT platform, they enter domains of companies such as Amazon (AWS), Google, etc. As Google and Apple are moving into new areas such as manufacturing cars and watches and offering insurance, the industry-lines are becoming blurred and service becomes the key differentiator. The best service offerings will be contingent upon the best analytics and the best analytics require a complete and reliable data-platform. Only companies that can leverage data will be able to compete and thrive in the future.
The idea of this “servitization” is that instead of selling assets, companies offer service that utilizes those assets. For example, Siemens offers a service for body-scans to hospitals instead of selling the MRI scanner, Philips sells lightning services to cities and large companies, not the light bulbs. These business models enable suppliers to minimize disruption and repairs as this will cost them money. Also, it is more attractive to have as much functionality of devices in software so that upgrades or adjustments can be done without replacing physical components. This is made possible by the fact that all devices are connected, generate data and can be monitored and managed from another location. The data is used to analyse functionality, power consumption, usage , but also can be utilised to predict malfunction, proactive maintenance planning, etc.
So what impact does this have on data and on IT? First of all, the volumes are immense. Whereas the total global volume of for example Twitter messages is around 150GB, ONE gas-turbine with around 200 sensors generates close to 600GB per day! But according to IDC only 3% of potentially useful data is tagged and less than 1% is currently analysed. Secondly, the structure of the data is now always straightforward and even a similar device is producing different content (messages) as it can be on a different software level. This has impact on the backend processing and reliability of the analysis of the data.
Also the data often needs to put into context with other master data from thea, locations or customers for real-time decision making. This is a non-trivial task. Next, Governance is an aspect that needs top-level support. Questions like: Who owns the data? Who may see/use the data? What data needs to be kept or archived and for how long? What needs to be answered and governed in IoT projects with the same priorities as the data in the more traditional applications.
To summarize, managing data and mastering data governance is becoming one of the most important pillars of companies that lead the digital age. Companies that fail to do so will be at risk for becoming a new Blockbuster or Kodak: companies that didn’t adopt quickly enough. In order to avoid this, companies need to evaluate a data platform can support a comprehensive data strategy which encapsulates scalability, quality, governance, security, ease of use and flexibility, and that enables them to choose the most appropriate data processing infrastructure, whether that is on premise or in the cloud, or most likely a hybrid combination of these.
March 20th 2015 was the official start of spring and to be honest, it couldn’t have come soon enough for us folks in the North East. After a long, cold and snowy winter we’re looking forward to the spring thaw and the first green shoots of burgeoning life. Spring is also the time that we like to tackle new projects and start afresh after our winter hibernation.
For those of us in technology new spring projects often reflect the things we do in everyday life. Naturally our mind turns at this time to spring cleaning and spring training. To be honest, we’d have to admit that we haven’t scrubbed our data in three months so data cleansing is a must, but so too is training. We probably haven’t picked up a book or attended a seminar since last November. But what training should we do? And “what should we do next?”
Luckily, Informatica is providing the answer. We’ve put together two free, half day training seminars for cloud application owners and Salesforce practitioners. That’s two dates, two fantastic locations and dozens of brilliant speakers lined up to give you some new pointers for what’s coming next in the world of cloud and SaaS.
The goals of the event are to give you the tools and knowledge to strengthen your Salesforce implementation and help you delight your customers. The sessions will include presentations by experts from Salesforce and our partner Bluewolf. There will also be some best practices presentations and demonstrations from Informatica’s team of very talented engineers.
Just glance at the seminar summary and you’ll see what we mean:
Session 1: Understand the Opportunity of Every Customer Interaction
In this session Eric Berridge, Co-founder and CEO of Bluewolf Inc. will discuss how you can develop a customer obsessed culture and get the most value from every customer interaction.
Session 2: Delight Your Customers by Taking Your Salesforce Implementation to the Next Level
Ajay Gandhi, Informatica’s VP Product Marketing is next up and he’s going to provide a fabulous session on what you look out for, and where should you invest as your Salesforce footprint grows.
Session 3: Anticipate Your Business Needs With a Fresh Approach to Customer Analytics
The seminar wraps up with Benjamin Pruden, Sr. Manager Product Marketing, at Salesforce. Ben’s exciting session touches on one of the hottest topics in the industry today. He’s going to explain how you can obtain a comprehensive understanding of your most valuable customers with cloud-analytics and data-driven dashboards.
I’m sure you’ll agree that it’s a pretty impressive seminar and well worth a couple of hours of your time.
The New York event is happening at Convene (810 Seventh Ave, 52nd and 53rd ) on April 7th. Click here for more details and to reserve your seat.
The San Francisco event is a week later on April 14th at Hotel Nikko (222 Mason Street). Make sure you click here and register today.
Come join us on the 7th or the 14th to learn how to take your cloud business to the next level. Oh, and don’t forget that you’ll also be treating yourself to some well-deserved spring training!
Informatica, over the last two years, successfully transformed from running 80% of its application portfolio on premises to 80% in the cloud. Success was based on two key criteria:
- Ensuring the SaaS-based processes are integrated with no disruption
- Data in the cloud continues to be available and accessible for analytics
With industry analysts predicting that the majority of new application deployments will be SaaS-based by 2017, the requirement of having connected data should not be negotiable. It is a must have. Most SaaS applications ensure businesses are able to keep processes integrated using connected and shared data through application programming interfaces (APIs).
If you are a consumer of SaaS applications, you probably know the importance of having clean, connected and secure data from the cloud. The promise of SaaS is improved agility. When data is not easily accessible, that promise is broken. With the plethora of options available in the SaaS ecosystem and marketplace, not having clean, connected and safe data is a compelling event for switching SaaS vendors.
If you are in the SaaS application development industry, you probably know that building these APIs and connectors is a critical requirement for success. However, how do you decide which applications you should build connectors for when the ecosystem keeps changing? Investment in developing connectors and interfaces consumes resources and competes with developing competitive and differentiating features.
This week, Informatica launched its inaugural DataMania event in San Francisco where the leading topic was SaaS application and data integration. Speakers from AWS, Adobe, App Dynamics, Dun & Bradstreet, and Marketo – to name a few – contributed to the discussion and confirmed that we entering into the era of the Data Ready Enterprise. Also during the event, Informatica announced the Connect-a-thon, a hackathon-like event, where SaaS vendors can get connected to hundreds of cloud and on-premises apps.
Without a doubt, transitioning to a cloud and SaaS-based application architecture can only be successful if the applications are easily connectable with shared data. Here at Informatica, this was absolutely the case. Whether you are in the business or a consumer of SaaS applications, consider the benefits of using a standard library of connectors, such as what Informatica Cloud offers so you can focus your time and energy on innovation and more strategic parts of your business.
Strata 2015 – Making Data Work for Everyone with Cloud Integration, Cloud Data Management and Cloud Machine Learning
Are you ready to answer “Yes” to the questions:
a) “Are you Cloud Ready?”
b) “Are you Machine Learning Ready?”
I meet with hundreds of Informatica Cloud customers and prospects every year. While they are investing in Cloud, and seeing the benefits, they also know that there is more innovation out there. They’re asking me, what’s next for Cloud? And specifically, what’s next for Informatica in regards to Cloud Data Integration and Cloud Data Management? I’ll share more about my response throughout this blog post.
The spotlight will be on Big Data and Cloud at the Strata + Hadoop World conference taking place in Silicon Valley from February 17-20 with the theme “Make Data Work”. I want to focus this blog post on two topics related to making data work and business insights:
- How existing cloud technologies, innovations and partnerships can help you get ready for the new era in cloud analytics.
- How you can make data work in new and advanced ways for every user in your company.
Today, Informatica is announcing the availability of its Cloud Integration Secure Agent on Microsoft Azure and Linux Virtual Machines as well as an Informatica Cloud Connector for Microsoft Azure Storage. Users of Azure data services such as Azure HDInsight, Azure Machine Learning and Azure Data Factory can make their data work with access to the broadest set of data sources including on-premises applications, databases, cloud applications and social data. Read more from Microsoft about their news at Strata, including their relationship with Informatica, here.
“Informatica, a leader in data integration, provides a key solution with its Cloud Integration Secure Agent on Azure,” said Joseph Sirosh, Corporate Vice President, Machine Learning, Microsoft. “Today’s companies are looking to gain a competitive advantage by deriving key business insights from their largest and most complex data sets. With this collaboration, Microsoft Azure and Informatica Cloud provide a comprehensive portfolio of data services that deliver a broad set of advanced cloud analytics use cases for businesses in every industry.”
Even more exciting is how quickly any user can deploy a broad spectrum of data services for cloud analytics projects. The fully-managed cloud service for building predictive analytics solutions from Azure and the wizard-based, self-service cloud integration and data management user experience of Informatica Cloud helps overcome the challenges most users have in making their data work effectively and efficiently for analytics use cases.
The new solution enables companies to bring in data from multiple sources for use in Azure data services including Azure HDInsight, Azure Machine Learning, Azure Data Factory and others – for advanced analytics.
The broad availability of Azure data services, and Azure Machine Learning in particular, is a game changer for startups and large enterprises. Startups can now access cloud-based advanced analytics with minimal cost and complexity and large businesses can use scalable cloud analytics and machine learning models to generate faster and more accurate insights from their Big Data sources.
Success in using machine learning requires not only great analytics models, but also an end-to-end cloud integration and data management capability that brings in a wide breadth of data sources, ensures that data quality and data views match the requirements for machine learning modeling, and an ease of use that facilitates speed of iteration while providing high-performance and scalable data processing.
For example, the Informatica Cloud solution on Azure is designed to deliver on these critical requirements in a complementary approach and support advanced analytics and machine learning use cases that provide customers with key business insights from their largest and most complex data sets.
Using the Informatica Cloud solution on Azure connector with Informatica Cloud Data Integration enables optimized read-write capabilities for data to blobs in Azure Storage. Customers can use Azure Storage objects as sources, lookups, and targets in data synchronization tasks and advanced mapping configuration tasks for efficient data management using Informatica’s industry leading cloud integration solution.
As Informatica fulfills the promise of “making great data ready to use” to our 5,500 customers globally, we continue to form strategic partnerships and develop next-generation solutions to stay one step ahead of the market with our Cloud offerings.
My goal in 2015 is to help each of our customers say that they are Cloud Ready! And collaborating with solutions such as Azure ensures that our joint customers are also Machine Learning Ready!
To learn more, try our free Informatica Cloud trial for Microsoft Azure data services.
As reviewed by Loraine Lawson, a MeriTalk survey about cloud adoption found that a “In the latest survey of 150 federal executives, nearly one in five say one-quarter of their IT services are fully or partially delivered via the cloud.”
For the most part, the shifts are more tactical in nature. These federal managers are shifting email (50 percent), web hosting (45 percent) and servers/storage (43 percent). Most interesting is that they’re not moving traditional business applications, custom business apps, or middleware. Why? Data, and data integration issues.
“Federal agencies are worried about what happens to data in the cloud, assuming they can get it there in the first place:
- 58 percent of executives fret about cloud-to-legacy system integration as a barrier.
- 57 percent are worried about migration challenges, suggesting they’re not sure the data can be moved at all.
- 54 percent are concerned about data portability once the data is in the cloud.
- 53 percent are worried about ‘contract lock-in.’ ”
The reality is that the government does not get much out of the movement to cloud without committing core business applications and thus core data. While e-mail and Web hosting, and some storage is good, the real cloud computing money is made when moving away from expensive hardware and software. Failing to do that, you fail to find the value, and, in this case, spend more taxpayer dollars than you should.
Data issues are not just a concern in the government. Most larger enterprise have the same issues as well. However, a few are able to get around these issues with good planning approaches and the right data management and data integration technology. It’s just a matter of making the initial leap, which most Federal IT executives are unwilling to do.
In working with CIOs of Federal agencies in the last few years, the larger issue is that of funding. While everyone understands that moving to cloud-based systems will save money, getting there means hiring government integrators and living with redundant systems for a time. That involves some major money. If most of the existing budget goes to existing IP operations, then the move may not be practical. Thus, there should be funds made available to work on the cloud projects with the greatest potential to reduce spending and increase efficiencies.
The shame of this situation is that the government was pretty much on the leading edge with cloud computing. back in 2008 and 2009. The CIO of the US Government, Vivek Kundra, promoted the use of cloud computing, and NIST drove the initial definitions of “The Cloud,” including IaaS, SaaS, and PaaS. But, when it came down to making the leap, most agencies balked at the opportunity citing issues with data.
Now that the technology has evolved even more, there is really no excuse for the government to delay migration to cloud-based platforms. The clouds are ready, and the data integration tools have cloud integration capabilities backed in. It’s time to see some more progress.
It’s true. Data integration is a whole new game, compared to five years ago, or, in some organizations, five minutes ago. The right approaches to data integration continue to evolve around a few principal forces: First, the growth of cloud computing, as pointed out by Stafford. Second, the growing use of big data systems, and the emerging use of data as a strategic asset for the business.
These forces combine to drive us to the understanding that old approaches to data integration won’t provide the value that they once did. As someone who was a CTO of three different data integration companies, I’ve seen these patterns change over the time that I was building technology, and that change has accelerated in the last 7 years.
The core opportunities lie with the enterprise architect, and their ability to drive an understanding of the value of data integration, as well as drive change within their organization. After all, they, or the enterprises CTOs and CIOs (whomever makes decisions about technological approaches), are supposed to drive the organization in the right technical directions that will provide the best support for the business. While most enterprise architects follow the latest hype, such as cloud computing and big data, many have missed the underlying data integration strategies and technologies that will support these changes.
“The integration challenges of cloud adoption alone give architects and developers a once in a lifetime opportunity to retool their skillsets for a long-term, successful career, according to both analysts. With the right skills, they’ll be valued leaders as businesses transition from traditional application architectures, deployment methodologies and sourcing arrangements.”
The problem is that, while most agree that data integration is important, they typically don’t understand what it is, and the value it can bring. These days, many developers live in a world of instant updates. With emerging DevOps approaches and infrastructure, they really don’t get the need, or the mechanisms, required to share data between application or database silos. In many instances, they resort to coding interfaces between source and target systems. This leads to brittle and unreliable integration solutions, and thus hurts and does not help new cloud application and big data deployments.
The message is clear: Those charged with defining technology strategies within enterprises need to also focus on data integration approaches, methods, patterns, and technologies. Failing to do so means that the investments made in new and emerging technology, such as cloud computing and big data, will fail to provide the anticipated value. At the same time, enterprise architects need to be empowered to make such changes. Most enterprises are behind on this effort. Now it’s time to get to work.
I think this new capability, Salesforce Lightning Connect, is an innovative development and gives OData, an OASIS standard, a leg-up on its W3C-defined competitor Linked Data. OData is a REST-based protocol that provides access to data over the web. The fundamental data model is relational and the query language closely resembles what is possible with stripped-down SQL. This is much more familiar to most people than the RDF-based model using by Linked Data or its SPARQL query language.
Standardization of OData has been going on for years (they are working on version 4), but it has suffered from a bit of a chicken-egg problem. Applications haven’t put a large priority on supporting the consumption of OData because there haven’t been enough OData providers, and data providers haven’t prioritized making their data available through OData because there haven’t been enough consumers. With Salesforce, a cloud leader declaring that they will consume OData, the equation changes significantly.
But these things take time – what does someone do who is a user of Salesforce (or any other OData consumer) if most of their data sources they have cannot be accessed as an OData provider? It is the old last-mile problem faced by any communications or integration technology. It is fine to standardize, but how do you get all the existing endpoints to conform to the standard. You need someone to do the labor-intensive work of converting to the standard representation for lots of endpoints.
Informatica has been in the last-mile business for years. As it happens, the canonical model that we always used has been a relational model that lines up very well with the model used by OData. For us to host an OData provider for any of the data sources that we already support, we only needed to do one conversion from the internal format that we’ve always used to the OData standard. This OData provider capability will be available soon.
But there is also the firewall issue. The consumer of the OData has to be able to access the OData provider. So, if you want Salesforce to be able to show data from your Oracle database, you would have to open up a hole in your firewall that provides access to your database. Not many people are interested in doing that – for good reason.
Informatica Cloud’s Vibe secure agent architecture is a solution to the firewall issue that will also work with the new OData provider. The OData provider will be hosted on Informatica’s Cloud servers, but will have access to any installed secure agents. Agents require a one-time install on-premise, but are thereafter managed from the cloud and are automatically kept up-to-date with the latest version by Informatica . An agent doesn’t require a port to be opened, but instead opens up an outbound connection to the Informatica Cloud servers through which all communication occurs. The agent then has access to any on-premise applications or data sources.
OData is especially well suited to reading external data. However, there are better ways for creating or updating external data. One problem is that Salesforce only handles reads, but even when it does handle writes, it isn’t usually appropriate to add data to most applications by just inserting records in tables. Usually a collection of related information must to be provided in order for the update to make sense. To facilitate this, applications provide APIs that provide a higher level of abstraction for updates. Informatica Cloud Application Integration can be used now to read or write data to external applications from with Salesforce through the use of guides that can be displayed from any Salesforce screen. Guides make it easy to generate a friendly user interface that shows exactly the data you want your users to see and to guide them through the collection of new or updated data that needs to be written back to your app.
As covered in Loraine Lawson’s blog, MeriTalk surveyed federal government IT professionals about their use of cloud computing. As it turns out, “89 percent out of 153 surveyed expressed ‘some apprehension about losing control of their IT services,’ according to MeriTalk.”
Loraine and I agree that what the survey says about the government’s data integration, management, and governance, is that they don’t seem to be very good at cloud data management…yet. Some of the other gruesome details include:
- 61 percent do not have quality, documented metadata.
- 52 percent do not have well understood data integration processes.
- 50 percent have not identified data owners.
- 49 percent do not have known systems of record.
“Overall, respondents did not express confidence about the success of their data governance and management efforts, with 41 percent saying their data integration management efforts were some degree of ‘not successful.’ This lead MeriTalk to conclude, ‘Data integration and remediation need work.’”
The problem with the government is that data integration, data governance, data management, and even data security have not been priorities. The government has a huge amount of data to manage, and they have not taken the necessary steps to adopt the best practices and technology that would allow them to manage it properly.
Now that everyone is moving to the cloud, the government included, questions are popping up about the proper way to manage data within the government, from the traditional government enterprises to the public cloud. Clearly, there is much work to be done to get the government ready for the cloud, or even ready for emerging best practices around data management and data integration.
If the government is to move in the right direction, they must first come to terms with the data. This means understanding where the data is, what it does, who owns it, access mechanisms, security, governance, etc., and apply this understanding holistically to most of the data under management.
The problem within the government is that the data is so complex, distributed, and, in many cases, unique, that it’s difficult for the government to keep good track of the data. Moreover, the way the government does procurement, typically in silos, leads to a much larger data integration problem. I was working with government agencies that had over 5,000 siloed systems, each with their own database or databases, and most do not leverage data integration technology to exchange data.
There are ad-hoc data integration approaches and some technology in place, but nowhere close to what’s need to support the amount and complexity of data. Now that government agencies are looking to move to the cloud, the issues around data management are beginning to be better understood.
So, what’s the government to do? This is a huge issue that can’t be fixed overnight. There should be incremental changes that occur over the next several years. This also means allocating more resources to data management and data integration than has been allocated in the past, and moving it much higher up in the priorities lists.
These are not insurmountable problems. However, they require a great deal of focus before things will get better. The movement to the cloud seems to be providing that focus.
Informatica Cloud Summer ’14 Release Breaks Down Barriers with Unified Data Integration and Application Integration for Real Time and Bulk Patterns
This past week, Informatica Cloud marked an important milestone with the Summer 2014 release of the Informatica Cloud platform. This was the 20th Cloud release, and I am extremely proud of what our team has accomplished.
“SDL’s vision is to help our customers use data insights to create meaningful experiences, regardless of where or how the engagement occurs. It’s multilingual, multichannel and on a global scale. Being able to deliver the right information at the right time to the right customer with Informatica Cloud Summer 2014 is critical to our business and will continue to set us apart from our competition.”
– Paul Harris, Global Business Applications Director, SDL Pic
When I joined Informatica Cloud, I knew that it had the broadest cloud integration portfolio in the marketplace: leading data integration and analytic capabilities for bulk integration, comprehensive cloud master data management and test data management, and over a hundred connectors for cloud apps, enterprise systems and legacy data sources.. all delivered in a self-service design with point-and-click wizards for citizen integrators, without the need for complex and costly manual custom coding.
But, I also learned that our broad portfolio belies another structural advantage: because of Informatica Cloud’s unique, unified platform architecture, it has the ability to surface application (or real time) integration capabilities alongside its data integration capabilities with shared metadata across real time and batch workflows.
With the Summer 2014 release, we’ve brought our application integration capabilities to the forefront. We now provide the most-complete cloud app integration capability in the marketplace. With a design environment that’s meant not for just developers but also line of business IT, now app admins can also build real time process workflows that cut across on-premise and cloud and include built-in human workflows. And with the capability to translate these process workflows instantly into mobile apps for iPhone and Android mobile devices, we’re not just setting ourselves apart but also giving customers the unique capabilities they need for their increasingly mobile employees.
“Schneider’s strategic initiative to improve front-office performance relied on recording and measuring sales person engagement in real time on any mobile device or desktop. The enhanced real time cloud application integration features of Informatica Cloud Summer 2014 makes it all possible and was key to the success of a highly visible and transformative initiative.”
– Mark Nardella, Global Sales Process Director, Schneider Electric SE
With this release, we’re also giving customers the ability to create workflows around data sharing that mix and match batch and real time integration patterns. This is really important. Because unlike the past, where you had to choose between batch and real time, in today’s world of on-premise, cloud-based, transactional and social data, you’re now more than ever having to deal with both real time interactions and the processing of large volumes of data. For example, let’s surmise a typical scenario these days at high-end retail stores. Using a clienteling iPad app, the sales rep looks up bulk purchase history and inventory availability data in SAP, confirms availability and delivery date, and then processes the customer’s order via real time integration with NetSuite. And if you ask any customer, having a single workflow to unify all of that for instant and actionable insights is a huge advantage.
“Our industry demands absolute efficiency, speed and trust when dealing with financial information, and the new cloud application integration feature in the latest release of Informatica Cloud will help us service our customers more effectively by delivering the data they require in a timely fashion. Keeping call-times to a minimum and improving customer satisfaction in real time.”
– Kimberly Jansen, Director CRM, Misys PLC
We’ve also included some exciting new Vibe Integration packages or VIPs. VIPs deliver pre-built business process mappings between front-office and back-office applications. The Summer 2014 release includes new bidirectional VIPs for Siebel to Salesforce and SAP to Salesforce that make it easier for customers to connect their Salesforce with these mission-critical business applications.
And lastly, but not least importantly, the release includes a critical upgrade to our API Framework that provides the Informatica Cloud iPaaS end-to-end support for connectivity to any company’s internal or external APIs. With the newly available API creation, definition and consumption patterns, developers or citizen integrators can now easily expose integrations as APIs and users can consume them via integration workflows or apps, without the need for any additional custom code.
The features and capabilities released this summer are available to all existing Informatica Cloud customers, and everyone else through our free 30-day trial offer.
I spent last week at the annual SIIA All About the Cloud Conference in San Francisco. The Software & Information Industry Association (SIIA) “is the principal trade association for the software and digital content industries.” They are also the organization that gives out the annual CODiE Awards, “the industry’s only peer‐review awards program, acknowledge innovation and excellence for products and services in the business software, digital content and education technology industries.”
I’m pleased to be able to say that Informatica Cloud has been recognized this year as the “Best Cloud Management Solution.”