Category Archives: Cloud Computing

Bringing the “Local Experience” Online: Today’s Farm Store is Data Driven

Today’s Farm Store is Data Driven

Today’s Farm Store is Data Driven

Have you ever found yourself walking into a store to buy one thing, only to leave 2 hours later with enough items to fill 2 mini vans? I certainly have. Now, imagine the same scenario, however this time, you walk in a store to buy ranch supplies, like fencing materials or work boots, but end up leaving with an outdoor fire pit, fancy spurs, a pair of Toms shoes, a ski rack and a jar of pickled egg. If you had no idea these products could be purchased at the same place, you clearly haven’t been to North 40 Outfitters.

Established in Northwestern United States, North 40 Outfitters, a family owned and operated business, has been outfitting the hardworking and hard playing populace of the region. Understanding the diverse needs of its customers, hardworking people, North 40 Outfitters carries everything from fencing for cattle and livestock to tools and trailers. They have gear for camping and hunting—even fly fishing.

Named after the Homestead Act of 1862, an event with strong significance in the region, North 40 Outfitters heritage is built on its community involvement and support of local small businesses. The company’s 700 employees could be regarded as family. At this year’s Thanksgiving, every employee was given a locally raised free range turkey to bring home. Furthermore, true to Black Friday’s shopping experience, North 40 Outfitters opened its door. Eschewing the regular practice of open as early as 3 AM, North 40 Outfitters opened at the reasonable 7 o’clock hour. They offered patrons donuts as well as coffee obtained from a local roaster.

North 40 Outfitters aims to be different. They achieve differentiation by being data driven. While the products they sell cannot be sourced exclusively from local sources, their experience aims to do exactly that.

The Problem

Prior to operating under the name North 40 Outfitters, the company ran under the banner of “Big R”, which was shared with several other members of the same buying group. The decision to change the name to North 40 Outfitters was the result of a move into the digital realm— they needed a name to distinguish themselves. Now as North 40 Outfitters, they can focus on what matters rather than having to deal with the confusion of a shared name. They would now provide the “local store” experience, while investing in their digital strategy as a means to do business online and bring the unique North 40 Outfitters experience and value nationwide.

With those organizational changes taking place, lay an even greater challenge. With over 150,000 SKUs and no digital database for their product information, North 40 Outfitters had to find a solution and build everything from the ground up. Moreover, with customers demanding a means to buy products online, especially customers living in rural areas, it became clear that North 40 Outfitters would have to address its data concerns.

Along with the fresh rebrand and restructure, North 40 Outfitters needed to tame their product information situation, a critical step conducive to building their digital product database and launching their ecommerce store.

The Solution

North 40 Outfitters was clear about the outcome of the recent rebranding and they knew that investments needed to be taken if they were to add value to their existing business. Building the capabilities to take their business to new channels, ecommerce in this case, meant finding the best solution to start on the right foot. Consequently, wishing to become master of their own data, for both online and in-store uses, North 40 Outfitters determined that they needed a PIM application that would act as a unique data information repository.

It’s important to note that North 40 Outfitters environment is not typical to that of traditional retailers. The difference can be found in the large variation of product type they sell. Some of their suppliers have local, boutique style production scales, while some are large multi-national distributors. Furthermore, a large portion of North 40 Outfitters customers live in rural regions, in some cases their stores are a day’s drive away. With the ability to leverage both a PIM and an ecommerce solution North 40 Outfitters is now a step closer to outfitting everyone in the Northwestern region.

Results

It is still very early to talk about results, since North 40 Outfitters has only recently entered the implementation phase. What can be said is that they are very excited. Having reclaimed their territory, and equipped with a PIM solution and an ecommerce solution they have all the right tools to till and plow the playing field.

The meaning of North 40 Outfitters

To the uninitiated the name North 40 Outfitters might not mean much. However, there is a lot of local heritage and history standing behind this newly rebranded name. North 40 is derived from the Homestead Act of 1862. The Act refers to the “North forty”, to the Northern most block of the homesteader’s property. To this day, this still holds significance to the local community. The second half of the brand: “Outfitters” is about the company’s focus on the company ability to outfit its customers both for work and play. On the one hand, you can visit North 40 Outfitters to purchase goods aimed at running your ranch, such as fencing material, horse related goods or quality tools. At the same time, you can buy camping and backpacking goods—they even sell ice fishing huts.

North 40 Outfitters ensures their customers have what they need to work the land, get back from it and ultimately go out and play just as hard if not harder.

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Cloud Computing, Data Governance, Data Integration, Data Quality, Enterprise Data Management | Tagged , , , , , , , , , , | Leave a comment

Salesforce Lightning Connect and OData: What You Need to Know

Salesforce Lightning Connect and OData

Salesforce Lightning Connect and OData

Last month, Salesforce announced that they are democratizing integration through the introduction of Salesforce1 Lightning Connect. This new capability makes it possible to work with data that is stored outside of Salesforce using the same force.com constructs (SOQL, Apex, VisualForce, etc) that are used with Salesforce objects. The important caveat is that that external data has to be available through the OData protocol, and the provider of that protocol has to be accessible from the internet.

I think this new capability, Salesforce Lightning Connect, is an innovative development and gives OData, an OASIS standard, a leg-up on its W3C-defined competitor Linked Data. OData is a REST-based protocol that provides access to data over the web. The fundamental data model is relational and the query language closely resembles what is possible with stripped-down SQL. This is much more familiar to most people than the RDF-based model using by Linked Data or its SPARQL query language.

Standardization of OData has been going on for years (they are working on version  4), but it has suffered from a bit of a chicken-egg problem. Applications haven’t put a large priority on supporting the consumption of OData because there haven’t been enough OData providers, and data providers haven’t prioritized making their data available through OData because there haven’t been enough consumers. With Salesforce, a cloud leader declaring that they will consume OData, the equation changes significantly.

But these things take time – what does someone do who is a user of Salesforce (or any other OData consumer) if most of their data sources they have cannot be accessed as an OData provider? It is the old last-mile problem faced by any communications or integration technology. It is fine to standardize, but how do you get all the existing endpoints to conform to the standard. You need someone to do the labor-intensive work of converting to the standard representation for lots of endpoints.

Informatica has been in the last-mile business for years. As it happens, the canonical model that we always used has been a relational model that lines up very well with the model used by OData. For us to host an OData provider for any of the data sources that we already support, we only needed to do one conversion from the internal format that we’ve always used to the OData standard. This OData provider capability will be available soon.

But there is also the firewall issue. The consumer of the OData has to be able to access the OData provider. So, if you want Salesforce to be able to show data from your Oracle database, you would have to open up a hole in your firewall that provides access to your database. Not many people are interested in doing that – for good reason.

Informatica Cloud’s Vibe secure agent architecture is a solution to the firewall issue that will also work with the new OData provider. The OData provider will be hosted on Informatica’s Cloud servers, but will have access to any installed secure agents. Agents require a one-time install on-premise, but are thereafter managed from the cloud and are automatically kept up-to-date with the latest version by Informatica . An agent doesn’t require a port to be opened, but instead opens up an outbound connection to the Informatica Cloud servers through which all communication occurs. The agent then has access to any on-premise applications or data sources.

OData is especially well suited to reading external data. However, there are better ways for creating or updating external data. One problem is that Salesforce only handles reads, but even when it does handle writes, it isn’t usually appropriate to add data to most applications by just inserting records in tables. Usually a collection of related information must to be provided in order for the update to make sense. To facilitate this, applications provide APIs that provide a higher level of abstraction for updates. Informatica Cloud Application Integration can be used now to read or write data to external applications from with Salesforce through the use of guides that can be displayed from any Salesforce screen. Guides make it easy to generate a friendly user interface that shows exactly the data you want your users to see and to guide them through the collection of new or updated data that needs to be written back to your app.

FacebookTwitterLinkedInEmailPrintShare
Posted in B2B, Business Impact / Benefits, Cloud, Cloud Computing, Cloud Data Integration, Data Governance | Tagged , , , | Leave a comment

Amazon re:Invent 2014 Recap: “Cloud has Become the New Normal”

It’s amazing how fast a year goes by. Last year, Informatica Cloud exhibited at Amazon re:Invent for the very first time where we showcased our connector for Amazon Redshift. At the time, customers were simply kicking the tires on Amazon’s newest cloud data warehousing service, and trying to learn where it might make sense to fit Amazon Redshift into their overall architecture. This year, it was clear that customers had adopted several AWS services and were truly “all-in” on the cloud. In the words of Andy Jassy, Senior VP of Amazon Web Services, “Cloud has become the new normal”.

During Day 1 of the keynote, Andy outlined several areas of growth across the AWS ecosystem such as a 137% YoY increase in data transfer to and from Amazon S3, and a 99% YoY increase in Amazon EC2 instance usage. On Day 2 of the keynote, Werner Vogels, CTO of Amazon made the case that there has never been a better time to build apps on AWS because of all the enterprise-grade features. Several customers came on stage during both keynotes to demonstrate their use of AWS:

  • Major League Baseball’s Statcast application consumed 17PB of raw data
  • Philips Healthcare used over a petabyte a month
  • Intuit revealed their plan to move the rest of their applications to AWS over the next few years
  • Johnson & Johnson outlined their use of Amazon’s Virtual Private Cloud (VPC) and referred to their use of hybrid cloud as the “borderless datacenter”
  • Omnifone illustrated how AWS has the network bandwidth required to deliver their hi-res audio offerings
  • The Weather Company scaled AWS across 4 regions to deliver 15 billion forecast publications a day

Informatica was also mentioned on stage by Andy Jassy as one of the premier ISVs that had built solutions on top of the AWS platform. Indeed, from having one connector in the AWS ecosystem last year (for Amazon Redshift), Informatica has released native connectors for Amazon DynamoDB, Elastic MapReduce (EMR), S3, Kinesis, and RDS.

With so many customers using AWS, it becomes hard for them to track their usage on a more granular level – this is especially true with enterprise companies using AWS because of the multitude of departments and business units using several AWS services. Informatica Cloud and Tableau developed a joint solution which was showcased at the Amazon re:Invent Partner Theater, where it was possible for an IT Operations individual to drill down into several dimensions to find out the answers they need around AWS usage and cost. IT Ops personnel can point out the relevant data points in their data model, such as availability zone, rate, and usage type, to name a few, and use Amazon Redshift as the cloud data warehouse to aggregate this data. Informatica Cloud’s Vibe Integration Packages combined with its native connectivity to Amazon Redshift and S3 allow the data model to be reflected as the correct set of tables in Redshift. Tableau’s robust visualization capabilities then allow users to drill down into the data model to extract whatever insights they require. Look for more to come from Informatica Cloud and Tableau on this joint solution in the upcoming weeks and months.

FacebookTwitterLinkedInEmailPrintShare
Posted in B2B Data Exchange, Cloud, Cloud Computing, Cloud Data Integration | Tagged , , , | Leave a comment

Amazon Web Services and Informatica Deliver Data-Ready Cloud Computing Infrastructure for Every Business

At re:Invent 2014 in Las Vegas,  Informatica and AWS announced a broad strategic partnership to deliver data-ready cloud computing infrastructure to any type or size of business.

Informatica’s comprehensive portfolio across Informatica Cloud and PowerCenter solutions connect to multiple AWS Data Services including Amazon Redshift, RDS, DynamoDB, S3, EMR and Kinesis – the broadest pre-built connectivity available to AWS Data Services. Informatica and AWS offerings are pre-integrated, enabling customers to rapidly and cost-effectively implement data warehousing, large scale analytics, lift and shift, and other key use cases in cloud-first and hybrid IT environments. Now, any company can use Informatica’s portfolio to get a plug-and-play on-ramp to the cloud with AWS.

Economical and Flexible Path to the Cloud

As business information needs intensify and data environments become more complex, the combination of AWS and Informatica enables organizations to increase the flexibility and reduce the costs of their information infrastructures through:

  • More cost-effective data warehousing and analytics – Customers benefit from lower costs and increased agility when unlocking the value of their data with no on-premise data warehousing/analytics environment to design, deploy and manage.
  • Broad, easy connectivity to AWS – Customers gain full flexibility in integrating data from any Informatica-supported data source (the broadest set of sources supported by any integration vendor) through the use of pre-built connectors for AWS.
  • Seamless hybrid integration – Hybrid integration scenarios across Informatica PowerCenter and Informatica Cloud data integration deployments are able to connect seamlessly to AWS services.
  • Comprehensive use case coverage – Informatica solutions for data integration and warehousing, data archiving, data streaming and big data across cloud and on-premise applications mesh with AWS solutions such as RDS, Redshift, Kinesis, S3, DynamoDB, EMR and other AWS ecosystem services to drive new and rapid value for customers.

New Support for AWS Services

Informatica introduced a number of new Informatica Cloud integrations with AWS services, including connectors for Amazon DynamoDB, Amazon Elastic MapReduce (Amazon EMR) and Amazon Simple Storage Service (Amazon S3), to complement the existing connectors for Amazon Redshift and Amazon Relational Database Service (Amazon RDS).

Additionally, the latest Informatica PowerCenter release for Amazon Elastic Compute Cloud (Amazon EC2) includes support for:

  • PowerCenter Standard Edition and Data Quality Standard Edition
  • Scaling options – Grid, high availability, pushdown optimization, partitioning
  • Connectivity to Amazon RDS and Amazon Redshift
  • Domain and repository DB in Amazon RDS for current database PAM (policies and measures)

To learn more, try our 60-day free Informatica Cloud trial for Amazon Redshift.

If you’re in Vegas, please come by our booth at re:Invent, Nov. 11-14, in Booth #1031, Venetian / Sands, Hall.

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Cloud, Cloud Application Integration, Cloud Computing | Tagged , , , | Leave a comment

Informatica’s Response To the POODLE SSL v3 Vulnerability

The information security industry is an “arms race”, with attacks always getting better. To that end, it’s important that security controls and implementations be designed with flexibility and agility in mind. The SSL protocol was originally developed by Netscape Communications, the company that helped fuel the Internet generation. For the past 20 years, we’ve seen advancement in computing power and and novel new attacks against SSL implementations. As new vulnerabilities are discovered, enhancements are proposed and TLS was developed to replace SSL. The cycle continues…we’re now at TLS 1.2, and with this “POODLE” vulnerability it’s finally time to say goodbye forever to the SSL foundations that brought us here.

The recent announcement of POODLE should be another wake-up call to security practitioners and implementors that patches and updates are now the “New Normal” for infrastructure that handles sensitive data. Even foundational protocols like SSL require care and feeding, regular maintenance and updates. For Informatica customers, this note is our response to help you keep your software patched via important updates provided by our Global Customer Support organization.

1 – What you need to know

On October 14, 2014 Google security researchers released details of a vulnerability within the design of SSL version 3 protocol. An attacker in a privileged position on a network could intercept encrypted traffic and methodically decrypt the messages to reveal sensitive information such as credentials. This is an industry-wide issue, affecting nearly every system that implements or supports SSL. TLS is its replacement, but not every product is guaranteed compatible with an SSL -> TLS upgrade so patches need to be applied and tested carefully.

Informatica’s cloud-hosted products including Informatica Cloud Services (ICS) and our recently-launched Project Springbok beta, have already been patched to address this issue and will only support TLS going forward. We continue to monitor for relevant updates to both vulnerabilities and available patches.

Because this vulnerability affects connectivity to other systems, it is important that our customers carefully assess (a) the level of risk they are actually subject to, and (b) the impact disabling SSLv3 will have against other clients, servers, and API endpoints in their ecosystem. It may be acceptable, for example, to leave SSLv3 enabled if sufficient compensating controls are enforced and the risk of change is too great.

2 – What you need to do

Informatica’s Information Security team coordinated an internal response with developers to assess the vulnerability within our products and cloud services.

Some Informatica products require patches or configuration changes to be able to address this SSL parameter tuning capability. Please contact Informatica Global Customer Support or your account executive for more information or technical questions.

Informatica cloud-based services were patched by our operations team. Please check that your connectivity to these services is operating normally, especially any SSL or TLS settings.

Cloud Service Version Patch / Remediation
Springbok Beta No action necessary. The Springbok infrastructure has been patched by Informatica Cloud Operations.
ActiveVOS/Cloud All No action necessary. The ActiveVOS/Cloud infrastructure has been patched by Informatica Cloud Operations.
Cloud/ICS All No action necessary. The ICS infrastructure has been patched by Informatica Cloud Operations.

Informatica takes the security of our customers’ data very seriously. Please refer to this Informatica’s Knowledge Base article, or contact our Global Customer Support team if you have any questions or concerns about our product SSL configurations in your environment. The Informatica support portal is always available at http://mysupport.informatica.com.

3 – How to contact Informatica about security

If you are security researcher and have identified a potential vulnerability in an Informatica product or service, please follow our Responsible Disclosure Program.

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Cloud Computing, Data Security | Tagged , , , | Leave a comment

Informatica Cloud Powers a New Era in Cloud Analytics with Salesforce Wave Analytics Cloud at Dreamforce 2014

We are halfway through Dreamforce and it’s been an eventful and awesome couple of days so far. The biggest launch by far was the announcement of Wave, the Salesforce Analytics Cloud, Salesforce’s new entry into Cloud analytics and business intelligence. Informatica has been the integration leader for enterprise analytics for 20 years, and our leadership continues with Cloud analytics, as our Informatica Cloud portfolio is the only solution that Completes Salesforce Analytics Cloud for Big Data, fully enabling companies to use Salesforce Analytics Cloud to understand their customers like never before. But don’t take our word for it, view the Analytics Cloud Keynote from Dreamforce 2014, and see Alex Dayon uniquely call out Informatica as their key integration partner during his keynote.

DIY Great Customer Data

DIY Great Customer Data

The Informatica Cloud Portfolio delivers a broad set of analytics-centric services for the Salesforce Analytics Cloud, including bulk and real time application integration, data integration, data preparation, test data management, data quality and master data management (MDM) services. The portfolio is designed for high volume data sets from transactional applications such as SAP, cloud applications like Workday and new data sources such as Hadoop, Microsoft Azure and Amazon Web Services.

We have a great booth in the Analytics Zone, Moscone West, 3rd floor, where you can see demos of Informatica Cloud for Salesforce Wave Analytics and get lots more details from product experts.

And, you don’t need to wait till Dreamforce is over to try out Informatica Cloud for Salesforce Analytics. The free trial of Informatica Cloud, including Springbok, for Salesforce Analytics Cloud is available now. Trial users have unlimited usage of Informatica Cloud capabilities for Salesforce Analytics Cloud for 60 days, free of charge.

Aside from new product launches, and tons of partner activities going on, we’ve also got some great customers speaking at DF. Today, we have a great session on “Get Closer to Your Customers Using Agile Data Management with Salesforce” with executive speakers from BT, Dolby and Travel Corporation explaining how they achieve customer insight with use cases ranging from integrating 9 Salesforce orgs into a single business dashboard to unifying 30+ acquired travel brands into a single customer view.

On Monday, we had Qualcomm and Warranty Group present how their companies have moved to the Cloud using Salesforce and Informatica Cloud to meet the agility needs of their businesses while simultaneously resolving the challenges of data scaling, organization complexity and evolving technology strategy to make it all happen.

Win $10k from Informatica!

Win $10k from Informatica!

Drop by our main booth in Moscone North, N1216 to see live demos showcasing solutions for Customer Centricity, Salesforce Data Lifecycle and Analytics Cloud. If you want a preview of our Informatica Cloud solutions for the Salesforce ecosystem, click here.

During Dreamforce, we also announced a significant milestone for Informatica Cloud, which now processes over 100 Billion transactions per month, on behalf of our 3,000+ joint customers with Salesforce.

Oh, and one more thing we announced at DF: the Informatica Cloud Data Wizard, our next-generation data loader for Salesforce, that delivers a beautifully simple user experience, natively inside Salesforce for non-technical business analysts and admins to easily bring external data into Salesforce with a one-touch UI, really!

For more information on how you can connect with Informatica at Dreamforce 2014, get all the details at informaticacloud.com/dreamforce

FacebookTwitterLinkedInEmailPrintShare
Posted in Cloud Computing, Data Integration | Tagged , , , , , | Leave a comment

Embracing the Hybrid IT World through Cloud Integration

Embracing the Hybrid IT World through Cloud Integration

Embracing Hybrid IT through Cloud Integration

Being here at Oracle Open World, it’s hard not to think about Oracle’s broad scope in enterprise software and the huge influence it wields over our daily work. But even as all-encompassing as Oracle has become, the emergence of the cloud is making us equally reliant on a whole new class of complementary applications and services. During the early era of on-premise apps, different lines of businesses (LOBs) selected the leading application for CRM, ERP, HCM, and so on. In the cloud, it feels like we have come full circle to the point where best of breed cloud applications have been deployed across the enterprise, with the exception that the data models, services and operations are not under our direct control. As a result, Hybrid IT, and the ability to integrate major on-premises applications such as Oracle E-Business, PeopleSoft, and Siebel, to name a few with cloud applications such as Oracle Cloud Applications, Salesforce, Workday, Marketo, SAP Cloud Applications, and Microsoft Cloud Apps, has become one of businesses’ greatest imperatives and challenges.

With Informatica Cloud, we’ve long tracked the growth of the various cloud apps and its adoption in the enterprise. Common business patterns – such as opportunity-to-order, employee onboarding, data migration and business intelligence – that once took place solely on-premises are now being conducted both in the cloud and on-premises.

The fact is that we are well on our way to a world where our business needs are best met by a mix of on-premises and cloud applications. Regardless of what we do or make, we can no longer get away with just on-premises applications – or at least not for long.  As we become more reliant on cloud services, such as those offered by Oracle, Salesforce, SAP, NetSuite, Workday, we are embracing the reality of a new hybrid world, and the imperative for simpler integration it demands.

So, as the ground shifts beneath us, moving us toward the hybrid world, we, as business and IT users, are left standing with a choice: Continue to seek solutions in our existing on-premises integration stacks, or go beyond, to find them with the newer and simpler cloud solution. Let us briefly look at five business patterns we’ve been tracking.

One of the first things we’ve noticed with the hybrid environment is the incredible frequency with which data is moved back and forth between the on-premises and cloud environments. We call this the data integration pattern, and it is best represented by getting data, such as price list or inventory from Oracle E-Business into a cloud app so that the actual user of the cloud app can view the most updated information. Here the data (usually master data) is copied toserves a certain purpose. Data Integration also involves the typical needs of data to be transformed before it can be inserted or updated. The understanding of metadata and data models of the involved applications is key to do this effectively and repeatedly.

The second is the application integration pattern, or the real time transaction flow between your on-premises and cloud environment, where you have business processes and services that need to communicate with one another. Here, the data needs to be referenced in real time for a knowledge worker to take action.

The third, data warehousing in the cloud, is an emerging pattern that is gaining importance for both mid- and large-size companies. In this pattern, businesses are moving massive amounts of data in bulk from both on-premises and cloud sources into a cloud data warehouse, such as Amazon Redshift, for BI analysis.

The fourth, the Internet of Things (IOT) pattern, is also emerging and is becoming more important, especially as new technologies and products, such as Nest, enable us to push streaming data (sensor data, web logs, etc.) and combine them with other cloud and on-premises data sources into a cloud data store. Often the data is unstructured and hence it is critical for an integration platform to effectively deal with unstructured data.

The fifth and final pattern, API integration, is gaining prominence in the cloud. Here, an on-premise or cloud application exposes the data or service as an external API that can be consumed directly by applications or by a higher-level composite app in an orchestration.

While there are certainly different approaches to the challenges brought by Hybrid IT, cloud integration is often best-suited to solving them.

Here’s why.

First, while the integration problems are more or less similar to the on-premise world, the patterns now overlap between cloud and on-premise. Second, integration responsibility is now picked up at the edge, closer to the users, whom we call “citizen integrators”. Third, time to market and agility demands that any integration platform you work with can live up to your expectations of speed. There are no longer multiyear integration initiatives in the era of the cloud. Finally, the same values that made cloud application adoption attractive (such as time-to-value, manageability, low operational overhead) also apply to cloud integration.

One of the most important forces driving cloud adoption is the need for companies to put more power into hands of the business user.  These users often need to access data in other systems and they are quite comfortable going through the motions of doing so without actually being aware that they are performing integration. We call this class of users ‘Citizen Integrators’. For example, if a user uploads an excel file to Salesforce, it’s not something they would call as “integration”. It is an out-of-the-box action that is integrated with their user experience and is simple to use from a tooling point of view and oftentimes native within the application they are working with.

Cloud Integration Convergence is driving many integration use cases. The most common integration – such as employee onboarding – can span multiple integration patterns. It involves data integration, application integration and often data warehousing for business intelligence. If we agree that doing this in the cloud makes sense, the question is whether you need three different integration stacks in the cloud for each integration pattern. And even if you have three different stacks, what if an integration flow involves the comingling of multiple patterns? What we are noticing is a single Cloud Integration platform to address more and more of these use cases and also providing the tooling for both a Citizen Integrator as well as an experienced Integration Developer.

The bottom line is that in the new hybrid world we are seeing a convergence, where the industry is moving towards streamlined and lighter weight solutions that can handle multiple patterns with one platform.

The concept of Cloud Integration Convergence is an important one and we have built its imperatives into our products. With our cloud integration platform, we combine the ability to handle any integration pattern with an easy-to-use interface that empowers citizen integrators, and frees integration developers for more rigorous projects. And because we’re Informatica, we’ve designed it to work in tandem with PowerCenter, which means anything you’ve developed for PowerCenter can be leveraged for Informatica Cloud and vice versa thereby fulfilling Informatica’s promise of Map Once, Deploy Anywhere.

In closing, I invite you to visit us at the Informatica booth at Oracle Open World in booth #3512 in Moscone West. I’ll be there with some of my colleagues, and we would be happy to meet and talk with you about your experiences and challenges with the new Hybrid IT world.

FacebookTwitterLinkedInEmailPrintShare
Posted in Cloud, Cloud Application Integration, Cloud Computing, Cloud Data Integration | Tagged , , , , , , | Leave a comment

Elevating Your App by Embedding Advanced, REST-ful Cloud Integration

Cloud Integration

Elevating Your App by Embedding Advanced, REST-ful Cloud Integration

While the great migration to the cloud has created opportunities for practically everyone in tech, no one group stands to benefit from it more than application developers. In partnership with category-leading SaaS companies like Salesforce, NetSuite, Ultimate Software and others, ISVs have not only easy access to open APIs to leverage but also complete marketplaces to establish themselves and exploit for market and mind share. And within those ecosystems, there stands a whole new class of users looking to these applications to solve their specific business problems.

But, as Billy Macinnes, in his July MicroScope article, reminds us, the opportunities come with many challenges, and so far only a few ISVs have risen high enough to truly meet them all. While the article itself is more concerned about where ISVs are headed, Macinnes and the industry experts he references, such as Mike West and Philip Howard, make it clear that no one is going anywhere far without a cloud strategy that meaningfully addresses data integration.

As a business app consumer myself, I too am excited by the possibilities that exist. I am intrigued by the way in which the new applications embrace the user-first ethos and deliver consumer-app-like interfaces and visual experiences. What concerns me is what happens next, once you get beyond the pretty design and actually try to solve the business use case for which the app is intended. This, unfortunately, is where many business apps fail. While most can access data from a single specific application, few can successfully interact with external data coming from multiple sources.

Like many of the challenges (such as licensing and provisioning) faced by today’s ISVs, data integration is something that lies outside of the expertise area of a typical app developer. Let’s say, for example, you’ve just come up with a new way to anticipate customer needs and match it with excess inventory. While the developer expertise and art of the app may be, say, in a new algorithm, the user experience, ultimately, is equally dependent on your ability to surface data – inventory, pricing, SKU numbers, etc. – that may be held in SaaS and on-premises systems and seamlessly marry it – behind the scenes – to cloud-based customer information.

The bottom line is that regardless of the genius behind your idea or user interface, if you can’t feed relevant data into your application and ensure its completeness, quality and security for meaningful consumption, your app will be dead in the water. As a result, app developers are spending an inordinate amount of time – in some cases up to 80% of their development cycle – working through data issues. Even with that, many still get stuck and end up with little more for their effort than a hard lesson in the difficulties of enterprise data integration long understood by every integration developer.

Fortunately, there is a better way: cloud integration.

Cloud integration enables the developer to focus on their app and core business. The ISV can offer cloud integration to its customers as an external resource or as an embedded part of its app. While some may see this as a choice, any ISV looking to provide the best possible user experience has no real option other than to embed the integration services as part of their application.

Look at any successful business app, and chances are you’ll find something that empowers users to work independently, without having to rely on other teams or tools for solutions. Take, for example, the common use case of bringing data into an app via a CSV file. With integration built directly into the app, the user can upload the file and resolve any semantic conflicts herself, with no assistance from IT. Without it, the user is now reliant on others to do his or her job, and ultimately less productive. Clearly, the better experience is the one that provides users with easy access to everything needed – including data from multiple sources – to get the work done themselves. And the most effective way you can do that is by embedding integration into the application.

Now that we’ve settled why cloud integration works best as an embedded capability, let’s take a closer look at how it works within the application context.

With cloud integration embedded into your app, you can essentially work behind the scenes to connect different data sources and incorporate the mapping and workflows between your app and the universe of enterprise data sources. How it accomplishes that is through abstraction. By abstracting connectivity to these data sources, you take the complexities involved with bringing data from an external source – such as SAP or Salesforce – and place it within a well-defined integration template or Vibe Integration Package (VIP). Once these abstractions are defined, you can then, as an application developer, access these templates through REST API and bring the specified data into your application.

While connectivity abstraction and REST APIs are important on their own, like all great pairings, it is only in combination that their true utility is realized. In fact, taken separately, neither is of much value to the application developer. Alone, a REST API can access the raw data type, but without the abstraction, the information is too unintelligible and incomplete to be of any use. And without the REST API, the abstracted data has no way of getting from the source to the application.

The value that REST APIs together with connectivity abstraction bring cannot be overstated, especially when the connectivity can span multiple integration templates. The mechanism for accomplishing integration is, like an automobile transmission, incredibly complex. To give an analogy, just like a car’s shift lever exposes a simple interface to move the gears from Park to Drive, activating a series of complex sensors to make the appropriate motions under the hood, the integration templates allow the user to work with the data in any way they want without ever having to understand or know about the complexities going on underneath.

As the leading cloud integration solution and platform, Informatica Cloud has long recognized the importance of pairing REST APIs and connectivity abstraction.

The first and most important function within our REST API is administration. It enables you to set up your organization and the administration of your users and permissions. The second function allows you to run and monitor integration tasks. And with the third, end users can configure the integration templates themselves, and enforce the business rules to apply for their specific process. You can view the entire set of Informatica Cloud REST API capabilities here.

It is in this last area – integration configurability – where we are truly setting ourselves apart. The Vibe Integration Packages (VIPs) not only abstract backend connectivity but also ensure that the data is complete – with the needed attributes from the underlying apps – and is of high quality and formatted for easy consumption in the end-user application. With the Packages, we’ve put together many of the most common integrations with reusable integration logic that is configurable through a variety of parameters. Our configurable templates enable your app users to customize and fine-tune their integrations – with custom fields, objects, etc. – to meet the specific behavior and functionality of their integrations. For example, the Salesforce to SAP VIP includes all the integration templates you need to solve different business use cases, such as integrating product, order and account information.

With their reusability and groupings encompassing many of the common integration use cases, our Vibe Integration Packages really are revolutionizing work for everyone. Using Informatica Cloud’s Visual Designer, developers can quickly create new, reusable VIPs, with parameterized values, for business users to consume. And SaaS administrators and business analysts can perform complex business integrations in a fraction of the time it took previously, and customize new integrations on the fly, without IT’s help.

More and more, developers are building great-looking apps with even greater aspirations. In many cases, the only thing holding them back is the ability to access back-office data without using external tools and interfaces, or outside assistance. With Informatica Cloud, data integration need no longer take a backseat to design, or anything else. Through our REST API, abstractions and Vibe Integration Packages, we help developers put an end to the compromise on user experience by bringing in the data directly through the application – for the benefit of everyone.

FacebookTwitterLinkedInEmailPrintShare
Posted in Cloud Computing, Cloud Data Integration | Tagged , | Leave a comment

Informatica Cloud Summer ’14 Release Breaks Down Barriers with Unified Data Integration and Application Integration for Real Time and Bulk Patterns

This past week, Informatica Cloud marked an important milestone with the Summer 2014 release of the Informatica Cloud platform. This was the 20th Cloud release, and I am extremely proud of what our team has accomplished.

“SDL’s vision is to help our customers use data insights to create meaningful experiences, regardless of where or how the engagement occurs. It’s multilingual, multichannel and on a global scale. Being able to deliver the right information at the right time to the right customer with Informatica Cloud Summer 2014 is critical to our business and will continue to set us apart from our competition.”

– Paul Harris, Global Business Applications Director, SDL Pic

When I joined Informatica Cloud, I knew that it had the broadest cloud integration portfolio in the marketplace: leading data integration and analytic capabilities for bulk integration, comprehensive cloud master data management and test data management, and over a hundred connectors for cloud apps, enterprise systems and legacy data sources.. all delivered in a self-service design with point-and-click wizards for citizen integrators, without the need for complex and costly manual custom coding.

But, I also learned that our broad portfolio belies another structural advantage: because of Informatica Cloud’s unique, unified platform architecture, it has the ability to surface application (or real time) integration capabilities alongside its data integration capabilities with shared metadata across real time and batch workflows.

With the Summer 2014 release, we’ve brought our application integration capabilities to the forefront. We now provide the most-complete cloud app integration capability in the marketplace. With a design environment that’s meant not for just developers but also line of business IT, now app admins can also build real time process workflows that cut across on-premise and cloud and include built-in human workflows. And with the capability to translate these process workflows instantly into mobile apps for iPhone and Android mobile devices, we’re not just setting ourselves apart but also giving customers the unique capabilities they need for their increasingly mobile employees.

InformaticaCloud

Informatica Cloud Summer Release Webinar Replay

“Schneider’s strategic initiative to improve front-office performance relied on recording and measuring sales person engagement in real time on any mobile device or desktop. The enhanced real time cloud application integration features of Informatica Cloud Summer 2014 makes it all possible and was key to the success of a highly visible and transformative initiative.”

– Mark Nardella, Global Sales Process Director, Schneider Electric SE

With this release, we’re also giving customers the ability to create workflows around data sharing that mix and match batch and real time integration patterns. This is really important.  Because unlike the past, where you had to choose between batch and real time, in today’s world of on-premise, cloud-based, transactional and social data, you’re now more than ever having to deal with both real time interactions and the processing of large volumes of data. For example, let’s surmise a typical scenario these days at high-end retail stores. Using a clienteling iPad app, the sales rep looks up bulk purchase history and inventory availability data in SAP, confirms availability and delivery date, and then processes the customer’s order via real time integration with NetSuite. And if you ask any customer, having a single workflow to unify all of that for instant and actionable insights is a huge advantage.

“Our industry demands absolute efficiency, speed and trust when dealing with financial information, and the new cloud application integration feature in the latest release of Informatica Cloud will help us service our customers more effectively by delivering the data they require in a timely fashion. Keeping call-times to a minimum and improving customer satisfaction in real time.”

– Kimberly Jansen, Director CRM, Misys PLC

We’ve also included some exciting new Vibe Integration packages or VIPs. VIPs deliver pre-built business process mappings between front-office and back-office applications. The Summer 2014 release includes new bidirectional VIPs for Siebel to Salesforce and SAP to Salesforce that make it easier for customers to connect their Salesforce with these mission-critical business applications.

And lastly, but not least importantly, the release includes a critical upgrade to our API Framework that provides the Informatica Cloud iPaaS end-to-end support for connectivity to any company’s internal or external APIs. With the newly available API creation, definition and consumption patterns, developers or citizen integrators can now easily expose integrations as APIs and users can consume them via integration workflows or apps, without the need for any additional custom code.

The features and capabilities released this summer are available to all existing Informatica Cloud customers, and everyone else through our free 30-day trial offer.

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Cloud, Cloud Application Integration, Cloud Computing, Cloud Data Integration, Data Integration, Uncategorized | Tagged , , , , , | Leave a comment

Moving to the Cloud: 3 Data Integration Facts That Every Enterprise Should Understand

Cloud Data Integration

Cloud Data Integration

According to a survey conducted by Dimensional Research and commissioned by Host Analytics, “CIOs continue to grow more and more bullish about cloud solutions, with a whopping 92% saying that cloud provides business benefits, according to a recent survey. Nonetheless, IT execs remain concerned over how to avoid SaaS-based data silos.”

Since the survey was published, many enterprises have, indeed, leveraged the cloud to host business data in both IaaS and SaaS incarnations.  Overall, there seems to be two types of enterprises: First are the enterprises that get the value of data integration.  They leverage the value of cloud-based systems, and do not create additional data silos.  Second are the enterprises that build cloud-based data silos without a sound data integration strategy, and thus take a few steps backward, in terms of effectively leveraging enterprise data.

There are facts about data integration that most in enterprise IT don’t yet understand, and the use of cloud-based resources actually makes things worse.  The shame of it all is that, with a bit of work and some investment, the value should come back to the enterprises 10 to 20 times over.  Let’s consider the facts.

Fact 1: Implement new systems, such as those being stood up on public cloud platforms, and any data integration investment comes back 10 to 20 fold.  The focus is typically too much on cost and not enough on the benefit, when building a data integration strategy and investing in data integration technology.

Many in enterprise IT point out that their problem domain is unique, and thus their circumstances need special consideration.  While I always perform domain-specific calculations, the patterns of value typically remain the same.  You should determine the metrics that are right for your enterprise, but the positive values will be fairly consistent, with some varying degrees.

Fact 2: It’s not just about data moving from place-to-place, it’s also about the proper management of data.  This includes a central understanding of data semantics (metadata), and a place to manage a “single version of the truth” when it comes to dealing massive amounts of distributed data that enterprises must typically manage, and now they are also distributed within public clouds.

Most of those who manage enterprise data, cloud or no-cloud, have no common mechanism to deal with the meaning of the data, or even the physical location of the data.  While data integration is about moving data from place to place to support core business processes, it should come with a way to manage the data as well.  This means understanding, protecting, governing, and leveraging the enterprise data, both locally and within public cloud providers.

Fact 3: Some data belongs on clouds, and some data belongs in the enterprise.  Those in enterprise IT have either pushed back on cloud computing, stating that data outside the firewall is a bad idea due to security, performance, legal issues…you name it.  Others try to move all data to the cloud.  The point of value is somewhere in between.

The fact of the matter is that the public cloud is not the right fit for all data.  Enterprise IT must carefully consider the tradeoff between cloud-based and in-house, including performance, security, compliance, etc..  Finding the best location for the data is the same problem we’ve dealt with for years.  Now we have cloud computing as an option.  Work from your requirements to the target platform, and you’ll find what I’ve found: Cloud is a fit some of the time, but not all of the time.

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Cloud, Cloud Application Integration, Cloud Computing, Cloud Data Integration, Data Integration | Tagged , , | Leave a comment