Category Archives: B2B Data Exchange

The Supply Chain Impact of Adding an Allergen to a Chocolate Bar

supply_chainThroughout the lifecycle of a consumable product, many parties are constantly challenged with updating and managing product information, like ingredients and allergens. Since December 13th, 2014, this task has become even more complex than before for companies producing or selling food and beverage products in the European Union due to the new EU 1169/2011 rules. As a result, changing the basic formula of a chocolate bar by adding one allergen, like nuts for example should be carefully considered as it can have a tremendous impact on its complete supply chain if manufacturer and retailer(s) want to remain compliant with EU regulation 1169/2011 and to inform the consumer about the changes to the product.

EU_1169_nuts_to_a_chololate_bar-supply_chainLet’s say, the chocolate bar is available in three (3) varieties: dark-, whole milk and white chocolate. Each of the varieties is available in two (2) sizes: normal and mini size. The chocolate bar is distributed in twelve (12) countries within the European Union using different packaging. This would require the manufacturer to introduce 72 (3 varieties * 2 sizes * 12 countries) new GTINs at an item level. If the chocolate producer decides to do any package or seasonal promotions, multipacks or introduce a new variety, this number would even be higher. The new attributes, including updated information on allergens, have to be modified for each product and 72 new GTINs have to be generated for the chocolates bars by the chocolate manufacturer’s data maintenance department. Trading partners have to be updated about the modifications, too. Assuming the manufacturer uses the Global Data Synchronization Network (GDSN), the new GTINs will have to be registered along with their new attributes eight weeks before the modified product will be available. In addition to that, packaging hierarchies have to be taken into consideration as well. Let’s say, each item has an associated case and pallet, the number of updates would sum up at 216 (72 updates * 3 product hierarchies).

Managing the updates related to the product modifications, results in high administrative and other costs. Trading partners across the supply chain report significant impact to their annual sales and costs. According to GS1 Europe, one retailer reported 6,000-8,000 GTIN changes per year, leading to 2-3% of additional administrative and support costs. If the GTIN had to change for every new minor variant of a product, they forecast the number of changes per year could rise to 20,000-25,000, leading to a significant increase in further additional administrative and support costs.

The change of the chocolate bar’s recipe also means that a corresponding change to the mandatory product data displayed on the label is required. This means for the online retailer(s) selling the chocolate bar that they have to update information displayed in the online shops. Considering that a retailer has to deliver exactly the product that is displayed in his online shop, there will be a period of time when the old version of the product and the new version coexist in the supply chain. During this period it is not possible for the retailer to know if the version of the product ordered on a website will be available at the time and place the order is picked.

GS1 Europe suggests handling this issue as follows: Retailers working to GS1 standards use GTINs to pick on-line orders. If the modified chocolate bar with more fat is given a new GTIN it increases the possibility that the correct variant can be made available for picking and, even if it is not available at the pick point, the retailer can recognize automatically if the version being picked is different from the version that was ordered. In this latter case the product can be offered as a substitute when the goods are delivered and the consumer can choose whether to accept it or not. On their websites, GS1 provides comprehensive information on how to comply with the new European Food Information Regulation.

Using the Global Data Synchronization Network (GDSN), suppliers and retailers are able to share standardized product data, cut down the cost of building point to point integrations and speed-up new product introductions by getting access to the most accurate and most current product information. The Informatica GDSN Accelerator is an add-on to the Informatica Product Information Management (PIM) system that provides an interface to access a GDSN certified data pool. It is designed to help organizations securely and continuously exchange, update and synchronize product data with trading partners according to the standards defined by Global Standards One (GS1). GDSN ensures that data exchanged between trading partners is accurate and compliant with globally supported standards in maintaining uniqueness, classification and identification of source and recipients. Integrated in your PIM System, the GDSN Accelerator allows for leveraging product data of highest standards to be exchanged with your trading partners via the GDSN.

Thanks to the automated product data exchange, efforts and costs related to the modification of a product, as demonstrated in the chocolate bar example can be significantly reduced for both, manufacturers and retailers. The product data can be easily transferred to the data pool and you can fully control the information sharing with a specific trading partner or with all recipients of a target market.

Related blogs:

http://blogs.informatica.com/perspectives/2014/08/01/how-gs1-and-pim-help-fullfil-legal-regulations-and-feed-distribution-channels/

http://blogs.informatica.com/perspectives/2014/12/15/new-eu-food-information-regulation-5-things-to-consider/

Share
Posted in B2B Data Exchange, PiM | Tagged , , , | 1 Comment

Product Intelligence: How To Make Your Product Information Smarter

As we discussed at length in our #HappyHoliData series, no matter what the customer industry or use case, information quality is a key value component to deliver the right services or products to the right customer.

In my blog on 2015 omnichannel trends impacting customer experience I commented on product trust as a key expectation in the eyes of customers.

For product managers, merchandizers or category managers this means: which products shall we offer for which price? How is the competition pricing this item? With which content is the competition promoting this SKU? Are my retailers and distributors sticking to my price policy. Companies need quicker insights for taking decisions on their assortment, prices and compelling content and for better customer facing service.

Recently, we’ve been spending time discussing this challenge with the folks at Indix, an innovator in the product intelligence space, to find ways to help businesses improve their product information quality.  For background, Indix is building the world’s largest database of product information and currently tracks over 600 million products, over 600,000 seller, over 40,000 brands, over 10,000 attributes across over 6,000 categories. (source: Indix.com)

Indix takes all of that data, then cleanses and normalizes it and breaks it down into two types of product information — offers data and catalog data.  The offers data includes all the dynamic information related to the sale of a product such as the number of stores at which it is sold, price history, promotions, channels, availability, and shipping. The catalog data comprises relatively unchanging product information, such as brand, images, descriptions, specifications, attributes, tags, and facets.

product intelligence indix informatica

We’ve been talking with the Indix team about how powerful it could be to integrate product intelligence directly into the Informatica PIM.  Just imagine if Informatica customers could seamlessly bring in relevant offers and catalog content into the PIM through a direct connection to the Indix Product Intelligence Platform and begin using market and competitive data immediately.

What do you think?  

We’re going to be at NRF and meet selected people to discuss more.  If you like the idea, or have some feedback on the concept, let us know.  We’d love to see you while we’re there and talk further about this idea with you.

Share
Posted in B2B, B2B Data Exchange, PiM, Product Information Management, Retail | Tagged , , | Leave a comment

What Should Come First: Business Processes or Analytics?

business processesAs more and more businesses become fully digitized, the instantiation of their business processes and business capabilities becomes based in software. And when businesses implement software, there are choices to be made that can impact whether these processes and capabilities become locked in time or establish themselves as a continuing basis for business differentiation.

Make sure you focus upon the business goals

business processesI want to suggest that whether the software instantiations of business process and business capabilities deliver business differentiation depends upon whether business goals and analytics are successfully embedded in a software implementation from the start. I learned this first hand several years ago. I was involved in helping a significant insurance company with their implementation of analytics software. Everyone in the management team was in favor of the analytics software purchase. However, the project lead wanted the analytics completed after an upgrade had occurred to their transactional processing software. Fortunately, the firm’s CIO had a very different perspective. This CIO understood that decisions regarding the transaction processing software implementation could determine whether critical metrics and KPIs could be measured. So instead of doing analytics as an afterthought, this CIO had the analytics done as a fore thought. In other words, he slowed down the transactional software implementation. He got his team to think first about the goals for the software implementation and the business goals for the enterprise. With these in hand, his team determined what metrics and KPIs were needed to measure success and improvement. They then required the transaction software development team to ensure that the software implemented the fields needed to measure the metrics and KPIs. In some cases, this was as simple as turning on a field or training users to enter a field as the transaction software went live.

Make the analytics part of everyday business decisions and business processes

Tom DavenportThe question is how common is this perspective because it really matters. Tom Davenport says that “if you really want to put analytics to work in an enterprise, you need to make them an integral part of everyday business decisions and business processes—the methods by which work gets done” (Analytics at Work, Thomas Davenport, Harvard Business Review Press, page 121). For many, this means turning their application development on its head like our insurance CIO. This means in particular that IT implementation teams should no longer be about just slamming in applications. They need to be more deliberate. They need to start by identifying the business problems that they want to get solved through the software instantiation of a business process. They need as well to start with how they want to improve process by the software rather than thinking about getting the analytics and data in as an afterthought.

Why does this matter so much? Davenport suggests that “embedding analytics into processes improves the ability of the organization to implement new insights. It eliminates gaps between insights, decisions, and actions” (Analytics at Work, Thomas Davenport, Harvard Business Review Press, page 121). Tom gives the example of a car rental company that embedded analytics into its reservation system and was able with the data provided to expunge long held shared beliefs. This change, however, resulted in a 2% increased fleet utilization and returned $19m to the company from just one location.

Look beyond the immediate decision to the business capability

Davenport also suggests as well that enterprises need look beyond their immediate task or decision and appreciate the whole business process or what happens upstream or downstream. This argues that analytics be focused on the enterprise capability system. Clearly, maximizing performance of the enterprise capability system requires an enterprise perspective upon analytics. As well, it should be noted that a systems perspective allows business leadership to appreciate how different parts of the business work together as a whole. Analytics, therefore, allow the business to determine how to drive better business outcomes for the entire enterprise.

At the same time, focusing upon the enterprise capabilities system in many cases will overtime lead a reengineering of overarching business processes and a revamping of their supporting information systems. This allows in turn the business to capitalize on the potential of business capability and analytics improvement. From my experience, most organizations need some time to see what a change in analytics performance means. This is why it can make sense to start by measuring baseline process performance before determining enhancements to the business process. Once completed, however, refinement to the enhanced process can be determined by continuously measuring processes performance data.

Related links

Related Blogs

Analytics Stories: A Banking Case Study
Analytics Stories: A Financial Services Case Study
Analytics Stories: A Healthcare Case Study
Who Owns Enterprise Analytics and Data?
Competing on Analytics: A Follow Up to Thomas H. Davenport’s Post in HBR
Thomas Davenport Book “Competing On Analytics”

Solution Brief: The Intelligent Data Platform

Author Twitter: @MylesSuer

Share
Posted in B2B, B2B Data Exchange, Business Impact / Benefits, Business/IT Collaboration, Enterprise Data Management | Tagged , , , | Leave a comment

2015 – The Year of Data Integration?

2015, the Year of Data Integration?

2015, the Year of Data Integration?

I love the data integration coverage by Loraine Lawson in IT Business Edge,  especially in this December 12th posting  that focuses on the trends that will emerge in 2015.  The best quote from the post is: “Oddly, organizations still tended to focus on point-solutions in the cloud. As more infrastructure and data moves to the cloud, they’re experiencing similar pain points and relearning old lessons.”

The articles cites some research from Ovum, that predicts many enterprises will begin moving toward data integration, driven largely by the rise of cloud computing and big data.  However, enterprises need to invest in both modernizing the existing data management infrastructure, as well as invest in data integration technology.  “All of these new investments will push the middleware software market up 9 percent to a $16.3 billion industry, Information Management reports.”  This projection is for 2015.

I suspect that’s a bit conservative.  In my travels, I see much more interest in data integration strategies, approaches, and technology, as cloud computing continues to grow, as well as enterprises understand better the strategic use of data.  So, I would put the growth at 15 percent for 2015.

There are many factors driving this growth, beyond mere interest in cloud computing and big data.

The first consideration is that data is more strategic than initially understood.  While businesses have always considered data a huge asset, it has not been until the last few years that businesses have seen the true value of understanding what’s going on inside, and outside of their business.

Manufacturing companies want to see the current state of production, as well as production history.  Management can now use that data to predict trends to address, such as future issues around employee productivity, or even a piece of equipment that is likely to fail and the impact of that failure on revenue.  Healthcare companies are learning how to better monitor patient health, such as spotting likely health problems before they are diagnosed, or leveraging large data to understand when patterns emerge around health issues, such as areas of the country that are more prone to asthma, based upon air quality.

Second, there is the need to deal with compliance issues.  The new health care regulations, or even the new regulation around managing a publically traded company, require a great deal of data management issues, including data integration.

As these laws emerge, and are altered over time, the reporting requirements are always more complex and far reaching than they were before.  Those who want to avoid fines, or even avoid stock drops around mistakes, are paying close attention to this area.

Finally, there is an expectation from customers and employees that you will have a good handle on your data.  10 years ago you could tell a customer on the phone that you needed to check different systems to answer their question.  Those days are over.  Today’s customers and employees want immediate access to the data they need, and there is no good excuse for not being able to produce that data.  If you can’t, your competition will.

The interest in data integration will experience solid growth in 2015, around cloud and big data, for sure.  However, other factors will drive this growth, and enterprises will finally understand that data integration is core to an IT strategy, and should never be an afterthought.

Share
Posted in B2B Data Exchange, Business/IT Collaboration, Data Governance, Data Integration, Data Integration Platform | Tagged , , , | Leave a comment

Happy Holidays, Happy HoliData.

Happy Holidays, Happy HoliData

In case you have missed our #HappyHoliData series on Twitter and LinkedIn, I decided to provide a short summary of best practices which are unleashing information potential. Simply scroll and click on the case study which is relevant for you and your business. The series touches on different industries and use cases. But all have one thing in common: All consider information quality as key value to their business to deliver the right services or products to the right customer.

HappyHoliData_01 HappyHoliData_02 HappyHoliData_03 HappyHoliData_04 HappyHoliData_05 HappyHoliData_06 HappyHoliData_07 HappyHoliData_08 HappyHoliData_09 HappyHoliData_10 HappyHoliData_11 HappyHoliData_12 HappyHoliData_13 HappyHoliData_14 HappyHoliData_15 HappyHoliData_16 HappyHoliData_17 HappyHoliData_18 HappyHoliData_19 HappyHoliData_20 HappyHoliData_21 HappyHoliData_22 HappyHoliData_23 HappyHoliData_24

Thanks a lot to all my great teammates, who made this series happen.

Happy Holidays, Happy HoliData.

Share
Posted in B2B, B2B Data Exchange, Banking & Capital Markets, Big Data, CIO, CMO, Customers, Data Governance, Data Quality, Enterprise Data Management, Financial Services, Governance, Risk and Compliance, Manufacturing, Master Data Management, PaaS, PiM, Product Information Management, Retail, SaaS | Tagged , | Leave a comment

Informatica Rev: Data Democracy At Last – Part 2

This is a continuation from Part 1 of the Blog which you can read here.

Now, if you are in IT, reading about how Informatica Rev enables the everyday business users in your company to participate in the Data Democracy might feel like treachery. You are likely thinking that Informatica is letting the bull loose in your own fine china shop. You likely feel, first of all, that Informatica is supporting the systemic bypass of all the data governance that IT has worked hard to put in place and then second of all, that Informatica is supporting the alienation of the very IT people that have approved of and invested in Informatica for decades.

While I can understand this thought process I am here to, proudly, inform you that your thoughts cannot be further from the truth! In fact, in the not too distant future, Informatica is in a very strong position to create a very unique technology solution to ensure you can better govern all the data in your enterprise and do it in a way that will allow you to proactively deliver the right data to the business, yes, before the masses of everyday business users have started to knock your door down to even ask for it. Informatica’s unique solution will ensure the IT and Business divide that has existed in your company for decades, actually becomes a match made in heaven. And you in IT get the credit for leading this transformation of your company to a Data Democracy. Listen to this webinar to hear Justin Glatz, Executive Director of Information Technology at Code Nast speak about how he will be leading Conde Nast’s transformation to Data Democracy.

Data Democracy At Last

Data Democracy At Last

“How?” you might ask. Well, first let’s face it, today you do not have any visibility into how the business is procuring and using most data, and therefore you are not governing most of it. Without a change in your tooling, your ability to gain this visibility is diminishing greatly, especially since the business does not have to come to you to procure and use their cloud based applications.  By having all of your everyday business users use Informatica Rev, you, for the first time will have the potential to gain a truly complete picture of how data is being used in your company. Even the data they do not come to you to procure.

In the not too distant future, you will gain this visibility through an IT companion application to Informatica Rev. You will then gain the ability to easily operationalize your business user’s exact transformation logic or Recipe as we call it in Informatica Rev, into your existing repositories be they your enterprise data warehouse, datamart or master data management repository for example. And by-the-way you are likely already using Informatica PowerCenter or Informatica Cloud or Informatica MDM to manage these repositories anyway so you already have the needed infrastructure we will be integrating Informatica Rev with. And if you are not using Informatica for managing these repositories, the draw of becoming proactive with your business and leading the transformation of your company to a Data Democracy will be enough to make you want to go get Informatica.

Just as these Professionals have found success by participating in the Data Democracy, with Informatica Rev you finally can do so, too. You can try Informatica Rev for free by clicking here.

Share
Posted in B2B, B2B Data Exchange, Data First, Data Governance | Tagged , , , , | Leave a comment

Building an Enterprise Data Hub: Choosing the Data Integration Solution

Building an Enterprise Data Hub with proper Data Integration

Building an Enterprise Data Hub

Building an Enterprise Data Hub

Data flows into the enterprise from many sources, in many formats, sizes, and levels of complexity. And as enterprise architectures have evolved over the years, traditional data warehouses have become less of a final staging center for data, but rather, one component of the enterprise that interfaces with significant data flows. But since data warehouses should focus on being powerful engines for high value analytics, they should not be the central hub for data movement and data preparation (e.g. ETL/ELT), especially for the newer data types–such as social media, clickstream data, sensor data, internet-of-things-data, etc.–that are in use today.

When you start seeing data warehouse capacity consumed too quickly and performance degradation where end users are complaining about slower response times, and you risk not meeting your service-level agreements, then it might be time to consider an enterprise data hub (EDH). With an EDH, especially one built on Apache™ Hadoop®, you can plan a strategy around data warehouse optimization to get better use out of your entire enterprise architecture.

Of course, whenever you add another new technology to your data center, you care about interoperability. And since many systems in today’s architectures interoperate via data flows, it’s clear that sophisticated data integration technologies will be an important part of your EDH strategy. Today’s big data presents new challenges as relates to a wide variety of data types and formats, and the right technologies are needed to glue all the pieces together, whether those pieces are data warehouses, relational databases, Hadoop, or NoSQL databases.

Choosing a Data Integration Solution

Data integration software, at a high level, has one broad responsibility: to help you process and prepare your data with the right technology. This means it has to get your data to the right place in the right format in a timely manner. So it actually includes many tasks, but the end result is that timely, trusted data can be used for decision-making and risk management throughout the enterprise. You end up with a complete, ready-for-analysis picture of your business, as opposed to segmented snapshots based on a limited data set.

When evaluating a data integration solution for the enterprise, look for:

  • Ease of use to boost developer productivity
  • A proven track record in the industry
  • Widely available technology expertise
  • Experience with production deployments with newer technologies like Hadoop
  • Ability to reuse data pipelines across different technologies (e.g. data warehouse, RDBMS, Hadoop, and other NoSQL databases)

Trustworthy data

Data integration is only part of the story. When you’re depending on data to drive business decisions and risk management, you clearly want to ensure the data is reliable. Data governance, data lineage, data quality, and data auditing remain as important topics in an EDH. Oftentimes, data privacy regulatory demands must be met, and the enterprise’s own intellectual property must be protected from accidental exposure.

To help ensure that data is sound and secure, look for a solution that provides:

  • Centralized management and control
  • Data certification prior to publication, transparent data and integration processes, and the ability to track data lineage
  • Granular security, access controls, and data masking to protect data both in transit and at the source to prevent unauthorized access to specific data sets

Informatica is the data integration solution selected by many enterprises. Informatica’s family of enterprise data integration, data quality, and other data management products can manage data — of any format, complexity level, or size –from any business system, and then deliver that data across the enterprise at the desired speed.

Watch the latest Gartner video to see Todd Goldman, Vice President and General Manager for Enterprise Data Integration at Informatica, as well as executives from Cisco and MapR, give their perspective on how businesses today can gain even more value from big data.

Share
Posted in B2B, B2B Data Exchange, Cloud Data Integration, Data Governance, Data Integration, Enterprise Data Management | Tagged , , , , | Leave a comment

Building Engagement through Service and Support

This is a guest post by Tom Petrocelli, Research Director of Enterprise Social, Mobile and Cloud Applications at Neuralytix

Engagement through Service and Support

Engagement through Service and Support

A product is not just an object or the bits that comprise software or digital media; it is an entire experience. The complete customer experience is vital to the overall value a customer derives from their product and the on-going relationship between the customer and the vendor. The customer experience is enhanced through a series of engagements over a variety of digital, social, and personal channels.  Each point of contact between a vendor and customer is an opportunity for engagement. These engagements over time affect the level of satisfaction the customers with the vendor relationship.

Service and support is a critical part of this engagement strategy. Retail and consumer goods companies recognize the importance of support to the overall customer relationship. Subsequently, these companies have integrated their before and after-purchase support into their multi-channel marketing and omni-channel marketing strategies. While retail and consumer products companies have led the way on support an integral part of on-going customer engagement, B2B companies have begun to do the same. Enterprise IT companies, which are primarily B2B companies, have been expanding their service and support capabilities to create more engagement between their customers and themselves. Service offerings have expanded to include mobile tools, analytics-driven self-help, and support over social media and other digital channels. The goal of these investments has been to make interactions more productive for the customer, strengthen relationships through positive engagement, and to gather data that drives improvements in both the product and service.

A great example of an enterprise software company that understands the value in customer engagement though support is Informatica.  Known primarily for their data integration products, Informatica has been quickly expanding their portfolio of data management and data access products over the past few years. This growth in their product portfolio has introduced many new types of customers Informatica and created more complex customer relationships. For example, the new SpringBok product is aimed at making data accessible to the business user, a new type of interaction for Informatica. Informatica has responded with a collection of new service enhancements that augment and extend existing service channels and capabilities.

What these moves say to me is that Informatica has made a commitment to deeper engagement with customers. For example, Informatica has expanded the avenues from which customers can get support. By adding social media and mobile capabilities, they are creating additional points of presence that address customer issues when and where customers are. Informatica provides support on the customers’ terms instead of requiring customers to do what is convenient for Informatica. Ultimately, Informatica is creating more value by making it easier for customers to interact with them. The best support is that which solves the problem quickest with the least amount of effort. Intuitive knowledge base systems, online support, sourcing answers from peers, and other tools that help find solutions immediately are more valued than traditional phone support. This is the philosophy that drives the new self-help portal, predicative escalation, and product adoption services.

Informatica is also shifting the support focus from products to business outcomes. They are manage problems holistically and are not simply trying to create product band-aids. This shows a recognition that technical problems with data are actually business problems that have broad effects on a customer’s business.  Contrast this with the traditional approach to support that focuses fixing a technical issue but doesn’t necessarily address the wider organizational effects of those problems.

More than anything, these changes are preparation for a very different support landscape. With the launch of the Springbok data analytics tool, Informatica’s support organization is clearly positioning itself to help business analysts and similar semi-technical end-users. The expectations of these end-users have been set by consumer applications. They expect more automation and more online resources that help them to use and derive value from their software and are less enamored with fixing technical problems.

In the past, technical support was mostly charged with solving immediate technical issues.  That’s still important since the products have to work first to be useful. Now, however, support organizations has an expanded mission to be part of the overall customer experience and to enhance overall engagement. The latest enhancements to the Informatica support portfolio reflects this mission and prepares them for the next generation of non-IT Informatica customers.

Share
Posted in B2B Data Exchange, Big Data, Business Impact / Benefits, Business/IT Collaboration | Tagged , , | Leave a comment

With the Winter 2015 Release, Informatica Cloud Advances Real Time and Batch Integration for Citizen Integrators Everywhere

Informatica Cloud Winter 2015 Release

Informatica Cloud Winter 2015 Release

For those who work in tech, or even have a passing interest in the latest computing trends, it was hard to miss the buzz coming out of Dreamforce and Amazon re:Invent. As a partner to both companies, engaged on a parallel path, Informatica Cloud is equally excited about these new developments. With the upcoming Winter 2015 release, we have three new platform enhancements that will take those capabilities even further.

The first of these is in the area of connectivity and brings a whole new set of features and capabilities to those who use our platform to connect with Salesforce, Amazon Redshift, NetSuite and SAP.

Starting with Amazon, the Winter 2015 release leverages the new Redshift Unload Command, giving any user the ability to securely perform bulk queries, and quickly scan and place multiple columns of data in the intended target, without the need for ODBC or JDBC connectors.  We are also ensuring the data is encrypted at rest on the S3 bucket while loading data into Redshift tables; this provides an additional layer of security around your data.

For SAP, we’ve added the ability to balance the load across all applications servers. With the new enhancement, we use a Type B connection to route our integration workflows through a SAP messaging server, which then connects with any available SAP application server. Now if an application server goes down, your integration workflows won’t go down with it. Instead, you’ll automatically be connected to the next available application server.

Additionally, we’ve expanded the capability of our SAP connector by adding support for ECC5. While our connector came out of the box with ECC6, ECC5 is still used by a number of our enterprise customers. The expanded support now provides them with the full coverage they and many other larger companies need.

Finally, for Salesforce, we’re updating to the newest versions of their APIs (Version 31) to ensure you have access to the latest features and capabilities. The upgrades are part of an aggressive roadmap strategy, which places updates of connectors to the latest APIs on our development schedule the instant they are announced.

The second major platform enhancement for the Winter 2015 release has to do with our Cloud Mapping Designer and is sure to please those familiar with PowerCenter. With the new release, PowerCenter users can perform secure hybrid data transformations – and sharpen their cloud data warehousing and data analytic skills – through a familiar mapping and design environment and interface.

Specifically, the new enhancement enables you to take a mapplet you’ve built in PowerCenter and bring it directly into the Cloud Mapping Designer, without any additional steps or manipulations. With the PowerCenter mapplets, you can perform multi-group transformations on objects, such as BAPIs. When you access the Mapplet via the Cloud Mapping Designer, the groupings are retained, enabling you to quickly visualize what you need, and navigate and map the fields.

Additional productivity enhancements to the Cloud Mapping Designer extend the lookup and sorting capabilities and give you the ability to upload or delete data automatically based on specific conditions you establish for each target. And with the new feature supporting fully parameterized, unconnected lookups, you’ll have increased flexibility in runtime to do your configurations.

The third and final major Winter release enhancement is to our Real Time capability. Most notable is the addition of three new features that improve the usability and functionality of the Process Designer.

The first of these is a new “Wait” step type. This new feature applies to both processes and guides and enables the user to add a time-based condition to an action within a service or process call step, and indicate how long to wait for a response before performing an action.

When used in combination with the Boundary timer event variation, the Wait step can be added to a service call step or sub-process step to interrupt the process or enable it to continue.

The second is a new select feature in the Process Designer which lets users create their own service connectors. Now when a user is presented with multiple process objects created when the XML or JSON is returned from a service, he or she can select the exact ones to include in the connector.

An additional Generate Process Objects feature automates the creation of objects, thus eliminating the tedious task of replicating hold service responses containing hierarchical XML and JSON data for large structures. These can now be conveniently auto generated when testing a Service Connector, saving integration developers a lot of time.

The final enhancement for the Process Designer makes it simpler to work with XML-based services. The new “Simplified XML” feature for the “Get From” field treats attributes as children, removing the namespaces and making sibling elements into an object list. Now if a user only needs part of the returned XML, they just have to indicate the starting point for the simplified XML.

While those conclude the major enhancements, additional improvements include:

  • A JMS Enqueue step is now available to submit an XML or JSON message to a JMS Queue or Topic accessible via the a secure agent.
  • Dequeuing (queue and topics) of XML or JSON request payloads is now fully supported.
Share
Posted in B2B Data Exchange, Big Data, Cloud, Cloud Application Integration, Cloud Data Integration, Cloud Data Management | Tagged , , , | Leave a comment

Amazon re:Invent 2014 Recap: “Cloud has Become the New Normal”

It’s amazing how fast a year goes by. Last year, Informatica Cloud exhibited at Amazon re:Invent for the very first time where we showcased our connector for Amazon Redshift. At the time, customers were simply kicking the tires on Amazon’s newest cloud data warehousing service, and trying to learn where it might make sense to fit Amazon Redshift into their overall architecture. This year, it was clear that customers had adopted several AWS services and were truly “all-in” on the cloud. In the words of Andy Jassy, Senior VP of Amazon Web Services, “Cloud has become the new normal”.

During Day 1 of the keynote, Andy outlined several areas of growth across the AWS ecosystem such as a 137% YoY increase in data transfer to and from Amazon S3, and a 99% YoY increase in Amazon EC2 instance usage. On Day 2 of the keynote, Werner Vogels, CTO of Amazon made the case that there has never been a better time to build apps on AWS because of all the enterprise-grade features. Several customers came on stage during both keynotes to demonstrate their use of AWS:

  • Major League Baseball’s Statcast application consumed 17PB of raw data
  • Philips Healthcare used over a petabyte a month
  • Intuit revealed their plan to move the rest of their applications to AWS over the next few years
  • Johnson & Johnson outlined their use of Amazon’s Virtual Private Cloud (VPC) and referred to their use of hybrid cloud as the “borderless datacenter”
  • Omnifone illustrated how AWS has the network bandwidth required to deliver their hi-res audio offerings
  • The Weather Company scaled AWS across 4 regions to deliver 15 billion forecast publications a day

Informatica was also mentioned on stage by Andy Jassy as one of the premier ISVs that had built solutions on top of the AWS platform. Indeed, from having one connector in the AWS ecosystem last year (for Amazon Redshift), Informatica has released native connectors for Amazon DynamoDB, Elastic MapReduce (EMR), S3, Kinesis, and RDS.

With so many customers using AWS, it becomes hard for them to track their usage on a more granular level – this is especially true with enterprise companies using AWS because of the multitude of departments and business units using several AWS services. Informatica Cloud and Tableau developed a joint solution which was showcased at the Amazon re:Invent Partner Theater, where it was possible for an IT Operations individual to drill down into several dimensions to find out the answers they need around AWS usage and cost. IT Ops personnel can point out the relevant data points in their data model, such as availability zone, rate, and usage type, to name a few, and use Amazon Redshift as the cloud data warehouse to aggregate this data. Informatica Cloud’s Vibe Integration Packages combined with its native connectivity to Amazon Redshift and S3 allow the data model to be reflected as the correct set of tables in Redshift. Tableau’s robust visualization capabilities then allow users to drill down into the data model to extract whatever insights they require. Look for more to come from Informatica Cloud and Tableau on this joint solution in the upcoming weeks and months.

Share
Posted in B2B Data Exchange, Cloud, Cloud Computing, Cloud Data Integration | Tagged , , , | Leave a comment