Category Archives: B2B

Optimizing Supply Chain with Data Governance

Optimizing Supply Chain with Data Governance

Optimizing Supply Chain with Data Governance

Last week I met with a customer who recently completed a few data fueled supply chain projects. Informatica data integration and master data management are part of the solution architecture. I’m not going to name the client at this early stage but want to share highlights from our Q&A.

Q: What was the driver for this project?

A: The initiative fell out of a procure-to-pay (P2P) initiative.  We engaged a consulting firm to help centralize Accounts Payable operations.  One required deliverable was an executive P2P dashboard. This dashboard would provide enterprise insights by relying on the enterprise data warehousing and business intelligence platform.

Q: What did the dashboard illustrate?

The dashboard integrated data from many sources to provide a single view of information about all of our suppliers. By visualizing this information in one place, we were able to rapidly gain operational insights. There are approximately 30,000 suppliers in the supplier master who either manufacture, or distribute, or both over 150,000 unique products.

Q: From which sources is Informatica consuming data to power the P2P dashboard?

A: There are 8 sources of data:

3 ERP Systems:

  1. Lawson
  2. HBOC STAR
  3. Meditech

4 Enrichment Sources:

  1. Dun & Bradstreet – for associating suppliers together from disparate sources.
  2. GDSN – Global Data Pool for helping to cleanse healthcare products.
  3. McKesson Pharmacy Spend – spend file from third party pharmaceutical distributor Helps capture detailed pharmacy spend which we procure from this third party.
  4. Office Depot Spend – spend file from third party office supply distributor.  Helps capture detailed pharmacy spend.
  5. MedAssets – third party group purchasing organization (GPO) who provides detailed contract pricing.

Q: Did you tackle clinical scenarios first?

A: No, well we certainly have many clinical scenarios we want to explore like cost per procedure per patient we knew that we should establish a few quick, operational wins to gain traction and credibility.

Q: Great idea – capturing quick wins is certainly the way we are seeing customers have the most success in these transformative projects. Where did you start?

A: We started with supply chain cost containment; increasing pressures on healthcare organizations to reduce cost made this low hanging fruit the right place to start. There may be as much as 20% waste to be eliminated through strategic and actionable analytics.

Q: What did you discover?

A: Through the P2P dashboard, insights were gained into days to pay on invoices as well as early payment discounts and late payment penalties. With the visualization we quickly saw that we were paying a large amount of late fees. With this awareness, we dug into why the late fees were so high. What was discovered is that, with one large supplier, the original payment terms were net 30 but that in later negotiations terms were changed to 20 days. Late fees were accruing after 20 days. Through this complete view we were able to rapidly hone in on the issue and change operations — avoiding costly late fees.

Q: That’s a great example of straight forward analytics powered by an integrated view of data, thank you. What’s a more complex use case you plan to tackle?

A: Now that we have the systems in place along with data stewardship, we will start to focus on clinical supply chain scenarios like cost per procedure per patient. We have all of the data in one data warehouse to answer questions like – which procedures are costing the most, do procedure costs vary by clinician? By location? By supply? – and what is the outcome of each of these procedures? We always want to take the right and best action for the patient.

We were also able to identify where negotiated payment discounts were not being taken advantage of or where there were opportunities to negotiate discounts.

These insights were revealed through the dashboard and immediate value was realized the first day.

Fueling knowledge with data is helping procurement negotiate the right discounts, i.e. they can seek discounts on the most used supplies vs discounts on supplies rarely used. Think of it this way… you don’t want to get a discount on OJ and if you are buying milk.

Q: Excellent example and metaphor. Let’s talk more about stewardship, you have a data governance organization within IT that is governing supply chain?

A: No, we have a data governance team within supply chain… Supply chain staff that used to be called “content managers” now “data stewards”. They were doing the stewardship work of defining data, its use, its source, its quality before but it wasn’t a formally recognized part of their jobs… now it is. Armed with Informatica Data Director they are managing the quality of supply chain data across four domains including suppliers/vendors, locations, contracts and items. Data from each of these domains resides in our EMR, our ERP applications and in our ambulatory EMR/Practice Management application creating redundancy and manual reconciliation effort.

By adding Master Data Management (MDM) to the architecture, we were able to centralize management of master data about suppliers/vendors, items, contracts and locations, augment this data with enrichment data like that from D&B, reduce redundancy and reduce manual effort.

MDM shares this complete and accurate information with the enterprise data warehouse and we can use it to run analytics against. Having a confident, complete view of master data allows us to trust analytical insights revealed through the P2P dashboard.

Q: What lessons learned would you offer?

A: Having recognized operational value, I’d encourage health systems to focus on data driven supply chain because there are savings opportunities through easier identification of unmanaged spend.

I really enjoyed learning more about this project with valuable, tangible and nearly immediate results. I will keep you posted as the customer moves onto the next phase. If you have comments or questions, leave them here.

FacebookTwitterLinkedInEmailPrintShare
Posted in B2B, Data First, Data Governance, Retail | Tagged , , , | Leave a comment

Data Integration Webinar Follow-Up: By Our First Strange and Fatal Interview

How to Maximize Value of Data Management Investments

How to Maximize Value of Data Management Investments

This is a guest author post by Philip Howard, Research Director, Bloor Research.

I recently posted a blog about an interview style webcast I was doing with Informatica on the uses and costs associated with data integration tools.

I’m not sure that the poet John Donne was right when he said that it was strange, let alone fatal. Somewhat surprisingly, I have had a significant amount of feedback following this webinar. I say “surprisingly” because the truth is that I very rarely get direct feedback. Most of it, I assume, goes to the vendor. So, when a number of people commented to me that the research we conducted was both unique and valuable, it was a bit of a thrill. (Yes, I know, I’m easily pleased).

There were a number of questions that arose as a result of our discussions. Probably the most interesting was whether moving data into Hadoop (or some other NoSQL database) should be treated as a separate use case. We certainly didn’t include it as such in our original research. In hindsight, I’m not sure that the answer I gave at the time was fully correct. I acknowledged that you certainly need some different functionality to integrate with a Hadoop environment and that some vendors have more comprehensive capabilities than others when it comes to Hadoop and the same also applies (but with different suppliers, when it comes to integrating with, say, MongoDB or Cassandra or graph databases). However, as I pointed out in my previous blog, functionality is ephemeral. And, just because a particular capability isn’t supported today, doesn’t mean it won’t be supported tomorrow. So that doesn’t really affect use cases.

However, where I was inadequate in my reply was that I only referenced Hadoop as a platform for data warehousing, stating that moving data into Hadoop was not essentially different from moving it into Oracle Exadata or Teradata or HP Vertica. And that’s true. What I forgot was the use of Hadoop as an archiving platform. As it happens we didn’t have an archiving use case in our survey either. Why not? Because archiving is essentially a form of data migration – you have some information lifecycle management and access and security issues that are relevant to archiving once it is in place but that is after the fact: the process of discovering and moving the data is exactly the same as with data migration. So: my bad.

Aside from that little caveat, I quite enjoyed the whole event. Somebody or other (there’s always one!) didn’t quite get how quantifying the number of end points in a data integration scenario was a surrogate measure for complexity (something we took into account) and so I had to explain that. Of course, it’s not perfect as a metric but it’s the only alternative to ask eye of the beholder type questions which aren’t very satisfactory.

Anyway, if you want to listen to the whole thing you can find it HERE:

FacebookTwitterLinkedInEmailPrintShare
Posted in B2B, B2B Data Exchange, Data Integration Platform, Data Quality | Tagged , , , | Leave a comment

Data First: Five Tips To Reduce the Risk of A Breach

Reduce the Risk of A Breach

Reduce the Risk of A Breach

This article was originally published on www.federaltimes.com

November – that time of the year. This year, November 1 was the start of Election Day weekend and the associated endless barrage of political ads. It also marked the end of Daylight Savings Time. But, perhaps more prominently, it marked the beginning of the holiday shopping season. Winter holiday decorations erupted in stores even before Halloween decorations were taken down. There were commercials and ads, free shipping on this, sales on that, singing, and even the first appearance of Santa Claus.

However, it’s not all joy and jingle bells. The kickoff to this holiday shopping season may also remind many of the countless credit card breaches at retailers that plagued last year’s shopping season and beyond. The breaches at Target, where almost 100 million credit cards were compromised, Neiman Marcus, Home Depot and Michael’s exemplify the urgent need for retailers to aggressively protect customer information.

In addition to the holiday shopping season, November also marks the next round of open enrollment for the ACA healthcare exchanges. Therefore, to avoid falling victim to the next data breach, government organizations as much as retailers, need to have data security top of mind.

According to the New York Times (Sept. 4, 2014), “for months, cyber security professionals have been warning that the healthcare site was a ripe target for hackers eager to gain access to personal data that could be sold on the black market. A week before federal officials discovered the breach at HealthCare.gov, a hospital operator in Tennessee said that Chinese hackers had stolen personal data for 4.5 million patients.”

Acknowledging the inevitability of further attacks, companies and organizations are taking action. For example, the National Retail Federation created the NRF IT Council, which is made up of 130 technology-security experts focused on safeguarding personal and company data.

Is government doing enough to protect personal, financial and health data in light of these increasing and persistent threats? The quick answer: no. The federal government as a whole is not meeting the data privacy and security challenge. Reports of cyber attacks and breaches are becoming commonplace, and warnings of new privacy concerns in many federal agencies and programs are being discussed in Congress, Inspector General reports and the media. According to a recent Government Accountability Office report, 18 out of 24 major federal agencies in the United States reported inadequate information security controls. Further, FISMA and HIPAA are falling short and antiquated security protocols, such as encryption, are also not keeping up with the sophistication of attacks. Government must follow the lead of industry and look for new and advanced data protection technologies, such as dynamic data masking and continuous data monitoring to prevent and thwart potential attacks.

These five principles can be implemented by any agency to curb the likelihood of a breach:

1. Expand the appointment and authority of CSOs and CISOs at the agency level.

2. Centralize the agency’s data privacy policy definition and implement on an enterprise level.

3. Protect all environments from development to production, including backups and archives.

4. Data and application security must be prioritized at the same level as network and perimeter security.

5. Data security should follow data through downstream systems and reporting.

So, as the season of voting, rollbacks, on-line shopping events, free shipping, Black Friday, Cyber Monday and healthcare enrollment begins, so does the time for protecting personal identifiable information, financial information, credit cards and health information. Individuals, retailers, industry and government need to think about data first and stay vigilant and focused.

This article was originally published on www.federaltimes.com. Please view the original listing here

FacebookTwitterLinkedInEmailPrintShare
Posted in B2B, B2B Data Exchange, Data First, Data Security, Data Services | Tagged , , , | Leave a comment

Decrease Salesforce Data Prep Time With Project Springbok

Account Executives update opportunities in Salesforce all the time. As opportunities close, payment information is received in the financial system. Normally, they spend hours trying to combine the data, to prepare it for differential analysis. Often, there is a prolonged, back-and-forth dialogue with IT. This takes time and effort, and can delay the sales process.

What if you could spend less time preparing your Salesforce data and more time analyzing it?

Decrease Salesforce Data Prep Time With Project Springbok

Decrease Data Prep Time With Project Springbok

Informatica has a vision to solve this challenge by providing self-service data to non-technical users. Earlier this year, we announced our Intelligent Data Platform. One of the key projects in the IDP, code-named “Springbok“, uses an excel-like search interface to let business users find and shape the data they need.

Informatica’s Project Springbok is a faster, better and, most importantly, easier way to intelligently work with data for any purpose. Springbok guides non-technical users through a data preparation process in a self-service manner. It makes intelligent recommendations and suggestions, based on the specific data they’re using.

To see this in action, we welcome you to join us as we partner with Halak Consulting, LLC for an informative webinar. The webinar will take place on November 18th at 10am PST. You will learn from the Springbok VP of Strategy and from an experienced Springbok user about how Springbok can benefit you.

So REGISTER for the webinar today!

For another perspective, see the “Imagine Not Needing to do a VLookup ever again!” from Deepa Patel, Salesforce.com MVP.

FacebookTwitterLinkedInEmailPrintShare
Posted in B2B, B2B Data Exchange, Business Impact / Benefits, Business/IT Collaboration | Tagged , , , | Leave a comment

What is the Silver Lining in Cloud for Financial Services?

This was a great week of excitement and innovation here in San Francisco starting with the San Francisco Giants winning the National League Pennant for the 3rd time in 5 years on the same day Saleforce’s Dreamforce 2014 wrapped up their largest customer conference with over 140K+ attendees from all over the world talking about their new Customer Success Platform.

Salesforce has come a long way from their humble beginnings as the new kid on the cloud front for CRM. The integrated sales, marketing, support, collaboration, application, and analytics as part of the Salesforce Customer Success Platform exemplifies innovation and significant business value upside for various industries however I see it very promising for today’s financial services industry. However like any new business application, the value business gains from it are dependent in having the right data available for the business.

The reality is, SaaS adoption by financial institutions has not been as quick as other industries due to privacy concerns, regulations that govern what data can reside in public infrastructures, ability to customize to fit their business needs, cultural barriers within larger institutions that critical business applications must reside on-premise for control and management purposes, and the challenges of integrating data to and from existing systems with SaaS applications.  However, experts are optimistic that the industry may have turned the corner. Gartner (NYSE:IT) asserts more than 60 percent of banks worldwide will process the majority of their transactions in the cloud by 2016.  Let’s take a closer look at some of the challenges and what’s required to overcome these obstacles when adopting cloud solutions to power your business.

Challenge #1:  Integrating and sharing data between SaaS and on-premise must not be taken lightly

For most banks and insurance companies considering new SaaS based CRM, Marketing, and Support applications with solutions from Salesforce and others must consider the importance of migrating and sharing data between cloud and on-premise applications in their investment decisions.  Migrating existing customer, account, and transaction history data is often done by IT staff through the use of custom extracts, scripts, and manual data validations which can carry over invalid information from legacy systems making these new application investments useless in many cases.

For example, customer type descriptions from one or many existing systems may be correct in their respective databases however collapsing them into a common field in the target application seems easy to do. Unfortunately, these transformation rules can be complex and that complexity increases when dealing with tens if not hundreds of applications during the migration and synchronization phase. Having capable solutions to support the testing, development, quality management, validation, and delivery of existing data from old to new is not only good practice, but a proven way of avoiding costly workarounds and business pain in the future.

Challenge 2:  Managing and sharing a trusted source of shared business information across the enterprise.

As new SaaS applications are adopted, it is critical to understand how to best govern and synchronize common business information such as customer contact information (e.g. address, phone, email) across the enterprise. Most banks and insurance companies have multiple systems that create and update critical customer contact information, many of them which reside on-premise. For example, insurance customers who update contact information such as a phone number or email address while filing an insurance claim will often result in that claims specialist to enter/update only the claims system given the siloed nature of many traditional banking and insurance companies. This is the power of Master Data Management which is purposely designed to identify changes to master data including customer records in one or many systems, update the customer master record, and share that across other systems that house and require that update is essential for business continuity and success.

In conclusion, SaaS adoption will continue to grow in financial services and across other industries. The silver lining in the cloud is your data and the technology that supports the consumption and distribution of it across the enterprise. Banks and insurance companies investing in new SaaS solutions will operate in a hybrid environment made up of Cloud and core transaction systems that reside on-premise. Cloud adoption will continue to grow and to ensure investments yield value for businesses, it is important to invest in a capable and scalable data integration platform to integrate, govern, and share data in a hybrid eco-system. To learn more on how to deal with these challenges, click here and download a complimentary copy of the new “Salesforce Integration for Dummies”

FacebookTwitterLinkedInEmailPrintShare
Posted in B2B, Cloud, Financial Services | Tagged , , , , | Leave a comment

Go On, Flip Your Division of Labor: More Time Analyzing and Less Time Prepping Data

Are you in Sales Operations, Marketing Operations, Sales Representative/Manager, or Marketing Professional? It’s no secret that if you are, you benefit greatly from the power of performing your own analysis, at your own rapid pace. When you have a hunch, you can easily test it out by visually analyzing data in Tableau without involving IT. When you are faced with tight timeframes in which to gain business insight from data, being able to do it yourself in the time you have available and without technical roadblocks makes all the difference.

Self-service Business Intelligence is powerful!  However, we all know it can be even more powerful. When needing to put together an analysis, we know that you spend about 80% of your time putting together data, and then just 20% of your time analyzing data to test out your hunch or gain your business insight. You don’t need to accept this anymore. We want you to know that there is a better way!

We want to allow you to Flip Your Division of Labor and allow you to spend more than 80% of your time analyzing data to test out your hunch or gain your business insight and less than 20% of your time putting together data for your Tableau analysis! That’s right. You like it. No, you love it. No, you are ready to run laps around your chair in sheer joy!! And you should feel this way. You now can spend more time on the higher value activity of gaining business insight from the data, and even find copious time to spend with your family. How’s that?

Project Springbok is a visionary new product designed by Informatica with the goal of making data access and data quality obstacles a thing of the past.  Springbok is meant for the Tableau user, a data person would rather spend their time visually exploring information and finding insight than struggling with complex calculations or waiting for IT. Project Springbok allows you to put together your data, rapidly, for subsequent analysis in Tableau. Project Springbok tells you things about your data that even you may not have known. It does it through Intelligent Suggestions that it presents to the User.

Let’s take a quick tour:

  • Project Springbok tells you, that you have a date column and that you likely want to obtain the Year and Quarter for your analysis (Fig 1)., And if you so wish, by a single click, voila, you have your corresponding years and even the quarters. And it all happened in mere seconds. A far cry from the 45 minutes it would have taken a fluent user of Excel to do using VLOOKUPS.

data

                                                                      Fig. 1

VALUE TO A MARKETING CAMPAIGN PROFESSIONAL: Rapidly validate and accurately complete your segmentation list, before you analyze your segments in Tableau. Base your segments on trusted data that did not take you days to validate and enrich.

  • Then Project Springbok will tell you that you have two datasets that could be joined on a common key, email for example, in each dataset, and would you like to move forward and join the datasets (Fig 2)? If you agree with Project Springbok’s suggestion, voila, dataset joined in a mere few seconds. Again, a far cry from the 45 minutes it would have taken a fluent user of Excel to do using VLOOKUPS.

Data

  Fig. 2

VALUE TO A SALES REPRESENTATIVE OR SALES MANAGER: You can now access your Salesforce.com data (Fig 3) and effortlessly combine it with ERP data to understand your true quota attainment. Never miss quota again due to a revenue split, be it territory or otherwise. Best of all, keep your attainment datatset refreshed and even know exactly what datapoint changed when your true attainment changes.

Data

Fig. 3

  • Then, if you want, Project Springbok will tell you that you have emails in the dataset, which you may or may not have known, but more importantly it will ask you if you wish to determine which emails can actually be mailed to. If you proceed, not only will Springbok check each email for correct structure (Fig 4), but will very soon determine if the email is indeed active, and one you can expect a response from. How long would that have taken you to do?

VALUE TO A TELESALES REPRESENTATIVE OR MARKETING EMAIL CAMPAIGN SPECIALIST : Ever thought you had a great email list and then found out most emails bounced? Now, confidently determine which emails are truly ones will be able to email to, before you send the message. Email prospects who you know are actually at the company and be confident you have their correct email addresses. You can then easily push the dataset into Tableau to analyze the trends in email list health.

Data

Fig. 4

 And, in case you were wondering, there is no training or install required for Project Springbok. The 80% of your time you used to spend on data preparation is now shrunk considerably, and this is after using only a few of Springbok’s capabilities. One more thing: You can even directly export from Project Springbok into Tableau via the “Export to Tableau TDE” menu item (Fig 5).  Project Springbok creates a Tableau TDE file and you just double click on it to open Tableau to test out your hunch or gain your business insight.

Data

Fig. 5

Here are some other things you should know, to convince you that you, too, can only spend no more than 20% of you time on putting together data for your subsequent Tableau analysis:

  • Springbok Sign-Up is Free
  • Springbok automatically finds problems with your data, and lets you fix them with a single click
  • Springbok suggests useful ways for you to combine different datasets, and lets you combine them effortlessly
  • Springbok suggests useful summarizations of your data, and lets you follow through on the summarizations with a single click
  • Springbok allows you to access data from your cloud or on-premise systems with a few clicks, and the automatically keep it refreshed. It will even tell you what data changed from the last time you saw it
  • Springbok allows you to collaborate by sharing your prepared data with others
  • Springbok easily exports your prepared data directly into Tableau for immediate analysis. You do not have to tell Tableau how to interpret the prepared data
  • Springbok requires no training or installation

Go on. Shift your division of labor in the right direction, fast. Sign-Up for Springbok and stop wasting precious time on data preparation. http://bit.ly/TabBlogs

———-

Are you going to be at Dreamforce this week in San Francisco?  Interested in seeing Project Springbok working with Tableau in a live demonstration?  Visit the Informatica or Tableau booths and see the power of these two solutions working hand-in-hand.Informatica is Booth #N1216 and Booth #9 in the Analytics Zone. Tableau is located in Booth N2112.

FacebookTwitterLinkedInEmailPrintShare
Posted in B2B, Big Data, Business Impact / Benefits, Business/IT Collaboration, General | Tagged , , , | Leave a comment

The Catalog is Dead – Long Live the Catalog?

The Catalog is Dead.

Print solution provider Werk II came up with a provocative marketing campaign in 2012. Their ads have been designed like the obituary notice for the “Main Catalog” which is “no longer with us”…

According to the Multi Channel Merchant Outlook 2014 survey, the eCommerce website (not a surprise ;-) ) is the top channel through which merchants market (90%). The social media (87.2%) and email (83%) channels follow close behind. Although catalogs may have dropped as a marketing tool, 51.7% of retailers said they still use the catalog to market their brands.

importance of channels chart

Source: MCM Outlook 2014

The Changing Role of the Catalog

Merchants are still using catalogs to sell products. However, their role has changed from transactional to sales tool. On a scale of 1 to 10, with 10 being the most important, merchant respondents said that using catalogs as mobile traffic drivers and custom retention tools were the most important activities (both scored an 8.25). At 7.85, web traffic driver was a close third.

methods of prospecting chart

Source: MCM Outlook 2014

Long Live the Catalog: Prospecting 

More than three-quarters of merchant respondents said catalogs were the top choice for the method of prospecting they will use in the next 12 months (77.7%). Catalog was the most popular answer, followed by Facebook (68%), email (66%), Twitter (42.7%) and Pinterest (40.8%).

What is your point of view?

How have catalogs changed in your business? What are your plans and outlook for 2015? It would be very interesting to hear points of views from different industries and countries… I’d be happy to discuss here or on Twitter @benrund. My favorite fashion retailer keeps sending me a stylish catalog, which makes me order online. Brands, retailer, consumer – how do you act, what do you expect?

FacebookTwitterLinkedInEmailPrintShare
Posted in B2B, Manufacturing, Master Data Management, PiM, Product Information Management, Retail, Uncategorized | Tagged , , , | 1 Comment

In a Data First World, IT must Empower Business Change!

IT must Empower Business ChangeYou probably know this already, but I’m going to say it anyway: It’s time you changed your infrastructure. I say this because most companies are still running infrastructure optimized for ERP, CRM and other transactional systems. That’s all well and good for running IT-intensive, back-office tasks. Unfortunately, this sort of infrastructure isn’t great for today’s business imperatives of mobility, cloud computing and Big Data analytics.

Virtually all of these imperatives are fueled by information gleaned from potentially dozens of sources to reveal our users’ and customers’ activities, relationships and likes. Forward-thinking companies are using such data to find new customers, retain existing ones and increase their market share. The trick lies in translating all this disparate data into useful meaning. And to do that, IT needs to move beyond focusing solely on transactions, and instead shine a light on the interactions that matter to their customers, their products and their business processes.

They need what we at Informatica call a “Data First” perspective. You can check out my first blog first about being Data First here.

A Data First POV changes everything from product development, to business processes, to how IT organizes itself and —most especially — the impact IT has on your company’s business. That’s because cloud computing, Big Data and mobile app development shift IT’s responsibilities away from running and administering equipment, onto aggregating, organizing and improving myriad data types pulled in from internal and external databases, online posts and public sources. And that shift makes IT a more-empowering force for business change. Think about it: The ability to connect and relate the dots across data from multiple sources finally gives you real power to improve entire business processes, departments and organizations.

I like to say that the role of IT is now “big I, little t,” with that lowercase “t” representing both technology and transactions. But that role requires a new set of priorities. They are:

  1. Think about information infrastructure first and application infrastructure second.
  2. Create great data by design. Architect for connectivity, cleanliness and security. Check out the eBook Data Integration for Dummies.
  3. Optimize for speed and ease of use – SaaS and mobile applications change often. Click here to try Informatica Cloud for free for 30 days.
  4. Make data a team sport. Get tools into your users’ hands so they can prepare and interact with it.

I never said this would be easy, and there’s no blueprint for how to go about doing it. Still, I recognize that a little guidance will be helpful. In a few weeks, Informatica’s CIO Eric Johnson and I will talk about how we at Informatica practice what we preach.

FacebookTwitterLinkedInEmailPrintShare
Posted in B2B, B2B Data Exchange, Big Data, Business Impact / Benefits, Data Integration, Data Security, Data Services, Enterprise Data Management | Tagged , , , | Leave a comment

Right Product, Right Customer, Right Place – The Informed Purchase Journey

The Informed Purchase Journey

The way we shop has changed. It’s hard to keep up with customer demands in a single channel, much less many. Selling products today has changed and always will. The video below shows how today’s customer takes The Informed Purchase Journey:

“Customers expect a seamless experience that makes it easy for them to engage at every touchpoint on their “decision journey. Informatica PIM is key component on  transformation from a product centric view to a consumer experience driven marketing with more efficiency.” – Heather Hanson – Global Head of Marketing Technology at Electrolux

Selling products today is:

  • Shopper-controlled. It’s never been easier for consumers to compare products and prices. This has eroded old customer loyalty and means you have to earn every sale.
  • Global. If you’re selling your products in different regions, you’re facing complex localization and supply chain coordination.
  • Fast. Product lifecycles are short. Time-to-market is critical (and gets tougher the more channels you’re selling through).
  • SKU-heavy. Endless-aisle assortments are great for margins. That’s a huge opportunity, but product data overload due to the large number of SKUs and their attributes adds up to a huge admin burden.
  • Data driven. Product data alone is more than a handful to deal with. But you also need to know as much about your customers as you know about your products. And the explosion of channels and touch points doesn’t make it any easier to connect the dots.

Conversion Power – From Deal Breaker To Deal Maker

For years, a customer’s purchase journey was something of “An Unexpected Journey.” Lack of insight into the journey was a struggle for retailers and brands. The journey is fraught with more questions about product than ever before, even for fast moving consumer goods.

Today, the consumer behaviors and the role of product information have changed since the advent of substantial bandwidths and social buying. To do so, lets examine the way shoppers buy today.

  • Due to Google shoppers use 10.4 sources in average (zero moment of truth ZMOT google research)
  •  133% higher conversion rate shown by mobile shoppers who view customer content like reviews.
  • Digital devices’ influence 50% of in-store purchase behavior by end of 2014 (Deloitte’s Digital Divide)

How Informatica PIM 7.1 turns information from deal breaker to deal maker

PIM 7.1 comes with new data quality dashboards, helping users like category managers, marketing texters, managers or ecommerce specialists to do the right things. The quality dashboards point users to the things they have to do next in order to get the data right, out and ready for sales.

Eliminate Shelf Lag: The Early Product Closes the Sale

For vendors, this effectively means time-to-market: the availability of a product plus the time it takes to collect all relevant product information so you can display it to the customer (product introduction time).

The biggest threat is not the competition – it’s your own time-consuming, internal processes. We call this Shelf Lag, and it’s a big inhibitor of retailer profits. Here’s why:

  • You can’t sell what you can’t display.
  • Be ready to spin up new channels
  • Watch your margins.

How Informatica PIM 7.1 speeds up product introduction and customer experience

“By 2017… customer experience is what buyers are going to use to make purchase decisions.” (Source: Gartner’s Hype Cycle for E-Commerce, 2013) PIM 7.1 comes with new editable channel previews. This helps business users like marketing, translators, merchandisers or product managers to envistion how the product looks at the cutomer facing webshop, catalog or other touchpoint. Getting products live online within seconds, we is key because the customer always wants it now. For eCommerce product data Informatica PIM is certified for IBM WebSphere Commerce to get products ready for ecommerce within seconds.

The editable channel previews helps professionals in product management, merchandizing, marketing and ecommerce to envision their products as customers are facing it. The way of “what you see is what you get (WYSIWYG)” product data management improves customer shopping experience with best and authentic information. With the new eCommerce integration, Informatica speeds up the time to market in eBusiness. The new standard (certified by IBM WebSphere Commerce enables a live update of eShops with real time integration.

The growing need for fast and s ecure collaboration across globally acting enterprises is addressed by the Business Process Management tool of Informatica, which can now be used for PIM customers.

Intelligent insights: How relevant is our offering to your customers?

This is the age of annoyance and information overload. Each day, the average person has to handle more than 7,000 pieces of information. Only 25% of Americans say there are brand loyal. That means brands and retailers have to earn every new sale in a transparent world. In this context information needs to be relevant to the recipient.

  • Where do the data come from? How can product information auto-cleansed and characterizing into a taxonomy?
  • Is the supplier performance hitting our standards?
  • How can we mitigate risks like hidden costs and work with trusted suppliers only?
  • How can we and build customer segmentations for marketing?
  • How to build product personalization and predict the next logical buy of the customer?

It is all about The Right product. To the Right Person. In the Right Way. Learn more about the vision of the Intelligent Data Plaform.

Informatica PIM Builds the Basis of Real Time Commerce Information

All these innovations speed up the new product introduction and collaboration massively. As buyers today are always online and connected, PIM helps our customer to serve the informed purchase journey, with the right information in at the right touch point and in real time.

  1. Real-time commerce (certification with IBM WebSphere Commerce), which eliminates shelf lag
  2. Editable channel preview which help to envision how customers view the product
  3. Data quality dashboards for improved conversion power, which means selling more with better information
  4. Business Process Management for better collaboration throughout the enterprise
  5. Accelerator for global data synchronization (GDSN like GS1 for food and CPG) – which helps to improve quality of data and fulfill legal requirements

All this makes merchandizers more productive and increases average spend per customer.

Find out how the new release of Informatica PIM 7.1 helps you to unleash conversion power on the customer’s informed purchase journey.

FacebookTwitterLinkedInEmailPrintShare
Posted in B2B, B2B Data Exchange, CMO, Manufacturing, Master Data Management, PiM, Product Information Management, Retail | Tagged , , , | Leave a comment

How GS1 and PIM Help to Fulfill Legal Regulations and Feed Distribution Channels

Manufacturers and retailers are constantly being challenged by the market. They continually seek ways to optimize their business processes and improve their margins. They face a number of challenges. These challenges include the following:

  • Delays in getting products ordered
  • Delays in getting products displayed on the shelf
  • Out of stock issues
  • Constant pressure to comply with how information is exchanged with local partners
  • Pressure to comply with how information is exchanged with international distribution partners

Recently, new regulations have been mandated by governing bodies. These bodies include the US Food and Drug Administration (FDA) as well as European Union (EU) entities. One example of these regulations is EU Regulation 1169/2011. This regulation focuses on nutrition and contents for food labels.

How much would it mean to a supplier if they could reduce their “time to shelf?” What would it mean if they could improve their order and item administration?

GS1 and PIM

If you’re a supplier, and if these improvements would benefit you, you’ll want to explore solutions. In particular, you’d benefit from a solution which could do the following:

  • Make your business available to the widest possible audience, both locally and internationally
  • Eliminate the need to build individual “point to point” interfaces
  • Provide the ability to communicate both “one on one” with a partner and broadly with othe
  • Eliminate product data inconsistencies
  • Improve data quality
  • Improve productivity

One such solution that can accomplish these things is Informatica’s combination of PIM and GDSN.

For Manufacturers

Manufacturers of CPG or food products have to adhere to strict compliance regulations. The new EU Regulation 1169/2011 on the provision of food information to consumers changes existing legislation on food labeling. The new rules take effect on December 13, 2014. The obligation to provide nutrition information will apply from 13 December 2016. The US Food & Drug Administration (FDA) enforces record keeping and the Hazard Analysis & Critical Control Points (HACCP).

In addition to that information standards are key factor feedbug distributors and retailers as our customer Vitakraft says:

“For us as a manufacturer of pet food, the retailers and distributors are key distribution channels. With the GS1 Accelerator for Informatica PIM we connect with the Global Data Synchronization Network (GDSN). Leveraging GDSN we serve our retail and distribution partners with product information for all sales channels. Informatica, helps us to meet the expectations of our business partners and customers in the e-business.”

Heiko Cichala, Product & Electronic Data Interchange Management

For Retailers

On one side retailers like supermarkets, expect from their vendors or manufacturers to get all  required information which is required legally – on the other side they are looking for strategies to leverage information for better customer service and experience (Check out “the supermarket of tomorrow”).

Companies, like German food retailer Edeka offer an app for push marketing, or help matching customer profiles of dietary or allergy profiles with QR-code scanned products on the shopping list within the supermarket app.

The Informatica GS1 Accelerator

GS1, GDSN for PIM

 

The GS1 Accelerator from Informatica offers suppliers and manufacturers the capability to ensure their data is not only of high quality but also confirms to GS1 standards. The Informatica GDSN accelerator offers the possibility to provide this high quality data directly to a certified data pool for synchronisation with their trading partners.

The quality of the data can be ensured by the Data Quality rules engine of the PIM system. It leverages the Global Product Classification hierarchy that conforms to GS1 standards for communication with the data pools.

All GDSN related activities is encapsulated within PIM can be initiated from there itself. The product data can easily be transferred to the data pool and released to a specific trading partner or made public for all recipients of a Target Market.

FacebookTwitterLinkedInEmailPrintShare
Posted in B2B, B2B Data Exchange, PiM, Product Information Management | Tagged , | Leave a comment