Tag Archives: supply chain

New Supplier360 View for Optimized Supplier Information Management

If your organization is working with a large number of vendors, you know the challenge to successfully manage your suppliers’ information across the entire enterprise.

It starts with quickly onboarding the right suppliers at the right location to introduce a new product or initiate a project – a process that can take several weeks if it isn’t automated.

Once qualified and on-boarded, you need to be able to assess your suppliers’ risk and compliance and analyze their performance. Do my suppliers deliver on time? Do they meet our quality standards? Do we benefit from negotiated corporate discounts and payment terms with a trusted Supplier360 view of our supplier relationships worldwide?

In order to enable organizations to more effectively manage vendors and their performance, negotiate better prices and payment terms, and reduce operating costs and risk, Informatica just launched the Total Supplier Relationship Application (TSR), a master data-fueled solution that optimizes supplier information management with a Supplier360 view across the enterprise.

Sodexo is a multinational food services and facilities management corporation. Frank Griffith, Senior Director of Integrated Procurement Systems at Sodexo says: “The Informatica Total Supplier Relationship application will empower Sodexo to manage our supplier master data and vendor items more effectively and streamline our current supply chain processes. Based on one, single trusted view of our vendor and product data, we will be able to better track our vendors’ compliance, and manage vendor performance and certifications, while leveraging the GDSN (Global Data Synchronization Network) for nutritional product attribute data.”

TSR-OverviewCombining the industry-leading Informatica Master Data Management (MDM) hub with Informatica Data Integration and Informatica Data Quality, TSR is a master data-fueled application with a configurable business-friendly user interface. It can be enhanced with Informatica Product Information Management (PIM) and Data-as-a-Service for contact data verification and enrichment.

 

 

 

Visit www.informatica.com/tsr to learn more or read the press release.

(more…)

Share
Posted in Master Data Management | Tagged , , , , , | Leave a comment

The Supply Chain Impact of Adding an Allergen to a Chocolate Bar

supply_chainThroughout the lifecycle of a consumable product, many parties are constantly challenged with updating and managing product information, like ingredients and allergens. Since December 13th, 2014, this task has become even more complex than before for companies producing or selling food and beverage products in the European Union due to the new EU 1169/2011 rules. As a result, changing the basic formula of a chocolate bar by adding one allergen, like nuts for example should be carefully considered as it can have a tremendous impact on its complete supply chain if manufacturer and retailer(s) want to remain compliant with EU regulation 1169/2011 and to inform the consumer about the changes to the product.

EU_1169_nuts_to_a_chololate_bar-supply_chainLet’s say, the chocolate bar is available in three (3) varieties: dark-, whole milk and white chocolate. Each of the varieties is available in two (2) sizes: normal and mini size. The chocolate bar is distributed in twelve (12) countries within the European Union using different packaging. This would require the manufacturer to introduce 72 (3 varieties * 2 sizes * 12 countries) new GTINs at an item level. If the chocolate producer decides to do any package or seasonal promotions, multipacks or introduce a new variety, this number would even be higher. The new attributes, including updated information on allergens, have to be modified for each product and 72 new GTINs have to be generated for the chocolates bars by the chocolate manufacturer’s data maintenance department. Trading partners have to be updated about the modifications, too. Assuming the manufacturer uses the Global Data Synchronization Network (GDSN), the new GTINs will have to be registered along with their new attributes eight weeks before the modified product will be available. In addition to that, packaging hierarchies have to be taken into consideration as well. Let’s say, each item has an associated case and pallet, the number of updates would sum up at 216 (72 updates * 3 product hierarchies).

Managing the updates related to the product modifications, results in high administrative and other costs. Trading partners across the supply chain report significant impact to their annual sales and costs. According to GS1 Europe, one retailer reported 6,000-8,000 GTIN changes per year, leading to 2-3% of additional administrative and support costs. If the GTIN had to change for every new minor variant of a product, they forecast the number of changes per year could rise to 20,000-25,000, leading to a significant increase in further additional administrative and support costs.

The change of the chocolate bar’s recipe also means that a corresponding change to the mandatory product data displayed on the label is required. This means for the online retailer(s) selling the chocolate bar that they have to update information displayed in the online shops. Considering that a retailer has to deliver exactly the product that is displayed in his online shop, there will be a period of time when the old version of the product and the new version coexist in the supply chain. During this period it is not possible for the retailer to know if the version of the product ordered on a website will be available at the time and place the order is picked.

GS1 Europe suggests handling this issue as follows: Retailers working to GS1 standards use GTINs to pick on-line orders. If the modified chocolate bar with nuts is given a new GTIN it increases the possibility that the correct variant can be made available for picking and, even if it is not available at the pick point, the retailer can recognize automatically if the version being picked is different from the version that was ordered. In this latter case the product can be offered as a substitute when the goods are delivered and the consumer can choose whether to accept it or not. On their websites, GS1 provides comprehensive information on how to comply with the new European Food Information Regulation.

Using the Global Data Synchronization Network (GDSN), suppliers and retailers are able to share standardized product data, cut down the cost of building point to point integrations and speed-up new product introductions by getting access to the most accurate and most current product information. The Informatica GDSN Accelerator is an add-on to the Informatica Product Information Management (PIM) system that provides an interface to access a GDSN certified data pool. It is designed to help organizations securely and continuously exchange, update and synchronize product data with trading partners according to the standards defined by Global Standards One (GS1). GDSN ensures that data exchanged between trading partners is accurate and compliant with globally supported standards in maintaining uniqueness, classification and identification of source and recipients. Integrated in your PIM System, the GDSN Accelerator allows for leveraging product data of highest standards to be exchanged with your trading partners via the GDSN.

Thanks to the automated product data exchange, efforts and costs related to the modification of a product, as demonstrated in the chocolate bar example can be significantly reduced for both, manufacturers and retailers. The product data can be easily transferred to the data pool and you can fully control the information sharing with a specific trading partner or with all recipients of a target market.

Related blogs:

How GS1 and PIM Help to Fulfill Legal Regulations and Feed Distribution Channels

5 Ways to Comply with the New European Food Information Regulation

Share
Posted in B2B Data Exchange, PiM | Tagged , , , | 1 Comment

Optimizing Supply Chain with Data Governance

Optimizing Supply Chain with Data Governance

Optimizing Supply Chain with Data Governance

Last week I met with a customer who recently completed a few data fueled supply chain projects. Informatica data integration and master data management are part of the solution architecture. I’m not going to name the client at this early stage but want to share highlights from our Q&A.

Q: What was the driver for this project?

A: The initiative fell out of a procure-to-pay (P2P) initiative.  We engaged a consulting firm to help centralize Accounts Payable operations.  One required deliverable was an executive P2P dashboard. This dashboard would provide enterprise insights by relying on the enterprise data warehousing and business intelligence platform.

Q: What did the dashboard illustrate?

The dashboard integrated data from many sources to provide a single view of information about all of our suppliers. By visualizing this information in one place, we were able to rapidly gain operational insights. There are approximately 30,000 suppliers in the supplier master who either manufacture, or distribute, or both over 150,000 unique products.

Q: From which sources is Informatica consuming data to power the P2P dashboard?

A: There are 8 sources of data:

3 ERP Systems:

  1. Lawson
  2. HBOC STAR
  3. Meditech

4 Enrichment Sources:

  1. Dun & Bradstreet – for associating suppliers together from disparate sources.
  2. GDSN – Global Data Pool for helping to cleanse healthcare products.
  3. McKesson Pharmacy Spend – spend file from third party pharmaceutical distributor Helps capture detailed pharmacy spend which we procure from this third party.
  4. Office Depot Spend – spend file from third party office supply distributor.  Helps capture detailed pharmacy spend.
  5. MedAssets – third party group purchasing organization (GPO) who provides detailed contract pricing.

Q: Did you tackle clinical scenarios first?

A: No, well we certainly have many clinical scenarios we want to explore like cost per procedure per patient we knew that we should establish a few quick, operational wins to gain traction and credibility.

Q: Great idea – capturing quick wins is certainly the way we are seeing customers have the most success in these transformative projects. Where did you start?

A: We started with supply chain cost containment; increasing pressures on healthcare organizations to reduce cost made this low hanging fruit the right place to start. There may be as much as 20% waste to be eliminated through strategic and actionable analytics.

Q: What did you discover?

A: Through the P2P dashboard, insights were gained into days to pay on invoices as well as early payment discounts and late payment penalties. With the visualization we quickly saw that we were paying a large amount of late fees. With this awareness, we dug into why the late fees were so high. What was discovered is that, with one large supplier, the original payment terms were net 30 but that in later negotiations terms were changed to 20 days. Late fees were accruing after 20 days. Through this complete view we were able to rapidly hone in on the issue and change operations — avoiding costly late fees.

Q: That’s a great example of straight forward analytics powered by an integrated view of data, thank you. What’s a more complex use case you plan to tackle?

A: Now that we have the systems in place along with data stewardship, we will start to focus on clinical supply chain scenarios like cost per procedure per patient. We have all of the data in one data warehouse to answer questions like – which procedures are costing the most, do procedure costs vary by clinician? By location? By supply? – and what is the outcome of each of these procedures? We always want to take the right and best action for the patient.

We were also able to identify where negotiated payment discounts were not being taken advantage of or where there were opportunities to negotiate discounts.

These insights were revealed through the dashboard and immediate value was realized the first day.

Fueling knowledge with data is helping procurement negotiate the right discounts, i.e. they can seek discounts on the most used supplies vs discounts on supplies rarely used. Think of it this way… you don’t want to get a discount on OJ and if you are buying milk.

Q: Excellent example and metaphor. Let’s talk more about stewardship, you have a data governance organization within IT that is governing supply chain?

A: No, we have a data governance team within supply chain… Supply chain staff that used to be called “content managers” now “data stewards”. They were doing the stewardship work of defining data, its use, its source, its quality before but it wasn’t a formally recognized part of their jobs… now it is. Armed with Informatica Data Director they are managing the quality of supply chain data across four domains including suppliers/vendors, locations, contracts and items. Data from each of these domains resides in our EMR, our ERP applications and in our ambulatory EMR/Practice Management application creating redundancy and manual reconciliation effort.

By adding Master Data Management (MDM) to the architecture, we were able to centralize management of master data about suppliers/vendors, items, contracts and locations, augment this data with enrichment data like that from D&B, reduce redundancy and reduce manual effort.

MDM shares this complete and accurate information with the enterprise data warehouse and we can use it to run analytics against. Having a confident, complete view of master data allows us to trust analytical insights revealed through the P2P dashboard.

Q: What lessons learned would you offer?

A: Having recognized operational value, I’d encourage health systems to focus on data driven supply chain because there are savings opportunities through easier identification of unmanaged spend.

I really enjoyed learning more about this project with valuable, tangible and nearly immediate results. I will keep you posted as the customer moves onto the next phase. If you have comments or questions, leave them here.

Share
Posted in B2B, Data First, Data Governance, Retail | Tagged , , , | 1 Comment

How Much is Poorly Managed Supplier Information Costing Your Business?

Supplier Information“Inaccurate, inconsistent and disconnected supplier information prohibits us from doing accurate supplier spend analysis, leveraging discounts, comparing and choosing the best prices, and enforcing corporate standards.”

This is quotation from a manufacturing company executive. It illustrates the negative impact that poorly managed supplier information can have on a company’s ability to cut costs and achieve revenue targets.

Many supply chain and procurement teams at large companies struggle to see the total relationship they have with suppliers across product lines, business units and regions. Why? Supplier information is scattered across dozens or hundreds of Enterprise Resource Planning (ERP) and Accounts Payable (AP) applications. Too much valuable time is spent manually reconciling inaccurate, inconsistent and disconnected supplier information in an effort to see the big picture. All this manual effort results in back office administrative costs that are higher than they should be.

Do these quotations from supply chain leaders and their teams sound familiar?

  • “We have 500,000 suppliers. 15-20% of our supplier records are duplicates. 5% are inaccurate.”
  • I get 100 e-mails a day questioning which supplier to use.”
  • “To consolidate vendor reporting for a single supplier between divisions is really just a guess.”
  • “Every year 1099 tax mailings get returned to us because of invalid addresses, and we play a lot of Schedule B fines to the IRS.”
  • “Two years ago we spent a significant amount of time and money cleansing supplier data. Now we are back where we started.”
Webinar, Supercharge Your Supply Chain Apps with Better Supplier Information

Join us for a Webinar to find out how to supercharge your supply chain applications with clean, consistent and connected supplier information

Please join me and Naveen Sharma, Director of the Master Data Management (MDM) Practice at Cognizant for a Webinar, Supercharge Your Supply Chain Applications with Better Supplier Information, on Tuesday, July 29th at 11 am PT.

During the Webinar, we’ll explain how better managing supplier information can help you achieve the following goals:

  1. Accelerate supplier onboarding
  2. Mitiate the risk of supply disruption
  3. Better manage supplier performance
  4. Streamline billing and payment processes
  5. Improve supplier relationship management and collaboration
  6. Make it easier to evaluate non-compliance with Service Level Agreements (SLAs)
  7. Decrease costs by negotiating favorable payment terms and SLAs

I hope you can join us for this upcoming Webinar!

 

 

Share
Posted in Business Impact / Benefits, Business/IT Collaboration, Data Integration, Data Quality, Manufacturing, Master Data Management | Tagged , , , , , , , , , , , , , , | Leave a comment

CIO to CIO: A New Community to Help IT Leaders Succeed in a Data-Centric World

IT is evolving unlike it has ever before. Get the best up-to date career insights & best practices to help you succeed in a data-centric world. Check out my exciting new Potential at Work Community for IT Leaders
Share
Posted in CIO | Tagged , , , , , , , , , | Leave a comment

Matching for Management: Business Problems

So now that you understand the terminology and concepts let’s talk about business problems that can be addressed with this technology.

Inability to get to single view of customer because of matching issues

In the examples above, you can see where it can be a challenge getting the correct customer records into a single cluster. If you do not get all the same customer records together properly, you may not be treating particular customers appropriately. One example is not identifying your top customers because they are represented by multiple account numbers. Worse can be treating a very good customer poorly because you think they had only had one small transaction with you but in reality he just did not log in or use his frequent shopper card. This poor service could jeopardize the entire account. (more…)

Share
Posted in Data Quality | Tagged , , , | Leave a comment

Putting Master Data In The Hands Of Business Users

As companies increasingly explore master data management (MDM), we often hear inquiries about the usability of master data by business users.

Common questions include: Do business users need to learn and use a separate MDM application? Do they need support from IT to access master data? Can master data fit into the everyday business applications they use for CRM, SFA, ERP, supply chain management, and so forth?

If your organization has ever asked these questions, you should take a look at our new white paper, “Drive Business User Adoption of Master Data.” (more…)

Share
Posted in Business Impact / Benefits, Business/IT Collaboration, Customers, Data Governance, Data Quality, Enterprise Data Management, Master Data Management, Operational Efficiency | Tagged , , , , , , , , , , , , , , , , , , , , , | 4 Comments

Moo…MDM…Moo

In the world of Master Data Management (MDM), it is quite common to see a single view of customer, product, supplier, etc but one of our customers is building a Single View of Cow.

Moo…

The customer is in the animal management business and they are responsible for managing the life cycle (literally!) of a cow from birth to beef stroganoff. Tracking starts with parentage, siblings and relatives, cows in proximity on the various ranches. Tracking continues to the abattoir and then follows the various beef components as they travel through the supply chain all the way to the end consumers. The main business driver is food safety regulation. If a disease shows up at some point in the supply chain, you need to be able to quickly track upstream where the cow came from and downstream all the way to supermarket shelves.

The Single View of Cow use case provides some interesting best practices for the broader MDM community…

(more…)

Share
Posted in Customers, Data Integration, Master Data Management | Tagged , , , , , , | Leave a comment