Category Archives: Master Data Management

Is the Internet of Things relevant for the government?

Get connected. Be connected. Make connections. Find connections. The Internet of Things (IoT) is all about connecting people, processes, data and, as the name suggests, things. The recent social media frenzy surrounding the ALS Ice Bucket Challenge has certainly reminded everyone of the power of social media, the Internet and a willingness to answer a challenge. Fueled by personal and professional connections, the craze has transformed fund raising for at least one charity. Similarly, IoT may potentially be transformational to the business of the public sector, should government step up to the challenge.

shutterstock_132378518

Is the Internet of Things relevant for the government?

Government is struggling with the concept and reality of how IoT really relates to the business of government, and perhaps rightfully so. For commercial enterprises, IoT is far more tangible and simply more fun. Gaming, televisions, watches, Google glasses, smartphones and tablets are all about delivering over-the-top, new and exciting consumer experiences. Industry is delivering transformational innovations, which are connecting people to places, data and other people at a record pace.

It’s time to accept the challenge. Government agencies need to keep pace with their commercial counterparts and harness the power of the Internet of Things. The end game is not to deliver new, faster, smaller, cooler electronics; the end game is to create solutions that let devices connecting to the Internet interact and share data, regardless of their location, manufacturer or format and make or find connections that may have been previously undetectable. For some, this concept is as foreign or scary as pouring ice water over their heads. For others, the new opportunity to transform policy, service delivery, leadership, legislation and regulation is fueling a transformation in government. And it starts with one connection.

One way to start could be linking previously siloed systems together or creating a golden record of all citizen interactions through a Master Data Management (MDM) initiative. It could start with a big data and analytics project to determine and mitigate risk factors in education or linking sensor data across multiple networks to increase intelligence about potential hacking or breaches. Agencies could stop waste, fraud and abuse before it happens by linking critical payment, procurement and geospatial data together in real time.

This is the Internet of Things for government. This is the challenge. This is transformation.

This article was originally published on www.federaltimes.com. Please view the original listing here

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, Data Integration, Data Security, Master Data Management, Public Sector, Uncategorized | Tagged , , , , , | Leave a comment

The Catalog is Dead – Long Live the Catalog?

The Catalog is Dead.

Print solution provider Werk II came up with a provocative marketing campaign in 2012. Their ads have been designed like the obituary notice for the “Main Catalog” which is “no longer with us”…

According to the Multi Channel Merchant Outlook 2014 survey, the eCommerce website (not a surprise ;-) ) is the top channel through which merchants market (90%). The social media (87.2%) and email (83%) channels follow close behind. Although catalogs may have dropped as a marketing tool, 51.7% of retailers said they still use the catalog to market their brands.

importance of channels chart

Source: MCM Outlook 2014

The Changing Role of the Catalog

Merchants are still using catalogs to sell products. However, their role has changed from transactional to sales tool. On a scale of 1 to 10, with 10 being the most important, merchant respondents said that using catalogs as mobile traffic drivers and custom retention tools were the most important activities (both scored an 8.25). At 7.85, web traffic driver was a close third.

methods of prospecting chart

Source: MCM Outlook 2014

Long Live the Catalog: Prospecting 

More than three-quarters of merchant respondents said catalogs were the top choice for the method of prospecting they will use in the next 12 months (77.7%). Catalog was the most popular answer, followed by Facebook (68%), email (66%), Twitter (42.7%) and Pinterest (40.8%).

What is your point of view?

How have catalogs changed in your business? What are your plans and outlook for 2015? It would be very interesting to hear points of views from different industries and countries… I’d be happy to discuss here or on Twitter @benrund. My favorite fashion retailer keeps sending me a stylish catalog, which makes me order online. Brands, retailer, consumer – how do you act, what do you expect?

FacebookTwitterLinkedInEmailPrintShare
Posted in B2B, Manufacturing, Master Data Management, PiM, Product Information Management, Retail, Uncategorized | Tagged , , , | 1 Comment

What’s In A Name?

Naming ConventionsSometimes, the choice of a name has unexpected consequences. Often these consequences aren’t fair. But they exist, nonetheless. For an example of this, consider the well-known study by the National Bureau of Economic Research study that compares the hiring prospects of candidates with identical resumes, but different names. During the study, titled a “Field Experiment on Labor Market Discrimination,” employers were found to be more likely to reply candidates with popular, traditionally Caucasian names than to candidates with either unique, eclectic names or with traditionally African-American names. Though these biases are clearly unfair to the candidates, they do illustrate a key point: One’s choice when naming something can come with perceptions that influence outcomes.

For an example from the IT world, consider my recent engagement at a regional retail bank. In this engagement, half of the meeting time was consumed by IT and business leaders debating how to label their Master Data Management (MDM) Initiative.  Consider these excerpts:

  • Should we even call it MDM? Answer: No. Why? Because nobody on the business side will understand what that means. Also, as we just implemented a Data Warehouse/Mart last year and we are in the middle of our new CRM roll-out, everybody in business and retail banking will assume their data is already mastered in both of these.  On a side note; telcos understand MDM as Mobile Device Management.
  • Should we call it “Enterprise Data Master’? Answer: No. Why? Because unless you roll out all data domains and all functionality (standardization, matching, governance, hierarchy management, etc.) to the whole enterprise, you cannot.  And doing so is a bad idea as it is with every IT project.  Boiling the ocean and going live with a big bang is high cost, high risk and given shifting organizational strategies and leadership, quick successes are needed to sustain the momentum.
  • Should we call it “Data Warehouse – Release 2”? Answer: No. Why? Because it is neither a data warehouse, nor a version 2 of one.  It is a backbone component required to manage a key organizational ingredient – data –in a way that it becomes useful to many use cases, processes, applications and people, not just analytics, although it is often the starting block.  Data warehouses have neither been conceived nor designed to facilitate data quality (they assume it is there already) nor are they designed for real time interactions.  Did anybody ask if ETL is “Pneumatic Tubes – Version 2”?
  • Should we call it “CRM Plus”? Answer: No. Why? Because it has never intended or designed to handle the transactional volume and attribution breadth of high volume use cases, which are driven by complex business processes. Also, if it were a CRM system, it would have a more intricate UI capability beyond comparatively simple data governance workflows and UIs.

Consider this; any data quality solution like MDM, makes any existing workflow or application better at what it does best: manage customer interactions, create orders, generate correct invoices, etc.  To quote a colleague “we are the BASF of software”.  Few people understand what a chemical looks like or does but it makes a plastic container sturdy, transparent, flexible and light.

I also explained hierarchy management in a similar way. Consider it the LinkedIn network of your company, which you can attach every interaction and transaction to.  I can see one view, people in my network see a different one and LinkedIn has probably the most comprehensive view but we are all looking at the same core data and structures ultimately.

So let’s call the “use” of your MDM “Mr. Clean”, aka Meister Proper, because it keeps everything clean.

While naming is definitely a critical point to consider given the expectations, fears and reservations that come with MDM and the underlying change management, it was hilarious to see how important it suddenly was.  However, it was puzzling to me (maybe a naïve perspective) why mostly recent IT hires had to categorize everything into new, unique functional boxes, while business and legacy IT people wanted to re-purpose existing boxes.  I guess, recent IT used their approach to showcase that they were familiar with new technologies and techniques, which was likely a reason for their employment.  Business leaders, often with the exception of highly accomplished and well regarded ones, as well as legacy IT leaders, needed to reassure continuity and no threat of disruption or change.  Moreover, they also needed to justify their prior software investments’ value proposition.

Aside from company financial performance and regulatory screw-ups, legions of careers will be decide if, how and how successful this initiative will be.

Naming a new car model for a 100,000 production run or a shampoo for worldwide sales could not face much more scrutiny.  Software vendors give their future releases internal names of cities like Atlanta or famous people like Socrates instead of descriptive terms like “Gamification User Interface Release” or “Unstructured Content Miner”. This may be a good avenue for banks and retailers to explore.  It would avoid the expectation pitfalls associated with names like “Customer Success Data Mart”, “Enterprise Data Factory”, “Data Aggregator” or “Central Property Repository”.  In reality, there will be many applications, which can claim bits and pieces of the same data, data volume or functionality.  Who will make the call on which one will be renamed or replaced to explain to the various consumers what happened to it and why.

You can surely name any customer facing app something more descriptive like “Payment Central” or “Customer Success Point” but the reason why you can do this is that the user will only have one or maybe two points to interface with the organization. Internal data consumers will interact many more repositories.  Similarly, I guess this is all the reason why I call my kids by their first name and strangers label them by their full name, “Junior”, “Butter Fingers” or “The Fast Runner”.

I would love to hear some other good reasons why naming conventions should be more scrutinized.  Maybe you have some guidance on what should and should not be done and the reasons for it?

FacebookTwitterLinkedInEmailPrintShare
Posted in Business/IT Collaboration, CIO, Data Quality, Master Data Management | Tagged , , | Leave a comment

The King of Benchmarks Rules the Realm of Averages

A mid-sized insurer recently approached our team for help. They wanted to understand how they fell short in making their case to their executives. Specifically, they proposed that fixing their customer data was key to supporting the executive team’s highly aggressive 3-year growth plan. (This plan was 3x today’s revenue).  Given this core organizational mission – aside from being a warm and fuzzy place to work supporting its local community – the slam dunk solution to help here is simple.  Just reducing the data migration effort around the next acquisition or avoiding the ritual annual, one-off data clean-up project already pays for any tool set enhancing data acquisitions, integration and hygiene.  Will it get you to 3x today’s revenue?  It probably won’t.  What will help are the following:

The King of Benchmarks Rules the Realm of Averages

Making the Math Work (courtesy of Scott Adams)

Hard cost avoidance via software maintenance or consulting elimination is the easy part of the exercise. That is why CFOs love it and focus so much on it.  It is easy to grasp and immediate (aka next quarter).

Soft cost reduction, like staff redundancies are a bit harder.  Despite them being viable, in my experience very few decision makers want work on a business case to lay off staff.  My team had one so far. They look at these savings as freed up capacity, which can be re-deployed more productively.   Productivity is also a bit harder to quantify as you typically have to understand how data travels and gets worked on between departments.

However, revenue effects are even harder and esoteric to many people as they include projections.  They are often considered “soft” benefits, although they outweigh the other areas by 2-3 times in terms of impact.  Ultimately, every organization runs their strategy based on projections (see the insurer in my first paragraph).

The hardest to quantify is risk. Not only is it based on projections – often from a third party (Moody’s, TransUnion, etc.) – but few people understand it. More often, clients don’t even accept you investigating this area if you don’t have an advanced degree in insurance math. Nevertheless, risk can generate extra “soft” cost avoidance (beefing up reserve account balance creating opportunity cost) but also revenue (realizing a risk premium previously ignored).  Often risk profiles change due to relationships, which can be links to new “horizontal” information (transactional attributes) or vertical (hierarchical) from parent-child relationships of an entity and the parent’s or children’s transactions.

Given the above, my initial advice to the insurer would be to look at the heartache of their last acquisition, use a benchmark for IT productivity from improved data management capabilities (typically 20-26% – Yankee Group) and there you go.  This is just the IT side so consider increasing the upper range by 1.4x (Harvard Business School) as every attribute change (last mobile view date) requires additional meetings on a manager, director and VP level.  These people’s time gets increasingly more expensive.  You could also use Aberdeen’s benchmark of 13hrs per average master data attribute fix instead.

You can also look at productivity areas, which are typically overly measured.  Let’s assume a call center rep spends 20% of the average call time of 12 minutes (depending on the call type – account or bill inquiry, dispute, etc.) understanding

  • Who the customer is
  • What he bought online and in-store
  • If he tried to resolve his issue on the website or store
  • How he uses equipment
  • What he cares about
  • If he prefers call backs, SMS or email confirmations
  • His response rate to offers
  • His/her value to the company

If he spends these 20% of every call stringing together insights from five applications and twelve screens instead of one frame in seconds, which is the same information in every application he touches, you just freed up 20% worth of his hourly compensation.

Then look at the software, hardware, maintenance and ongoing management of the likely customer record sources (pick the worst and best quality one based on your current understanding), which will end up in a centrally governed instance.  Per DAMA, every duplicate record will cost you between $0.45 (party) and $0.85 (product) per transaction (edit touch).  At the very least each record will be touched once a year (likely 3-5 times), so multiply your duplicated record count by that and you have your savings from just de-duplication.  You can also use Aberdeen’s benchmark of 71 serious errors per 1,000 records, meaning the chance of transactional failure and required effort (% of one or more FTE’s daily workday) to fix is high.  If this does not work for you, run a data profile with one of the many tools out there.

If the sign says it - do it!

If the sign says it – do it!

If standardization of records (zip codes, billing codes, currency, etc.) is the problem, ask your business partner how many customer contacts (calls, mailing, emails, orders, invoices or account statements) fail outright and/or require validation because of these attributes.  Once again, if you apply the productivity gains mentioned earlier, there are you savings.  If you look at the number of orders that get delayed in form of payment or revenue recognition and the average order amount by a week or a month, you were just able to quantify how much profit (multiply by operating margin) you would be able to pull into the current financial year from the next one.

The same is true for speeding up the introduction or a new product or a change to it generating profits earlier.  Note that looking at the time value of funds realized earlier is too small in most instances especially in the current interest environment.

If emails bounce back or snail mail gets returned (no such address, no such name at this address, no such domain, no such user at this domain), e(mail) verification tools can help reduce the bounces. If every mail piece (forget email due to the miniscule cost) costs $1.25 – and this will vary by type of mailing (catalog, promotion post card, statement letter), incorrect or incomplete records are wasted cost.  If you can, use fully loaded print cost incl. 3rd party data prep and returns handling.  You will never capture all cost inputs but take a conservative stab.

If it was an offer, reduced bounces should also improve your response rate (also true for email now). Prospect mail response rates are typically around 1.2% (Direct Marketing Association), whereas phone response rates are around 8.2%.  If you know that your current response rate is half that (for argument sake) and you send out 100,000 emails of which 1.3% (Silverpop) have customer data issues, then fixing 81-93% of them (our experience) will drop the bounce rate to under 0.3% meaning more emails will arrive/be relevant. This in turn multiplied by a standard conversion rate (MarketingSherpa) of 3% (industry and channel specific) and average order (your data) multiplied by operating margin gets you a   benefit value for revenue.

If product data and inventory carrying cost or supplier spend are your issue, find out how many supplier shipments you receive every month, the average cost of a part (or cost range), apply the Aberdeen master data failure rate (71 in 1,000) to use cases around lack of or incorrect supersession or alternate part data, to assess the value of a single shipment’s overspend.  You can also just use the ending inventory amount from the 10-k report and apply 3-10% improvement (Aberdeen) in a top-down approach. Alternatively, apply 3.2-4.9% to your annual supplier spend (KPMG).

You could also investigate the expediting or return cost of shipments in a period due to incorrectly aggregated customer forecasts, wrong or incomplete product information or wrong shipment instructions in a product or location profile. Apply Aberdeen’s 5% improvement rate and there you go.

Consider that a North American utility told us that just fixing their 200 Tier1 suppliers’ product information achieved an increase in discounts from $14 to $120 million. They also found that fixing one basic out of sixty attributes in one part category saves them over $200,000 annually.

So what ROI percentages would you find tolerable or justifiable for, say an EDW project, a CRM project, a new claims system, etc.? What would the annual savings or new revenue be that you were comfortable with?  What was the craziest improvement you have seen coming to fruition, which nobody expected?

Next time, I will add some more “use cases” to the list and look at some philosophical implications of averages.

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Business/IT Collaboration, Data Integration, Data Migration, Data Quality, Enterprise Data Management, Master Data Management | Tagged , , , | Leave a comment

Right Product, Right Customer, Right Place – The Informed Purchase Journey

The Informed Purchase Journey

The way we shop has changed. It’s hard to keep up with customer demands in a single channel, much less many. Selling products today has changed and always will. The video below shows how today’s customer takes The Informed Purchase Journey:

“Customers expect a seamless experience that makes it easy for them to engage at every touchpoint on their “decision journey. Informatica PIM is key component on  transformation from a product centric view to a consumer experience driven marketing with more efficiency.” – Heather Hanson – Global Head of Marketing Technology at Electrolux

Selling products today is:

  • Shopper-controlled. It’s never been easier for consumers to compare products and prices. This has eroded old customer loyalty and means you have to earn every sale.
  • Global. If you’re selling your products in different regions, you’re facing complex localization and supply chain coordination.
  • Fast. Product lifecycles are short. Time-to-market is critical (and gets tougher the more channels you’re selling through).
  • SKU-heavy. Endless-aisle assortments are great for margins. That’s a huge opportunity, but product data overload due to the large number of SKUs and their attributes adds up to a huge admin burden.
  • Data driven. Product data alone is more than a handful to deal with. But you also need to know as much about your customers as you know about your products. And the explosion of channels and touch points doesn’t make it any easier to connect the dots.

Conversion Power – From Deal Breaker To Deal Maker

For years, a customer’s purchase journey was something of “An Unexpected Journey.” Lack of insight into the journey was a struggle for retailers and brands. The journey is fraught with more questions about product than ever before, even for fast moving consumer goods.

Today, the consumer behaviors and the role of product information have changed since the advent of substantial bandwidths and social buying. To do so, lets examine the way shoppers buy today.

  • Due to Google shoppers use 10.4 sources in average (zero moment of truth ZMOT google research)
  •  133% higher conversion rate shown by mobile shoppers who view customer content like reviews.
  • Digital devices’ influence 50% of in-store purchase behavior by end of 2014 (Deloitte’s Digital Divide)

How Informatica PIM 7.1 turns information from deal breaker to deal maker

PIM 7.1 comes with new data quality dashboards, helping users like category managers, marketing texters, managers or ecommerce specialists to do the right things. The quality dashboards point users to the things they have to do next in order to get the data right, out and ready for sales.

Eliminate Shelf Lag: The Early Product Closes the Sale

For vendors, this effectively means time-to-market: the availability of a product plus the time it takes to collect all relevant product information so you can display it to the customer (product introduction time).

The biggest threat is not the competition – it’s your own time-consuming, internal processes. We call this Shelf Lag, and it’s a big inhibitor of retailer profits. Here’s why:

  • You can’t sell what you can’t display.
  • Be ready to spin up new channels
  • Watch your margins.

How Informatica PIM 7.1 speeds up product introduction and customer experience

“By 2017… customer experience is what buyers are going to use to make purchase decisions.” (Source: Gartner’s Hype Cycle for E-Commerce, 2013) PIM 7.1 comes with new editable channel previews. This helps business users like marketing, translators, merchandisers or product managers to envistion how the product looks at the cutomer facing webshop, catalog or other touchpoint. Getting products live online within seconds, we is key because the customer always wants it now. For eCommerce product data Informatica PIM is certified for IBM WebSphere Commerce to get products ready for ecommerce within seconds.

The editable channel previews helps professionals in product management, merchandizing, marketing and ecommerce to envision their products as customers are facing it. The way of “what you see is what you get (WYSIWYG)” product data management improves customer shopping experience with best and authentic information. With the new eCommerce integration, Informatica speeds up the time to market in eBusiness. The new standard (certified by IBM WebSphere Commerce enables a live update of eShops with real time integration.

The growing need for fast and s ecure collaboration across globally acting enterprises is addressed by the Business Process Management tool of Informatica, which can now be used for PIM customers.

Intelligent insights: How relevant is our offering to your customers?

This is the age of annoyance and information overload. Each day, the average person has to handle more than 7,000 pieces of information. Only 25% of Americans say there are brand loyal. That means brands and retailers have to earn every new sale in a transparent world. In this context information needs to be relevant to the recipient.

  • Where do the data come from? How can product information auto-cleansed and characterizing into a taxonomy?
  • Is the supplier performance hitting our standards?
  • How can we mitigate risks like hidden costs and work with trusted suppliers only?
  • How can we and build customer segmentations for marketing?
  • How to build product personalization and predict the next logical buy of the customer?

It is all about The Right product. To the Right Person. In the Right Way. Learn more about the vision of the Intelligent Data Plaform.

Informatica PIM Builds the Basis of Real Time Commerce Information

All these innovations speed up the new product introduction and collaboration massively. As buyers today are always online and connected, PIM helps our customer to serve the informed purchase journey, with the right information in at the right touch point and in real time.

  1. Real-time commerce (certification with IBM WebSphere Commerce), which eliminates shelf lag
  2. Editable channel preview which help to envision how customers view the product
  3. Data quality dashboards for improved conversion power, which means selling more with better information
  4. Business Process Management for better collaboration throughout the enterprise
  5. Accelerator for global data synchronization (GDSN like GS1 for food and CPG) – which helps to improve quality of data and fulfill legal requirements

All this makes merchandizers more productive and increases average spend per customer.

Find out how the new release of Informatica PIM 7.1 helps you to unleash conversion power on the customer’s informed purchase journey.

FacebookTwitterLinkedInEmailPrintShare
Posted in B2B, B2B Data Exchange, CMO, Manufacturing, Master Data Management, PiM, Product Information Management, Retail | Tagged , , , | Leave a comment

The Five C’s of Data Management

The Five C’s of Data Management

The Five C’s of Data Management

A few days ago, I came across a post, 5 C’s of MDM (Case, Content, Connecting, Cleansing, and Controlling), by Peter Krensky, Sr. Research Associate, Aberdeen Group and this response by Alan Duncan with his 5 C’s (Communicate, Co-operate, Collaborate, Cajole and Coerce). I like Alan’s list much better. Even though I work for a product company specializing in information management technology, the secret to successful enterprise information management (EIM) is in tackling the business and organizational issues, not the technology challenges. Fundamentally, data management at the enterprise level is an agreement problem, not a technology problem.

So, here I go with my 5 C’s: (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Big Data, Data Governance, Data Integration, Enterprise Data Management, Integration Competency Centers, Master Data Management | Tagged , , , , , | Leave a comment

The Information Difference Pegs Informatica as a Top MDM Vendor

Information Difference MDM Diagram v2This blog post feels a little bit like bragging… and OK, I guess it is pretty self-congratulatory to announce that this year, Informatica was again chosen as a leader in MDM and PIM by The Information Difference. As you may know, The Information Difference is an independent research firm that specializes in the MDM industry and each year surveys, analyzes and ranks MDM and PIM providers and customers around the world. This year, like last year, The Information Difference named Informatica tops in the space.

Why do I feel especially chuffed about this?  Because of our customers.

(more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Governance, Data Quality, Enterprise Data Management, Master Data Management | Tagged , , , , , , | Leave a comment

One Search Procurement – For the Purchasing of Indirect Goods and Services

One Search Procurement – for purchasing of indirect goods and services 

Informatica Procurement is the internal Amazon for purchasing of MRO, C-goods, indirect materials and services. Informatica Procurement supports enterprise companies in catalog procurement with an industry-independent catalog procurement solution that enables fast and cost-efficient procurement of products and services and supplier integration in an easy to use self-service concept.

Information Procurement at a glance

informatica-procurement-at-a-glance

Informatica recently announced the availability of Informatica Procurement 7.3, the catalog procurement solution. I meet with Melanie Kunz our product manager to learn from here what’s new.

Melanie, for our readers and followers, who is using Informatica Procurement, for which purposes?

Melanie Kunz

Melanie Kunz: Informatica Procurement is industry-independent. Our customers are based in different industries – from engineering and the automotive to companies in the public sector (e.g. Cities). The responsibilities of people who work with Informatica Procurement differ depending on the company. For some customers, only employees from the purchasing department order items in Informatica Procurement. For other customers, all employees are allowed to order their needs themselves. Examples are employees who need screws for the completion of their product or office staff who ordered the business cards for the manager.

What is the most important thing to know about Informatica Procurement 7.3?

Melanie Kunz: In companies where a lot of IT equipment is ordered, it is important to always see the current prices. With each price changes, the catalog would have to be imported into Informatica Procurement. With a punch out to the online shop of IT equipment manufacturer, this is much easier and more efficient. The data from these catalogs are all available in Informatica Procurement, but the price can always be called on a daily basis from the online shop.

Users no longer need to leave Informatica Procurement to order items from external online shops. Informatica Procurement now enables the user to locate internal and indexed external items in just one search. That means you do not have to use different eShops for when you order new office stationary, IT equipment or services.

Great, what is the value for enterprise users and purchasing departments?

Melanie Kunz: All items in Informatica Procurement have the negotiated prices. Informatica Procurement is simple and intuitive that each employee can use the system without training. The view concept allows the restriction on products. For each employee (each department), the administrator can define a view. This view contains only the products that can be seen and ordered.

When you open the detail view for an indexed external item, the current price is determined from the external online shop. This price is saved in item detail view for a defined period. In this way, the user always gets the current price for the item.

The newly designed detail view has an elegant and clear layout. Thus, a high level of user experience is safe. This also applies to the possibility of image enlargement in the search result list.

What if I order same products frequently, like my business cards?

Melanie Kunz: The overview of recent shopping carts help users to reorder the same items on an easy and fast way. A shopping cart from a previous order can use as basis for this new order.

Large organizations with 1000s of employees are even more might have totally different needs what they need for the daily business and maybe dedicated to their career level. How do you address this?

Melanie Kunz: The standard assortment feature has been enhanced in Informatica Procurement 7.3. Administrators can define the assortment per user. Furthermore, it is possible to specify whether users have to search the standard assortment first and only search in the entire assortment if they do not find the relevant item in the standard assortment.

All of these features and many more minor features not only enhance the user experience, but also reduce the processing time of an order drastically.

Informatica Procurement 7.3 “One Search” at a glance

One Search Procurement

 

Learn more on Informatica Procurement 7.3 with the latest webinar.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Enterprise Data Management, Life Sciences, Manufacturing, Marketplace, Master Data Management, News & Announcements, Operational Efficiency, PiM, Public Sector | Tagged , , , | Leave a comment

How Much is Poorly Managed Supplier Information Costing Your Business?

Supplier Information“Inaccurate, inconsistent and disconnected supplier information prohibits us from doing accurate supplier spend analysis, leveraging discounts, comparing and choosing the best prices, and enforcing corporate standards.”

This is quotation from a manufacturing company executive. It illustrates the negative impact that poorly managed supplier information can have on a company’s ability to cut costs and achieve revenue targets.

Many supply chain and procurement teams at large companies struggle to see the total relationship they have with suppliers across product lines, business units and regions. Why? Supplier information is scattered across dozens or hundreds of Enterprise Resource Planning (ERP) and Accounts Payable (AP) applications. Too much valuable time is spent manually reconciling inaccurate, inconsistent and disconnected supplier information in an effort to see the big picture. All this manual effort results in back office administrative costs that are higher than they should be.

Do these quotations from supply chain leaders and their teams sound familiar?

  • “We have 500,000 suppliers. 15-20% of our supplier records are duplicates. 5% are inaccurate.”
  • I get 100 e-mails a day questioning which supplier to use.”
  • “To consolidate vendor reporting for a single supplier between divisions is really just a guess.”
  • “Every year 1099 tax mailings get returned to us because of invalid addresses, and we play a lot of Schedule B fines to the IRS.”
  • “Two years ago we spent a significant amount of time and money cleansing supplier data. Now we are back where we started.”
Webinar, Supercharge Your Supply Chain Apps with Better Supplier Information

Join us for a Webinar to find out how to supercharge your supply chain applications with clean, consistent and connected supplier information

Please join me and Naveen Sharma, Director of the Master Data Management (MDM) Practice at Cognizant for a Webinar, Supercharge Your Supply Chain Applications with Better Supplier Information, on Tuesday, July 29th at 11 am PT.

During the Webinar, we’ll explain how better managing supplier information can help you achieve the following goals:

  1. Accelerate supplier onboarding
  2. Mitiate the risk of supply disruption
  3. Better manage supplier performance
  4. Streamline billing and payment processes
  5. Improve supplier relationship management and collaboration
  6. Make it easier to evaluate non-compliance with Service Level Agreements (SLAs)
  7. Decrease costs by negotiating favorable payment terms and SLAs

I hope you can join us for this upcoming Webinar!

 

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Business/IT Collaboration, Data Integration, Data Quality, Manufacturing, Master Data Management | Tagged , , , , , , , , , , , , , , | Leave a comment