The Funny Side of Poor Master Data Management

In almost all cases, poor quality master data is not a laughing matter.  It can directly lead to losing out on millions either through excess costs or lost opportunity.  However, there are occasions where poor data has a lighter side.  One such case happened in the mid ‘90s whilst I was implementing ERP for a living.

On the first phase of a major project, I was on the master data team – in particular developing processes that enabled master data creation and maintenance.  Since both MDM and workflow tools were in their infancy, we conjured up a manual processes supported by green screen based scripts and email.  Not great, but workable.  However, before go-live, system test was proving painful, with every other team wanting lots of material numbers for testing.  Since manual entry excited no-one, so we created a product master file, populated by cut-and-paste, which the IT guys happily automated for us to create hundreds of material masters.

Data Chart

Testing was a success, but then something went wrong.  Actually two things went wrong – the first being the script was designed to load test data and had no data quality reviews, but somehow it was used to load production data.  This brought on the second  problem which our master data team found highly amusing.

A couple of days after go-live, I got a call from ‘Phil’ – the shift supervisor in shipping.

After the usual pleasantries, it was clear that something was really bothering Phil, and he had been grappling with a problem for 45min or so.  It came down to this:

“No matter how hard we try, we cannot fit 32 desktops on a pallet, even if we shrink wrap it”

I was a bit confused – pallets were sized for 32 laptops, or 16 desktops.  Why would Phil be trying to put 32 desktops on a single pallet?  Some brief queries showed there was an error in the master data.  Since system test did not actually involve any physical actions in the real world, nobody noticed that our master data was inaccurate, and all products were defined as 32 per pallet.  This was the script used to load data into the new ERP.

I suggested Phil continue as usual (16 per pallet), regardless of what the system said in terms of items per pallet, and I would raise a support request to get the data fixed.

I still try to imagine how many warehouse employees it takes to hold 32 desktops on a pallet designed for 16, whilst another armed with a portable shrink-wrapper desperately tries to wrap desktops, but not hands or whole people onto the pallet.  I imagine all would be cursing ‘the new system’ loudly during the process.

There are a few important lessons out of this, some of which are:

  • Without the correct tools and care, your data will quickly be infested with inaccuracies
  • New systems are not immune to poor data quality (and may be at greater risk)
  • Appointed data custodians should care about the data and have a stake in it’s accuracy

And perhaps most interestingly:

People will believe the system, even if their instinct tells them otherwise.

Which is probably one of the best reasons to ensure your data is correct.  This was a clear demonstration of how poor data quality can directly & negatively affect daily business processes.

A real pity that this incident predated smart phones as well as Master Data Management and workflow tools.  If it happened today, I may have had a great photo to remind me of the importance of accurate master data.