Category Archives: Master Data Management

Are You Looking For A New Information Governance Framework?

A few months ago, while addressing a room full of IT and business professional at an Information Governance conference, a CFO said – “… if we designed our systems today from scratch, they will look nothing like the environment we own.” He went on to elaborate that they arrived there by layering thousands of good and valid decisions on top of one another.

Similarly, Information Governance has also evolved out of the good work that was done by those who preceded us. These items evolve into something that only a few can envision today. Along the way, technology evolved and changed the way we interact with data to manage our daily tasks. What started as good engineering practices for mainframes gave way to data management.

Then, with technological advances, we encountered new problems, introduced new tasks and disciplines, and created Information Governance in the process. We were standing on the shoulders of data management, armed with new solutions to new problems. Now we face the four Vs of big data and each of those new data system characteristics have introduced a new set of challenges driving the need for Big Data Information Governance as a response to changing velocity, volume, veracity, and variety.

Information GovernanceDo you think we need a different framework?

Before I answer this question, I must ask you “How comprehensive is the framework you are using today and how well does it scale to address the new challenges?

While there are several frameworks out in the marketplace to choose from. In this blog, I will tell you what questions you need to ask yourself before replacing your old framework with a new one:

Q. Is it nimble?

The focus of data governance practices must allow for nimble responses to changes in technology, customer needs, and internal processes. The organization must be able to respond to emergent technology.

Q. Will it enable you to apply policies and regulations to data brought into the organization by a person or process?

  • Public company: Meet the obligation to protect the investment of the shareholders and manage risk while creating value.
  • Private company: Meet privacy laws even if financial regulations are not applicable.
  • Fulfill the obligations of external regulations from international, national, regional, and local governments.

Q. How does it Manage quality?

For big data, the data must be fit for purpose; context might need to be hypothesized for evaluation. Quality does not imply cleansing activities, which might mask the results.

Q. Does it understanding your complete business and information flow?

Attribution and lineage are very important in big data. Knowing what is the source and what is the destination is crucial in validating analytics results as fit for purpose.

Q. How does it understanding the language that you use, and can the framework manage it actively to reduce ambiguity, redundancy, and inconsistency?

Big data might not have a logical data model, so any structured data should be mapped to the enterprise model. Big data still has context and thus modeling becomes increasingly important to creating knowledge and understanding. The definitions evolve over time and the enterprise must plan to manage the shifting meaning.

Q. Does it manage classification?

It is critical for the business/steward to classify the overall source and the contents within as soon as it is brought in by its owner to support of information lifecycle management, access control, and regulatory compliance.

Q. How does it protect data quality and access?

Your information protection must not be compromised for the sake of expediency, convenience, or deadlines. Protect not just what you bring in, but what you join/link it to, and what you derive. Your customers will fault you for failing to protect them from malicious links. The enterprise must formulate the strategy to deal with more data, longer retention periods, more data subject to experimentation, and less process around it, all while trying to derive more value over longer periods.

Q. Does it foster stewardship?

Ensuring the appropriate use and reuse of data requires the action of an employee. E.g., this role cannot be automated, and it requires the active involvement of a member of the business organization to serve as the steward over the data element or source.

Q. Does it manage long-term requirements?

Policies and standards are the mechanism by which management communicates their long-range business requirements. They are essential to an effective governance program.

Q. How does it manage feedback?

As a companion to policies and standards, an escalation and exception process enables communication throughout the organization when policies and standards conflict with new business requirements. It forms the core process to drive improvements to the policy and standard documents.

Q. Does it Foster innovation?

Governance must not squelch innovation. Governance can and should make accommodations for new ideas and growth. This is managed through management of the infrastructure environments as part of the architecture.

Q. How does it control third-party content?

Third-party data plays an expanding role in big data. There are three types and governance controls must be adequate for the circumstances. They must consider applicable regulations for the operating geographic regions; therefore, you must understand and manage those obligations.

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Data Governance, Master Data Management | Tagged , , , , , | Leave a comment

Brandspiration – The Story of the Gap Stick and an Award Winner

This is the story about a great speaker, a simple but funny product and the idea of a Ventana Award winning company which does “Brandspiration”.

When I invited Dale Denham, CIO from Geiger to speak on his at Informatica World this year, I was not sure what I will get. I only knew that Dale is known as an entertaining speaker. What could we expect from a person who, calls himself “the selling CIO”?

And Dale delivered. He opened his session “How product information in ecommerce improved Geiger’s ability to promote and sell promotional products” with a video.

What I liked about it was: It is a simple product, addressing a everyday problem, everybody knows. And this is the business of Geiger & Crestline, two brands in one company which sell promotional products to help companies inspire with their brand. They call is “Brandspiration”.

What this has to do with PIM?

Well the business need for Geiger was to sell 100,000s of products more efficient. Which includes update products faster and more accurately and add more products. But also Geiger was planning to

  • Eliminate reliance on ERP
  • Launch new web properties
  • Improve SEO
  • Centralize product management & control
  • Standardize business processes & workflows
  • Produce print catalog more faster

Before working with Informatica PIM it took a week to launch a new product. And Geiger/ Crestline has a complex price management for bundles, brands, packages and more under their two own brands for two different target groups: low price products with aggressive pricing and more high quality promotional products.

Business Outcomes

With PIM the product entry time could be reduced by about an hour. Geiger achieved 25% time saving for catalog creation and implemented PIM in about six months. (btw with the integrator “Ideosity“.) Another fact which made me proud on our offering was, that Dale told me his company was able to upgrade on the latest PIM version within hours.

“PIM has allowed us to be more proactive Instead of being handcuffed to a system that made us reactive. A great invest for this company. I can’t believe we survived for as long as we did without this software.” 
Dale Denham, CIO

Whatch the video of Dale and how his company Geiger realizes Brandspiration with Informatica PIM. Did you know, Geiger is a proud winner of the Ventana Research Innovation Award for their PIM initiative?

FacebookTwitterLinkedInEmailPrintShare
Posted in Master Data Management, PiM, Product Information Management, Retail | Tagged , , , , , | Leave a comment

We Are Sports – SportScheck Omnichannel Retail

Are you a manager dedicated to fashion, B2C or retail? This blog provides an overview what companies can learn on omnichannel from SportScheck.

SportScheck is one of Germany’s most successful multichannel businesses. SportScheck (btw Ventana Research Innovation Award Winner) is an equipment and clothing specialist for almost every sport and its website gets over 52 million hits per year, making it one of the most successful online stores in Germany.

Each year, more than million customers sign up for the mail-order business while over 17 million customers visit its brick and mortar stores (Source). These figures undoubtedly describe the success of SportScheck’s multichannel strategy. SportScheck also strives to deliver innovative concepts in all of its sales channels, while always aiming to provide customers with the best shopping experience possible. This philosophy can be carried out only in conjunction with modern systems landscapes and optimized processes.

Complete, reliable, and attractive information – across every channel – is the key to a great customer experience and better sales. It’s hard to keep up with customer demands in a single channel, much less multiple channels. Download The Informed Purchase Journey. The Informed Purchase Journey requires the right product, to right customer at the right place. Enjoy the video!

What is the Business Initiative in SportScheck

  • Providing the customer the same deals across all sales channels with a centralized location for all product information
  • Improve customer service in all sales channels with perfect product data
  • Make sure customers have enough product information to make a purchase without the order being returned

Intelligent and Agile Processes are Key to Success

“Good customer service, whether online, in-store, or in print, needs perfect product data” said Alexander Pischetsrieder in an interview. At the Munich-based sporting goods retailer, there had been no centralized system for product data before now. After extensive research and evaluation, the company decided to implement the product information management (PIM) system from Informatica.

The main reason for the introduction of Informatica Product Information Management (PIM) solutions was its support for a true multichannel strategy. Customers should have access to the same deals across all sales channels. In addition to making a breadth of information available, customer service still remains key.

In times where information is THE killer app, key challenges are, keeping information up to date and ensuring efficient processes. In a retail scenario, product catalog onboarding starts with PIM to get the latest product information. A dataset in the relevant systems that is always up-to-date is a further basis, which allows companies to react immediately to market movements and implement marketing requirements as quickly as possible. Data must be exchanged between the systems practically in real time. If you want to learn more details, how SportScheck solved the technical integration between SAP ERP and Informatica PIM

Product Data Equals Demonstrated Expertise

“I am convinced that a well-presented product with lots of pictures and details sells better. For us, this signals knowing our product. That sets us apart from the large discount stores,” notes Alexander Pischetsrieder. “In the end, we have to ask: who is the customer going to trust? We gain trust here with our product knowledge and our love of sports in general.” Just like our motto says, “We get our fans excited.” By offering a professional search engine, product comparisons, and many other features, 
PIM adds value not only in ecommerce – and that gets us excited!”

Benefits for SportScheck

  • Centralized location for all product information across all sales channels
  • An agile system that is capable of interweaving the different retail processes across sales channels into a smooth, cross-channel function
  • Self-Service portal for agencies and suppliers with direct upload to the PIM system

For German readers I can highly recommend this video on the customer use case. If you are interested in more details, ask me on Twitter @benrund.

PS: This blog is based on the PIM case study on SportScheck.

FacebookTwitterLinkedInEmailPrintShare
Posted in Master Data Management, PiM, Product Information Management, Real-Time, Retail, Uncategorized | Tagged , , , , , | Leave a comment

8 Information Management Challenges for UDI Compliance

“My team spends far too much time pulling together medical device data that’s scattered across different systems and reconciling it in spreadsheets to create compliance reports.” This quotation from a regulatory affairs leader at a medical device manufacturer highlights the impact of poorly managed medical device data on compliance reporting, such as the reports needed for the FDA’s Universal Device Identification (UDI) regulation. In fact, an overreliance on manual, time-consuming processes brings an increased risk of human error in UDI compliance reports.

frustrated_man_computer

Is your compliance team manually reconciling data for UDI compliance reports?

If you are an information management leader working for a medical device manufacturer, and your compliance team needs quick and easy access to medical device data for UDI compliance reporting, I have five questions for you:

1) How many Class III and Class II devices do you have?
2) How many systems or reporting data stores contain data about these medical devices?
3) How much time do employees spend manually fixing data errors before the data can be used for reporting?
4) How do you plan to manage medical device data so the compliance team can quickly and easily produce accurate reports for UDI Compliance?
5) How do you plan to help the compliance team manage the multi-step submission process?

Watch this on-demand webinar "3 EIM Best Practices for UDI Compliance"

Watch this on-demand webinar “3 EIM Best Practices for UDI Compliance”

For some helpful advice from data management experts, watch this on-demand webinar “3 Enterprise Information Management (EIM) Best Practices for UDI Compliance.”

The deadline to submit the first UDI compliance report to the FDA for Class III devices is September 24, 2014. But, the medical device data needed to produce the report is typically scattered among different internal systems, such as Enterprise Resource Planning (ERP) e.g. SAP and JD Edwards, Product Lifecycle Management (PLM), Manufacturing Execution Systems (MES) and external 3rd party device identifiers.

The traditional approach for dealing with poorly managed data is the compliance team burns the midnight oil to bring together and then manually reconcile all the medical device data in a spreadsheet. And, they have to do this each and every time a compliance report is due. The good news is your compliance team doesn’t have to.

Many medical device manufacturers are are leveraging their existing data governance programs, supported by a combination of data integration, data quality and master data management (MDM) technology to eliminate the need for manual data reconciliation. They are centralizing their medical device data management, so they have a single source of trusted medical device data for UDI compliance reporting as well as other compliance and revenue generating initiatives.

Get UDI data management advice from data experts Kelle O'Neal, Managing Partner at First San Francisco Partners and Bryan Balding, MDM Specialist at Informatica
Get UDI data management advice from data experts Kelle O’Neal, Managing Partner at First San Francisco Partners and Bryan Balding, MDM Specialist at Informatica

During this this on-demand webinar, Kelle O’Neal, Managing Partner at First San Francisco Partners, covers the eight information management challenges for UDI compliance as well as best practices for medical device data management.

Bryan Balding, MDM Solution Specialist at Informatica, shows you how to apply these best practices with the Informatica UDI Compliance Solution.

You’ll learn how to automate the process of capturing, managing and sharing medical device data to make it quicker and easier to create the reports needed for UDI compliance on ongoing basis.

 

 

20 Questions & Answers about Complying with the FDA Requirement for Unique Device Identification (UDI)

20 Questions & Answers about Complying with the FDA Requirement
for Unique Device Identification (UDI)

Also, we just published a joint whitepaper with First San Francisco Partners, Information Management FAQ for UDI: 20 Questions & Answers about Complying with the FDA Requirement for Unique Device Identification (UDI). Get answers to questions such as:

What is needed to support an EIM strategy for UDI compliance?
What role does data governance play in UDI compliance?
What are the components of a successful data governance program?
Why should I centralize my business-critical medical device data?
What does the architecture of a UDI compliance solution look like?

I invite you to download the UDI compliance FAQ now and share your feedback in the comments section below.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Governance, Data Integration, Data Quality, Enterprise Data Management, Life Sciences, Manufacturing, Master Data Management, Vertical | Tagged , , , , , , , , , , , , , , | Leave a comment

CSI: “Enter Location Here”

Last time I talked about how benchmark data can be used in IT and business use cases to illustrate the financial value of data management technologies.  This time, let’s look at additional use cases, and at how to philosophically interpret the findings.

ROI interpretation

We have all philosophies covered

So here are some additional areas of investigation for justifying a data quality based data management initiative:

  • Compliance or any audits data and report preparation and rebuttal  (FTE cost as above)
  • Excess insurance premiums on incorrect asset or party information
  • Excess tax payments due to incorrect asset configuration or location
  • Excess travel or idle time between jobs due to incorrect location information
  • Excess equipment downtime (not revenue generating) or MTTR due to incorrect asset profile or misaligned reference data not triggering timely repairs
  • Equipment location or ownership data incorrect splitting service cost or revenues incorrectly
  • Party relationship data not tied together creating duplicate contacts or less relevant offers and lower response rates
  • Lower than industry average cross-sell conversion ratio due to inability to match and link departmental customer records and underlying transactions and expose them to all POS channels
  • Lower than industry average customer retention rate due to lack of full client transactional profile across channels or product lines to improve service experience or apply discounts
  • Low annual supplier discounts due to incorrect or missing alternate product data or aggregated channel purchase data

I could go on forever, but allow me to touch on a sensitive topic – fines. Fines, or performance penalties by private or government entities, only make sense to bake into your analysis if they happen repeatedly in fairly predictable intervals and are “relatively” small per incidence.  They should be treated like M&A activity. Nobody will buy into cost savings in the gazillions if a transaction only happens once every ten years. That’s like building a business case for a lottery win or a life insurance payout with a sample size of a family.  Sure, if it happens you just made the case but will it happen…soon?

Use benchmarks and ranges wisely but don’t over-think the exercise either.  It will become paralysis by analysis.  If you want to make it super-scientific, hire an expensive consulting firm for a 3 month $250,000 to $500,000 engagement and have every staffer spend a few days with them away from their day job to make you feel 10% better about the numbers.  Was that worth half a million dollars just in 3rd party cost?  You be the judge.

In the end, you are trying to find out and position if a technology will fix a $50,000, $5 million or $50 million problem.  You are also trying to gauge where key areas of improvement are in terms of value and correlate the associated cost (higher value normally equals higher cost due to higher complexity) and risk.  After all, who wants to stand before a budget committee, prophesy massive savings in one area and then fail because it would have been smarter to start with something simpler and quicker win to build upon?

The secret sauce to avoiding this consulting expense and risk is a natural curiosity, willingness to do the legwork of finding industry benchmark data, knowing what goes into them (process versus data improvement capabilities) to avoid inappropriate extrapolation and using sensitivity analysis to hedge your bets.  Moreover, trust an (internal?) expert to indicate wider implications and trade-offs.  Most importantly, you have to be a communicator willing to talk to many folks on the business side and have criminal interrogation qualities, not unlike in your run-of-the-mill crime show.  Some folks just don’t want to talk, often because they have ulterior motives (protecting their legacy investment or process) or hiding skeletons in the closet (recent bad performance).  In this case, find more amenable people to quiz or pry the information out of these tough nuts, if you can.

CSI: "Enter Location Here"

CSI: “Enter Location Here”

Lastly; if you find ROI numbers, which appear astronomical at first, remember that leverage is a key factor.  If a technical capability touches one application (credit risk scoring engine), one process (quotation), one type of transaction (talent management self-service), a limited set of people (procurement), the ROI will be lower than a technology touching multiple of each of the aforementioned.  If your business model drives thousands of high-value (thousands of dollars) transactions versus ten twenty-million dollar ones or twenty-million one-dollar ones, your ROI will be higher.  After all, consider this; retail e-mail marketing campaigns average an ROI of 578% (softwareprojects.com) and this with really bad data.   Imagine what improved data can do just on that front.

I found massive differences between what improved asset data can deliver in a petrochemical or utility company versus product data in a fashion retailer or customer (loyalty) data in a hospitality chain.   The assertion of cum hoc ergo propter hoc is a key assumption how technology delivers financial value.  As long as the business folks agree or can fence in the relationship, you are on the right path.

What’s your best and worst job to justify someone giving you money to invest?  Share that story.

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Business/IT Collaboration, Data Integration, Data Quality, Governance, Risk and Compliance, Master Data Management, Mergers and Acquisitions | Tagged , , , | Leave a comment

Is the Internet of Things relevant for the government?

Get connected. Be connected. Make connections. Find connections. The Internet of Things (IoT) is all about connecting people, processes, data and, as the name suggests, things. The recent social media frenzy surrounding the ALS Ice Bucket Challenge has certainly reminded everyone of the power of social media, the Internet and a willingness to answer a challenge. Fueled by personal and professional connections, the craze has transformed fund raising for at least one charity. Similarly, IoT may potentially be transformational to the business of the public sector, should government step up to the challenge.

shutterstock_132378518

Is the Internet of Things relevant for the government?

Government is struggling with the concept and reality of how IoT really relates to the business of government, and perhaps rightfully so. For commercial enterprises, IoT is far more tangible and simply more fun. Gaming, televisions, watches, Google glasses, smartphones and tablets are all about delivering over-the-top, new and exciting consumer experiences. Industry is delivering transformational innovations, which are connecting people to places, data and other people at a record pace.

It’s time to accept the challenge. Government agencies need to keep pace with their commercial counterparts and harness the power of the Internet of Things. The end game is not to deliver new, faster, smaller, cooler electronics; the end game is to create solutions that let devices connecting to the Internet interact and share data, regardless of their location, manufacturer or format and make or find connections that may have been previously undetectable. For some, this concept is as foreign or scary as pouring ice water over their heads. For others, the new opportunity to transform policy, service delivery, leadership, legislation and regulation is fueling a transformation in government. And it starts with one connection.

One way to start could be linking previously siloed systems together or creating a golden record of all citizen interactions through a Master Data Management (MDM) initiative. It could start with a big data and analytics project to determine and mitigate risk factors in education or linking sensor data across multiple networks to increase intelligence about potential hacking or breaches. Agencies could stop waste, fraud and abuse before it happens by linking critical payment, procurement and geospatial data together in real time.

This is the Internet of Things for government. This is the challenge. This is transformation.

This article was originally published on www.federaltimes.com. Please view the original listing here

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, Data Integration, Data Security, Master Data Management, Public Sector, Uncategorized | Tagged , , , , , | Leave a comment

The Catalog is Dead – Long Live the Catalog?

The Catalog is Dead.

Print solution provider Werk II came up with a provocative marketing campaign in 2012. Their ads have been designed like the obituary notice for the “Main Catalog” which is “no longer with us”…

According to the Multi Channel Merchant Outlook 2014 survey, the eCommerce website (not a surprise ;-) ) is the top channel through which merchants market (90%). The social media (87.2%) and email (83%) channels follow close behind. Although catalogs may have dropped as a marketing tool, 51.7% of retailers said they still use the catalog to market their brands.

importance of channels chart

Source: MCM Outlook 2014

The Changing Role of the Catalog

Merchants are still using catalogs to sell products. However, their role has changed from transactional to sales tool. On a scale of 1 to 10, with 10 being the most important, merchant respondents said that using catalogs as mobile traffic drivers and custom retention tools were the most important activities (both scored an 8.25). At 7.85, web traffic driver was a close third.

methods of prospecting chart

Source: MCM Outlook 2014

Long Live the Catalog: Prospecting 

More than three-quarters of merchant respondents said catalogs were the top choice for the method of prospecting they will use in the next 12 months (77.7%). Catalog was the most popular answer, followed by Facebook (68%), email (66%), Twitter (42.7%) and Pinterest (40.8%).

What is your point of view?

How have catalogs changed in your business? What are your plans and outlook for 2015? It would be very interesting to hear points of views from different industries and countries… I’d be happy to discuss here or on Twitter @benrund. My favorite fashion retailer keeps sending me a stylish catalog, which makes me order online. Brands, retailer, consumer – how do you act, what do you expect?

FacebookTwitterLinkedInEmailPrintShare
Posted in B2B, Manufacturing, Master Data Management, PiM, Product Information Management, Retail, Uncategorized | Tagged , , , | 1 Comment

What’s In A Name?

Naming ConventionsSometimes, the choice of a name has unexpected consequences. Often these consequences aren’t fair. But they exist, nonetheless. For an example of this, consider the well-known study by the National Bureau of Economic Research study that compares the hiring prospects of candidates with identical resumes, but different names. During the study, titled a “Field Experiment on Labor Market Discrimination,” employers were found to be more likely to reply candidates with popular, traditionally Caucasian names than to candidates with either unique, eclectic names or with traditionally African-American names. Though these biases are clearly unfair to the candidates, they do illustrate a key point: One’s choice when naming something can come with perceptions that influence outcomes.

For an example from the IT world, consider my recent engagement at a regional retail bank. In this engagement, half of the meeting time was consumed by IT and business leaders debating how to label their Master Data Management (MDM) Initiative.  Consider these excerpts:

  • Should we even call it MDM? Answer: No. Why? Because nobody on the business side will understand what that means. Also, as we just implemented a Data Warehouse/Mart last year and we are in the middle of our new CRM roll-out, everybody in business and retail banking will assume their data is already mastered in both of these.  On a side note; telcos understand MDM as Mobile Device Management.
  • Should we call it “Enterprise Data Master’? Answer: No. Why? Because unless you roll out all data domains and all functionality (standardization, matching, governance, hierarchy management, etc.) to the whole enterprise, you cannot.  And doing so is a bad idea as it is with every IT project.  Boiling the ocean and going live with a big bang is high cost, high risk and given shifting organizational strategies and leadership, quick successes are needed to sustain the momentum.
  • Should we call it “Data Warehouse – Release 2”? Answer: No. Why? Because it is neither a data warehouse, nor a version 2 of one.  It is a backbone component required to manage a key organizational ingredient – data –in a way that it becomes useful to many use cases, processes, applications and people, not just analytics, although it is often the starting block.  Data warehouses have neither been conceived nor designed to facilitate data quality (they assume it is there already) nor are they designed for real time interactions.  Did anybody ask if ETL is “Pneumatic Tubes – Version 2”?
  • Should we call it “CRM Plus”? Answer: No. Why? Because it has never intended or designed to handle the transactional volume and attribution breadth of high volume use cases, which are driven by complex business processes. Also, if it were a CRM system, it would have a more intricate UI capability beyond comparatively simple data governance workflows and UIs.

Consider this; any data quality solution like MDM, makes any existing workflow or application better at what it does best: manage customer interactions, create orders, generate correct invoices, etc.  To quote a colleague “we are the BASF of software”.  Few people understand what a chemical looks like or does but it makes a plastic container sturdy, transparent, flexible and light.

I also explained hierarchy management in a similar way. Consider it the LinkedIn network of your company, which you can attach every interaction and transaction to.  I can see one view, people in my network see a different one and LinkedIn has probably the most comprehensive view but we are all looking at the same core data and structures ultimately.

So let’s call the “use” of your MDM “Mr. Clean”, aka Meister Proper, because it keeps everything clean.

While naming is definitely a critical point to consider given the expectations, fears and reservations that come with MDM and the underlying change management, it was hilarious to see how important it suddenly was.  However, it was puzzling to me (maybe a naïve perspective) why mostly recent IT hires had to categorize everything into new, unique functional boxes, while business and legacy IT people wanted to re-purpose existing boxes.  I guess, recent IT used their approach to showcase that they were familiar with new technologies and techniques, which was likely a reason for their employment.  Business leaders, often with the exception of highly accomplished and well regarded ones, as well as legacy IT leaders, needed to reassure continuity and no threat of disruption or change.  Moreover, they also needed to justify their prior software investments’ value proposition.

Aside from company financial performance and regulatory screw-ups, legions of careers will be decide if, how and how successful this initiative will be.

Naming a new car model for a 100,000 production run or a shampoo for worldwide sales could not face much more scrutiny.  Software vendors give their future releases internal names of cities like Atlanta or famous people like Socrates instead of descriptive terms like “Gamification User Interface Release” or “Unstructured Content Miner”. This may be a good avenue for banks and retailers to explore.  It would avoid the expectation pitfalls associated with names like “Customer Success Data Mart”, “Enterprise Data Factory”, “Data Aggregator” or “Central Property Repository”.  In reality, there will be many applications, which can claim bits and pieces of the same data, data volume or functionality.  Who will make the call on which one will be renamed or replaced to explain to the various consumers what happened to it and why.

You can surely name any customer facing app something more descriptive like “Payment Central” or “Customer Success Point” but the reason why you can do this is that the user will only have one or maybe two points to interface with the organization. Internal data consumers will interact many more repositories.  Similarly, I guess this is all the reason why I call my kids by their first name and strangers label them by their full name, “Junior”, “Butter Fingers” or “The Fast Runner”.

I would love to hear some other good reasons why naming conventions should be more scrutinized.  Maybe you have some guidance on what should and should not be done and the reasons for it?

FacebookTwitterLinkedInEmailPrintShare
Posted in Business/IT Collaboration, CIO, Data Quality, Master Data Management | Tagged , , | Leave a comment