Tag Archives: Operational Efficiency
Product Information Management (PIM) is an investment, not a “cost.” This is so important it’s worth repeating: “PIM is an investment.” If you are a retailer of any size (small or big, it makes little difference), it’s likely your most painful challenges include the following:
- Lost sales from out-of-stock issues
- Lost time haggling over return disputes
- Squandered hours reconciling product information promotion discrepancies across all channels
And, if you’re comfortable experiencing the following, then no change is needed:
- Slower Time-To-Market for for new product introductions and the corresponding wasted dollars
- Wasted time spent relaying basic product information to customers and partners
- Overall brand erosion
However, for those who want to embark on a journey toward redemption, I will use this post to lay a good foundation and to demystify PIM.
Here are 6 reasons to consider transforming your business using PIM. Note that the priority order will vary by industry and situation.
1) PIM for Operational efficiency
This really boils down to a reduction in the number of call center questions regarding basic item information, a reduction in the number of instances when inventory levels are insufficient, and a reduction in the number of purchase order errors, resulting in incorrect shipments or adjustments.
2) PIM for New Product Introduction (NPI)
Introducing a new product requires the coordinated efforts of dozens of internal and external staff. It can be a fairly complex task. Even a simple product may require hundreds of attributes, all derived from multiple systems residing within and outside the organization.
3) PIM to Reduce the “Time to Market”
Studies have shown that high-performing companies generate, on average, 61 percent of their sales from successful introductions of new products and services. PIM will help you have a streamlined process for creating new products and distributing them to eCommerce and other channels in the ecosystem. The faster, the better. Why? Well, you will have them before anyone else and secondly there will be more time to sell them.
4) PIM for Business growth and improved customer satisfaction
The instantaneous nature of online retail impacts consistency and adds an additional layer of complexity to the management of product information. Customer satisfaction is (also) correlated to rich, contextual, and consistent product information across sales channels. Companies that lack this discipline experience brand erosion with consequental detrimental impact on overall business performance.
5) PIM to Improve your supplier performance
How would you answer your CFO if she asked for the average cost to on-board a product from a supplier? All things considered, my bet is that it would be in the neighborhood of $500-$700. But even if it is $200 you are still in deep water. In fact, with a bit of math you’ll figure out the incidence of this cost when considering the tens of thousands of SKUs that are introduced into the market every year. Forward-thinking retailers are leveraging an integrated supplier portal to enhance supplier collaboration. This saves precious time and manpower, often bringing down that cost to less than $5.
6) PIM for Omni-channel enablement
A retail omni-channel strategy cannot not contemplate the management of product information. This is because retailers need to collect information from multiple sources, optimizing content and facilitating timely distribution of content across multiple channels. Very often, though, information in stores, eCommerce sites, mobile apps, and print catalogs just don’t match up. Often, it is difficult to connect products and customers, resulting in poor customer experiences. This is a large topic that will be covered in upcoming posts.
Looking for more information on how PIM can actually lay the foundations to deliver on these promises? There is more in my previous blog posts here, here, and here. There is also a SlideShare presentation here. Finally, we have a new eBook called The Informed Purchase Journey available here.
Analyzing current business trends helps illustrate how difficult and complex the Communication Service Provider business environment has become. CSPs face many challenges. Clients expect high quality, affordable content that can move between devices with minimum advertising or privacy concerns. To illustrate this phenomenon, here are a few recent examples:
- Apple is working with Comcast/NBC Universal on a new converged offering
- Vodafone purchased the Spanish cable operator, Ono, having to quickly separate the wireless customers from the cable ones and cross-sell existing products
- Net neutrality has been scuttled in the US and upheld in the EU so now a US CSP can give preferential bandwidth to content providers, generating higher margins
- Microsoft’s Xbox community collects terabytes of data every day making effective use, storage and disposal based on local data retention regulation a challenge
- Expensive 4G LTE infrastructure investment by operators such as Reliance is bringing streaming content to tens of millions of new consumers
To quickly capitalize on “new” (often old, but unknown) data sources, there has to be a common understanding of:
- Where the data is
- What state it is in
- What it means
- What volume and attributes are required to accommodate a one-off project vs. a recurring one
When a multitude of departments request data for analytical projects with their one-off, IT-unsanctioned on-premise or cloud applications, how will you go about it? The average European operator has between 400 and 1,500 (known) applications. Imagine what the unknown count is.
A European operator with 20-30 million subscribers incurs an average of $3 million per month due to unpaid invoices. This often results from incorrect or incomplete contact information. Imagine how much you would have to add for lost productivity efforts, including gathering, re-formatting, enriching, checking and sending invoices. And this does not even account for late invoice payments or extended incorrect credit terms.
Think about all the wrong long-term conclusions that are being drawn from this wrong data. This single data problem creates indirect cost in excess of three times the initial, direct impact of unpaid invoices.
Want to fix your data and overcome the accelerating cost of change? Involve your marketing, CEM, strategy, finance and sales leaders to help them understand data’s impact on the bottom line.
Disclaimer: Recommendations and illustrations contained in this post are estimates only and are based entirely upon information provided by the prospective customer and on our observations and benchmarks. While we believe our recommendations and estimates to be sound, the degree of success achieved by the prospective customer is dependent upon a variety of factors, many of which are not under Informatica’s control and nothing in this post shall be relied upon as representative of the degree of success that may, in fact, be realized and no warranty or representation of success, either express or implied, is made.
In this video, Rob Karel, vice president of product strategy, Informatica, outlines the Informatica Data Governance Framework, highlighting the 10 facets that organizations need to focus on for an effective data governance initiative:
- Vision and Business Case to deliver business value
- Tools and Architecture to support architectural scope of data governance
- Policies that make up data governance function (security, archiving, etc.)
- Measurement: measuring the level of influence of a data governance initiative and measuring its effectiveness (business value metrics, ROI metrics, such as increasing revenue, improving operational efficiency, reducing risk, reducing cost or improving customer satisfaction)
- Change Management: incentives to workforce, partners and customers to get better quality data in and potential repercussions if data is not of good quality
- Organizational Alignment: how the organization will work together across silos
- Dependent Processes: identifying data lifecycles (capturing, reporting, purchasing and updating data into your environment), all processes consuming the data and processes to store and manage the data
- Program Management: effective program management skills to build out communication strategy, measurement strategy and a focal point to escalate issues to senior management when necessary
- Define Processes that make up the data governance function (discovery, definition, application and measuring and monitoring).
For more information from Rob Karel on the Informatica Data Governance Framework, visit his Perspectives blogs.
Friday August 5, 2011 set new records for trading volume around the world. According to this FT.com story: “The amount of data generated by the day’s trading in US futures and equities alone saw over 130m trades on Friday, generating 950 gigabytes of data, according to Nanex, a market data provider.” In London, “some exchanges with older technology could not cope”. And so Big Data strikes again.
But market data volume has been exploding for months, even years. This is just one more chapter in a long story, illustrating the types of problems that a business could encounter if they neglect their technical infrastructure in the face of data volume growth. (more…)
Enterprises use Hadoop in data-science applications that improve operational efficiency, grow revenues or reduce risk. Many of these data-intensive applications use Hadoop for log analysis, data mining, machine learning or image processing.
Commercial, open source or internally developed data-science applications have to tackle a lot of semi-structured, unstructured or raw data. They benefit from Hadoop’s combination of storage and processing in each data node spread across a cluster of cost-effective commodity hardware. Hadoop’s lack of fixed-schema works particularly well for answering ad-hoc queries and exploratory “what if” scenarios.
With our Informatica9 World Tour in full swing, I have found the last month to be extremely exhilarating meeting clients across North America, South America, Europe and Australia. I have met with banking clients in New York, major telco operators in Brazil, government institutions in Sweden and retailers in Australia.
Several people have asked me if there have been any major differences from region to region. My observation is “yes … and no”. Let me explain:
I had the pleasure recently to attend a briefing by CxO media (publishers of CIO magazine). They had completed their annual CIO survey looking into the world of the CIO with the objective of understanding how the role of the CIO continues to evolve in today’s business climate and to help define the CIO agenda for 2009. They had over 500 CIOs from North American headquartered midsize and enterprise companies participate.
There were a few very interesting trends that jumped out at me which I wanted to share:
So you’ve managed to reduce your IT budget focused on KTLO (Keep-the-lights-on) by automating a lot of the manual-intensive integration processes using the latest data integration platform.
In doing this, you lowered your upfront TCO through ease-of-use, prebuilt connectivity, reusable logic and rules, and are benefiting from lower ongoing costs through scalability and performance, ease of administration and seamless upgrades.
You found the required staff to manage the myriad of integration challenges all across your IT shop by tapping into the large community of Informatica specialists available and implemented an ICC to mirror companies like T. Rowe Price, HP and Avaya who achieved significant savings through their ICC’s. (This might have come about from your conversation with Gartner whose research on ICC’s convinced you that you could achieve 25% reuse of integration components; 30% savings in integration development time and costs and 20% savings in maintenance costs).
By the way, have you tried the ICC calculator using this data to show how much you could save?
OK, so you did all that and you have a more efficient IT shop with savings enough to rollout that single critical application that you’ve been asked to complete for the last 6 months. Now what? (more…)
People often ask me how Informatica adds value to our customers – it’s a pretty simple question that I’ve answered a thousand times. Rather than become bogged down in industry acronyms or discussing the technical aspects of our solutions, I thought I’d look at it in a broader light of operational efficiency. I took great delight in a recent posting about how two customers of ours (KPN and RBS) had won industry awards for their Informatica-based projects.
We talk about data integration and data quality, data warehousing, data migration and Integration Competency Centers. What does this all mean at the end of the day to our customers? (more…)