Tag Archives: Operational Efficiency
Analyzing current business trends helps illustrate how difficult and complex the Communication Service Provider business environment has become. CSPs face many challenges. Clients expect high quality, affordable content that can move between devices with minimum advertising or privacy concerns. To illustrate this phenomenon, here are a few recent examples:
- Apple is working with Comcast/NBC Universal on a new converged offering
- Vodafone purchased the Spanish cable operator, Ono, having to quickly separate the wireless customers from the cable ones and cross-sell existing products
- Net neutrality has been scuttled in the US and upheld in the EU so now a US CSP can give preferential bandwidth to content providers, generating higher margins
- Microsoft’s Xbox community collects terabytes of data every day making effective use, storage and disposal based on local data retention regulation a challenge
- Expensive 4G LTE infrastructure investment by operators such as Reliance is bringing streaming content to tens of millions of new consumers
To quickly capitalize on “new” (often old, but unknown) data sources, there has to be a common understanding of:
- Where the data is
- What state it is in
- What it means
- What volume and attributes are required to accommodate a one-off project vs. a recurring one
When a multitude of departments request data for analytical projects with their one-off, IT-unsanctioned on-premise or cloud applications, how will you go about it? The average European operator has between 400 and 1,500 (known) applications. Imagine what the unknown count is.
A European operator with 20-30 million subscribers incurs an average of $3 million per month due to unpaid invoices. This often results from incorrect or incomplete contact information. Imagine how much you would have to add for lost productivity efforts, including gathering, re-formatting, enriching, checking and sending invoices. And this does not even account for late invoice payments or extended incorrect credit terms.
Think about all the wrong long-term conclusions that are being drawn from this wrong data. This single data problem creates indirect cost in excess of three times the initial, direct impact of unpaid invoices.
Want to fix your data and overcome the accelerating cost of change? Involve your marketing, CEM, strategy, finance and sales leaders to help them understand data’s impact on the bottom line.
Disclaimer: Recommendations and illustrations contained in this post are estimates only and are based entirely upon information provided by the prospective customer and on our observations and benchmarks. While we believe our recommendations and estimates to be sound, the degree of success achieved by the prospective customer is dependent upon a variety of factors, many of which are not under Informatica’s control and nothing in this post shall be relied upon as representative of the degree of success that may, in fact, be realized and no warranty or representation of success, either express or implied, is made.
In this video, Rob Karel, vice president of product strategy, Informatica, outlines the Informatica Data Governance Framework, highlighting the 10 facets that organizations need to focus on for an effective data governance initiative:
- Vision and Business Case to deliver business value
- Tools and Architecture to support architectural scope of data governance
- Policies that make up data governance function (security, archiving, etc.)
- Measurement: measuring the level of influence of a data governance initiative and measuring its effectiveness (business value metrics, ROI metrics, such as increasing revenue, improving operational efficiency, reducing risk, reducing cost or improving customer satisfaction)
- Change Management: incentives to workforce, partners and customers to get better quality data in and potential repercussions if data is not of good quality
- Organizational Alignment: how the organization will work together across silos
- Dependent Processes: identifying data lifecycles (capturing, reporting, purchasing and updating data into your environment), all processes consuming the data and processes to store and manage the data
- Program Management: effective program management skills to build out communication strategy, measurement strategy and a focal point to escalate issues to senior management when necessary
- Define Processes that make up the data governance function (discovery, definition, application and measuring and monitoring).
For more information from Rob Karel on the Informatica Data Governance Framework, visit his Perspectives blogs.
Friday August 5, 2011 set new records for trading volume around the world. According to this FT.com story: “The amount of data generated by the day’s trading in US futures and equities alone saw over 130m trades on Friday, generating 950 gigabytes of data, according to Nanex, a market data provider.” In London, “some exchanges with older technology could not cope”. And so Big Data strikes again.
But market data volume has been exploding for months, even years. This is just one more chapter in a long story, illustrating the types of problems that a business could encounter if they neglect their technical infrastructure in the face of data volume growth. (more…)
Enterprises use Hadoop in data-science applications that improve operational efficiency, grow revenues or reduce risk. Many of these data-intensive applications use Hadoop for log analysis, data mining, machine learning or image processing.
Commercial, open source or internally developed data-science applications have to tackle a lot of semi-structured, unstructured or raw data. They benefit from Hadoop’s combination of storage and processing in each data node spread across a cluster of cost-effective commodity hardware. Hadoop’s lack of fixed-schema works particularly well for answering ad-hoc queries and exploratory “what if” scenarios.
With our Informatica9 World Tour in full swing, I have found the last month to be extremely exhilarating meeting clients across North America, South America, Europe and Australia. I have met with banking clients in New York, major telco operators in Brazil, government institutions in Sweden and retailers in Australia.
Several people have asked me if there have been any major differences from region to region. My observation is “yes … and no”. Let me explain:
I had the pleasure recently to attend a briefing by CxO media (publishers of CIO magazine). They had completed their annual CIO survey looking into the world of the CIO with the objective of understanding how the role of the CIO continues to evolve in today’s business climate and to help define the CIO agenda for 2009. They had over 500 CIOs from North American headquartered midsize and enterprise companies participate.
There were a few very interesting trends that jumped out at me which I wanted to share:
So you’ve managed to reduce your IT budget focused on KTLO (Keep-the-lights-on) by automating a lot of the manual-intensive integration processes using the latest data integration platform.
In doing this, you lowered your upfront TCO through ease-of-use, prebuilt connectivity, reusable logic and rules, and are benefiting from lower ongoing costs through scalability and performance, ease of administration and seamless upgrades.
You found the required staff to manage the myriad of integration challenges all across your IT shop by tapping into the large community of Informatica specialists available and implemented an ICC to mirror companies like T. Rowe Price, HP and Avaya who achieved significant savings through their ICC’s. (This might have come about from your conversation with Gartner whose research on ICC’s convinced you that you could achieve 25% reuse of integration components; 30% savings in integration development time and costs and 20% savings in maintenance costs).
By the way, have you tried the ICC calculator using this data to show how much you could save?
OK, so you did all that and you have a more efficient IT shop with savings enough to rollout that single critical application that you’ve been asked to complete for the last 6 months. Now what? (more…)
People often ask me how Informatica adds value to our customers – it’s a pretty simple question that I’ve answered a thousand times. Rather than become bogged down in industry acronyms or discussing the technical aspects of our solutions, I thought I’d look at it in a broader light of operational efficiency. I took great delight in a recent posting about how two customers of ours (KPN and RBS) had won industry awards for their Informatica-based projects.
We talk about data integration and data quality, data warehousing, data migration and Integration Competency Centers. What does this all mean at the end of the day to our customers? (more…)
If you have been following the blog circles lately, there is a big buzz about SOA being dead. It all started with a recent blog post by Anne Thomas Manes in which she says “although the word ‘SOA’ is dead, the requirement for service-oriented architecture is stronger than ever.”
SOA at its very core is simply an architectural approach and not a technology stack nor a vendor-recommended product or platform. As Anne says, “they missed the important stuff: architecture and services.”
As I have always maintained, an SOA implementation can be as simple as a few business services that wrap business or application logic, and in its most complex form it can be an entire ecosystem of technologies selected based on thoroughly analyzing needs and that most importantly support service-orientation principles. (more…)