Tag Archives: Operational Efficiency
In this video, Rob Karel, vice president of product strategy, Informatica, outlines the Informatica Data Governance Framework, highlighting the 10 facets that organizations need to focus on for an effective data governance initiative:
- Vision and Business Case to deliver business value
- Tools and Architecture to support architectural scope of data governance
- Policies that make up data governance function (security, archiving, etc.)
- Measurement: measuring the level of influence of a data governance initiative and measuring its effectiveness (business value metrics, ROI metrics, such as increasing revenue, improving operational efficiency, reducing risk, reducing cost or improving customer satisfaction)
- Change Management: incentives to workforce, partners and customers to get better quality data in and potential repercussions if data is not of good quality
- Organizational Alignment: how the organization will work together across silos
- Dependent Processes: identifying data lifecycles (capturing, reporting, purchasing and updating data into your environment), all processes consuming the data and processes to store and manage the data
- Program Management: effective program management skills to build out communication strategy, measurement strategy and a focal point to escalate issues to senior management when necessary
- Define Processes that make up the data governance function (discovery, definition, application and measuring and monitoring).
For more information from Rob Karel on the Informatica Data Governance Framework, visit his Perspectives blogs.
Friday August 5, 2011 set new records for trading volume around the world. According to this FT.com story: “The amount of data generated by the day’s trading in US futures and equities alone saw over 130m trades on Friday, generating 950 gigabytes of data, according to Nanex, a market data provider.” In London, “some exchanges with older technology could not cope”. And so Big Data strikes again.
But market data volume has been exploding for months, even years. This is just one more chapter in a long story, illustrating the types of problems that a business could encounter if they neglect their technical infrastructure in the face of data volume growth. (more…)
Enterprises use Hadoop in data-science applications that improve operational efficiency, grow revenues or reduce risk. Many of these data-intensive applications use Hadoop for log analysis, data mining, machine learning or image processing.
Commercial, open source or internally developed data-science applications have to tackle a lot of semi-structured, unstructured or raw data. They benefit from Hadoop’s combination of storage and processing in each data node spread across a cluster of cost-effective commodity hardware. Hadoop’s lack of fixed-schema works particularly well for answering ad-hoc queries and exploratory “what if” scenarios.
With our Informatica9 World Tour in full swing, I have found the last month to be extremely exhilarating meeting clients across North America, South America, Europe and Australia. I have met with banking clients in New York, major telco operators in Brazil, government institutions in Sweden and retailers in Australia.
Several people have asked me if there have been any major differences from region to region. My observation is “yes … and no”. Let me explain:
I had the pleasure recently to attend a briefing by CxO media (publishers of CIO magazine). They had completed their annual CIO survey looking into the world of the CIO with the objective of understanding how the role of the CIO continues to evolve in today’s business climate and to help define the CIO agenda for 2009. They had over 500 CIOs from North American headquartered midsize and enterprise companies participate.
There were a few very interesting trends that jumped out at me which I wanted to share:
So you’ve managed to reduce your IT budget focused on KTLO (Keep-the-lights-on) by automating a lot of the manual-intensive integration processes using the latest data integration platform.
In doing this, you lowered your upfront TCO through ease-of-use, prebuilt connectivity, reusable logic and rules, and are benefiting from lower ongoing costs through scalability and performance, ease of administration and seamless upgrades.
You found the required staff to manage the myriad of integration challenges all across your IT shop by tapping into the large community of Informatica specialists available and implemented an ICC to mirror companies like T. Rowe Price, HP and Avaya who achieved significant savings through their ICC’s. (This might have come about from your conversation with Gartner whose research on ICC’s convinced you that you could achieve 25% reuse of integration components; 30% savings in integration development time and costs and 20% savings in maintenance costs).
By the way, have you tried the ICC calculator using this data to show how much you could save?
OK, so you did all that and you have a more efficient IT shop with savings enough to rollout that single critical application that you’ve been asked to complete for the last 6 months. Now what? (more…)
People often ask me how Informatica adds value to our customers – it’s a pretty simple question that I’ve answered a thousand times. Rather than become bogged down in industry acronyms or discussing the technical aspects of our solutions, I thought I’d look at it in a broader light of operational efficiency. I took great delight in a recent posting about how two customers of ours (KPN and RBS) had won industry awards for their Informatica-based projects.
We talk about data integration and data quality, data warehousing, data migration and Integration Competency Centers. What does this all mean at the end of the day to our customers? (more…)
If you have been following the blog circles lately, there is a big buzz about SOA being dead. It all started with a recent blog post by Anne Thomas Manes in which she says “although the word ‘SOA’ is dead, the requirement for service-oriented architecture is stronger than ever.”
SOA at its very core is simply an architectural approach and not a technology stack nor a vendor-recommended product or platform. As Anne says, “they missed the important stuff: architecture and services.”
As I have always maintained, an SOA implementation can be as simple as a few business services that wrap business or application logic, and in its most complex form it can be an entire ecosystem of technologies selected based on thoroughly analyzing needs and that most importantly support service-orientation principles. (more…)
As 2008 draws to a close, it’s time to look into the future and consider the top predictions for IT in 2009. IDC have made their predictions and Neil Raden posted his 2009 warehouse predictions on Intelligent Enterprise. As I find additional predictions I’ll keep you updated. In the meantime, here are a few of my own:
- Data will take center stage as the most strategic focus for all CIO’s. As we continue to see the vast explosion of data volumes, the delivery of trusted data across the enterprise will be the number one focus for them.
- Enterprises will evolve their computing architectures from the application-centric compuing model of the last 20 years to a more data-centric computing model. As data takes center stage, companies will look at how they build new applications on their data model, instead of building an integration hairball between applications to try to control their data model. (more…)