Tag Archives: Data Management
Financial services is one of the most data-centric industries in the world. Clean, connected, and secure data is critical to satisfy regulatory requirements, improve customer experience, grow revenue, avoid fines, and ultimately change the world of banking and insurance. Data management improvements have been made and several of the leading companies are empowered by Informatica.
Who are these companies and what are they doing with Informatica?
Fifteen of the top financial services companies will share their stories and success leveraging Informatica for their most critical business needs. These include:
- Capital One
- Bank of New Zealand
- Fannie Mae
- Fidelity Investments
- Morgan Stanley
- Thomson Reuters
- YAPI KREDI BANKASI A.S.
- Navy Federal Credit Union
- Wells Fargo Bank
- Westpac Banking Corporation
- Great American Insurance Group, Property & Casualty Group
- Liberty Mutual
Informatica World 2014 will have over 100 breakout sessions covering a wide range of topics for Line of Business Executives, IT decision makers, Architects, Developers, and Data Administrators. Our great keynote line up includes Informatica executives Sohaib Abbasi (Chief Executive Officer), Ivan Chong (Chief Strategy Officer), Marge Breya (Chief Marketing Officer) and Anil Chakravarthy (Chief Product Officer). Our series of speakers will share Informatica’s vision for this new data-centric world and explain innovations that will propel the concept of a data platform to an entirely new level.
Register today so you don’t miss out.
We look forward to seeing you in May!
“If I had my way, I’d fire the statisticians – all of them – they don’t add value”.
Surely not? Why would you fire the very people who were employed to make sense of the vast volumes of manufacturing data and guide future production? But he was right. The problem was at that time data management was so poor that data was simply not available for the statisticians to analyze.
So, perhaps this title should be re-written to be:
Fire your Data Scientists – They Aren’t Able to Add Value.
Although this statement is a bit extreme, the same situation may still exist. Data scientists frequently share frustrations such as:
- “I’m told our data is 60% accurate, which means I can’t trust any of it.”
- “We achieved our goal of an answer within a week by working 24 hours a day.”
- “Each quarter we manually prepare 300 slides to anticipate all questions the CFO may ask.”
- “Fred manually audits 10% of the invoices. When he is on holiday, we just don’t do the audit.”
This is why I think the original quote is so insightful. Value from data is not automatically delivered by hiring a statistician, analyst or data scientist. Even with the latest data mining technology, one person cannot positively influence a business without the proper data to support them.
Most organizations are unfamiliar with the structure required to deliver value from their data. New storage technologies will be introduced and a variety of analytics tools will be tried and tested. This change is crucial for to success. In order for statisticians to add value to a company, they must have access to high quality data that is easily sourced and integrated. That data must be available through the latest analytics technology. This new ecosystem should provide insights that can play a role in future production. Staff will need to be trained, as this new data will be incorporated into daily decision making.
With a rich 20-year history, Informatica understands data ecosystems. Employees become wasted investments when they do not have access to the trusted data they need in order to deliver their true value.
Who wants to spend their time recreating data sets to find a nugget of value only to discover it can’t be implemented?
Build a analytical ecosystem with a balanced focus on all aspects of data management. This will mean that value delivery is limited only by the imagination of your employees. Rather than questioning the value of an analytics team, you will attract some of the best and the brightest. Then, you will finally be able to deliver on the promised value of your data.
One thing we all have in common in this modern world, is that we have all, at some point in our lives, been on the receiving end of poor customer service.
Don’t get me wrong, a career in customer service is not an easy one, and I’m sure there are many service providers out there who have been wrongly on the receiving end of an angry customer, for reasons out of the businesses hands, that’s another topic in itself. It is hard, however, to ignore that one thing companies often fail on heavily is providing a timely, easy to access and appropriate level of service for their customers. (more…)
Some interesting news hit UK headlines last year that companies could be made to give the public greater access to their personal transaction data in an electronic, portable and machine-readable format. That’s if the midata project has anything to do with it.
Launched in April 2011 midata is part of the UK Government’s consumer empowerment strategy, Better Choices: Better Deals. Essentially, it’s a partnership between government, consumer groups and major businesses. Its aim is to give consumers access to the data that they produce, from the likes of household utilities, and banking, to internet transactions and high street loyalty cards. (more…)
Thomas Davenport, visiting professor at Harvard University and author of the watershed book Competing on Analytics, is once again making waves across the datasphere with his proclamation of data scientist as the “sexiest job of the 21st century.”
To many readers here at the Perspectives site, of course, this is not news, as many data professionals have increasingly been recognizing – and are being recognized – for the increasing power of information in driving new insights and business opportunities. (more…)
There are those who look at the emerging world of cloud computing as a trend to efficiency. It gives them the ability to leverage resources using much more cost effective models, where those resources are provisioned and shared amongst many consumers.
However, in the quest for efficiency, we often overlook the functionality of “the cloud.” Or, the usefulness of placing core features in a centralized location, which thus provides better control and governance, as well as efficiency. This leads to the placement of core enterprise data management services in the cloud, such as Master Data Management, or MDM. (more…)
In my previous post I discussed effective stakeholder management and communications as a key enabler of successful data quality delivery. In this blog, I will discuss the importance of demonstrated project management fundamentals.
Large-scale, complex enterprise Data Quality and Data Management efforts are characterized by numerous activities and tasks being performed iteratively by multiple resources, across multiple work streams, with high volume units of work (i.e. dozens of source systems and data objects, hundreds of tables, thousands of data elements, hundreds of thousands of data defects and millions of records). Without the means to effectively define, plan and manage these efforts, success is nearly impossible. (more…)
We have been looking at how data management issues can be classified, and in my last post I provided five categories, but broken them down into two groups: Systemic and System. The systemic issues are ones in which process or management gaps allow data flaws to be introduced. A good example occurs when consumers of reports from the data warehouse insist that the data sets are incomplete, and the root cause is that the processes in which the data is initially collected or created do not comply with the downstream requirement for capturing the missing values. (more…)