Tag Archives: Financial Services
Data is one of the most important and value assets to banks and insurance companies across the globe to help comply with industry regulations, improve customer experience, find new revenue opportunities, and reduce the cost of doing business. These are universal needs and challenges and Informatica’s industry leading solutions have helped over 780 financial services institutions increase their potential to achieve business success.
At Informatica World 2013, June 4-7 at the Aria Resort and Casino in Las Vegas, Nevada, we will be showcasing a wealth of valuable information to maximize value from your data assets and technology investments. The event includes over 100 interactive and informative breakout sessions across 6 dedicated tracks on (Platform & Products, Architecture, Best Practices, Big Data, Hybrid IT and Tech Talk).
There will also be a financial services path including guest speakers from the banking and insurance industry and from our Financial Services experts including:
- Morgan Stanley Wealth Management: Accelerating Business Growth While Protecting Sensitive Data: Find how Morgan Stanley built one of the largest Informatica platforms to mask and process over 150 thousand objects used by more than 1,000 applications globally and comply with today’s data privacy regulations.
- Wells Fargo Bank’s Data Governance Journey with Informatica: Hear and learn about Wells Fargo’s data governance strategy, program, and how Informatica is used to deliver actionable, transparent, and trusted data to the business.
- Liberty Mutual Insurance: Architecture and Best Practices with Informatica Data Integration: Learn how Informatica Data Integration’s metadata-driven architecture helps scale and support large data volumes and meet enterprise Liberty Mutual’s demands for performance and compliance.
- Addressing Top Business Priorities in Banking and Insurance with MDM: Peter Ku, Senior Director of Financial Services Industry solutions share how Master Data Management is being used in Banking and Insurance to help address top business imperatives from regulatory compliance to finding new revenue opportunities.
Register today at www.informaticaworld.com and I look forward to seeing you there!
The need to be more customer-centric in financial services is more important than ever as banks and insurance companies look for ways to reduce churn as those in the industry know that loyal customers spend more on higher margin products and are likely to refer additional customers. Bankers and insurers who understand this, and get this right, are in a better position to maintain profitable and lasting customer loyalty and reap significant financial rewards. The current market conditions remain significant and will be difficult to overcome without the right information management architecture to help companies be truly customer centric. Here’s why:
- Customer satisfaction with retail banks has decreased for four consecutive years, with particularly low scores in customer service. Thirty-seven percent of customers who switched primary relationships cited in an industry survey showed poor customer service as the main reasons.
- The commoditization of traditional banking and insurance products has rapidly increased client attrition and decreased acquisition rates. Industry reports estimate that banks are losing customers at an average rate of 12.5% per year, while average acquisition rates are at 13.5%, making acquisitions nearly a zero-sum game. Further, the cost of acquiring new customers is estimated at five times the rate of retaining existing ones.
- Switching is easier than ever before. Customer churn is at an all-time high in most European countries. According to an industry survey, 42 percent of German banking customers had been with their main bank for less than a year. As customer acquisition costs running between of €200 to €400, bankers and insurers need to keep their clients at least 5 to 7 years to simply break even.
- Mergers and acquisitions impact even further the complexity and risks of maintaining customer relationships. According to a recent study, 17 percent of respondents who had gone through a merger or acquisition had switched at least one of their accounts to another institution after their bank was acquired, while an additional 31 percent said they were at least somewhat likely to switch over the next year.
Financial services professionals have long recognized the need to manage customer relationships vs. account relationships by shifting away from a product-centric culture toward a customer-centric model to maintain client loyalty and grow their bottom lines organically. Here are some reasons why:
- A 5% increase in customer retention can increase profitability by 35% in banking, 50% in brokerage, and 125% in the consumer credit card market.
- Banks can add more than $1 million to the profitability of their commercial banking business line by simply extending 16 of these large corporate relationships by one year, or by saving two such clients from defecting. In the insurance sector, a one percent increase in customer retention results in $1M in revenue.
- The average company has between a 60% and 70% probability of success selling more services to a current customer, a 20% to 40% probability of selling to a former customer, and a 5% to 20% probability of making a sale to a prospect.
- Up to 66% of current users of financial institutions’ social media sites engage in receiving information about financial services, 32% use it to retrieve information about offers or promotions and 30% to conduct customer service related activities.
So what does it take to become more Customer-centric?
Companies who have successful customer centric business models share similar cultures of placing the customer first, people who are willing to go that extra mile, business processes designed with the customer’s needs in mind, product and marketing strategy that is designed to meet a customer’s needs, and technology solutions that helps access and deliver trusted, timely, and comprehensive information and intelligence across the business. These technologies include
Why is data integration important? Customer centricity begins with the ability to access and integrate your data regardless of format, source system, structure, volume, latency, from any location including the cloud and social media sites. The data business needs originates from many different systems across the organization and outside including new Software as a Service solutions and cloud based technologies. Traditional hand coded methods and one off tools and open source data integration tools are not able to scale and perform to effectively and efficiently access, manage, and deliver the right data to the systems and applications in the front lined. A the same time, we live in the Big Data era with increasing transaction volumes, new channel adoption including mobile devices and social media combined generating petabytes of data of which to support a capable and sustainable customer centric business model, requires technology that can handle this complexity, scale with the business, while reducing costs and improving productivity.
Data quality issues must be dealt with proactively and managed by both business and technology stakeholders. Though technology itself cannot prevent all data quality errors from happening, it is a critical part of your customer information management process to ensure any issues that exist are identified and dealt with in an expeditious manner. Specifically, a Data Quality solution that can help detect data quality errors in any source, allow business users to define data quality rules, support seamless consumption of those rules by developers to execute, dashboards and reports for business stakeholders, and ongoing quality monitoring to deal with time and business sensitive exceptions. Data quality management can only scale and deliver value if an organization believes and manages data as an asset. It also helps to have a data governance framework consisting of processes, policies, standards, and people from business and IT working together in the process.
Lastly, growing your business, improving wallet share, retaining profitable relationships, and lowering the cost of managing customer relationships requires a single, trusted, holistic, and authoritative source of customer information. Managing customer information has historically been in applications across traditional business silos that lacked any common processes to reconcile duplicate and conflicting information across business systems. Master Data Management solutions are purposely designed to help breakdown the traditional application and business silos and helps deliver that single view of the truth for all systems to benefit. Master Data Management allows banks and insurance companies to access, identity unique customer entities, relate accounts to each customer, and extend that relationship view across other customers and employees including relationship bankers, financial advisors, to existing agents and brokers.
The need to attract and retain customers is a continuous journey for the financial industry however that need is greater than ever before. The foundation for successful customer centricity requires technology that can help access and deliver trusted, timely, consistent, and comprehensive customer information and insight across all channels and avoid the mistakes of the past, allow you to stay ahead of your competition, and maximize value for your shareholders.
 2010 UK Retail Banking Satisfaction Study, J.D. Power and Associates, October 2010.
 “Customer Winback”
 Mortgage Servicing News
According to the IDC Financial Insights 2013 Predictions report, financial institutions across most regions are getting serious about updating their legacy systems to improve reduce operating costs, automate labor intensive processes, improve customer experiences, and avoid costly disruptions. Transforming a bank’s core systems or insurance provider’s main business systems is a strategic decision that has far-reaching implications on the firm’s future business strategies and success. When done right, the capabilities offered in today’s modern banking and insurance platforms can propel a company in front of their competition or be the nail in the coffin if your data is not migrated correctly, safeguards are not in place to protect against unwanted data breaches, and if you are not able to decommission those old systems as planned.
One of the most important and critical phases of any legacy modernization project is the process of migrating data from old to new. Migrating data involves:
- Ability to access existing data in the legacy systems
- Understand the data structures that need to be migrated
- Transform and execute one-to-one mapping with the relevant fields in the new system
- Identify data quality errors and other gaps in the data
- Validate what is entered into the new system by identifying transformation or mapping errors
- Seamlessly connect to the target tables and fields in the new system
Sounds easy enough right? Not so fast! (more…)
Like most Americans last week, I was glued to the news several days prior to Hurricane Sandy hitting landfall on the East Coast of the United States, hoping it would pass with minimal damage. Having lived in Hawaii and Florida for most of my life, I personally experienced three hurricanes and know how devastating these natural disasters can be during the storm and the hardships people go through afterwards. My thoughts are with all those who lost their lives and their belongings due to this disaster.
Hurricane Sandy has been described as one of the largest storms both in size and in property damage to homes and businesses. According to the New York Times, the total economic damage from Hurricane Sandy will range between $10 to $20 billion with insurance companies paying for $5 to $10 billion in insurance claims. At the high end of that range, Sandy would become the third-most expensive storm for insurers in U.S. history. As property, casualty and flood insurance companies prepare to face a significant wave of calls and claims requests from policyholders, I wonder what the implications and costs will be for these companies who lack reliable, trusted and accurate data which has plagued the industry industry for years.
Reliable, trusted, and accurate data is critical in helping insurance companies manage their business from satisfying regulatory requirements, maintaining and growing customer relationships, combating fraud, to reducing the cost of doing business. Unfortunately, many insurance companies, large and small, have long operated on paper-based processes to onboard new customers, manage policy changes and process claim requests. Though some firms have invested in data quality and governance practices in recent years, the majority of today’s insurance industry has ignored the importance of managing and governing good quality data and dealing with the root causes to bad data including:
- Inadequate verification of data stored in legacy systems
- Non-validated data leaks and data entry errors made by human beings
- Inadequate or manual integration of data between systems
- Redundant data sources/stores that cause data corruption to dependent applications
- Direct back-end updates with little to no data verification and impact analysis
Because of this, the data in core insurance systems can contain serious data quality errors including:
- Invalid property addresses
- Policyholder contact details (Name, Address, Phone numbers)
- Policy codes and descriptions (e.g. motor or home property)
- Risk rating codes
- Flood zone information
- Property assessment values and codes
- Loss ratios
- Claims adjuster estimates and contact information
- Lack of a comprehensive view of existing policyholder information across different policy coverage categories and lines of business
The cost of bad data can be measured in the following areas as firms gear up to deal with the fallout of Hurricane Sandy:
- Number of claims errors multiplied by the time and cost to resolve these errors
- Number of phone calls and emails concerning claims processing delays multiplied by the time per phone call and the cost per Customer Service Rep or field agents handling those requests
- Number of fraudulent claims and the loss of funds from those criminal activities
- Number of policy cancellations caused by poor customer service experienced by existing policy holders
- Not to mention the reputational damage caused by poor customer service
Having a sound data quality practice requires a well-defined data governance framework consisting of the following elements:
- Data quality policies that spell out what data are required, how they should be used, managed, updated and retired. More importantly, these policies should be aligned to the company’s goals, defined and maintained by the business, not IT.
- Data quality processes that involve documented steps to implement and enforce the policies described above.
- Specific roles including data stewards that represent business organizations, core systems (i.e. Underwriting Data Steward), or Data Category stewards who understand the business definition, requirements and usage of key data assets by the business.
Finally, in addition to the points listed above, firms must not discount or ignore the importance of having industry leading data quality software solutions to enable an effective and sustainable data quality practice including:
- Data profiling and auditing to identify existing data errors in source systems, during data entry processes and as data is extracted and shared between systems.
- Data quality and cleansing to build and execute data quality rules to enforce the policies set forth by the business.
- Address Validation solutions to ensure accurate address information for flood zone mapping and loss analysis
- Data Quality dashboards and monitoring solutions to analyze the performance and quality levels of data and escalate data errors that require immediate attention.
As cleanup activities progress and people get back on their feet from Hurricane Sandy, insurance companies should take the time to measure how well they are managing their data quality challenges and start looking at addressing them in preparation for these inevitable events caused by Mother Nature.
Financial Stability Board Pushes Legal Entity Identifier to the G20– Vote Expected this Month – What’s Next?
Hot off the press! The Financial Stability Board (FSB) published today (June 8th, 2012) a report entitled “A Global Legal Entity Identifier for Financial Markets” for the G20 supervisors for consideration and response to the mandate issued by the G20 at the Cannes Summit for a final vote at the end of the month in Mexico. It sets out 35 recommendations for the development and implementation of the global LEI system. These recommendations are guided by a set of “High Level Principles” which outline the objectives that a global LEI system should meet.
The proposed global Legal Entity Identifier (LEI) is expected to help regulators identify unique counterparties across the financial system and monitor the impact of risky counterparties holding positions with the banks. Assuming LEI is approved by the G20 this month, it will be the first of these infrastructure standards to be implemented globally requiring firms to integrate, reconcile and cross-reference the new LEI with existing counterparty identifiers and information, as well as manage accurate and current legal hierarchies. (more…)
The need for more robust data retention management and enforcement is more than just good data management practice. It is a legal requirement for financial services organizations across the globe to comply with the myriad of local, federal, and international laws that mandate the retention of certain types of data for example:
- Dodd-Frank Act: Under Dodd-Frank, firms are required to maintain records for no less than five years.
- Basel Accord: The Basel guidelines call for the retention of risk and transaction data over a period of three to seven years. Noncompliance can result in significant fines and penalties.
- MiFiD II: Transactional data must also be stored in such a way that it meets new records retention requirements for such data (which must now be retained for up to five years) and easily retrieved, in context, to prove best execution.
- Bank Secrecy Act: All BSA records must be retained for a period of five years and must be filed or stored in such a way as to be accessible within a reasonable period of time.
- Payment Card Industry Data Security Standard (PCI): PCI requires card issuers and acquirers to retain an audit trail history for a period that is consistent with its effective use, as well as legal regulations. An audit history usually covers a period of at least one year, with a minimum of three months available on-line.
- Sarbanes-Oxley:Section 103 requires firms to prepare and maintain, for a period of not less than seven years, audit work papers and other information related to any audit report, in sufficient detail to support the conclusions reached and reported to external regulators.
Each of these laws have distinct data collection, analysis, and retention requirements that must be factored into existing information management practices. Unfortunately, existing data archiving methods including traditional database and tape backup methods lack the required capabilities to effectively enforce and automate data retention policies to comply with industry regulations. In addition, a number of internal and external challenges make it even more difficult for financial institutions to archive and retain required data due to the following trends: (more…)
In the second of two videos, Peter Ku, Director of Financial Services Solutions Marketing, Informatica, talks about the latest trends regarding counterparty data and how the legal entity identifier (LEI) system will impact banks across the globe.
Specifically, he answers the following questions:
- What are the latest trends regarding counterparty information and how will the legal entity identifier system impact banks across the globe?
- How does Informatica help solve the challenges regarding counterparty information and help banks prepare for the new legal entity identifier system?
Also watch Peter’s first video (http://youtu.be/KvyDPzOTnUY) to learn about counterparty data and its challenges.
In a recent post: Informatica Ultra Messaging Software Supports Capital Markets Reforms, I discussed the technology implications of the OTC derivatives (swaps) market moving to electronic trading as mandated by the Dodd-Frank Act (DFA) in the US and the European Market Infrastructure Regulation (EMIR) in Europe. One area where new technology infrastructure will be especially critical is in the creation and operation of “exchanges” for electronic swaps trading, similar to what is used for equities and other asset classes. In the language of the DFA, such exchange venues are called Swap Execution Facilities (SEFs) and are defined as “a facility, trading system or platform in which multiple participants have the ability to execute or trade swaps by accepting bids and offers made by other participants that are open to multiple participants in the facility or system, through any means of interstate commerce.” This of course includes capturing orders electronically, matching bids and offers, executing the trades, and providing connections to central clearing houses. And perhaps nowhere else in the new ecosystem is the expected growth in message volumes and associated need for new messaging middleware technology more evident than here. (more…)
Europe might have started a little later than the U.S. with master data management, but if the inaugural Gartner MDM Summit for EMEA is any indication, it’s catching up quickly. Well over 350 registrants attended the event in London in early February, with strong representation from the UK, France, the Netherlands and other EMEA countries. Gartner Research VP Andrew White called the event a “major hit,” and I have to agree. (more…)