This was a great week of excitement and innovation here in San Francisco starting with the San Francisco Giants winning the National League Pennant for the 3rd time in 5 years on the same day Saleforce’s Dreamforce 2014 wrapped up their largest customer conference with over 140K+ attendees from all over the world talking about their new Customer Success Platform.
Salesforce has come a long way from their humble beginnings as the new kid on the cloud front for CRM. The integrated sales, marketing, support, collaboration, application, and analytics as part of the Salesforce Customer Success Platform exemplifies innovation and significant business value upside for various industries however I see it very promising for today’s financial services industry. However like any new business application, the value business gains from it are dependent in having the right data available for the business.
The reality is, SaaS adoption by financial institutions has not been as quick as other industries due to privacy concerns, regulations that govern what data can reside in public infrastructures, ability to customize to fit their business needs, cultural barriers within larger institutions that critical business applications must reside on-premise for control and management purposes, and the challenges of integrating data to and from existing systems with SaaS applications. However, experts are optimistic that the industry may have turned the corner. Gartner (NYSE:IT) asserts more than 60 percent of banks worldwide will process the majority of their transactions in the cloud by 2016. Let’s take a closer look at some of the challenges and what’s required to overcome these obstacles when adopting cloud solutions to power your business.
Challenge #1: Integrating and sharing data between SaaS and on-premise must not be taken lightly
For most banks and insurance companies considering new SaaS based CRM, Marketing, and Support applications with solutions from Salesforce and others must consider the importance of migrating and sharing data between cloud and on-premise applications in their investment decisions. Migrating existing customer, account, and transaction history data is often done by IT staff through the use of custom extracts, scripts, and manual data validations which can carry over invalid information from legacy systems making these new application investments useless in many cases.
For example, customer type descriptions from one or many existing systems may be correct in their respective databases however collapsing them into a common field in the target application seems easy to do. Unfortunately, these transformation rules can be complex and that complexity increases when dealing with tens if not hundreds of applications during the migration and synchronization phase. Having capable solutions to support the testing, development, quality management, validation, and delivery of existing data from old to new is not only good practice, but a proven way of avoiding costly workarounds and business pain in the future.
Challenge 2: Managing and sharing a trusted source of shared business information across the enterprise.
As new SaaS applications are adopted, it is critical to understand how to best govern and synchronize common business information such as customer contact information (e.g. address, phone, email) across the enterprise. Most banks and insurance companies have multiple systems that create and update critical customer contact information, many of them which reside on-premise. For example, insurance customers who update contact information such as a phone number or email address while filing an insurance claim will often result in that claims specialist to enter/update only the claims system given the siloed nature of many traditional banking and insurance companies. This is the power of Master Data Management which is purposely designed to identify changes to master data including customer records in one or many systems, update the customer master record, and share that across other systems that house and require that update is essential for business continuity and success.
In conclusion, SaaS adoption will continue to grow in financial services and across other industries. The silver lining in the cloud is your data and the technology that supports the consumption and distribution of it across the enterprise. Banks and insurance companies investing in new SaaS solutions will operate in a hybrid environment made up of Cloud and core transaction systems that reside on-premise. Cloud adoption will continue to grow and to ensure investments yield value for businesses, it is important to invest in a capable and scalable data integration platform to integrate, govern, and share data in a hybrid eco-system. To learn more on how to deal with these challenges, click here and download a complimentary copy of the new “Salesforce Integration for Dummies”
I recently participated on an EDM Council panel on BCBS 239 earlier this month in London and New York. The panel consisted of Chief Risk Officers, Chief Data Officers, and information management experts from the financial industry. BCBS 239 set out 14 key principles requiring banks aggregate their risk data to allow banking regulators to avoid another 2008 crisis, with a deadline of Jan 1, 2016. Earlier this year, the Basel Committee on Banking Supervision released the findings from a self-assessment from the Globally Systemically Important Banks (GISB’s) in their readiness to 11 out of the 14 principles related to BCBS 239.
Given all of the investments made by the banking industry to improve data management and governance practices to improve ongoing risk measurement and management, I was expecting to hear signs of significant process. Unfortunately, there is still much work to be done to satisfy BCBS 239 as evidenced from my findings. Here is what we discussed in London and New York.
- It was clear that the “Data Agenda” has shifted quite considerably from IT to the Business as evidenced by the number of risk, compliance, and data governance executives in the room. Though it’s a good sign that business is taking more ownership of data requirements, there was limited discussions on the importance of having capable data management technology, infrastructure, and architecture to support a successful data governance practice. Specifically capable data integration, data quality and validation, master and reference data management, metadata to support data lineage and transparency, and business glossary and data ontology solutions to govern the terms and definitions of required data across the enterprise.
- With regard to accessing, aggregating, and streamlining the delivery of risk data from disparate systems across the enterprise and simplifying the complexity that exists today from point to point integrations accessing the same data from the same systems over and over again creating points of failure and increasing the maintenance costs of supporting the current state. The idea of replacing those point to point integrations via a centralized, scalable, and flexible data hub approach was clearly recognized as a need however, difficult to envision given the enormous work to modernize the current state.
- Data accuracy and integrity continues to be a concern to generate accurate and reliable risk data to meet normal and stress/crisis reporting accuracy requirements. Many in the room acknowledged heavy reliance on manual methods implemented over the years and investing in Automating data integration and onboarding risk data from disparate systems across the enterprise is important as part of Principle 3 however, much of what’s in place today was built as one off projects against the same systems accessing the same data delivering it to hundreds if not thousands of downstream applications in an inconsistent and costly way.
- Data transparency and auditability was a popular conversation point in the room as the need to provide comprehensive data lineage reports to help explain how data is captured, from where, how it’s transformed, and used remains a concern despite advancements in technical metadata solutions that are not integrated with their existing risk management data infrastructure
- Lastly, big concerns regarding the ability to capture and aggregate all material risk data across the banking group to deliver data by business line, legal entity, asset type, industry, region and other groupings, to support identifying and reporting risk exposures, concentrations and emerging risks. This master and reference data challenge unfortunately cannot be solved by external data utility providers due to the fact the banks have legal entity, client, counterparty, and securities instrument data residing in existing systems that require the ability to cross reference any external identifier for consistent reporting and risk measurement.
To sum it up, most banks admit they have a lot of work to do. Specifically, they must work to address gaps across their data governance and technology infrastructure.BCBS 239 is the latest and biggest data challenge facing the banking industry and not just for the GSIB’s but also for the next level down as mid-size firms will also be required to provide similar transparency to regional regulators who are adopting BCBS 239 as a framework for their local markets. BCBS 239 is not just a deadline but the principles set forth are a key requirement for banks to ensure they have the right data to manage risk and ensure transparency to industry regulators to monitor system risk across the global markets. How ready are you?
How are they accomplishing this? A new generation of hackers has learned to reverse engineer popular software programs (e.g. Windows, Outlook Java, etc.) in order to find so called “holes”. Once those holes are exploited, the hackers develop “bugs” that infiltrate computer systems, search for sensitive data and return it to the bad guys. These bugs are then sold in the black market to the highest bidder. When successful, these hackers can wreak havoc across the globe.
I recently read a Time Magazine article titled “World War Zero: How Hackers Fight to Steal Your Secrets.” The article discussed a new generation of software companies made up of former hackers. These firms help other software companies by identifying potential security holes, before they can be used in malicious exploits.
This constant battle between good (data and software security firms) and bad (smart, young, programmers looking to make a quick/big buck) is happening every day. Unfortunately, the average consumer (you and I) are the innocent victims of this crazy and costly war. As a consumer in today’s digital and data-centric age, I worry when I see these headlines of ongoing data breaches from the Targets of the world to my local bank down the street. I wonder not “if” but “when” I will become the next victim. According to the Ponemon institute, the average cost to a company was $3.5 million in US dollars and 15 percent more than what it cost last year.
As a 20 year software industry veteran, I’ve worked with many firms across global financial services industry. As a result, my concerned about data security exceed those of the average consumer. Here are the reasons for this:
- Everything is Digital: I remember the days when ATM machines were introduced, eliminating the need to wait in long teller lines. Nowadays, most of what we do with our financial institutions is digital and online whether on our mobile devices to desktop browsers. As such every interaction and transaction is creating sensitive data that gets disbursed across tens, hundreds, sometimes thousands of databases and systems in these firms.
- The Big Data Phenomenon: I’m not talking about sexy next generation analytic applications that promise to provide the best answer to run your business. What I am talking about is the volume of data that is being generated and collected from the countless number of computer systems (on-premise and in the cloud) that run today’s global financial services industry.
- Increase use of Off-Shore and On-Shore Development: Outsourcing technology projects to offshore development firms has be leverage off shore development partners to offset their operational and technology costs. With new technology initiatives.
Now here is the hard part. Given these trends and heightened threats, do the companies I do business with know where the data resides that they need to protect? How do they actually protect sensitive data when using it to support new IT projects both in-house or by off-shore development partners? You’d be amazed what the truth is.
According to the recent Ponemon Institute study “State of Data Centric Security” that surveyed 1,587 Global IT and IT security practitioners in 16 countries:
- Only 16 percent of the respondents believe they know where all sensitive structured data is located and a very small percentage (7 percent) know where unstructured data resides.
- Fifty-seven percent of respondents say not knowing where the organization’s sensitive or confidential data is located keeps them up at night.
- Only 19 percent say their organizations use centralized access control management and entitlements and 14 percent use file system and access audits.
Even worse, those surveyed said that not knowing where sensitive and confidential information resides is a serious threat and the percentage of respondents who believe it is a high priority in their organizations. Seventy-nine percent of respondents agree it is a significant security risk facing their organizations. But a much smaller percentage (51 percent) believes that securing and/or protecting data is a high priority in their organizations.
I don’t know about you but this is alarming and worrisome to me. I think I am ready to reach out to my banker and my local retailer and let him know about my concerns and make sure they ask and communicate my concerns to the top of their organization. In today’s globally and socially connected world, news travels fast and given how hard it is to build trustful customer relationships, one would think every business from the local mall to Wall St should be asking if they are doing what they need to identify and protect their number one digital asset – Their data.
It has never been a more challenging time to be a Chief Risk Officer at a financial services firm. New regulations (CCAR, Basel III, BCBS 239, Solvency II, EMIR) have increased the complexity of the role. Today, risk management organizations must use precise data to measure risk, allocate capital and cover exposure. In addition, they must equip compliance groups to explain these decisions to industry regulators.
The challenges facing a Chief Risk Officer are even more daunting when the data at the heart of each decision is incomplete, inaccessible or inaccurate. Unless the data is complete, trustworthy, timely, authoritative and auditable, success will be hard to come by.
When data issues arise, most CROs lay blame at the feet of their risk applications. Next, they blame IT and the people and processes responsible for providing data to your risk modeling and analysis applications. However, in most situations, the issue with the data is neither the fault of the applications nor the IT groups. The reality is, most business users are unfamiliar with the root causes of data issues. More importantly, they are unfamiliar with available data and information management solutions to resolve and prevent similar issues. The root causes for existing data issues stem from processes and tools IT development teams use to deliver data to risk management groups. Regrettably, ongoing budget constraints, lack of capable tools, and fear of becoming obsolete have resulted in CIO leaders “throwing bodies at the problem.” This approach consumes IT worker cycles, as they manually access, transform, cleanse, validate, and deliver data into risk and compliance applications.
So what are the data issues impacting risk management organizations today? What should your organization consider and invest in if these situations exist? Here is a list of issues I have heard through my own conversations with risk and technology executives in the global financial markets. The following section is to help Chief Risk Officers, VP of Risk Management, and Risk Analysts to understand the solutions that can help with their data issues.
Challenge #1: I don’t have the right data in my risk applications, models, and analytic applications. It is either incomplete, out of date, or pain incorrect.
- The data required to manage and monitor risk across the business originate from hundreds of systems and applications. Data volumes continue to grow every day with new systems and types of data in today’s digital landscape
- Due to the lack of proper tools to integrate required data, IT developers manually extract data from internal and external systems which can range in the hundreds and comes in various formats
- Raw data from source systems are transformed and validated custom coded methods using COBOL, PLSQL, JAVA, PERL, etc.
Solutions that can help:
Consider investing in industry proven data integration and data quality software designed to reduce manual extraction, transformation, and validation and streamline the process of identifying and fixing upstream data quality errors. Data Integration tools not only reduce the risk of errors, they are designed to help IT professionals streamline these complex steps, reuse transformation and data quality rules across the risk data management process to enable repeatability, consistency, and efficiencies that require less resources to support current and future data needs by risk and other parts of the business.
Challenge #2: We do not have a comprehensive view of risk to satisfy systemic risk requirements
- Too many silos or standalone data marts or data warehouses containing segmented views of risk information
- Creating a single enterprise risk data warehouse takes too long to build, too complex, too expensive, too much data to process all in one system
Solutions that can help:
- Data virtualization solutions can tie existing data together to deliver a consolidated view of risk for business users without having to bring that data into an existing data warehouse.
- Long term, look at consolidating and simplifying existing data warehouses into an enterprise data warehouse leveraging high performing data processing technologies like Hadoop.
Challenge #3: I don’t trust the data being delivered into my risk applications and modeling solutions
- Data quality checks and validations are performed after the fact or often not at all.
- Business believes IT is performing the required data quality checks and corrections however business lacks visibility into how IT is fixing data errors and if these errors are being addressed at all.
Solutions that can help:
- Data quality solutions that allow business and IT to enforce data policies and standards to ensure business applications have accurate data for modeling and reporting purposes.
- Data quality scorecards accessible by business users to showcase the performance of ongoing data quality rules used to validate and fix data quality errors before they go into downstream risk systems.
Challenge #4: Unable to explain how risk is measured and reported to external regulators
- Related to IT manually managing data integration processes, organizations lack up to date, detailed, and accurate documentation of all of these processes from beginning to end. This results in IT not able to produce data lineage reports and information resulting in audit failures, regulatory penalties, and higher capital allocations that required.
- Lack of agreed upon documentation of business terms and definitions explaining what data is available, how is it used, and who has the domain knowledge to answer questions
Solutions that can help:
- Metadata management solutions that can capture upstream and downstream details of what data is collected, how it is processed, where it is used, and who uses it can help solve this requirement.
- Business glossary for data stewards and owners to manage definitions of your data and provide seamless access by business users from their desktops
Challenge #5: Unable to identify and measure risk exposures between counterparties and securities instruments
- No single source of the truth – Existing counterparty/legal entity master data resides in systems across traditional business silos.
- External identifiers including the proposed Global Legal Entity Identifier will never replace identifiers across existing systems
- Lack of insight into how each legal entity is related to each other both from a legal hierarchy standpoint and their exposure to existing securities instruments.
Solutions that can help:
- Master Data Management for Counterparty and Securities Master Data can help provide a single, connected, and authoritative source of counterparty information including legal hierarchy relationships and rules to identify the role and relationship between counterparties and existing securities instruments. It also eliminates business confusion of having different identifiers for the same legal entity by creating a “master” record and cross reference to existing records and identifiers for the same entity.
In summary, Chief Risk Officers and their organizations are investing to improve existing business processes, people, and business applications to satisfy industry regulations and gain better visibility into their risk conditions. Though these are important investments, it is also critical that you invest in the technologies to ensure IT has what it needs to access and deliver comprehensive, timely, trusted, and authoritative data.
At the same time, CIO’s can no longer afford wasting precious resources supporting manual works of art. As you take this opportunity to invest in your data strategies and requirements, it is important that both business and IT realize the importance of investing in a scalable and proven Information and Data Architecture to not only satisfy upcoming regulatory requirements but have in place a solution that meets the needs of the future across all lines of business. Click here to learn more about informatica’s solutions for banking and capital markets.
Financial services is one of the most data-centric industries in the world. Clean, connected, and secure data is critical to satisfy regulatory requirements, improve customer experience, grow revenue, avoid fines, and ultimately change the world of banking and insurance. Data management improvements have been made and several of the leading companies are empowered by Informatica.
Who are these companies and what are they doing with Informatica?
Fifteen of the top financial services companies will share their stories and success leveraging Informatica for their most critical business needs. These include:
- Capital One
- Bank of New Zealand
- Fannie Mae
- Fidelity Investments
- Morgan Stanley
- Thomson Reuters
- YAPI KREDI BANKASI A.S.
- Navy Federal Credit Union
- Wells Fargo Bank
- Westpac Banking Corporation
- Great American Insurance Group, Property & Casualty Group
- Liberty Mutual
Informatica World 2014 will have over 100 breakout sessions covering a wide range of topics for Line of Business Executives, IT decision makers, Architects, Developers, and Data Administrators. Our great keynote line up includes Informatica executives Sohaib Abbasi (Chief Executive Officer), Ivan Chong (Chief Strategy Officer), Marge Breya (Chief Marketing Officer) and Anil Chakravarthy (Chief Product Officer). Our series of speakers will share Informatica’s vision for this new data-centric world and explain innovations that will propel the concept of a data platform to an entirely new level.
Register today so you don’t miss out.
We look forward to seeing you in May!
The business of financial services is transforming before our eyes. Traditional banking and insurance products have become commoditized. As each day passes, consumers demand increasingly personalized products and services. Social and mobile channels continue to overthrow traditional communication methods. To survive and grow in this complex environment, financial institutions must do three things:
- Attract and retain the best customers
- Grow wallet share
- Deliver top-notch customer experience across all channels and touch points
The finance industry is traditionally either product centric or account centric. However, to succeed in the future, financial institutions must become customer centric. Becoming customer-centric requires changes to your people, process, technology, and culture. You must offer the right product or service to the right customer, at the right time, via the right channel. To achive this, you must ensure alignment between business and technology leaders. It will require targeted investments to grow the business, particularly the need to modernize legacy systems.
To become customer-centric, business executives are investing in Big Data and in legacy modernization initiatives. These investments are helping Marketing, Sales and Support organizations to:
- Improve conversion rates on new marketing campaigns on cross-sell and up-sell activities
- Measure customer sentiment on particular marketing and sales promotions or on the financial institution as a whole
- Improve sales productivity ratios by targeting the right customers with the right product at the right time
- Identify key indicators that determine and predict profitable and unprofitable customers
- Deliver an omni-channel experience across all lines of business, devices, and locations
At Informatica, we want to help you succeed. We want you to maximize the value in these investments. For this reason, we’ve written a new eBook titled: “Potential Unlocked – Improving revenue and customer experience in financial services”. In the eBook, you will learn:
- The role customer information plays in taking customer experience to the next level
- Best practices for shifting account-centric operations to customer-centric operations
- Common barriers and pitfalls to avoid
- Key considerations and best practices for success
- Strategies and experiences from best-in-class companies
Take a giant step toward Customer-Centricity: Download the eBook now.
Last week at Informatica World 2013, Informatica introduced Vibe, the industry’s first and only embeddable virtual data machine (VDM), designed to embed data management into the next generation of applications for the integrated information age. This unique capability offers technology for banks and insurance companies to scale and improve their data management, integration, and governance processes to manage risk and ensure ongoing compliance with a host of industry regulations from Basel III, Dodd Frank, to Solvency II. Why is Vibe unique and how does it help with risk management and regulatory compliance?
The data required for risk and compliances originates from tens if not hundreds of systems across all lines of business including loan origination systems, loan servicing, credit card processors, deposit servicing, securities trading, brokerage, call center, online banking, and more. Not to mention external data providers for market, pricing, positions, and corporate actions information. The volumes are greater than ever, the systems range from legacy mainframe trading systems to mobile banking applications, the formats vary across the board from structured, semi-structured, and unstructured, and a wide range of data standards must be dealt with including MISMO®, FpML®, FIX®, ACORD®, to SWIFT to name a few. Take all that into consideration and the data administration, management, governance, and integration work required is massive, multifaceted, and fraught with risk and hidden costs often caused by custom coded processes or use of standalone tools.
The Informatica Platform and Vibe can help by allowing our customers to take advantage of ever evolving data technologies and innovations without having to recode and develop a lean data management process that turns unique works of art into reusable artifacts across the information supply chain. In other words, Vibe powers the unique “Map Once. Deploy Anywhere.” capabilities of the Informatica Platform accelerates project delivery by 5x and makes the entire data lifecycle easier to manage and eliminates the risks, costs, and short lived value associated with hand coding or using standalone tools to do this work. Here are some examples of Vibe for risk and compliance:
- Built data quality rules to standardize address information, remove or consolidate duplicates, translate or standardize reference data, and other critical information to calculate risk within your ETL process or as a “Data Quality Validation” service in upstream systems
- Build rules to standardize wire transfer data to the latest SWIFT formats within your payment hubs as well as leverage the same rules in facilitating payment transactions with your counterparties.
- Build and execute complex parsing and transformation processes leveraging the power of Hadoop to handle large volumes of structured and unstructured data to analytics and utilize the same rules in downstream credit, operational, and market risk data warehouses.
- Define standard data masking rules once, and leverage it when using data with sensitive information for testing and develop as well as enforcing data access rights for ongoing data privacy compliance.
The “Map Once. Deploy Anywhere.” capabilities inherent to Vibe drive:
- Faster adoption of new technologies and data – Banks and insurance companies can take rapid advantage of new data and technologies without having to know the details of the underlying platform, or having to hire highly specialized and costly programming resources.
- Reduced complexity through insulation from change – When data type, volume, source, platform or users change, financial institutions can simply redeploy their existing data integration instructions without re-specification, redesign or redevelopment on a new integration technology – like Hadoop.
Vibe is NOT a new product offering. It is a unique capability that Informatica supports through our existing platform comprised of our Data Integration, Data Quality, Master Data Management, and Informatica Life Cycle Management products. Whether it is Dodd Frank, Basel III, FATCA, or Solvency II, with Vibe, banks and insurance companies can ensure they have the right data and increase the potential to improve how they measure risk and ensure regulatory compliance. Visit Informatica’s Banking/Capital Markets and Insurance industry solutions section of our website for more information on how we help today’s global financial services industry.
Data is one of the most important and value assets to banks and insurance companies across the globe to help comply with industry regulations, improve customer experience, find new revenue opportunities, and reduce the cost of doing business. These are universal needs and challenges and Informatica’s industry leading solutions have helped over 780 financial services institutions increase their potential to achieve business success.
At Informatica World 2013, June 4-7 at the Aria Resort and Casino in Las Vegas, Nevada, we will be showcasing a wealth of valuable information to maximize value from your data assets and technology investments. The event includes over 100 interactive and informative breakout sessions across 6 dedicated tracks on (Platform & Products, Architecture, Best Practices, Big Data, Hybrid IT and Tech Talk).
There will also be a financial services path including guest speakers from the banking and insurance industry and from our Financial Services experts including:
- Morgan Stanley Wealth Management: Accelerating Business Growth While Protecting Sensitive Data: Find how Morgan Stanley built one of the largest Informatica platforms to mask and process over 150 thousand objects used by more than 1,000 applications globally and comply with today’s data privacy regulations.
- Wells Fargo Bank’s Data Governance Journey with Informatica: Hear and learn about Wells Fargo’s data governance strategy, program, and how Informatica is used to deliver actionable, transparent, and trusted data to the business.
- Liberty Mutual Insurance: Architecture and Best Practices with Informatica Data Integration: Learn how Informatica Data Integration’s metadata-driven architecture helps scale and support large data volumes and meet enterprise Liberty Mutual’s demands for performance and compliance.
- Addressing Top Business Priorities in Banking and Insurance with MDM: Peter Ku, Senior Director of Financial Services Industry solutions share how Master Data Management is being used in Banking and Insurance to help address top business imperatives from regulatory compliance to finding new revenue opportunities.
Register today at www.informaticaworld.com and I look forward to seeing you there!
The need to be more customer-centric in financial services is more important than ever as banks and insurance companies look for ways to reduce churn as those in the industry know that loyal customers spend more on higher margin products and are likely to refer additional customers. Bankers and insurers who understand this, and get this right, are in a better position to maintain profitable and lasting customer loyalty and reap significant financial rewards. The current market conditions remain significant and will be difficult to overcome without the right information management architecture to help companies be truly customer centric. Here’s why:
- Customer satisfaction with retail banks has decreased for four consecutive years, with particularly low scores in customer service. Thirty-seven percent of customers who switched primary relationships cited in an industry survey showed poor customer service as the main reasons.
- The commoditization of traditional banking and insurance products has rapidly increased client attrition and decreased acquisition rates. Industry reports estimate that banks are losing customers at an average rate of 12.5% per year, while average acquisition rates are at 13.5%, making acquisitions nearly a zero-sum game. Further, the cost of acquiring new customers is estimated at five times the rate of retaining existing ones.
- Switching is easier than ever before. Customer churn is at an all-time high in most European countries. According to an industry survey, 42 percent of German banking customers had been with their main bank for less than a year. As customer acquisition costs running between of €200 to €400, bankers and insurers need to keep their clients at least 5 to 7 years to simply break even.
- Mergers and acquisitions impact even further the complexity and risks of maintaining customer relationships. According to a recent study, 17 percent of respondents who had gone through a merger or acquisition had switched at least one of their accounts to another institution after their bank was acquired, while an additional 31 percent said they were at least somewhat likely to switch over the next year.
Financial services professionals have long recognized the need to manage customer relationships vs. account relationships by shifting away from a product-centric culture toward a customer-centric model to maintain client loyalty and grow their bottom lines organically. Here are some reasons why:
- A 5% increase in customer retention can increase profitability by 35% in banking, 50% in brokerage, and 125% in the consumer credit card market.
- Banks can add more than $1 million to the profitability of their commercial banking business line by simply extending 16 of these large corporate relationships by one year, or by saving two such clients from defecting. In the insurance sector, a one percent increase in customer retention results in $1M in revenue.
- The average company has between a 60% and 70% probability of success selling more services to a current customer, a 20% to 40% probability of selling to a former customer, and a 5% to 20% probability of making a sale to a prospect.
- Up to 66% of current users of financial institutions’ social media sites engage in receiving information about financial services, 32% use it to retrieve information about offers or promotions and 30% to conduct customer service related activities.
So what does it take to become more Customer-centric?
Companies who have successful customer centric business models share similar cultures of placing the customer first, people who are willing to go that extra mile, business processes designed with the customer’s needs in mind, product and marketing strategy that is designed to meet a customer’s needs, and technology solutions that helps access and deliver trusted, timely, and comprehensive information and intelligence across the business. These technologies include
Why is data integration important? Customer centricity begins with the ability to access and integrate your data regardless of format, source system, structure, volume, latency, from any location including the cloud and social media sites. The data business needs originates from many different systems across the organization and outside including new Software as a Service solutions and cloud based technologies. Traditional hand coded methods and one off tools and open source data integration tools are not able to scale and perform to effectively and efficiently access, manage, and deliver the right data to the systems and applications in the front lined. A the same time, we live in the Big Data era with increasing transaction volumes, new channel adoption including mobile devices and social media combined generating petabytes of data of which to support a capable and sustainable customer centric business model, requires technology that can handle this complexity, scale with the business, while reducing costs and improving productivity.
Data quality issues must be dealt with proactively and managed by both business and technology stakeholders. Though technology itself cannot prevent all data quality errors from happening, it is a critical part of your customer information management process to ensure any issues that exist are identified and dealt with in an expeditious manner. Specifically, a Data Quality solution that can help detect data quality errors in any source, allow business users to define data quality rules, support seamless consumption of those rules by developers to execute, dashboards and reports for business stakeholders, and ongoing quality monitoring to deal with time and business sensitive exceptions. Data quality management can only scale and deliver value if an organization believes and manages data as an asset. It also helps to have a data governance framework consisting of processes, policies, standards, and people from business and IT working together in the process.
Lastly, growing your business, improving wallet share, retaining profitable relationships, and lowering the cost of managing customer relationships requires a single, trusted, holistic, and authoritative source of customer information. Managing customer information has historically been in applications across traditional business silos that lacked any common processes to reconcile duplicate and conflicting information across business systems. Master Data Management solutions are purposely designed to help breakdown the traditional application and business silos and helps deliver that single view of the truth for all systems to benefit. Master Data Management allows banks and insurance companies to access, identity unique customer entities, relate accounts to each customer, and extend that relationship view across other customers and employees including relationship bankers, financial advisors, to existing agents and brokers.
The need to attract and retain customers is a continuous journey for the financial industry however that need is greater than ever before. The foundation for successful customer centricity requires technology that can help access and deliver trusted, timely, consistent, and comprehensive customer information and insight across all channels and avoid the mistakes of the past, allow you to stay ahead of your competition, and maximize value for your shareholders.
 2010 UK Retail Banking Satisfaction Study, J.D. Power and Associates, October 2010.
 “Customer Winback”
 Mortgage Servicing News
According to the IDC Financial Insights 2013 Predictions report, financial institutions across most regions are getting serious about updating their legacy systems to improve reduce operating costs, automate labor intensive processes, improve customer experiences, and avoid costly disruptions. Transforming a bank’s core systems or insurance provider’s main business systems is a strategic decision that has far-reaching implications on the firm’s future business strategies and success. When done right, the capabilities offered in today’s modern banking and insurance platforms can propel a company in front of their competition or be the nail in the coffin if your data is not migrated correctly, safeguards are not in place to protect against unwanted data breaches, and if you are not able to decommission those old systems as planned.
One of the most important and critical phases of any legacy modernization project is the process of migrating data from old to new. Migrating data involves:
- Ability to access existing data in the legacy systems
- Understand the data structures that need to be migrated
- Transform and execute one-to-one mapping with the relevant fields in the new system
- Identify data quality errors and other gaps in the data
- Validate what is entered into the new system by identifying transformation or mapping errors
- Seamlessly connect to the target tables and fields in the new system
Sounds easy enough right? Not so fast! (more…)