Tag Archives: banking

What’s Driving Core Banking Modernization?

Renew

What’s Driving Core Banking Modernization

When’s the last time you visited your local branch bank and spoke to a human being? How about talking to your banker over the phone?  Can’t remember?  Well you’re not alone and don’t worry, it’s not a bad thing. The days of operating physical branches with expensive workers to greet and service customers  are being replaced with more modern and customer friendly mobile banking applications that allow consumers to deposit checks from the phone, apply for a mortgage and sign closing documents electronically, to eliminating the need to go to an ATM and get physical cash by using mobile payment solutions like Apple Pay.  In fact, a new report titled ‘Bricks + Clicks: Building the Digital Branch,’ from Jeanne Capachin and Jim Marous takes an in-depth look at how banks and credit unions are changing their branch and customer channel strategies to meet the demand of today’s digital banking customer.

Why am I talking about this? These market trends are dominating the CEO and CIO agenda in today’s banking industry. I just returned from the 2015 IDC Asian Financial Congress event in Singapore where the digital journey for the next generation bank was a major agenda item. According the IDC Financial Insights, global banks will invest $31.5B USD in core banking modernization to enable these services, improve operational efficiency, and position these banks to better compete on technology and convenience across markets. Core banking modernization initiatives are complex, costly, and fraught with risks. Let’s take a closer look. (more…)

Share
Posted in Application Retirement, Architects, Banking & Capital Markets, Data Migration, Data Privacy, Data Quality, Vertical | Tagged , , | Leave a comment

How to Improve Cross Sell and Customer Experience in Banking

Customer Experience in Banking

Customer Experience in Banking

I recently refinanced an existing mortgage on an investment property with my bank. Like most folks these days, I went to their website from my iPad, fill out an online application form, and received a pre-approval decision. Like any mortgage application, we stated our liabilities and assets including credit cards, auto loans, and investment accounts some of which were with this bank.  During the process I also entered a new contact email address after my email service was hacked over the summer.  The whole process took quite a bit of time and being an inpatient person I ended up logging off and coming back to the application over the weekend.

I walked into my local branch the following week to do a withdrawal with my bank teller and asked how my mortgage application was going. She had no clue what I was talking about as though I was a complete stranger.  When I asked her if they had my updated email address that I entered online, she was equally puzzled stating that any updates to that information would require me to contact all the other groups that held my brokerage, credit card, and mortgage services to make the change. That experience was extremely frustrating and I felt like my bank had no idea who I was as a customer despite the fact my ATM card as printed on it “Customer Since 1989″! Even worse, I expected someone to reach out to me after entering my entire financial history on my mortgage application about moving my investment accounts to their bank however no one contacted me about any new offers or services. (Wondering if they really wanted my business??)

2015 will continue to be a challenge for banks large and small to grow revenue caused by low interest rates, increasing competition from non-traditional segments, and lower customer loyalty with existing institutions.  The biggest opportunity for banks to grow revenue is to expand the wallet with existing customers.  Though times are ahead as many bank customers continue to do business with a multitude of different financial institutions.

The average U.S. consumer owns between 8-12 financial products ranging from your basic checking, credit card, mortgages, etc. to a wider range of products from IRA’s to 401K’s as they get closer to retirement.  On the flip side the average institution has between 2-3 products per customer relationship.  So why do banks continue to struggle in gaining more wallet share from existing customers?  Based on my experience and research, it stems down to two key reasons including:

  • Traditional product-centric business silos and systems
  • Lack of a single trusted source of customer, account, household, and other shared data syndicated and governed across the enterprise

The first reason is the way banks are set up to do business. Back in the day, you would walk into your local branch office. As you enter the doors, you have your bank tellers behind the counter ready to handle your deposits, withdrawals, and payments. If you need to open a new account you would talk to the new accounts manager sitting at their desk waiting to offer you a cookie. For mortgages and auto loans that would be someone else sitting in the far side of the building equally eager to sign new customers. As banks diversified their businesses with new products including investments, credit cards, insurance, etc. each product had their own operating units. The advent of the internet did not really change the traditional “brick and mortar” business model. Instead, one would go to the bank’s website to transact or sign up for a new product however on the back end the systems, people, and incentives to sell one product did not change creating the same disconnected customer experience.  Fast forward to today, these product centric silos continue to exist in big and small banks across the globe despite CEO’s saying they are focused on delivering a better customer experience.

Why is that the case? Well, another reason or cause are the systems within these product silos including core banking, loan origination, loan servicing, brokerage systems, etc. that were never designed to share common information with each other. In traditional retail or consumer banks maintained customer, account, and household information within the Customer Information File (CIF) often part of the core banking systems. Primary and secondary account holders would be grouped with a household based on the same last name and mailing address. Unfortunately, CIF systems were mainly used within retail banking. The problem grows expotentially as more systems were adopted to run the business across core business functions and traditional product business silos. Each group and its systems managed their own versions of the truth and these environments were never set up to share common data between them.

This is where Master Data Management technology can help.  “Master Data” is defined as a single source of basic business data used across multiple systems, applications, and/or processes.  In banking that traditionally includes information such as:

  • Customer name
  • Address
  • Email
  • Phone
  • Account numbers
  • Employer
  • Household members
  • Employees of the bank
  • Etc.

Master Data Management technology has evolved over the years starting as Customer Data Integration (CDI) solutions providing merge and match capabilities between systems to more modern platforms that govern consistent records and leverage inference analytics in to determine relationships between entities across systems within an enterprise. Depending on your business need, there are core capabilities one should consider when investing in an MDM platform. They include:

Key functions: What to look for in an MDM solution?
Capturing existing master data from two or more systems regardless of source and creating a single source of the truth for all systems to share. To do this right, you need seamless access to data regardless of source, format, system, and in real-time
Defining relationships based on “business rules” between entities. For example: “Household = Same last name, address, and account number.” These relationship definitions can be complex and can change over time therefore having the ability to create and modify those business rules by business users will help grow adoption and scalability across the enterprise
Governing consistency across systems by identifying changes to this common business information, determining whether it’s a unique, duplicate, or update to an existing record, and updating other systems that use and rely on that information. Similar to the first, you need the ability easily deliver and update dependent systems across the enterprise in real-time. Also, having a flexible and user friendly way of managing those master record rules and avoid heavy IT development is important to consider.

Now, what would my experience have been if my bank had capable Master Data Management solution in my bank? Let’s take a look:

Process Without MDM With MDM Benefit with MDM
Start a new mortgage application online Customer is required to fill out the usual information (name, address, employer, email, phone, existing accounts, etc.) The online banking system references the MDM solution which delivers the most recent master record of this customer based on existing data from the bank’s core banking system and brokerage systems and pre-populates the form with those details including information for their existing savings and credit card accounts with that bank.
  • Accelerate new customer on-boarding
  • Mitigating the risk of a competitor grabbing my attention to do business with them

 

New email address from customer Customer enters this on their mortgage application and gets entered into the bank’s loan origination system MDM recognizes that the email address is different from what exists in other systems, asks the customer to confirm changes.The master record is updated and shared across the banks’ other systems in real-time including the downstream data warehouse used by Marketing to drive cross sell campaigns.
  • Ensure every part of the bank shares the latest information about their customer
  • Avoids any disruptions with future communication of new products and offers to grow wallet share.

The banking industry continues to face headwinds from a revenue, risk, and regulatory standpoint. Traditional product-centric silos will not go away anytime soon and new CRM and client onboarding solutionsmay help with improving customer engagements and productivity within a firm however front office business applications are not designed to manage and share critical master data across your enterprise.  Anyhow, I decided to bank with another institution who I know has Master Data Management.   Are you ready for a new bank too?

For more information on Informatica’s Master Data Management:

Share
Posted in Banking & Capital Markets, Customer Acquisition & Retention, Master Data Management | Tagged , , | Leave a comment

Keeping Information Governance Relevant

Gartner’s official definition of Information Governance is “…the specification of decision rights and an accountability framework to encourage desirable behavior in the valuation, creation, storage, use, archival and deletion of information. It includes the processes, roles, standards, and metrics that ensure the effective and efficient use of information in enabling a business to achieve its goals.” It therefore looks to address important considerations that key stakeholders within an enterprise face.

A CIO of a large European bank once asked me – “How long do we need to keep information?”

Keeping Information Governance relevant

Data-Cartoon-2This bank had to govern, index, search, and provide content to auditors to show it is managing data appropriately to meet Dodd-Frank regulation. In the past, this information was retrieved from a database or email. Now, however, the bank was required to produce voice recordings from phone conversations with customers, show the Reuters feeds coming in that are relevant, and document all appropriate IMs and social media interactions between employees.

All these were systems the business had never considered before. These environments continued to capture and create data and with it complex challenges. These islands of information that seemingly do not have anything to do with each other, yet impact how that bank governs itself and how it saves any of the records associated with trading or financial information.

Coping with the sheer growth is one issue; what to keep and what to delete is another. There is also the issue of what to do with all the data once you have it. The data is potentially a gold mine for the business, but most businesses just store it and forget about it.

Legislation, in tandem, is becoming more rigorous and there are potentially thousands of pieces of regulation relevant to multinational companies. Businesses operating in the EU, in particular, are affected by increasing regulation. There are a number of different regulations, including Solvency II, Dodd-Frank, HIPAA, Gramm-Leach-Bliley Act (GLBA), Basel III and new tax laws. In addition, companies face the expansion of state-regulated privacy initiatives and new rules relating to disaster recovery, transportation security, value chain transparency, consumer privacy, money laundering, and information security.

Regardless, an enterprise should consider the following 3 core elements before developing and implementing a policy framework.

Whatever your size or type of business, there are several key processes you must undertake in order to create an effective information governance program. As a Business Transformation Architect, I can see 3 foundation stones of an effective Information Governance Program:

Assess Your Business Maturity

Understand the full scope of requirements on your business is a heavy task. Assess whether your business is mature enough to embrace information governance. Many businesses in EMEA do not have an information governance team already in place, but instead have key stakeholders with responsibility for information assets spread across their legal, security, and IT teams.

Undertake a Regulatory Compliance Review

Understand the legal obligations to your business are critical in shaping an information governance program. Every business is subject to numerous compliance regimes managed by multiple regulatory agencies, which can differ across markets. Many compliance requirements are dependent upon the numbers of employees and/or turnover reaching certain limits. For example, certain records may need to be stored for 6 years in Poland, yet the same records may need to be stored for 3 years in France.

Establish an Information Governance Team

It is important that a core team be assigned responsibility for the implementation and success of the information governance program. This steering group and a nominated information governance lead can then drive forward operational and practical issues, including; Agreeing and developing a work program, Developing policy and strategy, and Communication and awareness planning.

Share
Posted in Data Governance, Financial Services, Governance, Risk and Compliance | Tagged , , , , , | 1 Comment

The Ones Not Screwing Up with Compliance Will Win

A few weeks ago, a regional US bank asked me to perform some compliance and use case analysis around fixing their data management situation.  This bank prides itself on customer service and SMB focus, while using large-bank product offerings.  However, they were about a decade behind the rest of most banks in modernizing their IT infrastructure to stay operationally on top of things.

compliance

Bank Efficiency Ratio per AUM (Assets under Management), bankregdata.com

This included technologies like ESB, BPM, CRM, etc.  They also were a sub-optimal user of EDW and analytics capabilities. Having said all this; there was a commitment to change things up, which is always a needed first step to any recovery program.

THE STAKEHOLDERS

As I conducted my interviews across various departments (list below) it became very apparent that they were not suffering from data poverty (see prior post) but from lack of accessibility and use of data.

  • Compliance
  • Vendor Management & Risk
  • Commercial and Consumer Depository products
  • Credit Risk
  • HR & Compensation
  • Retail
  • Private Banking
  • Finance
  • Customer Solutions

FRESH BREEZE

This lack of use occurred across the board.  The natural reaction was to throw more bodies and more Band-Aid marts at the problem.  Users also started to operate under the assumption that it will never get better.  They just resigned themselves to mediocrity.  When some new players came into the organization from various systemically critical banks, they shook things up.

Here is a list of use cases they want to tackle:

  • The proposition of real-time offers based on customer events as simple as investment banking products for unusually high inflow of cash into a deposit account.
  • The use of all mortgage application information to understand debt/equity ratio to make relevant offers.
  • The capture of true product and customer profitability across all lines of commercial and consumer products including trust, treasury management, deposits, private banking, loans, etc.
  • The agile evaluation, creation, testing and deployment of new terms on existing and products under development by shortening the product development life cycle.
  • The reduction of wealth management advisors’ time to research clients and prospects.
  • The reduction of unclaimed use tax, insurance premiums and leases being paid on consumables, real estate and requisitions due to the incorrect status and location of the equipment.  This originated from assets no longer owned, scrapped or moved to different department, etc.
  • The more efficient reconciliation between transactional systems and finance, which often uses multiple party IDs per contract change in accounts receivable, while the operating division uses one based on a contract and its addendums.  An example would be vendor payment consolidation, to create a true supplier-spend; and thus, taking advantage of volume discounts.
  • The proactive creation of central compliance footprint (AML, 314, Suspicious Activity, CTR, etc.) allowing for quicker turnaround and fewer audit instances from MRAs (matter requiring attention).

MONEY TO BE MADE – PEOPLE TO SEE

Adding these up came to about $31 to $49 million annually in cost savings, new revenue or increased productivity for this bank with $24 billion total assets.

So now that we know there is money to be made by fixing the data of this organization, how can we realistically roll this out in an organization with many competing IT needs?

The best way to go about this is to attach any kind of data management project to a larger, business-oriented project, like CRM or EDW.  Rather than wait for these to go live without good seed data, why not feed them with better data as a key work stream within their respective project plans?

To summarize my findings I want to quote three people I interviewed.  A lady, who recently had to struggle through an OCC audit told me she believes that the banks, which can remain compliant at the lowest cost will ultimately win the end game.  Here she meant particularly tier 2 and 3 size organizations.  A gentleman from commercial banking left this statement with me, “Knowing what I know now, I would not bank with us”.  The lady from earlier also said, “We engage in spreadsheet Kung Fu”, to bring data together.

Given all this, what would you suggest?  Have you worked with an organization like this? Did you encounter any similar or different use cases in financial services institutions?

Share
Posted in Banking & Capital Markets, Data Quality, Financial Services, Governance, Risk and Compliance | Tagged , , , | Leave a comment

Major Financial Services Institution uses technology to improve your teller experience

Financial Services

Major Financial Services Institution uses technology to improve your teller experience

Like many American men, I judge my banking experience by the efficiency of my transaction time. However, my wife often still likes to go into the bank and see her favorite teller.

For her, banking is a bit more of a social experience. And every once in a while, my wife even drags into her bank as well.  But like many of my male counterparts, I still judge the quality of the experience by the operational efficiency of her teller. And the thing that I hate the most is when our experience at the bank is lengthened when the teller can’t do something and has to get the bank manager’s approval.

Now, a major financial institution has decided to make my life and even my wife’s life better. Using Informatica Rulepoint, they have come up with a way to improve teller operational efficiency and customer experience while actually decreasing operational business risks. Amazing!

How has this bank done this magic? They make use of the data that they have to create a better banking experience. They already capture historical transactions data and team member performance against each transaction in multiple databases. What they are doing now is using this information to make better decisions. With this information, this bank is able to create and update a risk assessment score for each team member at a branch location. And then by using Informatica Rulepoint, they have created approximately 100 rules that are able change teller’s authority based upon the new transaction, the teller’s transaction history, and the teller’s risk assessment score. This means that if my wife carefully picks the right teller, she is speed through the line without waiting for management approval.

So the message at this bank is the fastest teller is the best teller. To me this is really using data to improve  customer experience and allow for less time in a line. Maybe I should get this bank to talk next to my auto mechanic!

Share
Posted in Financial Services, Real-Time | Tagged , , | Leave a comment

Customer Centric Financial Services

Customer Centric Finance eBookThe business of financial services is transforming before our eyes. Traditional banking and insurance products have become commoditized. As each day passes, consumers demand increasingly personalized products and services. Social and mobile channels continue to overthrow traditional communication methods. To survive and grow in this complex environment, financial institutions must do three things:

  1. Attract and retain the best customers
  2. Grow wallet share
  3. Deliver top-notch customer experience across all channels and touch points

The finance industry is traditionally either product centric or account centric. However, to succeed in the future, financial institutions must become customer centric. Becoming customer-centric requires changes to your people, process, technology, and culture. You must offer the right product or service to the right customer, at the right time, via the right channel. To achive this, you must ensure alignment between business and technology leaders. It will require targeted investments to grow the business, particularly the need to modernize legacy systems.

To become customer-centric, business executives are investing in Big Data and in legacy modernization initiatives. These investments are helping Marketing, Sales and Support organizations to:

  • Improve conversion rates on new marketing campaigns on cross-sell and up-sell activities
  • Measure customer sentiment on particular marketing and sales promotions or on the financial institution as a whole
  • Improve sales productivity ratios by targeting the right customers with the right product at the right time
  • Identify key indicators that determine and predict profitable and unprofitable customers
  • Deliver an omni-channel experience across all lines of business, devices, and locations

At Informatica, we want to help you succeed. We want you to maximize the value in these investments. For this reason, we’ve written a new eBook titled: “Potential Unlocked – Improving revenue and customer experience in financial services”. In the eBook, you will learn:

  • The role customer information plays in taking customer experience to the next level
  • Best practices for shifting account-centric operations to customer-centric operations
  • Common barriers and pitfalls to avoid
  • Key considerations and best practices for success
  • Strategies and experiences from best-in-class companies

Take a giant step toward Customer-Centricity: Download the eBook now.

Share
Posted in Banking & Capital Markets, Financial Services | Tagged , , | Leave a comment

Open Source, Next Generation Data Encoding

Today is an exciting day for technology in high performance electronic trading. By the time you read this, the CME Group, Real Logic Ltd., and Informatica will have announced a new open source initiative. I’ve been collaborating on this work for a few months and I feel it is some great technology. I hope you will agree.

Simple Binary Encoding (SBE) is an encoding for FIX that is being developed by the FIX protocol community as part of their High Performance Working Group. The goal is to produce a binary encoding representation suitable for low-latency financial trading. The CME Group, Real Logic, and Informatica have sponsored the development of an open source implementation of an early version of the SBE specification undertaken by Martin Thompson (of Real Logic, formerly of LMAX) and myself, Todd Montgomery (of Informatica). The implementation methodology has been a very high performance encoding/decoding mechanism for data layout that is tailored to not just high performance application demands in low-latency trading. But has implications for all manner of serialization and marshaling in use cases from Big Data analytics to device data capture.

Financial institutions, and other businesses, need to serialize data structures for purposes of transmission over networks as well as for storage. SBE is a developing standard for how to encode/decode FIX data structures over a binary media at high speeds with low-latency. The SBE project is most similar to Google Protocol Buffers. However, looks are quite deceiving. SBE is an order of magnitude faster and immensely more efficient for encoding and decoding. This focus on performance means application developers can turn their attention to the application logic instead of the details of serialization. There are a number of advantages to SBE beyond speed, although, speed is of primary concern.

  • SBE provides a strong typing mechanism in the form of schemas for data objects
  • SBE only generates the overhead of versioning if the schema needs to handle versioning and if so, only on decode
  • SBE uses an Intermediate Representation (IR) for decoupling schema specification, optimization, and code generation
  • SBEs use of IR will allow it to provide various data layout optimizations in the near future
  • SBE initially provides Java, C++98, and C# code generators with more on the way

What breakthrough has lead to SBE being so fast?

It isn’t new or a breakthrough. SBE has been designed and implemented with the concepts and tenants of Mechanical Sympathy. Most software is developed with abstractions to mask away the details of CPU architecture, disk access, OS concepts, etc. Not so for SBE. It’s been designed with Martin and I utilizing everything we know about how CPUs, memory, compilers, managed runtimes, etc. work and making it very fast and work _with_ the hardware instead of against it.

Martin’s Blog will have a more detailed-oriented, technical discussion sometime later on SBE. But I encourage you to look at it and try it out. The work is open to the public under an Apache Public License.

Find out more on the FIX/SBE specification and SBE on github.

———————————————–

Todd Montgomery

Todd L. Montgomery is a Vice President of Architecture for Informatica and the chief designer and implementer of the 29West low latency messaging products. The Ultra Messaging product family (formerly known as LBM) has over 190 production deployments within electronic trading across many asset classes and pioneered the broker-less messaging paradigm. In the past, Todd has held architecture positions at TIBCO and Talarian as well as lecture positions at West Virginia University, contributed to the IETF, and performed research for NASA in various software fields. With a deep background in messaging systems, high performance systems, reliable multicast, network security, congestion control, and software assurance, Todd brings a unique perspective tempered by over 20 years of practical development experience.

Share
Posted in Banking & Capital Markets, Big Data | Tagged , , | Leave a comment

Sub-100 Nanosecond Pub/Sub Messaging: What Does It Matter?

Our announcement last week was an exciting milestone for those of us who started at 29West supporting the early high-frequency traders from 2004 to 2006. Last week, we announced the next step in a 10 year effort that has now seen us set the bar for low latency messaging lower by six orders of magnitude in Version 6.1 of Informatica Ultra Messaging with Shared Memory Acceleration (SMX). The really cool thing is that we have helped early customers like Intercontinental Exchange and Credit Suisse take advantage of the reductions from 2.5 million nanoseconds (ns) of latency to now as low as 37 ns on commodity hardware and networks without having to switch products or do major rewrites of their code.

But as I said in the title, what does it matter? Does being able to send messages to multiple receivers within a single box trading system or order matching engine in 90 ns as opposed to one microsecond really make a difference?

Well, according to a recent article by Scott Appleby on the TabbFORUM, “The Death of Alpha on Wall Street”* the only way for investment banks to find alpha or excess returns is “to find valuation correlations among markets to extract microstructure alpha”.  He states “Getco, Tradebot and Renaissance use technology to find valuation correlations among markets to extract microstructure alpha; this still works, but requires significant capital.”  What that extra hundreds of nanoseconds that SMX frees up allows a company to do is to make their matching algorithms or order routers that much smarter by doing dozens of additional complex calculations before the computer makes a decision. Furthermore, by allowing busy software developers to let the messaging layer takeover integrating software components that may be less critical to producing alpha (but very important for operational risk control like guaranteeing that messages can be captured off the single box trading system for compliance and disaster recovery) they can focus on changes in the microstructure of the markets.

The key SMX innovation is another “less is more” style engineering feat from our team. Basically SMX eliminates any copying of messages from the message delivery path. And of course if the processes in your trading system happened to be running within the same CPU on the same or different cores, this means messages are being sent within the memory cache of the core or CPU.   The other reason this matters is that because this product uniquely (as far as I know) allows zero copy shared memory communication between Java, C, and Microsoft .Net applications, developers can fully leverage the best features and the knowledge of their teams to deploy complex high-performance applications. For example, this allows third-party feed handlers built in C to communicate at extremely low latencies with algo engines written in Java.

So congrats to the UM development team for achieving this  important milestone and “thanks” to our customers for continuing to push us to provide you with that “lagniappe” of extra time that can make all the difference in the success of your trading strategies and  your businesses.

 

*- http://tabbforum.com/opinions/the-death-of-alpha-on-wall-street?utm_source=TabbFORUM+Alerts&utm_campaign=1c01537e42-UA-12160392-1&utm_medium=email&utm_term=0_29f4b8f8f1-1c01537e42-270859141

Share
Posted in Banking & Capital Markets, Ultra Messaging | Tagged , , , , , | Leave a comment

Bankers, Insurers – How Customer Centric Are You?

The need to be more customer-centric in financial services is more important than ever as banks and insurance companies look for ways to reduce churn as those in the industry know that loyal customers spend more on higher margin products and are likely to refer additional customers. Bankers and insurers who understand this, and get this right, are in a better position to maintain profitable and lasting customer loyalty and reap significant financial rewards. The current market conditions remain significant and will be difficult to overcome without the right information management architecture to help companies be truly customer centric. Here’s why:

  • Customer satisfaction with retail banks has decreased for four consecutive years, with particularly low scores in customer service.[1] Thirty-seven percent of customers who switched primary relationships cited in an industry survey showed poor customer service as the main reasons.
  • The commoditization of traditional banking and insurance products has rapidly increased client attrition and decreased acquisition rates. Industry reports estimate that banks are losing customers at an average rate of 12.5% per year, while average acquisition rates are at 13.5%, making acquisitions nearly a zero-sum game. Further, the cost of acquiring new customers is estimated at five times the rate of retaining existing ones.
  • Switching is easier than ever before. Customer churn is at an all-time high in most European countries. According to an industry survey, 42 percent of German banking customers had been with their main bank for less than a year. As customer acquisition costs running between of €200 to €400, bankers and insurers need to keep their clients at least 5 to 7 years to simply break even.
  • Mergers and acquisitions impact even further the complexity and risks of maintaining customer relationships. According to a recent study, 17 percent of respondents who had gone through a merger or acquisition had switched at least one of their accounts to another institution after their bank was acquired, while an additional 31 percent said they were at least somewhat likely to switch over the next year.[2]

Financial services professionals have long recognized the need to manage customer relationships vs. account relationships by shifting away from a product-centric culture toward a customer-centric model to maintain client loyalty and grow their bottom lines organically. Here are some reasons why:

  • A 5% increase in customer retention can increase profitability by 35% in banking, 50% in brokerage, and 125% in the consumer credit card market.[3]
  • Banks can add more than $1 million to the profitability of their commercial banking business line by simply extending 16 of these large corporate relationships by one year, or by saving two such clients from defecting. In the insurance sector, a one percent increase in customer retention results in $1M in revenue.
  • The average company has between a 60% and 70% probability of success selling more services to a current customer, a 20% to 40% probability of selling to a former customer, and a 5% to 20% probability of making a sale to a prospect.[4]
  • Up to 66% of current users of financial institutions’ social media sites engage in receiving information about financial services, 32% use it to retrieve information about offers or promotions and 30% to conduct customer service related activities.[5]

So what does it take to become more Customer-centric?

Companies who have successful customer centric business models share similar cultures of placing the customer first, people who are willing to go that extra mile, business processes designed with the customer’s needs in mind, product and marketing strategy that is designed to meet a customer’s needs, and technology solutions that helps access and deliver trusted, timely, and comprehensive information and intelligence across the business. These technologies include

Why is data integration important? Customer centricity begins with the ability to access and integrate your data regardless of format, source system, structure, volume, latency, from any location including the cloud and social media sites. The data business needs originates from many different systems across the organization and outside including new Software as a Service solutions and cloud based technologies. Traditional hand coded methods and one off tools and open source data integration tools are not able to scale and perform to effectively and efficiently access, manage, and deliver the right data to the systems and applications in the front lined. A the same time, we live in the Big Data era with increasing transaction volumes, new channel adoption including mobile devices and social media combined generating petabytes of data of which to support a capable and sustainable customer centric business model, requires technology that can handle this complexity, scale with the business, while reducing costs and improving productivity.

Data quality issues must be dealt with proactively and managed by both business and technology stakeholders.  Though technology itself cannot prevent all data quality errors from happening, it is a critical part of your customer information management process to ensure any issues that exist are identified and dealt with in an expeditious manner. Specifically, a Data Quality solution that can help detect data quality errors in any source, allow business users to define data quality rules, support seamless consumption of those rules by developers to execute, dashboards and reports for business stakeholders, and ongoing quality monitoring to deal with time and business sensitive exceptions. Data quality management can only scale and deliver value if an organization believes and manages data as an asset. It also helps to have a data governance framework consisting of processes, policies, standards, and people from business and IT working together in the process.

Lastly, growing your business, improving wallet share, retaining profitable relationships, and lowering the cost of managing customer relationships requires a single, trusted, holistic, and authoritative source of customer information.  Managing customer information has historically been in applications across traditional business silos that lacked any common processes to reconcile duplicate and conflicting information across business systems.  Master Data Management solutions are purposely designed to help breakdown the traditional application and business silos and helps deliver that single view of the truth for all systems to benefit.  Master Data Management allows banks and insurance companies to access, identity unique customer entities, relate accounts to each customer, and extend that relationship view across other customers and employees including relationship bankers, financial advisors, to existing agents and brokers.

The need to attract and retain customers is a continuous journey for the financial industry however that need is greater than ever before. The foundation for successful customer centricity requires technology that can help access and deliver trusted, timely, consistent, and comprehensive customer information and insight across all channels and avoid the mistakes of the past, allow you to stay ahead of your competition, and maximize value for your shareholders.

[1] 2010 UK Retail Banking Satisfaction Study, J.D. Power and Associates, October 2010.

Share
Posted in Customer Acquisition & Retention, Data Governance, Data Integration, Data Quality, Financial Services, Master Data Management, Vertical | Tagged , , , , , , , | Leave a comment

Maximize the Potential Business Value from New Core Banking/Insurance Application Investments

 

According to the IDC Financial Insights 2013 Predictions report, financial institutions across most regions are getting serious about updating their legacy systems to improve reduce operating costs, automate labor intensive processes, improve customer experiences, and avoid costly disruptions. Transforming a bank’s core systems or insurance provider’s main business systems is a strategic decision that has far-reaching implications on the firm’s future business strategies and success. When done right, the capabilities offered in today’s modern banking and insurance platforms can propel a company in front of their competition or be the nail in the coffin if your data is not migrated correctly, safeguards are not in place to protect against unwanted data breaches, and if you are not able to decommission those old systems as planned.  

One of the most important and critical phases of any legacy modernization project is the process of migrating data from old to new.  Migrating data involves:

  • Ability to access existing data in the legacy systems
  • Understand the data structures that need to be migrated
  • Transform and execute one-to-one mapping with the relevant fields in the new system
  • Identify data quality errors and other gaps in the data
  • Validate what is entered into the new system by identifying transformation or mapping errors
  • Seamlessly connect to the target tables and fields in the new system

Sounds easy enough right?  Not so fast! (more…)

Share
Posted in Application ILM, Application Retirement, Data Archiving, Data Governance, Data Integration, Data Quality, Financial Services | Tagged , , , , , | 1 Comment