Category Archives: Banking & Capital Markets

How to Improve Cross Sell and Customer Experience in Banking

Customer Experience in Banking

Customer Experience in Banking

I recently refinanced an existing mortgage on an investment property with my bank. Like most folks these days, I went to their website from my iPad, fill out an online application form, and received a pre-approval decision. Like any mortgage application, we stated our liabilities and assets including credit cards, auto loans, and investment accounts some of which were with this bank.  During the process I also entered a new contact email address after my email service was hacked over the summer.  The whole process took quite a bit of time and being an inpatient person I ended up logging off and coming back to the application over the weekend.

I walked into my local branch the following week to do a withdrawal with my bank teller and asked how my mortgage application was going. She had no clue what I was talking about as though I was a complete stranger.  When I asked her if they had my updated email address that I entered online, she was equally puzzled stating that any updates to that information would require me to contact all the other groups that held my brokerage, credit card, and mortgage services to make the change. That experience was extremely frustrating and I felt like my bank had no idea who I was as a customer despite the fact my ATM card as printed on it “Customer Since 1989″! Even worse, I expected someone to reach out to me after entering my entire financial history on my mortgage application about moving my investment accounts to their bank however no one contacted me about any new offers or services. (Wondering if they really wanted my business??)

2015 will continue to be a challenge for banks large and small to grow revenue caused by low interest rates, increasing competition from non-traditional segments, and lower customer loyalty with existing institutions.  The biggest opportunity for banks to grow revenue is to expand the wallet with existing customers.  Though times are ahead as many bank customers continue to do business with a multitude of different financial institutions.

The average U.S. consumer owns between 8-12 financial products ranging from your basic checking, credit card, mortgages, etc. to a wider range of products from IRA’s to 401K’s as they get closer to retirement.  On the flip side the average institution has between 2-3 products per customer relationship.  So why do banks continue to struggle in gaining more wallet share from existing customers?  Based on my experience and research, it stems down to two key reasons including:

  • Traditional product-centric business silos and systems
  • Lack of a single trusted source of customer, account, household, and other shared data syndicated and governed across the enterprise

The first reason is the way banks are set up to do business. Back in the day, you would walk into your local branch office. As you enter the doors, you have your bank tellers behind the counter ready to handle your deposits, withdrawals, and payments. If you need to open a new account you would talk to the new accounts manager sitting at their desk waiting to offer you a cookie. For mortgages and auto loans that would be someone else sitting in the far side of the building equally eager to sign new customers. As banks diversified their businesses with new products including investments, credit cards, insurance, etc. each product had their own operating units. The advent of the internet did not really change the traditional “brick and mortar” business model. Instead, one would go to the bank’s website to transact or sign up for a new product however on the back end the systems, people, and incentives to sell one product did not change creating the same disconnected customer experience.  Fast forward to today, these product centric silos continue to exist in big and small banks across the globe despite CEO’s saying they are focused on delivering a better customer experience.

Why is that the case? Well, another reason or cause are the systems within these product silos including core banking, loan origination, loan servicing, brokerage systems, etc. that were never designed to share common information with each other. In traditional retail or consumer banks maintained customer, account, and household information within the Customer Information File (CIF) often part of the core banking systems. Primary and secondary account holders would be grouped with a household based on the same last name and mailing address. Unfortunately, CIF systems were mainly used within retail banking. The problem grows expotentially as more systems were adopted to run the business across core business functions and traditional product business silos. Each group and its systems managed their own versions of the truth and these environments were never set up to share common data between them.

This is where Master Data Management technology can help.  “Master Data” is defined as a single source of basic business data used across multiple systems, applications, and/or processes.  In banking that traditionally includes information such as:

  • Customer name
  • Address
  • Email
  • Phone
  • Account numbers
  • Employer
  • Household members
  • Employees of the bank
  • Etc.

Master Data Management technology has evolved over the years starting as Customer Data Integration (CDI) solutions providing merge and match capabilities between systems to more modern platforms that govern consistent records and leverage inference analytics in to determine relationships between entities across systems within an enterprise. Depending on your business need, there are core capabilities one should consider when investing in an MDM platform. They include:

Key functions: What to look for in an MDM solution?
Capturing existing master data from two or more systems regardless of source and creating a single source of the truth for all systems to share. To do this right, you need seamless access to data regardless of source, format, system, and in real-time
Defining relationships based on “business rules” between entities. For example: “Household = Same last name, address, and account number.” These relationship definitions can be complex and can change over time therefore having the ability to create and modify those business rules by business users will help grow adoption and scalability across the enterprise
Governing consistency across systems by identifying changes to this common business information, determining whether it’s a unique, duplicate, or update to an existing record, and updating other systems that use and rely on that information. Similar to the first, you need the ability easily deliver and update dependent systems across the enterprise in real-time. Also, having a flexible and user friendly way of managing those master record rules and avoid heavy IT development is important to consider.

Now, what would my experience have been if my bank had capable Master Data Management solution in my bank? Let’s take a look:

Process Without MDM With MDM Benefit with MDM
Start a new mortgage application online Customer is required to fill out the usual information (name, address, employer, email, phone, existing accounts, etc.) The online banking system references the MDM solution which delivers the most recent master record of this customer based on existing data from the bank’s core banking system and brokerage systems and pre-populates the form with those details including information for their existing savings and credit card accounts with that bank.
  • Accelerate new customer on-boarding
  • Mitigating the risk of a competitor grabbing my attention to do business with them

 

New email address from customer Customer enters this on their mortgage application and gets entered into the bank’s loan origination system MDM recognizes that the email address is different from what exists in other systems, asks the customer to confirm changes.The master record is updated and shared across the banks’ other systems in real-time including the downstream data warehouse used by Marketing to drive cross sell campaigns.
  • Ensure every part of the bank shares the latest information about their customer
  • Avoids any disruptions with future communication of new products and offers to grow wallet share.

The banking industry continues to face headwinds from a revenue, risk, and regulatory standpoint. Traditional product-centric silos will not go away anytime soon and new CRM and client onboarding solutionsmay help with improving customer engagements and productivity within a firm however front office business applications are not designed to manage and share critical master data across your enterprise.  Anyhow, I decided to bank with another institution who I know has Master Data Management.   Are you ready for a new bank too?

For more information on Informatica’s Master Data Management:

Share
Posted in Banking & Capital Markets, Customer Acquisition & Retention, Master Data Management | Tagged , , | Leave a comment

The 3 Little Architects and the Big Bad Mr. Wolf – A Data Parody for today’s Financial Industry

The 3 Little Architects and the Big Bad Wolf

The 3 Little Architects

Once upon a time, there were 3 Information Architects working in the financial services industry, each with different firms and backgrounds but all responsible for recommending the right technology solutions to help their firms comply with industry regulations including ongoing bank stress testing across the globe.  Since 2008, bank regulators have been focused on measuring systemic risk and requiring banks to provide transparency into how risk is measured and reported to support their capital adequacy needs.

The first architect grew through the ranks starting as a Database Administrator, a black belt in SQL and COBOL programming. Hand coding was their DNA for many years and thought of as the best approach given how customized their business and systems were vs. other organizations. As such, Architect #1 and their team went down the path of building their data management capabilities through custom hand coded scripts, manual data extractions and transformations, and dealing with data quality issues through the business organizations after the data is delivered.   Though their approach and decisions delivered on their short term needs, the firm realized the overhead required to make changes and respond to new requests driven by new industry regulations and changing market conditions.

The second architect is a “gadget guy” at heart who grew up using off the shelf tools vs. hand coding for managing data. He and his team decides not to hand code their data management processes, instead adopt and built their solution leveraging best of breed tools, some of which were open source, others from existing solutions the company had from previous projects for data integration, data quality, and metadata management.  Though their tools helped automate much of the “heavy lifting” he and is IT team were still responsible for integrating these point solutions to work together which required ongoing support and change management.

The last architect is as technically competent as his peers however understood the value of building something once to use across the business. His approach was a little different than the first two. Understanding the risks and costs of hand coding or using one off tools to do the work, he decided to adopt an integrated platform designed to handle the complexities, sources, and volumes of data required by the business.  The platform also incorporated shared metadata, reusable data transformation rules and mappings, a single source of required master and reference data, and provided agile development capabilities to reduce the cost of implementation and ongoing change management. Though this approach was more expensive to implement, the long term cost benefit and performance benefits made the decision a “no brainer’.

Lurking in the woods is Mr. Wolf. Mr. Wolf is not your typical antagonist however is a regulatory auditor whose responsibility is to ensure these banks can explain how risk is calculated as reported to the regulatory authorities. His job isn’t to shut these banks down, instead making sure the financial industry is able to measure risk across the enterprise, explain how risk is measured, and ensure these firms are adequately capitalized as mandated by new and existing industry regulations.

Mr. Wolf visits the first bank for an annual stress test audit. Looking at the result of their stress test, he asks the compliance teams to explain how their data was produced, transformed, calculated, to support the risk measurements they reported as part of the audit. Unfortunately, due to the first architect’s recommendations of hand coding their data management processes, IT failed to provide explanations and documentation on what they did, they found the developers that created their systems were no longer with the firm. As a result, the bank failed miserably, resulting in stiff penalties and higher audit costs.

Next, Architect #2’s bank was next. Having heard of what happened to their peer in the news, the architect and IT teams were confident that they were in good shape to pass their stress test audit. After digging into the risk reports, Mr. Wolf questioned the validity of the data used to calculate Value at Risk (VaR). Unfortunately, the tools that were adopted were never designed nor guaranteed by the vendors to work with each other resulting in invalid data mapping and data quality rules and gaps within their technical metadata documentation. As a result, bank #2 also failed their audit and found themselves with a ton of on one-off tools that helped automate their data management processes but lacked the integration and sharing of rules and metadata to satisfy the regulator’s demand for risk transparency.

Finally, Mr. Wolf investigated Architect #3’s firm. Having seen the result of the first two banks, Mr. Wolf was leery of their ability to pass their stress test audits. Similar demands were presented by Mr. Wolf however this time, Bank #3 provided detailed and comprehensive metadata documentation of their risk data measurements, descriptions of the data used in each report, an comprehensive report of each data quality rule used to cleanse their data, and detailed information on each counterparty and legal entity used to calculate VaR.  Unable to find gaps in their audit, Mr. Wolf, expecting to “blow” the house down, delivered a passing grade for Bank 3 and their management team due to the right investments they made to support their enterprise risk data management needs.

The moral of this story, similar to the familiar one involving the three little pigs is about the importance of having a solid foundation to weather market and regulatory storms or the violent bellow of a big bad wolf.  A foundation that includes the required data integration, data quality, master data management, and metadata management needs but also supports collaboration and visibility of how data is produced, used, and performing across the business. Ensuring current and future compliance in today’s financial services industry requires firms to have a solid data management platform, one that is intelligent, comprehensive, and allows Information Architects to help mitigate the risks and costs of hand coding or using point tools to get by only in the short term.

Are you prepared to meet Mr. Wolf?

Share
Posted in Architects, Banking & Capital Markets, Data Governance, Data Integration, Data Integration Platform, Data Quality, Enterprise Data Management | Tagged , , , , | Leave a comment

Data Security – A Major Concern in 2015

Data Security

Data Security – A Major Concern in 2015

2014 ended with a ton of hype and expectations and some drama if you are a data security professional or business executive responsible for shareholder value.  The recent attacks on Sony Pictures by North Korea during December caught everyone’s attention, not about whether with Sony would release “The Interview” but how vulnerable we as a society are to these criminal acts.

I have to admit, I was one of those who saw the movie and found the film humorous to say the least and can see why a desperate regime like North Korea would not want their leader admitting they love margarita’s and Katy Perry. What concerned me about the whole event was whether these unwanted security breaches were now just a fact of life?  As a disclaimer, I have no affinity over the downfall of the North Korean government however what transpired was fascinating and amazing that companies like Sony continue to struggle to protect sensitive data despite being one of the largest companies in the world.

According to the Identity Theft Resource Center, there were 761 reported data security breaches in 2014 impacting over 83 million breached records across industries and geographies with B2B and B2C retailers leading the pack with 79.2% of all breaches. Most of these breaches originated through the internet via malicious WORMS and viruses purposely designed to identify and rely back sensitive information including credit card numbers, bank account numbers, and social security information used by criminals to wreak havoc and significant financial losses to merchants and financial institutions. According to the 2014 Ponemon Institute Research study:

  • The average cost of cyber-crime per company in the US was $12.7 million this year, according to the Ponemon report, and US companies on average are hit with 122 successful attacks per year.
  • Globally, the average annualized cost for the surveyed organizations was $7.6 million per year, ranging from $0.5 million to $61 million per company. Interestingly, small organizations have a higher per-capita cost than large ones ($1,601 versus $437), the report found.
  • Some industries incur higher costs in a breach than others, too. Energy and utility organizations incur the priciest attacks ($13.18 million), followed closely by financial services ($12.97 million). Healthcare incurs the fewest expenses ($1.38 million), the report says.

Despite all the media attention around these awful events last year, 2015 does not seem like it’s going to get any better. According to CNBC just this morning, Morgan Stanley reported a data security breach where they had fired an employee who it claims stole account data for hundreds of thousands of its wealth management clients. Stolen information for approximately 900 of those clients was posted online for a brief period of time.  With so much to gain from this rich data, businesses across industries have a tough battle ahead of them as criminals are getting more creative and desperate to steal sensitive information for financial gain. According to a Forrester Research, the top 3 breach activities included:

  • Inadvertent misuse by insider (36%)
  • Loss/theft of corporate asset (32%)
  • Phishing (30%)

Given the growth in data volumes fueled by mobile, social, cloud, and electronic payments, the war against data breaches will continue to grow bigger and uglier for firms large and small.  As such, Gartner predicts investments in Information Security Solutions will grow further 8.2 percent in 2015 vs. 2014 reaching $76.9+ billion globally.  Furthermore, by 2018, more than half of organizations will use security services firms that specialize in data protection, security risk management and security infrastructure management to enhance their security postures.

Like any war, you have to know your enemy and what you are defending. In the war against data breaches, this starts with knowing where your sensitive data is before you can effectively defend against any attack. According to the Ponemon Institute, 18% of firms who were surveyed said they knew where their structured sensitive data was located where as the rest were not sure. 66% revealed that if would not be able to effectively know if they were attacked.   Even worse, 47% were NOT confident at having visibility into users accessing sensitive or confidential information and that 48% of those surveyed admitted to a data breach of some kind in the last 12 months.

In closing, the responsibilities of today’s information security professional from Chief Information Security Officers to Security Analysts are challenging and growing each day as criminals become more sophisticated and desperate at getting their hands on one of your most important assets….your data.  As your organizations look to invest in new Information Security solutions, make sure you start with solutions that allow you to identify where your sensitive data is to help plan an effective data security strategy both to defend your perimeter and sensitive data at the source.   How prepared are you?

For more information about Informatica Data Security Solutions:

  • Download the Gartner Data Masking Magic Quadrant Report
  • Click here to learn more about Informatica’s Data Masking Solutions
  • Click here to access Informatica Dynamic Data Masking: Preventing Data Breaches with Benchmark-Proven Performance whitepaper
Share
Posted in Application ILM, Banking & Capital Markets, Big Data, CIO, Data masking, Data Privacy, Data Security | Tagged , , | Leave a comment

Happy Holidays, Happy HoliData.

Happy Holidays, Happy HoliData

In case you have missed our #HappyHoliData series on Twitter and LinkedIn, I decided to provide a short summary of best practices which are unleashing information potential. Simply scroll and click on the case study which is relevant for you and your business. The series touches on different industries and use cases. But all have one thing in common: All consider information quality as key value to their business to deliver the right services or products to the right customer.

HappyHoliData_01 HappyHoliData_02 HappyHoliData_03 HappyHoliData_04 HappyHoliData_05 HappyHoliData_06 HappyHoliData_07 HappyHoliData_08 HappyHoliData_09 HappyHoliData_10 HappyHoliData_11 HappyHoliData_12 HappyHoliData_13 HappyHoliData_14 HappyHoliData_15 HappyHoliData_16 HappyHoliData_17 HappyHoliData_18 HappyHoliData_19 HappyHoliData_20 HappyHoliData_21 HappyHoliData_22 HappyHoliData_23 HappyHoliData_24

Thanks a lot to all my great teammates, who made this series happen.

Happy Holidays, Happy HoliData.

Share
Posted in B2B, B2B Data Exchange, Banking & Capital Markets, Big Data, CIO, CMO, Customers, Data Governance, Data Quality, Enterprise Data Management, Financial Services, Governance, Risk and Compliance, Manufacturing, Master Data Management, PaaS, PiM, Product Information Management, Retail, SaaS | Tagged , | Leave a comment

Are The Banks Going to Make Retailers Pay for Their Poor Governance?

 

Retail and Data Governance

Retail and Data Governance

A couple months ago, I reached out to a set of CIOs on the importance of good governance and security. All of them agreed that both were incredibly important. However, one CIO retorted a very pointed remark by saying that “the IT leadership at these breached companies wasn’t stupid.” He continued by saying that when selling the rest of the C-Suite, the discussion needs to be about business outcomes and business benefits.  For this reason, he said that CIOs have struggled at selling the value of investments in governance and security investment. Now I have suggested previously that security pays because of the impact on “brand promise”.  And, I still believe this.

However, this week the ante was raised even higher. A district judge ruled that a group of banks can proceed to sue a retailer for negligence in their data governance and security. The decision could clearly lead to significant changes in the way the cost of fraud is distributed among parties within the credit card ecosystem. Where once banks and merchant acquirers would have shouldered the burden of fraud, this decision paves the way for more card-issuing banks to sue merchants for not adequately protecting their POS systems.

Accidents waste priceless time

Accidents waste priceless time

The judge’s ruling said that “although the third-party hackers’ activities caused harm, merchant played a key role in allowing the harm to occur.” The judge also determined that the bank suit against merchants was valid because the plaintiffs adequately showed that the retailer failed “to disclose that its data security systems were deficient.” This is interesting because it says that security systems should be sufficient and if not, retailers need to inform potentially affected stakeholders of their deficient systems. And while taking this step could avoid a lawsuit, it would likely increase the cost of interchange for more risky merchants. This would effectively create a risk premium for retailers that do not adequately govern and protect their IT environments.

There are broad implications for all companies who end up harming customer, partners, or other stakeholders by not keeping their security systems up to snuff. The question is, will this make good governance have enough of a business outcome and benefit that businesses will actually want to pay it forward — i.e. invest in good governance and security? What do you think? I would love to hear from you.

Related links

Solutions:

Enterprise Level Data Security

Hacking: How Ready Is Your Enterprise?

Gambling With Your Customer’s Financial Data

The State of Data Centric Security

Twitter: @MylesSuer

 

Share
Posted in Banking & Capital Markets, CIO, Data Governance, Data Security, Financial Services | Tagged , , , | Leave a comment

If Data Projects Weather, Why Not Corporate Revenue?

Every fall Informatica sales leadership puts together its strategy for the following year.  The revenue target is typically a function of the number of sellers, the addressable market size and key accounts in a given territory, average spend and conversion rate given prior years’ experience, etc.  This straight forward math has not changed in probably decades, but it assumes that the underlying data are 100% correct. This data includes:

  • Number of accounts with a decision-making location in a territory
  • Related IT spend and prioritization
  • Organizational characteristics like legal ownership, industry code, credit score, annual report figures, etc.
  • Key contacts, roles and sentiment
  • Prior interaction (campaign response, etc.) and transaction (quotes, orders, payments, products, etc.) history with the firm

Every organization, no matter if it is a life insurer, a pharmaceutical manufacturer, a fashion retailer or a construction company knows this math and plans on getting somewhere above 85% achievement of the resulting target.  Office locations, support infrastructure spend, compensation and hiring plans are based on this and communicated.

data revenue

We Are Not Modeling the Global Climate Here

So why is it that when it is an open secret that the underlying data is far from perfect (accurate, current and useful) and corrupts outcomes, too few believe that fixing it has any revenue impact?  After all, we are not projecting the climate for the next hundred years here with a thousand plus variables.

If corporate hierarchies are incorrect, your spend projections based on incorrect territory targets, credit terms and discount strategy will be off.  If every client touch point does not have a complete picture of cross-departmental purchases and campaign responses, your customer acquisition cost will be too high as you will contact the wrong prospects with irrelevant offers.  If billing, tax or product codes are incorrect, your billing will be off.  This is a classic telecommunication example worth millions every month.  If your equipment location and configuration is wrong, maintenance schedules will be incorrect and every hour of production interruption will cost an industrial manufacturer of wood pellets or oil millions.

Also, if industry leaders enjoy an upsell ratio of 17%, and you experience 3%, data (assuming you have no formal upsell policy as it violates your independent middleman relationship) data will have a lot to do with it.

The challenge is not the fact that data can create revenue improvements but how much given the other factors: people and process.

Every industry laggard can identify a few FTEs who spend 25% of their time putting one-off data repositories together for some compliance, M&A customer or marketing analytics.  Organic revenue growth from net-new or previously unrealized revenue is what the focus of any data management initiative should be.  Don’t get me wrong; purposeful recruitment (people), comp plans and training (processes) are important as well.  Few people doubt that people and process drives revenue growth.  However, few believe data being fed into these processes has an impact.

This is a head scratcher for me. An IT manager at a US upstream oil firm once told me that it would be ludicrous to think data has a revenue impact.  They just fixed data because it is important so his consumers would know where all the wells are and which ones made a good profit.  Isn’t that assuming data drives production revenue? (Rhetorical question)

A CFO at a smaller retail bank said during a call that his account managers know their clients’ needs and history. There is nothing more good data can add in terms of value.  And this happened after twenty other folks at his bank including his own team delivered more than ten use cases, of which three were based on revenue.

Hard cost (materials and FTE) reduction is easy, cost avoidance a leap of faith to a degree but revenue is not any less concrete; otherwise, why not just throw the dice and see how the revenue will look like next year without a central customer database?  Let every department have each account executive get their own data, structure it the way they want and put it on paper and make hard copies for distribution to HQ.  This is not about paper versus electronic but the inability to reconcile data from many sources on paper, which is a step above electronic.

Have you ever heard of any organization move back to the Fifties and compete today?  That would be a fun exercise.  Thoughts, suggestions – I would be glad to hear them?

Share
Posted in Banking & Capital Markets, Big Data, Business Impact / Benefits, Business/IT Collaboration, Data Governance, Data Integration, Data Quality, Data Warehousing, Enterprise Data Management, Governance, Risk and Compliance, Master Data Management, Product Information Management | Tagged , | 2 Comments

Has Hadoop Crossed The Chasm? Thoughts About Strata 2014

Well, it’s been a little over a week since the Strata conference so I thought I should give some perspective on what I learned.  I think it was summed up at my first meeting, on the first morning of the conference. The meeting was with a financial services company who has significance experience with Hadoop. The first words out of their mouths were, “Hadoop is hard.” 

Later in the conference, after a Western Union representative spoke about their Hadoop deployment, they were mobbed by end user questions and comments. The audience was thrilled to hear about an actual operational deployment: Not just a sandbox deployment, but an actual operational Hadoop deployment from a company that is over 160 years old.

The market is crossing the chasm from early adopters who love to hand code (and the macho culture of proving they can do the hard stuff) to more mainstream companies that want to use technology to solve real problems. These mainstream companies aren’t afraid to admit that it is still hard. For the early adopters, nothing is ever hard. They love hard. But the mainstream market doesn’t view it that way.  They don’t want to mess around in the bowels of enabling technology.  They want to use the technology to solve real problems.  The comment from the financial services company represents the perspective of the vast majority of organizations. It is a sign Hadoop is hitting the mainstream market.

More proof we have moved to a new phase?  Cloudera announced they were going from shipping six versions a year down to just three.  I have been saying for awhile that we will know that Hadoop is real when the distribution vendors stop shipping every 2 months and go to a more typical enterprise software release schedule.  It isn’t that Hadoop engineering efforts have slowed down.  It is still evolving very rapidly.  It is just that real customers are telling the Hadoop suppliers that they won’t upgrade as fast because they have real business projects running and they can’t do it.  So for those of you who are disappointed by the “slow down,” don’t be.  To me, this is news that Hadoop is reaching critical mass.

Technology is closing the gap to allow organizations to use Hadoop as a platform without having to actually have an army of Hadoop experts.  That is what Informatica does for data parsing, data integration,  data quality and data lineage (recent product announcement).  In fact, the number one demo at the Informatica booth at Strata was the demonstration of “end to end” data lineage for data, going from the original source all the way to how it was loaded and then transformed within Hadoop.  This is purely an enterprise-class capability that becomes more interesting and important when you actually go into true production.

Informatica’s goal is to hide the complexity of Hadoop so companies can get on with the work of using the platform with the skills they already have in house.  And from what I saw from all of the start-up companies that were doing similar things for data exploration and analytics and all the talk around the need for governance, we are finally hitting the early majority of the market.  So, for those of you who still drop down to the underlying UNIX OS that powers a Mac, the rest of us will keep using the GUI.   To the extent that there are “fit for purpose” GUIs on top of Hadoop, the technology will get used by a much larger market.

So congratulations Hadoop, you have officially crossed the chasm!

P.S. See me on theCUBE talking about a similar topic at: youtu.be/oC0_5u_0h2Q

Share
Posted in Banking & Capital Markets, Big Data, Hadoop, Informatica Events | Tagged , , , | Leave a comment

BCBS 239 – What Are Banks Talking About?

I recently participated on an EDM Council panel on BCBS 239 earlier this month in London and New York. The panel consisted of Chief Risk Officers, Chief Data Officers, and information management experts from the financial industry. BCBS 239 set out 14 key principles requiring banks aggregate their risk data to allow banking regulators to avoid another 2008 crisis, with a deadline of Jan 1, 2016.  Earlier this year, the Basel Committee on Banking Supervision released the findings from a self-assessment from the Globally Systemically Important Banks (GISB’s) in their readiness to 11 out of the 14 principles related to BCBS 239. 

Given all of the investments made by the banking industry to improve data management and governance practices to improve ongoing risk measurement and management, I was expecting to hear signs of significant process. Unfortunately, there is still much work to be done to satisfy BCBS 239 as evidenced from my findings. Here is what we discussed in London and New York.

  • It was clear that the “Data Agenda” has shifted quite considerably from IT to the Business as evidenced by the number of risk, compliance, and data governance executives in the room.  Though it’s a good sign that business is taking more ownership of data requirements, there was limited discussions on the importance of having capable data management technology, infrastructure, and architecture to support a successful data governance practice. Specifically capable data integration, data quality and validation, master and reference data management, metadata to support data lineage and transparency, and business glossary and data ontology solutions to govern the terms and definitions of required data across the enterprise.
  • With regard to accessing, aggregating, and streamlining the delivery of risk data from disparate systems across the enterprise and simplifying the complexity that exists today from point to point integrations accessing the same data from the same systems over and over again creating points of failure and increasing the maintenance costs of supporting the current state.  The idea of replacing those point to point integrations via a centralized, scalable, and flexible data hub approach was clearly recognized as a need however, difficult to envision given the enormous work to modernize the current state.
  • Data accuracy and integrity continues to be a concern to generate accurate and reliable risk data to meet normal and stress/crisis reporting accuracy requirements. Many in the room acknowledged heavy reliance on manual methods implemented over the years and investing in Automating data integration and onboarding risk data from disparate systems across the enterprise is important as part of Principle 3 however, much of what’s in place today was built as one off projects against the same systems accessing the same data delivering it to hundreds if not thousands of downstream applications in an inconsistent and costly way.
  • Data transparency and auditability was a popular conversation point in the room as the need to provide comprehensive data lineage reports to help explain how data is captured, from where, how it’s transformed, and used remains a concern despite advancements in technical metadata solutions that are not integrated with their existing risk management data infrastructure
  • Lastly, big concerns regarding the ability to capture and aggregate all material risk data across the banking group to deliver data by business line, legal entity, asset type, industry, region and other groupings, to support identifying and reporting risk exposures, concentrations and emerging risks.  This master and reference data challenge unfortunately cannot be solved by external data utility providers due to the fact the banks have legal entity, client, counterparty, and securities instrument data residing in existing systems that require the ability to cross reference any external identifier for consistent reporting and risk measurement.

To sum it up, most banks admit they have a lot of work to do. Specifically, they must work to address gaps across their data governance and technology infrastructure.BCBS 239 is the latest and biggest data challenge facing the banking industry and not just for the GSIB’s but also for the next level down as mid-size firms will also be required to provide similar transparency to regional regulators who are adopting BCBS 239 as a framework for their local markets.  BCBS 239 is not just a deadline but the principles set forth are a key requirement for banks to ensure they have the right data to manage risk and ensure transparency to industry regulators to monitor system risk across the global markets. How ready are you?

Share
Posted in Banking & Capital Markets, Data Aggregation, Data Governance, Data Services | Tagged , , , | Leave a comment

The Ones Not Screwing Up with Compliance Will Win

A few weeks ago, a regional US bank asked me to perform some compliance and use case analysis around fixing their data management situation.  This bank prides itself on customer service and SMB focus, while using large-bank product offerings.  However, they were about a decade behind the rest of most banks in modernizing their IT infrastructure to stay operationally on top of things.

compliance

Bank Efficiency Ratio per AUM (Assets under Management), bankregdata.com

This included technologies like ESB, BPM, CRM, etc.  They also were a sub-optimal user of EDW and analytics capabilities. Having said all this; there was a commitment to change things up, which is always a needed first step to any recovery program.

THE STAKEHOLDERS

As I conducted my interviews across various departments (list below) it became very apparent that they were not suffering from data poverty (see prior post) but from lack of accessibility and use of data.

  • Compliance
  • Vendor Management & Risk
  • Commercial and Consumer Depository products
  • Credit Risk
  • HR & Compensation
  • Retail
  • Private Banking
  • Finance
  • Customer Solutions

FRESH BREEZE

This lack of use occurred across the board.  The natural reaction was to throw more bodies and more Band-Aid marts at the problem.  Users also started to operate under the assumption that it will never get better.  They just resigned themselves to mediocrity.  When some new players came into the organization from various systemically critical banks, they shook things up.

Here is a list of use cases they want to tackle:

  • The proposition of real-time offers based on customer events as simple as investment banking products for unusually high inflow of cash into a deposit account.
  • The use of all mortgage application information to understand debt/equity ratio to make relevant offers.
  • The capture of true product and customer profitability across all lines of commercial and consumer products including trust, treasury management, deposits, private banking, loans, etc.
  • The agile evaluation, creation, testing and deployment of new terms on existing and products under development by shortening the product development life cycle.
  • The reduction of wealth management advisors’ time to research clients and prospects.
  • The reduction of unclaimed use tax, insurance premiums and leases being paid on consumables, real estate and requisitions due to the incorrect status and location of the equipment.  This originated from assets no longer owned, scrapped or moved to different department, etc.
  • The more efficient reconciliation between transactional systems and finance, which often uses multiple party IDs per contract change in accounts receivable, while the operating division uses one based on a contract and its addendums.  An example would be vendor payment consolidation, to create a true supplier-spend; and thus, taking advantage of volume discounts.
  • The proactive creation of central compliance footprint (AML, 314, Suspicious Activity, CTR, etc.) allowing for quicker turnaround and fewer audit instances from MRAs (matter requiring attention).

MONEY TO BE MADE – PEOPLE TO SEE

Adding these up came to about $31 to $49 million annually in cost savings, new revenue or increased productivity for this bank with $24 billion total assets.

So now that we know there is money to be made by fixing the data of this organization, how can we realistically roll this out in an organization with many competing IT needs?

The best way to go about this is to attach any kind of data management project to a larger, business-oriented project, like CRM or EDW.  Rather than wait for these to go live without good seed data, why not feed them with better data as a key work stream within their respective project plans?

To summarize my findings I want to quote three people I interviewed.  A lady, who recently had to struggle through an OCC audit told me she believes that the banks, which can remain compliant at the lowest cost will ultimately win the end game.  Here she meant particularly tier 2 and 3 size organizations.  A gentleman from commercial banking left this statement with me, “Knowing what I know now, I would not bank with us”.  The lady from earlier also said, “We engage in spreadsheet Kung Fu”, to bring data together.

Given all this, what would you suggest?  Have you worked with an organization like this? Did you encounter any similar or different use cases in financial services institutions?

Share
Posted in Banking & Capital Markets, Data Quality, Financial Services, Governance, Risk and Compliance | Tagged , , , | Leave a comment

Customer Centric Financial Services

Customer Centric Finance eBookThe business of financial services is transforming before our eyes. Traditional banking and insurance products have become commoditized. As each day passes, consumers demand increasingly personalized products and services. Social and mobile channels continue to overthrow traditional communication methods. To survive and grow in this complex environment, financial institutions must do three things:

  1. Attract and retain the best customers
  2. Grow wallet share
  3. Deliver top-notch customer experience across all channels and touch points

The finance industry is traditionally either product centric or account centric. However, to succeed in the future, financial institutions must become customer centric. Becoming customer-centric requires changes to your people, process, technology, and culture. You must offer the right product or service to the right customer, at the right time, via the right channel. To achive this, you must ensure alignment between business and technology leaders. It will require targeted investments to grow the business, particularly the need to modernize legacy systems.

To become customer-centric, business executives are investing in Big Data and in legacy modernization initiatives. These investments are helping Marketing, Sales and Support organizations to:

  • Improve conversion rates on new marketing campaigns on cross-sell and up-sell activities
  • Measure customer sentiment on particular marketing and sales promotions or on the financial institution as a whole
  • Improve sales productivity ratios by targeting the right customers with the right product at the right time
  • Identify key indicators that determine and predict profitable and unprofitable customers
  • Deliver an omni-channel experience across all lines of business, devices, and locations

At Informatica, we want to help you succeed. We want you to maximize the value in these investments. For this reason, we’ve written a new eBook titled: “Potential Unlocked – Improving revenue and customer experience in financial services”. In the eBook, you will learn:

  • The role customer information plays in taking customer experience to the next level
  • Best practices for shifting account-centric operations to customer-centric operations
  • Common barriers and pitfalls to avoid
  • Key considerations and best practices for success
  • Strategies and experiences from best-in-class companies

Take a giant step toward Customer-Centricity: Download the eBook now.

Share
Posted in Banking & Capital Markets, Financial Services | Tagged , , | Leave a comment