These days, most of the conversations on Wall Street are about the recent passage of the financial reform bill which some are saying represents the most ambitious and thorough regulatory reform of the laws governing the financial industry since the Great Depression. According to Financial Reform Watch ,the new legislation adds several new corporate governance and disclosure requirements applicable to companies listed on U.S. stock exchanges and in some instances, other publicly-traded companies, including:
- Requirement for having a non-binding shareholder vote on compensation of specified executive officers and in certain instances golden parachute provisions;
- Requirement for more stringent rules and disclosure applicable to compensation committees;
- Requirement for additional disclosure requirements related to executive compensation;
- The elimination of discretionary voting by brokers in connection with the election of directors, executive compensation issues or other significant matters;
- Authorization for the SEC to adopt rules related to proxy access; and requirements to adopt claw back policies with respect to employment arrangements of executives of companies seeking to list on a U.S. stock exchange.
While much of the details on what companies need to do to comply will sort out over time, once thing is certain, that the need for timely, trusted, and relevant data will be even more critical than ever to effectively manage risk and ensure regulatory compliance.
CIOs and enterprise architects should use this opportunity to evaluate how they integrate and manage data to ensure not only the recent changes but prepare their organizations for future ripples in the financial system which are surely to come. When it comes to integrating and making the right data available to the business, it’s interesting how many companies continue to rely on traditional hand coded data integration processes to access and deliver the necessary data for risk and compliance. Over the past several months, I have been privy to these conversations as companies are now focused on identifying existing “data” gaps caused by these traditional practices. Many are quantifying the cost and risks of staying with “status quo” vs. the benefits of having a proven data integration infrastructure to meet the current and future data needs of the business.
Enterprise architects and senior IT decision makers must prepare themselves for the long term by considering a flexible and adaptable data integration architecture that can quickly access and deliver data from any source in any format and in any latency without being subject to traditional methods of hand coding or even tools that require lengthy development processes. The next generation of data integration is the adoption of data services. This can provide a higher level of abstraction or a “virtual” data integration layer to help businesses quickly integrate and access required data in the systems and application they depend on. All this without being subject to lengthy development cycles that goes with traditional/physical data integration activities. Having such flexibility and scalability is critical in cases where time is the real enemy.
Managing data is no longer just an IT responsibility and those who still believe that it is are at greater risk of non-compliance. Data is recognized in many industries as a strategic asset and is as important as the people and buildings in them. Data must be owned and managed by the business and business folks need to be directly involved and in constant collaboration with IT to define the data they need to drive business success. Over the last three years, I have had many conversations about the topic of data governance as the “next big thing”. Most have struggled to define a solid business case to fund the formation of a data governance practice and required tools to manage consistent and accurate data.
With the recent passage of the financial reform bill, the consequences of non-compliance should easily justify the need and investment in a data governance program. While a successful data governance program requires well defined roles, processes, and policies it is also important to have the right tools to manage data quality and business metadata that are both easy to use for business analyst and steward that promote and enable seamless collaboration with IT throughout the data governance lifecycle.
Lastly, achieving that “single view” of the business will be more critical as the barriers of traditional business silos and islands of applications often cause information fragmentation impacting risk and compliance management. Investing in a centralized reference data architecture to manage consistent and holistic reference data including counterparty and client data will be essential to ensure that existing risk and compliance systems across the enterprise share the same information for ongoing credit risk management and regulatory compliance reporting.
This is where master data management (MDM) plays a pivotal role. It ensures consistent business information across existing systems, databases and applications that reside in heterogeneous operating environments. They can benefit from having reference data that is consistent, accurate and managed by the business. Not all MDM solutions are the same as the key to MDM adoption falls on the ability for data stewards to manage often complex hierarchies without having to go through a full development cycle through IT.
Are you ready for the new financial reform bill?