Category Archives: Governance, Risk and Compliance
Financial Stability Board Pushes Legal Entity Identifier to the G20– Vote Expected this Month – What’s Next?
Hot off the press! The Financial Stability Board (FSB) published today (June 8th, 2012) a report entitled “A Global Legal Entity Identifier for Financial Markets” for the G20 supervisors for consideration and response to the mandate issued by the G20 at the Cannes Summit for a final vote at the end of the month in Mexico. It sets out 35 recommendations for the development and implementation of the global LEI system. These recommendations are guided by a set of “High Level Principles” which outline the objectives that a global LEI system should meet.
The proposed global Legal Entity Identifier (LEI) is expected to help regulators identify unique counterparties across the financial system and monitor the impact of risky counterparties holding positions with the banks. Assuming LEI is approved by the G20 this month, it will be the first of these infrastructure standards to be implemented globally requiring firms to integrate, reconcile and cross-reference the new LEI with existing counterparty identifiers and information, as well as manage accurate and current legal hierarchies. (more…)
In a May 2012 survey by the Ponemon Institute, 66 percent said they are not confident their organization would be able to detect the loss or theft of sensitive personal information contained in systems operated by third parties, including cloud providers. In addition, the majority are not confident that their organization would be able detect the loss or theft of sensitive personal information in their company’s production environment.
Which aspect of data security for your cloud solution is most important?
1. Is it to protect the data in copies of production/cloud applications used for test or training purposes? For example, do you need to secure data in your Salesforce.com Sandbox?
2. Is it to protect the data so that a user will see data based on her/his role, privileges, location and data privacy rules?
3. Is it to protect the data before it gets to the cloud?
As compliance continues to drive people to action, compliance with contractual agreements, especially for the cloud infrastructure continues to drive investment. In addition, many organizations are supporting Salesforce.com as well as packaged solutions such as Oracle eBusiness, Peoplesoft, SAP, and Siebel.
Of the available data protection solutions, tokenization has been used and is well known for supporting PCI data and preserving the format and width of a table column. But because many tokenization solutions today require creating database views or changing application source code, it has been difficult for organizations to support packaged applications that don’t allow these changes. In addition, databases and applications take a measurable performance hit to process tokens.
What might work better is to dynamically tokenize data before it gets to the cloud. So there would be a transparent layer between the cloud and on-premise data integration that would replace the sensitive data with tokens. In this way, additional code to the application would not be required.
In the Ponemon survey, most said the best control is to dynamically mask sensitive information based on the user’s privilege level. After dynamically masking sensitive data, people said encrypting all sensitive information contained in the record is the best option.
The strange thing is that people recognize there is a problem but are not spending accordingly. In the same survey from Ponemon, 69% of organizations find it difficult to restrict user access to sensitive information in IT and business environments. However, only 33% say they have adequate budgets to invest in the necessary solutions to reduce the insider threat.
Is this an opportunity for you?
Hear Larry Ponemon discuss the survey results in more detail during a CSOonline.com/Computerworld webinar, Data Privacy Challenges and Solutions: Research Findings with Ponemon Institute, on Wednesday, June 13.
Today, agility and timely visibility are critical to the business. No wonder CIO.com, states that business intelligence (BI) will be the top technology priority for CIOs in 2012. However, is your data architecture agile enough to handle these exacting demands?
In his blog Top 10 Business Intelligence Predictions For 2012, Boris Evelson of Forrester Research, Inc., states that traditional BI approaches often fall short for the two following reasons (among many others):
- BI hasn’t fully empowered information workers, who still largely depend on IT
- BI platforms, tools and applications aren’t agile enough (more…)
If you haven’t already, I think you should read The Forrester Wave™: Data Virtualization, Q1 2012. For several reasons – one, to truly understand the space, and two, to understand the critical capabilities required to be a solution that solves real data integration problems.
At the very outset, let’s clearly define Data Virtualization. Simply put, Data Virtualization is foundational to Data Integration. It enables fast and direct access to the critical data and reports that the business needs and trusts. It is not to be confused with simple, traditional Data Federation. Instead, think of it as a superset which must complement existing data architectures to support BI agility, MDM and SOA. (more…)
Data warehouses are applications– so why not manage them like one? In fact, data grows at a much faster rate in data warehouses, since they integrate date from multiple applications and cater to many different groups of users who need different types of analysis. Data warehouses also keep historical data for a long time, so data grows exponentially in these systems. The infrastructure costs in data warehouses also escalate quickly since analytical processing on large amounts of data requires big beefy boxes. Not to mention the software license and maintenance costs of such a large amount of data. Imagine how many backup media is required to backup tens to hundreds of terabytes of data warehouses on a regular basis. But do you really need to keep all that historical data in production?
One of the challenges of managing data growth in data warehouses is that it’s hard to determine which data is actually used, which data is no longer being used, or even if the data was ever used at all. Unlike transactional systems where the application logic determines when records are no longer being transacted upon, the usage of analytical data in data warehouses has no definite business rules. Age or seasonality may determine data usage in data warehouses, but business users are usually loath to let go of the availability of all that data at their fingertips. The only clear cut way to prove that some data is no longer being used in data warehouses is to monitor its usage.
The recent Commodities Futures Trading Commission (CFTC) ruling requiring real-time reporting for over-the-counter (OTC) swap trading was decided over the holidays to increase transparency and provide a comprehensive view of the entire swaps market to help regulators monitor and govern market activities and hedge against increased systemic risk. This ruling is a major change for many companies who have had little to no regulatory reporting requirements prior to this rulings. The deadlines for real-time swap reporting are right around the corner as the first of three deadlines being July 16, 2012 to commence real-time swap reporting.
Meeting these new reporting requirements poses significant challenges for those impacted by the new ruling that cannot be ignored. Let’s take a look at what they are and how Informatica’s solutions can help overcome these obstacles. (more…)
Recently, one Sunday afternoon, my wife told me her purchased airline reservation for Monday had disappeared. A few days ago it was there, then suddenly it was gone. Apparently she was not alone. Complaints were coming in online. Support was overwhelmed. In fact, Twitter seemed to be the only way to get a quick response, so people were sending their messages and details that way!
Now, this is a great brand that I’m talking about, what I would call the chic brand in the airline industry. They had just experienced a brand-killing event. No one wants to fly an airline where reservations disappear. But the next part stunned me. When my wife finally talked to a manager at the airline, it turned out they expected this would happen! They had just migrated to a new reservation system and another airline that had done this same migration had experienced issues for a few weeks. So the airline not only expected to have similar issues, they scheduled fewer planes to fly in the upcoming weeks! (more…)
With just a few days remaining in what has been an eventful year, I thought I’d take some time to reflect on the world of data quality as I’ve observed it over the past twelve months. While the idea of data quality improvement in general didn’t change much, the way that companies are viewing and approaching it most certainly have. Here are three areas that seemed to come up quite frequently:
Data governance awareness grew
In thinking about all the customer interactions that I was involved in throughout the year, it’s hard to come up with one where the topic of data governance didn’t surface. Whereas before, the topic of data governance only seemed to come up for companies with more mature data management organizations, now it seems everyone is looking to build a governance framework in conjunction with their data quality efforts. Furthermore, while previously the conversation was largely driven by IT, now it’s both IT and business stakeholders that are looking for answers to how data governance can help them drive better business outcomes. In increasingly competitive market conditions, we can only expect this trend to continue. Whether it’s focused on increasing revenue, driving out cost or managing risk and compliance, data quality with data governance is where companies of all sizes are turning to create and sustain a differentiated edge. Trends like big data will only make this need more acute. (more…)
In the first of two videos, Peter Ku, Director of Financial Services Solutions Marketing, Informatica, talks about counterparty data and its challenges.
Specifically, he answers the following questions:
- What is counterparty data and why is it important?
- What are some of the challenges that the global banking industry is facing with their counterparty information?
Similar to the way that a carburetor restrictor plate prevents NASCAR race cars from going as fast as possible by restricting maximum airflow, inefficient messaging middleware prevents IT organizations from processing vital business data as fast as possible.