Tag Archives: Enterprise Data Management

And now for the rest of the data…

In the first two issues I spent time looking at the need for states to pay attention to the digital health and safety of their citizens, followed by the oft forgotten need to understand and protect the non-production data. This is data than has often proliferated and also ignored or forgotten about.

In many ways, non-production data is simpler to protect. Development and test systems can usually work effectively with realistic but not real PII data and realistic but not real volumes of data. On the other hand, production systems need the real production data complete with the wealth of information that enables individuals to be identified – and therefore presents a huge risk. If and when that data is compromised either deliberately or accidentally the consequences can be enormous; in the impact on the individual citizens and also the cost of remediation on the state. Many will remember the massive South Carolina data breach of late 2012 when over the course of 2 days a 74 GB database was downloaded and stolen, around 3.8 million payers and 1.9 million dependents had their social security information stolen and 3.3 million “lost” bank account details. The citizens’ pain didn’t end there, as the company South Carolina picked to help its citizens seems to have tried to exploit the situation.

encryption protects against theft - unless the key is stolen too

encryption protects against theft – unless the key is stolen too

The biggest problem with securing production data is that there are numerous legitimate users and uses of that data, and most often just a small number of potentially malicious or accidental attempts of inappropriate or dangerous access. So the question is… how does a state agency protect its citizens’ sensitive data while at the same time ensuring that legitimate uses and users continues – without performance impacts or any disruption of access? Obviously each state needs to make its own determination as to what approach works best for them.

This video does a good job at explaining the scope of the overall data privacy/security problems and also reviews a number of successful approaches to protecting sensitive data in both production and non-production environments. What you’ll find is that database encryption is just the start and is fine if the database is “stolen” (unless of course the key is stolen along with the data! Encryption locks the data away in the same way that a safe protects physical assets – but the same problem exists. If the key is stolen with the safe then all bets are off. Legitimate users are usually easily able deliberately breach and steal the sensitive contents, and it’s these latter occasions we need to understand and protect against. Given that the majority of data breaches are “inside jobs” we need to ensure that authorized users (end-users, DBAs, system administrators and so on) that have legitimate access only have access to the data they absolutely need, no more and no less.

So we have reached the end of the first series. In the first blog we looked at the need for states to place the same emphasis on the digital health and welfare of their citizens as they do on their physical and mental health. In the second we looked at the oft-forgotten area of non-production (development, testing, QA etc.) data. In this third and final piece we looked at the need to and some options for providing the complete protection of non-production data.

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Business Impact / Benefits, Data masking, Data Privacy, Enterprise Data Management, Public Sector | Tagged , , , , | Leave a comment

What types of data need protecting?

In my first article on the topic of citizens’ digital health and safety we looked at the states’ desire to keep their citizens healthy and safe and also at the various laws and regulations they have in place around data breaches and losses. The size and scale of the problem together with some ideas for effective risk mitigation are in this whitepaper.

The cost and frequency of data breaches continue to rise

The cost and frequency of data breaches continue to rise

Let’s now start delving a little deeper into the situation states are faced with. It’s pretty obvious that citizen data that enables an individual to be identified (PII) needs to be protected. We immediately think of the production data: data that is used in integrated eligibility systems; in health insurance exchanges; in data warehouses and so on. In some ways the production data is the least of our problems; our research shows that the average state has around 10 to 12 full copies of data for non-production (development, test, user acceptance and so on) purposes. This data tends to be much more vulnerable because it is widespread and used by a wide variety of people – often subcontractors or outsourcers, and often the content of the data is not well understood.

Obviously production systems need access to real production data (I’ll cover how best to protect that in the next issue), on the other hand non-production systems of every sort do not. Non-production systems most often need realistic, but not real data and realistic, but not real data volumes (except maybe for the performance/stress/throughput testing system). What need to be done? Well to start with, a three point risk remediation plan would be a good place to start.

1. Understand the non-production data using sophisticated data and schema profiling combined with NLP (Natural Language Processing) techniques help to identify previously unrealized PII that needs protecting.
2. Permanently mask the PII so that it is no longer the real data but is realistic enough for non-production uses and make sure that the same masking is applied to the attribute values wherever they appear in multiple tables/files.
3. Subset the data to reduce data volumes, this limits the size of the risk and also has positive effects on performance, run-times, backups etc.

Gartner has just published their 2013 magic quadrant for data masking this covers both what they call static (i.e. permanent or persistent masking) and dynamic (more on this in the next issue) masking. As usual the MQ gives a good overview of the issues behind the technology as well as a review of the position, strengths and weaknesses of the leading vendors.

It is (or at least should be) an imperative that from the top down state governments realize the importance and vulnerability of their citizens data and put in place a non-partisan plan to prevent any future breaches. As the reader might imagine, for any such plan to success needs a combination of cultural and organizational change (getting people to care) and putting the right technology – together these will greatly reduce the risk. In the next and final issue on this topic we will look at the vulnerabilities of production data, and what can be done to dramatically increase its privacy and security.

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Data Archiving, Data Governance, Data masking, Data Privacy, Public Sector | Tagged , , , , , | Leave a comment

Data Management Is No Longer A Back Office Function

A front office as defined by Wikipedia is “a business term that refers to a company’s departments that come in contact with clients, including the marketing, sales, and service departments” while a back office is “tasks dedicated to running the company….without being seen by customers.”  Wikipedia goes on to say that “Back office functions can be outsourced to consultants and contractors, including ones in other countries.” Data Management was once a back office activity but in recent years it has moved to the front office.  What changed? (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Business/IT Collaboration, CIO, Data Governance, Enterprise Data Management, Integration Competency Centers, Master Data Management | Tagged , , , , | 2 Comments

Building A Better Data Warehouse

The devil, as they say, is in the detail. Your organization might have invested years of effort and millions of dollars in an enterprise data warehouse, but unless the data in it is accurate and free of contradiction, it can lead to misinformed business decisions and wasted IT resources.

We’re seeing an increasing number of organizations confront the issue of data quality in their data warehousing environments in efforts to sharpen business insights in a challenging economic climate. Many are turning to master data management (MDM) to address the devilish data details that can undermine the value of a data warehousing investment.

Consider this: Just 24 percent of data warehouses deliver “high value” to their organizations, according to a survey by The Data Warehousing Institute (TDWI).[1] Twelve percent are low value and 64 percent are moderate value “but could deliver more,” TDWI’s report states. For many organizations, questionable data quality is the reason why data warehouses fall short of their potential. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Quality, Data Warehousing, Enterprise Data Management, Master Data Management | Tagged , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , | Leave a comment

Salesforce.com Integration – Inside Out or Outside In?

People have begun to realize that integration is a key requirement for a successful SaaS application deployment.  Since a SaaS application like Salesforce CRM resides in the cloud (it is hosted in a data center outside your company’s firewalls), customers need to move their critical corporate data like customer, product or pricing information into Salesforce before they can use the application and provide maximum value to their end users.

They also need to keep the data in Salesforce synchronized with the rest of their on premise applications in order to maintain operational efficiency and provide timely and accurate information flow throughout their enterprise.

So how difficult is it to integrate Salesforce with on premise applications, and how should it be done?  Well, for starters, salesforce.com provides a complete set of well documented APIs that make it straight forward for a good software programmer to accomplish that task.

Those same APIs are also used by integration vendors to provide more flexible, powerful and manageable solutions that accomplish the same task without requiring the same level of programming skill.   (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Cloud Computing, Customers, Data Integration, Data Services, Enterprise Data Management | Tagged , , , , , , , , , | 4 Comments

IT Does Matter

Just about everyone has heard about Nicholas Carr’s Harvard Business Review article titled IT Doesn’t Matter. I’ve heard many reactions to the paper presented at conferences and in the blog-o-sphere, so I figured it was time to go to the source and purchased IT Doesn’t Matter (HBR OnPoint Enhanced Edition) from Amazon for $6.50 as a downloadable pdf.

The paper is excellent in that it presents not only the original paper first written in 2003, but also features no fewer that 14 reactionary letters plus a final response from Carr to the lettersInterestingly, not a single one of the 14 letters agreed with Carr’s fundamental premise (although most of them were polite enough to acknowledge some value in the debate). I don’t know whether it was the editor’s choice to select only negative letters out of the hundreds that I’m sure were submitted, but I suspect these 14 are representative of the masses. So by way of this commentary let me add the 15th letter. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Enterprise Data Management, Integration Competency Centers | Tagged , , , | Leave a comment

Is IT Wagging the Organization?

You may need to reorganize your company to take advantage of modern IT architectures such as SOA or MDM. Some people would say this is heresy – that this is a case of the tail wagging the dog and that the IT strategy should echo the business strategy and operating model. True enough – but IT is also “part” of the business and also has an obligation to play a role in leading the enterprise into the future.The fact is that much of the current IT “mess” is because it has directly mimicked the business. Any organization that builds systems will naturally build them in a way that supports the organizational structure and the lines of communication that are inherent in the functional silos. The legacy problem that most organizations face (high cost of maintenance, lack of agility to change, poor data quality, unpredictable operations) is a direct result of building systems piecemeal in support of silos.

There is no shortage of technology silver bullets to solve the problem. The current crop of solutions includes SOA, MDM and ERP to name just a few.

There may another reason why the Gartner hype cycle exists – you know the one; peak of inflated expectations, trough of disillusionment, plateau of productivity. The reality is that most of these technologies are not technologies at all – they are enterprise strategies. And strategies require a C-level commitment that literally starts with top the organization – the CEO.

Lets take a look at an SOA example. A common business service for a bank is a “Credit Decision” service. Credit decisions should be done consistently across the bank in a way that reflects the bank policies, market strategy, risk management controls and regularity constraints in all the different legal jurisdictions that it operates in. But rather than implement one credit decisioning engine, the Credit Card business implements one, as does the Mortgage Business, the Deposit Business, the Line of Credit business, the Personal Loan business, etc. And if that isn’t enough, since the internet bank is run by a different executive from the retail branch bank – we also have separate decisioning engines for different customer channels. Each of the business groups as a result needs to hire staff to constantly maintain and update the business rules in each of the redundant systems to comply with corporate directives.

If you go and ask each of the LOB managers if they need their own decisioning engine, along with all the overhead that goes along with maintaining it, they will likely say no. It doesn’t make sense. But if you try to implement an SOA strategy in a large bank with 10 different decisioning engines, you will have an uphill battle. The first challenge is one of timing. While all 10 groups may in fact be totally on-board with the concept, they all have different priorities in any given year. You may be waiting forever for all the stars to align such that a critical mass of them are prepared to all focus on implementing a generic solution at the same time.

It’s clear from this example that reorganizing the business to create a “credit decisioning function” that serves the enterprise would dramatically simplify the IT solution. The typical pattern however is to avoid the hard work of addressing the organizational, people and political issues and instead try to solve the problem “bottom up” by incrementally and thereby iterate towards an enterprise solution. The bottom-up approach can in fact work – but don’t make the mistake to think that it’s necessarily easier more successful. The reason so many organizations are in the SOA Trough of Disillusionment is exactly because they are building it bottom-up and have not addressed the organizational issues.

Let’s take another example – Master Data Management. MDM promises to deliver quality information and one version of the truth for business data. But in order to achieve this, the organization needs an information owner (or steward) who is accountable for data quality, consistency, currency, security, etc. For most companies that are organized around LOBs or products – who owns customer information? In the absence of a business owner, a common pattern is for IT to step and presume a stewardship role. Unfortunately (as is evidenced by a long list of less than ideal CRM implementations) the IT group often does so without an official sanction from the business executives so the results should not be a big surprise.

This is not to say that any of the technologies that are inherent in SOA, MDM or ERP are bad or flawed. In fact, they are the result of the smartest people in the industry creating the technologies and the result of market forces testing the all the innovate ideas and having the best ones bubble to the surface.

At the risk of repeating myself, the point is that successful implementation is much more than a technology. You can’t “buy” SOA or MDM – these are strategies that demand organizational top-down commitment. While a bottom-up strategy can make meaningful progress and deliver significant results, IT organizations should not be afraid to tackle the “big” issues of organizational design and politics. IT is part of the business. And as IT has become a larger and larger player in most organizations due to the natural shift to an information economy, it would be wrong to think of this as the tail wagging the dog.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Enterprise Data Management, Integration Competency Centers | Tagged , | Leave a comment

The Real Silver Bullet

It’s been almost 5 years since Nicolas Carr started a lively (and sometimes hostile) debate with his Harvard Business Review article IT Doesn’t Matter. And now, just this past fall, Cynthia Rettig added fuel to the fire with her MIT Sloan Management Review paper The Trouble with Enterprise Software. There are indeed serious concerns with the perception of IT and many business leaders are fed up. You don’t need to look very hard to find evidence of trouble. The Standish Group has been tracking IT project success rates for years and the findings are dismal. The 2004 Standish CHAOS report found that only 29% of all IT projects succeeded while the remainder were either challenged or failed. Cynthia Rettig in her paper writes:

“As a percentage of total investment, IT rose from 2.6% to 3.5% between 1970 and 1980. By 1990 IT consumed 9%, and by 1999 a whopping 22% of total investment went to IT. Growth in IT spending has fallen off, but it is nonetheless surprising to hear that today’s IT departments spend 70% to 80% of their budgets just trying to keep existing systems running.”

So while IT is gobbling up larger and larger amounts of investment spending, it seems to be doing so in a way that is expensive to maintain and difficult to change.

Consider this, why is it that the Gartner Hype Cycle is such a well-accepted pattern; that technologies go through a cycle that includes the “peak of inflated expectations”, “the trough of disillusionment” and finally the “plateau of productivity”. It seems that the industry has a habit of over-promising and under-delivering so it’s hard not to blame business users for being cynical about any new technology.

By the way, Gartner missed a key phase in their technology hype cycle; the “pit of obscurity”. Some technologies never reach the plateau of productivity and instead fade away until they are just a distant memory of the IT old-timers. Whatever happened to AI (Artificial Intelligence) or 4GL (fourth generation language)? They are now in the pit along with a host of other silver bullets.

I respect both Carr and Rettig for helping to shine a light on the “Moose on the table” (the big ugly thing in the room that no one wants to talk about). If the industry is going to solve the problem, we need to make it visible and tackle it head-on. By the “industry”, I don’t just mean technology suppliers; we need corporate IT groups, academic organizations, standards bodies and government agencies all working together.

Both Carr and Rettig stop short of offering any solutions to the trouble. Carr takes the “necessary evil” point of view and simply suggests spending less on IT and outsourcing as much as possible. This doesn’t really solve the root cause issues and is simply avoiding the problem. Rettig suggests closer communication between business and IT and that executives should educate themselves more about technology. Good advice, but it’s a bit like saying if you want your food cooked right, you need to go to culinary school so that you can cook it yourself.

So what is the solution? At the risk of sounding pretentious, I’d like to suggest that there is a silver bullet solution. The answer is not more technology – we have plenty of technology and quite frankly, most of the core technical challenges were addressed by the 1980’s. The wave after wave of new technologies over the past 20 years are mostly incremental improvements over long-standing practices. Don’t get me wrong, there have been amazing developments in the past 20 years such as the low cost and tremendous capacity of the modern PC and the World Wide Web which is arguably the largest application in the world.

What I am talking about are the fundamental disciplines of managing change in an IT environment. Specifically, Information Architecture, Program Management, Systems Integration and Configuration Management. The silver bullet solution to today’s IT complexity is to simply perform these four practice areas in a disciplined fashion. And the interesting thing is you don’t even need to buy the latest IT Management Best Seller to learn the techniques – if you follow the advice of The Mythical Man-Month, first written by Fred Brooks in 1975, you will have 80% of the Silver Bullet solution. In his book, Brooks talks about his insights for how to effectively build large-scale integrated systems that require many teams consisting of hundreds, or even thousands, of staff working in a cohesive manner. His advice is as applicable today for a large-scale enterprise IT department consisting of hundreds or thousands of staff working to build an integrated enterprise system that meets the business needs.

Note that I didn’t list Software Engineering as one of the 4 pillars of the real Silver Bullet. The reason is that the industry was doing a good job of software engineering 30 years ago and is still doing a good job today. The development languages have changed and the tools have evolved, but fundamentally, the discipline of writing software is not a root cause of the trouble that Cynthia Rettig speaks of, or the frustration that Nicholas Carr writes about. The root cause is lack of discipline (read leadership) in the practice of architecture, integration, program management, and change management. If an organization was to focus on these four practice areas, put their strongest leaders in charge of them, and invest in training and tools for the staff, then the hype cycle will disappear.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Enterprise Data Management, Integration Competency Centers | Tagged , , | Leave a comment