Category Archives: Architects

“It’s not you, it’s me!” – says Data Quality to Big Data

“It’s not you, it’s me!” – says Data Quality to Big Data

“It’s not you, it’s me!” – says Data Quality to Big Data

I couldn’t help myself start this blog with George Costanza’s “You are giving me the – It’s not you, it’s me! – routine? I invented – It’s not you, it’s me …”

The thing that resonates today, in the odd context of big data, is that we may all need to look in the mirror, hold a thumb drive full of information in our hands, and concede once and for all It’s not the data… it’s us.

Many organizations have a hard time making something useful from the ever-expanding universe of big-data, but the problem doesn’t lie with the data: It’s a people problem.

The contention is that big-data is falling short of the hype because people are:

  1. too unwilling to create cultures that value standardized, efficient, and repeatable information, and
  2. too complex to be reduced to “thin data” created from digital traces.

Evan Stubbs describes poor data quality as the data analyst’s single greatest problem.


About the only satisfying thing about having bad data is the schadenfreude that goes along with it. There’s cold solace in knowing that regardless of how poor your data is, everyone else’s is equally as bad. The thing is poor quality data doesn’t just appear from the ether. It’s created. Leave the dirty dishes for long enough and you’ll end up with cockroaches and cholera. Ignore data quality and eventually you’ll have black holes of untrustworthy information. Here’s the hard truth: we’re the reason bad data exists.


I will tell you that most data teams make “large efforts” to scrub their data. Those “infrequent” big cleanups however only treat the symptom, not the cause – and ultimately lead to inefficiency, cost, and even more frustration.

It’s intuitive and natural to think that data quality is a technological problem. It’s not; it’s a cultural problem. The real answer is that you need to create a culture that values standardized, efficient, and repeatable information.

If you do that, then you’ll be able to create data that is re-usable, efficient, and high quality. Rather than trying to manage a shanty of half-baked source tables, effective teams put the effort into designing, maintaining, and documenting their data. Instead of being a one-off activity, it becomes part of business as usual, something that’s simply part of daily life.

However, even if that data is the best it can possibly be, is it even capable of delivering on the big-data promise of greater insights about things like the habits, needs, and desires of customers?

Despite the enormous growth of data and the success of a few companies like Amazon and Netflix, “the reality is that deeper insights for most organizations remain elusive,” write Mikkel Rasmussen and Christian Madsbjerg in a Bloomberg Businessweek blog post that argues “big-data gets people wrong.”


Big-data delivers thin data. In the social sciences, we distinguish between two types of human behavior data. The first – thin data – is from digital traces: He wears a size 8, has blue eyes, and drinks pinot noir. The second – rich data – delivers an understanding of how people actually experience the world: He could smell the grass after the rain, he looked at her in that special way, and the new running shoes made him look faster. Big-data focuses solely on correlation, paying no attention to causality. What good is thin “information” when there is no insight into what your consumers actually think and feel?


Accenture reported only 20 percent of the companies it profiled had found a proven causal link between “what they measure and the outcomes they are intending to drive.”

Now, I can contend they keys to transforming big-data to strategic value are critical thinking skills.

Where do we get such skills? People, it seems, are both the problem and the solution. Are we failing on two fronts: failing to create the right data-driven cultures, and failing to interpret the data we collect?

Twitter @bigdatabeat

Share
Posted in Architects, Big Data, Business Impact / Benefits, CIO, Data Governance, Data Quality, Data Transformation, Hadoop | Tagged , , , | Leave a comment

Rising DW Architecture Complexity

Rising DW Architecture Complexity

Rising DW Architecture Complexity

I was talking to an architect-customer last week at a company event and he was describing how his enterprise data warehouse architecture was getting much more complex after many years of relative calm and stability.  In the old days of yore, you had some data sources, a data warehouse (with single database), and some related edge systems.

The current trend is that new types of data and new types of physical storage are changing all of that.

When I got back from my trip I found a TDWI white paper by Philip Russom that describes the situation very well in a white paper detailing his research on this subject;  Evolving Data Warehouse Architectures in the Age of Big Data.

From an enterprise data architecture and management point of view, this is a very interesting paper.

  • First the DW architectures are getting complex because of all the new physical storage options available
    • Hadoop – very large scale and inexpensive
    • NoSQL DBMS – beyond tabular data
    • Columnar DBMS – very fast seek time
    • DW Appliances – very fast / very expensive
  • What is driving these changes is the rapidly-increasing complexity of data. Data volume has captured the imagination of the press, but it is really the rising complexity of the data types that is going to challenge architects.
  • But, here is what really jumped out at me. When they asked the people in their survey what are the important components of their data warehouse architecture, the answer came back; Standards and rules.  Specifically, they meant how data is modeled, how data quality metrics are created, metadata requirements, interfaces for data integration, etc.

The conclusion for me, from this part of the survey, was that business strategy is requiring more complex data for better analyses (example: realtime response or proactive recommendations) and business processes (example: advanced customer service).  This, in turn, is driving IT to look into more advanced technology to deal with different data types and different use cases for the data.  And finally, the way they are dealing with the exploding complexity was through standards, particularly data standards.  If you are dealing with increasing complexity and have to do it better, faster and cheaper, they only way you are going to survive is by standardizing as much as reasonably makes sense.  But, not a bit more.

If you think about it, it is good advice.  Get your data standards in place first.  It is the best way to manage the data and technology complexity.  …And a chance to be the driver rather than the driven.

I highly recommend reading this white paper.  There is far more in it than I can cover here. There is also a Philip Russom webinar on DW Architecture that I recommend.

Share
Posted in Architects, CIO | Tagged , , , , , , , , , , | Leave a comment

Solution Patterns for IoT Now

Analysing_IoT

Analysing Internet of Things (IoT)

I have worked with several clients in the Internet of Things space over the last year and really enjoyed all of the engagements.

First, I am not a fan of the term IoT/Internet of Things. It just seems to be a bit too much pie in the sky and marketing. It reminds me a lot of the people who put an “e” or “i” in front of everything in the late 90s and early 00s. To me this is about expanded data integration use cases (e.g. more end points that you have the choice to access), data filtering and processing (e.g. what is the data you actually care about) and work flow/bpm from an enterprise perspective. (can you automate tasks and actions based on data or analysis of data)

There are definitely advancements in technology that are making for some very interesting solutions. My Nest thermostat is really cool, but it’s not really changing the world as one might think from some of the IoT frenzy the last few years. From what I have seen I think there are three main real world solutions that fit under the concept of IoT.

1) Passive Monitoring. This amounts to data collection and filtering. Lots of the consumer facing solutions fall into this category. Wearables, which we are told are super hot or just all the huge amount of big data collection that will then be churned and analyzed or just sit as it builds up. A big issue here is there is a lot of data to collect from an every growing set of end points but more data is not always useful if a company has not set up a process and a way to filter and identify the actual important data. I think the impact on individuals is more real than companies in this segment. I know people who swear they live better because of the data from their CPAP for example.

2) Active Monitoring. Most of these use cases fall into alerting or rule based work flow. There are examples of companies taking existing solutions or evolving existing solutions to then use real-time or near real-time data to drive work flow or alerts to make sure someone actually does something. My next write up is going to focus on an example in this space where a company has creating some really great technology to track usage of a product so they can provide real time view of inventory and then drive either automated replacement orders or work flow for people to do something like order more.

3) Automated response. To me a lot of the so called IoT use cases fall into a re-branding of solutions that have been around for years but now there is a mobile client. This is where all the security, energy (e.g. smart meters) and home automation fit.

Over the next 10 years I could see additional patterns become real, but a lot of the landscape is more hope than real when it comes to IoT from an enterprise company point of view or a how it really impacts a person’s life point of view. Of course I would expect other people would break down the use cases differently and I would love to hear your point of view.

(Note: IoT Landscape Chart is re-posted from work by First Mark Capital’s Matt Turck)

Share
Posted in Architects | Tagged | Leave a comment

Garbage In, Garbage Out? Don’t Take Data for Granted in Analytics Initiatives!

Cant trust data_1The verdict is in. Data is now broadly perceived as a source of competitive advantage. We all feel the heat to deliver good data. It is no wonder organizations view Analytics initiatives as highly strategic. But the big question is, can you really trust your data? Or are you just creating pretty visualizations on top of bad data?

We also know there is a shift towards self-service Analytics. But did you know that according to Gartner, “through 2016, less than 10% of self-service BI initiatives will be governed sufficiently to prevent inconsistencies that adversely affect the business”?1 This means that you may actually show up at your next big meeting and have data that contradicts your colleague’s data.  Perhaps you are not working off of the same version of the truth. Maybe you have siloed data on different systems and they are not working in concert? Or is your definition of ‘revenue’ or ‘leads’ different from that of your colleague’s?

So are we taking our data for granted? Are we just assuming that it’s all available, clean, complete, integrated and consistent?  As we work with organizations to support their Analytics journey, we often find that the harsh realities of data are quite different from perceptions. Let’s further investigate this perception gap.

For one, people may assume they can easily access all data. In reality, if data connectivity is not managed effectively, we often need to beg borrow and steal to get the right data from the right person. If we are lucky. In less fortunate scenarios, we may need to settle for partial data or a cheap substitute for the data we really wanted. And you know what they say, the only thing worse than no data is bad data. Right?

Another common misperception is: “Our data is clean. We have no data quality issues”.  Wrong again.  When we work with organizations to profile their data, they are often quite surprised to learn that their data is full of errors and gaps.  One company recently discovered within one minute of starting their data profiling exercise, that millions of their customer records contained the company’s own address instead of the customers’ addresses… Oops.

Another myth is that all data is integrated.  In reality, your data may reside in multiple locations: in the cloud, on premise, in Hadoop and on mainframe and anything in between. Integrating data from all these disparate and heterogeneous data sources is not a trivial task, unless you have the right tools.

And here is one more consideration to mull over. Do you find yourself manually hunting down and combining data to reproduce the same ad hoc report over and over again? Perhaps you often find yourself doing this in the wee hours of the night? Why reinvent the wheel? It would be more productive to automate the process of data ingestion and integration for reusable and shareable reports and Analytics.

Simply put, you need great data for great Analytics. We are excited to host Philip Russom of TDWI in a webinar to discuss how data management best practices can enable successful Analytics initiatives. 

And how about you?  Can you trust your data?  Please join us for this webinar to learn more about building a trust-relationship with your data!

  1. Gartner Report, ‘Predicts 2015: Power Shift in Business Intelligence and Analytics Will Fuel Disruption’; Authors: Josh Parenteau, Neil Chandler, Rita L. Sallam, Douglas Laney, Alan D. Duncan; Nov 21 2014
Share
Posted in Architects, Business/IT Collaboration, Data Governance, Data Integration, Data Warehousing | Tagged , , , , , , | 1 Comment

Is Your Data Ready to Maximize Value from Your CRM Investments?

Is Your Data Ready to Maximize Value from Your CRM Investments?

Is Your Data Ready to Maximize Value from Your CRM Investments?

A friend of mine recently reached out to me about some advice on CRM solutions in the market.  Though I have not worked for a CRM vendor, I’ve had both direct experience working for companies that implemented such solutions to my current role interacting with large and small organizations regarding their data requirements to support ongoing application investments across industries. As we spoke, memories started to surface when he and I had worked on implementing Salesforce.com (SFDC) many years ago. Memories that we wanted to forget but important to call out given his new situation.

We worked together for a large mortgage lending software vendor selling loan origination solutions to brokers and small lenders mainly through email and snail mail based marketing.  He was responsible for Marketing Operations, and I ran Product Marketing. The company looked at Salesforce.com to help streamline our sales operations  and improve how we marketed and serviced our customers.  The existing CRM system was from the early 90’s and though it did what the company needed it to do, it was heavily customized, costly to operate, and served its life. It was time to upgrade, to help grow the business, improve business productivity, and enhance customer relationships.

After 90 days of rolling out SFDC, we ran into some old familiar problems across the business.  Sales reps continued to struggle in knowing who was a current customer using our software, marketing managers could not create quality mailing lists for prospecting purposes, and call center reps were not able to tell if the person on the other end was a customer or prospect. Everyone wondered why this was happening given we adopted the best CRM solution in the market.  You can imagine the heartburn and ulcers we all had after making such a huge investment in our new CRM solution.  C-Level executives were questioning our decisions and blaming the applications. The truth was, the issues were not related to SFDC but the data that we had migrated into the system and the lack proper governance and a capable information architecture to support the required data management integration between systems that caused these significant headaches.

During the implementation phase, IT imported our entire customer database of 200K+ unique customer entities from the old system to SFDC. Unfortunately, the mortgage industry was very transient and on average there were roughly 55K licenses mortgage brokers and lenders in the market and because no one ever validated the accuracy of who was really a customer vs. someone who had ever bought out product, we had a serious data quality issues including:

  • Trial users  who purchased evaluation copies of our products that expired were tagged as current customers
  • Duplicate records caused by manual data entry errors consisting of companies with similar but entered slightly differently with the same business address were tagged as unique customers
  • Subsidiaries of parent companies in different parts of the country that were tagged again as a unique customer.
  • Lastly, we imported the marketing contact database of prospects which were incorrectly accounted for as a customer in the new system

We also failed to integrate real-time purchasing data and information from our procurement systems for sales and support to handle customer requests. Instead of integrating that data in real-time with proper technology, IT had manually loaded these records at the end of the week via FTP resulting in incorrect billing information, statement processing, and a ton of complaints from customers through our call center. The price we paid for not paying attention to our data quality and integration requirements before we rolled out Salesforce.com was significant for a company of our size. For example:

  • Marketing got hit pretty hard. Each quarter we mailed evaluation copies of new products to our customer database of 200K, each costing the company $12 per to produce and mail. Total cost = $2.4M annually.  Because we had such bad data,  we would get 60% of our mailings returned because of invalid addresses or wrong contact information. The cost of bad data to marketing = $1.44M annually.
  • Next, Sales struggled miserably when trying to upgrade a customer by running cold call campaigns using the names in the database. As a result, sales productivity dropped by 40% and experienced over 35% sales turnover that year. Within a year of using SFDC, our head of sales got let go. Not good!
  • Customer support used SFDC to service customers, our average all times were 40 min per service ticket. We had believed that was “business as usual” until we surveyed what reps were spending their time each day and over 50% said it was dealing with billing issues caused by bad contact information in the CRM system.

At the end of our conversation, this was my advice to my friend:

  • Conduct a data quality audit of the systems that would interact with the CRM system. Audit how complete your critical master and reference data is including names, addresses, customer ID, etc.
  • Do this before you invest in a new CRM system. You may find that much of the challenges faced with your existing applications may be caused by the data gaps vs. the legacy application.
  • If they had a data governance program, involve them in the CRM initiative to ensure they understand what your requirements are and see how they can help.
  • However, if you do decide to modernize, collaborate and involve your IT teams, especially between your Application Development teams and your Enterprise Architects to ensure all of the best options are considered to handle your data sharing and migration needs.
  • Lastly, consult with your technology partners including your new CRM vendor, they may be working with solution providers to help address these data issues as you are probably not the only one in this situation.

Looking Ahead!

CRM systems have come a long way in today’s Big Data and Cloud Era. Many firms are adopting more flexible solutions offered through the Cloud like Salesforce.com, Microsoft Dynamics, and others. Regardless of how old or new, on premise or in the cloud, companies invest in CRM not to just serve their sales teams or increase marketing conversion rates, but to improve your business relationship with your customers. Period! It’s about ensuring you have data in these systems that is trustworthy, complete, up to date, and actionable to improve customer service and help drive sales of new products and services to increase wallet share. So how to do you maximize your business potential from these critical business applications?

Whether you are adopting your first CRM solution or upgrading an existing one, keep in mind that Customer Relationship Management is a business strategy, not just a software purchase. It’s also about having a sound and capable data management and governance strategy supported by people, processes, and technology to ensure you can:

  • Access and migrate data from old to new avoiding develop cost overruns and project delays.
  • Identify, detect, and distribute transactional and reference data from existing systems into your front line business application in real-time!
  • Manage data quality errors including duplicate records, invalid names and contact information due to proper data governance and proactive data quality monitoring and measurement during and after deployment
  • Govern and share authoritative master records of customer, contact, product, and other master data between systems in a trusted manner.

Will your data be ready for your new CRM investments?  To learn more:

Follow me on Twitter @DataisGR8

Share
Posted in Architects, Cloud, Cloud Application Integration, Cloud Computing, Cloud Data Integration, Cloud Data Management, CMO, Customer Acquisition & Retention, SaaS | Tagged , , , , , , , , , | Leave a comment

How a Business-led Approach Displaces an IT-led Project

In my previous blog, I talked about how a business-led approach can displace technology-led projects. Historically IT-led projects have invested significant capital while returning minimal business value. It further talks about how transformation roadmap execution is sustainable because the business is driving the effort where initiative investments are directly traceable to priority business goals.

business-led approachFor example, an insurance company wants to improve the overall customer experience. Mature business architecture will perform an assessment to highlight all customer touch points. It requires a detailed capability map, fully formed, customer-triggered value streams, value stream/ capability cross-mappings and stakeholder/ value stream cross-mappings. These business blueprints allow architects and analysts to pinpoint customer trigger points, customer interaction points and participating stakeholders engaged in value delivery.

One must understand that value streams and capabilities are not tied to business unit or other structural boundaries. This means that while the analysis performed in our customer experience example may have been initiated by a given business unit, the analysis may be universally applied to all business units, product lines and customer segments. Using the business architecture to provide a representative cross-business perspective requires incorporating organization mapping into the mix.

Incorporating the application architecture into the analysis and proposed solution is simply an extension of business architecture mapping that incorporates the IT architecture. Robust business architecture is readily mapped to the application architecture, highlighting enterprise software solutions that automate various capabilities, which in turn enable value delivery. Bear in mind, however, that many of the issues highlighted through a business architecture assessment may not have corresponding software deployments since significant interactions across the business tend to be manual or desktop-enabled. This opens the door to new automation opportunities and new ways to think about business design solutions.

Building and prioritizing the transformation strategy and roadmap is dramatically simplified once all business perspectives needed to enhance customer experience are fully exposed. For example, if customer service is a top priority, then that value stream becomes the number one target, with each stage prioritized based on business value and return on investment. Stakeholder mapping further refines design approaches for optimizing stakeholder engagement, particularly where work is sub-optimized and lacks automation.

Capability mapping to underlying application systems and services provides the basis for establishing a corresponding IT deployment program, where the creation and reuse of standardized services becomes a focal point. In certain cases, a comprehensive application and data architecture transformation becomes a consideration, but in all cases, any action taken will be business and not technology driven.

Once this occurs, everyone will focus on achieving the same goals, tied to the same business perspectives, regardless of the technology involved.

Twitter @bigdatabeat

Share
Posted in Architects, Business Impact / Benefits, Business/IT Collaboration, CIO, Data Governance, Integration Competency Centers | Tagged , , , , | Leave a comment

The Similarities between Mixed Martial Arts Fighters and Information Architects

Mixed Martial Arts Fighters and Information Architects

Mixed Martial Arts Fighters

I have to admit I am a huge and longtime Mixed Martial Arts (MMA) fan.  Even before MMA became main stream, I have followed and studied traditional martial arts from Tae Kwon Do, Judo, Boxing, Wrestling, and Jujitsu since I was a young lad.  Given I hate pain, I am glad I call myself a fan and only a spectator.  Modern MMA fighters are extremely talented athletes however their success to win the cage is dependent on having a strong base across different disciplines.  They must be good on their feet with effective punches and kicks, have world class level wrestling, jujitsu, and judo on the ground, strong defense techniques all around, and a strong will not to quit.

My other passion of which is a less painful one is helping organizations understand and harness technology and best practices to leverage great data for business success.  Believe it or not, there are close similarities between being an effective MMA fighter and a successful information architect.  Information architects in today’s modern Big Data/Internet of Things world has to have a mix of knowledge and skills to recommend, design, and implement the right information management solutions for their businesses.  This involves having a strong background in database development, data engineering, computer programming,  web, security,  networking, system administration, development, and other technology competencies to formulate and categorize information into a coherent structure, preferably one that the intended audience can understand quickly, if not inherently, and then easily retrieve the information for which they are searching.

Information Architects

Information Architects

Like successful MMA fighters, Information Architects require training and development of basic building blocks regardless of their “intellectual” and “technical” prowess.  Having a strong base allows architects to recommend the right solutions, avoiding ineffective and inefficient methods such as hand coding critical data integration, data governance, and data quality processes that often result in bad data, higher costs, and increased risk of not meeting what the business needs. An MMA fighter with a strong base would leverage those skills to avoid techniques or moves that places themselves at harm or risk of getting knocked out, choked out, or getting an arm broken by their opponent.  Instead, like an MMA fighter, well developed architects leverage that base and knowledge to adopt proven technologies for their information architecture and management needs.

The technologies to manage data in the enterprise vary both in performance, functionality, and value.  Like MMA fighters competing for a living, it’s important they learn from skilled masters vs. the local martial arts school at your neighborhood strip mall or free YouTube videos recorded by jokers claiming to be a master and hoping that knowledge will help them survive in combat. Similarly, architects must make careful investments and decisions when designing systems to deliver great data. Short cuts and “good enough” tools won’t cut it in today’s data driven world. Great Data only comes by Great Design powered by an intelligent and capable data platform.    “Are you Ready to Get it On!”

Share
Posted in Architects, Enterprise Data Management | Tagged | Leave a comment

CES, Digital Strategy and Architecture: Are You Ready?

CES, Digital Strategy and Architecture

CES, Digital Strategy and Architecture

CES, the International Consumer Electronics show is wrapping up this week and the array of new connected products and technologies was truly impressive. “The Internet of Things” is moving from buzzword to reality.  Some of the major trends seen this week included:

  • Home Hubs from Google, Samsung, and Apple (who did not attend the show but still had a significant impact).
  • Home Hub Ecosystems providing interoperability with cars, door locks, and household appliances.
  • Autonomous cars, and intelligent cars
  • Wearable devices such as smart watches and jewelry.
  • Drones that take pictures and intelligently avoid obstacles.  …Including people trying to block them.  There is a bit of a creepy factor here!
  • The next generation of 3D printers.
  • And the intelligent baby pacifier.  The idea is that it takes the baby’s temperature, but I think the sleeper hit feature on this product is the ability to locate it using GPS and a smart phone. How much money would you pay to get your kid to go to sleep when it is time to do so?

Digital Strategies Are Gaining Momentum

There is no escaping the fact that the vast majority of companies out there have active digital strategies, and not just in the consumer space. The question is: Are you going to be the disruptor or the disruptee?  Gartner offered an interesting prediction here:

“By 2017, 60% of global enterprise organizations will execute on at least one revolutionary and currently unimaginable business transformation effort.”

It is clear from looking at CES, that a lot of these products are “experiments” that will ultimately fail.  But focusing too much on that fact is to risk overlooking the profound changes taking place that will shake out industries and allow competitors to jump previously impassible barriers to entry.

IDC predicted that the Internet of Things market would be over $7 Trillion by the year 2020.  We can all argue about the exact number, but something major is clearly happening here.  …And it’s big.

Is Your Organization Ready?

A study by Gartner found that 52% of CEOs and executives say they have a digital strategy.  The problem is that 80% of them say that they will “need adaptation and learning to be effective in the new world.”  Supporting a new “Internet of Things” or connected device product may require new business models, new business processes, new business partners, new software applications, and require the collection and management of entirely new types of data.  Simply standing up a new ERP system or moving to a cloud application will not help your organization to deal with the new business models and data complexity.

Architect’s Call to Action

Now is the time (good New Year’s resolution!) to get proactive on your digital strategy.  Your CIO is most likely deeply engaged with her business counterparts to define a digital strategy for the organization. Now is the time to be proactive in terms of recommending the IT architecture that will enable them to deliver on that strategy – and a roadmap to get to the future state architecture.

Key Requirements for a Digital-ready Architecture

Digital strategy and products are all about data, so I am going to be very data-focused here.  Here are some of the key requirements:

  • First, it must be designed for speed.  How fast? Your architecture has to enable IT to move at the speed of business, whatever that requires.  Consider the speed at which companies like Google, Amazon and Facebook are making IT changes.
  • It has to explicitly directly link the business strategy to the underlying business models, processes, systems and technology.
  • Data from any new source, inside or outside your organization, has to be on-boarded quickly and in a way that it is immediately discoverable and available to all IT and business users.
  • Ongoing data quality management and Data Governance must be built into the architecture.  Point product solutions cannot solve these problems.  It has to be pervasive.
  • Data security also has to be pervasive for the same reasons.
  • It must include business self-service.  That is the only way that IT is going to be able to meet the needs of business users and scale to the demands of the changes required by digital strategy.

Resources:

For a webinar on connecting business strategy to the architecture of business transformation see; Next-Gen Architecture: A “Business First” Approach for Agile Architecture.   With John Schmidt of Informatica and Art Caston, founder of Proact.

For next-generation thinking on enterprise data architectures see; Think “Data First” to Drive Business Value

For more on business self-service for data preparation and a free software download.

Share
Posted in Architects, CIO, Data Integration Platform, Enterprise Data Management | Tagged , , , , , , , , | Leave a comment

The 3 Little Architects and the Big Bad Mr. Wolf – A Data Parody for today’s Financial Industry

The 3 Little Architects and the Big Bad Wolf

The 3 Little Architects

Once upon a time, there were 3 Information Architects working in the financial services industry, each with different firms and backgrounds but all responsible for recommending the right technology solutions to help their firms comply with industry regulations including ongoing bank stress testing across the globe.  Since 2008, bank regulators have been focused on measuring systemic risk and requiring banks to provide transparency into how risk is measured and reported to support their capital adequacy needs.

The first architect grew through the ranks starting as a Database Administrator, a black belt in SQL and COBOL programming. Hand coding was their DNA for many years and thought of as the best approach given how customized their business and systems were vs. other organizations. As such, Architect #1 and their team went down the path of building their data management capabilities through custom hand coded scripts, manual data extractions and transformations, and dealing with data quality issues through the business organizations after the data is delivered.   Though their approach and decisions delivered on their short term needs, the firm realized the overhead required to make changes and respond to new requests driven by new industry regulations and changing market conditions.

The second architect is a “gadget guy” at heart who grew up using off the shelf tools vs. hand coding for managing data. He and his team decides not to hand code their data management processes, instead adopt and built their solution leveraging best of breed tools, some of which were open source, others from existing solutions the company had from previous projects for data integration, data quality, and metadata management.  Though their tools helped automate much of the “heavy lifting” he and is IT team were still responsible for integrating these point solutions to work together which required ongoing support and change management.

The last architect is as technically competent as his peers however understood the value of building something once to use across the business. His approach was a little different than the first two. Understanding the risks and costs of hand coding or using one off tools to do the work, he decided to adopt an integrated platform designed to handle the complexities, sources, and volumes of data required by the business.  The platform also incorporated shared metadata, reusable data transformation rules and mappings, a single source of required master and reference data, and provided agile development capabilities to reduce the cost of implementation and ongoing change management. Though this approach was more expensive to implement, the long term cost benefit and performance benefits made the decision a “no brainer’.

Lurking in the woods is Mr. Wolf. Mr. Wolf is not your typical antagonist however is a regulatory auditor whose responsibility is to ensure these banks can explain how risk is calculated as reported to the regulatory authorities. His job isn’t to shut these banks down, instead making sure the financial industry is able to measure risk across the enterprise, explain how risk is measured, and ensure these firms are adequately capitalized as mandated by new and existing industry regulations.

Mr. Wolf visits the first bank for an annual stress test audit. Looking at the result of their stress test, he asks the compliance teams to explain how their data was produced, transformed, calculated, to support the risk measurements they reported as part of the audit. Unfortunately, due to the first architect’s recommendations of hand coding their data management processes, IT failed to provide explanations and documentation on what they did, they found the developers that created their systems were no longer with the firm. As a result, the bank failed miserably, resulting in stiff penalties and higher audit costs.

Next, Architect #2’s bank was next. Having heard of what happened to their peer in the news, the architect and IT teams were confident that they were in good shape to pass their stress test audit. After digging into the risk reports, Mr. Wolf questioned the validity of the data used to calculate Value at Risk (VaR). Unfortunately, the tools that were adopted were never designed nor guaranteed by the vendors to work with each other resulting in invalid data mapping and data quality rules and gaps within their technical metadata documentation. As a result, bank #2 also failed their audit and found themselves with a ton of on one-off tools that helped automate their data management processes but lacked the integration and sharing of rules and metadata to satisfy the regulator’s demand for risk transparency.

Finally, Mr. Wolf investigated Architect #3’s firm. Having seen the result of the first two banks, Mr. Wolf was leery of their ability to pass their stress test audits. Similar demands were presented by Mr. Wolf however this time, Bank #3 provided detailed and comprehensive metadata documentation of their risk data measurements, descriptions of the data used in each report, an comprehensive report of each data quality rule used to cleanse their data, and detailed information on each counterparty and legal entity used to calculate VaR.  Unable to find gaps in their audit, Mr. Wolf, expecting to “blow” the house down, delivered a passing grade for Bank 3 and their management team due to the right investments they made to support their enterprise risk data management needs.

The moral of this story, similar to the familiar one involving the three little pigs is about the importance of having a solid foundation to weather market and regulatory storms or the violent bellow of a big bad wolf.  A foundation that includes the required data integration, data quality, master data management, and metadata management needs but also supports collaboration and visibility of how data is produced, used, and performing across the business. Ensuring current and future compliance in today’s financial services industry requires firms to have a solid data management platform, one that is intelligent, comprehensive, and allows Information Architects to help mitigate the risks and costs of hand coding or using point tools to get by only in the short term.

Are you prepared to meet Mr. Wolf?

Share
Posted in Architects, Banking & Capital Markets, Data Governance, Data Integration, Data Integration Platform, Data Quality, Enterprise Data Management | Tagged , , , , | Leave a comment

Business-Led Transformation Is Value-Centric

Business-Led Transformation Is Value-Centric

Business-Led Transformation Is Value-Centric

Transformation roadmaps in many businesses tend to have a heavy technology focus, to the point where organizations invest millions of dollars in initiatives with no clear business value. In addition, numerous tactical projects funded each year have little understanding of how or even if, they align from a business perspective. Management often fall victim to the latest technology buzzwords, while stakeholder value, business issues, and strategic considerations take a backseat. When this happens, executives who should be focused on business scenarios to improve stakeholder value fall victim to technology’s promise of the next big thing.

I recently participated in the writing and reviewing a series of whitepapers on Business-led transformation at Informatica’s Strategic Services Group. These whitepapers discusses how executives can leverage business architecture to reclaim their ability to drive a comprehensive transformation strategy and roadmap. I will try to summarize them into this blog.

Consider the nature of most initiatives found within a corporate program office. They generally focus on enhancing one system or another, or in more extreme cases a complete rebuild. The scope of work is bounded by a given system, not by the business focal point, whether that is a particular business capability, stakeholder, or value delivery perspective. These initiatives generally originate within the IT organization, not the business, and launched in response to a specific business need quickly translated into a software enhancement, rewrite, or database project. Too often, however, these projects have myopia and lack an understanding of cross-impacts to other projects, business units, stakeholders, or products. Their scope is constrained, not by a given customer or business focus, but by technology.

Business led transformation delivers a value centric perspective and provides the underlying framework for envisioning and crafting a more comprehensive solution. In some cases, this may begin with a quick fix if that is essential, but this must be accompanied by a roadmap for a more transformative solution. It provides a more comprehensive issue analysis and planning perspective because it offers business specific, business first viewpoints that enable issue analysis and resolution through business transparency.

Twitter @bigdatabeat

Share
Posted in Architects, Business Impact / Benefits, Business/IT Collaboration, Data Governance, Integration Competency Centers | Tagged , , , , | Leave a comment