Category Archives: Data Integration Platform

Stop Trying to Manage Data Growth!(?)

Data Downpour

Data Downpour

Talking to architects about analytics at a recent event, I kept hearing the familiar theme; data scientists are spending 80% of their time on “data wrangling” leaving only 20% for delivering the business insights that will drive the company’s innovation.  It was clear to everybody that I spoke to that the situation will only worsen.  The coming growth everybody sees in data volume and complexity, will only lengthen the time to value.

Gartner recently predicted that:

“by 2015, 50% of organizations will give up on managing growth and will redirect funds to improve classification and analytics.”

50 percent

“by 2015, 50% of organizations will give up on managing growth and will redirect funds to improve classification and analytics.”

Some of the details of this study are interesting.  In the end, many organizations are coming to two conclusions:

  • It’s risky to delete data, so they keep it around as insurance.
  • All data has potential business value, so more organizations are keeping it around for potential analytical purposes.

The other mega-trend here is that more and more organizations are looking to compete on analytics – and they need data to do it, both internal data and external data.

From an architect’s perspective, here are several observations:

  • The floodgates are open and analytics is a top priority. Given that, the emphasis should be on architecting to manage the dramatic increases in both data quantity and data complexity rather than on trying to stop it.
  • The immediate architectural priority has to be on simplifying and streamlining your current enterprise data architecture. Break down those data silos and standardize your enterprise data management tools and processes as much as possible.  As discussed in other blogs, data integration is becoming the biggest bottleneck to business value delivery in your environment. Gartner has projected that “by 2018, more than half the cost of implementing new large systems will be spent on integration.”  The more standardized your enterprise data management architecture is, the more efficient it will be.
  • With each new data type, new data tool (Hive, Pig, etc.), and new data storage technology (Hadoop, NoSQL, etc.) ask first if your existing enterprise data management tools can handle the task before people go out and create a new “data silo” based on the cool, new technologies. Sometimes it will be necessary, but not always.
  • The focus needs to be on speeding value delivery for the business. And the key bottleneck is highly likely to be your enterprise data architecture.

Rather than focusing on managing data growth, the priority should be on managing it in the most standardized and efficient way possible.  It is time to think about enterprise data management as a function with standard processes, skills and tools (just like Finance, Marketing or Procurement.)

Several of our leading customers have built or are building a central “Data as a Service” platform within their organizations.  This is a single, central place where all developers and analysts can go to get trustworthy data that is managed by IT through a standard architecture and served up for use by all.

For more information, see “The Big Big Data Workbook

*Gartner Predicts 2015: Managing ‘Data Lakes’ of Unprecedented Enormity, December 2014  http://www.gartner.com/document/2934417#

Share
Posted in Architects, CIO, Data Integration Platform | Tagged , , , , , , | Leave a comment

Big Data Is Neither-Part II

Big_DataYou Say Big Dayta, I say Big Dahta

Some say Big Data is a great challenge while others say Big Data creates new opportunities. Where do you stand?  For most companies concerned with their Big Data challenges, it shouldn’t be so difficult – at least on paper. Computing costs (both hardware and software) have vastly shrunk. Databases and storage techniques have become more sophisticated and scale massively, and companies such as Informatica have made connecting and integrating all the “big” and disparate data sources much easier and have helped companies achieve a sort of “big data synchronicity”. As it is.

In the process of creating solutions to Big Data problems, humans (and the supra-species known as IT Sapiens) have a tendency to use theories based on linear thinking and the scientific method. There is data as our systems know it and data as our systems don’t. The reality, in my opinion, is that “Really Big Data” problems now and in the future will have complex correlations and unintuitive relationships that need to utilize mathematical disciplines, data models and algorithms that haven’t even been discovered or invented yet and when eventually discovered, will make current database science positively primordial.

At some point in the future, machines will be able to predict, based on big, perhaps unknown data types when someone is having a bad day or a good day, or more importantly whether a person may behave in a good or bad way. Many people do this now when they take a glance at someone across a room and infer how that person is feeling or what they will do next. They see eyes that are shiny or dull, crinkles around eyes or sides of mouths, then hear the “tone” in a voice and then their neurons put it altogether that this is a person that is having a bad day and needs a hug. Quickly. No one knows exactly how the human brain does this, but it does what it does and we go with it and we are usually right.

U.S._Air_Force_Senior_Airman__130429-F-ZX232-013

And some day, Big Data will be able to derive this and it will be an evolution point and it will also be a big business opportunity. Through bigger and better data ingestion and integration techniques and more sophisticated math and data models, a machine will do this fast and relatively speaking, cheaply. The vast majority won’t understand why or how it’s done, but it will work and it will be fairly accurate.

And my question to you all is this.

Do you see any other alternate scenarios regarding the future of big data? Is contextual computing an important evolution and will big data integration be more or less of a problem in the future.

PS. Oh yeah, one last thing to chew on concerning Big Data… If Big Data becomes big enough, does that spell the end of modelling as we know it?

Share
Posted in Big Data, Business Impact / Benefits, Business/IT Collaboration, CMO, Complex Event Processing, Data Integration Platform, Hadoop, Intelligent Data Platform | Tagged , , , | Leave a comment

Federal Migration to Cloud Computing, Hindered by Data Issues

cloud_computing

Moving towards Cloud Computing

As reviewed by Loraine Lawson,  a MeriTalk survey about cloud adoption found that a “In the latest survey of 150 federal executives, nearly one in five say one-quarter of their IT services are fully or partially delivered via the cloud.”

For the most part, the shifts are more tactical in nature.  These federal managers are shifting email (50 percent), web hosting (45 percent) and servers/storage (43 percent).  Most interesting is that they’re not moving traditional business applications, custom business apps, or middleware. Why? Data, and data integration issues.

“Federal agencies are worried about what happens to data in the cloud, assuming they can get it there in the first place:

  • 58 percent of executives fret about cloud-to-legacy system integration as a barrier.
  • 57 percent are worried about migration challenges, suggesting they’re not sure the data can be moved at all.
  • 54 percent are concerned about data portability once the data is in the cloud.
  • 53 percent are worried about ‘contract lock-in.’ ”

The reality is that the government does not get much out of the movement to cloud without committing core business applications and thus core data.  While e-mail and Web hosting, and some storage is good, the real cloud computing money is made when moving away from expensive hardware and software.  Failing to do that, you fail to find the value, and, in this case, spend more taxpayer dollars than you should.

Data issues are not just a concern in the government.  Most larger enterprise have the same issues as well.  However, a few are able to get around these issues with good planning approaches and the right data management and data integration technology.  It’s just a matter of making the initial leap, which most Federal IT executives are unwilling to do.

In working with CIOs of Federal agencies in the last few years, the larger issue is that of funding.  While everyone understands that moving to cloud-based systems will save money, getting there means hiring government integrators and living with redundant systems for a time.  That involves some major money.  If most of the existing budget goes to existing IP operations, then the move may not be practical.  Thus, there should be funds made available to work on the cloud projects with the greatest potential to reduce spending and increase efficiencies.

The shame of this situation is that the government was pretty much on the leading edge with cloud computing. back in 2008 and 2009.  The CIO of the US Government, Vivek Kundra, promoted the use of cloud computing, and NIST drove the initial definitions of “The Cloud,” including IaaS, SaaS, and PaaS.  But, when it came down to making the leap, most agencies balked at the opportunity citing issues with data.

Now that the technology has evolved even more, there is really no excuse for the government to delay migration to cloud-based platforms.  The clouds are ready, and the data integration tools have cloud integration capabilities backed in.  It’s time to see some more progress.

Share
Posted in Business Impact / Benefits, Cloud, Cloud Application Integration, Cloud Data Integration, Data Integration, Data Integration Platform | Tagged , , , | 2 Comments

The New Marketing Technology Landscape Is Here… And It’s Music to Our Ears!

How Do You Like It? How Do You Like It? More, More More!
Andrea+True+Connection+ANDREAChiefmartec came out with their 2015 Marketing Technology Landscape, and if there’s one word that comes to mind, it’s MORE. 1,876 corporate logos dot the page, up from 947 in 2014. That’s definitely more, more, more – just about double to be exact. I’m honestly not sure it’s possible to squeeze any more in a single image?

But it’s strangely fitting, because this is the reality that we marketers live in.  There are an infinite number of new technologies, approaches, social media platforms, operations tools, and vendors that we have to figure out. New, critical categories of technology roll out constantly. New vendors enter and exit the landscape. As Chiefmartec says “at least on the immediate horizon, I don’t think we’re going to see a dramatic shrinking of this landscape. The landscape will change, for sure. What qualifies as “marketing” and “technology” under the umbrella of marketing technology will undoubtedly morph. But if mere quantity is the metric we’re measuring, I think it’s going to be a world of 1,000+ marketing technology companies — perhaps even a world of 2,000+ of them — for some time to come.”

Marketing_Technology_Jan2015

Middleware: I’m Coming Up So You’d Better Get This Party Started!
pinkOne thing you’ll notice if you look carefully between last year’s and this year’s version, is the arrival of the middleware layer. Chiefmartec spends quite a bit of time talking about middleware, pointing out that great tools in the category are making the marketing technology landscape easier to manage – particularly those that handle a hybrid of on premise and cloud.

Marketers have long since cared about the things on the top – the red “Marketing Experiences” and the orange “Marketing Operations”. They’ve also put a lot of focus in the dark gray/black/blue layer “Backbone Platforms” like marketing autionation & e-commerce. But only recently has that yellow middleware layer become front and center for marketers. Data integration, data management platforms, connectivity, data quality, and API’s are definitely not new to the technology landscape, and have been a critical domain of IT for decades. But as marketers are becoming more and more skilled and reliant on analytics and focused customer experience management, data is entering the forefront.

Marketers cannot focus exclusively on their Salesforce CRM, their Marketo automation, or their Adobe Experience Manager web management. Data Ready marketers realize that each of these applications can no longer be run in a silo, they need to be looked at collectively as a powerful set of tools designed to engage the customer and push them through the buying cycle, as critical pieces to the same puzzle. And to do that, they need to be looking at connecting their data sources, powering them with great data, analyzing and measuring their results, and then deciding what to do.

If you squint, you can see Informatica in the yellow Middleware layer. (I could argue that it belongs in several of these yellow boxes, not just Cloud integration, but I’ll save that for another blog!) Some might say that’s not very exciting, but I would argue that Informtaica is in a tremendous place to help marketers succeed with great data. And it all comes down to two words… complexity and change.

Why You Have to Go and Make Things So Complicated?
avril-lavigne-avril-lavigne-34900869-1280-1024Ok, admittedly terrible grammar, but you get the picture. Marketers live in a trendounsly complex world. Sure you don’t have all 1,876 of the logos on the Technology Landscape in house. You probably don’t eve have one from each of the 43 categories. But you definitely have a lot of different tecnology solutions that you rely upon on a day-to-day basis. According to a September article by ChiefMarTech, most marketers already regularly rely on more than 100 software programs.

Data ready marketers realize that their environments are complicated, and that they need a foundation. They need a platform of great data that all of their various applications and tools can leverage, and that can actually connect all of their various applications and tools together. They need to be able to connect to just about anything from just about anything. They need a complete view of all of their interactions their customers. In short, they need to make their extremely complicated world more simple, streamlined, and complete.

Ch-Ch-Ch-Ch-Changes. Turn and Face the Strange!
David-Bowie-1973I have a tendency to misunderstand lyrics, so I have to confess that until I looked up this song today, I thought the lyric was “time to face the pain” (Bowie fans, I hang my head in shame!).  But quite honestly, “turn and face the strange” illustrates my point just as well!

There is no question that marketing has changed dramatically in the past few years.  Your most critical marketing tools and processes two years ago are almost certainly different than those this year, and will almost certainly be different from what you see two years from now.  Marketers realize this.  The Marketing Technology Landscape illustrates this every year!

The data ready marketer understands that their toolbox will change, but that their data will be the foundation for whatever new piece of the technology puzzle they embrace or get rid of.  Building a foundation of great data will power any technology solution or new approach.

Data ready marketers also work with their IT counterparts to engineer for change.  They make sure that no matter what technology or data source they want to add – no matter how strange or unthinkable it is today – they never have to start from scratch.  They can connect to what they want, when they want, leveraging great data, and ultimately making great decisions.

Get Ready ‘Cause Here I Come. The Era of the Data Ready Marketer is Here
TemptationsNow that you have a few catchy tunes stuck in your head, it’s time to ask yourself, are you data ready? Are you ready to embrace the complexity of marketing technology landscape? Are you ready to think about change as a competitive weapon?

I encourage you to take our survey about data ready marketing. The results are coming out soon so don’t miss your chance to be a part. You can find the link here.

Also, follow me on twitter – The Data Ready Marketer (@StephanieABest) for some of the latest & greatest news and insights on the world of data ready marketing.

And stay tuned because we have several new Data Ready Marketing pieces coming out soon – InfoGraphics, eBooks, SlideShares, and more!

Share
Posted in 5 Sales Plays, Big Data, CMO, Customers, Data Integration Platform, Enterprise Data Management, Intelligent Data Platform | Tagged , , , , , , , , | Leave a comment

Ready for Internet of Things?

internet_of_thingsData has always played a key role in informing decisions – machine generated and intuitive.  In the past, much of this data came from transactional databases as well as unstructured sources, such as emails and flat files.  Mobile devices appeared next on the map.  We have found applications of such devices not just to make calls but also to send messages, take a picture, and update status on social media sites.  As a result, new sets of data got created from user engagements and interactions.  Such data started to tell a story by connecting dots at different location points and stages of user connection.  “Internet of Things” or IoT is the latest technology to enter the scene that could transform how we view and use data on a massive scale.

Another buzzword? 

Does IoT present a significant opportunity for companies to transform their business processes?  Internet of Things probably add an important awareness veneer when it comes to data.  It could bring data early in focus by connecting every step of data creation stages in any business process.  It could de-couple the lagging factor in consuming data and making decisions based on it.  Data generated at every stage in a business process could show an interesting trend or pattern and better yet, tell a connected story.  Result could be predictive maintenance of equipment involved in any process that would further reduce cost.  New product innovations would happen by leveraging the connectedness in data as generated by each step in a business process.  We would soon begin to understand not only where the data is being used and how, but also what’s the intent and context behind this usage.  Organizations could then connect with their customers in a one-on-one fashion like never before, whether to promote a product or offer a promotion that could be both time and place sensitive.  New opportunities to tailor product and services offering for customers on an individual basis would create new growth areas for businesses.  Internet of Things could make it a possibility by bringing together previously isolated sets of data.

Proof-points

Recent Economist report, “The Virtuous Circle of Data: Engaging Employees in Data and Transforming Your Business” suggests that 68% of data-driven businesses outperform their competitors when it comes to profitability.  78% of those businesses foster a better culture of creativity and innovation.  Report goes on to suggest that 3 areas are critical for an organization to build a data-driven business, including data supported by devices: 1) Technology & Tools, 2) Talent & Expertise, and 3) Culture & Leadership.  By 2020, it’s projected that there’ll be 50B connected devices, 7x more than human beings on the planet.  It is imperative for an organization to have a support structure in place for device generated data and a strategy to connect with broader enterprise-wide data initiatives.

A comprehensive Internet of Things strategy would leverage speed and context of data to the advantage of business process owners.  Timely access to device generated data can open up the channels of communication to end-customers in a personalized at the moment of their readiness.  It’s not enough anymore to know what customers may want or what they asked for in the past; rather anticipating what they might want by connecting dots across different stages.  IoT generated data can help bridge this gap.

How to Manage IoT Generated Data

More data places more pressure on both quality and security factors – key building blocks for trust in one’s data.  Trust is ideally truth over time.  Consistency in data quality and availability is going to be key requirement for all organizations to introduce new products or service differentiated areas in a speedy fashion.  Informatica’s Intelligent Data Platform or IDP brings together industry’s most comprehensive data management capabilities to help organizations manage all data, including device generated, both in the cloud and on premise.  Informatica’s IDP enables an automated sensitive data discovery, such that data discovers users in the context where it’s needed.

Cool IoT Applications

There are a number of companies around the world that are working on interesting applications of Internet of Things related technology.  Smappee from Belgium has launched an energy monitor that can itemize electricity usage and control a household full of devices by clamping a sensor around the main power cable. This single device can recognize individual signatures produced by each of the household devices and can let consumers switch off any device, such as an oven remotely via smartphone.  JIBO is a IoT device that’s touted as the world’s first family robot.  It automatically uploads data in the cloud of all interactions.  Start-ups such as Roost and Range OI can retrofit older devices with Internet of Things capabilities.  One of the really useful IoT applications could be found in Jins Meme glasses and sunglasses from Japan.  They embed wearable sensors that are shaped much like Bluetooth headsets to detect drowsiness in its wearer.  It observes the movement of eyes and blinking frequency to identify tiredness or bad posture and communicate via iOS and android smartphone app.  Finally, Mellow is a new kind of kitchen robot that makes it easier by cooking ingredients to perfection while someone is away from home. Mellow is a sous-vide machine that takes orders through your smartphone and keeps food cold until it’s the exact time to start cooking.

Closing Comments

Each of the application mentioned above deals with data, volumes of data, in real-time and in stored fashion.  Such data needs to be properly validated, cleansed, and made available at the moment of user engagement.  In addition to Informatica’s Intelligent Data Platform, newly introduced Informatica’s Rev product can truly connect data coming from all sources, including IoT devices and make it available for everyone.  What opportunity does IoT present to your organization?  Where are the biggest opportunities to disrupt the status quo?

Share
Posted in 5 Sales Plays, Big Data, Cloud, Cloud Data Management, Customer Services, Customers, Data Integration Platform, Enterprise Data Management, Intelligent Data Platform, Wearable Devices | Tagged , , , , , | Leave a comment

Magical Data from the Internet of Things? Think again…

Internet of Things

Internet of Things

I recently read an opinion piece written in an insurance publication online. The author postulated, among other things, that the Internet of Things would magically deliver great data to an insurer. Yes, it was a statement just that glib. Almost as if there is some fantastic device that you just plug into the wall and out streams a flow of unicorns and rainbows. And furthermore that those unicorns and rainbows will subsequently give a magical boost to your business. But hey, you plugged in that fantastic device, so bring on the magic.

Now, let’s come back from the land of fairytales and ground ourselves in reality. Data is important, no doubt about that. Today, financial services firms are able to access data from so many new data sources. One of those new and fancy data sources is the myriad of devices in this thing we call the Internet of Things.

You ever have one of those frustrating days with your smart phone? Dropped calls, slow Internet, Facebook won’t locate you? Well, other devices experience the same wonkiness. Even the most robust of devices found on commercial aircraft or military equipment are not lossless in data transmission. And that’s where we are with the Internet of Things. All great devices, they serve a number of purposes, but are still fallible in communicating with the “mother ship”.

A telematics device in a consumer vehicle can transmit, VIN, speed, latitude/longitude, time, and other vehicle statuses for use in auto insurance. As with other devices on a network, some of these data elements will not come through reliably. That means that in order to reconstruct or smooth the set of data, interpolations need to be made and/or entire entries deleted as useless. That is the first issue. Second, simply receiving this isolated dataset does not make sense of it. The data needs to be moved, cleansed and then correlated to other pieces of the puzzle, which eventually turn into a policyholder, an account holder, a client or a risk. And finally, that enhanced data can be used for further analytics. It can be archived, aggregated, warehoused and secured for additional analysis. None of these activities happen magically. And the sheer volume of integration points and data requires a robust and standardized data management infrastructure.

So no, just having an open channel to the stream of noise from your local Internet of Things will not magically deliver you great data. Great data comes from market leading data management solutions from Informatica. So whether you are an insurance company, financial services firm or data provider, being “Insurance Ready” means having great data; ready to use; everywhere…from Informatica.

Share
Posted in Data Integration Platform, Data Quality, Financial Services, Master Data Management | Tagged , , , | Leave a comment

The Synergies of SaaS and Data Integration

Synergies of SaaS and Data Integration

Synergies of SaaS and Data Integration

Loraine Lawson provides some great insights into the world of data integration, and this article is no exception.  The topic struck me due to the fact that we’ve wrestled with SaaS integration issues for about 10 years now.

Back in 2004, we saw the rapid growth of SaaS providers such as Salesforce.com.  However, there was typically no consistent data integration strategy to go along with the use of SaaS.  In many instances, SaaS-delivered applications became the new data silos in the enterprise, silos that lacked a sound integration plan and integration technology.

10 years later, we’ve gotten to a point where we have the ability to solve problems using SaaS and data integration problems around the use of SaaS.  However, we typically lack the knowledge and understanding of how to effectively use data integration technology within an enterprise to integrate SaaS problem domains.

Lawson looks at both sides of the SaaS integration argument.  “Surveys certainly show that integration is less of a concern for SaaS than in the early days, when nearly 88 percent of SaaS companies said integration concerns would slow down adoption and more than 88 percent said it’s an important or extremely important factor in winning new customers.”

Again, while we’ve certainly gotten better at integration, we’re nowhere near being out of the woods.  “A Dimensional Research survey of 350 IT executives showed that 67 percent cited data integration problems as a challenge with SaaS business applications. And as with traditional systems, integration can add hidden costs to your project if you ignore it.”

As I’ve stated many times in this blog, integration requires a bit of planning and the use of solid technology.  While this does require some extra effort and money, the return on the value of this work is huge.

SaaS integration requires that you take a bit of a different approach than traditional enterprise integration.  SaaS systems typically place your data behind well-defined APIs that can be accessed directly or through a data integration technology.  While the information can be consumed by anything that can invoke an API, enterprises still have to deal with structure and content differences, and that’s typically best handled using the right data integration technology.

Other things to consider, things that are again often overlooked, is the need for both data governance and data security around your SaaS integration solution.  There should be a centralized control mechanism to support the proper management and security of the data, as well as a mechanism to deal with data quality issues that often emerge when consuming data from any cloud computing services.

The reality is that SaaS is here to stay.  Even enterprise software players that put off the move to SaaS-delivered systems, are not standing up SaaS offerings.  The economics around the use of SaaS are just way to compelling.  However, as SaaS-delivered systems become more common place, so will the emergence of new silos.  This will not be an issue, if you leverage the right SaaS integration approach and technology.  What will your approach be?

Share
Posted in B2B, Data Integration, Data Integration Platform, SaaS | Tagged , | Leave a comment

What Do Your Insured Members Look Like?

What Do Your Insured Members Look Like?

What Do Your Insured Members Look Like?

As I was changing my flight this morning, I needed to make sure that my checked bag made the change as well (because who wants to get to a meeting with nothing to wear but yoga pants and t-shirts?). During the conversation with the gate agent, I was told that the reservation system accessed through a phone call is separate from the flight system that the desk agent had access to. As a result the airport baggage folks had no idea that my flight had changed. The entire time I was working with the gate agent, I kept thinking that they needed a complete view of me as a customer. *I* don’t care that the systems aren’t integrated and sharing information, I only want my bag to make it where I’m going.

The same applies to insurers. Your members don’t care that you don’t have access to their enrollment information when they are calling about a claim. In order to provide better service to your members – you need to be able to get a complete 360 degree view of your members. If you can get a complete view of your insured members while they are talking to you on the phone – that will enable you to give them better customer service. You want to focus on your member’s experiences. This includes strengthening member relationships and fostering high levels of satisfaction to gain member’s trust and easing members’ concerns.

In many insurance companies, getting a complete picture of what each insured member looks like is cumbersome – with one system for enrolling members, another system for member benefit administration and a third for claims processing. These systems may be cumbersome legacy systems designed for an employer-focused market. These legacy systems have been modified over the years to accommodate changing market needs and government regulations. You may be able to access information from each of these systems over time through batch file transfer, reporting against the various systems or having a customer service representative interact with each system separately.

In order to be competitive in today’s marketplace with the focus changing from employers providing the insurance to the individual, you need to provide your members with the best possible service.

Imagine the confidence I would have in the airline that could easily change my flight and re-route my baggage and interact with me exactly the same whether I am speaking to someone on the phone or standing in front of a gate agent. Imagine how much better my customer satisfaction ratings would be as a result.

What do your insured members look like?

Share
Posted in Business Impact / Benefits, Customer Acquisition & Retention, Customer Services, Data Integration, Data Integration Platform, Healthcare, Real-Time | Tagged , , , , , , | Leave a comment

CES, Digital Strategy and Architecture: Are You Ready?

CES, Digital Strategy and Architecture

CES, Digital Strategy and Architecture

CES, the International Consumer Electronics show is wrapping up this week and the array of new connected products and technologies was truly impressive. “The Internet of Things” is moving from buzzword to reality.  Some of the major trends seen this week included:

  • Home Hubs from Google, Samsung, and Apple (who did not attend the show but still had a significant impact).
  • Home Hub Ecosystems providing interoperability with cars, door locks, and household appliances.
  • Autonomous cars, and intelligent cars
  • Wearable devices such as smart watches and jewelry.
  • Drones that take pictures and intelligently avoid obstacles.  …Including people trying to block them.  There is a bit of a creepy factor here!
  • The next generation of 3D printers.
  • And the intelligent baby pacifier.  The idea is that it takes the baby’s temperature, but I think the sleeper hit feature on this product is the ability to locate it using GPS and a smart phone. How much money would you pay to get your kid to go to sleep when it is time to do so?

Digital Strategies Are Gaining Momentum

There is no escaping the fact that the vast majority of companies out there have active digital strategies, and not just in the consumer space. The question is: Are you going to be the disruptor or the disruptee?  Gartner offered an interesting prediction here:

“By 2017, 60% of global enterprise organizations will execute on at least one revolutionary and currently unimaginable business transformation effort.”

It is clear from looking at CES, that a lot of these products are “experiments” that will ultimately fail.  But focusing too much on that fact is to risk overlooking the profound changes taking place that will shake out industries and allow competitors to jump previously impassible barriers to entry.

IDC predicted that the Internet of Things market would be over $7 Trillion by the year 2020.  We can all argue about the exact number, but something major is clearly happening here.  …And it’s big.

Is Your Organization Ready?

A study by Gartner found that 52% of CEOs and executives say they have a digital strategy.  The problem is that 80% of them say that they will “need adaptation and learning to be effective in the new world.”  Supporting a new “Internet of Things” or connected device product may require new business models, new business processes, new business partners, new software applications, and require the collection and management of entirely new types of data.  Simply standing up a new ERP system or moving to a cloud application will not help your organization to deal with the new business models and data complexity.

Architect’s Call to Action

Now is the time (good New Year’s resolution!) to get proactive on your digital strategy.  Your CIO is most likely deeply engaged with her business counterparts to define a digital strategy for the organization. Now is the time to be proactive in terms of recommending the IT architecture that will enable them to deliver on that strategy – and a roadmap to get to the future state architecture.

Key Requirements for a Digital-ready Architecture

Digital strategy and products are all about data, so I am going to be very data-focused here.  Here are some of the key requirements:

  • First, it must be designed for speed.  How fast? Your architecture has to enable IT to move at the speed of business, whatever that requires.  Consider the speed at which companies like Google, Amazon and Facebook are making IT changes.
  • It has to explicitly directly link the business strategy to the underlying business models, processes, systems and technology.
  • Data from any new source, inside or outside your organization, has to be on-boarded quickly and in a way that it is immediately discoverable and available to all IT and business users.
  • Ongoing data quality management and Data Governance must be built into the architecture.  Point product solutions cannot solve these problems.  It has to be pervasive.
  • Data security also has to be pervasive for the same reasons.
  • It must include business self-service.  That is the only way that IT is going to be able to meet the needs of business users and scale to the demands of the changes required by digital strategy.

Resources:

For a webinar on connecting business strategy to the architecture of business transformation see; Next-Gen Architecture: A “Business First” Approach for Agile Architecture.   With John Schmidt of Informatica and Art Caston, founder of Proact.

For next-generation thinking on enterprise data architectures see; Think “Data First” to Drive Business Value

For more on business self-service for data preparation and a free software download.

Share
Posted in Architects, CIO, Data Integration Platform, Enterprise Data Management | Tagged , , , , , , , , | Leave a comment

The 3 Little Architects and the Big Bad Mr. Wolf – A Data Parody for today’s Financial Industry

The 3 Little Architects and the Big Bad Wolf

The 3 Little Architects

Once upon a time, there were 3 Information Architects working in the financial services industry, each with different firms and backgrounds but all responsible for recommending the right technology solutions to help their firms comply with industry regulations including ongoing bank stress testing across the globe.  Since 2008, bank regulators have been focused on measuring systemic risk and requiring banks to provide transparency into how risk is measured and reported to support their capital adequacy needs.

The first architect grew through the ranks starting as a Database Administrator, a black belt in SQL and COBOL programming. Hand coding was their DNA for many years and thought of as the best approach given how customized their business and systems were vs. other organizations. As such, Architect #1 and their team went down the path of building their data management capabilities through custom hand coded scripts, manual data extractions and transformations, and dealing with data quality issues through the business organizations after the data is delivered.   Though their approach and decisions delivered on their short term needs, the firm realized the overhead required to make changes and respond to new requests driven by new industry regulations and changing market conditions.

The second architect is a “gadget guy” at heart who grew up using off the shelf tools vs. hand coding for managing data. He and his team decides not to hand code their data management processes, instead adopt and built their solution leveraging best of breed tools, some of which were open source, others from existing solutions the company had from previous projects for data integration, data quality, and metadata management.  Though their tools helped automate much of the “heavy lifting” he and is IT team were still responsible for integrating these point solutions to work together which required ongoing support and change management.

The last architect is as technically competent as his peers however understood the value of building something once to use across the business. His approach was a little different than the first two. Understanding the risks and costs of hand coding or using one off tools to do the work, he decided to adopt an integrated platform designed to handle the complexities, sources, and volumes of data required by the business.  The platform also incorporated shared metadata, reusable data transformation rules and mappings, a single source of required master and reference data, and provided agile development capabilities to reduce the cost of implementation and ongoing change management. Though this approach was more expensive to implement, the long term cost benefit and performance benefits made the decision a “no brainer’.

Lurking in the woods is Mr. Wolf. Mr. Wolf is not your typical antagonist however is a regulatory auditor whose responsibility is to ensure these banks can explain how risk is calculated as reported to the regulatory authorities. His job isn’t to shut these banks down, instead making sure the financial industry is able to measure risk across the enterprise, explain how risk is measured, and ensure these firms are adequately capitalized as mandated by new and existing industry regulations.

Mr. Wolf visits the first bank for an annual stress test audit. Looking at the result of their stress test, he asks the compliance teams to explain how their data was produced, transformed, calculated, to support the risk measurements they reported as part of the audit. Unfortunately, due to the first architect’s recommendations of hand coding their data management processes, IT failed to provide explanations and documentation on what they did, they found the developers that created their systems were no longer with the firm. As a result, the bank failed miserably, resulting in stiff penalties and higher audit costs.

Next, Architect #2’s bank was next. Having heard of what happened to their peer in the news, the architect and IT teams were confident that they were in good shape to pass their stress test audit. After digging into the risk reports, Mr. Wolf questioned the validity of the data used to calculate Value at Risk (VaR). Unfortunately, the tools that were adopted were never designed nor guaranteed by the vendors to work with each other resulting in invalid data mapping and data quality rules and gaps within their technical metadata documentation. As a result, bank #2 also failed their audit and found themselves with a ton of on one-off tools that helped automate their data management processes but lacked the integration and sharing of rules and metadata to satisfy the regulator’s demand for risk transparency.

Finally, Mr. Wolf investigated Architect #3’s firm. Having seen the result of the first two banks, Mr. Wolf was leery of their ability to pass their stress test audits. Similar demands were presented by Mr. Wolf however this time, Bank #3 provided detailed and comprehensive metadata documentation of their risk data measurements, descriptions of the data used in each report, an comprehensive report of each data quality rule used to cleanse their data, and detailed information on each counterparty and legal entity used to calculate VaR.  Unable to find gaps in their audit, Mr. Wolf, expecting to “blow” the house down, delivered a passing grade for Bank 3 and their management team due to the right investments they made to support their enterprise risk data management needs.

The moral of this story, similar to the familiar one involving the three little pigs is about the importance of having a solid foundation to weather market and regulatory storms or the violent bellow of a big bad wolf.  A foundation that includes the required data integration, data quality, master data management, and metadata management needs but also supports collaboration and visibility of how data is produced, used, and performing across the business. Ensuring current and future compliance in today’s financial services industry requires firms to have a solid data management platform, one that is intelligent, comprehensive, and allows Information Architects to help mitigate the risks and costs of hand coding or using point tools to get by only in the short term.

Are you prepared to meet Mr. Wolf?

Share
Posted in Architects, Banking & Capital Markets, Data Governance, Data Integration, Data Integration Platform, Data Quality, Enterprise Data Management | Tagged , , , , | Leave a comment