Category Archives: Data Integration Platform

Ready for Internet of Things?

internet_of_thingsData has always played a key role in informing decisions – machine generated and intuitive.  In the past, much of this data came from transactional databases as well as unstructured sources, such as emails and flat files.  Mobile devices appeared next on the map.  We have found applications of such devices not just to make calls but also to send messages, take a picture, and update status on social media sites.  As a result, new sets of data got created from user engagements and interactions.  Such data started to tell a story by connecting dots at different location points and stages of user connection.  “Internet of Things” or IoT is the latest technology to enter the scene that could transform how we view and use data on a massive scale.

Another buzzword? 

Does IoT present a significant opportunity for companies to transform their business processes?  Internet of Things probably add an important awareness veneer when it comes to data.  It could bring data early in focus by connecting every step of data creation stages in any business process.  It could de-couple the lagging factor in consuming data and making decisions based on it.  Data generated at every stage in a business process could show an interesting trend or pattern and better yet, tell a connected story.  Result could be predictive maintenance of equipment involved in any process that would further reduce cost.  New product innovations would happen by leveraging the connectedness in data as generated by each step in a business process.  We would soon begin to understand not only where the data is being used and how, but also what’s the intent and context behind this usage.  Organizations could then connect with their customers in a one-on-one fashion like never before, whether to promote a product or offer a promotion that could be both time and place sensitive.  New opportunities to tailor product and services offering for customers on an individual basis would create new growth areas for businesses.  Internet of Things could make it a possibility by bringing together previously isolated sets of data.

Proof-points

Recent Economist report, “The Virtuous Circle of Data: Engaging Employees in Data and Transforming Your Business” suggests that 68% of data-driven businesses outperform their competitors when it comes to profitability.  78% of those businesses foster a better culture of creativity and innovation.  Report goes on to suggest that 3 areas are critical for an organization to build a data-driven business, including data supported by devices: 1) Technology & Tools, 2) Talent & Expertise, and 3) Culture & Leadership.  By 2020, it’s projected that there’ll be 50B connected devices, 7x more than human beings on the planet.  It is imperative for an organization to have a support structure in place for device generated data and a strategy to connect with broader enterprise-wide data initiatives.

A comprehensive Internet of Things strategy would leverage speed and context of data to the advantage of business process owners.  Timely access to device generated data can open up the channels of communication to end-customers in a personalized at the moment of their readiness.  It’s not enough anymore to know what customers may want or what they asked for in the past; rather anticipating what they might want by connecting dots across different stages.  IoT generated data can help bridge this gap.

How to Manage IoT Generated Data

More data places more pressure on both quality and security factors – key building blocks for trust in one’s data.  Trust is ideally truth over time.  Consistency in data quality and availability is going to be key requirement for all organizations to introduce new products or service differentiated areas in a speedy fashion.  Informatica’s Intelligent Data Platform or IDP brings together industry’s most comprehensive data management capabilities to help organizations manage all data, including device generated, both in the cloud and on premise.  Informatica’s IDP enables an automated sensitive data discovery, such that data discovers users in the context where it’s needed.

Cool IoT Applications

There are a number of companies around the world that are working on interesting applications of Internet of Things related technology.  Smappee from Belgium has launched an energy monitor that can itemize electricity usage and control a household full of devices by clamping a sensor around the main power cable. This single device can recognize individual signatures produced by each of the household devices and can let consumers switch off any device, such as an oven remotely via smartphone.  JIBO is a IoT device that’s touted as the world’s first family robot.  It automatically uploads data in the cloud of all interactions.  Start-ups such as Roost and Range OI can retrofit older devices with Internet of Things capabilities.  One of the really useful IoT applications could be found in Jins Meme glasses and sunglasses from Japan.  They embed wearable sensors that are shaped much like Bluetooth headsets to detect drowsiness in its wearer.  It observes the movement of eyes and blinking frequency to identify tiredness or bad posture and communicate via iOS and android smartphone app.  Finally, Mellow is a new kind of kitchen robot that makes it easier by cooking ingredients to perfection while someone is away from home. Mellow is a sous-vide machine that takes orders through your smartphone and keeps food cold until it’s the exact time to start cooking.

Closing Comments

Each of the application mentioned above deals with data, volumes of data, in real-time and in stored fashion.  Such data needs to be properly validated, cleansed, and made available at the moment of user engagement.  In addition to Informatica’s Intelligent Data Platform, newly introduced Informatica’s Rev product can truly connect data coming from all sources, including IoT devices and make it available for everyone.  What opportunity does IoT present to your organization?  Where are the biggest opportunities to disrupt the status quo?

Share
Posted in 5 Sales Plays, Big Data, Cloud, Cloud Data Management, Customer Services, Customers, Data Integration Platform, Enterprise Data Management, Intelligent Data Platform, Wearable Devices | Tagged , , , , , | Leave a comment

Magical Data from the Internet of Things? Think again…

Internet of Things

Internet of Things

I recently read an opinion piece written in an insurance publication online. The author postulated, among other things, that the Internet of Things would magically deliver great data to an insurer. Yes, it was a statement just that glib. Almost as if there is some fantastic device that you just plug into the wall and out streams a flow of unicorns and rainbows. And furthermore that those unicorns and rainbows will subsequently give a magical boost to your business. But hey, you plugged in that fantastic device, so bring on the magic.

Now, let’s come back from the land of fairytales and ground ourselves in reality. Data is important, no doubt about that. Today, financial services firms are able to access data from so many new data sources. One of those new and fancy data sources is the myriad of devices in this thing we call the Internet of Things.

You ever have one of those frustrating days with your smart phone? Dropped calls, slow Internet, Facebook won’t locate you? Well, other devices experience the same wonkiness. Even the most robust of devices found on commercial aircraft or military equipment are not lossless in data transmission. And that’s where we are with the Internet of Things. All great devices, they serve a number of purposes, but are still fallible in communicating with the “mother ship”.

A telematics device in a consumer vehicle can transmit, VIN, speed, latitude/longitude, time, and other vehicle statuses for use in auto insurance. As with other devices on a network, some of these data elements will not come through reliably. That means that in order to reconstruct or smooth the set of data, interpolations need to be made and/or entire entries deleted as useless. That is the first issue. Second, simply receiving this isolated dataset does not make sense of it. The data needs to be moved, cleansed and then correlated to other pieces of the puzzle, which eventually turn into a policyholder, an account holder, a client or a risk. And finally, that enhanced data can be used for further analytics. It can be archived, aggregated, warehoused and secured for additional analysis. None of these activities happen magically. And the sheer volume of integration points and data requires a robust and standardized data management infrastructure.

So no, just having an open channel to the stream of noise from your local Internet of Things will not magically deliver you great data. Great data comes from market leading data management solutions from Informatica. So whether you are an insurance company, financial services firm or data provider, being “Insurance Ready” means having great data; ready to use; everywhere…from Informatica.

Share
Posted in Data Integration Platform, Data Quality, Financial Services, Master Data Management | Tagged , , , | Leave a comment

The Synergies of SaaS and Data Integration

Synergies of SaaS and Data Integration

Synergies of SaaS and Data Integration

Loraine Lawson provides some great insights into the world of data integration, and this article is no exception.  The topic struck me due to the fact that we’ve wrestled with SaaS integration issues for about 10 years now.

Back in 2004, we saw the rapid growth of SaaS providers such as Salesforce.com.  However, there was typically no consistent data integration strategy to go along with the use of SaaS.  In many instances, SaaS-delivered applications became the new data silos in the enterprise, silos that lacked a sound integration plan and integration technology.

10 years later, we’ve gotten to a point where we have the ability to solve problems using SaaS and data integration problems around the use of SaaS.  However, we typically lack the knowledge and understanding of how to effectively use data integration technology within an enterprise to integrate SaaS problem domains.

Lawson looks at both sides of the SaaS integration argument.  “Surveys certainly show that integration is less of a concern for SaaS than in the early days, when nearly 88 percent of SaaS companies said integration concerns would slow down adoption and more than 88 percent said it’s an important or extremely important factor in winning new customers.”

Again, while we’ve certainly gotten better at integration, we’re nowhere near being out of the woods.  “A Dimensional Research survey of 350 IT executives showed that 67 percent cited data integration problems as a challenge with SaaS business applications. And as with traditional systems, integration can add hidden costs to your project if you ignore it.”

As I’ve stated many times in this blog, integration requires a bit of planning and the use of solid technology.  While this does require some extra effort and money, the return on the value of this work is huge.

SaaS integration requires that you take a bit of a different approach than traditional enterprise integration.  SaaS systems typically place your data behind well-defined APIs that can be accessed directly or through a data integration technology.  While the information can be consumed by anything that can invoke an API, enterprises still have to deal with structure and content differences, and that’s typically best handled using the right data integration technology.

Other things to consider, things that are again often overlooked, is the need for both data governance and data security around your SaaS integration solution.  There should be a centralized control mechanism to support the proper management and security of the data, as well as a mechanism to deal with data quality issues that often emerge when consuming data from any cloud computing services.

The reality is that SaaS is here to stay.  Even enterprise software players that put off the move to SaaS-delivered systems, are not standing up SaaS offerings.  The economics around the use of SaaS are just way to compelling.  However, as SaaS-delivered systems become more common place, so will the emergence of new silos.  This will not be an issue, if you leverage the right SaaS integration approach and technology.  What will your approach be?

Share
Posted in B2B, Data Integration, Data Integration Platform, SaaS | Tagged , | Leave a comment

What Do Your Insured Members Look Like?

What Do Your Insured Members Look Like?

What Do Your Insured Members Look Like?

As I was changing my flight this morning, I needed to make sure that my checked bag made the change as well (because who wants to get to a meeting with nothing to wear but yoga pants and t-shirts?). During the conversation with the gate agent, I was told that the reservation system accessed through a phone call is separate from the flight system that the desk agent had access to. As a result the airport baggage folks had no idea that my flight had changed. The entire time I was working with the gate agent, I kept thinking that they needed a complete view of me as a customer. *I* don’t care that the systems aren’t integrated and sharing information, I only want my bag to make it where I’m going.

The same applies to insurers. Your members don’t care that you don’t have access to their enrollment information when they are calling about a claim. In order to provide better service to your members – you need to be able to get a complete 360 degree view of your members. If you can get a complete view of your insured members while they are talking to you on the phone – that will enable you to give them better customer service. You want to focus on your member’s experiences. This includes strengthening member relationships and fostering high levels of satisfaction to gain member’s trust and easing members’ concerns.

In many insurance companies, getting a complete picture of what each insured member looks like is cumbersome – with one system for enrolling members, another system for member benefit administration and a third for claims processing. These systems may be cumbersome legacy systems designed for an employer-focused market. These legacy systems have been modified over the years to accommodate changing market needs and government regulations. You may be able to access information from each of these systems over time through batch file transfer, reporting against the various systems or having a customer service representative interact with each system separately.

In order to be competitive in today’s marketplace with the focus changing from employers providing the insurance to the individual, you need to provide your members with the best possible service.

Imagine the confidence I would have in the airline that could easily change my flight and re-route my baggage and interact with me exactly the same whether I am speaking to someone on the phone or standing in front of a gate agent. Imagine how much better my customer satisfaction ratings would be as a result.

What do your insured members look like?

Share
Posted in Business Impact / Benefits, Customer Acquisition & Retention, Customer Services, Data Integration, Data Integration Platform, Healthcare, Real-Time | Tagged , , , , , , | Leave a comment

CES, Digital Strategy and Architecture: Are You Ready?

CES, Digital Strategy and Architecture

CES, Digital Strategy and Architecture

CES, the International Consumer Electronics show is wrapping up this week and the array of new connected products and technologies was truly impressive. “The Internet of Things” is moving from buzzword to reality.  Some of the major trends seen this week included:

  • Home Hubs from Google, Samsung, and Apple (who did not attend the show but still had a significant impact).
  • Home Hub Ecosystems providing interoperability with cars, door locks, and household appliances.
  • Autonomous cars, and intelligent cars
  • Wearable devices such as smart watches and jewelry.
  • Drones that take pictures and intelligently avoid obstacles.  …Including people trying to block them.  There is a bit of a creepy factor here!
  • The next generation of 3D printers.
  • And the intelligent baby pacifier.  The idea is that it takes the baby’s temperature, but I think the sleeper hit feature on this product is the ability to locate it using GPS and a smart phone. How much money would you pay to get your kid to go to sleep when it is time to do so?

Digital Strategies Are Gaining Momentum

There is no escaping the fact that the vast majority of companies out there have active digital strategies, and not just in the consumer space. The question is: Are you going to be the disruptor or the disruptee?  Gartner offered an interesting prediction here:

“By 2017, 60% of global enterprise organizations will execute on at least one revolutionary and currently unimaginable business transformation effort.”

It is clear from looking at CES, that a lot of these products are “experiments” that will ultimately fail.  But focusing too much on that fact is to risk overlooking the profound changes taking place that will shake out industries and allow competitors to jump previously impassible barriers to entry.

IDC predicted that the Internet of Things market would be over $7 Trillion by the year 2020.  We can all argue about the exact number, but something major is clearly happening here.  …And it’s big.

Is Your Organization Ready?

A study by Gartner found that 52% of CEOs and executives say they have a digital strategy.  The problem is that 80% of them say that they will “need adaptation and learning to be effective in the new world.”  Supporting a new “Internet of Things” or connected device product may require new business models, new business processes, new business partners, new software applications, and require the collection and management of entirely new types of data.  Simply standing up a new ERP system or moving to a cloud application will not help your organization to deal with the new business models and data complexity.

Architect’s Call to Action

Now is the time (good New Year’s resolution!) to get proactive on your digital strategy.  Your CIO is most likely deeply engaged with her business counterparts to define a digital strategy for the organization. Now is the time to be proactive in terms of recommending the IT architecture that will enable them to deliver on that strategy – and a roadmap to get to the future state architecture.

Key Requirements for a Digital-ready Architecture

Digital strategy and products are all about data, so I am going to be very data-focused here.  Here are some of the key requirements:

  • First, it must be designed for speed.  How fast? Your architecture has to enable IT to move at the speed of business, whatever that requires.  Consider the speed at which companies like Google, Amazon and Facebook are making IT changes.
  • It has to explicitly directly link the business strategy to the underlying business models, processes, systems and technology.
  • Data from any new source, inside or outside your organization, has to be on-boarded quickly and in a way that it is immediately discoverable and available to all IT and business users.
  • Ongoing data quality management and Data Governance must be built into the architecture.  Point product solutions cannot solve these problems.  It has to be pervasive.
  • Data security also has to be pervasive for the same reasons.
  • It must include business self-service.  That is the only way that IT is going to be able to meet the needs of business users and scale to the demands of the changes required by digital strategy.

Resources:

For a webinar on connecting business strategy to the architecture of business transformation see; Next-Gen Architecture: A “Business First” Approach for Agile Architecture.   With John Schmidt of Informatica and Art Caston, founder of Proact.

For next-generation thinking on enterprise data architectures see; Think “Data First” to Drive Business Value

For more on business self-service for data preparation and a free software download.

Share
Posted in Architects, CIO, Data Integration Platform, Enterprise Data Management | Tagged , , , , , , , , | Leave a comment

The 3 Little Architects and the Big Bad Mr. Wolf – A Data Parody for today’s Financial Industry

The 3 Little Architects and the Big Bad Wolf

The 3 Little Architects

Once upon a time, there were 3 Information Architects working in the financial services industry, each with different firms and backgrounds but all responsible for recommending the right technology solutions to help their firms comply with industry regulations including ongoing bank stress testing across the globe.  Since 2008, bank regulators have been focused on measuring systemic risk and requiring banks to provide transparency into how risk is measured and reported to support their capital adequacy needs.

The first architect grew through the ranks starting as a Database Administrator, a black belt in SQL and COBOL programming. Hand coding was their DNA for many years and thought of as the best approach given how customized their business and systems were vs. other organizations. As such, Architect #1 and their team went down the path of building their data management capabilities through custom hand coded scripts, manual data extractions and transformations, and dealing with data quality issues through the business organizations after the data is delivered.   Though their approach and decisions delivered on their short term needs, the firm realized the overhead required to make changes and respond to new requests driven by new industry regulations and changing market conditions.

The second architect is a “gadget guy” at heart who grew up using off the shelf tools vs. hand coding for managing data. He and his team decides not to hand code their data management processes, instead adopt and built their solution leveraging best of breed tools, some of which were open source, others from existing solutions the company had from previous projects for data integration, data quality, and metadata management.  Though their tools helped automate much of the “heavy lifting” he and is IT team were still responsible for integrating these point solutions to work together which required ongoing support and change management.

The last architect is as technically competent as his peers however understood the value of building something once to use across the business. His approach was a little different than the first two. Understanding the risks and costs of hand coding or using one off tools to do the work, he decided to adopt an integrated platform designed to handle the complexities, sources, and volumes of data required by the business.  The platform also incorporated shared metadata, reusable data transformation rules and mappings, a single source of required master and reference data, and provided agile development capabilities to reduce the cost of implementation and ongoing change management. Though this approach was more expensive to implement, the long term cost benefit and performance benefits made the decision a “no brainer’.

Lurking in the woods is Mr. Wolf. Mr. Wolf is not your typical antagonist however is a regulatory auditor whose responsibility is to ensure these banks can explain how risk is calculated as reported to the regulatory authorities. His job isn’t to shut these banks down, instead making sure the financial industry is able to measure risk across the enterprise, explain how risk is measured, and ensure these firms are adequately capitalized as mandated by new and existing industry regulations.

Mr. Wolf visits the first bank for an annual stress test audit. Looking at the result of their stress test, he asks the compliance teams to explain how their data was produced, transformed, calculated, to support the risk measurements they reported as part of the audit. Unfortunately, due to the first architect’s recommendations of hand coding their data management processes, IT failed to provide explanations and documentation on what they did, they found the developers that created their systems were no longer with the firm. As a result, the bank failed miserably, resulting in stiff penalties and higher audit costs.

Next, Architect #2’s bank was next. Having heard of what happened to their peer in the news, the architect and IT teams were confident that they were in good shape to pass their stress test audit. After digging into the risk reports, Mr. Wolf questioned the validity of the data used to calculate Value at Risk (VaR). Unfortunately, the tools that were adopted were never designed nor guaranteed by the vendors to work with each other resulting in invalid data mapping and data quality rules and gaps within their technical metadata documentation. As a result, bank #2 also failed their audit and found themselves with a ton of on one-off tools that helped automate their data management processes but lacked the integration and sharing of rules and metadata to satisfy the regulator’s demand for risk transparency.

Finally, Mr. Wolf investigated Architect #3’s firm. Having seen the result of the first two banks, Mr. Wolf was leery of their ability to pass their stress test audits. Similar demands were presented by Mr. Wolf however this time, Bank #3 provided detailed and comprehensive metadata documentation of their risk data measurements, descriptions of the data used in each report, an comprehensive report of each data quality rule used to cleanse their data, and detailed information on each counterparty and legal entity used to calculate VaR.  Unable to find gaps in their audit, Mr. Wolf, expecting to “blow” the house down, delivered a passing grade for Bank 3 and their management team due to the right investments they made to support their enterprise risk data management needs.

The moral of this story, similar to the familiar one involving the three little pigs is about the importance of having a solid foundation to weather market and regulatory storms or the violent bellow of a big bad wolf.  A foundation that includes the required data integration, data quality, master data management, and metadata management needs but also supports collaboration and visibility of how data is produced, used, and performing across the business. Ensuring current and future compliance in today’s financial services industry requires firms to have a solid data management platform, one that is intelligent, comprehensive, and allows Information Architects to help mitigate the risks and costs of hand coding or using point tools to get by only in the short term.

Are you prepared to meet Mr. Wolf?

Share
Posted in Architects, Banking & Capital Markets, Data Governance, Data Integration, Data Integration Platform, Data Quality, Enterprise Data Management | Tagged , , , , | Leave a comment

2015 – The Year of Data Integration?

2015, the Year of Data Integration?

2015, the Year of Data Integration?

I love the data integration coverage by Loraine Lawson in IT Business Edge,  especially in this December 12th posting  that focuses on the trends that will emerge in 2015.  The best quote from the post is: “Oddly, organizations still tended to focus on point-solutions in the cloud. As more infrastructure and data moves to the cloud, they’re experiencing similar pain points and relearning old lessons.”

The articles cites some research from Ovum, that predicts many enterprises will begin moving toward data integration, driven largely by the rise of cloud computing and big data.  However, enterprises need to invest in both modernizing the existing data management infrastructure, as well as invest in data integration technology.  “All of these new investments will push the middleware software market up 9 percent to a $16.3 billion industry, Information Management reports.”  This projection is for 2015.

I suspect that’s a bit conservative.  In my travels, I see much more interest in data integration strategies, approaches, and technology, as cloud computing continues to grow, as well as enterprises understand better the strategic use of data.  So, I would put the growth at 15 percent for 2015.

There are many factors driving this growth, beyond mere interest in cloud computing and big data.

The first consideration is that data is more strategic than initially understood.  While businesses have always considered data a huge asset, it has not been until the last few years that businesses have seen the true value of understanding what’s going on inside, and outside of their business.

Manufacturing companies want to see the current state of production, as well as production history.  Management can now use that data to predict trends to address, such as future issues around employee productivity, or even a piece of equipment that is likely to fail and the impact of that failure on revenue.  Healthcare companies are learning how to better monitor patient health, such as spotting likely health problems before they are diagnosed, or leveraging large data to understand when patterns emerge around health issues, such as areas of the country that are more prone to asthma, based upon air quality.

Second, there is the need to deal with compliance issues.  The new health care regulations, or even the new regulation around managing a publically traded company, require a great deal of data management issues, including data integration.

As these laws emerge, and are altered over time, the reporting requirements are always more complex and far reaching than they were before.  Those who want to avoid fines, or even avoid stock drops around mistakes, are paying close attention to this area.

Finally, there is an expectation from customers and employees that you will have a good handle on your data.  10 years ago you could tell a customer on the phone that you needed to check different systems to answer their question.  Those days are over.  Today’s customers and employees want immediate access to the data they need, and there is no good excuse for not being able to produce that data.  If you can’t, your competition will.

The interest in data integration will experience solid growth in 2015, around cloud and big data, for sure.  However, other factors will drive this growth, and enterprises will finally understand that data integration is core to an IT strategy, and should never be an afterthought.

Share
Posted in B2B Data Exchange, Business/IT Collaboration, Data Governance, Data Integration, Data Integration Platform | Tagged , , , | Leave a comment

Great Data Puts You In Driver Seat: The Next Step in The Digital Revolution

Great Data Puts You In Driver Seat

Great Data Is the Next Step

The industrial revolution began in mid-late eighteenth century, introducing machines to cut costs and speed up manufacturing processes. Steam engines forever changed efficiency in iron making, textiles, and chemicals production, among many others. Transportation improved significantly, and the standard of living for the masses went saw significant, sustained growth.

In last 50-60 years, we have witnessed another revolution, through the invention of computing machines and the Internet – a digital revolution.  It has transformed every industry and allowed us to operate at far greater scale – processing more transactions and in more locations – than ever before.    New cities emerged on the map, migrations of knowledge workers throughout the world followed, and the standard of living increased again.  And digitally available information transformed how we run businesses, cities, or countries.

Forces Shaping Digital Revolution

Over the last 5-6 years, we’ve witnessed a massive increase in the volume and variety of this information.  Leading forces that contributed to this increase are:

  • Next generation of software technology connecting data faster from any source
  • Little to no hardware cost to process and store huge amount of data  (Moore’s Law)
  • A sharp increase in number of machines and devices generating data that are connected online
  • Massive worldwide growth of people connecting online and sharing information
  • Speed of Internet connectivity that’s now free in many public places

As a result, our engagement with the digital world is rising – both for personal and business purposes.  Increasingly, we play games, shop, sign digital contracts, make product recommendations, respond to customer complains, share patient data, and make real time pricing changes to in-store products – all from a mobile device or laptop.  We do so increasingly in a collaborative way, in real-time, and in a very personalized fashion.  Big Data, Social, Cloud, and Internet of Things are key topics dominating our conversations and thoughts around data these days.  They are altering our ways to engage with and expectations from each other.

This is the emergence of a new revolution or it is the next phase of our digital revolution – the democratization and ubiquity of information to create new ways of interacting with customers and dramatically speeding up market launch.  Businesses will build new products and services and create new business models by exploiting this vast new resource of information.

The Quest for Great Data

But, there is work to do before one can unleash the true potential captured in data.  Data is no more a by-product or transaction record.  Neither it has anymore an expiration date.  Data now flows through like a river fueling applications, business processes, and human or machine activities.  New data gets created on the way and augments our understanding of the meaning behind this data.  It is no longer good enough to have good data in isolated projects, but rather great data need to become accessible to everyone and everything at a moment’s notice. This rich set of data needs to connect efficiently to information that has been already present and learn from it.  Such data need to automatically rid itself of inaccurate and incomplete information.  Clean, safe, and connected – this data is now ready to find us even before we discover it.   It understands the context in which we are going to make use of this information and key decisions that will follow.  In the process, this data is learning about our usage, preference, and results.  What works versus what doesn’t.  New data is now created that captures such inherent understanding or intelligence.  It needs to flow back to appropriate business applications or machines for future usage after fine-tuning.  Such data can then tell a story about human or machine actions and results.  Such data can become a coach, a mentor, a friend of kind to guide us through critical decision points.  Such data is what we would like to call great data.  In order to truly capitalize on the next step of digital revolution, we will pervasively need this great data to power our decisions and thinking.

Impacting Every Industry

By 2020, there’ll be 50 Billion connected devices, 7x more than human beings on the planet.  With this explosion of devices and associated really big data that will be processed and stored increasingly in the cloud.  More than size, this complexity will require a new way of addressing business process efficiency that renders agility, simplicity, and capacity.  Impact of such transformation will spread across many industries.  A McKinsey article, “The Future of Global Payments”, focuses on digital transformation of payment systems in the banking industry and ubiquity of data as a result.   One of the key challenges for banks will be to shift from their traditional heavy reliance on siloed and proprietary data to a more open approach that encompasses a broader view of customers.

Industry executives, front line managers, and back office workers are all struggling to make the most sense of the data that’s available.

Closing Thoughts on Great Data

A “2014 PWC Global CEO Survey ” showed 81% ranked technology advances as #1 factor to transform their businesses over next 5 years.  More data, by itself, isn’t enough for this transformation.  A robust data management approach integrating machine and human data, from all sources and updated in real-time, among on-premise and cloud-based systems must be put in place to accomplish this mission.  Such an approach will nurture great data.  This end-to-end data management platform will provide data guidance and curate an organization’s one of the most valuable assets, its information.    Only by making sense of what we have at our disposal, will we unleash the true potential of the information that we possess.  The next step in the digital revolution will be about organizations of all sizes being fueled by great data to unleash their potential tapped.

Share
Posted in Big Data, Business Impact / Benefits, CIO, Data Governance, Data Integration Platform, Enterprise Data Management | Tagged , , , , | Leave a comment

When Data Integration Saves Lives

When Data Integration Saves Lives

When Data Integration Saves Lives

In an article published in Health Informatics, its author, Gabriel Perna, claims that data integration could save lives, as we learn more about illnesses and causal relationships.

According to the article, in Hamilton County Ohio, it’s not unusual to see kids from the same neighborhoods coming to the hospital for asthma attacks.  Thus, researchers wanted to know if it was fact or mistaken perception that an unusually high number of children in the same neighborhood were experiencing asthma attacks.  The next step was to review existing data to determine the extent of the issues, and perhaps how to solve the problem altogether.

“The researchers studied 4,355 children between the ages of 1 and 16 who visited the emergency department or were hospitalized for asthma at Cincinnati Children’s between January 2009 and December 2012. They tracked those kids for 12 months to see if they returned to the ED or were readmitted for asthma.”

Not only were the researchers able to determine a sound correlation between the two data sets, but they were able to advance the research to predict which kids were at high-risk based upon where they live.  Thus, some of the cause and the effects have been determined.

This came about when researchers began thinking out of the box, when it comes to dealing with traditional and non-traditional medical data.  They integrated housing and census data, in this case, with that of the data from the diagnosis and treatment of the patients.  These are data sets unlikely to find their way to each other, but together they have a meaning that is much more valuable than if they just stayed in their respective silos.

“Non-traditional medical data integration has begun to take place in some medical collaborative environments already. The New York-Presbyterian Regional Health Collaborative created a medical village, which ‘goes beyond the established patient-centered medical home mode.’ It not only connects an academic medical center with a large ambulatory network, medical homes, and other providers with each other, but community resources such as school-based clinics and specialty-care centers (the ones that are a part of NYP’s network).”

The fact of the matter is that data is the key to understanding what the heck is going on when cells of sick people begin to emerge.  While researchers and doctors can treat the individual patients there is not a good understanding of the larger issues that may be at play.  In this case, poor air quality in poor neighborhoods.  Thus, they understand what problem needs to be corrected.

The universal sharing of data is really the larger solution here, but one that won’t be approached without a common understanding of the value, and funding.  As we pass laws around the administration of health care, as well as how data is to be handled, perhaps it’s time we look at what the data actually means.  This requires a massive deployment of data integration technology, and the fundamental push to share data with a central data repository, as well as with health care providers.

Share
Posted in Data Integration, Data Integration Platform, Data Quality, Data Services | Tagged , , , | Leave a comment

Getting to Your Future State Enterprise Data Architecture

Getting to Your Future State Enterprise Data Architecture

Future State Enterprise Data Architecture

Just exactly how do your move from a “Just a Bunch of Data” (JBOD) architecture to a coherent enterprise data architecture?

The white paper, “The Great Rethink: Building a Highly Responsive and Evolving Data Integration Architecture” by Claudia Imhoff and Joe McKendrick provides an interesting view of what such an architecture might look like.  The paper describes how to move from ad hoc Data Integration to an Enterprise Data Architecture.  The paper also describes an approach towards building architectural maturity and a next-generation enterprise data architecture that helps organizations to be more competitive.

Organizations that look to compete based on their data are searching for ways to design an architecture that:

  • On-boards new data quickly
  • Delivers clean and trustworthy data
  • Delivers data at the speed required of the business
  • Ensures that data is handled in secure way
  • Is flexible enough to incorporate new data types and new technology
  • Enables end user self-service
  • Speeds up the speed of business value delivery for an organization

In my previous blog, Digital Strategy and Architecture, we discussed the demands that digital strategies are putting on enterprise data architecture in particular.  Add to that the additional stress from business initiatives such as:

  • Supporting new mobile applications
  • Moving IT applications to the cloud – which significantly increases data management complexity
  • Dealing with external data.  One recent study estimates that a full 25% of the data being managed by the average organization is external data.
  • Next-generation analytics and predictive analytics with Hadoop and No SQL
  • Integrating analytics with applications
  • Event-driven architectures and projects
  • The list goes on…

The point here is that most people are unlikely to be funded to build an enterprise data architecture from scratch that can meet all these needs.  A pragmatic approach would be to build out your future state architecture in each new strategic business initiative that is implemented.  The real challenge of being an enterprise architect is ensuring that all of the new work does indeed add up to a coherent architecture as it gets implemented.

The “Great Rethink” white paper describes a practical approach to achieving an agile and responsive future state enterprise data architecture that will support your strategic business initiatives.  It also describes a high level data integration architecture and the building blocks to achieving that architecture.  This is highly recommended reading.

Also, you might recall that Informatica sponsored the Informatica Architect’s Challenge this year to design an enterprise-wide data architecture of the future.  The contest has closed and we have a winner.  See the site for details, Informatica Architect Challenge .

Share
Posted in Architects, CIO, Data First, Data Integration Platform | Tagged , , , , , | Leave a comment