Category Archives: Data Integration Platform

Data Integration with Devices is Easier than You Think

Data Integration with Devices

Data Integration with Devices

The concept of the “Internet of Things” (IOT) is about getting devices we leverage in our daily lives, or devices used in industrial applications, to communicate with other devices or systems. This is not a new notion, but the bandwidth and connectivity mechanisms to make the IOT practical is a recent development.

My first job out of college was to figure out how to get devices that monitored and controlled an advanced cooling and heating system to communicate with a centralized and automated control center. We ended up building custom PCs for the application, running a version of Unix (DOS would not cut it), and the PCs mounted in industrial cases would communicate with the temperature and humidity sensors, as well as turn on and turn off fans and dampers.

At then end of the day, this was a data integration, not an engineering problem, that we were attempting to solve. The devices had to talk to the PCs, and the PC had to talk to a centralized system (Mainframe) that was able to receive the data, as well as use that data to determine what actions to take. For instance, the ability determine that 78 degrees was too warm for a clean room, and that a damper had to be open and a fan turned on to reduce the temperature, and then turn off when the temperature returned to normal.

Back in the day, we had to create and deploy custom drivers and software. These days, most devices have well-defined interfaces, or APIs, that developers and data integration tools can access to gather information from that device. We also have high performing networks. Much like any source or target system, these devices produce data which is typically bound to a structure, and that data can be consumed and restructured to meet the needs of the target system.

For instance, data coming off a smart thermostat in your home may be in the following structure:

Device (char 10)
Date (char 8)
Temp (num 3)

You’re able to access this device using an API (typically a REST-based Web Service), which returns a single chunk of data which is bound to the structure, such as:

Device (“9999999999”)
Date (“09162014”)
Temp (076)

Then you can transform the structure into something that’s native to the target system that receives this data, as well as translate the data (e.g., converting the Data form characters to numbers). This is where data integration technology makes money for you, given its ability to deal with the complexity of translating and transforming the information that comes off the device, so it can be placed in a system or data store that’s able to monitor, analyze, and react to this data.

This is really what the IOT is all about; the ability to have devices spin out data that is leveraged to make better use of the devices. The possibilities are endless, as to what can be done with that data, and how we can better manage these devices. Data integration is key. Trust me, it’s much easier to integrate with devices these days than it was back in the day.

Thank you for reading about Data Integration with Devices! Editor’s note: For more information on Data Integration, consider downloading “Data Integration for Dummies

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Data Integration Platform | Tagged , , | Leave a comment

Enterprise Architects as Strategists

Data Architecture

The conversation at the Gartner Enterprise Architecture Summit was very interesting last week. They central them for years had been idea of closely linking enterprise architecture with the goals and strategy.  This year, Gartner added another layer to that conversation.  They are now actively promoting the idea of enterprise architects as strategists.

The reason why is simple.  The next wave of change is coming and it will significantly disrupt everybody.  Even worse, your new competitors may be coming from other industries.

Enterprise architects are in a position to take a leading role within the strategy process. This is because they are the people who best understand both business strategy and technology trends.

Some of the key ideas discussed included:

  • The boundaries between physical and digital products will blur
  • Every organization will need a technology strategy to survive
  • Gartner predicts that by 2017: 60% of the Global 1,000 will execute on at least one revolutionary and currently unimaginable business transformation effort.
  • The change is being driven by trends such as mobile, social, the connectedness of everything, cloud/hybrid, software-defined everything, smart machines, and 3D printing.

Observations

I agree with all of this.  My view is that this means that it is time for enterprise architects to think very differently about architecture.  Enterprise applications will come and go.  They are rapidly being commoditized in any case.  They need to think like strategists; in terms of market differentiation.  And nothing will differentiate an organization more than their data.    Example: Google autonomous cars.  Google is jumping across industry boundaries to compete in a new market with data as their primary differentiator. There will be many others.

Thinking data-first

Years of thinking of architecture from an application-first or business process-first perspective have left us with silos of data and the classic ‘spaghetti diagram” of data architecture. This is slowing down business initiative delivery precisely at the time organizations need to accelerate and make data their strategic weapon.  It is time to think data-first when it comes to enterprise architecture.

You will be seeing more from Informatica on this subject over the coming weeks and months.

Take a minute to comment on this article.  Your thoughts on how we should go about changing to a data-first perspective, both pro and con are welcomed.

Also, remember that Informatica is running a contest to design the data architecture of the year 2020.  Full details are here.

http://www.informatica.com/us/architects-challenge/

FacebookTwitterLinkedInEmailPrintShare
Posted in Architects, CIO, Data Integration, Data Integration Platform, Enterprise Data Management | Tagged , , , , , , , | Leave a comment

A Data Integration Love-Fest in Vegas

Question: What do American Airlines, Liberty Mutual, Discount Tire and MD Anderson all have in common?

Is it?

Next-Gen Data Integration

Agile Data Integration

a) They are all top in their field.

b) They all view data as critical to their business success.

c) They are all using Agile Data Integration to drive business agility.

d) They have spoken about their Data Integration strategy at Informatica World in Vegas.

Did you reply all of the above? If so then give yourself a Ding Ding Ding. Or shall we say Ka-Ching in honor of our host city?

Indeed Data experts from these companies and many more flocked to Las Vegas for Informatica World.  They shared their enthusiasm for the important role of data in their business.  These industry leaders discussed best practices that facilitate an Agile Data Integration process.

American Airlines recently completed a merger with US Airways, making them the largest airline in the world. In order to service critical reporting requirements for the merged airlines, the enterprise data team undertook a huge Data Integration task.  This effort involved large-scale data migration and included many legacy data sources.  The project required transferring over 4TB of current history data for Day 1 reporting. There is still a major task of integrating multiple combined subject areas in order to give a full picture of combined reporting.

American Airlines architects recommend the use of Data Integration design patterns in order to improve agility.  The architects shared success-factors for merger Data Integration.  They discussed the importance of ownership by leadership from IT and business.  They emphasized the benefit of open and honest communications between teams.  They architects also highlighted the need to identify integration teams and priorities.  Finally the architects discussed the significance of understanding cultural differences and celebrating success.  The team summarized with merger Data Integration lessons learned : Metadata is key, IT and business collaboration is critical, and profiling and access to the data is helpful.

Liberty Mutual, the third largest property and casualty insurer in the US, has grown through acquisitions.  The Data Integration team needs to support this business process.  They have been busy integrating five claim systems into one. They are faced with a large-scale Data Integration challenge. To add to the complexity, their business requires that each phase is completed in one weekend, no data is lost in the process and that all finances balance out at the end of each merge.  Integrating all claims in a single location was critical for smooth processing of insurance claims.  A single system also leads to reduced costs and complexity for support and maintenance.

Liberty Mutual experts recommend a methodology of work preparation, profiling, delivery and validation.  Rinse and repeat. Additionally, the company chose to utilize a visual Data Integration tool. This tool was quick and easy for the team to learn and greatly enhanced development agility.

Discount Tire, the largest independent tire dealer in the USA, shared tips and tricks from migrating legacy data into a new SAP system.  This complex project included data conversion from 50 legacy systems.  The company needs to combine and aggregate data from many systems, including customer, sales, financial and supply chain.  This integrated system helps Discount Tire make key business decisions and remain competitive in a highly competitive space.

Discount Tire has automated their data validation process in development and in production. This reduces testing time, minimizes data defects and increases agility of  development and operations. They have also implemented proactive monitoring in order to accomplish early detection and correction of data problems in production.

MD Anderson Cancer Center is the No. 1 hospital for cancer care in the US according to U.S. News and World Report.  They are pursuing the lofty goal of erasing cancer from existence. Data Integration is playing an important role in this fight against cancer. In order to accomplish their goal, MD Anderson researchers rely on integration of vast amounts of genomic, clinical and pharmaceutical data to facilitate leading-edge cancer research.

MD Anderson experts pursue Agile Data Integration through close collaboration between IT and business stakeholders.  This enables them to meet the data requirements of the business faster and better. They shared that data insights, through metadata management, offer a significant value to the organization. Finally the experts at MD Anderson believe in ‘Map Once, Deploy Anywhere’ in order to accomplish Agile Data Integration.

So let’s recap, Data Integration is helping:

- An airlines continue to serve its customers and run its business smoothly post-merger.

- A tire retail company to procure and provide tires to its customers and maintain leadership

- An insurance company to process claims accurately and in a timely manner, while minimizing costs, and

- A cancer research center to cure cancer.

Not too shabby, right? Data Integration is clearly essential to business success!

So OK, I know, I know… what happens in Vegas, stays in Vegas. Still, this was one love-fest I was compelled to share! Wish you were there. Hopefully you will next year!

To learn more about Agile Data Integration, check out this webinar: Great Data by Design II: How to Get Started with Next-Gen Data Integration

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Data Integration Platform | Tagged , , , , , , , | Leave a comment

Intelligent Data for a Smarter World

Intelligent Data for a Smarter WorldI’ve been thinking a lot lately about next-generation products — products that will have the smarts to respond on their own to rapidly changing conditions.

We can all imagine self-driving cars that distinguish between a life-threatening situation (like a swerving car ahead) or a thing-threatening occurrence (like a scurrying raccoon) and brake and steer accordingly. And we expect automated picking systems will soon know — by a SKU’s size, shape, weight and temperature — which assembly line or packing area gets which products. And it won’t be long before enterprise systems will see and plug security holes across hundreds of systems, no matter whether the data is hosted internally or held by partners and suppliers.

The underpinning for such smarts is data that’s clean, safe and connected — the hallmarks of everything we do and believe in at Informatica. But we also recognize that next-generation products need something more. They also need to know when and where data changes, along with how to get the right data to the right person, place or thing, in the right way. That’s why Informatica is unveiling our vision for an Intelligent Data Platform, fueled by new technology innovations in data intelligence.

Data intelligence is built on two new capabilities – live data map and inference engine. Live data map continuously updates all the metadata—structural, semantic, usage and otherwise— on all of the data flowing through an enterprise, while the inference engine can deduce user intentions, help humans search for what they need in their own natural language, and provide recommendations on the best way to consume data depending on the use case. The combination ensures that clean, safe and connected data gets to whomever or whatever needs it, as it’s needed—fast.

We at Informatica believe these capabilities are so incredibly vital for the enterprise that the Intelligent Data Platform now serves as the foundation of many of our future products — beginning with Project Springbok and Project Secure@Source™. These two new offerings simplify some of the toughest challenges facing people in the enterprise: letting business users find and use the data they need, and seeing where their most-sensitive data is hiding amidst all the nooks and crannies.

Project Springbok’s Excel-like interface lets everyday business folks and mere mortals find the data sets they’re interested in, fix formatting and quality issues, and do tasks that are a pain today to perform — such as combining data sets or publishing the results for colleagues to reuse and enhance. Project Springbok is also a guide, with its recommendations derived by the inference engine. It tells users the sources they could or should have access to, and then provisions only what they should have. It lets users see which data sets colleagues are most frequently accessing and finding the most valuable. It also alerts users to inconsistent or incomplete data, suggests ways to sort new combinations of data sets and recommends the best data for the task.

While we designed Project Springbok for the average business user, Project Secure@Source is intended for people responsible for protecting the enterprise, including chief risk officers, chief information security officers (CISOs) and even board members of public companies. That’s because Project Secure@Source’s graphical interface displays all the systems holding sensitive data, such as social security numbers, medical records or payment card information.

But it’s not enough just to know where that data is. To safeguard all the sensitive information about their products, their customers, and their employees, users also need to understand how that data got into these systems, how it moves around, and who is using it. Project Secure@Source does that, too — showing, for example, that an engineer used payment card data to test a Hadoop cluster, and left it there. With Project Secure@Source, users can selectively remove or mask that data from any system in the enterprise.

You’ll hear us talk about and showcase the Intelligent Data Platform, Project Springbok and Project Secure@Source at Informatica World on May 13 and 14. I hope you’ll join us to learn how our vision and our product roadmap will enable a smarter world for all of us, today.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration Platform, Data masking, Informatica World 2014 | Leave a comment

Announcing the Informatica Architect’s Challenge

Informatica Architect Challenge

What would the ideal data architecture of the year 2020 look like?

Informatica want’s to know how YOU would answer that question. For this reason, we’ve created the Informatica Architect’s Challenge, a chance for YOU to share how you would approach enterprise data architecture differently. Send us your proposal and you could win 100 iPad Minis for the school of your choice.

There are a lot of challenges to think about here, but let’s start with these:

  • Organizations are requiring dramatically faster delivery of business initiatives and are unhappy with the current performance of IT.  Think this is “marketing hyperbole?” See the McKinsey survey
  • Data in most organizations is highly fragmented and scattered across dozens or hundreds of different systems.  Simply finding and prepping data is becoming the majority of the work in any IT project.
  • The problem is only going to get worse as cloud, 3rd party data, social, mobile, big data, and the Internet of Things dramatically increase the complexity of enterprise data environments.

Data is the one thing that uniquely differentiates your organization from its competitors.  The question is:  How you are going to architect to deliver the data to fuel your future business success?  How will you manage the challenges of increasing complexity while delivering with the speed your organization requires?

It’s a chance make a positive contribution for education, while at the same time gaining some professional visibility for yourself as a thought leader.  We can’t wait to see what you’ll create!

For additional details, please visit the Informatica Architect’s Challenge official page.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Data Integration Platform, Enterprise Data Management | Tagged , , , , , | Leave a comment

Comparative Costs and Uses for Data Integration Platforms – A Study from Bloor Research

Data Integration PlatformsFor years, companies have wrestled with the promise of data integration platforms. Along the way, businesses have asked many questions, including:

  • Does Data Integration technology truly provide a clear path toward unified data?
  • Can businesses truly harness the potential of their information?
  • Can companies take powerful action as a result?

Recently, Bloor Research set out to evaluate how things were actually playing out on the ground. In particular, they wanted to determine which data integration projects were actually taking place, at what scale, and with what results. The study, “Comparative Costs and Uses for Data Integration Platforms,” was authored by Philip Howard, research director at Bloor. The study examined data integration tool suitability across a range of scenarios, including:

  1. Data migration and consolidation projects
  2. Master data management (MDM) and associated solutions
  3. Application-to-application integration
  4. Data warehousing and business intelligence implementations
  5. Synching data with SaaS applications
  6. B2B data exchange

To draw conclusions, Bloor examined 292 responses from a range of companies. The responders used a variety of data integration approaches, from commercial data integration tools to “hand-coding.”

Informatica is pleased to be able to offer you a copy of this research for your review. The research covers areas like:

  • Suitability
  • Productivity
  • Reusability
  • Total Cost of Ownership (TCO)

We welcome you to download a copy of “Comparative Costs and Uses for Data Integration Platforms” today. We hope these findings offer you insights as you implement and evaluate your data integration projects and options.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Data Integration Platform, Data Migration, Master Data Management | Tagged , , , | Leave a comment

Keeping it “Real” at Informatica World 2014

Keeping it Real

This dog is Keeping it Real

Most have heard by now that Informatica World 2014 will be a unique opportunity for attendees to get new ideas, expert advice, and hands-on demonstrations on real-time data integration products and capabilities at the premier data conference of the year. 

However, it is Las Vegas after all, so for those of you wagering on the best sessions, here’s my quick guide (or morning line) for your viewing, research or attending pleasure.

I hope to see you there.

Tuesday May 13, 2014

BREAKOUT SESSIONS

1:00 PM (GRACIA 8):
ARCH101 – Enterprise Data Architecture for the Data-Centric Enterprise 
How do you build an architecture that increases development speed, protects quality, and reduces cost and complexity, all while accelerating current project delivery?
Srinivas Kolluru – Chief Architect, Southwest Power Pool, Inc.
Tom Kato – Architect, US AirwaysAmerican Airlines
John Schmidt – Vice President, Informatica

 4:45 PM (GRACIA 8):
ARCH113 – Western Union: Implementing a Hadoop-based Enterprise Data Hub with Cloudera and Informatica 
To expand its business and delight customers with proactive, personalized web and mobile marketing, Western Union needs to process massive amounts of data from multiple sources.
Pravin Darbare – Senior Manager Integration and Transformation, Western Union
Clarke Patterson – Sr. Director of Product Marketing, Cloudera, Inc.

 4:45 PM (GRACIA 3):
IPaW132 – NaviNet, Inc and Informatica: Delivering Network Intelligence… The Value to the Payer, Provider and Patient 
Healthcare payers and providers today must share information in unprecedented ways to achieve their goals of reducing redundancy, cutting costs, coordinating care and driving positive outcomes.
Frank Ingari – CEO, NaviNet, Inc.

HANDS-ON LABS

11:30 AM (BRERA BALLROOM/TABLE 17B):
Proactive Monitoring of PowerCenter Environments 
Expert led sessions by Informatica Product Management 45 min Hands-On Labs
Indu Thomas – Director QA Engineering, Informatica
Elizabeth Duke – Principal Sales Consultant, Informatica

 11:30 AM (BRERA BALLROOM/TABLE 32):
Informatica Data Replication 
Expert led sessions by Informatica Product Management 45 min Hands-On Labs
Glenn Goodrich – Sr. Manager Technical Enablement, Informatica
Alex Belov – Senior Development Manager, Informatica
Phil Line – Principal Product Manager, Informatica
Andy Bristow – Product Specialist, Informatica

 BOOTHS 5:30-8:00 PM

2032- PowerExchange Change Data Capture 9.6
2065- Informatica Data Replication
2136- Informatica Real-time Data Integration
2137- Vibe Data Stream for Machine Data

Wednesday, May 14, 2014

BREAKOUT SESSIONS

11:30 AM (CASTELLANA 1)
ARCH109 – Proactive Analytics: The Next-Generation of Business Intelligence 
Business users demand self-service tools that give them faster access to better insights. Exploding data volumes and variety make finding relevant, trusted information a challenge.
John Poonen – Director, Infosario Data Services Group, Quintiles
Senthil Kanakarajan – VP, Technology Manager, Wells Fargo
Nelson Petracek – Senior Director Emerging Technology Architecture, Informatica

3:45 PM (GRACIA 8)
ARCH102 – Best Practices and Architecture for Agile Data Integration
Business can’t wait for IT to deliver data and reports that may not meet business needs by the time they’re delivered. In the age of self-service and lean integration, business analysts need more control even as IT continues to govern development.
Jared Hillam – EIM Practice Director, Intricity, LLC
John Poonen – Director, Infosario Data Services Group, Quintiles
Robert Myers – Tech Delivery Manager, Informatica

3:45 PM (GRACIA 6)
ARCH119 – HIPAA Validation for Eligibility and Claims Status in Real Time
Healthcare reform requires healthcare payers to exchange and process HIPAA messages in less time with greater accuracy. Learn how Health Net met regulatory requirements and limited both costs and expensive rework by architecting a real-time data integration architecture that lets it respond to eligibility and claims status requests within six seconds, near error-free.
Jerry Allen – IT Architect, Health Net, Inc.

3:45 PM (GRACIA 1)
ARCH114 – Bi-Directional, Real-Time Hadoop Streaming 
As organizations seek faster access to data insights, Hadoop is becoming the architectural foundation for real-time data processing environments. With the increase of Hadoop deployments in operational workloads, the importance of real-time and bi-directional data integration grows. MapR senior product management director Anoop Dawar will describe a streaming architecture that combines Informatica technologies with Hadoop. You’ll learn how this architecture can augment capabilities around 360-degree customer views, data warehouse optimization, and other big data business initiatives.
Anoop Dawar – Senior Director, Product Management, MapR Technologies

 HANDS-ON LABS

11:30 AM (BRERA BALLROOM/TABLE 17B)
Table 17b – Proactive Monitoring of PowerCenter Environments 
Expert led sessions by Informatica Product Management 45 min Hands-On Labs
Indu Thomas – Director QA Engineering, Informatica
Elizabeth Duke – Principal Sales Consultant, Informatica

 11:30 AM (BRERA BALLROOM/TABLE 32)
Table 32 – Informatica Data Replication
Expert led sessions by Informatica Product Management 45 min Hands-On Labs
Glenn Goodrich – Sr. Manager Technical Enablement, Informatica
Alex Belov – Senior Development Manager, Informatica
Phil Line – Principal Product Manager, Informatica
Andy Bristow – Product Specialist, Informatica
 

Thursday, May 15, 2014

BREAKOUT SESSIONS

 DEV108 – Informatica and the Information Potential of the “Internet of Things” 
The “Internet of Things” and its torrents of data from multiple sources — clickstreams from web servers, application and infrastructure log data, real-time systems, social media, sensor data, and more — offers an unprecedented opportunity for insight and business transformation. Learn how Informatica can help you access and integrate these massive amounts of real-time data with your enterprise data and achieve your information potential.
Amrish Thakkar – Senior Product Manager, Informatica
Boris Bulanov – Senior Director Solutions Product Management, Informatica

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Complex Event Processing, Data Integration Platform, Informatica Events, Informatica World 2014, Real-Time, Ultra Messaging | Tagged , | Leave a comment

Oh the Data I’ve Seen…

shutterstock_152663261Eighteen months ago, I was sitting in a conference room, nothing remarkable except for the great view down 6th Avenue toward the Empire State Building.  The pre-sales consultant sitting across from me had just given a visually appealing demonstration to the CIO of a multinational insurance corporation.  There were fancy graphics and colorful charts sharply displayed on an iPad and refreshing every few seconds.  The CIO asked how long it had taken to put the presentation together. The consultant excitedly shared that it only took him four to five hours, to which the CIO responded, “Well, if that took you less than five hours, we should be able to get a production version in about two to three weeks, right?”

The facts of the matter were completely different however. The demo, while running with the firm’s own data, had been running from a spreadsheet, housed on the laptop of the consultant and procured after several weeks of scrubbing, formatting, and aggregating data from the CIO’s team; this does not even mention the preceding data procurement process.  And so, as the expert in the room, the voice of reason, the CIO turned to me wanting to know how long it would take to implement the solution.  At least six months, was my assessment.  I had seen their data, and it was a mess. I had seen the flow, not a model architecture and the sheer volume of data was daunting. If it was not architected correctly, the pretty colors and graphs would take much longer to refresh; this was not the answer he wanted to hear.

The advancement of social media, new web experiences and cutting edge mobile technology have driven users to expect more of their applications.  As enterprises push to drive value and unlock more potential in their data, insurers of all sizes have attempted to implement analytical and business intelligence systems.  But here’s the truth: by and large most insurance enterprises are not in a place with their data to make effective use of the new technologies in BI, mobile or social.  The reality is that data cleanliness, fit for purpose, movement and aggregation is being done in a BI when it should be done lower down so that all applications can take advantage of it.

Let’s face it – quality data is important. Movement and shaping of data in the enterprise is important.  Identification of master data and metadata in the enterprise is important and data governance is important.  It brings to mind episode 165, “The Apology”, of the mega-hit show Seinfeld.  Therein George Costanza accuses erstwhile friend Jason Hanky of being a “step skipper”.  What I have seen in enterprise data is “step skipping” as users clamor for new and better experiences, but the underlying infrastructure and data is less than ready for consumption.  So the enterprise bootstraps, duct tapes and otherwise creates customizations where it doesn’t architecturally belong.

Clearly this calls for a better solution; A more robust and architecturally sustainable data ecosystem, which shepherds the data from acquisition through to consumption and all points in between. It also must be attainable by even modestly sized insurance firms.

First, you need to bring the data under your control.  That may mean external data integration, or just moving it from transactional, web, or client-server systems into warehouses, marts or other large data storage schemes and back again.  But remember, the data is in various stages of readiness.  This means that through out of the box or custom cleansing steps the data needs to be processed, enhanced and stored in a way that is more in line with corporate goals for governing the quality of that data.  And this says nothing of the need to change a data normalization factor between source and target.  When implemented as a “factory” approach, the ability to bring new data streams online, integrate them quickly and maintain high standards become small incremental changes and not a ground up monumental task.  Move your data shaping, cleansing, standardization and aggregation further down in the stack and many applications will benefit from the architecture.

Critical to this process is that insurance enterprises need to ensure the data remains secure, private and is managed in accordance with rules and regulations. They must also govern the archival, retention and other portions of the data lifecycle.

At any point in the life of your information, you are likely sending or receiving data from an agent, broker, MGA or service provider, which needs to be processed using the robust ecosystem, described above. Once an effective data exchange infrastructure is implemented, the steps to process the data can nicely complement your setup as information flows to and from your trading partners.

Finally, as your enterprise determines “how” to implement these solutions, you may look to a cloud based system for speed to market and cost effectiveness compared to on-premises solutions.

And don’t forget to register for Informatica World 2014 in Las Vegas, where you can take part in sessions and networking tailored specifically for insurers.

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Data Integration Platform, Data Quality, Enterprise Data Management, Financial Services | Tagged , , , , , , | Leave a comment

Fire your Data Scientists – They Don’t Add Value

Data ScientistYears ago, I was on a project to improve production and product quality through data analysis. During the project, I heard one man say: 

“If I had my way, I’d fire the statisticians – all of them – they don’t add value”. 

Surely not? Why would you fire the very people who were employed to make sense of the vast volumes of manufacturing data and guide future production?  But he was right. The problem was at that time data management was so poor that data was simply not available for the statisticians to analyze.

So, perhaps this title should be re-written to be: 

Fire your Data Scientists – They Aren’t Able to Add Value.

Although this statement is a bit extreme, the same situation may still exist. Data scientists frequently share frustrations such as:

  • “I’m told our data is 60% accurate, which means I can’t trust any of it.”
  • “We achieved our goal of an answer within a week by working 24 hours a day.”
  • “Each quarter we manually prepare 300 slides to anticipate all questions the CFO may ask.”
  • “Fred manually audits 10% of the invoices.  When he is on holiday, we just don’t do the audit.”

This is why I think the original quote is so insightful.  Value from data is not automatically delivered by hiring a statistician, analyst or data scientist. Even with the latest data mining technology, one person cannot positively influence a business without the proper data to support them.

Most organizations are unfamiliar with the structure required to deliver value from their data. New storage technologies will be introduced and a variety of analytics tools will be tried and tested. This change is crucial for to success. In order for statisticians to add value to a company, they must have access to high quality data that is easily sourced and integrated. That data must be available through the latest analytics technology. This new ecosystem should provide insights that can play a role in future production. Staff will need to be trained, as this new data will be incorporated into daily decision making.

With a rich 20-year history, Informatica understands data ecosystems. Employees become wasted investments when they do not have access to the trusted data they need in order to deliver their true value.

Who wants to spend their time recreating data sets to find a nugget of value only to discover it can’t be implemented?

Build a analytical ecosystem with a balanced focus on all aspects of data management. This will mean that value delivery is limited only by the imagination of your employees. Rather than questioning the value of an analytics team, you will attract some of the best and the brightest. Then, you will finally be able to deliver on the promised value of your data.

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, Data Integration, Data Integration Platform, Data Warehousing | Tagged , , , | 4 Comments

Architects: 8 Great Reasons To Be At Informatica World 2014

Architects INFA14For years, corporations have tried to solve business problems by “throwing new enterprise applications at them.” This strategy has created ‘silos of data’ that are attached to these applications and databases. As a result, it is now increasingly difficult to find, access and use the correct data for new projects.

Data fragmentation leads to slow project delivery and “shadow IT.” Without a change in enterprise data strategy, is it unlikely this problem will improve. On the contrary, the growth of cloud applications and platforms, mobile applications, NoSQL, and the “Internet of Things” create increasing urgency.  Unless a new approach to enterprise data management is taken, the house of cards is going to crumble.

I seriously doubt that this is a surprise to most of you. The question is, “What should we do about it?” The Informatica World 2014 event is a perfect place to find answers. Here are eight benefits architects will enjoy at Informatica World 2014:

  1. A dedicated track of breakout sessions for architects. This track explores reference architectures, design patterns, best practices and real-world examples for building and sustaining your next-generation information architecture. The track will begin with a keynote on the broader issues of enterprise data architecture. It will also include panel of architects from leading companies.
  2. Inspiration to participate in defining the enterprise data architecture of the future. (I can’t spoil it by divulging the details here, but I promise that it will be interesting and worth your while!)
  3. New insights on how to manage information for the data-centric enterprise. These will expand your thinking about data architecture and platforms.
  4. Chances to network with architect peers.
  5. A Hands-on Lab, were you can talk to Informatica experts about their products and solutions.
  6. Updates on what Informatica is doing to bring business subject matter experts in as full partners in the co-development of data-related projects.
  7. A chance to win a one-hour one-on-one session with an Informatica architect at Informatica World.
  8. A chance to learn to control your biggest enterprise system: The collection of all data-moving resources across your company.

We believe that architecture is the key to unleashing information potential. To compete in today’s global 24x7x365 economy, business requires well-designed information architectures that can continuously evolve to support the new heights of flexibility, efficiency, responsiveness, and trust. I hope you will join us in Las Vegas, May 12-15!

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Data Integration Platform, Enterprise Data Management, Informatica World 2014 | Tagged , | Leave a comment