Tag Archives: Data Integration

The Swiss Army Knife of Data Integration

The Swiss Army Knife of Data Integration

The Swiss Army Knife of Data Integration

Back in 1884, a man had a revolutionary idea; he envisioned a compact knife that was lightweight and would combine the functions of many stand-alone tools into a single tool. This idea became what the world has known for over a century as the Swiss Army Knife.

This creative thinking to solve a problem came from a request to build a soldier knife from the Swiss Army.  In the end, the solution was all about getting the right tool for the right job in the right place. In many cases soldiers didn’t need industrial strength tools, all they really needed was a compact and lightweight tool to get the job at hand done quickly.

Putting this into perspective with today’s world of Data Integration, using enterprise-class data integration tools for the smaller data integration project is over kill and typically out of reach for the smaller organization. However, these smaller data integration projects are just as important as those larger enterprise projects, and they are often the innovation behind a new way of business thinking. The traditional hand-coding approach to addressing the smaller data integration project is not-scalable, not-repeatable and prone to human error, what’s needed is a compact, flexible and powerful off-the-shelf tool.

Thankfully, over a century after the world embraced the Swiss Army Knife, someone at Informatica was paying attention to revolutionary ideas. If you’ve not yet heard the news about the Informatica platform, a version called PowerCenter Express has been released and it is free of charge so you can use it to handle an assortment of what I’d characterize as high complexity / low volume data integration challenges and experience a subset of the Informatica platform for yourself. I’d emphasize that PowerCenter Express doesn’t replace the need for Informatica’s enterprise grade products, but it is ideal for rapid prototyping, profiling data, and developing quick proof of concepts.

PowerCenter Express provides a glimpse of the evolving Informatica platform by integrating four Informatica products into a single, compact tool. There are no database dependencies and the product installs in just under 10 minutes. Much to my own surprise, I use PowerCenter express quite often going about the various aspects of my job with Informatica. I have it installed on my laptop so it travels with me wherever I go. It starts up quickly so it’s ideal for getting a little work done on an airplane. 

For example, recently I wanted to explore building some rules for an upcoming proof of concept on a plane ride home so I could claw back some personal time for my weekend. I used PowerCenter Express to profile some data and create a mapping.  And this mapping wasn’t something I needed to throw away and recreate in an enterprise version after my flight landed. Vibe, Informatica’s build once / run anywhere metadata driven architecture allows me to export a mapping I create in PowerCenter Express to one of the enterprise versions of Informatica’s products such as PowerCenter, DataQuality or Informatica Cloud.

As I alluded to earlier in this article, being a free offering I honestly didn’t expect too much from PowerCenter Express when I first started exploring it. However, due to my own positive experiences, I now like to think of PowerCenter Express as the Swiss Army Knife of Data Integration.

To start claiming back some of your personal time, get started with the free version of PowerCenter Express, found on the Informatica Marketplace at:  https://community.informatica.com/solutions/pcexpress

 Business Use Cases

Business Use Case for PowerCenter Express

FacebookTwitterLinkedInEmailPrintShare
Posted in Architects, Data Integration, Data Migration, Data Transformation, Data Warehousing, PowerCenter, Vibe | Tagged , | Leave a comment

How Informatica Tackles the Top 5 Big Data Challenges

Tackle the Top 5 Big Data Challenges

Tackle the Top 5 Big Data Challenges

In my last blog post I discussed the top 5 Big Data challenges listed below:

  • It’s difficult to find and retain resource skills to staff big data projects
  • It takes too long to deploy Big Data projects from ‘proof-of-concept’ to production
  • Big data technologies are evolving too quickly to adapt
  • Big Data projects fail to deliver the expected value
  • It’s difficult to make Big Data fit-for-purpose, assess trust, and ensure security

Informatica has extended its leadership in data integration and data quality to Hadoop with our Big Data Edition to address all of these Big Data challenges.

The biggest challenge companies’ face is finding and retaining Big Data resource skills to staff their Big Data projects.  One large global bank started their first Big Data project with 5 Java developers but as their Big Data initiative gained momentum they needed to hire 25 more Java developers that year.  They quickly realized that while they had scaled their infrastructure to store and process massive volumes of data they could not scale the necessary resource skills to implement their Big Data projects.  The research mentioned earlier indicates that 80% of the work in a Big Data project relates to data integration and data quality.  With Informatica you can staff Big Data projects with readily available Informatica developers instead of an army of developers hand-coding in Java and other Hadoop programming languages.  In addition, we’ve proven to our customers that Informatica developers are up to 5 times more productive on Hadoop than hand-coding and they don’t need to know how to program on Hadoop.  A large Fortune 100 global manufacturer needed to hire 40 data scientists for their Big Data initiative.  Do you really want these hard-to-find and expensive resources spending 80% of their time integrating and preparing data?

Another key challenge is that it takes too long to deploy Big Data projects to production.  One of our Big Data Media and Entertainment customers told me prior to purchasing the Informatica Big Data Edition that most of his Big Data projects had failed.  Naturally, I asked him why they had failed.  His response was, “We have these hot-shot Java developers with a good idea which they prove out in our sandbox environment.  But then when it comes time to deploy it to production they have to re-work a lot of code to make it perform and scale, make it highly available 24×7, have robust error-handling, and integrate with the rest of our production infrastructure.  In addition, it is very difficult to maintain as things change.  This results in project delays and cost overruns.”  With Informatica, you can automate the entire data integration and data quality pipeline; everything you build in the development sandbox environment can be immediately and automatically deployed and scheduled for production as enterprise ready.  Performance, scalability, and reliability are simply handled through configuration parameters without having to re-build or re-work any development which is typical with hand-coding.  And Informatica makes it easier to reuse existing work and maintain Big Data projects as things change. The Big Data Editions is built on Vibe our virtual data machine and provides near universal connectivity so that you can quickly onboard new types of data of any volume and at any speed.

Big Data technologies are emerging and evolving extremely fast. This in turn becomes a barrier to innovation since these technologies evolve much too quickly for most organizations to adopt before the next big thing comes along.  What if you place the wrong technology bet and find that it is obsolete before you barely get started?  Hadoop is gaining tremendous adoption but it has evolved along with other big data technologies where there are literally hundreds of open source projects and commercial vendors in the Big Data landscape.  Informatica is built on the Vibe virtual data machine which means that everything you built yesterday and build today can be deployed on the major big data technologies of tomorrow.  Today it is five flavors of Hadoop but tomorrow it could be Hadoop and other technology platforms.  One of our Big Data Edition customers, stated after purchasing the product that Informatica Big Data Edition with Vibe is our insurance policy to insulate our Big Data projects from changing technologies.  In fact, existing Informatica customers can take PowerCenter mappings they built years ago, import them into the Big Data Edition and can run on Hadoop in many cases with minimal changes and effort.

Another complaint of business is that Big Data projects fail to deliver the expected value.  In a recent survey (1), 86% Marketers say they could generate more revenue if they had a more complete picture of customers.  We all know that the cost of us selling a product to an existing customer is only about 10 percent of selling the same product to a new customer. But, it’s not easy to cross-sell and up-sell to existing customers.  Customer Relationship Management (CRM) initiatives help to address these challenges but they too often fail to deliver the expected business value.  The impact is low marketing ROI, poor customer experience, customer churn, and missed sales opportunities.  By using Informatica’s Big Data Edition with Master Data Management (MDM) to enrich customer master data with Big Data insights you can create a single, complete, view of customers that yields tremendous results.  We call this real-time customer analytics and Informatica’s solution improves total customer experience by turning Big Data into actionable information so you can proactively engage with customers in real-time.  For example, this solution enables customer service to know which customers are likely to churn in the next two weeks so they can take the next best action or in the case of sales and marketing determine next best offers based on customer online behavior to increase cross-sell and up-sell conversions.

Chief Data Officers and their analytics team find it difficult to make Big Data fit-for-purpose, assess trust, and ensure security.  According to the business consulting firm Booz Allen Hamilton, “At some organizations, analysts may spend as much as 80 percent of their time preparing the data, leaving just 20 percent for conducting actual analysis” (2).  This is not an efficient or effective way to use highly skilled and expensive data science and data management resource skills.  They should be spending most of their time analyzing data and discovering valuable insights.  The result of all this is project delays, cost overruns, and missed opportunities.  The Informatica Intelligent Data platform supports a managed data lake as a single place to manage the supply and demand of data and converts raw big data into fit-for-purpose, trusted, and secure information.  Think of this as a Big Data supply chain to collect, refine, govern, deliver, and manage your data assets so your analytics team can easily find, access, integrate and trust your data in a secure and automated fashion.

If you are embarking on a Big Data journey I encourage you to contact Informatica for a Big Data readiness assessment to ensure your success and avoid the pitfalls of the top 5 Big Data challenges.

  1. Gleanster Survey of 100 senior level marketers. The title of this survey is, Lifecycle Engagement: Imperatives for Midsize and Large Companies.  Sponsored by YesMail.
  2. “The Data Lake:  Take Big Data Beyond the Cloud”, Booz Allen Hamilton, 2013
FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Data Archiving, Data Governance, Data Integration | Tagged , , | Leave a comment

4 Steps to Bring Big Data to the Business

Bring Big Data to the Business

Bring Big Data to the Business

By now, the business benefits of effectively leveraging big data have become well known. Enhanced analytical capabilities, greater understanding of customers, and ability to predict trends before they happen are just some of the advantages. But big data doesn’t just appear and present itself. It needs to be made tangible to the business. All too often, executives are intimidated by the concept of big data, thinking the only way to work with it is to have an advanced degree in statistics.

There are ways to make big data more than an abstract concept that can only be loved by data scientists. Four of these ways were recently covered in a report by David Stodder, director of business intelligence research for TDWI, as part of TDWI’s special report on What Works in Big Data.

Go real-time

The time is ripe for experimentation with real-time, interactive analytics technologies, Stodder says. The next major step in the movement toward big data is enabling real-time or near-real-time delivery of information. Real-time data has been a challenge with BI data for years, with limited success, Stodder says. The good news is that Hadoop framework, originally built for batch processing, now includes interactive querying and streaming applications, he reports. This opens the way for real-time processing of big data.

Design for self-service

Interest in self-service access to analytical data continues to grow. “Increasing users’ self-reliance and reducing their dependence on IT are broadly shared goals,” Stodder says. “Nontechnical users—those not well versed in writing queries or navigating data schemas—are requesting to do more on their own.” There is an impressive array of self-service tools and platforms now appearing on the market. “Many tools automate steps for underlying data access and integration, enabling users to do more source selection and transformation on their own, including for data from Hadoop files,” he says. “In addition, new tools are hitting the market that put greater emphasis on exploratory analytics over traditional BI reporting; these are aimed at the needs of users who want to access raw big data files, perform ad-hoc requests routinely, and invoke transformations after data extraction and loading (that is, ELT) rather than before.”

Encourage visualization

Nothing gets a point across faster than having data points visually displayed – decision-makers can draw inferences within seconds. “Data visualization has been an important component of BI and analytics for a long time, but it takes on added significance in the era of big data,” Stodder says. “As expressions of meaning, visualizations are becoming a critical way for users to collaborate on data; users can share visualizations linked to text annotations as well as other types of content, such as pictures, audio files, and maps to put together comprehensive, shared views.”

Unify views of data

Users are working with many different data types these days, and are looking to bring this information into a single view – “rather than having to move from one interface to another to view data in disparate silos,” says Stodder. Unstructured data – graphics and video files – can also provide a fuller context to reports, he adds.

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, Data Transformation | Tagged , , | Leave a comment

Data Integration with Devices is Easier than You Think

Data Integration with Devices

Data Integration with Devices

The concept of the “Internet of Things” (IOT) is about getting devices we leverage in our daily lives, or devices used in industrial applications, to communicate with other devices or systems. This is not a new notion, but the bandwidth and connectivity mechanisms to make the IOT practical is a recent development.

My first job out of college was to figure out how to get devices that monitored and controlled an advanced cooling and heating system to communicate with a centralized and automated control center. We ended up building custom PCs for the application, running a version of Unix (DOS would not cut it), and the PCs mounted in industrial cases would communicate with the temperature and humidity sensors, as well as turn on and turn off fans and dampers.

At then end of the day, this was a data integration, not an engineering problem, that we were attempting to solve. The devices had to talk to the PCs, and the PC had to talk to a centralized system (Mainframe) that was able to receive the data, as well as use that data to determine what actions to take. For instance, the ability determine that 78 degrees was too warm for a clean room, and that a damper had to be open and a fan turned on to reduce the temperature, and then turn off when the temperature returned to normal.

Back in the day, we had to create and deploy custom drivers and software. These days, most devices have well-defined interfaces, or APIs, that developers and data integration tools can access to gather information from that device. We also have high performing networks. Much like any source or target system, these devices produce data which is typically bound to a structure, and that data can be consumed and restructured to meet the needs of the target system.

For instance, data coming off a smart thermostat in your home may be in the following structure:

Device (char 10)
Date (char 8)
Temp (num 3)

You’re able to access this device using an API (typically a REST-based Web Service), which returns a single chunk of data which is bound to the structure, such as:

Device (“9999999999”)
Date (“09162014”)
Temp (076)

Then you can transform the structure into something that’s native to the target system that receives this data, as well as translate the data (e.g., converting the Data form characters to numbers). This is where data integration technology makes money for you, given its ability to deal with the complexity of translating and transforming the information that comes off the device, so it can be placed in a system or data store that’s able to monitor, analyze, and react to this data.

This is really what the IOT is all about; the ability to have devices spin out data that is leveraged to make better use of the devices. The possibilities are endless, as to what can be done with that data, and how we can better manage these devices. Data integration is key. Trust me, it’s much easier to integrate with devices these days than it was back in the day.

Thank you for reading about Data Integration with Devices! Editor’s note: For more information on Data Integration, consider downloading “Data Integration for Dummies

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Data Integration Platform | Tagged , , | Leave a comment

How CFOs can change the conversation with their CIO?

Recently, I had the opportunity to interview half dozen CIOs and half dozen CFOs. Kind of like a marriage therapist, I got to see each party’s story about the relationship. CFOs, in particular, felt that the quality of the relationship could impact their businesses’ success. Armed with this knowledge, I wanted to see if I could help each leader build a better working relationship. Previously, I let CIO’s know about the emergence and significance of the strategic CFO.  In today’s post, l will start by sharing the CIOs perspective on the CFO relationship and then I will discuss how CFOs can build better CIO relationships.

CIOs feel under the gun these days!

Under the gunIf you don’t know, CIOs feel under the gun these days. CIOs see their enterprises demanding ubiquitous computing. Users want to use their apps and expect corporate apps to look like their personal apps such as Facebook. They want to bring their own preferred devices. Most of all, , they want all their data on any device when they need it. This means CIOs are trying to manage a changing technical landscape of mobile, cloud, social, and big data. These are all vying for both dollars and attention. As a result, CIOs see their role in a sea change. Today, they need to focus less on building things and more on managing vendors. CIOs say that they need to 1) better connect what IT is doing to support the business strategy;  2) improve technical orchestration; and 3) improve process excellence. This is a big and growing charter.

CIOs see the CFO conversation being just about the numbers

Only about the numbersCIOs worry that you don’t understand how many things are now being run by IT and that historical percentages of revenue may no longer appropriate. Think about healthcare, which used to be a complete laggard in technology but today it is having everything digitalized. Even a digital thermometer plugs into an iPad so it directly communicates with a patient record. The world has clearly changed. And CIOs worry that you view IT as merely a cost center and that you do not see the value generated through IT investment or the asset that information provides to business decision makers. However, the good news is that I believe that a different type of discussion is possible. And that CFOs have the opportunity to play an important role in helping to shape the value that CIOs deliver to the business.

CFOs should share their experience and business knowledge

Business ExperienceCFOs that I talked to said that they believe the CFO/CIO relationship needs to be complimentary and that the roles have the most concentric rings. These CFOs believe that the stronger the relationship the better it is for their business. One area that you can help the CIO is in sharing your knowledge of the business and business needs. CIOs are trying to get closer to the business and you can help build this linkage and to support requests that come out of this process. Clearly, an aligned CFO can be “one of the biggest advocates of the CIO”. Given this, make sure that you are on your CIOs Investment Committee.

 Tell your CIO about your data pains

Manual data movementCFOs need to be good customers too. CFOs that I talked to told me that they know their business has “a data issue”. They worry about the integrity of data from the source. CFOs see their role as relying increasingly on timely, accurate data. They, also, know they have disparate systems and too much manual stuff going on in the back office. For them, integration needs to exist from the frontend to the backend. Their teams personally feel the large number of manual steps.

For this reasons, CFOs, we talked to, believe that the integration of data is a big issue whether they are in a small or large business. Have you talked to your CIO about data integration or quality projects to change the ugliness that you have to live with day in day out? It will make you and the business more efficient. One CFO was blunt here saying “making life easier is all about the systems. If the systems suck then you cannot trust the numbers when you get them. You want to access the numbers easily, timely, and accurately. You want to make easier to forecast so you can set expectations with the business and externally”.

At the same time, CFOs that I talked to worried about the quality of financial and business data analysis. Once he had data, he worried about being able to analyze information effectively. Increasingly, CFOs say that they need to help drive synergies across their businesses. At the same time, CFOs increasingly need to manage upward with information.  They want information for decision makers so they can make better decisions.

Changing the CIO Dialog

So it is clear that CFOs like you see data as a competitive advantage in particular financial data. The question is, as your unofficial therapist, why aren’t you having a discussion with your CIO not just about the numbers or financial justification for this or that system and instead, asking about the+ integration investment that can make your integration problems go away.

FacebookTwitterLinkedInEmailPrintShare
Posted in CIO, Data Integration, Data Quality | Tagged , , , , , | Leave a comment

New type of CFO represents a potent CIO ally

The strategic CFO is different than the “1975 Controller CFO”

CFOTraditionally, CIOs have tended to work with what one CIO called a “1975 Controller CFO”. For this reason, the relationship between CIOs and CFOs was expressed well in a single word “contentious”. But a new type of CFO is emerging that offers the potential of different type of relationship. These so called “strategic CFOs” can be an effective ally for CIOs. The question is which type of CFO do you have? In this post, I will provide you with a bit of a litmus test so you can determine what type of CFO you have but more importantly, I will share how you can take maximum advantage of having a strategic-oriented CFO relationship. But first let’s hear a bit more of the CIOs reactions to CFOs.

Views of CIOs according to CIO interviews

downloadClearly, “the relationship…with these CFOs is filled with friction”. Controller CFOs “do not get why so many things require IT these days. They think that things must be out of whack. One CIO said that they think technology should only cost 2-3% of revenue while it can easily reach 8-9% of revenue these days.” Another CIO complained by saying their discussion with a Controller CFOs is only about IT productivity and effectiveness. In their eyes, this has limited the topics of discussion to IT cost reduction, IT produced business savings, and the soundness of the current IT organization. Unfortunately, this CIO believe that Controller CFOs are not concerned with creating business value or sees information as an asset. Instead, they view IT as a cost center. Another CIO says Controller CFOs are just about the numbers and see the CIO role as being about signing checks. It is a classic “demand versus supply” issue. At the same times, CIOs say that they see reporting to Controller CFO as a narrowing function. As well, they believe it signals to the rest of the organization “that IT is not strategic and less important than other business functions”.

What then is this strategic CFO?

bean counterIn contrast to their controller peers, strategic CFOs often have a broader business background than their accounting and a CPA peers. Many have, also, pursued an MBA. Some have public accounting experience. Others yet come from professions like legal, business development, or investment banking.

More important than where they came from, strategic CFOs see a world that is about more than just numbers. They want to be more externally facing and to understand their company’s businesses. They tend to focus as much on what is going to happen as they do on what has happened. Remember, financial accounting is backward facing. Given this, strategic CFOs spend a lot of time trying to understand what is going on in their firm’s businesses. One strategic CFO said that they do this so they can contribute and add value—I want to be a true business leader. And taking this posture often puts them in the top three decision makers for their business. There may be lessons in this posture for technology focused CIOs.

Why is a strategic CFO such a game changer for CIO?

Business DecisionsOne CIO put it this way. “If you have a modern day CFO, then they are an enabler of IT”. Strategic CFO’s agree. Strategic CFOs themselves as having the “the most concentric circles with the CIO”. They believe that they need “CIOs more than ever to extract data to do their jobs better and to provide the management information business leadership needs to make better business decisions”. At the same time, the perspective of a strategic CFO can be valuable to the CIO because they have good working knowledge of what the business wants. They, also, tend to be close to the management information systems and computer systems. CFOs typically understand the needs of the business better than most staff functions. The CFOs, therefore, can be the biggest advocate of the CIO. This is why strategic CFOs should be on the CIOs Investment Committee. Finally, a strategic CFO can help a CIO ensure their technology selections meet affordability targets and are compliant with the corporate strategy.

Are the priorities of a strategic CFO different?

Strategic CFOs still care P&L, Expense Management, Budgetary Control, Compliance, and Risk Management. But they are also concerned about performance management for the enterprise as whole and senior management reporting. As well they, they want to do the above tasks faster so finance and other functions can do in period management by exception. For this reason they see data and data analysis as a big issue.

Strategic CFOs care about data integration

In interviews of strategic CFOs, I saw a group of people that truly understand the data holes in the current IT system. And they intuit firsthand the value proposition of investing to fix things here. These CFOs say that they worry “about the integrity of data from the source and about being able to analyze information”. They say that they want the integration to be good enough that at the push of button they can get an accurate report. Otherwise, they have to “massage the data and then send it through another system to get what you need”.

These CFOs say that they really feel the pain of systems not talking to each other. They understand this means making disparate systems from the frontend to the backend talk to one another. But they, also, believe that making things less manual will drive important consequences including their own ability to inspect books more frequently. Given this, they see data as a competitive advantage. One CFO even said that they thought data is the last competitive advantage.

Strategic CFOs are also worried about data security. They believe their auditors are going after this with a vengeance. They are really worried about getting hacked. One said, “Target scared a lot of folks and was to many respects a watershed event”. At the same time, Strategic CFOs want to be able to drive synergies across the business. One CFO even extolled the value of a holistic view of customer. When I asked why this was a finance objective versus a marketing objective, they said finance is responsible for business metrics and we have gaps in our business metrics around customer including the percentage of cross sell is taking place between our business units. Another CFO amplified on this theme by saying that “increasingly we need to manage upward with information. For this reason, we need information for decision makers so they can make better decisions”. Another strategic CFO summed this up by saying “the integration of the right systems to provide the right information needs to be done so we and the business have the right information to manage and make decisions at the right time”.

So what are you waiting for?

If you are lucky enough to have a Strategic CFO, start building your relationship. And you can start by discussing their data integration and data quality problems. So I have a question for you. How many of you think you have a Controller CFO versus a Strategic CFO? Please share here.

FacebookTwitterLinkedInEmailPrintShare
Posted in CIO, Data Governance, Data Integration, Data Quality, Data Security | Tagged , , , | Leave a comment

The Changing ROI of Data Integration

The ROI of Data Integration

The Changing ROI of Data Integration

Over the years we’ve always tried to better define the ROI of data integration.  It seems pretty simple.  There is an increasing value to core enterprise systems and data stores once they communicate effectively with other enterprise systems and data stores.  There is unrealized value when systems and stores do not exchange data.

However, the nature of data integration has evolved, and so has the way we define the value.  The operational benefits are still there, but there are more strategic benefits to consider as well.

Data integration patterns have progressed from simple patterns that replicated data amongst systems and data stores, to more service-based use of core business data that is able to provide better time-to-market advantages and much better agility.  These are the strategic concepts that, when measured, add up to much more value than the simple operational advantages we first defined as the ROI of data integration.

The new ROI for data integration can be defined a few ways, including:

The use of data services to combine core data assets with composite applications and critical business processes.  This allows those who leverage data services, which is a form of data integration, to mix and match data services to provide access to core applications or business processes.  The applications leverage the data services (typically REST-based Web services) as ways to access back-end data stores, and can even redefine the metadata for the application or process (a.k.a., Data Virtualization).

This provides for a compressed time-to-market for critical business solutions, thus returning much in the way of investment.  What’s more important is the enterprise’s new ability to change to adapt to new business opportunities, and thus get to the value of agility.  This is clearly where the majority of ROI resides.

The use of integrated data to make better automated operational decisions.  This means that we’re taking integrated data, either as services or through simple replication, or using that data to make automated decisions.  Examples would be the ability to determine if inventory levels will support an increase in sales, or if the risk levels for financial trades are too high.

The use of big data analytics to define advanced use of data, including predicting the future.  Refers to the process of leveraging big data, and big data analytics, to make critical calls around the business, typically calls that are more strategic in nature.  An example would be the use of predictive analytics that leverages petabytes of data to determine if a product line is likely to be successful, or if the production levels will likely decline or increase.  This is different than operational use of data, as we discussed previously, in that we’re making strategic versus tactical use of the information derived from the data.  The ROI here, as you would guess, is huge.

A general pattern is that the ROI is much greater around data integration than it was just 5 years ago.  This is due largely to the fact that enterprises understand that data is everything, when it comes to driving a business.  The more effective the use of data, the better you can drive the business, and that means more ROI.  It’s just that simple.

Editor’s note: For more information on Data Integration, consider downloading “Data Integration for Dummies

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration | Tagged | Leave a comment

Does Your IT Organization have a Common Data Strategy?

Does Your IT Org have a Common Data Strategy?

Does Your IT Org have a Common Data Strategy?

You have to talk.

No, really, you do.

When it comes to data strategy, there is a growing divide between IT leaders and IT practitioners. New research from Informatica reveals a deep disconnect between IT executives and rank-and-file workers.

Consider this:

  • 61% of IT executives believe that improving data management strategy will help them be more responsive to customers. In contrast, only 42% of IT workers feel the same way.
  • 63% of IT executives say that effective data use enhances business agility. Only 41% of their IT workers agree.
  • 55% of IT executives regularly consult with business leaders on data management strategies. Only 17% of their IT staff do the same.

Why does this matter? Because if you can’t come together, it will hurt your business. Organizations that are smarter about data perform better financially. According to recent research by the Economist, organizations that use data strategically have a significantly higher EBITA (earnings before interest, taxes, and amortization) than those who do not.

Better data strategy? Better bottom line.

To drive revenue, IT executives and workers need to align around their data strategy. The Informatica ebook “Research: Data Drives Profit,” outlines all the evidence. In this eBook, we share four practices that fuel the strategic use of data.

Right Data, Right Time, Right Way

We now live in a data-centric world. When the right data is available and used at the right time, every application and every process becomes smarter than it was before. This, in turn, makes every person “smarter” as they make their daily business decisions.

In order to unleash your organization’s full potential, it is critical to think differently about your data:

  1. Data can no longer be defined by its source or application. Instead, data needs to be managed as an interconnected ecosystem spanning all applications, processes, computing platforms, devices, users and use cases.
  2. The data technology landscape will never again be a static standardized architecture. Instead, it will be constantly changing and adapting to incorporate new technologies or applications;
  3. With the consumerization of IT, companies are sitting on an ever-growing pool of data and technology skills, in both IT as well as the business. It is vital to harness all of this for the combined good of the company.

We want to know what you think. Reply in the comments and let us know whether you agree or disagree with the above statement and whether you think there’s alignment around it within your IT organization. (If you care to share your title or categorize yourself as an IT executive or IT staff member, that’d be helpful, too.) I’ll review the results and report out in a future blog post.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration | Tagged | Leave a comment

Telecommunications and Data: What If Your Fiancée Flunked Finance?

About 15 or so years ago, some friends of mine called me to share great news.  Their dating relationship had become serious and they were headed toward marriage.  After a romantic proposal and a beautiful ring, it was time to plan the wedding and invite the guests.

Telecommunication and data

Lack of a Steady Income Stream is Not Romantic

This exciting time was confounded by a significant challenge. Though they were very much in love, one of them had an incredibly tough time making wise financial choices. During the wedding planning process, the financially astute fiancée grew concerned about the problems the challenged partner could bring. Even though the financially illiterate fiancée had every other admirable quality, the finance issue nearly created enough doubt to end the engagement.  Fortunately, my friends moved forward with the ceremony, were married and immediately went to work on learning new healthy financial habits as a couple.

Telecommunication and data

Is financial folly a relationship red flag?

Let’s segue into how this relates to telecommunications and data, specifically to your average communications operator. Just like a concerned fiancée, you’d think twice about making a commitment to an organization that didn’t have a strong foundation.

Like the financially challenged fiancée, the average operator has a number of excellent qualities: functioning business model, great branding, international roaming, creative ads, long-term prospects, smart people at the helm and all the data and IT assets you can imagine.  Unfortunately, despite the externally visible bells and whistles, over time they tend to lose operational soundness around the basics. Specifically, their lack of data quality causes them to forfeit an ever increasing amount of billing revenue. Their poor data costs them millions each year.

A recent set of engagements highlighted this phenomenon. The small carrier (3-6 million subscribers) who implements a more consistent, unique way to manage core subscriber profile and product data could recover underbilling of $6.9 million annually. A larger carrier (10-20 million subscribers) could recover $28.1 million every year from fixing billing errors. (This doesn’t even cover the large Indian and Chinese carriers who have over 100 million customers!)

Typically, a billing error starts with an incorrect set up of a service line item base price and related 30+ discount line variances.  Next, the wrong service discount item is applied at contract start.  If that did not happen (or on top of those), it will occur when the customer calls in during or right before the end of the first contract period (12-24 months) to complain about the service quality, bill shock, etc.  Here, the call center rep will break an existing triple play bundle by deleting an item and setting up a separate non-bundle service line item at a lower price (higher discount).  The head of billing actually told us, “our reps just give a residential subscriber a discount of $2 for calling us”.  It’s even higher for commercial clients.

To make matters worse, this change will trigger misaligned (incorrect) activation dates or even bill duplication, all of which will have to be fixed later by multiple staff on the BSS and OSS side or may even trigger an investigation project by the revenue assurance department.  Worst case, the deletion of the item from the bundle (especially for B2B clients) will not terminate the wholesale cost the carrier still owes a national carrier for a broadband line, which often is 1/3 of the retail price for a business customer.

To come full circle to my initial “accounting challenged” example; would you marry (invest in) this organization?  Do you think this can or should be solved in a big bang approach or incrementally?  Where would you start: product management, the service center, residential or commercial customers?

Observations and illustrations contained in this post are estimates only and are based entirely upon information provided by the prospective customer and on our observations and benchmarks.  While we believe our recommendations and estimates to be sound, the degree of success achieved by the prospective customer is dependent upon a variety of factors, many of which are not under Informatica’s control and nothing in this post shall be relied upon as representative of the degree of success that may, in fact, be realized and no warranty or representation of success, either express or implied, is made.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Data Privacy | Tagged , | Leave a comment

Harness the Flow of Valuable Data Files Throughout Your System

Managing the recovery and flow of data files throughout your enterprise is much like managing the flow of oil from well to refinery – a wide range of tasks must be carefully completed to ensure optimal resource recovery. If these tasks are not handled properly, or are not addressed in the correct order, valuable resources may be lost. When the process involves multiple pipelines, systems, and variables, managing the flow of data can be difficult.

Organizations have many options to automate the processes of gathering data, transferring files, and executing key IT jobs. These options include home-built scheduling solutions, system integrated schedulers, and enterprise schedulers. Enterprise schedulers, such as Skybot Scheduler, often offer the most control over the organization’s workflow, as they offer the ability to create schedules connecting various applications, systems, and platforms.

In this way, the enterprise scheduler facilitates the transfer of data into and out of Informatica PowerCenter and Informatica Cloud, and ensures that raw materials are refined into valuable resources.

Enterprise Scheduling Automates Your Workflow

Think of an enterprise scheduler as the pipeline bearing data from its source to the refinery. Rather than allowing jobs or processes to execute randomly or to sit idle, the enterprise scheduler automates your organization’s workflow, ensuring that tasks are executed under the appropriate conditions without the need for manual monitoring or the risk of data loss.

Skybot Scheduler addresses the most common pain points associated with data recovery, including:

  • Scheduling dependencies: In order for PowerCenter or Cloud to complete the data gathering processes, other dependencies must be addressed. Information must be swept and updated, and files may need to be reformatted. Skybot Scheduler automates these tasks, keeping the data recovery process consistently moving forward.
  • Reacting to key events: As with oil recovery, small details can derail the successful mining of data. Key events, such as directory changes, file arrivals, and evaluation requirements can lead to a clog in the pipeline. Skybot Scheduler maintains the flow of data by recognizing these key events and reacting to them automatically.

Choose the Best Pipeline Available

Skybot Scheduler is one of the most powerful enterprise scheduling solutions available today, and is the only enterprise scheduler integrated with PowerCenter and Cloud.

Capable of creating comprehensive cross-platform automation schedules, Skybot Scheduler manages the many steps in the process of extracting, transforming, and loading data. Skybot maintains the flow of data by recognizing directory changes and other key events, and reacting to them automatically.

In short, by managing your workflow, Skybot Scheduler increases the efficiency of ETL processes and reduces the potential of a costly error.

To learn more about the power of enterprise scheduling and the Skybot Scheduler check out this webinar:  Improving Informatica ETL Processing with Enterprise Job Scheduling   or download the Free Trial.

FacebookTwitterLinkedInEmailPrintShare
Posted in Cloud Computing, Data Integration, Marketplace | Tagged , , | Leave a comment