Category Archives: Business Impact / Benefits

Guest interview with Jorij Abraham: author of the first book about PIM and founder of E-commerce Foundation

Jorij AbrahamJorij Abraham is the founder of the E-commerce Foundation, a non-profit organization dedicated to helping organizations and industries improve their e-commerce activities. He advises companies on e-commerce strategy, Omnichannel development and product information management. He also works as Director Research & Advise for Ecommerce Europe.

He’s written a fine book about PIM but don’t expect a technical book at all! This is what marketing teams, merchandisers, product category teams, digital strategist, should be reading.

Like him or not, when he talks you’d better listen!

Michele: Let’s start with a view on the PIM market. Where are we globally?

Jorij: We are just starting. Most retailers still do not realize how important product information is to sell digital. In some countries the expectation is that in 2020 30 – 50% of all consumer goods are bought online. PIM no longer is an option. It is essential to be successful now and in the upcoming years.

M.: What was the main inspiration behind the book? It is definitely the first book about PIM but I am sure the motivation runs a bit deeper than that.

J.: The fact that there is very little in depth information available about the subject triggered me to write the book. However, I got a lot of help from experts from Unic and the different software vendors and was very happy with all the great research Heiler had already done in the area.

M.: Who should be reading your book?

J.: I wrote the book for a broad audience; managers, employees responsible for product information management, marketers, merchandisers, and even students! There are chapters covering the basics and how a PIM can help a company on a strategic, tactical and operational level. Few later chapters are devoted to helping product information officers implement a PIM system and choose the right PIM solution.

M.: I see that in your book you cover a good number of big PIM vendors. What is the future for those who target mid-market businesses?

J.: If you look at the overall market I think we will see a large shake out in the industry. We will have very big players like Amazon, Ebay, Walmart and lots of niche players. The medium sized business will have a difficult time to survive. All will be in need for a PIM system however.

M.: What’s your take on the different PIM vendors out there? I personally see different flavours of PIM such as those more commerce friendly, as opposed to those more ERP friendly, or just minimalist PIM solutions.  

J.: In the book I discuss several solutions. Some are for companies starting with PIM others are top of the line. Especially for larger firms with lots of product information to manage I recommend to make a larger investment. Low-end PIM solutions are a good choice if you expect your needs will remain simple. However if you know that within two or three years you will have 100.000 products, in multiple languages with lots of attributes, do not start with a simple solution. Within 1.5 years you will have to migrate again and the costs of migration are not worth the licence costs saved.

PIM Book Jorij AbrahamM.: In your view, what are the major inhibitors for PIM adoption?

J.: There are many strategic, tactical and operational benefits. Managers have difficulties understanding the ROI because it is indirect. PIM can improve traffic to your site, increase conversion ratio, and reduce returns.

M.: Would it be easier to promote PIM in combination to a WCMS platform? More generally, is there a case to promote PIM as part of a greater strategic thrust?

J.: I personally prefer systems which are great at doing what they are meant to do. However it very much depends on the needs of the company. Combining a PIM with a WCMS are mixing two solutions with very different goals. Hybris is an example of a complete solutions. If you want to buy everything at once, it is a good choice. However what I like very much about the Heiler/Informatica solution is that is great at doing what is says it does. Especially the user friendliness of the system is a big plus. Why? Because if a PIM fails it usually is because of the low user adaptation.

M.: What would you suggest to Australian retailers who are clearly reluctant to adopt PIM primarily because of limited local references (at least on large scale)?

J.: Retail in Australia is going the same way as everywhere else. Digital commerce will be a fact of life and a PIM is essential to be successful online. Look at the proof in the Asia, Europe and the USA. PIM is here to stay.

M.: Is PIM now what ERP was in the 90s and CRM at the beginning of the millennium? In other words, will it ever become a commodity?

J.: I think so. But we are really at the start of PIM. CRM is anno 2014 not yet really a part of most IT architectures. So we have a long way to go…

M.: Let’s talk about the influence exerted by analyst firms such as Gartner, Forrester, and Ventana. What’s your view on this? Are they moving the market? They put a lot of effort in trying to differentiate themselves. For example, see how Gartner MDM Quadrant for Products combine MDM and PIM players.

J.:I think the research agencies in general do not get PIM yet to the full extent. It is still a niche market and they are combining solutions which in my view is not helping the business and IT user. I have seen companies buy an MDM solution expecting to support their PIM processes. MDM is very different from PIM although its goals overlap. I often see that PIM has much more end-users, requires faster publication processes. There are only a few solutions in the market which really combine MDM and PIM in a sensible way.

M.: Looking at your book, I noticed that you spend a great deal of effort in unearthing what I’d call ‘PIM core concepts”. However, while the core concepts are stable, being a technology-enabled discipline PIM will undergo ongoing enhancements. What is your view on this?

J.: This is a tough question. In fact, few chapters in my book may go out of date soon. For example, PIM providers are popping up and it’s hard to keep up. On a more important note, I also see the following trends:

a) The cloud is going to have a fundamental impact on PIM solutions. It will hard to sell an on-premise solution to companies that are very much focused on their core business and outsourcing everything else (e.g. Retailers)

b) I see companies working much more intensively to collect and disseminate accurate product information. This is costly and operational inefficient if it is undertaken in isolation. In fact, there’s room to improve the overall supply chain by integrating product information across different parties, e.g. suppliers, manufactures, and retailers.

c) Finally, I see the emergence of the social as another key development in the PIM space. Just think about the contribute that consumers are providing when they shop online and share their experience on the social platforms or provide a product recommendation and/or ranking. This is product information and PIMs need to incorporate that in the overall product enrichment.

M.: Thank you Jorij. This has been a fantastic opportunity for me and my readers to learn more about you and the great work you are doing.

J.: It is great that you are putting so much effort in sharing information about product information management. Only in such a way companies can start to understand the value of PIM and increase both sales as well as reduce costs.

The book is available on Springer website, Amazon, Bookdepository, and many other book stores.

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, CMO, PiM, Product Information Management | Tagged , , , , | Leave a comment

Who Has the Heart to Adopt this Orphan Oil Well?

As I browsed my BBC app a few weeks ago, I ran into this article about environmental contamination of oil wells in the UK, which were left to their own devices. The article explains that a lack of data and proper data management is causing major issues for gas and oil companies. In fact, researchers found no data for more than 2,000 inactive wells, many of which have been abandoned or “orphaned”(sealed and covered up). I started to scratch my head imagining what this problem looks like in places like Brazil, Nigeria, Malaysia, Angola and the Middle East. In these countries and regions, regulatory oversight is, on average, a bit less regulated.

Data Management

Like Oliver, this well needs a home!

On top of that, please excuse my cynicism here, but an “Orphan” well is just as ridiculous a concept as a “Dry” well.  A hole without liquid inside is not a well but – you guessed it – a hole.  Also, every well has a “Parent”, meaning

  • The person or company who drilled it
  • A  land owner who will get paid from its production and allowed the operation (otherwise it would be illegal)
  • A financier who fronted the equipment and research cost
  • A regulator, who is charged with overseeing the reservoir’s exploration

Let the “hydrocarbon family court judge” decide whose problem this orphan is with well founded information- no pun intended.  After all, this “domestic disturbance” is typically just as well documented as any police “house call”, when you hear screams from next door. Similarly, one would expect that when (exploratory) wells are abandoned and improperly capped or completed, there is a long track record about financial or operational troubles at the involved parties.  Apparently I was wrong.  Nobody seems to have a record of where the well actually was on the surface, let alone subsurface, to determine perforation risks in itself or from an actively managed bore nearby.

This reminds me of a meeting with an Asian NOC’s PMU IT staff, who vigorously disagreed with every other department on the reality on the ground versus at group level. The PMU folks insisted on having fixed all wells’ key attributes:

  1. Knowing how many wells and bores they had across the globe and all types of commercial models including joint ventures
  2. Where they were and are today
  3. What their technical characteristics were and currently are

The other departments, from finance to strategy, clearly indicated that 10,000 wells across the globe currently being “mastered” with (at least initially) cheap internal band aid fixes has a margin of error of up to 10%.   So much for long term TCO.  After reading this BBC article, this internal disagreement made even more sense.

If this chasm does not make a case for proper mastering of key operational entities, like wells, I don’t know what does. It also begs the question how any operation with potentially very negative long term effects can have no legally culpable party being capture in some sort of, dare I say, master register.  Isn’t this the sign of “rule of law” governing an advanced nation, e.g. having a land register, building permits, wills, etc.?

I rest my case, your honor.  May the garden ferries forgive us for spoiling their perfectly manicured lawn.  With more fracking and public scrutiny on the horizon, maybe regulators need to establish their own “trusted” well master file, rather than rely on oil firms’ data dumps.  After all, the next downhole location may be just a foot away from perforating one of these “orphans” setting your kitchen sink faucet on fire.

Do you think another push for local government to establish “well registries” like they did ten years ago for national IDs, is in order?

Disclaimer: Recommendations and illustrations contained in this post are estimates only and are based entirely upon information provided by the prospective customer and on our observations and benchmarks.  While we believe our recommendations and estimates to be sound, the degree of success achieved by the prospective customer is dependent upon a variety of factors, many of which are not under Informatica’s control and nothing in this post shall be relied upon as representative of the degree of success that may, in fact, be realized and no warranty or representation of success, either express or implied, is made.

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, Master Data Management | Tagged , , | Leave a comment

A Data-Driven Healthcare Culture is Foundational to Delivering Personalized Medicine in Healthcare

According to a recent article in the LA Times, healthcare costs in the United States far exceed costs in other countries. For example, heart bypass surgery costs an average of $75,345 in the U.S. compared to $15,742 in the Netherlands and $16,492 in Argentina. In the U.S. healthcare accounts for 18% of the U.S. GDP and is increasing. 

Michelle Blackmer is an healthcare industry expert at Informatica

Michelle Blackmer is an healthcare industry expert at Informatica

Michelle Blackmer is an healthcare industry expert at Informatica. In this interview, she explains why business as usual isn’t good enough anymore. Healthcare organizations are rethinking how they do business in an effort to improve outcomes, reduce costs, and comply with regulatory pressures such as the Affordable Care Act (ACA). Michelle believes a data-driven healthcare culture is foundational to personalized medicine and discusses the importance of clean, safe and connected data in executing a successful transformation.

Q. How is the healthcare industry responding to the rising costs of healthcare?
In response to the rising costs of healthcare, regulatory pressures (i.e. Affordable Care Act (ACA)), and the need to better patient outcomes at lower costs, the U.S. healthcare industry is transforming from a volume-based to a value-based model. In this new model, healthcare organizations need to invest in delivering personalized medicine.

To appreciate the potential of personalized medicine, think about your own healthcare experience. It’s typically reactive. You get sick, you go to the doctor, the doctor issues a prescription and you wait a couple of days to see if that drug works. If it doesn’t, you call the doctor and she tries another drug. This process is tedious, painful and costly.

Now imagine if you had a chronic disease like depression or cancer. On average, any given prescription drug only works for half of those who take it. Among cancer patients, the rate of ineffectiveness jumps to 75 percent. Anti-depressants are effective in only 62 percent of those who take them.

Video: MD Anderson Cancer CenterOrganizations like MD Anderson and UPMC aim to put an end to cancer. They are combining scientific research with access to clean, safe and connected data (data of all types including genomic data). The insights revealed will empower personalized chemotherapies. Personalized medicine offers customized treatments based on patient history and best practices. Personalized medicine will transform healthcare delivery. Click on the links to watch videos about their transformational work.

Q. What role does data play in enabling personalized medicine?
Data is foundational to value-based care and personalized medicine. Not just any data will do. It needs to be clean, safe and connected data. It needs to be delivered rapidly across hallways and across networks.

As an industry, healthcare is at a stage where meaningful electronic data is being generated. Now you need to ensure that the data is accessible and trustworthy so that it can be rapidly analyzed. As data is aggregated across the ecosystem, married with financial and genomic data, data quality issues become more obvious. It’s vital that you can define the data issues so the people can spend their time analyzing the data to gain insights instead of wading through and manually resolving data quality issues.

The ability to trust data will differentiate leaders from the followers. Leaders will advance personalized medicine because they rely on clean, safe and connected data to:

1)      Practice analytics as a core competency
2)      Define evidence, deliver best practice care and personalize medicine
3)      Engage patients and collaborate to foster strong, actionable relationships

Healthcare e-bookTake a look at this Healthcare eBook for more on this topic: Potential Unlocked: Transforming Healthcare by Putting Information to Work.

Q. What is holding healthcare organizations back from managing their healthcare data like other mission-critical assets?
When you say other mission-critical assets, I think of facilitates, equipment, etc. Each of these assets has people and money assigned to manage and maintain them. The healthcare organizations I talk to who are highly invested in personalized medicine recognize that data is mission-critical. They are investing in the people, processes and technology needed to ensure data is clean, safe and connected. The technology includes data integration, data quality and master data management (MDM).

What’s holding other healthcare organizations back is that while they realize they need data governance, they wrongly believe they need to hire big teams of “data stewards” to be successful. In reality, you don’t need to hire a big team. Use the people you already have doing data governance. You may not have made this a formal part of their job description and they might not have data governance technologies yet, but they do have the skillset and they are already doing the work of a data steward.

So while a technology investment is required and you need people who can use the technology, start by formalizing the data stewardship work people are doing already as part of their current job. This way you have people who understand the data, taking an active role in the management of the data and they even get excited about it because their work is being recognized. IT takes on the role of enabling these people instead of having responsibility for all things data.

Q. Can you share examples of how immature information governance is a serious impediment to healthcare payers and providers?
Cost of Bad DataSure, without information governance, data is not harmonized across sources and so it is hard to make sense of it. This isn’t a problem when you are one business unit or one department, but when you want to get a comprehensive view or a view that incorporates external sources of information, this approach falls apart.

For example, let’s say the cardiology department in a healthcare organization implements a dashboard. The dashboard looks impressive. Then a group of physicians sees the dashboard, point out erroes and ask where the information (i.e. diagnosis or attending physician) came from. If you can’t answer these questions, trace the data back to its sources, or if you have data inconsistencies, the dashboard loses credibility. This is an example of how analytics fail to gain adoption and fail to foster innovation.

Q. Can you share examples of what data-driven healthcare organizations are doing differently?
Certainly, while many are just getting started on their journey to becoming data-driven, I’m seeing some inspiring  examples, including:

  • Implementing data governance for healthcare analytics. The program and data is owned by the business and enabled by IT and supported by technology such as data integration, data quality and MDM.
  • Connecting information from across the entire healthcare ecosystem including 3rd party sources like payers, state agencies, and reference data like credit information from Equifax, firmographics from Dun & Bradstreet or NPI numbers from the national provider registry.
  • Establishing consistent data definitions and parameters
  • Thinking about the internet of things (IoT) and how to incorporate device data into analysis
  • Engaging patients through non-traditional channels including loyalty programs and social media; tracking this information in a customer relationship management (CRM) system
  • Fostering collaboration by understanding the relationships between patients, providers and the rest of the ecosystem
  • Analyzing data to understand what is working and what is not working so  that they can drive out unwanted variations in care

Q. What advice can you give healthcare provider and payer employees who want access to high quality healthcare data?
As with other organizational assets that deliver value—like buildings and equipment—data requires a foundational investment in people and systems to maximize return. In other words, institutions and individuals must start managing their mission-critical data with the same rigor they manage other mission-critical enterprise assets.

Q. Anything else you want to add?
Yes, I wanted to thank our 14 visionary customer executives at data-driven healthcare organizations such as MD Anderson, UPMC, Quest Diagnostics, Sutter Health, St. Joseph Health, Dallas Children’s Medical Center and Navinet for taking time out of their busy schedules to share their journeys toward becoming data-driven at Informatica World 2014.  In our next post, I’ll share some highlights about how they are using data, how they are ensuring it is clean, safe and connected and a few data management best practices. InformaticaWorld attendees will be able to download presentations starting today! If you missed InformaticaWorld 2014, stay tuned for our upcoming webinars featuring many of these examples.

 

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Business/IT Collaboration, Customers, Data Governance, Data Integration, Data Quality, Enterprise Data Management, Healthcare, Informatica World 2014, Master Data Management, Vertical | Tagged , , , , , , , , , , , , , , | Leave a comment

Agile Data Integration in Action: PowerCenter 9.6 Demo

PowerCenter 9.6 Demo WebinarA Data Integration Developer, a Data Analyst and a Business Analyst go into a bar… Heard that one? You probably didn’t. They never made it to the bar. They are still back at the office, going back and forth for the umpteenth time on the data requirements for the latest report…

Sounds familiar? If so, you are not alone. Many IT departments are struggling to meet the data needs of their business counterparts. Spreadsheets, emails and cocktail napkins have not proven themselves an effective tool for relaying data requirement by the business. The process takes too long and leaves both sides frustrated and dissatisfied with the outcome. IT does not have the bandwidth to meet the ever-increasing and rapidly changing data needs of the business.

The old-fashioned “waterfall” approach to data integration simply won’t cut it anymore in the fast-paced data-driven world. There has to be a better way. Here at Informatica, we believe that an end-to-end Agile Data Integration process can greatly increase business agility.

We start with a highly collaborative process, whereby IT and the Analyst work closely together through an iterative process to define data integration requirements. IT empowers the analyst with self-service tools that enable rapid prototyping and data profiling. Once the analyst is happy with the data they access and combine, they can use their tool to seamlessly share the output with IT for final deployment. This approach greatly reduces the time-to-data, and not just any data, the right data!

The ability to rapidly generate reports and deliver new critical data for decision-making is foundational to business agility. Another important aspect of business agility is the ability to scale your system as your needs grow to support more data, data types, users and projects. We accomplish that through advanced scaling capabilities, such as grid support and high availability, leading to zero downtime, as well as improved data insights through metadata management, lineage, impact analysis and business glossary.

Finally, we need to continue to ensure agility when our system is in production. Data validation should be performed to eliminate data defects. Trying to manually validate data is like looking for a needle in a haystack, very slowly… Automating your data validation process is fast and reliable, ensuring that the business gets accurate data all the time.

It is just as important to become more proactive and less reactive when it comes to your data in production. Early detection of data process and workflow problems through proactive monitoring is key to prevention.

Would you like to see a 5X increase in the speed of delivering data integration projects?

Would you like to provide the system reliability you need as your business grows, and ensure that your business continues to get the critical data it requires without defects and without interruption?

To learn more about how Agile Data Integration can enable business agility, please check out the demonstration of the newly-released PowerCenter 9.6, featuring David Lyle, VP Product Strategy at Informatica and the Informatica Product Desk experts. This demo webinar is available on demand.

Deep Dive Demo: Informatica PowerCenter 9.6.

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, Business/IT Collaboration, Data Integration | Tagged , , , , , | Leave a comment

Business Beware! Corporate IT Is “Fixing” YOUR Data

It is troublesome to me to repeatedly get into conversations with IT managers who want to fix data “for the sake of fixing it”.  While this is presumably increasingly rare, due to my department’s role, we probably see a higher occurrence than the normal software vendor employee.  Given that, please excuse the inflammatory title of this post.

Nevertheless, once the deal is done, we find increasingly fewer of these instances, yet still enough, as the average implementation consultant or developer cares about this aspect even less.  A few months ago a petrochemical firm’s G&G IT team lead told me that he does not believe that data quality improvements can or should be measured.  He also said, “if we need another application, we buy it.  End of story.”  Good for software vendors, I thought, but in most organizations $1M here or there do not lay around leisurely plus decision makers want to see the – dare I say it – ROI.

This is not what a business - IT relationship should feel like

This is not what a business – IT relationship should feel like

However, IT and business leaders should take note that a misalignment due to lack OR disregard of communication is a critical success factor.  If the business does not get what it needs and wants AND it differs what Corporate IT is envisioning and working on – and this is what I am talking about here – it makes any IT investment a risky proposition.

Let me illustrate this with 4 recent examples I ran into:

1. Potential for flawed prioritization

A retail customer’s IT department apparently knew that fixing and enriching a customer loyalty record across the enterprise is a good and financially rewarding idea.  They only wanted to understand what the less-risky functional implementation choices where. They indicated that if they wanted to learn what the factual financial impact of “fixing” certain records or attributes, they would just have to look into their enterprise data warehouse.  This is where the logic falls apart as the warehouse would be just as unreliable as the “compromised” applications (POS, mktg, ERP) feeding it.

Even if they massaged the data before it hit the next EDW load, there is nothing inherently real-time about this as all OLTP are running processes of incorrect (no bidirectional linkage) and stale data (since the last load).

I would question if the business is now completely aligned with what IT is continuously correcting. After all, IT may go for the “easy or obvious” fixes via a weekly or monthly recurring data scrub exercise without truly knowing, which the “biggest bang for the buck” is or what the other affected business use cases are, they may not even be aware of yet.  Imagine the productivity impact of all the roundtripping and delay in reporting this creates.  This example also reminds me of a telco client, I encountered during my tenure at another tech firm, which fed their customer master from their EDW and now just found out that this pattern is doomed to fail due to data staleness and performance.

2. Fix IT issues and business benefits will trickle down

Client number two is a large North American construction Company.  An architect built a business case for fixing a variety of data buckets in the organization (CRM, Brand Management, Partner Onboarding, Mobility Services, Quotation & Requisitions, BI & EPM).

Grand vision documents existed and linked to the case, which stated how data would get better (like a sick patient) but there was no mention of hard facts of how each of the use cases would deliver on this.  After I gave him some major counseling what to look out and how to flesh it out – radio silence. Someone got scared of the math, I guess.

3. Now that we bought it, where do we start

The third culprit was a large petrochemical firm, which apparently sat on some excess funds and thought (rightfully so) it was a good idea to fix their well attributes. More power to them.  However, the IT team is now in a dreadful position having to justify to their boss and ultimately the E&P division head why they prioritized this effort so highly and spent the money.  Well, they had their heart in the right place but are a tad late.   Still, I consider this better late than never.

4. A senior moment

The last example comes from a South American communications provider. They seemingly did everything right given the results they achieved to date.  This gets to show that misalignment of IT and business does not necessarily wreak havoc – at least initially.

However, they are now in phase 3 of their roll out and reality caught up with them.  A senior moment or lapse in judgment maybe? Whatever it was; once they fixed their CRM, network and billing application data, they had to start talking to the business and financial analysts as complaints and questions started to trickle in. Once again, better late than never.

So what is the take-away from these stories. Why wait until phase 3, why have to be forced to cram some justification after the purchase?  You pick, which one works best for you to fix this age-old issue.  But please heed Sohaib’s words of wisdom recently broadcast on CNN Money “IT is a mature sector post bubble…..now it needs to deliver the goods”.  And here is an action item for you – check out the new way for the business user to prepare their own data (30 minutes into the video!).  Agreed?

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Business/IT Collaboration, CIO, Customer Acquisition & Retention, Customer Services, Data Aggregation, Data Governance, Data Integration, Data Quality, Data Warehousing, Enterprise Data Management, Master Data Management | Leave a comment

The Future of Data for Everyone

Chart for the future of dataWithin every corporation there are lines of businesses, like Finance, Sales, Logistics and Marketing. And within those lines of businesses are business users who are either non-technical or choose to be non-technical.

These business users are increasingly using Next-Generation Business Intelligence Tools like Tableau, Qliktech, MicroStrategy Visual Insight, Spotfire or even Excel. A unique capability of these Next-Generation Business Intelligence Tools is that they allow a non-technical Business User to prepare data, themselves, prior to the ingestion of the prepared data into these tools for subsequent analysis.

Initially, the types of activities involved in preparing this data are quite simple. It involves, perhaps, putting together two excel files via a join on a common field. However, over time, the types of operations a non-technical user wishes to perform on the data become more complex. They wish to do things like join two files of differing grain, or validate/complete addresses, or even enrich company or customer profile data. And when a non-technical user reaches this point they require either coding or advanced tooling, neither of which they have access to. Therefore, at this point, they will pick up the phone, call their brethren in IT and ask nicely for help with combining, enhancing quality and enriching the data. Often times they require the resulting dataset back in a tight timeframe, perhaps a couple of hours. IT, will initially be very happy to oblige. They will get the dataset back to the business user in the timeframe requested and at the quality levels expected. No issues.

However, as the number of non-technical Business Users using Next-Generation Business Intelligence tools increase, the number of requests to IT for datasets also increase. And so, while initially IT was able to meet the “quick hit dataset” requests from the Business, over time, and to the best of their abilities, IT increasingly becomes unable to do so.

The reality is that over time, the business will see a gradual decrease in the quality of the datasets returned, as well as an increase the timeframe required for IT to provide the data. And at some point the business will reach a decision point. This is where they determine that for them to meet their business commitments, they will have to find other means by which to put together their “quick hit datasets.” It is precisely at this point that the business may do things like hire an IT contractor to sit next to them to do nothing but put together these “quick hit” datasets. It is also when IT begins to feel marginalized and will likely begin to see a drop in funding.

This dynamic is one that has been around for decades and has continued to worsen due to the increase in the pace of data driven business decision making. I feel that we at Informatica have a truly unique opportunity to innovate a technology solution that focuses on two related constituents, specifically, the Non-Technical Business User and the IT Data Provisioner.

The specific point of value that this technology will provide to the Non-Technical Business User will enable them to rapidly put together datasets for subsequent analysis in their Next-Generation BI tool of choice. Without this tool they might spend a week or two putting together a dataset or wait for someone else to put it together. I feel we can improve this division-of-labor and allow business users to spend 1-2 weeks performing meaningful analysis before spending 15 minutes putting the data set together themselves. Doing so, we allow non-technical business users to dramatically decrease their decision making time.

The specific point of value that this technology will provide the IT data provisioner is that they will now be able to effectively scale data provisioning as the number of requests for “quick hit datasets” rapidly increase. Most importantly, they will be able to scale, proactively.

Because of this, the Business and IT relationship has become a match made in heaven.

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, Business/IT Collaboration, Data Quality, Data Services | Tagged , , , | Leave a comment

How Can CEOs Protect Customer Data And Their Own Jobs?

Data Security

Data-centric security

Recently, a number of high-profile data breaches have drawn attention to the impact that compromised data can have on a business. When customer data is breached, the consequences can include:

  • A loss of customer trust
  • Revenue shortfalls
  • A plummeting stock price
  • C-level executives losing their jobs

As a result, Data security and privacy has become a key topic of discussion, not just in IT meetings, but in the media and the boardroom.

Preventing access to sensitive data has become more complex than ever before. There are new potential entry points that IT never previously considered. These new options go beyond typical BYOD user devices like smartphones and tablets. Today’s entry points can be much smaller: Things like HVAC controllers, office polycoms and temperature control systems. 

So what can organizations do to combat this increasing complexity? Traditional data security practices focus on securing both the perimeter and the endpoints. However, these practices are clearly no longer working and no longer manageable. Not only is the number and type of devices expanding, but the perimeter itself is no longer present. As companies increasingly outsource, off-shore and move operations to the cloud, it is no longer possible fence the perimeters and to keep intruders out. Because 3rd parties often require some form of access, even trusted user credentials may fall into the hands of malicious intruders. 

Data security requires a new approach. It must use policies to follow the data and to protect it, regardless of where it is located and where it moves. Informatica is responding to this need. We are leveraging our market leadership and domain expertise in data management and security. We are defining a new data security offering and category.  This week, we unveiled our entry into the Data Security market at our Informatica World conference. Our new security offering, Secure@Source™ will allow enterprises to discover, detect and protect sensitive data.

The first step towards protecting sensitive data is to locate and identify them. So Secure@Source™ first allows you discover where all the sensitive data are located in the enterprise and classify them.  As part of the discovery, Secure@source also analyzes where sensitive data is being proliferated, who has access to the data, who are actually accessing them and whether the data is protected or unprotected when accessed.  Secure@Source™ leverages Informatica’s PowerCenter repository and lineage technology to perform a first pass, quick discovery with a more in depth analysis and profiling over time.  The solution allows you to determine the privacy risk index of your enterprise and slice and dice the analysis based on region, departments, organization hierarchy, as well as data classifications.

infaaa

The longer term vision of Secure@Source™ will allow you to detect suspicious usage patterns and orchestrate the appropriate data protection method, such as:  alerting, blocking, archiving and purging, dynamically masking, persistently masking, encrypting, and/or tokenizing the data. The data protection method will depend on whether the data store is a production or non-production system, and whether you would like to de-identify sensitive data across all users or only for some users.  All can be deployed based on policies. Secure@Source™ is intended to be an open framework for aggregating data security analytics and will integrate with key partners to provide a comprehensive visibility and assessment of an enterprise data privacy risk.

Secure@Source™ is targeted for beta at the end of 2014 and general availability in early 2015.  Informatica is recruiting a select group of charter customers to drive and provide feedback for the first release. Customers who are interested in being a charter customer should register and send email to SecureCustomers@informatica.com.

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, Customers, Data Governance, Data Privacy | Tagged , , , , , | Leave a comment

MDM Day Advice: Connect MDM to a Tangible Business Outcome or You Will Fail

“Start your master data management (MDM) journey knowing how it will deliver a tangible business outcome. Will it help your business generate revenue or cut costs? Focus on the business value you plan to deliver with MDM and revisit it often,” advises Michael Delgado, Information  Management Director at Citrix during his presentation at MDM Day, the InformaticaWorld 2014 pre-conference program. MDM Day focused on driving value from business-critical information and attracted 500 people.

A record 500 people attended MDM Day in Las Vegas

A record 500 people attended MDM Day in Las Vegas

In Ravi Shankar’s recent MDM Day preview blog, Part 2: All MDM, All Day at Pre-Conference Day at InformaticaWorld, he highlights the amazing line up of master data management (MDM) and product information management (PIM) customers speakers, Informatica experts as well as our talented partner sponsors.

Here are my MDM Day fun facts and key takeaways:

  • Did you know that every 2 seconds an aircraft with GE engine technology is taking off somewhere in the world?

    Ginny Walker, Chief Enterprise Architect at GE Aviation

    Ginny Walker, Chief Enterprise Architect at GE Aviation

    GE Aviation’s Chief Enterprise Architect, Ginny Walker, presented “Operationalizing Critical Business Processes: GE Aviation’s MDM Story.” GE Aviation is a $22 billion company and a leading provider of jet engines, systems and services.  Ginny shared the company’s multi-year journey to improve installed-base asset data management. She explained how the combination of data, analytics, and connectivity results in productivity improvements such as reducing up to 2% of the annual fuel bill and reducing delays. The keys to GE Aviation’s analytical MDM success were: 1) tying MDM to business metrics, 2) starting with a narrow scope, and 3) data stewards. Ginny believes that MDM is an enabler for the Industrial Internet and Big Data because it empowers companies to get insights from multiple sources of data.

  •  Did you know that EMC has made a $17 billion investment in acquisitions and is integrating more than 70 technology companies?
    Barbara Latulippe, EMC

    Barbara Latulippe, Senior Director, Enterprise Information Management at EMC

    EMC’s Barbara Latulippe, aka “The Data Diva,” is the Senior Director of Enterprise Information Management (EIM). EMC is a $21.7 billion company that has grown through acquisition and has 60,000 employees worldwide. In her presentation, “Formula for Success: EMC MDM Best Practices,” Barbara warns that if you don’t have a data governance program in place, you’re going to have a hard time getting an MDM initiative off the ground. She stressed the importance of building a data governance council and involving the business as early as possible to agree on key definitions such as “customer.” Barbara and her team focused on the financial impact of higher quality data to build a business case for operational MDM. She asked her business counterparts, “Imagine if you could onboard a customer in 3 minutes instead of 15 minutes?”

  • Did you know that Citrix is enabling the mobile workforce by uniting apps, data and services on any device over any network and cloud?

    Michael Delgado, Citrix

    Michael Delgado, Information Management Director at Citrix

    Citrix’s Information Management Director, Michael Delgado, presented “Citrix MDM Case Study: From Partner 360 to Customer 360.” Citrix is a $2.9 billion Cloud software company that embarked on a multi-domain MDM and data governance journey for channel partner, hierarchy and customer data. Because 90% of the company’s product bookings are fulfilled by channel partners, Citrix started their MDM journey to better understand their total channel partner relationship to make it easier to do business with Citrix and boost revenue. Once they were successful with partner data, they turned to customer data. They wanted to boost customer experience by understanding the total customer relationship across products lines and regions. Armed with this information, Citrix employees can engage customers in one product renewal process for all products. MDM also helps Citrix’s sales team with white space analysis to identify opportunities to sell more user licenses in existing customer accounts.

  •  Did you know Quintiles helped develop or commercialize all of the top 5 best-selling drugs on the market?

    John Poonnen, Quintiles

    John Poonnen, Director Infosario Data Factory at Quintiles

    Quintiles’ Director of the Infosario Data Factory, John Poonnen, presented “Using Multi-domain MDM to Gain Information Insights:How Quintiles Efficiently Manages Complex Clinical Trials.” Quintiles is the world’s largest provider of biopharmaceutical development and commercial outsourcing services with more than 27,000 employees. John explained how the company leverages a tailored, multi-domain MDM platform to gain a holistic view of business-critical entities such as investigators, research facilities, clinical studies, study sites and subjects to cut costs, improve quality, improve productivity and to meet regulatory and patient needs. “Although information needs to flow throughout the process – it tends to get stuck in different silos and must be manually manipulated to get meaningful insights,” said John. He believes master data is foundational — combining it with other data, capabilities and expertise makes it transformational.

While I couldn’t attend the PIM customer presentations below, I heard they were excellent. I look forward to watching the videos:

  • Crestline/ Geiger: Dale Denham, CIO presented, “How Product Information in eCommerce improved Geiger’s Ability to Promote and Sell Promotional Products.”
  • Murdoch’s Ranch and Home Supply: Director of Marketing, Kitch Walker presented, “Driving Omnichannel Customer Engagement – PIM Best Practices.”

I also had the opportunity MDM Day Sponsorsto speak with some of our knowledgeable and experienced MDM Day partner sponsors. Go to Twitter and search for #MDM and #DataQuality to see their advice on what it takes to successfully kick-off and implement an MDM program.

There are more thought-provoking MDM and PIM customer presentations taking place this week at InformaticaWorld 2014. To join or follow the conversation, use #INFA14 #MDM or #INFA14 #PIM.

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Business/IT Collaboration, CIO, CMO, Customer Acquisition & Retention, Customers, Data Governance, Data Integration, Data Quality, Enterprise Data Management, Informatica World 2014, Master Data Management, Partners, PiM, Product Information Management, Uncategorized | Tagged , , , , , , , , , , , , , , , , , , , , | 1 Comment

Why Some Companies are So Good With Analytics

There’s a reason why big data analytics are so successful at some companies, yet fall flat at others. As MIT’s Michael Shrage put it in a recent Harvard Business Review article, it all depends on how deeply the data and tools are employed in the business. “Companies with mediocre to moderate outcomes use big data and analytics for decision support,” he says. “Successful ROA—Return on Analytics—firms use them to effect and support behavior change.”

use1

In other words, analytics really need to drill down deep into the psyche of organizations to make a difference. The more big data analytics get baked into business processes and outcomes, the more likely they are to deliver transformative results to the organization. As he puts it, “better data-driven analyses aren’t simply ‘plugged-in’ to existing processes and reviews, they’re used to invent and encourage different kinds of conversations and interactions.”

You may have heard some of these success stories in recent years – the casino and resort company that tracks customer engagements in real-time and extends targeted offers that will enrich their stay; the logistics company that knows where its trucks are, and can reroute them to speed up delivery and save fuel; the utility that can regulate customers’ energy consumption at critical moments to avoid brownouts.

Shrage’s observations come from interviews and discussions with hundreds of organizations in recent years. His conclusions point to the need to develop an “analytical culture” – in which the behaviors, practices, rituals and shared vision of the organization are based on data versus guesswork. This is not to say gut feel and passion don’t have a place in successful ventures – because they do. But having the data to back up passionate leadership is a powerful combination in today’s business climate.

Most executives instinctively understand the advantages big data can bring to their operations, especially with predictive analytics and customer analytics. The ability to employ analytics means better understanding customers and markets, as well as spotting trends as they are starting to happen, or have yet to happen. Performance analytics, predictive analytics, and prescriptive analytics all are available to decision makers.

Here are some considerations for “baking” data analytics deeper into the business:

Identify the business behaviors or processes to be changed by analytics. In his article, Shrage quotes a financial services CIO, who points out that standard BI and analytical tools often don’t go deeply enough into an organization’s psyche: “Improving compliance and financial reporting is the low-hanging fruit. But that just means we’re using analytics to do what we are already doing better.” The key is to get the business to open up and talk about what they would like to see changed as a result of analytics.

Focus on increasing analytic skills – for everyone. While many organizations go out searching for individual that can fill data scientist roles (or something similar), there’s likely an abundance of talent and insightfulness that can be brought out from current staff, both inside and outside of IT. Business users, for example, can be trained to work with the latest front-end tools that bring data forward into compelling visualizations. IT and data professionals can sharpen their skills with emerging tools and platforms such as Hadoop and MapReduce, as well as working with analytical languages such as R.

Shrage cites one company that recognized that a great deal of education and training was required before it could re-orient its analytics capabilities around “most profitable customers” and “most profitable products.”  Even clients and partners required some level of training. The bottom line: “The company realized that these analytics shouldn’t simply be used to support existing sales and services practices but treated as an opportunity to facilitate a new kind of facilitative and consultative sales and support organization.”

Automate, and what you can’t automate, make as friendly and accessible as possible. Automated decision management can improve the quality of analytics and the analytics experience for decision makers. That’s because automating low-level decisions – such as whether to grant a credit line increase or extend a special offer to a customer – removes these more mundane tasks from decision makers’ plates. As a result, they are freed up to concentrate on higher-level, more strategic decisions. For those decisions that can’t be automated, information should be as easily accessible as possible to all levels of decision makers – through mobile apps, dashboards, and self-service portals.

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, Data Quality | Tagged , , , | Leave a comment

Data Integration Eight Years Later

Data IntegrationI recently came across an article from 2006, which is clearly out-of-date, but still a good read about the state of data integration eight years ago. “Data integration was hot in 2005, and the intense interest in this topic continues in 2006 as companies struggle to integrate their ever-growing mountain of data.

A TDWI study on data integration last November found that 69% of companies considered data integration issues to be a high or very high barrier to new application development. To solve this problem, companies are increasing their spending on data integration products.”

Business intelligence (BI) and data warehousing were the way to go at the time, and companies were spending millions to stand up these systems. Data integration was all massive data movements and manipulations, typically driven by tactical tools rather than true data integration solutions.

The issue I had at the time was the inability to deal with real-time operational data, and the cost of the technology and deployments. While these issues were never resolved with traditional BI and data warehousing technology, we now have access to databases that can manage over a petabyte of data, and the ability to cull through the data in seconds.

The ability to support massive amounts of data have reignited the interest in data integration. Up-to-the-minute operational data in these massive data stores is actually possible. We can now understand the state of the business as it happens, and thus make incremental adjustments based upon almost perfect information.

What this situation leads to is true value. We have delivery of the right information to the right people, at the right time, and the ability to place automated processes and polices around this data. Business becomes self-correcting and self-optimizing. The outcome is a business that is data-driven, and thus more responsive to the markets as well as to the business world itself.

However, big data is an impossible dream without a focus on how the data moves from place to place, using data integration best practices and technology. I guess we can call this big data integration, but it’s really the path to provide these massive data stores with the operational data required to determine the proper metrics for the business.

While data integration is not a new term. However the application of new ways to leverage and value data brings unprecedented new value to enterprises. Millions of dollars an hour of value are being delivered to Global 2000 organizations that leverage these emerging data integration approaches and technology. What’s more, data integration is moving from the tactical to the strategic budgets of IT.

So, what’s changed in eight years? We finally figured out how to get the value from our data, using big data and data integration. It took us long enough, but I’m glad it’s finally become a priority.

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Business/IT Collaboration, Data Integration | Tagged , | Leave a comment