Category Archives: Vibe
Building a Data Competence for a Decision Ready Organization
There has been a lot of talk about “competing on analytics.” And this year, for the third year in a row BI/Analytics is the top spending priority for CIOs according to Gartner. Yet, the fact is that about half of all analytics projects do not deliver the expected results on time and on budget. That doesn’t mean that the projects don’t show value eventually, but it’s harder and takes longer than most people think.
To compete on analytics is to establish a company goal to deliver actionable business insights faster and better than anybody in your industry – and possibly competitors who may be looking to jump industry boundaries as Google and Apple have already done several times.
This requires a competence in analytics and a competence in data management, which is the focus of this blog. As an analytics manager at a healthcare company told me this week, “We suffer from beautiful reports built on crap data.” Most companies do not yet established standard people, processes and technology for data management. This is one of the last functional areas in most organizations where this is still true. Sales, Marketing, and Finance standardized years ago. It is only in the area of data management, which is shared by business and IT, that there is no real standardization. The result is unconnected silos of data, long IT backlogs for data-related requests, and a process that is literally getting slower by the day as it gets overwhelmed by data volume and data complexity.
Analytics Use Cases and Data Requirements
It is worthwhile to think of the different broad use cases for analytics within an organization and what that means for data requirements.
- Strategic Insights are the high level decisions a company must make. Better performing organizations are moving from “gut feel” to data-driven decision making. The data for these large decisions needs to be as perfect as possible since the business costs of getting it wrong can be enormous.
- Operational Insights require quick decisions to react to on-the-ground conditions. Here, the organization might be willing to sacrifice some data quality in order to deliver quick results. There is a speed versus expected benefit tradeoff to consider.
- Analytics Innovation is the process of asking questions that were often never possible or economic to even ask before. Often, the first step is to see if there is any value in the question or hypothesis. Here the data does not have to be perfect. Often approximated data is “good enough” to test whether a question is worth pursuing further. Some data scientists refer to this as “fail fast and move on quickly.”
The point here is that there is a tradeoff between speed of data delivery and the quality of the data that it is based on. Managers do not want to be making decisions based on bad data, and analysts do not want to spend a high percentage of their time just defending the data.
The Need for Speed in Business Insight Delivery
We are moving from historical to predictive and proscriptive analytics. Practically everybody has historical analysis, so while useful, it is not a market differentiator. The biggest competitive payoff will come from the more advanced forms of analytics. The need for speed as a market differentiator is built on the need provide service to customers in realtime and to make decisions faster than competitors. The “half-life” of an analytics insight drops rapidly once competitors gain the same insight.
Here are a couple of quick examples or predictive and proactive analytics:
- Many retailers are looking to identify a customer coming in the door and have a dashboard in front of the customer service representative that will give them a full profile of the customer’s history, products owned, and positive/negative ratings about this product on social media.
- In Sales, predictive analytics is being used today to recommend the “next best step” with a customer or what to upsell to that customer next and how to position it.
- Beyond that, we are seeing and emerging class of applications and smart devices that will proactively recommend an action to users, without being asked, based on realtime conditions.
The data problems
The big problem is that the data internal to an organization was never designed to be discovered, access and shared across the organization. It is typically locked into a specific application and that application’s format requirements. The new opportunity is the explosion of data external to the organization that can potentially enable questions that have never been possible to ask before. The best insights and most differentiating insights will come from data sources across multiple disparate sources. Often these sources are a mix of internal and external data.
Common data challenges for analytics:
- The 2015 Analytics and BI survey by InformationWeek found that the #1 barrier to analytics is data quality. And this does not just mean that that data is in the right format. It must be complete, it must have business meaning and context, it must be fit for purpose, and if joined with another data set, it must be joined correctly.
- The explosion of data volume and complexity.
- More than 50% organizations use is coming from external sources (Gartner). This data is often less-structured, of unknown structure, and may have limited business context as to what the data means exactly.
- The time-value of money. As mentioned earlier, the value of data and insights is eroding at increasing pace.
- Data Discovery: Gartner estimates that the BI tool market is growing at 8% but says that the market could be growing much faster if issues around data discovery and data management were addressed.
Recommendations for the Decision Ready Organization
If you truly want to compete on analytics, you need to first create a competency center around data management. Analytics is a great place to start. First:
- Break down the data & technology silos
- Standardize on data management tools, processes, skills to the extent possible
- Design so that all of your data is immediately discoverable, understandable, and shareable with any application or analytics project that might need it
Pick industry-leading data management tools, or even better, tools that are integrated into a comprehensive data management platform. Make sure that the platform:
- Works with any data
- Works with any BI tool
- Works with any analytics storage technology
- Supports all the analytics use cases: Strategic Decisions, Operational Decisions, and Innovation
- Supports multiple delivery modes: business analyst self-service as well as the more traditional IT delivery of data managed by a formal data governance body.
The past focus on applications has resulted in hard-to-access data silos. New technologies for analytics are causing some organizations to create new data silos in the search for speed for that particular project. If your organization is serious about being a leader in analytics, it is time to put the focus required into leading-edge data management tools and practices to fuel insight delivery.
We are working with organizations such as EMC, and Fidelity that have done this. You don’t have to do it all at once. Start with your next important analytics projects. Build it out the right way. Then expand your competence to the next project.
For more information see:
Enabling ISVs to Connect to More Data
Data is critical to application growth. Bringing additional data into your application is costly, and time spent on point-to-point integration takes time away from introducing new features.
Today, Informatica is releasing the Informatica Technology Partner Network (TPN) – an online developer portal designed to build a connector that makes it easy for Independent Software Vendors (ISVs) to access more application data. The Technology Partner Network provides ISVs with everything they need to fast-track cloud and hybrid connectivity with Informatica, including access to the following:
• Informatica development environment and connector toolkit
• Interactive REST API, instant API mock server and automated testing
• Technical resources, samples and adapter tester
• Developer community forum
The TPN provides developers with a development environment to load a connector toolkit (SDK) and immediately begin building their connector. The open and interactive REST API provides a space to learn, share and experience functionality without writing any code. A debugging proxy provides more detail on the request and response of the API call and can point to a mock server. These tools enable ISVs to build a prototype in a day and complete their connector development in just a couple of weeks.
The Informatica Vibe™ platform – a virtual data machine (VDM) provides the underlying data management engine that allows ISVs to transforms application data. Created exclusively for Independent Software Vendors (ISVs), Informatica is introducing the Vibe Ready Partner Program.
ISVs who join the Informatica Vibe Ready Partner Program gain access – at no cost – to the following:
• Pre-built Informatica connectors, mappings and end-user starter kit
• Informatica cloud sandbox multi-user instance
• Informatica software development not-for-resale (NFR) license Pack
• Connector developer support and certification
• Vibe Ready partner and certification logos
• 1-click activation of the connector on the Informatica Marketplace
ISVs that complete the Vibe Ready Certification can provide their customers with a 1-click trial or paid edition on the Informatica Marketplace. The Informatica Marketplace enables customers to search by application, connector or bundle. The Vibe Ready logo provides a simple way for customers to identify solutions that Informatica has certified.
Enabling a successful ISV ecosystem around the Vibe platform is a cornerstone of our business strategy. The Technology Partner Network and Informatica Vibe Ready Partner Program will enable our ISVs to make their data clean, connected and safe.
Here are some additional resources to get started developing with Informatica:
• Explore the Technology Partner Network
• Register as an Informatica Vibe Ready Partner Program
• Technology Partner Network questions? Email us
There is a new “Band Wagon” out there and it’s not Big Data. If you were at this year’s CES Show this past week, it would have been impossible even with a “Las Vegas-size” hangover not to have heard the hype around the Internet of Things (IoT). The Internet of Things includes anything and everything that is connected to the Internet and able to communicate and share information with other “smart” devices. This year as well as last it was about home appliances, fitness and health monitors, home security systems, Bluetooth enabled toothbrushes, sensors in shoes to monitor weight and mileage, thermostats that monitor humidity and sound, to kitchen utensils that can track and monitor the type of food you cook and eat.
If you ask me, all these devices and the IoT movement is both cool and creepy. Cool in the sense that networking technology has both matured and become affordable for devices to transmit data for companies to turn into actionable intelligence. IoT is creepy in the sense where do I really want someone monitoring what I cook or how many times I wake up and night? Like other hype cycles or band wagons, there are different opinions as to the size of the IoT market. Gartner expects it to include nearly 26 billion devices, with a “global economic value-add” of $1.9 trillion by 2020. The question is whether the Internet of Things is truly transformational to our daily lives? The answer to that really depends on being able to harness all that data into information. Just because my new IoT toothbrush can monitor and send data on how many times I brush my teeth, it doesn’t provide any color whether that makes me healthier or have a prettier smile :).
To help answer these questions, here are examples and potential use cases of leveraging all that Big Data from Small devices of the IoT world:
- Mimo’s Smart Baby Monitor is aimed at helping to prevent SIDS, the Mimo monitor is a new kind of infant monitor that provides parents with real-time information about their baby’s breathing, skin temperature, body position, and activity level on their smartphones.
- GlowCaps fit prescription bottles and via a wireless chip provide services that help people stick with their prescription regimen; from reminder messages, all the way to refill and doctor coordination.
- BeClose offers a wearable alarm button and other discrete wireless sensors placed around the home, the BeClose system can track your loved one’s daily routine and give you peace of mind for their safety by alerting you to any serious disruptions detected in their normal schedule.
- Postscapes provides technology a suite of sensors and web connectivity help save you time and resources by keeping plants fed based on their actual growing needs and conditions while automating much of the labor processes.
- OnFarm solution combines real-time sensor data from soil moisture levels, weather forecasts, and pesticide usage from farming sites into a consolidated web dashboard. Farmers can use this data with advanced imaging and mapping information to spot crop issues and remotely monitor all of the farms assets and resource usage levels.
- Banks and auto lenders are using cellular GPS units that report location and usage of financed cars in addition to locking the ignitions to prevent further movement in the case of default.
- Sensors on farm equipment now provides real-time intelligence on how many hours trackers are used, the weather conditions to predict mechanical problems, and measuring the productivity of the farmer to predict trends in the commodity market.
I can see a number of other potential use cases for IoT including:
- Health devices not only sending data but receiving data from other IoT devices to provide real time recommendations on workout routines based on weather data received from real-time weather sensors, food intake from kitchen devices, to nutritional information based on vitamins and medications consumed by the wearer.
- Credit card banks leveraging their GPS tracking device data from auto loan customers to combine it with credit card data to deliver real-time offers on merchant promotions while on the road.
- GPS tracking devices on hotel card keys to track where you go, eat, entertain to deliver more customized services and offers while one is on a business trip or vacation.
- Boxing gloves transmitting the impact and force of a punch to monitor for athlete concussions.
What does this all mean?
The Internet of Things has changed the way we live and do business and will continue to shape the future hopefully in a positive way. Harnessing all of that Big Data from Small devices does not come easily. Every device that generates data sends it to some central system through WiFi or cellular network. Once in that central system, it needs to be access, translated, transformed, cleansed, and standardized for business use with data from other systems that run the business. For example:
- Access, transform, and validate data from IoT with data generated from other business applications. Formats and values will be often different and change over time and needs to be rationalized and standardized for downstream business use. Otherwise, you end up with a bunch of Alphas and Numerics that make no sense.
- Data quality and validation: Just because a sensor can send data, it does not mean it will send the right data or data that is right for a business user trying to make sense of it. GPS data requires accurate coordinate data. If any value is transmitted incorrectly, it is important to identify those errors; more importantly correct it so the business can take action. This is especially important when combining like values (e.g. Weather status = Cold, Wet, Hot however the device is sending A,B, C)
- Shared with other systems: Once your data is ready to be consumed by new and existing analytic applications, marketing systems, CRM, or your fraud surveillance systems, it needs to be available in in real-time if required, in the right format, and structure as required by those applications and doing it in a way that is seamless, automated, and does not require heavy IT lifting.
In closing, IoT’s future is bright along with the additional insights gained from all that data. Consider it Cool or Creepy one thing is for sure, the IoT band wagon is in full swing!
In 2012, Forbes published an article predicting an upcoming problem.
The Need for Scalable Enterprise Analytics
Specifically, increased exploration in Big Data opportunities would place pressure on the typical corporate infrastructure. The generic hardware used to run most tech industry enterprise applications was not designed to handle real-time data processing. As a result, the explosion of mobile usages, and the proliferation of social networks, was increasing the strain on the system. Most companies now faced real-time processing requirements beyond what the traditional model was designed to handle.
In the past two years, the volume of data and speed of data growth has grown significantly. As a result, the problem has become more severe. It is now clear that these challenges can’t be overcome by simply doubling or tripling their IT spending on infrastructure sprawl. Today, enterprises seek consolidated solutions that offer scalability, performance and ease of administration. The present need is for scalable enterprise analytics.
A Clear Solution Is Available
Informatica PowerCenter and Data Quality is the market leading data integration and data quality platform. This platform has now been certified by Oracle as an optimal solution for both the Oracle Exadata Database Machine and the Oracle SuperCluster.
As the high-speed on-ramp for data into Oracle Exadata, PowerCenter and Data Quality deliver up-to five times faster performance on data load, query, profiling and cleansing tasks. Informatica’s data integration customers can now easily reuse data integration code, skills and resources to access and transform any data from any data source and load it into Exadata, with the highest throughput and scalability.
Customers adopting Oracle Exadata for high-volume, high-speed analytics can now be confident with Informatica PowerCenter and Data Quality. With these products, they can ingest, cleanse and transform all types of data into Exadata with the highest performance and scale required to maximize the value of their Exadata investment.
Proving the Value of Scalable Enterprise Analytics
In order to demonstrate the efficacy of their partnership, the two companies worked together on a Proof Of Value (POV) project. The goal is to prove that using PowerCenter with Exadata would improve both performance and scalability. The project involved PowerCenter and Data Quality 9.6.1 and x4-2 Exadata Machine. Oracle 11g was considered for both standard Oracle and Exadata versions.
The first test conducted a 1TB load test to Exadata and standard Oracle in a typical PowerCenter use case. The second test consisted of querying 1TB profiling warehouse database in Data Quality use case scenario. Performance data was collected for both tests. The scalability factor was also captured. A variant of the TPCH dataset was used to generate the test data. The results were significantly higher than prior Exabyte 1TB test. In particular:
- The data query tests achieved 5x performance.
- The data load tests achieved a 3x-5x speed increase.
- Linear scalability was achieved with read/write tests on Exadata.
What Business Benefits Could You Expect?
Informatica PowerCenter and Data Quality, along-with Oracle Exadata, now provide the best-of-breed combination of software and hardware, optimized to deliver the highest possible total system performance. These comprehensive tools drive agile reporting and analytics, while empowering IT organizations to meet SLAs and quality goals like never before.
- Extend Oracle Exadata’s access to even more business critical data sources. Utilize optimized out-of-the-box Informatica connectivity to easily access hundreds of data sources, including all the major databases, on-premise and cloud applications, mainframe, social data and Hadoop.
- Get more data, more quickly into Oracle Exadata. Move higher volumes of trusted data quickly into Exadata to support timely reporting with up-to-date information (i.e. up to 5x performance improvement compared to Oracle database).
- Centralize management and improve insight into large scale data warehouses. Deliver the necessary insights to stakeholders with intuitive data lineage and a collaborative business glossary. Contribute to high quality business analytics, in a timely manner across the enterprise.
- Instantly re-direct workloads and resources to Oracle Exadata without compromising performance. Leverage existing code and programming skills to execute high-performance data integration directly on Exadata by performing push down optimization.
- Roll-out data integration projects faster and more cost-effectively. Customers can now leverage thousands of Informatica certified developers to execute existing data integration and quality transformations directly on Oracle Exadata, without any additional coding.
- Efficiently scale-up and scale-out. Customers can now maximize performance and lower the costs of data integration and quality operations of any scale by performing Informatica workload and push down optimization on Oracle Exadata.
- Save significant costs involved in administration and expansion. Customers can now easily and economically manage large-scale analytics data warehousing environments with a single point of administration and control, and consolidate a multitude of servers on one rack.
- Reduce risk. Customers can now leverage Informatica’s data integration and quality platform to overcome the typical performance and scalability limitations seen in databases and data storage systems. This will help reduce quality-of-service risks as data volumes rise.
Oracle Exadata is a well-engineered system that offers customers out-of-box scalability and performance on demand. Informatica PowerCenter and Data Quality are optimized to run on Exadata, offering customers business benefits that speed up data integration and data quality tasks like never before. Informatica’s certified, optimized, and purpose-built solutions for Oracle can help you enable more timely and trustworthy reporting. You can now benefit from Informatica’s optimized solutions for Oracle Exadata to make better business decisions by unlocking the full potential of the most current and complete enterprise data available. As shown in our test results, you can attain up to 5x performance by scaling Exadata. Informatica Data Quality customers can perform profiling 1TB datasets, which is unheard before. We urge you to deploy the combined solution to solve your data integration and quality problems today while achieving high speed business analytics in these days of big data exploration and Internet Of Things.
Listen to what Ash Kulkarni, SVP, at OOW14 has to say on how @InformaticaCORP PowerCenter and Data Quality certified by Oracle as optimized for Exadata can deliver up-to five times faster performance improvement on data load, query, profiling, cleansing and mastering tasks, for Exadata.
The future of lighting may first be peeking through at Newark Liberty Airport in New Jersey. The airport has installed 171 new LED-based light fixtures that include a variety of sensors to detect and record what’s going in the airport, as reported by Diane Cardwell in The New York Times. Together they make a network of devices that communicates wirelessly and allows authorities to scan license plates of passing cars, watch out for lines and delays, and check out travelers for suspicious activities.
I get the feeling that Newark’s new gear will not be the last of lighting-based digital networks. Over the last few years, LED street lights have gone from something cities would love to have to the sector standard. That the market has shifted so swiftly is thanks to the efforts of early movers such as the City of Los Angeles, which last year completed the world’s largest LED street light replacement project, with LED fixtures installed on 150,000 streetlights.
Los Angeles is certainly not alone in making the switch to LED street lighting. In March 2013, Las Vegas outfitted 50,000 streetlights with LED fixtures. One month later, the Austin TX announced plans to install 35,000 LED street lights. Not to be outdone, New York City, is planning to go all-LED by 2017, which would save $14 million and many tons of carbon emissions each year.
The impending switch to LEDs is an excellent opportunity for LED light fixture makers and Big Data software vendors like Informatica. These fixtures are made with a wide variety of sensors that can be tailored to whatever the user wants to detect, including temperature, humidity, seismic activity, radiation, audio, and video, among other things. The sensors could even detect and triangulate the source of a gunshot.
This steady stream of real-time data collected from these fixtures can be transformed into torrents of small messages and events with unprecedented agility using Informatica Vibe Data Stream. Analyzed data can then be distributed to various governmental and non-governmental agencies, such as; law enforcement, environmental monitors, retailers, etc.
If I were to guess the number of streetlights in the world, I would say 4 billion. Upgrading these is a “once-in-a-generation opportunity” to harness “lots of data, i.e., Sensory big data.”
I recently had the opportunity to have a very interesting discussion with Glenn Gow, the CEO of Crimson Marketing. I was impressed at what an interesting and smart guy he was, and with the tremendous insight he has into the marketing discipline. He consults with over 150 CMOs every year, and has a pretty solid understanding about the pains they are facing, the opportunities in front of them, and the approaches that the best-of-the-best are taking that are leading them towards new levels of success.
I asked Glenn if he would be willing to do a Q&A in order to share some of his insight. I hope you find his perspective as interesting as I did!
Q: What do you believe is the single biggest advantage that marketers have today?
A: Being able to use data in marketing is absolutely your single biggest competitive advantage as a marketer. And therefore your biggest challenge is capturing, leveraging and rationalizing that data. The marketers we speak with tend to fall into two buckets.
- Those who understand that the way they manage data is critical to their marketing success. These marketers use data to inform their decisions, and then rely on it to measure their effectiveness.
- Those who haven’t yet discovered that data is the key to their success. Often these people start with systems in mind – marketing automation, CRM, etc. But after implementing and beginning to use these systems, they almost always come to the realization that they have a data problem.
Q: How has this world of unprecedented data sources and volumes changed the marketing discipline?
A: In short… dramatically. The shift has really happened in the last two years. The big impetus for this change has really been the availability of data. You’ve probably heard this figure, but Google’s Eric Schmidt likes to say that every two days now, we create as much information as we did from the dawn of civilization until 2003.
We believe this is a massive opportunity for marketers. The question is, how do we leverage this data. How do we pull the golden nuggets out that will help us do our jobs better. Marketers now have access to information they’ve never had access to or even contemplated before. This gives them the ability to become a more effective marketer. And by the way… they have to! Customers expect them to!
For example, ad re-targeting. Customers expect to be shown ads that are relevant to them, and if marketers don’t successfully do this, they can actually damage their brand.
In addition, competitors are taking full advantage of data, and are getting better every day at winning the hearts and minds of their customers – so marketers need to act before their competitors do.
Marketers have a tremendous opportunity – rich data is available and the technology is available to harness it is now, so that they can win a war that they could never before.
Q: Where are the barriers they are up against in harnessing this data?
A: I’d say that barriers can really be broken down into 4 main buckets: existing architecture, skill sets, relationships, and governance.
- Existing Architecture: The way that data has historically been collected and stored doesn’t have the CMO’s needs in mind. The CMO has an abundance of data theoretically at their fingertips, but they cannot do what they want with it. The CMO needs to insist on, and work together with the CIO to build an overarching data strategy that meets their needs – both today and tomorrow because the marketing profession and tool sets are rapidly changing. That means the CMO and their team need to step into a conversation they’ve never had before with the CIO and his/her team. And it’s not about systems integration but it’s about data integration.
- Existing Skill Sets: The average marketer today is a right-brained individual. They entered the profession because they are naturally gifted at branding, communications, and outbound perspectives. And that requirement doesn’t go away – it’s still important. But today’s marketer now needs to grow their left-brained skills, so they can take advantage of inbound information, marketing technologies, data, etc. It’s hard to ask a right-brained person to suddenly be effective at managing this data. The CMO needs to fill this skillset gap primarily by bringing in people that understand it, but they cannot ignore it themselves. The CMO needs to understand how to manage a team of data scientists and operations people to dig through and analyze this data. Some CMOs have actually learned to love data analysis themselves (in fact your CMO at Informatica Marge Breya is one of them).
- Existing Relationships: In a data-driven marketing world, relationships with the CIO become paramount. They have historically determined what data is collected, where it is stored, what it is connected to, and how it is managed. Today’s CMO isn’t just going to the CIO with a simple task, as in asking them to build a new dashboard. They have to collectively work together to build a data strategy that will work for the organization as a whole. And marketing is the “new kid on the block” in this discussion – the CIO has been working with finance, manufacturing, etc. for years, so it takes some time (and great data points!) to build that kind of cohesive relationship. But most CIOs understand that it’s important, if for no other reason that they see budgets increasingly shifting to marketing and the rest of the Lines of Business.
- Governance: Who is ultimately responsible for the data that lives within an organization? It’s not an easy question to answer. And since marketing is a relatively new entrant into the data discussion, there are often a lot of questions left to answer. If marketing wants access to the customer data, what are we going to let them do with it? Read it? Append to it? How quickly does this happen? Who needs to author or approve changes to a data flow? Who manages opt ins/outs and regulatory black lists? And how does that impact our responsibility as an organization? This is a new set of conversations for the CMO – but they’re absolutely critical.
Q: Are the CMOs you speak with concerned with measuring marketing success?
A: Absolutely. CMOs are feeling tremendous pressure from the CEO to quantify their results. There was a recent Duke University study of CMOs that asked if they were feeling pressure from the CEO or board to justify what they’re doing. 64% of the respondents said that they do feel this pressure, and 63% say this pressure is increasing.
CMOs cannot ignore this. They need to have access to the right data that they can trust to track the effectiveness of their organizations. They need to quantitatively demonstrate the impact that their activities have had on corporate revenue – not just ROI or Marketing Qualified Leads. They need to track data points all the way through the sales cycle to close and revenue, and to show their actual impact on what the CEO really cares about.
Q: Do you think marketers who undertake marketing automation products without a solid handle on their data first are getting solid results?
A: That is a tricky one. Ideally, yes, they’d have their data in great shape before undertaking a marketing automation process. The vast majority of companies who have implemented the various marketing technology tools have encountered dramatic data quality issues, often coming to light during the process of implementing their systems. So data quality and data integration is the ideal first step.
But the truth is, solving a company’s data problem isn’t a simple, straight-forward challenge. It takes time and it’s not always obvious how to solve the problem. Marketers need to be part of this conversation. They need to drive how they’re going to be managing data moving forward. And they need to involve people who understand data well, whether they be internal (typically in IT), or external (consulting companies like Crimson, and technology providers like Informatica).
So the reality for a CMO, is that it has to be a parallel path. CMOs need to get involved in ensuring that data is managed in a way they can use effectively as a marketer, but in the meantime, they cannot stop doing their day-to-day job. So, sure, they may not be getting the most out of their investment in marketing automation, but it’s the beginning of a process that will see tremendous returns over the long term.
Q: Is anybody really getting it “right” yet?
A: This is the best part… yes! We are starting to see more and more forward-thinking organizations really harnessing their data for competitive advantage, and using technology in very smart ways to tie it all together and make sense of it. In fact, we are in the process of writing a book entitled “Moneyball for Marketing” that features eleven different companies who have marketing strategies and execution plans that we feel are leading their industries.
So readers, what do you think? Who do you think is getting it “right” by leveraging their data with smart technology and truly getting meaningful an impactful results?
This creative thinking to solve a problem came from a request to build a soldier knife from the Swiss Army. In the end, the solution was all about getting the right tool for the right job in the right place. In many cases soldiers didn’t need industrial strength tools, all they really needed was a compact and lightweight tool to get the job at hand done quickly.
Putting this into perspective with today’s world of Data Integration, using enterprise-class data integration tools for the smaller data integration project is over kill and typically out of reach for the smaller organization. However, these smaller data integration projects are just as important as those larger enterprise projects, and they are often the innovation behind a new way of business thinking. The traditional hand-coding approach to addressing the smaller data integration project is not-scalable, not-repeatable and prone to human error, what’s needed is a compact, flexible and powerful off-the-shelf tool.
Thankfully, over a century after the world embraced the Swiss Army Knife, someone at Informatica was paying attention to revolutionary ideas. If you’ve not yet heard the news about the Informatica platform, a version called PowerCenter Express has been released and it is free of charge so you can use it to handle an assortment of what I’d characterize as high complexity / low volume data integration challenges and experience a subset of the Informatica platform for yourself. I’d emphasize that PowerCenter Express doesn’t replace the need for Informatica’s enterprise grade products, but it is ideal for rapid prototyping, profiling data, and developing quick proof of concepts.
PowerCenter Express provides a glimpse of the evolving Informatica platform by integrating four Informatica products into a single, compact tool. There are no database dependencies and the product installs in just under 10 minutes. Much to my own surprise, I use PowerCenter express quite often going about the various aspects of my job with Informatica. I have it installed on my laptop so it travels with me wherever I go. It starts up quickly so it’s ideal for getting a little work done on an airplane.
For example, recently I wanted to explore building some rules for an upcoming proof of concept on a plane ride home so I could claw back some personal time for my weekend. I used PowerCenter Express to profile some data and create a mapping. And this mapping wasn’t something I needed to throw away and recreate in an enterprise version after my flight landed. Vibe, Informatica’s build once / run anywhere metadata driven architecture allows me to export a mapping I create in PowerCenter Express to one of the enterprise versions of Informatica’s products such as PowerCenter, DataQuality or Informatica Cloud.
As I alluded to earlier in this article, being a free offering I honestly didn’t expect too much from PowerCenter Express when I first started exploring it. However, due to my own positive experiences, I now like to think of PowerCenter Express as the Swiss Army Knife of Data Integration.
To start claiming back some of your personal time, get started with the free version of PowerCenter Express, found on the Informatica Marketplace at: https://community.informatica.com/solutions/pcexpress
Master data management (MDM) has come a long way in the past decade or so. When I was supporting my company’s customer master implementation back in 2001, my management was thrilled to simply have a customer master that brought a bit of order to the chaos sharing customer data between our CRM and ERP applications and downstream into our marketing data warehouse.
Fast forward to 2014 and mastering customer data alone is often table stakes for leadership trying to transform their business from a product- or account-centric to a customer-centric organizations.
Here at Informatica, we’ve seen over 75% of our MDM customers in the past year purchase for multidomain use cases – meaning the scope of their initiative often spans mastering data such as Customers, Suppliers and Products as part of a coordinated effort. These organizations have built compelling business cases to demonstrate that mastering multiple domains – and the relationships among those domains – is necessary. Only a true 360 degree view of relationships among any data can provide the necessary insights to deliver on the desired operational efficiencies, optimized customer experiences, and growth objectives for their companies.
The progress we’ve all made in multidomain MDM is impressive, but it’s just scratching the surface of what’s possible. What happens when MDM meets Cloud, Social, the Internet of Things and other master data enrichment sources such as D&B and Acxiom? Dennis Moore, Informatica’s GM and SVP for MDM, envisions that a new “Internet of Master Data” will be formed that can include a massive new set of sensor and social data which it leverages to infer and recommend a new class of relationship insights. For example, in addition to sentiment and relationships from social networks, location data from mobile devices and sensors can now inform customer – and product – behaviors that span beyond direct transactions and interactions within your traditional business applications.
Those of you who have invested in building a foundation of clean, consistent and connected data have a huge advantage as the value of MDM grows exponentially with the exponential growth of data. You are well-positioned to take advantage of the deeper insights and potential innovations now possible by adding Cloud, Social, and Machine data to optimizing analytics and operations.
This week at Informatica World 2014 in Las Vegas, we kicked off with our fantastic MDM Day pre-conference event with over 500 attendees. During the event, we shared some early insights into our MDM 10 release planned for later this year which integrates the Informatica Vibe engine and incorporates other elements of the just unveiled Informatica Intelligent Data Platform vision to make it easier for customers to gain a 360 degree view of their most critical business entities, including customers, suppliers, products and assets.
We continue to be inspired by our awesome MDM customers and partners, and we’re excited to see what they can do to harness the power of the Internet of Master Data!