Category Archives: B2B Data Exchange

Analytics-Casual Versus Analytics-Driven

Analytics

Analytics-Casual Versus Analytics-Driven

What does it take to be an analytics-driven business? That’s a question that requires a long answer. Recently, Gartner research director Lisa Kart took on this question, noting how the key to becoming an analytics-driven business.

So, the secret of becoming an analytics-driven business is to bust down the silos — easier than done, of course. The good news, as Kart tells it, is that one doesn’t need to be casting a wide net across the world in search of the right data for the right occasion. The biggest opportunities are with connecting the data you already have, she says.

Taking Kart’s differentiation of just-using-analytics versus analytics-driven culture a step further, hare is a brief rundown of how businesses just using analytics approach the challenge, versus their more enlightened counterparts:

Business just using analytics: Lots of data, but no one really understands how much is around, or what to do with it.

Analytics-driven business: The enterprise has a vision and strategy, supported from the top down, closely tied to the business strategy. Management also recognizes that existing data has great value to the business.

Business just using analytics: Every department does its own thing, with varying degrees of success.

Analytics-driven business: Makes connections between all the data – of all types — floating around the organization. For example, gets a cross-channel view of a customer by digging deeper and connecting the silos together to transform the data into something consumable.

Business just using analytics: Some people in marketing have been collecting customer data and making recommendations to their managers.

Analytics-driven business: Marketing departments, through analytics, engage and interact with customers, Kart says. An example would be creating high end, in-store customer experiences that gave customers greater intimacy and interaction.

Business just using analytics: The CFO’s staff crunches numbers within their BI tools and arrive at what-if scenarios.

Analytics-driven business: Operations and finance departments share online data to improve performance using analytics. For example, a company may tap into a variety of data, including satellite images, weather patterns, and other factors that may shape business conditions, Kart says.

Business just using analytics: Some quants in the organization pour over the data and crank out reports.

Analytics-driven business: Encourages maximum opportunities for innovation by putting analytics in the hands of all employees. Analytics-driven businesses recognize that more innovation comes from front-line decision-makers than the executive suite.

Business just using analytics: Decision makers put in report requests to IT for analysis.

Analytics-driven business: Decision makers can go to an online interface that enables them to build and display reports with a click (or two).

Business just using analytics: Analytics spits out standard bar charts, perhaps a scattergram.

Analytics-driven business: Decision makers can quickly visualize insights through 3D graphics, also reflecting real-time shifts.

Share
Posted in B2B, B2B Data Exchange | Tagged , | 1 Comment

Big Data is Nice to Have, But Big Culture is What Delivers Success

big_data

Big Data is Nice to Have, But Big Culture is What Delivers Success

Despite spending more than $30 Billion in annual spending on Big Data, successful big data implementations elude most organizations. That’s the sobering assessment of a recent study of 226 senior executives from Capgemini, which found that only 13 percent feel they have truly have made any headway with their big data efforts.

The reasons for Big Data’s lackluster performance include the following:

  • Data is in silos or legacy systems, scattered across the enterprise
  • No convincing business case
  • Ineffective alignment of Big Data and analytics teams across the organization
  • Most data locked up in petrified, difficult to access legacy systems
  • Lack of Big Data and analytics skills

Actually, there is nothing new about any of these issues – in fact, the perceived issues with Big Data initiatives so far map closely with the failed expect many other technology-driven initiatives. First, there’s the hype that tends to get way ahead of any actual well-functioning case studies. Second, there’s the notion that managers can simply take a solution of impressive magnitude and drop it on top of their organizations, expecting overnight delivery of profits and enhanced competitiveness.

Technology, and Big Data itself, is but a tool that supports the vision, well-designed plans and hard work of forward-looking organizations. Those managers seeking transformative effects need to look deep inside their organizations, at how deeply innovation is allowed to flourish, and in turn, how their employees are allowed to flourish. Think about it: if line employees suddenly have access to alternative ways of doing things, would they be allowed to run with it? If someone discovers through Big Data that customers are using a product differently than intended, do they have the latitude to promote that new use? Or do they have to go through chains of approval?

Big Data may be what everybody is after, but Big Culture is the ultimate key to success.

For its part, Capgemini provides some high-level recommendations for better baking in transformative values as part of Big Data initiatives, based on their observations of best-in-class enterprises:

The vision thing: “It all starts with vision,” says Capgemini’s Ron Tolido. “If the company executive leadership does not actively, demonstrably embrace the power of technology and data as the driver of change and future performance, nothing digitally convincing will happen. We have not even found one single exception to this rule. The CIO may live and breathe Big Data and there may even be a separate Chief Data Officer appointed – expect more of these soon – if they fail to commit their board of executives to data as the engine of success, there will be a dark void beyond the proof of concept.”

Establish a well-defined organizational structure: “Big Data initiatives are rarely, if ever, division-centric,” the Capgemini report states. “They often cut across various departments in an organization. Organizations that have clear organizational structures for managing rollout can minimize the problems of having to engage multiple stakeholders.”

Adopt a systematic implementation approach:  Surprisingly, even the largest and most sophisticated organizations that do everything on process don’t necessarily approach Big Data this way, the report states. “Intuitively, it would seem that a systematic and structured approach should be the way to go in large-scale implementations. However, our survey shows that this philosophy and approach are rare. Seventy-four percent of organizations did not have well-defined criteria to identify, qualify and select Big Data use-cases. Sixty-seven percent of companies did not have clearly defined KPIs to assess initiatives. The lack of a systematic approach affects success rates.”

Adopt a “venture capitalist” approach to securing buy-in and funding: “The returns from investments in emerging digital technologies such as Big Data are often highly speculative, given the lack of historical benchmarks,” the Capgemini report points out. “Consequently, in many organizations, Big Data initiatives get stuck due to the lack of a clear and attributable business case.” To address this challenge, the report urges that Big Data leaders manage investments “by using a similar approach to venture capitalists. This involves making multiple small investments in a variety of proofs of concept, allowing rapid iteration, and then identifying PoCs that have potential and discarding those that do not.”

Leverage multiple channels to secure skills and capabilities: “The Big Data talent gap is something that organizations are increasingly coming face-to-face with. Closing this gap is a larger societal challenge. However, smart organizations realize that they need to adopt a multi-pronged strategy. They not only invest more on hiring and training, but also explore unconventional channels to source talent. Capgemini advises reaching out to partner organizations for the skills needed to develop Big Data initiatives. These can be employee exchanges, or “setting up innovation labs in high-tech hubs such as Silicon Valley.” Startups may also be another source of Big Data talent.

Share
Posted in B2B, B2B Data Exchange, Big Data, Business Impact / Benefits | Tagged , , | Leave a comment

Informatica Doubled Big Data Business in 2014 As Hadoop Crossed the Chasm

Big Data

Informatica Doubled Big Data Business in 2014 As Hadoop Crossed the Chasm

2014 was a pivotal turning point for Informatica as our investments in Hadoop and efforts to innovate in big data gathered momentum and became a core part of Informatica’s business. Our Hadoop related big data revenue growth was in the ballpark of leading Hadoop startups – more than doubling over 2013.

In 2014, Informatica reached about 100 enterprise customers of our big data products with an increasing number going into production with Informatica together with Hadoop and other big data technologies.  Informatica’s big data Hadoop customers include companies in financial services, insurance, telcommunications, technology, energy, life sciences, healthcare and business services.  These innovative companies are leveraging Informatica to accelerate their time to production and drive greater value from their big data investments.

These customers are in-production or implementing a wide range of use cases leveraging Informatica’s great data pipeline capabilities to better put the scale, efficiency and flexibility of Hadoop to work.  Many Hadoop customers start by optimizing their data warehouse environments by moving data storage, profiling, integration and cleansing to Hadoop in order to free up capacity in their traditional analytics data warehousing systems. Customers that are further along in their big data journeys have expanded to use Informatica on Hadoop for exploratory analytics of new data types, 360 degree customer analytics, fraud detection, predictive maintenance, and analysis of massive amounts of Internet of Things machine data for optimization of energy exploration, manufacturing processes, network data, security and other large scale systems initiatives.

2014 was not just a year of market momentum for Informatica, but also one of new product development innovations.  We shipped enhanced functionality for entity matching and relationship building at Hadoop scale (a key part of Master Data Management), end-to-end data lineage through Hadoop, as well as high performance real-time streaming of data into Hadoop. We also launched connectors to NoSQL and analytics databases including Datastax Cassandra, MongoDB and Amazon Redshift. Informatica advanced our capabilities to curate great data for self-serve analytics with a connector to output Tableau’s data format and launched our self-service data preparation solution, Informatica Rev.

Customers can now quickly try out Informatica on Hadoop by downloading the free trials for the Big Data Edition and Vibe Data Stream that we launched in 2014.  Now that Informatica supports all five of the leading Hadoop distributions, customers can build their data pipelines on Informatica with confidence that no matter how the underlying Hadoop technologies evolve, their Informatica mappings will run.  Informatica provides highly scalable data processing engines that run natively in Hadoop and leverage the best of open source innovations such as YARN, MapReduce, and more.   Abstracting data pipeline mappings from the underlying Hadoop technologies combined with visual tools enabling team collaboration empowers large organizations to put Hadoop into production with confidence.

As we look ahead into 2015, we have ambitious plans to continue to expand and evolve our product capabilities with enhanced productivity to help customers rapidly get more value from their data in Hadoop. Stay tuned for announcements throughout the year.

Try some of Informatica’s products for Hadoop on the Informatica Marketplace here.

Share
Posted in B2B Data Exchange, Big Data, Data Integration, Data Services, Hadoop | Tagged , , , , , , | Leave a comment

The Supply Chain Impact of Adding an Allergen to a Chocolate Bar

supply_chainThroughout the lifecycle of a consumable product, many parties are constantly challenged with updating and managing product information, like ingredients and allergens. Since December 13th, 2014, this task has become even more complex than before for companies producing or selling food and beverage products in the European Union due to the new EU 1169/2011 rules. As a result, changing the basic formula of a chocolate bar by adding one allergen, like nuts for example should be carefully considered as it can have a tremendous impact on its complete supply chain if manufacturer and retailer(s) want to remain compliant with EU regulation 1169/2011 and to inform the consumer about the changes to the product.

EU_1169_nuts_to_a_chololate_bar-supply_chainLet’s say, the chocolate bar is available in three (3) varieties: dark-, whole milk and white chocolate. Each of the varieties is available in two (2) sizes: normal and mini size. The chocolate bar is distributed in twelve (12) countries within the European Union using different packaging. This would require the manufacturer to introduce 72 (3 varieties * 2 sizes * 12 countries) new GTINs at an item level. If the chocolate producer decides to do any package or seasonal promotions, multipacks or introduce a new variety, this number would even be higher. The new attributes, including updated information on allergens, have to be modified for each product and 72 new GTINs have to be generated for the chocolates bars by the chocolate manufacturer’s data maintenance department. Trading partners have to be updated about the modifications, too. Assuming the manufacturer uses the Global Data Synchronization Network (GDSN), the new GTINs will have to be registered along with their new attributes eight weeks before the modified product will be available. In addition to that, packaging hierarchies have to be taken into consideration as well. Let’s say, each item has an associated case and pallet, the number of updates would sum up at 216 (72 updates * 3 product hierarchies).

Managing the updates related to the product modifications, results in high administrative and other costs. Trading partners across the supply chain report significant impact to their annual sales and costs. According to GS1 Europe, one retailer reported 6,000-8,000 GTIN changes per year, leading to 2-3% of additional administrative and support costs. If the GTIN had to change for every new minor variant of a product, they forecast the number of changes per year could rise to 20,000-25,000, leading to a significant increase in further additional administrative and support costs.

The change of the chocolate bar’s recipe also means that a corresponding change to the mandatory product data displayed on the label is required. This means for the online retailer(s) selling the chocolate bar that they have to update information displayed in the online shops. Considering that a retailer has to deliver exactly the product that is displayed in his online shop, there will be a period of time when the old version of the product and the new version coexist in the supply chain. During this period it is not possible for the retailer to know if the version of the product ordered on a website will be available at the time and place the order is picked.

GS1 Europe suggests handling this issue as follows: Retailers working to GS1 standards use GTINs to pick on-line orders. If the modified chocolate bar with nuts is given a new GTIN it increases the possibility that the correct variant can be made available for picking and, even if it is not available at the pick point, the retailer can recognize automatically if the version being picked is different from the version that was ordered. In this latter case the product can be offered as a substitute when the goods are delivered and the consumer can choose whether to accept it or not. On their websites, GS1 provides comprehensive information on how to comply with the new European Food Information Regulation.

Using the Global Data Synchronization Network (GDSN), suppliers and retailers are able to share standardized product data, cut down the cost of building point to point integrations and speed-up new product introductions by getting access to the most accurate and most current product information. The Informatica GDSN Accelerator is an add-on to the Informatica Product Information Management (PIM) system that provides an interface to access a GDSN certified data pool. It is designed to help organizations securely and continuously exchange, update and synchronize product data with trading partners according to the standards defined by Global Standards One (GS1). GDSN ensures that data exchanged between trading partners is accurate and compliant with globally supported standards in maintaining uniqueness, classification and identification of source and recipients. Integrated in your PIM System, the GDSN Accelerator allows for leveraging product data of highest standards to be exchanged with your trading partners via the GDSN.

Thanks to the automated product data exchange, efforts and costs related to the modification of a product, as demonstrated in the chocolate bar example can be significantly reduced for both, manufacturers and retailers. The product data can be easily transferred to the data pool and you can fully control the information sharing with a specific trading partner or with all recipients of a target market.

Related blogs:

How GS1 and PIM Help to Fulfill Legal Regulations and Feed Distribution Channels

5 Ways to Comply with the New European Food Information Regulation

Share
Posted in B2B Data Exchange, PiM | Tagged , , , | 1 Comment

Product Intelligence: How To Make Your Product Information Smarter

As we discussed at length in our #HappyHoliData series, no matter what the customer industry or use case, information quality is a key value component to deliver the right services or products to the right customer.

In my blog on 2015 omnichannel trends impacting customer experience I commented on product trust as a key expectation in the eyes of customers.

For product managers, merchandizers or category managers this means: which products shall we offer for which price? How is the competition pricing this item? With which content is the competition promoting this SKU? Are my retailers and distributors sticking to my price policy. Companies need quicker insights for taking decisions on their assortment, prices and compelling content and for better customer facing service.

Recently, we’ve been spending time discussing this challenge with the folks at Indix, an innovator in the product intelligence space, to find ways to help businesses improve their product information quality.  For background, Indix is building the world’s largest database of product information and currently tracks over 600 million products, over 600,000 seller, over 40,000 brands, over 10,000 attributes across over 6,000 categories. (source: Indix.com)

Indix takes all of that data, then cleanses and normalizes it and breaks it down into two types of product information — offers data and catalog data.  The offers data includes all the dynamic information related to the sale of a product such as the number of stores at which it is sold, price history, promotions, channels, availability, and shipping. The catalog data comprises relatively unchanging product information, such as brand, images, descriptions, specifications, attributes, tags, and facets.

product intelligence indix informatica

We’ve been talking with the Indix team about how powerful it could be to integrate product intelligence directly into the Informatica PIM.  Just imagine if Informatica customers could seamlessly bring in relevant offers and catalog content into the PIM through a direct connection to the Indix Product Intelligence Platform and begin using market and competitive data immediately.

What do you think?  

We’re going to be at NRF and meet selected people to discuss more.  If you like the idea, or have some feedback on the concept, let us know.  We’d love to see you while we’re there and talk further about this idea with you.

Share
Posted in B2B, B2B Data Exchange, PiM, Product Information Management, Retail | Tagged , , | Leave a comment

What Should Come First: Business Processes or Analytics?

business processesAs more and more businesses become fully digitized, the instantiation of their business processes and business capabilities becomes based in software. And when businesses implement software, there are choices to be made that can impact whether these processes and capabilities become locked in time or establish themselves as a continuing basis for business differentiation.

Make sure you focus upon the business goals

business processesI want to suggest that whether the software instantiations of business process and business capabilities deliver business differentiation depends upon whether business goals and analytics are successfully embedded in a software implementation from the start. I learned this first hand several years ago. I was involved in helping a significant insurance company with their implementation of analytics software. Everyone in the management team was in favor of the analytics software purchase. However, the project lead wanted the analytics completed after an upgrade had occurred to their transactional processing software. Fortunately, the firm’s CIO had a very different perspective. This CIO understood that decisions regarding the transaction processing software implementation could determine whether critical metrics and KPIs could be measured. So instead of doing analytics as an afterthought, this CIO had the analytics done as a fore thought. In other words, he slowed down the transactional software implementation. He got his team to think first about the goals for the software implementation and the business goals for the enterprise. With these in hand, his team determined what metrics and KPIs were needed to measure success and improvement. They then required the transaction software development team to ensure that the software implemented the fields needed to measure the metrics and KPIs. In some cases, this was as simple as turning on a field or training users to enter a field as the transaction software went live.

Make the analytics part of everyday business decisions and business processes

Tom DavenportThe question is how common is this perspective because it really matters. Tom Davenport says that “if you really want to put analytics to work in an enterprise, you need to make them an integral part of everyday business decisions and business processes—the methods by which work gets done” (Analytics at Work, Thomas Davenport, Harvard Business Review Press, page 121). For many, this means turning their application development on its head like our insurance CIO. This means in particular that IT implementation teams should no longer be about just slamming in applications. They need to be more deliberate. They need to start by identifying the business problems that they want to get solved through the software instantiation of a business process. They need as well to start with how they want to improve process by the software rather than thinking about getting the analytics and data in as an afterthought.

Why does this matter so much? Davenport suggests that “embedding analytics into processes improves the ability of the organization to implement new insights. It eliminates gaps between insights, decisions, and actions” (Analytics at Work, Thomas Davenport, Harvard Business Review Press, page 121). Tom gives the example of a car rental company that embedded analytics into its reservation system and was able with the data provided to expunge long held shared beliefs. This change, however, resulted in a 2% increased fleet utilization and returned $19m to the company from just one location.

Look beyond the immediate decision to the business capability

Davenport also suggests as well that enterprises need look beyond their immediate task or decision and appreciate the whole business process or what happens upstream or downstream. This argues that analytics be focused on the enterprise capability system. Clearly, maximizing performance of the enterprise capability system requires an enterprise perspective upon analytics. As well, it should be noted that a systems perspective allows business leadership to appreciate how different parts of the business work together as a whole. Analytics, therefore, allow the business to determine how to drive better business outcomes for the entire enterprise.

At the same time, focusing upon the enterprise capabilities system in many cases will overtime lead a reengineering of overarching business processes and a revamping of their supporting information systems. This allows in turn the business to capitalize on the potential of business capability and analytics improvement. From my experience, most organizations need some time to see what a change in analytics performance means. This is why it can make sense to start by measuring baseline process performance before determining enhancements to the business process. Once completed, however, refinement to the enhanced process can be determined by continuously measuring processes performance data.

Related links

Related Blogs

Analytics Stories: A Banking Case Study
Analytics Stories: A Financial Services Case Study
Analytics Stories: A Healthcare Case Study
Who Owns Enterprise Analytics and Data?
Competing on Analytics: A Follow Up to Thomas H. Davenport’s Post in HBR
Thomas Davenport Book “Competing On Analytics”

Solution Brief: The Intelligent Data Platform

Author Twitter: @MylesSuer

Share
Posted in B2B, B2B Data Exchange, Business Impact / Benefits, Business/IT Collaboration, Enterprise Data Management | Tagged , , , | Leave a comment

2015 – The Year of Data Integration?

2015, the Year of Data Integration?

2015, the Year of Data Integration?

I love the data integration coverage by Loraine Lawson in IT Business Edge,  especially in this December 12th posting  that focuses on the trends that will emerge in 2015.  The best quote from the post is: “Oddly, organizations still tended to focus on point-solutions in the cloud. As more infrastructure and data moves to the cloud, they’re experiencing similar pain points and relearning old lessons.”

The articles cites some research from Ovum, that predicts many enterprises will begin moving toward data integration, driven largely by the rise of cloud computing and big data.  However, enterprises need to invest in both modernizing the existing data management infrastructure, as well as invest in data integration technology.  “All of these new investments will push the middleware software market up 9 percent to a $16.3 billion industry, Information Management reports.”  This projection is for 2015.

I suspect that’s a bit conservative.  In my travels, I see much more interest in data integration strategies, approaches, and technology, as cloud computing continues to grow, as well as enterprises understand better the strategic use of data.  So, I would put the growth at 15 percent for 2015.

There are many factors driving this growth, beyond mere interest in cloud computing and big data.

The first consideration is that data is more strategic than initially understood.  While businesses have always considered data a huge asset, it has not been until the last few years that businesses have seen the true value of understanding what’s going on inside, and outside of their business.

Manufacturing companies want to see the current state of production, as well as production history.  Management can now use that data to predict trends to address, such as future issues around employee productivity, or even a piece of equipment that is likely to fail and the impact of that failure on revenue.  Healthcare companies are learning how to better monitor patient health, such as spotting likely health problems before they are diagnosed, or leveraging large data to understand when patterns emerge around health issues, such as areas of the country that are more prone to asthma, based upon air quality.

Second, there is the need to deal with compliance issues.  The new health care regulations, or even the new regulation around managing a publically traded company, require a great deal of data management issues, including data integration.

As these laws emerge, and are altered over time, the reporting requirements are always more complex and far reaching than they were before.  Those who want to avoid fines, or even avoid stock drops around mistakes, are paying close attention to this area.

Finally, there is an expectation from customers and employees that you will have a good handle on your data.  10 years ago you could tell a customer on the phone that you needed to check different systems to answer their question.  Those days are over.  Today’s customers and employees want immediate access to the data they need, and there is no good excuse for not being able to produce that data.  If you can’t, your competition will.

The interest in data integration will experience solid growth in 2015, around cloud and big data, for sure.  However, other factors will drive this growth, and enterprises will finally understand that data integration is core to an IT strategy, and should never be an afterthought.

Share
Posted in B2B Data Exchange, Business/IT Collaboration, Data Governance, Data Integration, Data Integration Platform | Tagged , , , | Leave a comment

Happy Holidays, Happy HoliData.

Happy Holidays, Happy HoliData

In case you have missed our #HappyHoliData series on Twitter and LinkedIn, I decided to provide a short summary of best practices which are unleashing information potential. Simply scroll and click on the case study which is relevant for you and your business. The series touches on different industries and use cases. But all have one thing in common: All consider information quality as key value to their business to deliver the right services or products to the right customer.

HappyHoliData_01 HappyHoliData_02 HappyHoliData_03 HappyHoliData_04 HappyHoliData_05 HappyHoliData_06 HappyHoliData_07 HappyHoliData_08 HappyHoliData_09 HappyHoliData_10 HappyHoliData_11 HappyHoliData_12 HappyHoliData_13 HappyHoliData_14 HappyHoliData_15 HappyHoliData_16 HappyHoliData_17 HappyHoliData_18 HappyHoliData_19 HappyHoliData_20 HappyHoliData_21 HappyHoliData_22 HappyHoliData_23 HappyHoliData_24

Thanks a lot to all my great teammates, who made this series happen.

Happy Holidays, Happy HoliData.

Share
Posted in B2B, B2B Data Exchange, Banking & Capital Markets, Big Data, CIO, CMO, Customers, Data Governance, Data Quality, Enterprise Data Management, Financial Services, Governance, Risk and Compliance, Manufacturing, Master Data Management, PaaS, PiM, Product Information Management, Retail, SaaS | Tagged , | Leave a comment

Informatica Rev: Data Democracy At Last – Part 2

This is a continuation from Part 1 of the Blog which you can read here.

Now, if you are in IT, reading about how Informatica Rev enables the everyday business users in your company to participate in the Data Democracy might feel like treachery. You are likely thinking that Informatica is letting the bull loose in your own fine china shop. You likely feel, first of all, that Informatica is supporting the systemic bypass of all the data governance that IT has worked hard to put in place and then second of all, that Informatica is supporting the alienation of the very IT people that have approved of and invested in Informatica for decades.

While I can understand this thought process I am here to, proudly, inform you that your thoughts cannot be further from the truth! In fact, in the not too distant future, Informatica is in a very strong position to create a very unique technology solution to ensure you can better govern all the data in your enterprise and do it in a way that will allow you to proactively deliver the right data to the business, yes, before the masses of everyday business users have started to knock your door down to even ask for it. Informatica’s unique solution will ensure the IT and Business divide that has existed in your company for decades, actually becomes a match made in heaven. And you in IT get the credit for leading this transformation of your company to a Data Democracy. Listen to this webinar to hear Justin Glatz, Executive Director of Information Technology at Code Nast speak about how he will be leading Conde Nast’s transformation to Data Democracy.

Data Democracy At Last

Data Democracy At Last

“How?” you might ask. Well, first let’s face it, today you do not have any visibility into how the business is procuring and using most data, and therefore you are not governing most of it. Without a change in your tooling, your ability to gain this visibility is diminishing greatly, especially since the business does not have to come to you to procure and use their cloud based applications.  By having all of your everyday business users use Informatica Rev, you, for the first time will have the potential to gain a truly complete picture of how data is being used in your company. Even the data they do not come to you to procure.

In the not too distant future, you will gain this visibility through an IT companion application to Informatica Rev. You will then gain the ability to easily operationalize your business user’s exact transformation logic or Recipe as we call it in Informatica Rev, into your existing repositories be they your enterprise data warehouse, datamart or master data management repository for example. And by-the-way you are likely already using Informatica PowerCenter or Informatica Cloud or Informatica MDM to manage these repositories anyway so you already have the needed infrastructure we will be integrating Informatica Rev with. And if you are not using Informatica for managing these repositories, the draw of becoming proactive with your business and leading the transformation of your company to a Data Democracy will be enough to make you want to go get Informatica.

Just as these Professionals have found success by participating in the Data Democracy, with Informatica Rev you finally can do so, too. You can try Informatica Rev for free by clicking here.

Share
Posted in B2B, B2B Data Exchange, Data First, Data Governance | Tagged , , , , | Leave a comment

Building an Enterprise Data Hub: Choosing the Data Integration Solution

Building an Enterprise Data Hub with proper Data Integration

Building an Enterprise Data Hub

Building an Enterprise Data Hub

Data flows into the enterprise from many sources, in many formats, sizes, and levels of complexity. And as enterprise architectures have evolved over the years, traditional data warehouses have become less of a final staging center for data, but rather, one component of the enterprise that interfaces with significant data flows. But since data warehouses should focus on being powerful engines for high value analytics, they should not be the central hub for data movement and data preparation (e.g. ETL/ELT), especially for the newer data types–such as social media, clickstream data, sensor data, internet-of-things-data, etc.–that are in use today.

When you start seeing data warehouse capacity consumed too quickly and performance degradation where end users are complaining about slower response times, and you risk not meeting your service-level agreements, then it might be time to consider an enterprise data hub (EDH). With an EDH, especially one built on Apache™ Hadoop®, you can plan a strategy around data warehouse optimization to get better use out of your entire enterprise architecture.

Of course, whenever you add another new technology to your data center, you care about interoperability. And since many systems in today’s architectures interoperate via data flows, it’s clear that sophisticated data integration technologies will be an important part of your EDH strategy. Today’s big data presents new challenges as relates to a wide variety of data types and formats, and the right technologies are needed to glue all the pieces together, whether those pieces are data warehouses, relational databases, Hadoop, or NoSQL databases.

Choosing a Data Integration Solution

Data integration software, at a high level, has one broad responsibility: to help you process and prepare your data with the right technology. This means it has to get your data to the right place in the right format in a timely manner. So it actually includes many tasks, but the end result is that timely, trusted data can be used for decision-making and risk management throughout the enterprise. You end up with a complete, ready-for-analysis picture of your business, as opposed to segmented snapshots based on a limited data set.

When evaluating a data integration solution for the enterprise, look for:

  • Ease of use to boost developer productivity
  • A proven track record in the industry
  • Widely available technology expertise
  • Experience with production deployments with newer technologies like Hadoop
  • Ability to reuse data pipelines across different technologies (e.g. data warehouse, RDBMS, Hadoop, and other NoSQL databases)

Trustworthy data

Data integration is only part of the story. When you’re depending on data to drive business decisions and risk management, you clearly want to ensure the data is reliable. Data governance, data lineage, data quality, and data auditing remain as important topics in an EDH. Oftentimes, data privacy regulatory demands must be met, and the enterprise’s own intellectual property must be protected from accidental exposure.

To help ensure that data is sound and secure, look for a solution that provides:

  • Centralized management and control
  • Data certification prior to publication, transparent data and integration processes, and the ability to track data lineage
  • Granular security, access controls, and data masking to protect data both in transit and at the source to prevent unauthorized access to specific data sets

Informatica is the data integration solution selected by many enterprises. Informatica’s family of enterprise data integration, data quality, and other data management products can manage data — of any format, complexity level, or size –from any business system, and then deliver that data across the enterprise at the desired speed.

Watch the latest Gartner video to see Todd Goldman, Vice President and General Manager for Enterprise Data Integration at Informatica, as well as executives from Cisco and MapR, give their perspective on how businesses today can gain even more value from big data.

Share
Posted in B2B, B2B Data Exchange, Cloud Data Integration, Data Governance, Data Integration, Enterprise Data Management | Tagged , , , , | Leave a comment