Category Archives: Informatica 9.5
For once hype could be a good thing. Well it is if you’re reading the latest Gartner – Hype Cycle for Application Infrastructure published last month, in July; because in it you will see how two important technology trends – the areas of BGS (sorry, another TLA for you to learn – B2B Gateway Software (and it’s even a nested TLA!)), and Managed File Transfer (MFT) have now made it out of the Trough of Disillusionment and up onto the Slope of Enlightenment. Why do I suddenly feel like John Bunyan’s Pilgrim?
Anyway, the key points that Gartner identifies are that centrally managing B2B interactions provides:
- Economies of scale and deeper insight into the technical aspects of data integration, transaction delivery, process integration and SOA interoperability, such as consolidating, tracking, storing and auditing files, messages, process events, acknowledgments, receipts, and errors and exceptions.
The B2B communications process with your external business partners, suppliers, etc. is not a static process and you need to be able to have visibility of these communications for not only regulatory compliance and auditability issues but also to manage the dynamic process.
- A single point through which to troubleshoot B2B integration issues.
B2B Gateways are now a mature technology and like standards most organisations use a number of them. Integrating them provides significant benefit and enables the organisation to have visibility of their business relationships and transactions and also know where to go to when things go wrong and need managing as they definitely will.
- A central, reusable repository for external business partner profiles and Web services APIs. This is particularly valuable when dealing with a large number of external business partners and cloud APIs, and when multiple business units interact with the same partners or cloud services.
The number of business partners we all have to deal with is increasing rapidly as we outsource, subcontract, farm-out and generally rely more on external specialist organisations. Having visibility of these relationships and making the most from new integration methodologies and processes can generate great savings and also give visibility of our business exposure to these suppliers.
- Support for the myriad data formats, transport and communication protocols, and security standards.
As the old saying goes “I love standards, there are so many to choose from.” Well our business processes are not getting any simpler; data standards are under constant change and revision with data formats becoming increasingly complex. So to be able to handle not just a few but all key formats and to be able to reuse previous transformation experience, utilise already developed libraries and lever new complex hierarchical data structures makes the difference between a stove-piped and soon to be redundant system and one that is flexible and supports new and ever changing business requirements.
As one of the vendors identified that can actively compete with offerings positioned to address the broader set of usage scenarios, Informatica’s B2B Data Exchange solution not only supports the B2B functional requirements an organisation will have but also integrates this process and data into the wider internal data integration platform and management process. (Look at this presentation for the new 9.5 functionality of key new featues.)
So now the reality not hype of B2B solutions can be delivered.
Sources: Gartner Hype Cycle for Application Infrastructure, 2012. Published: 24 July 2012
Analyst: Jess Thompson
Gartner Disclaimer re the Hype Cycle
Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.
It’s that time of year again when Gartner publishes their annual Magic Quadrant for Data Quality Tools and Informatica has been positioned as a leader. Informatica continues to build off of a strong heritage of success in delivering enterprise data quality solutions through a continued focus on delivering industry leading data management solutions to address the end-to-end data quality lifecycle. In large part, our progress can be attributed to our product roadmap, specifically our recent release of Informatica Data Quality 9.5, aimed at helping organizations maximize their return on data. Let’s look at a few of the more significant additions for this release:
Data Governance and Stewardship
Data governance as a corporate initiative is certainly here to stay and increasingly, organizations are looking for ways to leverage technology more effectively in support of their governance efforts. In Informatica Data Quality 9.5 we’ve added new capabilities so data stewards can use technology to their advantage. In particular, new workflow and task management capabilities help automate otherwise manual, error prone data reconciliation steps. Armed with such capabilities, data stewards can effectively align the activities of both business and IT in addressing data quality issues without excessive overhead.
With the rise of social computing platforms, customer analytics has a new wealth of information from which to produce unforeseen insight into customer and product patterns. The challenge, however, is how to make sense of what is otherwise unstructured information. One solution is natural language processing (NLP) which uses probabilistic analytics initiatives. With NLP available in Informatica Data Quality 9.5 organizations can now make sense of free form text from feeds like Facebook or LinkedIn and use that to sharpen their decision making processes while harnessing the benefits of big data.
Data discovery, quite simply, is the process of uncovering specific types of data when its meaning is not otherwise known. With new data discovery capabilities now available in Informatica Data Quality 9.5, organizations can easily and effectively identify key pieces of information without the need for manual intervention. This is particularly of value in areas such as data masking, where the identification of sensitive information such as personally identifiable information (PII) is of utmost importance.
These are just a few of the capabilities introduced in Informatica Data Quality 9.5 last month. As is evidenced in the 2012 Magic Quadrant, we believe the innovation and roadmap for our products is clearly going in the right direction. We’re excited by what lies ahead and have plans to continue to drive new and innovative capabilities for the data quality market.
About the Magic Quadrant
Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose
A recent trip to a supermarket in Telluride, Colorado struck me as a funny place to find an analogy for data quality, but there it was. You see, supermarkets here require you to bring your own bags to cart your groceries home. Those brown disposable plastic bags are banned here – the town has made a firm commitment to the philosophy of Reduce, Reuse and Recycle. By adhering to this environmental philosophy, data integration teams can develop and deploy successful data quality strategies across the enterprise despite the constraints of today’s “do more with less” IT budgets.
In the decade that I’ve been in the Information Management space, I’ve noticed that success in data integration usually comes in small increments – typically on a project by project basis. However, by leveraging those small incremental successes and deploying them in a repeatable, consistent fashion – either as standardized rules sets or data services in a SOA – development teams can maximize their impact at the enterprise level.
In a May 2012 survey by the Ponemon Institute, 66 percent said they are not confident their organization would be able to detect the loss or theft of sensitive personal information contained in systems operated by third parties, including cloud providers. In addition, the majority are not confident that their organization would be able detect the loss or theft of sensitive personal information in their company’s production environment.
Which aspect of data security for your cloud solution is most important?
1. Is it to protect the data in copies of production/cloud applications used for test or training purposes? For example, do you need to secure data in your Salesforce.com Sandbox?
2. Is it to protect the data so that a user will see data based on her/his role, privileges, location and data privacy rules?
3. Is it to protect the data before it gets to the cloud?
As compliance continues to drive people to action, compliance with contractual agreements, especially for the cloud infrastructure continues to drive investment. In addition, many organizations are supporting Salesforce.com as well as packaged solutions such as Oracle eBusiness, Peoplesoft, SAP, and Siebel.
Of the available data protection solutions, tokenization has been used and is well known for supporting PCI data and preserving the format and width of a table column. But because many tokenization solutions today require creating database views or changing application source code, it has been difficult for organizations to support packaged applications that don’t allow these changes. In addition, databases and applications take a measurable performance hit to process tokens.
What might work better is to dynamically tokenize data before it gets to the cloud. So there would be a transparent layer between the cloud and on-premise data integration that would replace the sensitive data with tokens. In this way, additional code to the application would not be required.
In the Ponemon survey, most said the best control is to dynamically mask sensitive information based on the user’s privilege level. After dynamically masking sensitive data, people said encrypting all sensitive information contained in the record is the best option.
The strange thing is that people recognize there is a problem but are not spending accordingly. In the same survey from Ponemon, 69% of organizations find it difficult to restrict user access to sensitive information in IT and business environments. However, only 33% say they have adequate budgets to invest in the necessary solutions to reduce the insider threat.
Is this an opportunity for you?
Hear Larry Ponemon discuss the survey results in more detail during a CSOonline.com/Computerworld webinar, Data Privacy Challenges and Solutions: Research Findings with Ponemon Institute, on Wednesday, June 13.
What an amazing week we had last week in Vegas with the Informatica community. I hope you all enjoyed the conference as much as I did. Having spent the last year pulling together the various components, it is wonderful when it all comes together! There were many memorable highlights ranging from the Product Councils on the Monday, to the pool party that evening to the keynotes on Tuesday including the launch of the Informatica 9.5 Platform, breakouts, hands-on labs, the Executive Summit, the Advisory Boards, the party at Haze nightclub and the closing keynotes.
Here are a few of my favorites – what are yours? (more…)
The world has gone mobile. Among consumers, the younger generation has grown up connecting via mobile devices rather than computers. In some developing countries, entire societies have skipped over computers directly to their smartphones to connect and interact. More and more enterprises are providing their workforce with mobile devices, or enabling “bring your own device”.
And mobile devices are just a tip of the machine-generated data iceberg. We are experiencing geometric growth in data being generated by machines and devices, ranging from mobile phones and tablets to smart meters to RFID tags to equipment sensors. There are billions of machines creating deluges of real-time data. And most traditional IT systems are simply not equipped to handle this type of big data. (more…)
Social networking is becoming inescapable. It has become mainstream faster than almost anyone could have predicted (other than perhaps Mark Zuckerberg.)
Full disclosure: I hardly ever use Facebook. Perhaps as a working mom with two young children, keeping up with former high school classmates is a luxury I can’t afford (or don’t want). But I use other types of social media extensively. I use LinkedIn for professional networking, recruiting, knowledge sharing and development. I use Twitter to communicate with customers, analysts and industry peers. I use several local online moms’ communities for advice on toddler tantrums, teething and preschools. (more…)
Just five years ago, there was a perception held by many in our industry that the world of data for enterprises was simplifying. This was in large part due to the wave of consolidation among application vendors. With SAP and Oracle gobbling up the competition to build massive, monolithic application stacks, the story was that this consolidation would simplify data integration and data management. (more…)
Quite a bit has happened on the topic of big data since my last post on Informatica Perspectives almost one and a half years ago. I have spent a career working with organizations on how to get control over their uncontrolled data growth and industry visionaries are promoting this brave new world of big data. (more…)