Tag Archives: PowerCenter
“We do nightly updates to our data warehouse, but we have no way to validate that the data was moved and transformed correctly in the time available. We only have time to test a subset and hope that it is the right subset.”
“We have tools to tell us about performance after an issue occurs, but nothing that helps us prevent the issue in the first place. So, we find out about failures from the end-users looking at a bad report. This causes delayed or poor business decision making, and also impacts our departments’ reputation.”
These are just a couple of quotes I have heard from data integration end users. We have been collecting similar information from thousands of our data integration customers around their challenges across the data integration lifecycle. What we are hearing is that the growth of data within organizations and ever increasing demand for more timely data has introduced a number of threats. In turn, these new demands have created massive variability in the way customers approach their projects, which introduces a host of data integration challenges, especially in production.
So naturally, organizations have taken a variety of approaches to combat these threats – ranging from adding full-time employee/contractor teams to test and monitor workflows to developing customized scripts or a combination of both. Ok – so there are monitoring tools out there. But, the fact remains that generic monitoring tools don’t uncover deep data integration issues. This was discussed at length here. Additionally, typical testing efforts such as “stare and compare” and hand-coding are manual, un-repeatable, and un-auditable as discussed here.
Now recall the point about added pressure of explosive data growth and increasing demands for timely delivery, and add to it the wide variability in how organizations approach projects – it’s scary. But what’s more concerning is that this variability riddles production environments with errors, inefficiencies, and security threats. Manual and reactive approaches to remedy the problem only exacerbate issues by increasing complexity. Hence the delays in identifying, monitoring and fixing issues, typically requiring fire drills, manual solutions and reactive measures, but not addressing problems.
If any of this sounds familiar to you, you may want to take advantage of what we are doing at Informatica World 2013 to help you address these challenges. To make things convenient for you, I have prepared a customized guide on relevant sessions, hands-on-labs and booths to help you understand how automated, repeatable and auditable testing, along-with a pre-emptive approach to diffusing threats before they erupt into full-blown issues, can help you. Please feel free to sign-up here for some of the breakout sessions we are hosting on this topic, or swing by one of the labs or booths:
Tuesday, June 4
- 9:00AM – 10:00AM – Platform & Products Track – Best Practices for Doing Data Replication the Right Way
- 2:00OM – 3:00PM – Architecture Track – PowerCenter Architecture
Wednesday, June 5
- 2:30PM – 3:30PM – Platform & Products Track – New Approaches to Reducing PowerCenter Testing and Monitoring Time
Thursday, June 6
- 9:00AM – 10:00AM – Tech Talk Track – Proactive Monitoring: Greater IT Productivity with Streamlined Data Integration
- 10:15AM – 11:15AM – Platform & Products Track – What’s New from Informatica to Improve Data Warehouse Performance and Lowering Costs
BIRDS OF A FEATHER ROUNDTABLES (Check Agenda for Daily Timings):
- Testing Strategies and Tools
- Automating Administrative Maintenance Tasks
HANDS-ON LABS (Check Agenda for Daily Timings):
- Table 42 – Informatica Data Validation
- Table 43 – Informatica Proactive Monitoring
BOOTHS (Check Agenda for Daily Timings):
- PowerCenter Developer Productivity and Production Manageability
I look forward to seeing you at Informatica World 2013.
The hype around big data is certainly top of mind with executives at most companies today but what I am really seeing are companies finally making the connection between innovation and data. Data as a corporate asset is now getting the respect it deserves in terms of a business strategy to introduce new innovative products and services and improve business operations. The most advanced companies have C-level executives responsible for delivering top and bottom line results by managing their data assets to their maximum potential. The Chief Data Officer and Chief Analytics Officer own this responsibility and report directly to the CEO. (more…)
In Ashwin Viswanath’s previous video blog, he spoke about why it is important to have a cloud integration solution that has purpose-built integration applications. In this video, he delves deeper into the security aspects of cloud integration and how to rapidly provision integration environments for distributed business units, subsidiaries and departments in a quick and efficient manner.
This year marks the 20th anniversary for Informatica. Twenty years of solving the problem of getting data from point A to point B, improving its quality, establishing a single view and managing it over its life-cycle. Yet after 20 years of innovation and leadership in the data integration market, when one would think the problem had been solved, all data had been extracted, transformed, cleansed and managed, it actually hasn’t — companies still need data integration. Why? Data is complicated business. And with data increasingly becoming central to business survival, organizations are constantly looking for ways to unlock new sources of it, use it as an unforeseen source of insight and do it all with greater agility and at lower cost. (more…)
Search Advertising also known as Search Engine Marketing (SEM), is a totally unique medium for attracting new customers. It is a method of placing online advertisements on Web pages that show results from search engine queries. Through the same search engine advertising services, advertisements can also be placed on Web pages with other published content. It is a multi-billion dollar industry. Advertising on Google, Yahoo and MSN gives you a total reach of roughly 86.4% of all Internet users. With such a broad reach, Search Advertising is one of the most extensible forms of advertising available, with added benefits that other forms of advertising are lacking.
In Search Advertising advertisers pay the website owner for clicks on their ads. There are two major types of Search Advertising: Sponsored Search and Content Placement Targeting. (more…)
In a recent webinar, Mark Smith, CEO at Ventana Research and David Lyle, vice president, Product Strategy at Informatica discussed: “Building the Business Case and Establishing the Fundamentals for Big Data Projects.” Mark pointed out that the second biggest barrier that impedes improving big data initiatives is that the “business case is not strong enough.” The first and third barriers respectively, were “lack of resources” and “no budget” which are also related to having a strong business case. In this context, Dave provided a simple formula from which to build the business case:
Return on Big Data = Value of Big Data / Cost of Big Data (more…)
The PowerCenter Big Data Edition’s here! The PowerCenter Big Data Edition’s here!
“The new phone book’s here! The new phone book’s here! “
“…Millions of people look at this book everyday! This is the kind of spontaneous publicity – your name in print – that makes people. I’m in print! Things are going to start happening to me now”
Navin R Johnson
Thousands of Oracle OpenWorld 2012 attendees visited the Informatica booth to learn how to leverage their combined investments in Oracle and Informatica technology. Informatica delivered over 40 presentations on topics that ranged from cloud, to data security to smart partitioning. Key Informatica executives and experts, from product engineering and product management, spoke with hundreds of users on topics and answered questions on how Informatica can help them improve Oracle application performance, lower risk and costs, and reduce project timelines. (more…)