Category Archives: Marketplace
I have a little fable to tell you…
This fable has nothing to do with Big Data, but instead deals with an Overabundance of Food and how to better digest it to make it useful.
And it all started when this SEO copywriter from IT Corporation walked into a bar, pub, grill, restaurant, liquor establishment, and noticed 2 large crowded tables. After what seemed like an endless loop, an SQL programmer sauntered in and contemplated the table problem. “Mind if I join you?”, he said? Since the tables were partially occupied and there were no virtual tables available, the host looked on the patio of the restaurant at 2 open tables. “Shall I do an outside join instead?” asked the programmer? The host considered their schema and assigned 2 seats to the space.
The writer told the programmer to look at the menu, bill of fare, blackboard – there were so many choices but not enough real nutrition. “Hmmm, I’m hungry for the right combination of food, grub, chow, to help me train for a triathlon” he said. With that contextual information, they thought about foregoing the menu items and instead getting in the all-you-can-eat buffer line. But there was too much food available and despite its appealing looks in its neat rows and columns, it seemed to be mostly empty calories. They both realized they had no idea what important elements were in the food, but came to the conclusion that this restaurant had a “Big Food” problem.
They scoped it out for a moment and then the writer did an about face, reversal, change in direction and the SQL programmer did a commit and quick pivot toward the buffer line where they did a batch insert of all of the food, even the BLOBS of spaghetti, mash potatoes and jello. There was far too much and it was far too rich for their tastes and needs, but they binged and consumed it all. You should have seen all the empty dishes at the end – they even caused a stack overflow. Because it was a batch binge, their digestive tracts didn’t know how to process all of the food, so they got a stomach ache from “big food” ingestion – and it nearly caused a core dump – in which case the restaurant host would have assigned his most dedicated servers to perform a thorough cleansing and scrubbing. There was no way to do a rollback at this point.
It was clear they needed relief. The programmer did an ad hoc query to JSON, their Server who they thought was Active, for a response about why they were having such “big food” indigestion, and did they have packets of relief available. No response. Then they asked again. There was still no response. So the programmer said to the writer, “Gee, the Quality Of Service here is terrible!”
Just then, the programmer remembered a remedy he had heard about previously and so he spoke up. “Oh, it’s very easy just <SELECT>Vibe.Data.Stream from INFORMATICA where REAL-TIME is NOT NULL.”
Informatica’s Vibe Data Stream enables streaming food collection for real-time Big food analytics, operational intelligence, and traditional enterprise food warehousing from a variety of distributed food sources at high scale and low latency. It enables the right food ingested at the right time when nutrition is needed without any need for binge or batch ingestion.
And so they all lived happily ever after and all was good in the IT Corporation once again.
Download Now and take your first steps to rapidly developing applications that sense and respond to streaming food (or data) in real-time.
Managing the recovery and flow of data files throughout your enterprise is much like managing the flow of oil from well to refinery – a wide range of tasks must be carefully completed to ensure optimal resource recovery. If these tasks are not handled properly, or are not addressed in the correct order, valuable resources may be lost. When the process involves multiple pipelines, systems, and variables, managing the flow of data can be difficult.
Organizations have many options to automate the processes of gathering data, transferring files, and executing key IT jobs. These options include home-built scheduling solutions, system integrated schedulers, and enterprise schedulers. Enterprise schedulers, such as Skybot Scheduler, often offer the most control over the organization’s workflow, as they offer the ability to create schedules connecting various applications, systems, and platforms.
In this way, the enterprise scheduler facilitates the transfer of data into and out of Informatica PowerCenter and Informatica Cloud, and ensures that raw materials are refined into valuable resources.
Enterprise Scheduling Automates Your Workflow
Think of an enterprise scheduler as the pipeline bearing data from its source to the refinery. Rather than allowing jobs or processes to execute randomly or to sit idle, the enterprise scheduler automates your organization’s workflow, ensuring that tasks are executed under the appropriate conditions without the need for manual monitoring or the risk of data loss.
Skybot Scheduler addresses the most common pain points associated with data recovery, including:
- Scheduling dependencies: In order for PowerCenter or Cloud to complete the data gathering processes, other dependencies must be addressed. Information must be swept and updated, and files may need to be reformatted. Skybot Scheduler automates these tasks, keeping the data recovery process consistently moving forward.
- Reacting to key events: As with oil recovery, small details can derail the successful mining of data. Key events, such as directory changes, file arrivals, and evaluation requirements can lead to a clog in the pipeline. Skybot Scheduler maintains the flow of data by recognizing these key events and reacting to them automatically.
Choose the Best Pipeline Available
Skybot Scheduler is one of the most powerful enterprise scheduling solutions available today, and is the only enterprise scheduler integrated with PowerCenter and Cloud.
Capable of creating comprehensive cross-platform automation schedules, Skybot Scheduler manages the many steps in the process of extracting, transforming, and loading data. Skybot maintains the flow of data by recognizing directory changes and other key events, and reacting to them automatically.
In short, by managing your workflow, Skybot Scheduler increases the efficiency of ETL processes and reduces the potential of a costly error.
To learn more about the power of enterprise scheduling and the Skybot Scheduler check out this webinar: Improving Informatica ETL Processing with Enterprise Job Scheduling or download the Free Trial.
When the average person hears of cloning, my bet is that they think of the controversy and ethical issues surrounding cloning, such as the cloning of Dolly the sheep, or the possible cloning of humans by a mad geneticist in a rogue nation state. I would also put money down that when an Informatica blog reader thinks of cloning they think of “The Matrix” or “Star Wars” (that dreadful episode II Attack of the Clones). I did. Unfortunately.
But my pragmatic expectation is that when Informatica customers think of cloning, they also think of Data Cloning software. Data Cloning software clones terabytes of database data into a host of other databases, data warehouses, analytical appliances, and Big Data stores such as Hadoop. And just for hoots and hollers, you should know that almost half of all Data Integration efforts involve replication, be it snapshot or real-time, according to TDWI survey data. Survey also says… replication is the second most popular — or second most used — data integration tool, behind ETL.
Do your company’s cloning tools work with non-standard types? Know that Informatica cloning tools can reproduce Oracle data to just about anything on 2 tuples (or more). We do non-discriminatory duplication, so it’s no wonder we especially fancy cloning the Oracle! (a thousand apologies for the bad “Matrix” pun)
Just remember that data clones are an important and natural component of business continuity, and the use cases span both operational and analytic applications. So if you’re not cloning your Oracle data safely and securely with the quality results that you need and deserve, it’s high time that you get some better tools.
Send in the Clones
With that in mind, if you haven’t tried to clone before, for a limited time, Informatica is making Fast Clone database cloning trial software product available for a free download. Click here to get it now.
Happy birthday to Informatica Marketplace!
In a brief 3 years, Informatica’s Marketplace has reached several major milestones including the posting of its 1,000th Block (Application or Service) and surpassing 10,000 unique downloads per month. Over that time, the Blocks posted on the Marketplace have marked the onward progress and evolution of the data management industry from multi-domain MDM to in-memory processing to big data, and beyond. In fact, if one was to go back and take snapshots of which Blocks were posted, and when, it would be a sort of “data” time capsule. To me, that’s one of the most compelling features of the Marketplace. In addition to the plentiful and diverse set of offerings the Marketplace Blocks represent, they are continuously evolving to address the ever changing “data” landscape and are reflective of the latest in technological innovation and best practices.
So what can we expect next from the Marketplace? Let me make a prediction. Over the next 3 years, I believe you will see a profound recognition and appreciation for data as a key enabler of significant and real improvements to business performance and operational effectiveness – and this shift will be driven by the business, not IT. The shift is already underway, but to speed it along will require those of us in the industry to continue to move the conversation beyond “it’s foundational”, and more appropriately position “data” as another lever for the achievement of business goals. In order to do that, we will need to speak in terms that resonate with the customer and frame the data discussion in the context of business performance. At the operational level, how are poor data quality and bad data management practices impacting DSO, inventory turns, on-time manufacturing and delivery, revenue recognition and assurance, leverage spend, etc.? From a financial perspective, how do these data related impacts manifest themselves on the balance sheet, income statement, or cash flow statement – and to what degree? These are the issues that our customers care about, and our solutions and recommendations should be made with these in mind. And since all solution providers are competing for finite corporate wallet share, those of us in the “data” industry will be at a competitive disadvantage if we don’t adapt. I strongly believe we will get there, and just like the past 3 years this shift will be marked by the continuing evolution of Informatica’s Marketplace.
At InformaticaWorld, we made a very exciting announcement—the introduction of PowerCenter Express, our entry-level data integration and profiling tool. What is PowerCenter Express, exactly? Well, in a nutshell, it’s giving the Power of PowerCenter to everyone, “to the people” if you like. We made PowerCenter Express available to all attendees at InformaticaWorld and they’ll be able to install it and be up and running in less than ten minutes. Since it’s PowerCenter, they’ll be able to scale up to enterprise class capabilities whenever they need to, using Vibe, our “Map Once, Deploy Anywhere” technology. Starting in July PowerCenter Express will be generally available to everyone- as a free download from Informatica’s Marketplace.
What we are doing with PowerCenter Express, is making sure that everyone, including departments and growing businesses, have access to PowerCenter’s high quality data integration and profiling tools. Until now the options for these groups have been limited—hand coding or open source products. Neither of these options is able to scale to be able to handle enterprise class data integration requirements. Which meant that before the advent of PowerCenter Express when these smaller organizations reached the point where they needed enterprise class capabilities and had to migrate to an enterprise data integration tool, they had no choice but to scrap all of their prior work . We don’t want that to happen anymore. We don’t want anyone to have to re-write mappings, to re-do work—ever. We want people to be able to map once, and deploy anywhere. And that’s what PowerCenter Express makes possible, that any organization, no matter how small, can start with PowerCenter—the gold standard for data integration—and stay with PowerCenter, re-using those same mappings when they transition to enterprise class, or when they want to deploy those mappings to Hadoop.
The reality is, as organizations’ data integration complexity reaches a certain point, they end up coming to Informatica—for the best products , the best support and the biggest ecosystem of developers. But in the past, for smaller organizations starting with the fully functional PowerCenter wasn’t always the best option. With PowerCenter Express, organizations can start small, start now, and scale fast. PowerCenter Express offers a real choice and future protection for entry-level data integration
If you’d like to learn more about PowerCenter Express before the public launch, shoot me an email at EBurns@Informatica.com. And start following me here, I’ll be posting a lot about this exciting new product over the coming weeks and months.
Emily V. Burns
Sr. Product Marketing Manager, PowerCenter Express
Recently, the Informatica Marketplace reached a major milestone: we exceeded 1,000 Blocks (Apps). Looking back to three years ago when we started with 70 Blocks from a handful of partners, it’s an amazing achievement to have reached this volume of solutions in such a short time. For me, it speaks to the tremendous value that the Marketplace brings not only to our customers who download more than 10,000 Blocks per month, but also to our partners who have found in the Marketplace a viable route to market and a great awareness and monetization vehicle for their solutions.
There has been a lot of discussion around the explosion of data and what it means to companies trying to leverage this extremely valuable resource. Informatica has a huge part to play in helping customers solve those problems not only through the technologies we provide directly, but through the tremendous ecosystem that we have built through our partners. The Marketplace has grown to more than 165 unique partner companies, and we’re adding more every day. Blocks such as BI & Analytics sing Social Media Data from Deloitte, and Interstage XWand – XBRL Processor from Fujitsu represent offerings from large, established software companies, while Blocks such as Skybot Enterprise Job Scheduler and Undraleu Code Review Tool from Coeurdata are solutions that have been contributed by earlier stage companies that have experienced significant success and growth. It has been a pleasure helping these companies to grow and reach new customers through the Marketplace.
One of the most exciting things about reaching the 1K Block milestone is not just the amount of companies that are on the Marketplace, but the amount of solutions that have been contributed from our developer community. Blocks such as Autotype Excel Macro, Execute Workflow, and iExportNormalizer are all solutions that Informatica developers have built because it helps them in their daily activities, and through the Marketplace they have found a way to share these valuable assets with the community. In fact, over half of our solutions are free to use, which is a ringing endorsement of the power of the community and a great way to try out any number of useful solutions at no risk. By leveraging enabling technologies such as Informatica’s Cloud Platform as a Service, developers can create and share solutions more quickly and easily than ever before.
Overall, it has been an exciting ride as the Marketplace has rocketed to 1,000 Blocks in under three years, and I look forward to what the next three years has in store!
This week we got the news that for the fifth year in a row, Informatica Cloud has won the 2012 Salesforce.com AppExchange Customer Choice Award. Informatica Cloud Integration for Salesforce was recognized as the winner in the very crowded IT and Administration category, which includes administration and IT, data cleansing, integration, IT management and other applications.These awards are based on the number and quality of customer reviews on the AppExchange. (more…)
Informatica Cloud Winter 2013 has arrived. This is the fourteenth release of the company’s award-winning family of cloud integration applications and integration platform as a service (iPaaS), which has now expanded to include Informatica Cloud Master Data Management (MDM). In this post I’ll provide an overview of the new cloud integration and cloud data quality capabilities. Be sure to register for a 30 day trial and/or attend the release webinar on Thursday to see Informatica Cloud in action.
In 2006, Informatica announced a strategic roadmap for cloud data integration, which outlined three phases: