Tag Archives: SAP
Adopting SAP HANA can offer significant new business value, but it can also be an expensive proposition. If you are contemplating or in the process of moving to HANA, it’s worth your time to understand your options for Nearlining your SAP data. The latest version of Informatica ILM Nearline, released in February, has been certified by SAP and can run with SAP BW systems running on HANA or any relational database supported by SAP.
Nearlining your company’s production SAP BW before migrating to a HANA-based BW can provide huge saving potentials. Even if your HANA project has already started, Nearlining the production data will help keep the database growth flat. We have customers that have actually been able to shrink InfoProviders by enforcing strict rules on data retention on the data stored in the live database.
Informatica World is around the corner, and I will be there with my peers to demo and talk about the latest version of Informatica ILM Nearline. Click here to learn more about Informatica World 2013 and make sure you sign up for one my Hands On Lab sessions on this topic. See you at the Aria in Las Vegas in June.
SAP’s data warehouse solution (SAP BW) provides enterprises the ability to easily build a warehouse over their existing operational systems with pre-defined extraction and reporting objects and methods. Data that is loaded into SAP BW is stored in a layered architecture which encourages reusability of data throughout the system in a standardized way. SAP’s implementation also enables easy audits of data delivery mechanisms that are used to produce various reports within the system.
To allow enterprises to achieve this level of standardization and auditability, SAP BW must persistently store large amounts of data within different layers of their architecture. Managing the size of the objects within these layers will become increasingly important as the system grows to insure high levels of performance for end-user queries and data delivery. (more…)
Just five years ago, there was a perception held by many in our industry that the world of data for enterprises was simplifying. This was in large part due to the wave of consolidation among application vendors. With SAP and Oracle gobbling up the competition to build massive, monolithic application stacks, the story was that this consolidation would simplify data integration and data management. (more…)
I’ve been asked numerous times how Dynamic Data Masking works, so here it is – The Dynamic Data Masking process. Believe me it’s simple …
The use case –IT personnel, developers, consultants and outsource support teams have access to production business applications (SAP, PeopleSoft, Oracle) or clones/backups that contains sensitive customer information and credit card information.
We cannot block their access, as they are required to ensure application performance, but we need to secure the data they are accessing.
These are the initial installation steps required:
- Install Informatica Dynamic Data Masking on the database or on a dedicated server as it acts as a proxy.
- Import one of our predefined rule sets that has been prepared for the application/data or create your own custom rules.
- Define the roles/responsibilities that need to be anonymized, using predefined hooks to ActiveDirectory/LDAP and application responsibilities.
Now how does Dynamic Data Masking work?
- User requests are intercepted in real-time by the Dynamic Data Masking server software.
- User roles and responsibilities are evaluated, and if they have been specified by the rules as requiring masking, Dynamic Dasta Masking rewrites them to return masked/scrambled personal information. No application changes, no database changes – completely transparent.
Sounds simple – yes it is.
Other common use cases include protecting development and reporting tool access to production databases, anonymizing datawarehouse reports and design tools, securing production clones and training environments.
I spent last weekend reading Geoffrey Moore’s new book, Escape Velocity: Free Your Company’s Future from the Pull of the Past. Then on Sunday, the New York Times published this article about salesforce.com: A Leader in the Cloud Gains Rivals. Clearly “The Big Switch” is on. With this as a backdrop, the need for a comprehensive cloud data management strategy has surfaced as a top IT imperative heading into the New Year – How and when do you plan to move data to the cloud? How will you prevent SaaS silos? How will you ensure your cloud data is trustworthy, relevant and complete? What is your plan for longer-term cloud governance and control?
These are just a few of the questions you need to think through as you develop your short, medium and long-term cloud strategy. Here are my predictions for what else should be on your 2012 cloud integration radar. (more…)
I recently returned from China and Hong Kong after having met with several CIOs, media and analysts, as well as delivering keynotes focused on customer centricity. When I return to the US after traveling, I’m often asked about the state of IT in the geography I was just in. I’ve been to both China and Hong Kong several times over the past few years, and from my perspective, IT is maturing at a very rapid pace in that region.
During prior trips to Asia, it felt like the old days of data processing. I would speak with senior IT leaders and they were more concerned with the “blocking and tackling” of IT, and not looking at how IT can provide a strategic competitive advantage. Specifically in China, IT leadership was comfortable scaling by applying people to the problem rather than using commercial software. (more…)
The Harry Potter books and movies were a particularly popular inspiration for project names. For example, at LinkedIn, to empower features such as “People You May Know” and “Jobs You May Be Interested In”, LinkedIn uses Hadoop together with an Azkaban batch workflow scheduler and Voldemort key-value store. We’ll see if the Twilight series has a similar impact on project names.
As companies increasingly explore master data management (MDM), we often hear inquiries about the usability of master data by business users.
Common questions include: Do business users need to learn and use a separate MDM application? Do they need support from IT to access master data? Can master data fit into the everyday business applications they use for CRM, SFA, ERP, supply chain management, and so forth?
In my previous blog, we talked about the benefit of making subsets of test data from live production applications and masking them to address cost and security issues. When the applications have simple data models where subsets can be made using a simple query on a few tables, the need to implement or purchase a solution may not be warranted. When dealing with complex custom or packaged applications such as Oracle E-Business Suite or SAP, functional test cases are typically organized by business processes, organization, time, or a combination of each.
Complex custom applications or packaged applications contain data for multiple business processes, such as Accounts Payables or Receivables, Sales & Distribution, or Payroll. Developing a SQL query that selects a complete subset for each of these processes for a particular business unit or geography and then masks the data while insuring the test application will continue to function is NOT a trivial task. It requires a detailed knowledge of the data model – including all database constraints, primary and foreign key relationships, data dependencies on programs running in the application tier and any inter-application dependencies via database links or other interfaces. It is possible to develop internally, but the time and effort required to develop and test makes the cost benefit of subsetting moot. (more…)
Business modernization programs typically focus on process standardization to gain the benefits of efficient repeatable, measurable processes. Enterprise resource planning (ERP) technologies fulfill the process standardization requirements and have now become a central point for management of business processes. However, ERP systems do not prevent low quality data from entering the systems nor do they measure its impact on the efficiency of a business process. Most organizations today are using the same ERP systems (SAP or Oracle) that were configured by the same consultancies. Therefore, the uniqueness and the scope for competitive advantage of any organization are defined by the people and the data. (more…)