Tag Archives: Archiving
Right before Christmas, I was delighted to read about the proposed merger between the New York Stock Exchange and Intercontinental Exchange. ICE and NYSE have been customers that we on the Informatica Ultra Messaging team have been working with for several years. NYSE Technologies leveraged our high performance messaging as part of their direct feeds market data solution that lowered latencies across dozens of Wall Street firms around the globe. (more…)
In this video, Rob Karel, vice president of product strategy, Informatica, outlines the Informatica Data Governance Framework, highlighting the 10 facets that organizations need to focus on for an effective data governance initiative:
- Vision and Business Case to deliver business value
- Tools and Architecture to support architectural scope of data governance
- Policies that make up data governance function (security, archiving, etc.)
- Measurement: measuring the level of influence of a data governance initiative and measuring its effectiveness (business value metrics, ROI metrics, such as increasing revenue, improving operational efficiency, reducing risk, reducing cost or improving customer satisfaction)
- Change Management: incentives to workforce, partners and customers to get better quality data in and potential repercussions if data is not of good quality
- Organizational Alignment: how the organization will work together across silos
- Dependent Processes: identifying data lifecycles (capturing, reporting, purchasing and updating data into your environment), all processes consuming the data and processes to store and manage the data
- Program Management: effective program management skills to build out communication strategy, measurement strategy and a focal point to escalate issues to senior management when necessary
- Define Processes that make up the data governance function (discovery, definition, application and measuring and monitoring).
For more information from Rob Karel on the Informatica Data Governance Framework, visit his Perspectives blogs.
Alternative Methods of Managing Data Growth and Best Practices for Using Them as Part of an Enterprise Information Lifecycle Management Strategy
Data, either manually created, or machine generated, tend to live on forever, because people hold on to it for fear that they might lose information by destroying data.
There is a saying in Bhagavad Gita:
jaathasya hi dhruvo mr.thyur dhr.uvam janma mr.thasya cha |
thasmaad aparihaarye’rthe’ na thvam sochithum-arhasi ||
“For death is certain to one who is born; to one who is dead, birth is certain; therefore, thou shalt not grieve for what is unavoidable.” (more…)
Both partitioning and archiving are alternative methods of improving database and application performance. Depending on a database administrator’s comfort level for one technology or method over another, either partitioning or archiving could be implemented to address performance issues due to data growth in production applications. But what are the best practices for utilizing one or the other method and how can they be used better together?
Eliminating Up To 95% Of Legacy Costs As Part Of Your Journey To The Cloud, With Application Decommissioning And Archiving
We’re all familiar with those legacy applications that no longer add value, but still absorb significant costs. These redundant applications may be left due to mergers and acquisitions, IT consolidation, business modernization, application migration, or moving to a cloud-based or software as a service environment. If you are an EMC customer, many of you may be undertaking projects to consolidate your IT stack to increase efficiency, and moving gradually towards a private or hybrid cloud environment. As you are virtualizing, re-platforming, and migrating your hardware and software, what do you do with the old applications that are left behind? (more…)
Following a Merger and Acquisition (M&A), there is usually a focus on consolidating the two companies’ IT systems, leaving behind many redundant legacy applications. Until those legacy applications are shut down, you haven’t realized the cost savings of the consolidation. However, those old applications may contain data that’s no longer used for daily operations, but need to be retained for regulatory compliance. Keeping those applications up and running, just to retain the data within them introduces operational, business and legal risks. It is likely that the IT staff who have the expertise about those applications are no longer with the company, and without them it may be difficult to impossible to access the data in a meaningful way, in the time required, for an audit or eDiscovery request.
The utilization of backup vs. archiving software for databases is often confused in many organizations. Customers often use backup for the purposes of archiving and vice versa. A survey conducted by Symantec Software recently indicates that 70% of enterprises are misusing backup, recovery, and archiving practices. The survey shows that 70% of the enterprises use their backup software to implement legal holds and 25% preserve the entire backup set indefinitely. Also, the survey respondents said 45% of their backup storage is due to legal holds. Additionally, nearly half of the enterprises surveyed are improperly using their backup and recovery software for archiving.
So what are the differences between the two types of solutions? What should each be used for and how are they complementary?
I recently attended HP’s Software Universe and a big theme of the conference was ‘winning the war of managing application performance’. Having spent time walking the solutions showcase floor, speaking to attendees and SI partners, I can say this is still a really big deal. As the growth and size of production applications at the core of business continues, organizations are faced with a significant and costly challenge that will only continue to get worse.
Another audience in attendance, namely members of the QA and testing teams responsible for ensuring the quality of production applications, building and protecting realistic testing environments for their internal applications is another huge challenge. These team members need to sub-set and create test environments without impacting production systems performance or requiring a duplicate hardware footprint. Masking and protecting the data once it’s pulled from production is also a necessary step of ensuring control of the information housed within these critical systems.
It was refreshing to hear these challenges from real practitioners trying to solve problems for some of the largest organization in the world. Their pain validated the need for Application ILM solutions. These are real production-impacting issues that if not addressed will have huge cost and productivity impacts. If you’re an Informatica partner or practitioner, expanding your knowledge of these new offerings might make you a hero! I urge you to take a look.