Tag Archives: risk
It’s official, “big data” is here to stay and the solutions, concepts, hardware and services to support these massive implementations are going to continue to grow at a rapid pace. However, every organization has their own definition of big data and how it plays in their organization. One area that we are seeing a lot of activity in is “big transaction data” for OLTP and relational databases because relational databases often lack the true management capabilities to scale transactional applications into the higher TBs and PBs. In this post we will explore some ways that your existing OLTP system can scale without crushing IT and your budget in the process. (more…)
After having lived in China since May 2012, I’ve been fortunate to have met with the leaders of most multinational software companies, leaders of local firms as well as industry analysts. My inspiration for this blog is based on a conversation I had with a senior leader at a leading “cloud” provider. (more…)
Adam Wilson, General Manager of ILM at Informatica talks about the next frontier of data security. The more data that is passed around internally, the more risk your company runs for a data breach. Find out why auditors are taking a closer look at the number of internal data copies that are floating around and what it means for your company’s risk of a data leak.
In a recent InformationWeek blog, “Big Data A Big Backup Challenge”, George Crump aptly pointed out the problems of backing up big data and outlined some best practices that should be applied to address them, including:
- Identifying which data can be re-derived and therefore doesn’t need to be backed up
- Eliminating redundancy, file de-duplication, and applying data compression
- Using storage tiering and the combination of online disk and tapes to reduce storage cost and optimize performance (more…)
Data is the Answer, Now What’s the Question? Hint: It’s The Key to Optimizing Enterprise Applications
Data quality improvement isn’t really anything new; it’s been around for some time now. Fundamentally the goal of cleansing, standardizing and enriching enterprise data through data quality processes remains the same. What’s different now, however, is that in an increasingly competitive marketplace and in difficult economic times, a complete enterprise data quality management approach can separate the leaders from the laggards. With a sound approach to enterprise data quality management, organizations reap the benefits of turning enterprise data into a key strategic asset. This helps to increase revenue, eliminate costs and reduce risks. Using the right solution, organizations can leverage data in a way never possible before, holistically and proactively, by addressing data quality issues when and where they arise. Doing so ensures key IT initiatives, like business intelligence, master data management, and enterprise applications, deliver on their promises of better business results. (more…)
Richard Cramer, Chief Healthcare Strategist at Informatica talks about protecting healthcare data in non-production testing environments.
In my last blog post, I commented that organizations that monitor for and react to product/service-related life cycle events can proactively manage customer retention, and suggested that quality data was critical to meet those retention objectives. I can share a concrete example from a discussion I had with a C-level executive at an office supplies company. His organization religiously tracked paper and ink buying patterns for those customers that had purchased a printer. His retention scheme relied on accurate knowledge of the client and the products the client had purchased.
He made a number of assumptions based on knowledge of each printer’s service life cycle. For example, he estimated the size of the business in relation to the printers number of pages printed per month. In turn, he was able to anticipate when the client was just about to run out of paper or when it was time to order new ink cartridges. By proactively contacting the customer directly about three weeks ahead of the time he expected them to need more paper or ink, he was able to capture the follow-on sales, increase customer satisfaction, and consequently, elongate customer lifetime. (more…)
The Aite Group recently surveyed senior treasury and receivables management at 80 top US corporations. About half of the transactions at these firms were processed straight through. As a result, there remains a lot of opportunity to reduce costs and mitigate risk by increasing straight-through-processing (STP) of transactions. In addition, banks can help their corporate clients get better information about their payments and posting. For example, Treasury managers need to know where they can invest excess cash and where their funds are throughout the organizations to borrow internal cash.
At SIBOS 2010 (October 25-29 in Amsterdam), Informatica will discuss how the Informatica SWIFT Integration solution enables your customers to achieve true STP and get better information about their payments. This is especially critical given all the new regulation and industry changes that have been reflected in the latest SWIFT standard. With Informatica’s 2010 SWIFT certified solution, organizations can: (more…)
Suppose AIG had in place a world-class counterparty and customer risk management program. Would it have suffered its epic failure and needed a $173 million bailout to avoid a potential global collapse of the financial system?
It’s an intriguing question. AIG’s failure to accurately assess and understand its counterparty exposure clearly contributed to its meltdown and its staggering $62 billion loss in Q4 2008, and was among the culprits behind the recent recession. It also drove home the critical need for financial institutions to have in place a rock-solid counterparty risk management program. (more…)