Tag Archives: webinar
Last month, the CEO of Deloitte said that CFOs are “the logical choice to own analytics and put them to work to serve the organization’s needs”. In my discussions with CFOs, they have expressed similar opinions. Given this, the question becomes what does a CFO need to do to be effective leader of their company’s analytics agenda? To answer this, I took a look at what Tom Davenport suggests in his book “Analytics at Work”. In this book, Tom suggests that an analytical leader need to do the following twelve things to be effective:
12 Ways to Be an Effective Analytics Leader
1) Develop their people skills. This is not just about managing analytical people which has its own challenges. It is, also, about CFOs establishing the “the credibility and trust needed when analytics produce insights that effectively debunk currently accepted wisdom”.
2) Push for fact based decision making. You need to, as a former boss of mine like to say, become the lightening rod and in this case, set the expectation that people will make decisions based upon data and analysis.
3) Hire and retain smart people. You need to provide a stimulating and supportive work environment for analysts and give them credit when they do something great.
4) Be the analytical example. You need to lead by example. This means you need to use data and analysis in making your own decisions
5) Signup for improved results. You need to commit to driving improvements in a select group of business processes by using analytics. Pick something meaningful ike reducing the cost of customer acquisition or optimizing your company’s supply chain management.
6) Teach the organization how to use analytic methods. Guide employees and other stakeholders into using more rigorous thinking and decision making.
7) Set strategies and performance expectations. Analytics and fact-based decisions cannot happen in a vacuum. They need strategies and goals that analytics help achieve.
8) Look for leverage points. Look for the business problems where analytics can make a real difference. Look for places where a small improvement in a process driven by analytics can make a big difference.
9) Demonstrate persistence. Work doggedly and persistently to apply analytics to decision making, business processes, culture, and business strategy.
10) Build an analytics ecosystem with your CIO. Build an ecosystem consisting of other business leaders, employees, external analytics suppliers, and business partners. Use them to help you institutionalize analytics at your company.
11) Apply analytics on more than one front. No single initiative will make the company more successful—no single analytics initiative will do so either.
12) Know the limits to analytics. Know when it is appropriate to use intuition instead of analytics. As a professor of mine once said not all elements of business strategy can be solved by using statistics or analytics. You should know where and when analytics are appropriate.
Following these twelve items will help strategic oriented CFOs lead the analytics agenda at their companies. As I indicated in “Who Owns the Analytics Agenda?”, CFOs already typically act as data validators at their firms, but taking this next step matters to their enterprise because “if we want to make better decisions and take the right actions, we have use analytics” (Analytics at Work, Tom Davenport, Harvard Business Review Press, page 1). Given this, CFOs really need to get analytics right. The CFOs that I have talked to say they already “rely on data and analytics and they need them to be timely and accurate”.
One CFO, in fact, said that data is potentially the only competitive advantage left for his firm”. And while implementing the data side of this depends on the CIO. It is clear from the CFOs that I have talked to that they believe a strong business relationship with their CIO is critical to the success of their business.
So the question remains are you ready as a financial leader to lead on the analytics agenda? If you are and you want to learn more about setting the analytics agenda, please consider yourself invited to webinar that I am doing with the CFO of RoseRyan in January.
CFOs Move to Chief Profitability Officer
CFOs Discuss Their Technology Priorities
The CFO Viewpoint upon Data
How CFOs can change the conversation with their CIO?
New type of CFO represents a potent CIO ally
Competing on Analytics
The Business Case for Better Data Connectivity
If you use production data in test and development environments or are looking for alternative approaches, register for the first webinar in a three part series on data security gaps and remediation. On December 9th, Adrian Lane, Security Analyst at Securosis, will join me to discuss security for test environments.
This is the first webinar in a three part series on data security gaps and remediation. This webinar will focus on how data centric security can be used to shore up vulnerabilities in one of the key focus areas, test and development environments. It’s common practice that non-production database environments are created by making copies of production data. This potentially exposes sensitive and confidential production data to developers, testers, and contractors alike. Commonly, 6-10 copies of production databases are created for each application environment and they are regularly provisioned to support development, testing and training efforts. Since security controls deployed for the source database are not replicated in the test environments, this is a glaring hole in data security and a target for external or internal exploits.
In this webinar, we will cover:
- Key trends in enterprise data security
- Vulnerabilities in non-production application environments (test and development)
- Alternatives to consider when protecting test and development environments
- Priorities for enterprises in reducing attack surface for their organization
- Compliance and internal audit cost reduction
- Data masking and synthetics data use cases
- Informatica Secure Testing capabilities
Register for the webinar today at http://infa.media/1pohKov. If you cannot attend the live event, be sure to watch the webinar on-demand.
The MDM space is filled by a number of well-known players. In addition to Informatica, you’ll also find IBM, Oracle, SAP, and a host of other vendors. But who’s the leader of this space?
The Information Difference is an analyst firm that specializes in MDM, and this firm has been watching the MDM space for years. Recently, The Information Difference compared 12 different MDM vendors, ranked them using 200 criteria, and published their findings in a report of the MDM landscape for Q2, 2013. The firm compared the vendors’ MDM offerings across 6 categories: data governance, business rules, data quality, data storage, data provision, and data movement.
Informatica recently hosted a webinar with Cognizant who shared how they streamline test data management processes internally with Informatica Test Data Management and pass on the benefits to their customers. Proclaimed as the world’s largest Quality Engineering and Assurance (QE&A) service provider, they have over 400 customers and thousands of testers and are considered a thought leader in the testing practice.
We polled over 100 attendees on what their top challenges were with test data management considering the data and system complexities and the need to protect their client’s sensitive data. Here are the results from that poll:
It was not surprising to see that generating test data sets and securing sensitive data in non-production environments were tied as the top two biggest challenges. Data integrity/synchronization was a very close 3rd .
Cognizant with Informatica has been evolving its test data management offering to truly focus on not only securing sensitive data – but also improving testing efficiencies with identifying, provisioning and resetting test data – tasks that consume as much as 40% of testing cycle times. As part of the next generation test data management platform, key components of that solution include:
Sensitive Data Discovery – an integrated and automated process that searches data sets looking for exposed sensitive data. Many times, sensitive data resides in test copies unbeknownst to auditors. Once data has been located, data can be masked in non-production copies.
Persistent Data Masking – masks sensitive data in-flight while cloning data from production or in-place on a gold copy. Data formats are preserved while original values are completely protected.
Data Privacy Compliance Validation – auditors want to know that data has in fact been protected, the ability to validate and report on data privacy compliance becomes critical.
Test Data Management – in addition to creating test data subsets, clients require the ability to synthetically generate test data sets to eliminate defects by having data sets aligned to optimize each test case. Also, in many cases, multiple testers work on the same environment and may clobber each other’s test data sets. Having the ability to reset test data becomes a key requirement to improve efficiencies.
Figure 2 Next Generation Test Data Management
When asked what tools or services that have been deployed, 78% said in-house developed scripts/utilities. This is an incredibly time-consuming approach and one that has limited repeatability. Data masking was deployed in almost half of the respondents.
Informatica with Cognizant are leading the way to establishing a new standard for Test Data Management by incorporating both test data generation, data masking, and the ability to refresh or reset test data sets. For more information, check out Cognizant’s offering based on Informatica: TDMaxim and White Paper: Transforming Test Data Management for Increased Business Value.
A few weeks ago I told you about a few webinars that can help you to dig deeper into MDM. One, entitled How to Integrate On-Premise and Cloud MDM Hubs Using Federated MDM Architecture, which took place last month, had over a hundred attendees, and they asked a lot of great questions.
Here, I’ll take an opportunity to answer a few:
Reposted with permission
Shahid Shah’s healthcare IT, EMR, EHR, PHR, medical content, and document management advisory service. Enjoy.
Join me for a free webinar on “Understanding the Escalating Data Challenges of Meaningful Use” on Thursday, April 7th
I’ve been doing a good deal of coaching and consulting on what Meaningful Use really means to technology professionals lately so I was pleased to accept an invitation by Informatica to lead a webinar on that subject for a data management audience.
Data management professionals and the executives that they report to have now had enough time to learn how difficult meeting the escalating requirements for MU actually is; most are reporting that it’s been more work than they thought. Gone are the days when health systems thought they could just install a certified EHR and they would be able to meet the MU goals. Everyone now understands that even if they’re able to collect the measures required in the first phase of MU, the escalating data challenges of later phases will be more difficult. (more…)
I’m looking forward to doing a Webinar on data virtualization this Thursday, April 22nd. Why? Because this is the single most beneficial concept of architecture, including SOA, and it’s often overlooked by the rank-and-file developers and architects out there. I’m constantly evangelizing the benefits of data virtualization, including integrating data from many and different data sources in real-time, and enabling query-based applications to get data from multiple systems.
The idea is pretty simple, really. Considering that there are many physical database schemas within most enterprises, and typically no common view of the data, data virtualization allows you to map many physical schemas to virtual schemas that are a better representation of the business. For example, a single view of customer data, sales data, and other data that has the same logical meaning, but may be scattered amongst many different physical database systems, using any number of implementation models. (more…)
Last week I participated in ebizQ’s “Cloud QCamp”, which included a podcast and a best-practice cloud integration webinar with Amazon.com and Dave Linthicum. Kurt Messersmith from Amazon Web Services gave an overview of Amazon’s “on-demand infrastructure for hosting web-scale solutions” and provided some enlightening statistics on their bandwidth usage and growth since it was introduced in 2006. He reviewed the key attributes of cloud computing reviewed some of the diverse enterprise use cases for infrastructure as a service (IaaS). They are:
December 14, 15 or 17, 2009
Next Generation Networks for Messaging
10Gigabit Ethernet, Infiniband, and Related Technologies
Prices for technologies like 10Gigabit Ethernet and Infiniband are dropping rapidly and the new technologies are moving performance bottlenecks. With this in mind, we will be discussing the new networks as well as supporting technologies like kernel bypass.
Did you miss it?
Play the full recorded Webinar here: