Category Archives: B2B
In honor of International Women’s Day 2015, Informatica is celebrating female leadership in a blog series. Every day this week, we will showcase a new female leader at Informatica, who will share their perspective on what it’s like to be a woman in the tech industry.
Area Vice President for US Financial Services
Women bring different perspectives to all situations. Sometimes it’s welcomed and sometimes it’s not embraced because it’s outside of the norm – which actually makes it of even greater value. The technology field is still dominated by men. Many times I am the only woman in the room and most times I am the only woman in a position of leadership; this also presents a unique opportunity to stand out.
It has been a long haul to get to where I am today with many stories of great successes and some real misses. Those “misses” always lead me to the vow of “never again” and to exceed in new ways. Today I have the confidence that my “different perspective” will change the course of things. I am unapologetic for my ideas or directions even though they may be contrary to the norm. I am grateful to have been given the chance to work personally with leaders in the tech industry and been given opportunities that stretched me beyond what I believed my own capabilities to be. Only to realize I am more than capable and actually able to move beyond.
As a leader, the most critical aspect is to build a management team that complements my own skills.
- Diversity in talent is important so that we leverage each other’s strengths.
- Open communication is a critical aspect to my leadership and success.
- Honoring and respecting what each individual brings to their teams – while giving just enough guidance, has led to ensuring forward momentum, having ultimate buy-in to what we do, and delivering consistent contributions – all of which is the foundation for great success.
Advice for women:
Put yourself out there. Be bold. Take your seat at the table in a big way! Have mentors. Know who to trust (and who not). Create a network of colleagues. Always help other women succeed.
Thoughts about Informatica’s culture:
Informatica is a great place to be. I wake up every day, passionate about what I do and excited about the difference that my team and I are making for our company and for our customers. We have a culture of integrity, excellence and innovation. This translates in everything we do. As we are critical in some many transformational initiatives for our customers. Who we are has been the basis for how we have delivered for our customers time and time again. Now with the future being all things ‘data’, Informatica will be instrumental in helping our customers leapfrog to their next level of growth. This is the best time to be part of Informatica.
With Informatica’s Data Mania on Wednesday, I’ve been thinking a lot lately about REST APIs. In particular, I’ve been considering how and why they’ve become so ubiquitous, especially for SaaS companies. Today they are the prerequisite for any company looking to connect with other ecosystems, accelerate adoption and, ultimately, separate themselves from the pack.
Let’s unpack why.
To trace the rise of the REST API, we’ll first need to take a look at the SOAP web services protocol that preceded it. SOAP is still very much in play and remains important to many application integration scenarios. But it doesn’t receive much use or love from the thousands of SaaS applications that just want to get or place data with one another or in one of the large SaaS ecosystems like Salesforce.
Why this is the case has more to do with needs and demands of a SaaS business than it does with the capabilities of SOAP web services. SOAP, as it turns out, is perfectly fine for making and receiving web service calls, but it does require work on behalf of both the calling application and the producing application. And therein lies the rub.
SOAP web service calls are by their very nature incredibly structured arrangements, with specifications that must be clearly defined by both parties. Only after both the calling and producing application have their frameworks in place can the call be validated. While the contract within SOAP WSDLs makes SOAP more robust, it also makes it too rigid, and less adaptable to change. But today’s apps need a more agile and more loosely defined API framework that requires less work to consume and can adapt to the inevitable and frequent changes demanded by cloud applications.
Enter REST APIs
REST APIs are the perfect vehicle for today’s SaaS businesses and mash-up applications. Sure, they’re more loosely defined than SOAP, but when all you want to do is get and receive some data, now, in the context you need, nothing is easier or better for the job than a REST API.
With a REST API, the calls are mostly done as HTTP with some loose structure and don’t require a lot of mechanics from the calling application, or effort on behalf of the producing application.
SaaS businesses prefer REST APIs because they are easy to consume. They also make it easy to onboard new customers and extend the use of the platform to other applications. The latter is important because it is primarily through integration that SaaS applications get to become part of an enterprise business process and gain the stickiness needed to accelerate adoption and growth.
Without APIs of any sort, integration can only be done through manual data movement, which opens the application and enterprise up to the potential errors caused by fat-finger data movement. That typically will give you the opposite result of stickiness, and is to be avoided at all costs.
While publishing an API as a way to get and receive data from other applications is a great start, it is just a means to an end. If you’re a SaaS business with greater ambitions, you may want to consider taking the next step of building native connectors to other apps using an integration system such as Informatica Cloud. A connector can provide a nice layer of abstraction on the APIs so that the data can be accessed as application data objects within business processes. Clearly, stickiness with any SaaS application improves in direct proportion to the number of business processes or other applications that it is integrated with.
The Informatica Cloud Connector SDK is Java-based and enables you easily to cut and paste the code necessary to create the connectors. Informatica Cloud’s SDKs are also richer and make it possible for you to adapt the REST API to something any business user will want to use – which is a huge advantage.
In addition to making your app stickier, native connectors have the added benefit of increasing your portability. Without this layer of abstraction, direct interaction with a REST API that’s been structurally changed would be impossible without also changing the data flows that depend on it. Building a native connector makes you more agile, and inoculates your custom built integration from breaking.
Building your connectors with Informatica Cloud also provides you with some other advantages. One of the most important is entrance to a community that includes all of the major cloud ecosystems and the thousands of business apps that orbit them. As a participant, you’ll become part of an interconnected web of applications that make up the business processes for the enterprises that use them.
Another ancillary benefit is access to integration templates that you can easily customize to connect with any number of known applications. The templates abstract the complexity from complicated integrations, can be quickly customized with just a few composition screens, and are easily invoked using Informatica Cloud’s APIs.
The best part of all this is that you can use Informatica Cloud’s integration technology to become a part of any business process without stepping outside of your application.
For those interested in continuing the conversation and learning more about how leading SaaS businesses are using REST API’s and native connectors to separate themselves, I invite you to join me at Data Mania, March 4th in San Francisco. Hope to see you there.
What does it take to be an analytics-driven business? That’s a question that requires a long answer. Recently, Gartner research director Lisa Kart took on this question, noting how the key to becoming an analytics-driven business.
So, the secret of becoming an analytics-driven business is to bust down the silos — easier than done, of course. The good news, as Kart tells it, is that one doesn’t need to be casting a wide net across the world in search of the right data for the right occasion. The biggest opportunities are with connecting the data you already have, she says.
Taking Kart’s differentiation of just-using-analytics versus analytics-driven culture a step further, hare is a brief rundown of how businesses just using analytics approach the challenge, versus their more enlightened counterparts:
Business just using analytics: Lots of data, but no one really understands how much is around, or what to do with it.
Analytics-driven business: The enterprise has a vision and strategy, supported from the top down, closely tied to the business strategy. Management also recognizes that existing data has great value to the business.
Business just using analytics: Every department does its own thing, with varying degrees of success.
Analytics-driven business: Makes connections between all the data – of all types — floating around the organization. For example, gets a cross-channel view of a customer by digging deeper and connecting the silos together to transform the data into something consumable.
Business just using analytics: Some people in marketing have been collecting customer data and making recommendations to their managers.
Analytics-driven business: Marketing departments, through analytics, engage and interact with customers, Kart says. An example would be creating high end, in-store customer experiences that gave customers greater intimacy and interaction.
Business just using analytics: The CFO’s staff crunches numbers within their BI tools and arrive at what-if scenarios.
Analytics-driven business: Operations and finance departments share online data to improve performance using analytics. For example, a company may tap into a variety of data, including satellite images, weather patterns, and other factors that may shape business conditions, Kart says.
Business just using analytics: Some quants in the organization pour over the data and crank out reports.
Analytics-driven business: Encourages maximum opportunities for innovation by putting analytics in the hands of all employees. Analytics-driven businesses recognize that more innovation comes from front-line decision-makers than the executive suite.
Business just using analytics: Decision makers put in report requests to IT for analysis.
Analytics-driven business: Decision makers can go to an online interface that enables them to build and display reports with a click (or two).
Business just using analytics: Analytics spits out standard bar charts, perhaps a scattergram.
Analytics-driven business: Decision makers can quickly visualize insights through 3D graphics, also reflecting real-time shifts.
Despite spending more than $30 Billion in annual spending on Big Data, successful big data implementations elude most organizations. That’s the sobering assessment of a recent study of 226 senior executives from Capgemini, which found that only 13 percent feel they have truly have made any headway with their big data efforts.
The reasons for Big Data’s lackluster performance include the following:
- Data is in silos or legacy systems, scattered across the enterprise
- No convincing business case
- Ineffective alignment of Big Data and analytics teams across the organization
- Most data locked up in petrified, difficult to access legacy systems
- Lack of Big Data and analytics skills
Actually, there is nothing new about any of these issues – in fact, the perceived issues with Big Data initiatives so far map closely with the failed expect many other technology-driven initiatives. First, there’s the hype that tends to get way ahead of any actual well-functioning case studies. Second, there’s the notion that managers can simply take a solution of impressive magnitude and drop it on top of their organizations, expecting overnight delivery of profits and enhanced competitiveness.
Technology, and Big Data itself, is but a tool that supports the vision, well-designed plans and hard work of forward-looking organizations. Those managers seeking transformative effects need to look deep inside their organizations, at how deeply innovation is allowed to flourish, and in turn, how their employees are allowed to flourish. Think about it: if line employees suddenly have access to alternative ways of doing things, would they be allowed to run with it? If someone discovers through Big Data that customers are using a product differently than intended, do they have the latitude to promote that new use? Or do they have to go through chains of approval?
Big Data may be what everybody is after, but Big Culture is the ultimate key to success.
For its part, Capgemini provides some high-level recommendations for better baking in transformative values as part of Big Data initiatives, based on their observations of best-in-class enterprises:
The vision thing: “It all starts with vision,” says Capgemini’s Ron Tolido. “If the company executive leadership does not actively, demonstrably embrace the power of technology and data as the driver of change and future performance, nothing digitally convincing will happen. We have not even found one single exception to this rule. The CIO may live and breathe Big Data and there may even be a separate Chief Data Officer appointed – expect more of these soon – if they fail to commit their board of executives to data as the engine of success, there will be a dark void beyond the proof of concept.”
Establish a well-defined organizational structure: “Big Data initiatives are rarely, if ever, division-centric,” the Capgemini report states. “They often cut across various departments in an organization. Organizations that have clear organizational structures for managing rollout can minimize the problems of having to engage multiple stakeholders.”
Adopt a systematic implementation approach: Surprisingly, even the largest and most sophisticated organizations that do everything on process don’t necessarily approach Big Data this way, the report states. “Intuitively, it would seem that a systematic and structured approach should be the way to go in large-scale implementations. However, our survey shows that this philosophy and approach are rare. Seventy-four percent of organizations did not have well-defined criteria to identify, qualify and select Big Data use-cases. Sixty-seven percent of companies did not have clearly defined KPIs to assess initiatives. The lack of a systematic approach affects success rates.”
Adopt a “venture capitalist” approach to securing buy-in and funding: “The returns from investments in emerging digital technologies such as Big Data are often highly speculative, given the lack of historical benchmarks,” the Capgemini report points out. “Consequently, in many organizations, Big Data initiatives get stuck due to the lack of a clear and attributable business case.” To address this challenge, the report urges that Big Data leaders manage investments “by using a similar approach to venture capitalists. This involves making multiple small investments in a variety of proofs of concept, allowing rapid iteration, and then identifying PoCs that have potential and discarding those that do not.”
Leverage multiple channels to secure skills and capabilities: “The Big Data talent gap is something that organizations are increasingly coming face-to-face with. Closing this gap is a larger societal challenge. However, smart organizations realize that they need to adopt a multi-pronged strategy. They not only invest more on hiring and training, but also explore unconventional channels to source talent. Capgemini advises reaching out to partner organizations for the skills needed to develop Big Data initiatives. These can be employee exchanges, or “setting up innovation labs in high-tech hubs such as Silicon Valley.” Startups may also be another source of Big Data talent.
According to Strategy and Business, the “CFO role is expanding to include being the company’s premier champion of strategic discipline.” It is no wonder that financial transformations are so much in vogue these days. According to The Conference Board, 81% of the companies that it surveyed are involved in a major financial transformation initiative. However, only 27% of these firms claim that they had achieved the benefits that were defined within their business case. Of the reasons for failure, the most interesting is thinking the transformation would be some kind of big bang. The problem is this type of thinking is unrealistic for today’s hyper competitive business environment. Financial strategy today needs be an enabler of business strategy. This means that it needs to be able to support the increasingly shorter duration of business strategy.
Financial Transformation needs to Enable Business Agility
I have discovered the same thing in my discussions with IT organizations. In other words, enabling business strategies increasingly need to be built with a notion of agility. This means for financial strategies that they to need to first and foremost make organizations more agile and enable more continuous business change. Think about the impact of an organization that has as part of its strategy inorganic acquisition. This all means that thinking that a multi-year ERP implementations will on it’s own deliver financial transformation alone is unrealistic.
While it is absolutely fair to determine what at the manual tasks financial teams can eliminate, it does not make sense to think that they are done once an ERP implementation is completed. Recently, I was talking with a large accounting consulting and integration firm, they let me know that they really liked doing large ERP implementations and re-implementations, but they also knew that they would soon break under the weight of financial and business change unless flexibility was built in from the start. Financial transformation must start by creating business flexibility and agility to work in today’s business environment.
Does Your Close Process Get in the Way?
But achieving better financial agility and profitability improvement capabilities is often limited by the timeliness, trustworthiness of data. This why CFOs say that they spend so much of their time on the close process. According to the MIT CFO Summit Survey, nearly half of the organizations surveyed are feeling pressure from senior leadership to become more data driven and analytical. Data clearly limits the finance function ability to guide corporate executives, business-unit managers, and sales and marketing functions in ways to ensures business profitable and growth.
Financial Transformations Need to Fit Business Strategy
At the same time, it cannot be stressed enough that successful financial transformations need to be designed to fit with the company’s larger business strategy. The Conference Board suggests financial organizations should put real emphasis upon transformations that grow the business. Jonathan Brynes at the MIT Sloan School has suggested “the most important issue facing most managers …is making more money from their existing businesses without costly new initiatives”. In Brynes’ cross industry research, he found that 30% or higher of each company’s businesses are unprofitable. Brynes claims these business losses are offset by what are “islands of high profitability”. The root cause of this issue he asserts is the inability of current financial and management control systems to surface profitability problems and opportunities for investment to accelerate growth. For this reason, financial transformations should as a business goal make it easier to evaluate business profitability.
In a survey from CFO magazine, they found that nearly all the survey respondents said their companies are striving to improve profitability over the next year. 87% said their companies needed to analyze financial and performance data much more quickly if they were to meet business targets. However, only 12% said their finance organizations can respond to requests for financial reports and analysis from business managers in real or near-real time. At the same time, business managers are expecting finance staff to be able to tell the story behind the numbers — to integrate financial and operational data in ways that get at the drivers of improvement.
We Are Talking About More than Financial Decision Making
This means not just worrying about financial decision making, but ensuring that the right questions and the right insights are being provided for the business. As Geoffrey Moore has indicated economies of scale and market clout are no longer the formidable barriers to entry that they once were. The allocation of resources must be focused on a company’s most distinctive business capabilities—those things that provide the enterprise its “right to win”. To be a strategic, CFOs need to become a critical champion of the capabilities system, making sure it gets the investment and consideration it needs. This accentuates your ongoing role as a voice of reason in M&A—favoring acquisitions that fit well with the company’s capabilities system, and recommending divestitures of products and services that don’t.
Today, the CFO role is being transformed to increasingly be a catalyst for change. This involves increasingly helping companies focus upon the business capabilities that drive value. CFOs are uniquely positioned to take on this challenge. They are the company leader that combines strategic insight with a line of sight into business execution. Moreover, unlike other change agents, CFOs have the power of the purse. However, to do this their financial transformations need to ensure business agility and improve their and the businesses ability to get and use data.
Strata 2015 – Making Data Work for Everyone with Cloud Integration, Cloud Data Management and Cloud Machine Learning
Are you ready to answer “Yes” to the questions:
a) “Are you Cloud Ready?”
b) “Are you Machine Learning Ready?”
I meet with hundreds of Informatica Cloud customers and prospects every year. While they are investing in Cloud, and seeing the benefits, they also know that there is more innovation out there. They’re asking me, what’s next for Cloud? And specifically, what’s next for Informatica in regards to Cloud Data Integration and Cloud Data Management? I’ll share more about my response throughout this blog post.
The spotlight will be on Big Data and Cloud at the Strata + Hadoop World conference taking place in Silicon Valley from February 17-20 with the theme “Make Data Work”. I want to focus this blog post on two topics related to making data work and business insights:
- How existing cloud technologies, innovations and partnerships can help you get ready for the new era in cloud analytics.
- How you can make data work in new and advanced ways for every user in your company.
Today, Informatica is announcing the availability of its Cloud Integration Secure Agent on Microsoft Azure and Linux Virtual Machines as well as an Informatica Cloud Connector for Microsoft Azure Storage. Users of Azure data services such as Azure HDInsight, Azure Machine Learning and Azure Data Factory can make their data work with access to the broadest set of data sources including on-premises applications, databases, cloud applications and social data. Read more from Microsoft about their news at Strata, including their relationship with Informatica, here.
“Informatica, a leader in data integration, provides a key solution with its Cloud Integration Secure Agent on Azure,” said Joseph Sirosh, Corporate Vice President, Machine Learning, Microsoft. “Today’s companies are looking to gain a competitive advantage by deriving key business insights from their largest and most complex data sets. With this collaboration, Microsoft Azure and Informatica Cloud provide a comprehensive portfolio of data services that deliver a broad set of advanced cloud analytics use cases for businesses in every industry.”
Even more exciting is how quickly any user can deploy a broad spectrum of data services for cloud analytics projects. The fully-managed cloud service for building predictive analytics solutions from Azure and the wizard-based, self-service cloud integration and data management user experience of Informatica Cloud helps overcome the challenges most users have in making their data work effectively and efficiently for analytics use cases.
The new solution enables companies to bring in data from multiple sources for use in Azure data services including Azure HDInsight, Azure Machine Learning, Azure Data Factory and others – for advanced analytics.
The broad availability of Azure data services, and Azure Machine Learning in particular, is a game changer for startups and large enterprises. Startups can now access cloud-based advanced analytics with minimal cost and complexity and large businesses can use scalable cloud analytics and machine learning models to generate faster and more accurate insights from their Big Data sources.
Success in using machine learning requires not only great analytics models, but also an end-to-end cloud integration and data management capability that brings in a wide breadth of data sources, ensures that data quality and data views match the requirements for machine learning modeling, and an ease of use that facilitates speed of iteration while providing high-performance and scalable data processing.
For example, the Informatica Cloud solution on Azure is designed to deliver on these critical requirements in a complementary approach and support advanced analytics and machine learning use cases that provide customers with key business insights from their largest and most complex data sets.
Using the Informatica Cloud solution on Azure connector with Informatica Cloud Data Integration enables optimized read-write capabilities for data to blobs in Azure Storage. Customers can use Azure Storage objects as sources, lookups, and targets in data synchronization tasks and advanced mapping configuration tasks for efficient data management using Informatica’s industry leading cloud integration solution.
As Informatica fulfills the promise of “making great data ready to use” to our 5,500 customers globally, we continue to form strategic partnerships and develop next-generation solutions to stay one step ahead of the market with our Cloud offerings.
My goal in 2015 is to help each of our customers say that they are Cloud Ready! And collaborating with solutions such as Azure ensures that our joint customers are also Machine Learning Ready!
To learn more, try our free Informatica Cloud trial for Microsoft Azure data services.
You might think this year’s Valentine’s Day is no different than any other. But make no mistake – Valentine’s Day has never been more powered by technology or more fueled by data.
It doesn’t get a lot of press coverage, but did you know the online dating industry is now a billion dollar industry globally? As technology empowers us all to live and work in nearly any place, we can no longer rely on geographic colocation to find our friends and soul mates. So, it’s no surprise that online dating and social networking grew in popularity as the smartphone revolution happened over the last eight years. Dating and networking are no longer just about running into people at the local bar on a Friday night. People are connecting with one another on every consumer device they have available, from their computers to their tablets to their phones. This mass consumerization of devices and the online social applications we run on them have fundamentally changed how we all connect with one another in the offline world.
There’s a lot of talk about big data in the tech industry, but there isn’t a lot of understanding of how big data actually affects the real world. Online dating serves as a fantastic example here. Did you know about 1 out of 6 people in the United States is single and that about 1 of 3 of them are members of an online dating site? With tens of millions of members just in the United States, the opportunity to meet new people online has never been more compelling. But the challenge is helping people sift through a “big data” store of millions of profiles. The solution to this has been the use of predictive recommendation engines. We are all familiar with recommendation engines on e-commerce sites that give us suggestions for new items to buy. The exact same analytics are now being applied to people and their preferences to help them find new friends and companions. Big data analytics is not just some fancy technology used by retailers and Wall Street. The proof is in the math: 1 of 18 people in the United States is using big data analytics today to fulfill the most human needs of all in finding companionship.
However, this everyday revolution of big data analytics is not just limited to online dating. US News and World Report estimates that a whopping $19 billion dollars will be spent around Valentine’s Day this year. This spending includes gifts, flowers, candy, jewelry and other forms of celebration. The online sites that sell these products are no foreigners to big data either. Organizations with e-commerce sites, many of whom are Informatica customers, are collecting real-time weblog and clickstream information to dynamically offer the best possible customer experiences. Big data is not only helping to build relationships between consumers, but is also building them between enterprises and consumers.
In an increasingly data-driven world, you cannot connect people unless you can connect data. The billion dollar online dating industry and the $19 billion dollar Valentine’s Day industry would not exist if they were not fueled by the ability to quickly derive meaning out of data assets and turn them into compelling analytical outcomes. Informatica is powering this data-driven world of connected data and connected people by making all types of data ready for analytics. Our leading technologies have helped customers collect data quickly, re-direct workloads easily, and perfect datasets completely. So on this Valentine’s Day, I invite you to connect with your customers and help them to connect with one another by first connecting with your data. There is no better way to help get ready for love than by getting big data ready for analytics!
Informatica users leveraging HDP are now able to see a complete end-to-end visual data lineage map of everything done through the Informatica platform. In this blog post, Scott Hedrick, director Big Data Partnerships at Informatica, tells us more about end-to-end visual data lineage.
Hadoop adoption continues to accelerate within mainstream enterprise IT and, as always, organizations need the ability to govern their end-to-end data pipelines for compliance and visibility purposes. Working with Hortonworks, Informatica has extended the metadata management capabilities in Informatica Big Data Governance Edition to include data lineage visibility of data movement, transformation and cleansing beyond traditional systems to cover Apache Hadoop.
Informatica users are now able to see a complete end-to-end visual data lineage map of everything done through Informatica, which includes sources outside Hortonworks Data Platform (HDP) being loaded into HDP, all data integration, parsing and data quality transformation running on Hortonworks and then loading of curated data sets onto data warehouses, analytics tools and operational systems outside Hadoop.
Regulated industries such as banking, insurance and healthcare are required to have detailed histories of data management for audit purposes. Without tools to provide data lineage, compliance with regulations and gathering the required information for audits can prove challenging.
With Informatica, the data scientist and analyst can now visualize data lineage and detailed history of data transformations providing unprecedented transparency into their data analysis. They can be more confident in their findings based on this visibility into the origins and quality of the data they are working with to create valuable insights for their organizations. Web-based access to visual data lineage for analysts also facilitates team collaboration on challenging and evolving data analytics and operational system projects.
The Informatica and Hortonworks partnership brings together leading enterprise data governance tools with open source Hadoop leadership to extend governance to this new platform. Deploying Informatica for data integration, parsing, data quality and data lineage on Hortonworks reduces risk to deployment schedules.
A demo of Informatica’s end-to-end metadata management capabilities on Hadoop and beyond is available here:
- A free trial of Informatica Big Data Edition in the Hortonworks Sandbox is available here .
By Philip Russom, TDWI, Research Director for Data Management.
I recently broadcast a really interesting Webinar with David Lyle, a vice president of product strategy at Informatica Corporation. David and I had a “fire-side chat” where we discussed one of the most pressing questions in data management today, namely: How can we prepare great data for great analytics, while still leveraging older best practices in data management? Please allow me to summarize our discussion.
Both old and new requirements are driving organizations toward analytics. David and I started the Webinar by talking about prominent trends:
- Wringing value from big data – The consensus today says that advanced analytics is the primary path to business value from big data and other types of new data, such as data from sensors, devices, machinery, logs, and social media.
- Getting more value from traditional enterprise data – Analytics continues to reveal customer segments, sales opportunities, and threats for risk, fraud, and security.
- Competing on analytics – The modern business is run by the numbers – not just gut feel – to study markets, refine differentiation, and identify competitive advantages.
The rise of analytics is a bit confusing for some data people. As experienced data professionals do more work with advanced forms of analytics (enabled by data mining, clustering, text mining, statistical analysis, etc.) they can’t help but notice that the requirements for preparing analytic data are similar-but-different as compared to their other projects, such as ETL for a data warehouse that feeds standard reports.
Analytics and reporting are two different practices. In the Webinar, David and I talked about how the two involve pretty much the same data management practices, but it different orders and priorities:
- Reporting is mostly about entities and facts you know well, represented by highly polished data that you know well. Squeaky clean report data demands elaborate data processing (for ETL, quality, metadata, master data, and so on). This is especially true of reports that demand numeric precision (about financials or inventory) or will be published outside the organization (regulatory or partner reports).
- Advanced analytics, in general, enables the discovery of new facts you didn’t know, based on the exploration and analysis of data that’s probably new to you. Preparing raw source data for analytics is simple, though at high levels of scale. With big data and other new data, preparation may be as simple as collocating large datasets on Hadoop or another platform suited to data exploration. When using modern tools, users can further prepare the data as they explore it, by profiling, modeling, aggregating, and standardizing data on the fly.
Operationalizing analytics brings reporting and analysis together in a unified process. For example, once an epiphany is discovered through analytics (e.g., the root cause of a new form of customer churn), that discovery should become a repeatable BI deliverable (e.g., metrics and KPIs that enable managers to track the new form of churn in dashboards). In these situations, the best practices of data management apply to a lesser degree (perhaps on the fly) during the early analytic steps of the process, but then are applied fully during the operationalization steps.
Architectural ramifications ensue from the growing diversity of data and workloads for analytics, reporting, multi-structured data, real time, and so on. For example, modern data warehouse environments (DWEs) include multiple tools and data platforms, from traditional relational databases to appliances and columnar databases to Hadoop and other NoSQL platforms. Some are on premises and others are on clouds. On the down side, this results in high complexity, with data strewn across multiple platforms. On the upside, users get great data for great analytics by moving data to a platform within the DWE that’s optimized for a particular data type, analytic workload, or price point, or data management best practice.
For example, a number of data architecture uses cases have emerged successfully in recent years, largely to assure great data for great analytics:
- Leveraging new data warehouse platform types gives analytics the high performance it needs. Toward this end, TDWI has seen many users successfully adopt new platforms based on appliances, columnar data stores, and a variety of in-memory functions.
- Offloading data and its processing to Hadoop frees up capacity on EDWs. And it also gives unstructured and multi-structured data types a platform that is better suited to their management and processing, all at a favorable cost point.
- Virtualizing data assets yields greater agility and simpler data management. Multi-platform data architectures too often entail a lot of data movement among the platforms. But this can be mitigated by federated and virtual data management practices, as well as by emerging practices for data lakes and enterprise data hubs.
If you’d like to hear more of my discussion with Informatica’s David Lyle, please replay the Webinar from the Informatica archive.
The Ponemon Institute stated that the biggest concern for security professionals is that they do not know where sensitive data resides. Informatica’s Intelligent Data Platform provides data security professionals with the technology required to discover, profile, classify and assess the risk of confidential and sensitive data.
Last year, we began significant investments in data security R&D support the initiative. This year, we continue the commitment by organizing around the vision. I am thrilled to be leading the Informatica Data Security Group, a newly-formed business unit comprised of a team dedicated to data security innovation. The business unit includes the former Application ILM business unit which consists of data masking, test data management and data archive technologies from previous acquisitions, including Applimation, ActiveBase, and TierData.
By having a dedicated business unit and engineering resources applying Informatica’s Intelligent Data Platform technology to a security problem, we believe we can make a significant difference addressing a serious challenge for enterprises across the globe. The newly formed Data Security Group will focus on new innovations in the data security intelligence market, while continuing to invest and enhance our existing data-centric security solutions such as data masking, data archiving and information lifecycle management solutions.
The world of data is transforming around us and we are committed to transforming the data security industry to keep our customer’s data clean, safe and connected.
For more details regarding how these changes will be reflected in our products, message and support, please refer to the FAQs listed below:
Q: What is the Data Security Group (DSG)?
A: Informatica has created a newly formed business unit, the Informatica Data Security Group, as a dedicated team focusing on data security innovation to meet the needs of our customers while leveraging the Informatica Intelligent Data Platform
Q: Why did Informatica create a dedicated Data Security Group business unit?
A: Reducing Risk is among the top 3 business initiatives for our customers in 2015. Data Security is a top IT and business initiative for just about every industry and organization that store sensitive, private, regulated or confidential data. Data Security is a Board room topic. By building upon our success with the Application ILM product portfolio and the Intelligent Data Platform, we can address more pressing issues while solving mission-critical challenges that matter to most of our customers.
Q: Is this the same as the Application ILM Business Unit?
A: The Informatica Data Security Group is a business unit that includes the former Application ILM business unit products comprised of data masking, data archive and test data management products from previous acquisitions, including Applimation, ActiveBase, and TierData, and additional resources developing and supporting Informatica’s data security products GTM, such as Secure@Source.
Q: How big is the Data Security market opportunity?
A: Data Security software market is estimated to be a $3B market in 2015 according to Gartner. Total information security spending will grow a further 8.2 percent in 2015 to reach $76.9 billion.
Q: Who would be most interested in this announcement and why?
A: All leaders are impacted when a data breach occurs. Understanding the risk of sensitive data is a board room topic. Informatica is investing and committing to securing and safeguarding sensitive, private and confidential data. If you are an existing customer, you will be able to leverage your existing skills on the Informatica platform to address a challenge facing every team who manages or handles sensitive or confidential data.
Q: How does this announcement impact the Application ILM products – Data Masking, Data Archive and Test Data Management?
A: The existing Application ILM products are foundational to the Data Security Group product portfolio. These products will continue to be invested in, supported and updated. We are building upon our success with the Data Masking, Data Archive and Test Data Management products.
Q: How will this change impact my customer experience?
A: The Informatica product website will reflect this new organization by listing the Data Masking, Data Archive, and Test Data Management products under the Data Security product category. The customer support portal will reference Data Security as the top level product category. Older versions of the product and corresponding documentation will not be updated and will continue to reflect Application ILM nomenclature and messaging.