Setting the Foundation for Enterprise Cloud Data Warehouses—Part 2
How to Find Quick Wins in a Cloud Data Warehouse or Data Lake
In my previous post, I described the benefits of using an enterprise data catalog to better navigate to quicker value in your cloud journey. In this second post in the series, I’ll expand on the benefits of taking a high-value, iterative approach to moving to the cloud. Whether you have an existing solution or are starting fresh, you’re going to want to leverage cloud to allow data consumers to take advantage of new analytics tools, as well as faster ingestion, scalability, and the opportunity to establish your data foundation in a scalable and sustainable manner. This can be challenging for customers, as there are numerous products to choose from inside and outside a cloud data warehouse or data lake.
In order to quickly navigate to value, data collaboration is key. Having tools that can easily cross-share data profiles, definitions, ownership, lineage, and impact analysis will help you navigate the landscape for quick wins and:
- Enable quick discovery and prioritization of high-value data assets (i.e., customers, products, etc.).
- Identify critical reports. Doing so today can help you ensure you have the data foundation for tomorrow, and quickly mitigate any gaps for a desired future state.
- Identify the data model that satisfies majority of critical reports it takes to run your business.
You can leverage whiteboarding sessions by reflecting the data model visually using business terms, and display sets of attributes and measures. You are identifying the desired (future) state that your business needs in its data foundation.
This future state can then be compared to your data backlog to ensure that you are not just replacing what you have but extending upon it as well
You can leverage data lineage and impact analysis to help you discover any existing inefficiencies or redundancies so that you don’t persist outdated solutions to the cloud
The quicker you get your data foundation to the cloud, the quicker you can enrich the data, so that consumers can use the new cloud platform analytics to find new opportunities and drive the market.
Once you have set the data foundation, leveraging an enterprise data catalog, your data architect can ensure your framework and methodology are followed for both onshore and offshore development. It also provides data transparency, building data-consumer trust and end-user adoption.
Build on Quick Wins With a High-Value, Iterative Approach
As previously stated, whether you have an existing on-premises solution or are just starting your data journey in the cloud, your best approach is not to tackle everything at once. You also shouldn’t persist legacy solutions that are redundant or do not take advantage of the new functionality the cloud data warehouse or data lake provides. To add to this, you must balance “just” replacing the existing to satisfying future needs (i.e., backlogs).
Quickly prioritizing high-value data assets and getting them to the new cloud data warehouse or data lake allows data consumers to take advantage of the new technology so that the development team can quickly backfill or enhance additional insights without disrupting the end users.
Legacy on-premises solutions can quickly be outdated and redundant due to lack of transparency or siloed integration, with little-to-no insight on what is critical to the business.
By developing an interim solution, you can rapidly discover and prioritize data assets with an enterprise data catalog, thus enabling to rapidly set your data foundation and allowing your data consumers to quickly take advantage of your cloud data warehouse or data lake.
Once you have changed the data consumers focus to the new solution, you can then quickly enrich and backfill data assets, while also ensuring data transparency and providing a path for data science opportunities and discovery.
Your future state should show you the efficiencies gained in the cloud data warehouse or data lake and set the data foundation for data science discovery opportunities.
- This should also allow you to deprecate redundant or inefficient legacy solutions.
- Any legacy repository solutions can be then repurposed for archiving or disaster recovery.
Setting the Foundation for a Next-Generation Enterprise Data Architecture
Moving to a cloud data warehouse or data lake provides organizations with a variety of opportunities and advantages, but it’s also the opportunity to ensure you are setting a stable data foundation that is scalable, sustainable, and maintainable. Therefore, data discovery is critical.
Before you start a journey, it is imperative to assess the landscape so you can successfully navigate to an efficient outcome. Many organizations try to do this effort manually, which can lead to inconsistencies and longer time to value. See example costs below:
Leveraging a data catalog with artificial intelligence and prebuilt reference tables allows you to quickly find data assets and sensitive risks.
By leveraging an enterprise data catalog, data stewards and stakeholders can quickly communication with data architects and developers on a platform that shows enterprise transparency to quickly convert “tribal knowledge” to enterprise knowledge.
Determine Your Cloud Strategy Plan
Assess Current-State Data Architecture
- Your first step in developing your cloud strategy plan is identify your current state.
- This used to be a manual effort that required process flows, data profile scripts, and manual logging over Excel, email, and SharePoint.
- Now all this is automated in Informatica Enterprise Data Catalog (EDC).
- EDC is the catalog of catalogs, so it can even scan cloud and on-premises applications, as well as cloud catalogs.
Define Future-State Data Architecture
- After you understand your current ecosystem, you can now plan how to deploy it to your cloud data warehouse or data lake.
- Identity high-value data assets.
- Mitigate risk by tagging sensitive data.
- Leverage data profiles and subject matter expert input to ensure your data model is still relevant or determine what needs to be enhanced.
- After you have determined your future state roadmap, you can design your rapid value, iterative approach and mitigate any data-consumer disruption, while establishing a lifecyle that that evolves to your business.
- EDC then allows the data architect to provide the oversight of off-shore or on-shore development, to ensure the sustainability and scalability of your data management solution.
Are You Ready to Take Your First Step into the Cloud?
Your journey to the cloud shouldn’t be a hard one, but it can be daunting if you don’t know where to begin. Combining a cloud data warehouse or data lake with Informatica’s integration platform as a service (iPaaS) provides you with an agile infrastructure that can be easily governed and streamlined for your data modernization. With the Informatica platform, you have a clear roadmap to get rapid value out of your cloud data warehouse or data lake, as well as the ability to advance your data and analytics maturity level.
Learn more about Informatica data management for cloud data warehouses and data lakes.