This article originally appeared in CDO magazine.
Data and analytics have long held promise in helping organizations deliver greater value across the entire stakeholder landscape, including customers, associates, and partners. However, since the beginning of the data warehousing and BI movement, achieving business value rapidly — in alignment with windows of opportunity — has proven elusive.
For an organization to be competitive in the era of digital transformation, data must be front and center — and accessible in near real-time. But many organizations are struggling with data that is deeply buried, complex to access, difficult to integrate, and inaccessible to business users. Problems like these diminish the value of your data and its ability to inform decision-making at all levels.
For most organizations, it’s hard to produce value from data quickly
The main challenge has been the distributive and siloed nature of the data subjects that need to be integrated to achieve business-relevant insights. Data subjects — customers, products, orders, warehouses, etc. — typically reside in different systems/databases, requiring extraction, transformation, and loading into a common database where analytics can be mounted.
Often, data delivery solutions like data warehouses, self-service BI, and data lakes are used to try and unlock these data silos; however, each of these solutions presents drawbacks in terms of effort, complexity, cost, and time-to-market.
That is where data virtualization comes in and delivers a holistic view of information to business users across all source systems.
So what exactly is data virtualization?
In its simplest form, data virtualization allows an organization to attach to its data subjects where they reside in real-time. It presents disparate data subjects through a semantic layer that enables them to be integrated on the fly to support query and analytic use cases.
By eliminating the need to design and build complex routines that move data from multiple source locations into a single integrated data warehouse, products like Denodo enable organizations to compress weeks to months of data preparation time out of the idea-to-execution value stream. As a result, value delivery is significantly accelerated.
Optimization with data fabric.
While data virtualization integrates data from different sources into one layer to provide real-time access, data fabric is an approach with end-to-end architecture that allows organizations to manage massive amounts of data in different places and automates the integration process.
The thing about data fabric is that it has a huge job to do and must have a robust integration backbone to do it. A data fabric must support many data sources, be compatible with several data pipeline workflows, support automated data orchestration, empower various kinds of data consumers, and more. To do this successfully, a data fabric requires powerful technologies and a solid data integration layer to access all data assets.
Many in the data community believe that you must choose data virtualization OR data fabric, but that is not the case — and that solid data integration layer is an example of why. The reality is that data fabric can be operationalized through data virtualization and optimizes your modern data architecture, allowing you to move with the speed of your business.
By building a model that utilizes both concepts, businesses make finding, interpreting, and using their data near seamless.
Technology by itself isn’t the answer.
Even with the proven results of this class of technologies, many organizations continue to struggle with traditional data management and analytic architectures and solutions.
This inability to adopt new approaches for data management and analytics only serves to deprive decision makers of rapid access to insights that are necessary to support agility in the pandemic-induced, rapidly transforming digital/global economy.
The solution is not just found in technology. Instead, it is found in the minds of the humans responsible for delivering data management and analytic capabilities. It is a human change management problem we face. Remember the adage, people/process/data/technology?
The next frontier to be conquered is optimizing the thinking and innovation risk tolerance of stewards of data management and analytics solutions within organizations.
What do you think? Is your organization facing any of these issues or trying to tackle how to deliver significant value — better, faster, cheaper, smarter?
I’m happy to chat about where you’re at and how to get where you would like to be. If you want to talk, send me a note.