Using a Logical Data Fabric to Deliver Faster Insights in a Hybrid Cloud World
Reading Time: 4 minutes

Organizations are increasingly turning to the cloud to take advantage of scalable yet elastic computing and storage resources. The availability and economics of the cloud for flexible and scalable high-performance environments has radically changed the way information architects envision the future of enterprise information management and application deployment.

Learn how a logical data fabric is a modern and highly effective approach to unify distributed data in a hybrid cloud environment and how it can be leveraged to overcome systemic challenges, simplify analysis and reporting, and more! Watch on-demand this keynote session by David Loshin, TDWI analyst, at Denodo DataFest!

This image has an empty alt attribute; its file name is Analyst-Keynote-Delivering-Faster-Insights-with-a-Logical-Data-Fabric-in-a-Hybrid-Cloud-World-Virtualization-Denodo-DataFest-2021-On-Demand-1024x549.jpg

 

Hybrid-Cloud Challenges

At the same time, any migration or modernization effort incurs some degree of risk that must be managed effectively. This will be particularly true over the next few years as organizations seek to migrate their data and applications to what will remain (for the near term) a hybrid cloud/on-premises environment.

Operating within a hybrid environment increases complexity, and consequently, this also increases organizational risk. Organizations can address these risks by combining best practices with good technology that empowers data consumers and supports their data analytics needs.

The Solution: Logical Data Fabric

Increased self-service data access streamlines the data analytics lifecycle and speeds time-to-analytics-value. Yet the conventional data extract/transform/load (ETL) paradigm is increasingly obsolete. The future lies in streaming data into data pipelines that fully encapsulate the data integration and preparation utilities to support data democratization. Data architects must consider a modern approach—a logical data fabric—to simplify data democratization by unifying disparate data and bringing data together in an intelligent and governed way. 

Data virtualization, a core component of logical data fabric, can play an important role in supporting the need to access, manage, and analyze data across disparate platforms for traditional reporting and BI—as well as modern use cases such as machine learning and artificial intelligence, integrated analytics for automated decision making and analysis, and combining traditional data-at-rest with real-time streaming data sources. To support this, a logical data fabric must:

  • Integrate data across multiple cloud environments. Modernization projects are motivating cloud migration, but as different cloud service providers offer different benefits, it would be surprising to see any organization limit its options by committing to a single cloud vendor. A logical data fabric should leverage fundamental data virtualization techniques in which separate data virtualization instances are placed within each cloud domain (such as AWS, Azure, or GCP) and can access and aggregate the data within that cloud provider. This enables the logical data fabric to coordinate access and aggregate the data from across the different clouds to provide a holistic view of data across the hybrid enterprise.
  • Automate formerly manual tasks. An enterprise-grade logical data fabric should employ machine learning to continually monitor data sources and track changes to data structures to automatically adapt a semantic/virtualized model and adjust the access methods accordingly. Examples include enabling seamless access despite changes to the source, adjustments to access routines that reflect data-consumer usage patterns, recommendations of data assets to data analysts, and automated data caching to help improve and maintain high performance data accessibility.
  • Speed data delivery to accelerate the production of analytical results. As data architectures become more distributed, application performance is increasingly impacted by data latency. An enterprise-grade logical data fabric overcomes this challenge using optimizations that mask or eliminate data delivery delays, such as leveraging a dynamic query optimization technique to partially push computation down to the host system and reduce the volume of data-in-motion, caching local copies of frequently-accessed data, as well as intelligent query federation. Together, these techniques boost performance and speed time-to-value.
  • Leverage data discovery to expand data awareness. To support data analytics, a logical data fabric must provide four fundamental data management capabilities: data awareness (documenting what data assets exist and are available), data democratization (enabling data accessibility and self-service), transparent access (providing a uniform method of accessing similar data domains), and semantic model flexibility (to reduce the complexity of trying to differentiate different source models and formats).
  • Analyze data at rest for prediction using data in motion. Traditional business intelligence and analytics leverages data-at-rest, but increasingly, organizations seek to develop predictive and prescriptive models that can be embedded within streams of data-in-motion. Inserting analytical models into enterprise data pipelines reduces manual intervention and streamlines decisioning by making it trustworthy and automated, and enabling it to happen in real time. A logical data fabric can support model integration that leverages the use of both data-in-motion and data-at-rest.
  • Catalog data assets along with their lineage. Data scientists strive for data trustworthiness, and fundamentally, this depends on data awareness and data intelligence: business glossaries, mappings of terms to data elements, and documentation of the relationships between the different business terms. Because a logical data fabric connects consumers with disparate data from across the enterprise, it becomes the central source for data awareness, documenting the location, type, format, content, and lineage of the data assets distributed across the hybrid multicloud environment.

A Streamlined Approach

Increased data distribution, adoption of multiple cloud platforms for migration, application modernization, and the continued growth of advanced analytics all increase pressure on the seamless delivery of data to be used for data-driven decision-making. Grafting together tools that are lifted-and-shifted out of the on-premises data center will prove insufficient for meeting the emerging demands for rapid data insights. Instead, consider using a logical data fabric to simplify data access across multicloud environments. Using a logical data fabric will streamline data management and analytics efforts, support access to a wide variety of data types and sources, speed data delivery, and optimize performance.

David Loshin