Would you agree that the data demanded by business users outpaces the data supplied by IT from the enterprise systems to the degree that the users are trying to make sense of yesterday’s data, today? Forrester Research analysts Noel Yuhanna and Michelle Goetz wrote in their research article titled Create A Road Map For A Real-Time, Agile, Self-Service Data Platform, “Information from the past won’t support the insights of the future – businesses need real-time data.” If businesses are to arm their users with up-to-the-minute data, they need to adopt fast data strategy.
Fast Data Strategy is a disciplined approach to using agile, real-time, self-service data technologies such as data virtualization to deliver data quickly to the business users to effect faster outcomes.
“Enterprise architects must revise their data architecture to meet the demand for fast data,” say Noel Yuhanna and Michele Goetz. But, where can they do it? There are four significant IT projects where fast data is a must – agile BI, big data and cloud integration, data services, and single view applications.
Agile BI projects ensure that the analysis and reporting done by the business users are timely to effect rapid and accurate decision making with the latest intelligence. Data virtualization enables fast data architectures such as logical data warehouse, virtual data marts, self-service BI, and operational analytics. These architectures aid IT to rapidly deliver data to their business users within their BI systems by aggregating the latest data residing in various source systems without the need to physically move the data, which is the culprit for data delay.
Big data and cloud evolution are making the IT landscape further distributed with additional data silos. Data virtualization counters this data deluge with fast data architectures for advanced analytics and data warehouse offloading. These architectures enable IT to leverage the low cost that big data and cloud afford, while significantly improving the time to data delivery with real-time access.
Data services are the backbone of application development. Data virtualization enables rapid application development with an unified data services layer, which a logical data abstraction of all structured and unstructured data from the underlying sources. Using data virtualization, IT can develop data services in less than half a day, whereas traditional data integration methods such as extract-transform-load (ETL) will take one to two weeks.
Single view applications such as single view of customers, products, inventories, and so on, improve the efficacy of call center agents with rapid response times, and that of sales and marketing teams with targeted campaigns. Data virtualization enables these single view architectures by virtually aggregating different master data in real-time without having to centrally store and manage the data.
Many customers such as Autodesk, DrillingInfo, and Jazztel are successfully using data virtualization to enable fast data architectures in these four IT projects.
I presented in detail these IT projects and customer case studies at the Fast Data Strategy Virtual Summit. Watch a preview of it here. My presentation is one among the 20+ other presentations delivered by customers such as Autodesk, DrillingInfo, Logitech, CIT, Guardian Life, Zurich Insurance, and other data virtualization experts. Best of all, this event is complimentary. Register today and learn how to inject fast data into your IT projects.
- The Data Lakehouse Myth - February 22, 2023
- Data Virtualization and Data Science - July 1, 2021
- Key Insights from Three Cloud Experts Roundtables: Accelerate Hybrid Cloud Journey, Harness Cloud Best Practices, and Simplify Data Management - September 30, 2020