As C-level executives managing different lines of business, our duties could be varied, but we are all driven by one common goal—to generate profit for the company. And there is always pressure to do it faster—demanded by the CEO. Data is the perfect tool to help us accelerate our departments’ tasks. Yet when the data is slow to arrive, our decisions are delayed, and are not as impactful from a timing perspective. So, how do we fast-track the availability of data? Answer—through fast data strategy. What does fast data strategy do? It makes the data available in real-time for operations and decision making. And how does it help? It helps us to be competitive in our industry and generate profits faster.
As CIOs, you are tasked with automating the business functions through an effective IT strategy, and as CROs, you are more concerned exceeding revenue targets quarter after quarter. As a CMO myself, my charter is to increase demand generation and improve brand awareness. Even though our day-to-day job duties are different, there’s one common thread that weaves our disparate functions together—that is data. CIOs, you use data to determine the optimal system performance and to onboard new projects on time. CROs, you use data to track the attainment of revenue goals, and to take corrective actions when needed. We CMOs, use data to measure the outcome of marketing campaigns, and to ensure enough pipe for sales to prosecute.
But what happens when that crucial data is not available to us on time to measure the current performance and to take corrective actions in time? The answer is simple—we will simply miss the profit targets. It’s like jumping from a plane with a parachute. You know where you want to land, but wind forces are going to move you away from your target. The earlier you know that you are off target, the higher the chances that you can course-correct on time and land where you would like. A delayed reaction, possibly by lack of information to determine whether you are on target or not, can result in missing the designated landing spot, because there’s not enough height to course-correct, and be swept off to an unwanted, and sometimes dangerous, landing spot. A.k.a., our careers might be in jeopardy!
So, what makes the data arrive slowly and delay our decisions? The answer is always the same—plethora of systems that house disparate data in different formats making it difficult to bring them together and give us meaningful information in a timely manner. Business users resort to manual reconciliation of data taking time away from client-facing tasks; IT teams expend time and energy in expensive physical integration of data across systems, and nobody is productive.
What is needed is fast flow of information that is accurate, inter-connected, and readily available. A strategy that ensures such information delivery is the fast data strategy. It is an approach to using agile, real-time, self-service data technologies such as data virtualization to quickly deliver data to business users to effect faster outcomes. Think of it as a virtual single repository of enterprise data—a well that you go to obtain instant data. This strategy allows data to continue to reside in disparate sources, while being virtually, or “logically” compiled in a single place. Put more broadly, the fast data strategy enables a modernized data management architecture where data is aggregated in real-time, and subsequently delivered within a single view.
Take Autodesk, an American multinational software corporation that makes software for the architecture, engineering, construction, manufacturing, media, and entertainment industries. Due to a change in their business model from a conventional perpetual license model to a subscription-based model, Autodesk’s legacy systems were unable to integrate the new systems supporting the new subscription models with the previous ones (those supporting perpetual licensing). This left Autodesk unable to deliver high quality data in a timely manner to their business stakeholders and unable to respond to business demands that required speed and precision. With the fast data strategy in place, Autodesk implemented a logical data warehouse, allowing data to be abstracted without business users having to touch or transform any physical data. Furthermore, the new and old systems were able to coexist, allowing business to proceed as it should with the creation of a single, unified enterprise access point for all data used within the company. This unified access point brought about by the fast data strategy allowed Autodesk to successfully transition to their subscription based licensing model, and further improved business agility and profitability.
Similarly, Zurich, or Switzerland’s largest insurer, found themselves with a need to modernize their external and internal processes. Historically, an extensive underwriting process was sufficient to develop a competitive advantage as an insurance company, however, modern times have brought forth a new focus on creative sourcing and distinct analytical methods. With that, Zurich adopted the fast data strategy with the hopes of agile deployment of their existing data assets as well as the leveraging of new data sources en route to their greater goal of developing a competitive advantage. This of course, was achieved by creating virtual views over their various XML data sources, and publishing that data as consumable sets in real-time. In just a few months, the implementation of the fast data strategy has brought forth a number of benefits including cost savings, reduced time-to-market, improved efficiency, and resource optimization among many others, allowing Zurich to fulfill their goal of developing a competitive advantage.
While our day to day duties may change, our outstanding objective to generate profit is here to stay. It’s with that in mind, that I recommend deploying that parachute here and now, allowing you to empower your company with the knowledge manifested in data and the success behind the fast data strategy.
Finally you can watch customer presentations on data virtualization, and many more from 30+ presenters, at our annual user conference Denodo DataFest. You also can watch all sessions on-demand.
- The Data Lakehouse Myth - February 22, 2023
- Data Virtualization and Data Science - July 1, 2021
- Key Insights from Three Cloud Experts Roundtables: Accelerate Hybrid Cloud Journey, Harness Cloud Best Practices, and Simplify Data Management - September 30, 2020