Fuelling the future of data

With the exponential growth of data, there is an increasing need for organisations to take advantage of the latest in data management solutions to help manage their existing architecture.

Historically, data lakes were slated to be the answer to the growing data issue but failed to live up to expectations, leaving many businesses unable to gain a better overview of their data to drive better decision making.  However, as demand for real-time insights grows, data teams are trying to navigate the complex makeup of existing lakes, which have become murky and unclear, as well as their multitude of existing data and application silos to enhance their digital transformation capabilities and provide the foundation for new insights and services.

This is where intelligent data fabrics can provide a stepping stone to the next generation of data architecture.  

Synchronising real-time and historical data

With different taxonomies, metadata and structures, data lakes have typically consisted of dissimilar data from various sources, stored in different formats. This makes it difficult to integrate, transform, normalise and harmonise all of this data so that organisations can gain a consistent and comprehensive overview and are able to use it.

Further complexity has been added due to the growing availability of real-time data and the subsequent requirement to harmonise this data alongside batch data, and additional complications arise when businesses need to use real-time and historical data to make decisions in the moment, for example for real time risk or fraud detection activities.

Ultimately, data lakes have shown themselves to be incompatible with these requirements, and many organisations are now looking for a way to combine both real-time data and batch data in a way that allows them to gain actionable insights in the moment. 

Due to the work involved, many businesses are hesitant to start ripping and replacing old systems – instead wanting to find a solution to this problem that complements their current technology and enables them to continue to extract value from their existing data architecture and investments. In many businesses that are operating in a highly siloed, distributed environment with many legacy applications and data stores, this need is coupled with the requirement for technology that can aggregate, integrate, transform and normalise the data from their existing infrastructure on demand. 

With data lakes proving to in effect be just another silo, a new approach to data is needed for businesses to extract the most value out of the data at their disposal.  

Intelligent data fabrics hold the key

Intelligent data fabrics have an ability to transform and harmonise the data so that it is actionable, and incorporate a wide range of analytics capabilities, from analytic SQL to business rules and machine learning, to support the needs of the business.

By allowing existing applications and data to remain in place, intelligent data fabrics enable organisations to get the most from previous investments, while helping them gain business value from the data stored in lakes and other sources quickly and flexibly to help meet the needs of a variety of business initiatives. This includes everything from scenario planning and risk modelling, to running simulations for wealth management to identify new sources of alpha. 

There is no longer a need for different development paradigms to manage the various application layers

To take advantage of these capabilities from data lakes using traditional technologies, multiple architectural layers would be needed, including distributed data stores, an integration layer, transformation, normalisation and harmonisation capabilities, a metadata layer, as well as a real-time, distributed caching layer. Then, there is also a need for an intelligence layer, with application logic and analytics capabilities, and a real-time layer. Building such an architecture traditionally required a wide range of products as well as integration and maintenance of the products, which is complex and costly to build and maintain.  

Streamlining and simplifying the stack is now possible due to advances in technology, making it much cleaner architecturally, and simpler from an implementation, maintenance and application development standpoint. There is no longer a need for different development paradigms to manage the various application layers. It also provides higher performance as latency is reduced due to the elimination of connections between the different layers of the architecture, allowing organisations to incorporate transaction and event data into analyses and processes in near-real-time. 

The impact of data fabrics on digital transformation

Modern intelligent data fabrics are able to scale out dynamically to help accommodate increases in data volumes and workloads, as well as having the ability to make a wider range of data usable. This is crucial in the finance sector, for example, where markets and volatility levels have spiked during the COVID-19 pandemic and the need to incorporate more data into analytics and services continues to grow. 

Furthermore, it can assist a business’ long-term goal for digital transformation as it breaks down data and application silos, helping to remove operational inefficiencies and streamline and automate end-to-end processes, which are two of the central aims of all digital transformation strategies. Once siloes have been broken down, organisations gain an overarching view of the enterprise data from internal and external sources, and with that comes the synergy to be able to use that data for a wider range of purposes.  

With the ability to access information from all corners of the organisation, alongside the all-important metadata, data provenance and lineage is also enabled. This is critical for businesses to be able to understand the source of the data and what actions have been applied to it so that they can validate and trust the data which is being used to make significant business decisions. 

‘Piloting’ data fabrics to steer business decisions

Incorporating an intelligent data fabric into a data architecture can provide a comprehensive, real-time operational “cockpit” to the business. 

To see the value of this in practice, look at flying a plane – a scenario in which pilots need to synthesise a variety of data to do so safely. Thanks to advances in technology, pilots now have all the signals they need being combined and analysed in real-time and presented in a display and with alerts that can predict the risk of incidents and suggest corrective actions in real-time, without requiring the pilot to manually interpret different signals from the various parts of the plane. In times of crisis, such as an imminent stall, these capabilities become critically important. 

Similarly, businesses today want this same capability to filter out the data that isn’t important and to bring the information that is to the surface, especially information that is created by combining different underlying data points that may come from different sources. These capabilities can steer the business in normal times and become critically important in times of crisis as we’re seeing now.  

Data fabrics are fuelling the future of data

With businesses demanding more from the increasing levels of batch and real-time data they have available to them to gain increased efficiency and deliver value to customers, it is clear of the value that modern intelligent data fabrics provide.  Not only do they offer businesses the opportunity to get more from their data, but also enables them to do so without replacing their existing technology infrastructure. 

This ability to leverage more data in the moment enables organisations to gain the vital capabilities they need to make better business decisions based on data, and in turn respond faster and better to crises while increasing revenue and reducing risk, especially during periods of disruption and volatility. 

Jeff Fried builds strategies and products that make that transformation possible. Jeff has been doing it in partnership with many of the world’s leading organisations for more than four decades. He is currently the Director of Product Management for InterSystems.

Scroll to Top

SUBSCRIBE

SUBSCRIBE