Overcoming data insight limitations with a data lakehouse

29 March 2023 Consultancy.com.au

In an increasingly digital business world, being able to gain accurate insights from large volumes of data has never been more important. Yet at the same time, a range of factors can limit the process, writes Rafi Katanasho from Dynatrace.

Leading organisations are making use of increasingly powerful tools to analyse everything from customer and transactional records to supply chains and delivery networks. They realise that such analysis can reveal trends and opportunities that would otherwise have gone unnoticed.

The sources of this data are many and varied. They can include real-time application streams, user session data, customer relationship management (CRM) platforms and call-centre solutions. Combining data from multiple sources can increase its value exponentially.

Rafi Katanasho, APAC Chief Technology Officer, Dynatrace

However, while data analysis tools can deliver powerful benefits, they can also create complexity. As more tools are deployed, it creates growing demand for common tooling, shared data, and democratised access.

Real-time insights

Tool and data complexity can also make it more difficult to achieve the real-time insights that are of most value to a growing business. Many find they are unable to unlock as much value from their data as they desire. The five most significant factors that can limit insights include:

Data silos
Having data spread across multiple locations and platforms makes undertaking analysis and achieving insights much more difficult. To achieve effective insights, data should be centralised to allow ready access by the various teams who require it.

Fragile integrations
Unfortunately, a large proportion of the data on which business analysts rely tends to be stored in proprietary systems. This can make it difficult to access and results in it often becoming stale. To make the data more accessible, many organisations find they need to invest in complex and fragile integrations which can be difficult to maintain.

A lack of real-time context
To gain business insights without needing to build complex integrations, many organisations turn to business intelligence tools such as SAP Business Objects and Datapine. However, while these tools can provide reporting, analysis, and interactive data visualisation, they analyse data asynchronously rather than in real time.

The tools can also miss out on having an awareness of the systems and processes responsible for the business data. This then leaves out vital details that teams need to understand relevance and business impact.

Rising storage costs
The rapidly increasing volume of data in cloud environments can quickly become very costly to store. Also, as an organisation starts to track additional key performance indicators, data volume grows further which can slow down searches.

To make searches more manageable, teams rely on database indexing. While this can improve performance, it can also be cumbersome to maintain and limit flexibility for future exploratory projects.

Frustrating search constraints
To undertake analysis of long-term trends using traditional methods, teams must rehydrate data that resides in so-called cold data storage. Reinstating large volumes of historical data, however, increases consumption and bogs down queries. As the volume grows, these factors become increasingly challenging and costly.

The benefits of a data ‘lakehouse’ strategy


To overcome these challenges and limitations, an increasing numbers of organisations are opting to create a data ‘lakehouse’ as a way to improve both analytical efficiency and observability. This approach combines the benefits of a data warehouse with those of a data lake.

The strategy delivers rich data management and analytics on top of low-cost cloud storage. A data lakehouse eliminates the need to choose the amount, locations, and type of data to ingest, retain, and analyse. As a result, teams no longer need to make those decisions upfront.

Also, a data lakehouse that has been purpose-built for business, observability, and security data provides the most effective way to store, contextualise, and query data from multiple different channels. This efficiency gives teams immediate, actionable insights and drives automation.

At the end of the day, a data lakehouse can unlock the true value of business data by centralising it for contextual analysis. As a result, business and IT teams can enhance collaboration and improve agility by using real-time insights to make informed business decisions.

Although the rising tide of business data can bring significant challenges, it also offers massive opportunities. By taking advantage of strategies such as the creation of a data lakehouse, these challenges can be overcome and real value unlocked.

About the author: Rafi Katanasho is APAC Chief Technology Officer and Solution Sales Vice President at Dynatrace.