Learn about the latest developments in data analytics and how they can transform your business

Yogi SchulzData analytics developments, including AI-powered analytics, that we’ll see in 2024 offer significant benefits for expanding our knowledge and making better data-driven decisions.

Data analytics examines, cleans, transforms, and interprets data to uncover valuable insights to advance the organization’s business plan.

Data analytics is a technology that has become more capable during many years of development. It’s become more prominent in recent years as organizations have embarked on digital transformation. One of the benefits of digital transformation is the creation of more digital data to analyze.

Here’s what’s in store for data analytics in 2024.

  1. Predictive analytics
fact checking data
Related Stories
Your roadmap to improving reporting and data analytics


Data analytics adds significant business value


Analytics create databases just waiting to be hacked


Predictive analytics will gain prominence in 2024. It’s been discussed for many years, but it’s frequently hampered by data and software limitations, both of which are fortunately declining.

Predictive analytics creates forecasts based on historical data and models to identify future trends and events. Businesses use predictive analytics to optimize operations, anticipate customer demands, and make data-driven investment decisions. It’s a game-changer for staying ahead in competitive markets.

  1. Generative AI

In 2023, Open AI and the Internet giants generated enormous buzz around the use of generative AI in many business functions.

Generative AI creates new content and data in response to natural language text queries. Generative AI relies on a vast store of encoded information stored in a large language model (LLM) to respond to queries.

In 2024, more organizations will use generative AI for data analysis to address the thorny problem of extracting value and insights from unstructured data. The problems include:

  • Data analytics tools have always been designed to query only structured data.
  • Using search tools has been hampered by difficulties accessing diverse PDF datastores and uneven metadata.
  • Data analytics tools can’t query images, audio, and video.
  • A significant portion of the unstructured data is still stored as paper.

Because generative AI is trained on structured and unstructured data, it can respond to queries by evaluating an amalgam of both.

  1. Data analytics vs. data science

The line between data analytics and data science will continue to become even murkier in 2024. Data analytics will become more sophisticated, and data science will become easier to understand. Both trends result from more sophisticated software that successfully hides complexity to make developers and data analysts more capable and productive.

  1. Data lakes are a dead-end

In 2024, more organizations will realize that data lakes do not deliver value. Data lakes are a concept that takes the pressure off IT departments to deliver impossible data analytics applications that are more complex and costly than project sponsors can afford or want to pay for. IT departments have grown tired of explaining project difficulties and being perceived as incompetent, lazy or gold-bricking.

With a data lake, the IT department can tell management and end-users that “we’ve brought all the data they need together for easy access. Our work is done. Please use the excellent end-user tools we’ve installed to quickly build the data analytics deliverables you need”.

By populating the data lake without any data transformation or integration, the IT department has cleverly skirted responsibility for all the problematic issues that lead to failed data analytics projects. These include:

  • Data quality lapses that result in low-confidence reports and charts.
  • Inability to integrate incompatible data structures.
  • Missing data.
  • Difficulty integrating internal with external data.
  • Insufficient end-user expertise to describe complex requirements.
  • Project cost and schedule overruns.

Organizations stuck with a no-value data lake should upgrade it to a data lakehouse, as discussed below.

  1. Data lakehouses

In 2024, more organizations will recognize that data lakehouses are a practical compromise between low-value data lakes and high-cost data warehouses.

Data lakehouses combine the low operating cost of data lakes with data warehouses’ data management and structure features on one platform. The difference between data lakehouses and data warehouses is how and how much data integration has been developed.

In data lakehouses, the data integration is only developed for the small number of data sources where views, queries, reports and dashboards are generated. In data warehouses, all the data from various data sources is transformed and loaded into the schema, encompassing the data integration.

In data lakehouses, the data integration proceeds slowly and partially as needed at a low cost. Business value is quickly and frequently evident. In data warehouses, all the data transformation, loading, and integration are significant upfront costs incurred before any value is delivered.

  1. Data warehouses

In 2024, no organizations will start to build a data warehouse. The age of data warehouses has come and gone. Beautifully structured and curated data warehouses are too expensive to build and operate.

Instead, organizations will:

  • Leverage advanced data integration tools to create the illusion of superior data integration even though the underlying data remains in its original data store.
  • Build and operate data lakehouses.
  1. Ethical considerations

In 2024, as data analytics continues to become more prevalent, privacy concerns and ethical considerations will become more prominent. Data breaches and misuse of personal information will lead to a stronger emphasis on responsible data handling and compliance with regulations and ethical guidelines. The goal is to protect privacy and shield organizations from lawsuits.

  1. Graph analytics

In 2024, the growing utilization of graph databases will lead to more use of graph analytics.

A NoSQL graph database stores data in nodes and relationships instead of tables or documents. Unlike a relational database, a graph database stores relationships explicitly. This feature dramatically accelerates query performance for complex queries.

Graph analytics is superior for uncovering hidden patterns, making predictions, and gaining insights into complex data structures.

  1. Data democratization

Data democratization intends to make data and analytics accessible to a broader audience within an organization. It’s not a new concept. It’s been a gleam in the eye of software marketers eager to license more software and ambitious end-users who want to develop their analytic queries without help from a developer.

In 2024, more data analytics tools and platforms will offer a user-friendly interface and more sophisticated data manipulation features, allowing non-technical end-users to access and analyze data independently.

While the data analytics tools are impressive, the impediments to broader adoption include:

  • The absence of a sufficient number of pre-defined views of the available data stores to simplify query development.
  • Insufficient end-user understanding of the available data stores’ data structures and column usage.
  • Insufficient end-user skill in using the available data analytics tools.

Sometimes, organizations claim they offer self-service analytics solutions without addressing these impediments.

  1. Data observability

Data observability improves the ability of organizations to monitor, track, and ensure the quality, reliability, and performance of data throughout its lifecycle. Performing these tasks has historically consisted of much drudgery that led to uneven outcomes.

In 2024, with organizations relying more on data analytics as they push toward data-driven decision-making, access to high-quality data has become essential. More organizations will implement comprehensive data observability that includes:

  • Monitoring data quality and making corrections.
  • Understanding data lineage and ensuring traceability.
  • Performing data governance and metadata management.
  • Sustaining continuous improvement.

Yogi Schulz has over 40 years of information technology experience in various industries. Yogi works extensively in the petroleum industry. He manages projects that arise from changes in business requirements, the need to leverage technology opportunities, and mergers. His specialties include IT strategy, web strategy and project management.

For interview requests, click here.


The opinions expressed by our columnists and contributors are theirs alone and do not inherently or expressly reflect the views of our publication.

© Troy Media
Troy Media is an editorial content provider to media outlets and its own hosted community news outlets across Canada.