BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

Building The New Narrative For Data ‘Storytelling’

Following
This article is more than 2 years old.

There’s a lot of talk about data. The rapid acceleration of computer microprocessors combined with ever-cheaper access to data storage, now twinned with the post-millennial growth of cloud and the development of increasingly complex algorithmic logic gave us big data analytics… and the tech industry has been talking about it ever since. 

Big data (which is basically at least one-tier-bigger than the older traditional relational database engines could handle) has been responsible for much of what the technology advocates, evangelists and futurists talked about for the first decade after the millennium.

But now the style of the discussion is evolving.

Big data became tablestakes

Like many next-big-thing technologies, big data itself has become almost standardized in the kinds of workloads that we now expect to be shouldered by the industrialized cloud computing IT stacks that we now tap into every day. 

When a technology has reached a certain level of penetration, adoption, augmentation, refinement and extension it ceases to be regarded as a defined elemental tool or lever in the modern IT stack; instead, it becomes a sort of tablestakes utility component for our digital existence. 

Over time, big data has edged towards being that kind of technology fundamental, so what matters now is what we do with data insights and how we combine machine ‘thought’ derived from Artificial Intelligence (AI) and its Machine Learning (ML) abilities with our own human intuition and decision-making processes.

As humans take data insights from our predictive analytics engines and attempt to apply directional vectors to business decisions, how will the big data conversation change?

Settle down for ‘data storytelling’

Historic approaches to data analytics rely on static dashboards and data visualization to identify, communicate and explore insights from complex business data. This is the kind of thing that data analysts and data scientists thrive on, but it obviously requires a certain level of data literacy. 

What comes next is what some in the industry are calling ‘data storytelling’ i.e. the action to put data analytics in the hands of less data-literate employees through narrative techniques. The goal is to enable a broader audience to interpret what is being evidenced in any given dataset. This process, where carried out effectively, is agued to enable so-called subject matter experts (typically business specialists who identify as non-techies) to identify and add context not initially obviously present in the data. 

This technique is driving the current demand for data storytelling capabilities, as business analytics users specify new solutions.  

VP for EMEA at data specialist Yellowfin is Geoff Sheppard argues that humans will always play a role in data storytelling, as they have an unmatched ability to add context and emotional intelligence that is not present in any lump of data. 

“By automating the parts of the data storytelling process best suited to machine support, we help users become more efficient and make data analytics tools useful to a broader business user base. Augmented automated analytics will solve key challenges to empower business users and subject experts to discover and develop compelling data stories more effectively,” said Sheppard.

Yellowfin calls itself an analytics vendor that combines action-based dashboards, automated [data] discovery technologies and data storytelling practices. The firm’s 2021 white paper entitled ‘The Future of Data Storytelling: how narrative and automation will redefine the next decade of analytics’, concurs with recent analyst statements that suggest ‘data stories’ will be the most widespread way of consuming data analytics by 2025.

The intersection of storytelling & augmented analytics 

Yellowfin’s white paper examines the process through which augmented analytics in modern Business Intelligence (BI) tools are automating the data analysis part of the narrative process, thus making analysis more comprehensive and efficient. It also explores how technologies such as AI, natural language query and machine learning can help users to better understand what their data means.  

Yellowfin identifies three emerging challenges that automated and augmented data storytelling can potentially solve: 

  • Human bias: data storytelling relies on humans to spot anomalies and find them important enough to explore further, but levels of interest and diligence vary from person to person. By adopting AI and ML analysis of datasets and extending this with a storytelling module, helpful data-led narratives could be generated that might have been missed, overlooked or undervalued when created by people. 
  • Low data literacy: levels of data literacy vary, making self-service analytics solutions too complex for less able users. By automating common self-service BI processes, the need for high data literacy is eliminated and insights are presented in a digestible way for a broader user base. 
  • Scaling data storytelling across the business: as a human-led activity, scaling it across the business may be unrealistic. However, by extending automated business monitoring and analytics past alerting capabilities, data stories can be generated at scale. 

As with all enterprise technologies, the question of scaling is often one of the most crucial i.e. we all know how the world juddered when we had to ramp up specific business streams exponentially at the start of this decade’s pandemic. 

Because storytelling with data is largely a manual, human-led technique, realistically scaling it across a business is an open question, especially because it’s still an analytical skill set many organizations are only now recognizing and prioritizing. In the meantime, analysts can’t be expected to always find and extract meaningful stories in big datasets in a timely fashion and regular professionals may be busy dealing with normal business operations to spend much time consuming and then communicating insights in narrative form for everyone’s benefit

Automated continuous monitoring

If there is another key technology theme that needs to be incorporated here it is the need for continuous always-on computing services.

Yellowfin’s Sheppard points to his firm’s Automated Business Monitoring (ABM) product Signals, which is engineered to deliver what he calls, “Automated continuous monitoring that detects patterns or outliers in data, generating headline alerts to help users become aware of important discoveries.”

He concludes by saying that as of today in 2021, AI’s capability to automatically generate augmented data stories with the level of emotion, relevance, context and narrative expertise as humans can provide is not yet a reality. If we accept the combination of these arguments and ideals, it appears clear that we need humans, but humans being helped by computers are more powerful humans.

The overriding sentiment from the Yellowfin team is that humans will always be the drivers of data storytelling.

Algorithms just cannot create the rich, contextual narratives that come naturally to us. What they can do, however, is point the way, guiding and alerting us to points of interest that might be overlooked and prompting us to build more effective, engaging and valuable data stories. Now then, it’s time for bed.

Follow me on Twitter or LinkedIn