Not convinced about this whole Big Data thing? You’re not alone.

I’m not convinced. And it seems like I’m not the only one.

We know that data volumes are increasing – and doing so more rapidly than ever before. We also know that this growth applies to existing as well as new data types and sources. And, we know that the pressure to unearth fact-based insights, and act on them in increasingly short timeframes, is rising.

So you’d think that all this Big Data talk was justified. And it probably is in certain circumstances. But broadly speaking, it seems that the hysteria surrounding the Big Data ‘phenomenon’ is inhibiting our ability to take better, faster, smarter actions from our data. In many instances, commentators, vendors and even so-called industry experts have been seduced by Big Data’s allure.

The result? Pointless and misinformed tongue-wagging-lip-flapping, idolization – which has led to mass confusion around Big Data’s actual usefulness.

Where’s your Big Data at?

A recent survey by analyst firm the Business Application Research Center (BARC) and IT services provider Steria, found that only seven percent of European companies considered Big Data as “very relevant” to their business.

Why? Because many organizations simply don’t have the volume, variety or velocity of data to view their data-based projects in terms of a Big Data challenge.

To emphasize the point, the most recent edition of BARC’s The BI Survey – the world’s largest survey of global Business Intelligence (BI) software users – found that the median input data volume for BI programs was around seven gigabytes.

So who does use Big Data?

So does this mean that the Big Data concept has about as much substance as American television ‘personality’ Joan Rivers? Not quite. And besides, despite all the hoopla, it’s not nearly as annoying.

Perhaps unsurprisingly, evidence indicates that true Big Data issues and opportunities lie predominantly with very large organizations – institutions that actually produce and consume truly large and complex data. And, most importantly, such organizations can also hope to realistically achieve ROI from capturing and analyzing data at a greatly increased volume, variety and velocity.

New Vantage PartnersBig Data Executive Survey 2013: The State of Big Data in the Large Corporate World asked financial services and healthcare providers within the Fortune 1000 – those organizations known to invest most heavily in data initiatives of scale – about the adoption of Big Data strategies and solutions. The survey suggested that, among this group of businesses, there had been “a surge of progress in Big Data utilization, sponsorship, investment, implementation, and value, fueling a rapid rate of adoption”.

The survey defined the term ‘Big Data’ as new database management and analytical approaches developed for analyzing, storing, and manipulating collections of data so large or complex that they require rapid processing and / or have become too difficult to work with using ‘standard’ database management and analytic solutions.
 

The executive summary of the report states: “The 2013 survey highlights that a robust 91% of executives responded that their organization has a Big Data initiative planned or in progress. Of these, 60% of executives report that at least one Big Data initiative has been implemented, with 32% of executives reporting that Big Data initiatives are fully operational, in production or operationalized across the corporation.”

In terms of investment, the report found that 68% of respondent organizations plan on spending more than $1MM on Big Data projects in 2013. Over time, that figure is expected to rise, with 88% looking to spend at least $1MM by 2016.

At the other end of the scale, 50% of Fortune 1000 organizations said they intended to invest $10MM or more in Big Data initiatives by 2016, up from 19% in 2013. The percentage of those investing in “large scale” Big Data projects – those costing in excess of $50MM – is forecast to rise from six percent in 2013 to 14% by 2016.

So it’s clear that very large organizations see Big Data, or rather the ability to leverage it, as a mechanism to streamline business operations and exploit previously unexplored business opportunities.

But the paybacks of Big Data shouldn’t necessarily be limited to the Fortune 1000. Other organizations too, can benefit from analyzing more data types in greater volumes. It just seems that hype and confusion are holding many back.

Big Data hype and confusion

The aforementioned BARC – Steria report, compiled from a user survey of 668 organizations across 20 different European countries between November 2012 and January 2013, also revealed that strategies for successfully applying Big Data to specific business goals have stagnated. These findings align with recent Gartner research, which suggested that extensive confusion among potential purchasers – regarding the definitions and differences between the terms ‘Big Data’, ‘BI’ and ‘analytics’ – was “blunting BI spend”. Why is a reduction in BI technologies investment detrimental to the Big Data movement? I’ll explain in a moment…

The other aspect of this “stagnation” issue is that many sections of the business analytics industry have overstated Big Data’s “revolutionary” potential.

The problem with insinuating that Big Data is revolutionary is that the premise distracts organizations from its true worth. As Yellowfin CEO, Glen Rabie, remarked in a soon-to-be-published piece for analyst firm Radiant Advisors, this revolutionary claim leads businesses “on dubious expeditions to uncover revolutionary ways in which to apply Big Data to their strategies and operations. Organizations should, however, focus their attention on how Big Data projects can be applied to current business challenges.”

Besides, using the term “revolution” implies sudden change. ‘Big Data’, or the concept of Big Data, has not suddenly appeared. As David Torres recently wrote on the BeyeNETWORK Expert Channel in his piece, Big Data: Is the Hype Over Yet? – “we’ve always been able to create more data than we could store and process”. It’s likely that we’ll always produce more information than we can effectively and efficiently capture and analyze. This is not a new notion.

Hal B. Becker asked in his 1986 work, Data Communications, “Can users really absorb data at today’s rates?” Samuel Arbesman – a senior scholar at the Ewing Marion Kauffman Foundation – takes it further, arguing “Vast linguistic datasets, for example, go back nearly 800 years. Early biblical concordances — alphabetical indexes of words in the Bible, along with their context — allowed for some of the same types of analyses found in modern-day textual data-crunching.”

Big Data is relative to context (time and place) and the available capacity to leverage it. Remember, the digital storage of data only became more cost-effective than paper-based systems in the mid 90s – 1996 according to Morris and Truskowski’s “The Evolution of Storage Systems”.

Taking action on Big Data: Big Data analytics

But most importantly, hype and confusion have distracted organizations from the role that BI and analytics play in relation to Big Data. Reporting and analytics capabilities enable you to actually analyze and extract value from Big Data. And isn’t that the whole point!?

According to Wikibon, strong Big Data interest will increasingly turn into action over the next few years. Wikibon predicted that, if current growth rates were sustained, the global Big Data market would exceed $47 billion by 2017 – a compound annual growth rate (CAGR) of 31% across 2012 – 2017. The catch? This forecast rests on one key assumptive factor: A better understanding of Big Data use cases leading to definitive ROI. That is: How and what are you using Big Data to achieve?

Simply marveling at piles of information in some sort of transfixed men-who-stare-at-goats gaze-off is valueless.

Crucially, the Big Data Executive Survey 2013 also noted the need for Big Data action through analytics, as opposed to just Big Data management. Respondents to the 2012 version of the study emphasized the ability to integrate larger data volumes, and a greater variety of data, as critical Big Data capabilities. In 2013, respondents said that analytical velocity was equally as important as the ability to integrate more data sources and analyze larger data volumes. Analytical velocity is the speed with which organizations can obtain answers to critical business questions. Recent research from Avanade found that 84% of organizations actively leveraging Big Data via analytics claim that they can now make better decisions regarding current business processes as a direct result of that investment.

Despite such findings, it seems likely that Big Data stagnation may prevail – for now. Every year Gartner assesses the globe’s big technology buzzwords via its Hype Cycle report. Its 2013 Hype Cycle for Big Data found that the term ‘Big Data’ is now approaching the “peak of inflated expectations”. However, Gartner suggests that much Big Data frenzy is still two to five years from reaching what they describe as the “plateau of productivity” – the phase where organizations finally discard embellishments and look to couple technology with exact business goals.

It’s not the size that matters; it’s how you use it…

Above all, while the way in which you extract value from your data matters, its size doesn’t.

Too often people directly associate Big Data’s value with its volume. The amount of data you can potentially collect is irrelevant. What matters, is the amount of data you need to collect to support specific reporting and analytics objectives, which in turn support clearly defined business goals. After all, what’s the point of collecting data for the sake of it? Big Data’s value doesn’t reside in the data itself, or in its increasing volume. Value is achieved through the ability to extract and act on insights derived from that data, for direct or indirect (usually financial) gain. Likewise, requirements pertaining to data variety and velocity need to be assessed with specific business outcomes in mind.

As Arbesman put it: “Really big datasets can be a mess. Unless researchers and analysts can reduce the number of variables and make the data more manageable, they get quantity without a whole lot of quality. Give me some quality medium data over bad Big Data any day.”