Aberdeen: Business Intelligence technologies to handle Big Data

The continued and rapid expansion of corporate data assets has been well documented and discussed within the business analytics industry and the analyst community. However, realistic solutions to address this mounting challenge have not been as forthcoming.

A recent benchmark report by the Aberdeen GroupBig Data, Big Moves – offers advice on the technologies and processes best able to address the challenge of Big Data. Crucially, the report found that organizations with larger data volumes are far more likely to demand/require higher data velocity (the speed with which data is collated and delivered in consumable forms to decision-makers).

Defining Big Data

For the purpose of this discussion, Big Data has been defined as: “The overall volume of active data an organization stores as well as the size of the data sets it uses for its business intelligence and analysis”.

Common Big Data drivers / challenges

Aberdeen research from June 2011 – Future Integration Needs: Embracing Complex Data – revealed that the biggest drivers / challenges for companies attempting to leverage value from Big Data (respondent organizations with data sets from 500 gigabytes to over 20 terabytes) as:

  • Increasing demand for management information (69%)
  • New analytic needs not well suited to existing data warehouse (67%)
  • Growing volumes of source data (41%)
  • Rapidly changing business needs require different management information (31%)

Bigger velocity demands on organizations dealing with Big Data

The June 2011 survey also found that organizations with larger data volumes are required to analyze and report on that data in shorter timeframes. The report found that:

  • Seventy-eight percent of organizations dealing with Big Data are required to deliver reports within 24 hours of a request being made, with over a quarter (27%) expected to deliver reporting and analytics at near real-time speed.
  • In stark contrast, only 50% of companies with small data sets (respondent organizations with data sets less than 10 gigabytes) are required to deliver reports within a day, with only 10% expected to deliver real-time analysis.

Methods for analyzing Big Data and delivering reporting from Big Data at speed

Achieving high data velocity on Big Data has traditionally been both hardware and labour intensive. Aberdeen’s June report outlines the most common methods for achieving high levels of speed and analysis on Big Data as:

  • Data integration tools (85%)
  • Data warehouse / mart (84%)
  • Data cleansing tools (58%)
  • Fully automated updating of BI data (56%)
  • Massively parallel processing hardware (32%)

Solution: Delivering reporting and analytics from Big Data faster and easier

Results garnered from Aberdeen’s June 2011 survey of 190 organizations – Aberdeen Business Review – revealed that companies dealing with Big Data were more likely to have and require the following self-service BI capabilities to enable decision-makers with up-to-date information:

Where to next?

The Aberdeen Business Review asked survey participants to indicate which analytical technologies they planned to adopt within the next 24 months to enable organizational decision-makes with anytime, anywhere self-service access to reporting and analytics. The functionality listed most frequently for planned adoption included:

Recommendations to achieve high velocity on Big Data

Aberdeen’s Big Data, Big Moves report concludes by suggesting that organizations looking to leverage Big Data sets and deliver reporting and analytics to decision-makers in near real-time should:

  • Invest in BI solutions that empower business users with interactive and ad-hoc analysis
  • Invest in BI solutions that enable real-time mobile access to reporting and analytics