Is there such a thing as too much analytics?

Yellowfin’s Managing Director for EMEA, Peter Baxter, says that too many organizations have leapt at the chance to put the power of reporting and analytics in the hands of the individual, creating chaos as a result. Rather than attempting to turn everyone into individual data scientists, organizations should be aiming to share BI-based insights broadly while ensuring BI content and access is governed by experts. The latter approach, Baxter argues, will ensure trustworthy data and data analysis on which the enterprise can rely.

This article originally appeared on channelnomics.eu HERE >

Making BI pervasive; but at what cost?

The prevalence of Business Intelligence (BI) and analytics technology has grown rapidly over the past decade, with Gartner predicting a compound annual growth rate (CAGR) of 8.7% through to 2018.

The benefits of an effective BI program include increased competitiveness, more efficient operations and improved decision-making. But, with BI being applied to almost every area of business, is there such a thing as too much analytics?

Well in certain circumstances the answer is definitely yes. That is, certain analytical approaches are far too commonplace.

Reacting to the restrictive nature of traditional BI – the wrong way…

BI deployments can be, and have traditionally been, highly complex endeavors, which can potentially sap resources and not always deliver the benefits companies expected. For example, more traditional BI tools often require businesses to accept that highly technical analysts need to spend large amounts of time sifting through and preparing data, as well as controlling access to – and development of – BI content requested by the business. This time-consuming process means insights can take months to make it into the hands of the people that really need them – the business users and decision-makers. This approach to analytics is unjustifiably resource intensive and slow. The BI industry, and those organizations implementing analytics projects, should place greater emphasis on the needs of business users as well as the ability to ‘publish’ and share BI content quickly, simply and effectively.

Today, many implementers and technology developers have realized the need for broader, faster business users access to the insights generated from BI software. As a result, there is now also a trend towards ‘self-service’ Data Discovery – putting data analysis tools in the hands of non-technical analysts. Everyone will be a data scientist in the future, so the theory goes…

Throwing the baby out with the bathwater: Forgetting the lessons of governance

Well, at Yellowfin, we don’t think that’s likely to be the case. In fact, we think that this latest approach to BI can simply create more unwanted analytics – untrustworthy and unsecure disparate mounds of data. Allowing multiple users at department level to work with datasets in isolation on a desktop is a really dangerous development. It leads to siloed and ungoverned data, which ultimately repeats the mistakes of the Excel era, where different versions of the truth proliferate across an organization.

Sure, you have greater analytics accessibility compared to traditional approaches but, if it can’t be trusted, then what’s the point? And besides, how many sales managers have you met that yearn to transform themselves into data analysts and spend their time creating BI dashboards? I’d suggest not many. It’s not their job role, and it’s certainly not something on which their performance is judged. They, like many other business users, just want to quickly consume BI-generated insights to better perform in their existing job role.

Strange bedfellows: Good governance and pervasive BI

If companies want to empower more people with BI insights then the answer isn’t to turn everyone into isolated data scientists. Rather, the goal should be to make BI content and insights – data visualizations, charts, dashboards, Storyboards and analysis – easy to consume and share for business users. But, to avoid creating more untrustworthy islands of analytics, as seen in Excel-dominated BI environments, BI deployments and the insights produced also need to be tightly governed by IT and appropriately skilled data experts.

The opportunity in today’s BI market is centered on the need to bring the power of BI insights to ever-greater numbers of people. For example, understanding up-to-date budgets and expenditure enables healthcare workers to allocate resources to deliver effective and efficient patient care. Similarly, in retail, providing access to real-time sales figures allows frontline managers to optimize product promotions to boost sluggish sales, adjust in-store merchandising strategies and keep abreast of stock levels.

There is a chance to empower a more operational tier of decision-maker and to bring the benefits of BI to decision-makers from across all job functions throughout the enterprises, and to organizations of all sizes. However, the approach companies take to this ‘democratization’ is critical. I encourage everyone to scrutinize propositions that forego the data governance best practice enterprises have refined over many years – no matter how impactful the visualizations might be. Beware the seduction of the desktop-based approach. Equally, the speed of modern business dictates that the traditional IT-dominated approach to BI is also unsatisfactory. Additionally, SMEs are also demanding access to enterprise quality BI – minus the price tag, lag and intensive IT resource requirements.

So yes – there’s a lot of analytics out there that fails to meet governance requirements and business user demand for timely information access. The answer? Organizations should pursue BI that enables IT to provide adequate information governance (rather than deliberately cutting technical experts out of the loop), while delivering business users the ability to directly access, explore, share and act on validated, trustworthy BI insights.