Site icon Yellowfin BI

Integrated data preparation: The missing piece of the pervasive BI puzzle?

What’s the secret to success – in any pursuit?

Do champion footballers become champions because of the brightness of their shoes? Do revered tennis players sweep their opponents off the court with the ferocity of their screeches? Is a vaunted guitarist’s ability to shred those strings directly related to the size of their swagger? Or, is the prose of an acclaimed novelist directly proportional to the size of the leather patches on their tweed jacket?

While watching success in action, it’s easy to be preoccupied by superfluous window dressing and bedazzled by the flamboyant outcomes. But, all that does is distract from what actually matters: The hard – often unglamorous – work that occurred behind-the-scenes to make it all happen. The practice and planning that went into making the outcome at the end of the journey a success. In short, the preparation.

And that’s the problem. While what’s ultimately delivered at the end of any process – from sport, to music or Business Intelligence (BI) – may be the inspiring and engaging part, actually undertaking the necessary preparation to achieve those results is vitally important. But, all too often, it just doesn’t demand attention – it’s not sexy.

See what (data) preparation can do for you
Attend the Webinar launch of Yellowfin 7.3 to see how introducing a fully integrated Data Preparation Module into Yellowfin’s BI platform empowers users to improve data governance, build better reports, and produce deeper insights faster. REGISTER HERE >

Stupid, sexy Business Intelligence

Shifting priorities
And, like most areas of work or play, the BI industry is not immune to the pursuit of sex appeal.

Since around 2010, BI has been the ‘it’ child of the enterprise software world. The 2016 version of Gartner’s well-respected CIO Agenda Report listed ‘Business Intelligence and data analytics’ as the top technology priority for CIOs – for the fifth consecutive year in a row. Similarly, TechTarget’s 2016 IT Priorities Survey revealed BI and analytics as the top software category, with more respondents listing implementing BI initiatives as a priority (27.5%) than another other type of software. And, perhaps most tellingly, research firm Ovum’s 2015 survey of 582 retail banking organizations found that this ardently risk averse institution was finally prioritising BI and analytics projects (29%) ahead of security software (28%).

The sexification of BI
As the much discussed ‘consumerization of Business Intelligence’ gathers pace (both pushing and reacting to the shifting landscape of more business users and business use cases), our collective focus has been on end-results and end-users. That is, the visible end-point of the data analytics process. The identifiable tipping point for this trend was back in 2011 when, for the first time, respondents to Gartner’s Magic Quadrant for Business Intelligence and Analytics Platforms study rated ‘ease-of-use’ ahead of ‘functionality’ as the most important selection criteria when purchasing BI software. The associated Gartner Research Note – BI Platforms User Survey, 2011: Customers Rate their BI Platform Functionality – stated that: “Strong functionality is clearly no longer enough… Vocal, demanding and influential business users are increasingly driving BI purchasing decisions, most often choosing easier to use data discovery tools over traditional BI platforms — with or without IT’s consent.”

Since then, many other industry studies revealed that technology providers and purchasers alike were heavily fixated on front-end, end-user-focused BI technologies. TDWI’s best practice report, Business-Driven Business Intelligence and Analytics revealed dashboards and data visualization as the two chief technology priorities. Now, this trend does make sense, for two reasons: First, BI needed to become more accessible so that decision-makers could more readily benefit from using data analysis. Secondly, BI project owners also needed to garner executive buy-in to ensure their initiative had sufficient support to survive and prosper.

Joseph Ruben’s The Forgotten
Whether you’re a user or developer of BI technology, enhancing the BI experience for people who use and consume the output of reporting and analytics is one thing – a good thing at that. But, simultaneously neglecting the needs of the two other critical stakeholder groups involved in attaining BI success – data analysts and IT – is counterproductive.

Somewhat ironically, to deliver pervasive business-user-oriented BI deployments and maintain widespread user adoption, empowering data analysts and IT is of equal importance. If you don’t enable the behind-the-scenes processes – think data quality, data preparation and governance – the outputs will suffer. I mean really: who’s going to embrace a BI environment that can’t be trusted to deliver accurate, consistent insights to the right people at the right time?

The big reveal: User adoption
And, there’s a pretty big hint that – on the whole – technology implementers and vendors in the BI industry have forgotten the necessity of proactively supporting what happens behind the BI curtain.

Despite six or so years of feverish BI development between now and the dawning of the democratization of BI discussed earlier, user adoption rates have remained disappointingly low. According to TDWI research in 2008, only around 20 percent of potential BI users were actually using BI technology within the enterprise. Fast forward to 2016, and research data from The BI Survey 16 – the latest version of the largest annual survey of BI end-users in the world – revealed that the average user adoption rate hasn’t budged (17%, in fact).

Stupid, sexy Flanders
So it appears that if the BI industry as a collective hadn’t followed Homer’s lead – becoming distracted by “stupid, sexy Flanders [BI]” instead of devoting sufficient attention towards the behind-the-scenes preparation required for success – we might have seen some better outcomes by now.

Interestingly, it now seems that BI users – of all types – are becoming attune to the need to remedy this problem.

A turning tide

Ensuring substance behind the sheen
While data gathered from BARC’s The BI Survey 16 found that forward-facing BI competencies data discovery and data visualization were rated as most important, they tied with data quality and data management.

Analysis from the survey indicated that, whilst it’s been a battle to compete with more glamorous features that help to visualize the outputs of data analysis, even business users are now recognizing the need to ensure substance behind the sheen:

“Data discovery and visualization… are among the typical functions users want to consume in a self-service mode. However, an agreed data and tool governance framework is paramount to avoid losing control over data…

“organizations seem to be aware that the coolest looking dashboard is worth nothing if the data shown is all wrong.”

BI vendors have their heads in the sand
These findings become more intriguing when analyzed by stakeholder type.

Unsurprisingly, IT users perceive issues of data management and data quality to be more important than any other group (7.4). But, business users now also seem keenly aware that no matter how visually appealing, engaging or intuitive front-facing features might be, BI software needs to support data analysts and IT to deliver trustworthy, insightful results: “End-users recognize the need for data quality and master data management” stated the report. “The reason for the high relevance of data quality and master data management is simple: people can only make the right decisions based on accurate data.”

Perhaps most tellingly is that, as a collective, BI vendors placed the least amount of importance on the topic (6.0). Is it because many BI vendors believe that background tasks associated with data quality, management and governance don’t make for a compelling marketing message?

Whatever the case, BI software needs to support, not disown, the needs of IT users such as data analysts and system administrators in order to deliver successful business-user-oriented BI programs. After all, as findings from The BI Survey 16 concluded, outputs and insights achieved from business analytics are only beneficial and trustworthy if the data inputs underpinning them are solid: “Business intelligence does not make a lot of sense without comprehensive data integration and data quality initiatives”.

While BI capabilities such as data visualization, data storytelling and collaboration all play important roles in directly engaging non-technical users, functionality that enables data analysts and IT to efficiently integrate, prepare, profile and govern data is critical for facilitating long-term user adoption. No matter how intuitive, visually appealing or social BI becomes, no company is going to embrace pervasive, enterprise-wide data-driven decision-making if the numbers don’t add-up.

The importance of data preparation

Out of all the considerations under the umbrella of data quality, management and governance, data preparation capabilities are vital for ensuring BI success. With a reliable data preparation framework built into a BI platform, data analysts can successfully integrate more data in less time. The flow-on effect is that data analysts can then use that framework to efficiently profile, clean, format, combine and enrich data from more sources, delivering deeper insights upon which business decision-makers can act sooner.

But, it’s clear that BI technology still needs to step-up to the plate.

Inadequate data preparation practices
The 2016 TDWI best practice report, Improving Data Preparation for Business Analytics, found that 40% of BI users were unsatisfied with the accuracy, quality and validity of current processes used to prepare their data for reporting and analysis. Further, 58% of respondents weren’t satisfied with the completeness and depth of data being prepared for BI.

Lost time
But, it’s not just the quality of the data – and therefore the depth of analysis and reliability of the ‘insights’ produced – that suffers from poor, cumbersome and manual data preparation practices. It’s the lost time. Report participants revealed that they were being crippled by inefficient data preparation processes, with a hefty majority of respondents (73%) spending between 41 percent and 100 percent of their time on data preparation instead of data analysis or report building during their last BI project. Almost half (45%) spent between 61 percent and 100 percent of their time on data preparation.

But, the report findings weren’t all doom and gloom.

The benefits of quality data preparation processes
The report underscored the ability of quality data preparation processes to enhance the value of BI implementations, benefiting both data analysts and business decision-makers downstream. How so? The three most important benefits of improving data preparation practices for business analytics were found to be: Shortening the time between preparing data for analysis and deriving business insights; reducing the time between preparing data and delivering business-ready data; and increasing the prevalence of data-driven decision-making throughout the organization.

Barriers to success
However, the same report found some of the biggest barriers to improving data preparation for business analytics to be: Difficulty accessing and integrating data across systems or application silos, as well as poor integration of data preparation capabilities with BI tools themselves.

Enter integrated data preparation.

An integrated approach

Major obstacles to successful data preparation for BI can be overcome by using a BI platform with a Data Preparation Module built-in.

Not only does this eliminate the cost of using two separate products, it also removes potential integration challenges between the two tools. Additionally, moving from data source to dashboard will be faster, decreasing the time it takes to deliver actionable insights to the business.

Conducting data preparation processes inside the BI platform also eliminates a second potential barrier to success – difficulty accessing and integrating data across systems and applications silos for the purpose of preparing it for exploration, analysis and reporting. By pulling data into one single location, it becomes faster, easier and simpler to ensure consistency in data preparation practices across all data sources used in enterprise reporting. It helps ensure the BI platform can act as a single source of truth.

The second important technical capability is that the BI platform should have a comprehensive metadata layer, and that the data preparation process should be performed at the metadata level. By pulling the data preparation process into the metadata layer of the BI platform, any changes made will be uniformly reflected across all content based on that metadata layer throughout the enterprise – from reports and charts, to dashboards and Storyboards. Going from data preparation to decision-making all in one solution allows organizations to maintain data lineage, visibility and control.

Conducting data preparation in this manner enables IT to maintain data governance, data analysts to build better reports and produce deeper insights faster, and business users to trust the validity of the data and accuracy of their decision-making. Ultimately, that trustworthiness is what BI projects require to support sustained, pervasive business user adoption and to drive a culture of fact-based decision-making.

What’s next, George?

As Irish playwright George Bernard Shaw once said: ‘Spontaneous’ events require careful preparation.

So, if you want to see how Yellowfin’s new integrated Data Preparation Module helps produce deeper BI insights, and underpin trustworthy in-the-moment business decision-making, register for the launch of Yellowfin 7.3 HERE >

After all, analytics without the trusted data underneath really is, as industry expert Dr Barry Devlin would say, Business unIntelligence.

Exit mobile version