We live in a world of Data. Our ability to collect data on the digital behaviour of both individuals and organizations now far exceeds any ability we have historically had to collect data on their physical activities. For businesses, this is both a huge opportunity and a growing challenge. This is because collecting troves of data is of little value in and of itself; the value comes from putting that data to work for you. What increasingly separates the leaders from the laggards in this area is the ability to transform data into insights, and insights into decisions.
This alchemy requires insights to match (or exceed) the speed of business, not lag it. One would then imagine that with the plethora of analytics and insights tools out there promising to put ‘data at our fingertips’, it must be game time already. However, the reality is that picking the right tool, employing the right vendor to deploy it, and then putting the whole insights framework–and proceeding to act on the actual insights output from it—is a long way from the real-time need for it. BI adoption levels are patchy and deliver surface-level KPIs, while tangible analytics ROI is handicapped by time to insight and range of insights.
What this leads to is a situation where the majority of business leaders and managerial decision-makers know they need to make better use of data; just that they don’t always know how best to do so, or they lack the tools to do it themselves. This is the expected outcome of a scenario where advanced analysis tools are practitioner- (e.g. data scientist-, citizen data scientist-, analyst-) centric, and the self-service tools designed for functional managers are largely limited to pre-determined surface-level reports.
As a result, Data, BI, & Decision support teams are bombarded with competing for ad-hoc requests that are unpredictable and therefore unplanned. The needs of the many rely on the expertise of a few, creating an insights gridlock through which only the ‘emergency’ requests can pass (where ‘emergency’ is defined by either the opinion of the technical practitioner or the title of the requestor). This is where the ‘democratization of data & insights’ then usually meets its demise. Clearly, business leaders need better ways to make decisions.
So, how then do you empower the business/functional layer to accelerate time-to-insight & range of insight, while also meeting the existing challenges of relevance and reliability? In what ways should insights tools be able to truly democratize decision insights?
Here’s a list of questions that can help you critically assess your insights platforms and whether they are well-suited to support your decision-making process.
- Can it understand my business?
Establishing and understanding the context of data is likely one of the most challenging aspects in business. After all, any insight engine can positively impact business outcomes only when the insights are relevant to the context of the organization, and equally importantly, to the context of the user. Data itself, without context and understanding is not an insight — it’s just…data! Delivering insights which are fined-tuned to business context and its nuances is a must. Keeping all analytics in context and exposing both related and unrelated data relative to the problem is crucial for any insights implementation to succeed.
- Can it help me make decisions in real-time?
While acquiring the data is often the single largest limiting factor in getting to an insight program, it is still the visible part of the time-to-insight iceberg. The big looming part underneath is the time needed to translate the data to insights at the granularity of individual users within individual functions. This starts with just being able to create surface KPIs and exploratory views at those contextual intersections, but really balloons when one comes to unearthing patterns and relationships that help one understand the “why” and “what-if” insights. This is because there is only so much advanced analysis and hypothesis testing one can automate, and so this ends up needing repeated expert intervention for any level of insight complexity.
An embedded real-time analytics capability—which can work without the need for repeated intervention by developers—is another important criteria to consider towards selecting a reliable insights engine.
- Can I trust the insights at their face value?
One of the challenges with newer analytics techniques like Artificial Intelligence (AI) and Machine Learning (ML) is that they are, inherently, black boxes. There is a whole new field of XAI (explainable AI) that is supposed to help make these algorithms explainable to practitioners.
Funnily (or tragically) though, the functional and business user does not have that benefit today even for the algorithms that we DO know about. It is assumed that the end user arrives at any algorithm-based insights by way of additional expert intervention, by experts who in turn are expected to make the underlying algorithms transparent to the end users. We have to therefore take the considerable questions that exist today related to building reliable and valid data that return the necessary insights: Is it the correct data? Is it complete? Is it clean, secure and governed? And apply them to insights as well: Are these the right insights? Do they have the right context? How do I trust them?
The ability to predict, address and ensure the integrity of insights is an important criterion to consider while selecting an insights platform
- Can I understand and interpret the insights on my own?
Or, to look at a specific example: What is the use of predicting customer churn if your business has no insight into how to avert the situation?
In any report, the insight does not come from the charts and the numbers, it comes from understanding what these numbers mean for the business. Today, the decision maker is reliant on human intervention to apply context to analysis, as well as to build a narrative related to the resultant insight. In addition, the interpretability of recommendations or the impact of changing certain input factors for any given situation is particularly relevant where the decision-maker must not only understand the outcome under the current scenario, but also consider the scope of intervention spread across multiple scenarios.
Any tool intended to address this has to empower functional and business users to interrogate and investigate the data and insights themselves, while making the process easy and engaging by aiding narrative development.
- How can I ensure adoption and usage of insights?
You can get the horse to the water as they say but getting it to drink is a whole other thing. A fundamental issue that plagues insights tools today is that usage and adoption of these tools by functional decision makers is low to abysmal. The reason for this lies at two levels. The first is relevance – does the tool factor in the context of the individual persona, their needs, and what creates value for them; and the second is engagement – how much burden does it put on the end user to get to a specific insight and how many hoops they have to jump through for that ? (Read: buttons, and filters, and all manners of thingamajig)
Essentially, having data and analytics is not enough if it’s not designed for specific personas and their needs, and in a manner that is easily embraceable. Insights tools should treat data and analytics like a product, one that is constantly refined to segments of users, with the segment of one being the ultimate insights democratization goal.
As the tectonic plates underlying the business world shift more rapidly than ever before, it is increasingly important for business leaders to rapidly make robust decisions and, as such, ‘Reports’ and pre-built ‘strategies’ are so 2010.
The bottom line is that you need to assess whether your insights tool is truly a business and user-centric. Time to insight, quality of insight, and range of insight are the principal guidelines within which you need to make that assessment.
Join Our Telegram Group. Be part of an engaging online community. Join Here.
Subscribe to our NewsletterGet the latest updates and relevant offers by sharing your email.
Satyakam Mohanty has over 20 years of expertise in the field of Insights and Analytics, Saty is a firm believer that data science, artificial intelligence and human ingenuity creates the perfect concoction for building a data-driven enterprise. He is most passionate about simplifying insight generation and democratising data decisions for next -generation organisations. He Heads Leni and Mosaic Applied Intelligence at LTI. He believes in building an organization that truly simplifies the lives of everyone we touch, be it our customer partners or our employees. He has worked with organizations like TNS, Genpact, Aegis school of Business, Data Science, Cyber Security and Telecommunications, IMT Hyderabad and others.