Imagine you have spent hours on data analysis at your boss’s behest. You put in a lot of effort to make your final product accurate, insightful and well packaged. But in the end, your boss decides to not use your presentation, and all your efforts go down the drain. There is nothing more frustrating than this for a data scientist, but apparently it happens far too often in the analytics industry.
We spoke to some industry experts to understand why this happens and how to avoid it.
Why Does It Happen?
“Data science and analytics by nature is exploration. If you had all the data available and causal relationships established, you don’t need to do analytics or data science. The problem is deterministic,” said Sridhar Turaga, Senior Vice President, Digital Innovation, Data Science and Consulting at CitiusTech.
AIM Daily XO
Join our editors every weekday evening as they steer you through the most significant news of the day, introduce you to fresh perspectives, and provide unexpected moments of joy
“In probabilistic problems, you are often determining the right problem to solve, finding the right opportunity to focus, predict an event or discover unknown patterns. Hence you go through a top-down and bottom-up process of establishing reliable, predictive, explainable, actionable and generalizable relationships between inputs and outputs.
“Not all data meets the filters of being reliable, predictive, explainable, actionable and generalizable. That’s why so much analysis and data is dropped along the way.”
Download our Mobile App
In other words, it’s the nature of the beast.
“The data analysis could remain unused because it doesn’t present an actionable insight to the stakeholders or there is an expectation mismatch between what the stakeholders were looking for v/s the analysis presented to them,” said Sachin Garg, Head of Data Science at PayU.
“In a few cases, the cost of implementation of analysis might exceed the expected benefit from the implementation,” Garg added, which is an issue with the cost-benefit analysis, “Or in other cases, there could just be an inertia with regards to changing the way of doing business, as previously, things were done manually.”
It is imperative to strike the right balance between visualizations and words when presenting your analysis.
“Whether a dashboard is visual or verbose, info-graph or detailed depends on the audience and usage. In many cases, you may need both – to allow for user preferences. Generally speaking, the choices of what to show and how to show should be guided by the decisions you intend to take and highlighting the right insights rather than all the information,” said Turaga.
“The flow of a dashboard should allow a natural process of drilling down to make a decision. Also, given different users make decisions in different flows, flexibility to navigate through the information freely is very important. Dashboards should also organize information based on organization initiatives or levers for managing the business,” he added
Along with making the right decisions, it is also important to bring the right user experience.
“It is about viewing every dashboard like a digital product, having the discipline to design a human-centric product to ensure high usage, and focus on usability and flow. The fact that a dashboard has interesting content doesn’t give a pass from needing great user experience design,” said Turaga.
Make It Scalable
“Analytics will have a mix of one time questions and repeated questions. And there is no way to know upfront which one is which. Most innovations or disruptive opportunities start very innocently as one-time analytical questions.” said Turaga.
“If you kill one time questions, the power of supporting decisions at the edges will be lost. So the way to do this is to design the platforms and process in a manner that reduces the cost of answering one-time questions while automating repeat questions quickly. Your data architecture, analytical layer, and user access/training will be key to reducing the cost of answering one-time questions.”
When it comes to scaling the analysis beyond one-time use, automation is the key.
“Using the right tools and technologies to do the job is an important first step here. The principle of automation should be adhered to while working on any analysis. An analysis report which is automated is much easier to scale than any analysis which requires manual effort,” said Garg.
“Further, publishing the analysis as a regular report or dashboard (with automated updates using the latest data) can ensure it can be referred to regularly by the relevant stakeholders.”
Power Of Stories
Decision-makers or CxOs use the analysis of data scientists. Hence, there must be good communication between the two.
“Regular communication and expectation-setting are a necessary part of a data scientist’s job. Before starting a project, detailed discussions on the scope of the project, the data which will be used, the hypothesis to test and setting clear expectations with the decision-makers are essential to avoid this trap,” said Garg.
“Further, if decision-makers expectations are implausible, the same should be called out by the data scientist at the beginning of the project.”
However, the onus is on the analytics teams to influence decision-makers with their insights, which can be achieved through effective storytelling.
“People get influenced by storytelling. Analytics teams need to moderate their insight generation. Avoid creating too much noise vs signal. Curate the right stories for the right channel at the right time. Focus on actionable insights and driving experimentation than assume brilliant insights will change minds,” said Turaga.
“Enough research in sociology and behavioural psychology has shown that human beings are influenced by stories and not data.”