The Oxford dictionary defines technology as ‘scientific knowledge used for practical purposes, especially in industry. Knowledge evolves, metamorphoses or goes obsolete over time. Technology evolves through agile processes and benefits from keeping a tab on incremental performance.
Trends and forecasts inform the strategy or plans at the product, solution and organisation level. If anything, the lessons from the last two pandemic years have taught us that uncertainty is the new normal. As per a Mckinsey report, the pandemic has accelerated digital transformation by years.
This shift has also unleashed a lot of niche technology – built and nurtured in research and innovation hubs across the globe. Gone are the days when AI conjured up images of Agent Smith, R2D2 or T-800. Now, AI has use cases in both automation and intelligence. The pandemic drove many organisations to adopt remote work. To manage this, AIOps strategies emerged. Gartner coined the term and defines AIOps as ‘platforms that utilise big data, modern machine learning and other advanced analytics technologies to, directly and indirectly, enhance IT operations (monitoring, automation and service desk) functions with proactive, personal and dynamic insight.‘
Identifying the gaps
This leaves many between the proverbial rock and a hard place. How do you take an informed decision at a time of rapid transformation where evolving is synonymous with survival and when the trends are changing at a pace and the relevance of historical data is eroding? Many organisations faced this dilemma. Navigating the ‘new normal’ requires new intelligence, a lot of resilience and new technology in the wake of failing legacy ML forecasts. Many decision-makers had to resist the urge to fall back on gut instinct.
In times of uncertainty, the biggest challenge is to resist giving into hype. Decluttering entails understanding business priorities and plugging gaps.
Moving away from traditional methods
The first step is to always look for evidence or data. If internal data is unreliable, alternate data sources should be used. One such source is dark data – not just unused data but also data representing market conditions, such as the opinion from experts.
Gartner analysts have outlined the use of new data and analytics techniques to build models that are resilient and adaptable. The small data approach offers useful insights using fewer data; it includes methods like time-series analysis, few-shot learning, and self-supervised learning. On the other hand, the wide data approach ties together and analyses a variety of small, large, structured and unstructured data. Combined, the approaches help in making the best of available data more effectively.
Another approach could be data marketplaces. These are platforms where one can access third party data (public, commercial and even private data if there are sellers) to provide deeper insights, almost like a transactional store to facilitate the buying and selling of data in various domains like business intelligence, demographics, market data, etc. It has been observed that businesses are seeking to augment their internal data sets with external data obtained from such marketplaces. With the growing importance of AI in decision-making processes, the data marketplaces are helping companies reduce effort and costs.
The goal is to not just get a siloed view of what’s happening from in house data, but a 360-degree view from multiple sources.
Ensemble models are one way to handle data from multiple sources. Another is a composite AI solution that combines multiple technologies. It increases the quality of the solution through better generalisation and abstraction by synergising a mix of machine learning, heuristic systems, rule-based constraints, optimisation, natural language processing and graph techniques. Together, it enhances the system’s learning ability and accuracy of results where a single solution might have failed.
Composite AI offers a multi-faceted approach to dealing with multi-dimensional aspects of a business problem. Insights can be extracted from multiple data sources effectively.
The final part is to of course test, validate and course-correct as much as is needed. The bottom line is to be equipped with accurate information to make decisions in real-time by striking a balance of technology and the intelligence to bind them.
This article is written by a member of the AIM Leaders Council. AIM Leaders Council is an invitation-only forum of senior executives in the Data Science and Analytics industry. To check if you are eligible for a membership, please fill out the form here.