MITB Banner

What is Propensity Model as a service?

Propensity modelling is a set of approaches to predictive models that help in forecasting the behaviour of the target audience.

Share

propensity model

Illustration by flickr/Prachatai

We speak a lot about AI services, but what do they really refer to? As per Vikas Raturi, senior ML engineer at Intuit, who spoke at MLDS 2022, AI services are the delivery of an AI capability over a communication interface. The main goal with AI services is to reduce the cost of development by allowing the said capabilities to be reused across various use cases.

That said, AI services are not always about putting a serving interface over a model. AI capabilities can be delivered in many forms like a trained model object, a batched set of predictions or an interface serving endpoint. In fact, in enterprise settings, a complete service may not even exist and it may be just a model in the rest of the system.

Propensity modelling is a set of approaches to predictive models that help in forecasting the behaviour of the target audience. It achieves this by analysing the users’ past behaviour. Common use cases of Propensity models are Churn Estimates, predicting feature adoption, etc.

Most of the Propensity models have similar characteristics:

  • They are trained on a broad range of user demography and behaviour data.
  • Although the underlying algorithm can be reused, all the use cases mostly require a separate model.
  • The Propensity model prediction are also in similar format that consists of customer identifier and a propensity score.
  • It must be scalable for both training and inference tasks as they are evaluated for the entire population. 

“We have huge data coming in, and every time a data scientist is trying to featurise for a particular use case, that means all the work – the whole work of transforming that data and adding the right feature sets, testing it out, ensuring that there is enough quality along with the change in dynamics of the nature of the data and the values. So, we thought, can we collate all the work that we have done for the feature set and can we build one common feature set for all different varieties of propensity models, which can be generic enough and that should be able to serve the different requirements. So, what we had is we collected, documented and created very high-quality feature sets which are peer-reviewed by different data scientists into one giant feature set right now – it is like more than 1000 features. And that can now power different propensity model use cases. And in our understanding, this was roughly taken 60% of the whole development lifecycle for any other model. So that’s what they call the feature addition – all the behaviour customer behaviour feature that was put into one single pipeline, and of course, that is a huge cost saver because you are doing it only once,” said Raturi.

Kernel in Propensity models

Each propensity model required a separate model, depending on the use case and business needs. That said, the core kernel of these models can be reused, while the hyperparameters and the metrics would still differ. Citing an example of a Churn prevention model, Raturi said that it might be optimised for higher precision or recall depending on the model of customer connect.

In the case of the Propensity model as a service, each use case has its own machine learning pipeline that is powered by a common kernel that is optimised for the propensity model. The algorithm models the propensity problem as a time-to-event problem. Speaking about the same, Raturi said, “For example, telling someone that this user is going to upgrade to the advanced version of your product, but he does not have enough experience or training to be able to use all the features. To solve such a challenge, we built an optimisation module that uses the features that the model uses for every customer and generates a set of action items that can be optimised.”

The kernel provides an optimisation algorithm to attribute the propensity score to user behaviour to generate actions that can be taken.

Wrapping up

“The biggest metric for success so far for AI services, in our understanding, has been how the service is actually delivered to the customer and how it can be done to better user experience is very important. It’s based on the simple fact that the population and consumers have to be clearly identified. Figuring out the correct customer base and asking for feedback, and then iterating, is important. This is behind the success of our AI service,” reveals Raturi.

Image credit: flickr/Prachatai

Share
Picture of Shraddha Goled

Shraddha Goled

I am a technology journalist with AIM. I write stories focused on the AI landscape in India and around the world with a special interest in analysing its long term impact on individuals and societies. Reach out to me at shraddha.goled@analyticsindiamag.com.
Related Posts

CORPORATE TRAINING PROGRAMS ON GENERATIVE AI

Generative AI Skilling for Enterprises

Our customized corporate training program on Generative AI provides a unique opportunity to empower, retain, and advance your talent.

Upcoming Large format Conference

May 30 and 31, 2024 | 📍 Bangalore, India

Download the easiest way to
stay informed

Subscribe to The Belamy: Our Weekly Newsletter

Biggest AI stories, delivered to your inbox every week.

AI Courses & Careers

Become a Certified Generative AI Engineer

AI Forum for India

Our Discord Community for AI Ecosystem, In collaboration with NVIDIA. 

Flagship Events

Rising 2024 | DE&I in Tech Summit

April 4 and 5, 2024 | 📍 Hilton Convention Center, Manyata Tech Park, Bangalore

MachineCon GCC Summit 2024

June 28 2024 | 📍Bangalore, India

MachineCon USA 2024

26 July 2024 | 583 Park Avenue, New York

Cypher India 2024

September 25-27, 2024 | 📍Bangalore, India

Cypher USA 2024

Nov 21-22 2024 | 📍Santa Clara Convention Center, California, USA

Data Engineering Summit 2024

May 30 and 31, 2024 | 📍 Bangalore, India

Subscribe to Our Newsletter

The Belamy, our weekly Newsletter is a rage. Just enter your email below.