Since 2014, Homelane has delivered personalised end-to-end home interior services to customers from 19 cities across the country. “It starts with enabling the discovery of themes, design and interior options relevant to a customer’s budget, style preferences and property type. We are building products using AI/ML to serve super personalised design options to our customers based on many variables feeding into our algorithms to determine what’s best suited in a specific case/project,” said Puneet Gupta, Chief Technology Officer, HomeLane.
In an exclusive interview with Analytics India Magazine, Puneet Gupta spoke about how the company embeds ethics into its platform.
Sign up for your weekly dose of what's up in emerging technology.
AIM: How does HomeLane leverage AI?
Puneet Gupta: At HomeLane, the window of interaction with our customers spans over several months. And this offers us a great vantage point to observe and learn about consumer behaviour, including preferences and expectations. We use these learnings to optimise the entry points of our conversion funnel.
In addition, our systems that implement the multi-vendor and multi-partner ecosystem of the supply chain and delivery offer us curated metrics that help us understand each part of our order fulfilment journey. Such data and metrics are leveraged for smarter business decisions and help us invest in the right initiatives that improve customer satisfaction and operational efficiencies.
AIM: Tell us about HomeLane’s AI governance framework.
Puneet Gupta: At HomeLane, we have a proprietary unified data platform that collates data across various systems and aggregates them in an easily interpretable format. This helps to capture consumer preferences across demographics and personalise the experience for our customers.
Our products/systems implement secure real-time pipes of data synchronisation with the data platform at scheduled intervals. While doing any work using data, customer data privacy is paramount, and there is a zero-tolerance policy regarding the same at HomeLane. For example, imagine a customer walking into a showroom and being provided with interior trends (themes, styles, colour palettes) that are laser-focused to match their preferences based on observed consumer behaviour, which helps convert them faster.
Our constant endeavour is to make our processes and system more and more secure, such that we never compromise this core principle. We do regular security audits and take preventive, corrective actions to work towards continuous improvements.
AIM: What explains the growing conversation around AI ethics, responsibility, and fairness? Why is it important?
Puneet Gupta: Ensuring that data security and privacy is a leadership and an organisational KRA can go a long way. We have had examples of organisations who have looked the other way for strategic or monetary gains. But customers today are very sensitive about privacy and data security, and ignoring these for short-term gains has a high likelihood of long-term loss of credibility and the client base. “With great power comes great responsibility”, and it is imperative that the distinction between influencing consumer preferences based on broader trends should not become manipulation. At HomeLane, transparency is key.
AIM: How do you mitigate biases in your AI algorithms?
Puneet Gupta: Training an algorithm on redundant attributes and managing huge volumes of training data is not always the most effective. So, the key is to identify the biases most pertinent to your problem statement. Our approach is to progressively add more decision-making factors, starting from some of the more obvious/heuristic ones and adding more attributes to improve.
AIM: How does HomeLane protect user data?