Listen to this story
|
The Digital Personal Data Protection Bill 2022—now in its fourth iteration since 2017—aims to provide a more “complete legal framework”. The data principal, data fiduciary, and grievance resolver form the basis of the bill’s operation. Although the General Data Protection Regulation (GDPR) initially appears to be comparable, there are several beneficial modifications and grey areas.
India is preparing to make history. Taking the lead in Industrial Revolution 4.0 is perhaps a significant turning point in the country’s development. In order to build artificial intelligence, the internet of things (IoT), and robotics while protecting individual privacy, we must take control of the data and information generated by Indian customers and work with professionals equipped to handle it.
But this is only a consumer perspective. We wanted to know what our industry experts think of the bill and their opinions on what has been done and what can be undone. The roundtable session was moderated by Shirsha Ray Chaudhuri, Director of Engineering, TR labs – Thomson Reuters along with panellists Sanjay Thawakar, Senior Vice President & Head, AI Works & BPMA at Max Life Insurance, Vijoy Basu, Sr. Director, AI & Analytics at Cognizant and Siddharth Shah, Head of Data Product at Airtel Digital.
Old or new?
The current bill has simplified things quite a lot. It removed the notion of what is sensitive, Personal Identifiable Information and considers all data as one. I think that simplification is helpful. Having said that, such a wide definition can also feel ambiguous. There is scope to further tighten this definition.
—Siddharth Shah, Head of Data Product at Airtel Digital.
What’s missing?
While the bill is quite comprehensive and it puts the onus on the companies to ensure that the collected data is processed only for the services mentioned explicitly at the time of collection. Organisations have to bring it out in a manner in which an end consumer can understand and explicitly give his consent for data usage. One of the other aspects that has come in, which is of concern, is that if in case you’re putting up a complaint against the fiduciaries and if it gets proven that it’s frivolous, then there is a certain amount of legal penalty that has been put on the data principals. Now this can discourage end consumers to file complaints because of the penalties involved. So, these are some of the areas where the bill could have done better.
—Sanjay Thawakar, Senior Vice President & Head, AI Works & BPMA at Max Life Insurance
AI Governance – A dream
People become much more aware of what their data is getting used for and the data will be much more clean. So, that way people will ensure that the data they’re giving is the right data, they will not add false data because that is what our responsibility is, that you do not try and falsify the data that you’re sharing with the authorities or even the private parties. And that will ensure that our AI models from a governance standpoint, since they’ll be using data that is clean, safe and secure, which will work better. Obviously biases will still remain, inherently there is bias even in clean data, unless that is taken care of by the models using unconscious bias, but governance model assets will become much more rigorous, because there are guardrails around how this data is stored, processed, used and much more accurately. So, we will see that these models will work much better with greater accuracy and then over a period of time, they will become more intelligent. So, the services and data products that get built will become more powerful for all the end users.
—Vijoy Basu, AI & Analytics at Cognizant
Impact on Model Behaviour
The number of people withdrawing consent is extremely miniscule. It is less than 0.5% because in today’s world, if you need services, you have to share data. There is no other way and most products are digital products. So, the loss of data that we see is not going to be significant enough to impact the model behaviour or the accuracy or precision. We will have to have a tracking of where which models are using what kind of data and how much data loss has happened. But, there is always a data feedback loop and as new data comes in, models learn. But given past experiences, I don’t think it’s going to have a huge impact on the models per se, with people withdrawing consent. If you need a service, you need to give consent.
—Vijoy Basu, AI & Analytics at Cognizant
Technology to forget data
There is one thing around models changing significantly, because the number of people opting out is not going to be so significant to make any meaningful change to the model.
That’s less of a problem. In my opinion, the bigger problem is operational, how will your systems adapt to deleting data, to be able to kind of remove customer identifiers. And then, if I decide not to personalise for me, how will my entire ecosystem of 50 Odd front ends and 200 Odd back-end systems adapt to this customer input. That is a very hard problem to solve for multiple reasons. One big reason is that the legacy systems do not have this level of customisation. Auditability of how the data is consumed across the board is generally a harder problem to solve. It’s also a very expensive problem to solve for most organisations, because auditing all End points at the scale of a large organisation is not trivial. So, governance is going to become a painful problem.
—Siddharth Shah, Head of Data Product at Airtel Digital.
Way forward
I think there is more cost involved in implementing data protection officers than doing audits and having data grievance officers for a very minuscule set of requests, but we learned how to do that. And that is now offered as a data prediction service to our customers and clients, where we say—this is how we did it. There’s also a lot of consulting business around that and all the companies have to adhere to it. Otherwise, there are huge penalties that they have to pay which is a risk to their brand in the market. Losing the trust of their customers is something that can’t be quantified and companies do not expect these projects to have an ROI since it is purely a regulatory thing that you have to do.
—Vijoy Basu, AI & Analytics at Cognizant
Data is now being consumed beyond simple business operations to personalise experiences for the customer. In such a scenario, we have increased obligation to our customer to explain the data we collect and how we use the data.
—Siddharth Shah, Head of Data Product at Airtel Digital.
I think from an opportunity point of view, because of the bill, entire data infrastructure and maturity comes under focus. With the bill’s focus on privacy, consent, right to erasure and audits, the entire data infrastructure needs to be redefined to support these aspects. Hence, there will be a focus on data governance, catalogues, lineage, security and internal policies regarding data sharing and processing. So, automatic simplification happens, more investments come towards data processes, documentation and more in terms of how the data flow is happening. The right level of investments, the right level of focus will start moving more and more towards data and that’s how I look at it as an opportunity.
—Senior Vice President & Head, AI Works & BPMA at Max Life Insurance
Conclusion
The fact that as consumers of digital services in India, we seek greater transparency, which is not there, even with the bill today at any point of time, I want to be notified what my data is going to be used for. Even if I have given my consent, it’s not a one time thing. So, we hope to have a little more of that gap that we see. This will also mean a lot of business process simplification, this will also mean opportunities for AI companies, IT service providers in the space, and the whole audit and AI governance spaces, is definitely in the right spotlight to be in.
—Shirsha Ray Chaudhuri, Director of Engineering – TR Labs, Thomson Reuters
There are many areas that need work and several opportunities as well. But the future still looks optimistic on what can be achieved.
“Digital freedom stops where that of users begins. . . Nowadays, digital evolution must no longer be offered to a customer in trade-off between privacy and security. Privacy is not for sale, it’s a valuable asset to protect.” —Stephane Nappo
This article is written by a member of the AIM Leaders Council. AIM Leaders Council is an invitation-only forum of senior executives in the Data Science and Analytics industry. To check if you are eligible for a membership, please fill out the form here