MITB Banner

A CFA Charterholder Who Started As A Head Of Data Science: Interview With Kaggle Master Ilia Larchenko

Share

For this week’s ML practitioner’s series, Analytics India Magazine got in touch with Ilia Larchenko, a Kaggle master ranked 23rd in the world rankings. In this interview, Ilia shares his unusual journey into the world of machine learning.

The Beginning

Ilia has a Bachelor’s and Masters in Applied Physics and Math from one of the best universities in Russia, Moscow Institute of Physics and Technology (MIPT). A dual degree in Applied Physics and Math doesn’t sound very unusual for a typical data scientist, but Ilia confesses that his path to data science was not straightforward at all. To begin with, Ilia also holds the much-coveted title of CFA, a gold standard that is looked upon highly by investment organisations across the world. A master’s degree in Physics followed by a career as a Chartered Financial Analyst and later a data scientist. So, it does make sense when Ilia says the path hasn’t been straightforward.

Before becoming a CFA charterholder, he began his career by working for consulting companies where he had to evaluate different markets, companies, and projects. Although the job involved a lot of data analysis and modeling, unlike in data science and machine learning, it was mainly about manual analysis of different trends and making long term predictions or cash flows.

I have decided to quit the financial/investment/consulting sphere and do something more hands-on in startups. 

In 2016, Ilia joined the Moscow based healthcare startup DOC+ as deputy COO, and several months later was promoted to the Chief Innovation Officer. And, this is where his tryst with machine learning began. As CIO, he was responsible for building internal data science expertise from scratch. “… and it may be said, I have started my DS career from the “head of DS role,” quips Ilia.

Ilia worked as the CIO of DOC+ for close to 3 years and has overseen the development of several data science projects from a smart auto-dispatching system to a medical-bot symptom checker and medical records quality control system.

The math behind the ML was clear for me, as well as the general programming/algorithms part. 

Though he hasn’t been formally educated in machine learning, his fundamental education and previous non-working programming experience with robotics and programming some devices as a hobby allowed him to pick up the subject quickly. 

The math and the underlying workings of the algorithms weren’t a problem for Ilia, and all he had to learn were the main tools, frameworks, and particular approaches used in modern ML to become proficient in the field.

I got more in-depth knowledge and practical experience from online-courses.

Having benefited immensely from the online courses, Ilia recommends the following courses for aspirants:

For those who speak Russian, he recommends: 

Classic machine learning course with deep dive into math

DL course covering all main applications in the field.

He also suggests the aspirants to check ODS.ai (open data science), a community that helps create machine learning projects and improve one’s skills.

Kaggle Journey

Having competed in many competitions (math, physics, programming, and even cryptography)  since high school, Ilia’s inclination towards a highly competitive platform like Kaggle was almost natural.

It has been only a year and a a half since Ilia has started to participate in Kaggle competitions seriously and has already climbed to the top 25 in the global leaderboard. 

Initially, I have thought that one should be a very high-level data scientist to compete on Kaggle. It was a mistake!

Although Ilia had his misconceptions of an amateur on Kaggle, he now asserts that one can start whenever they want, and the sooner, the better. 

What really drove Ilia towards Kaggle was when his notebook on hyperparameter tuning was featured by Kaggle in their newsletter as “Technique of the week.” Today, after nearly 50 competitions he has 8 bronze, 12 silver, and a gold in one of the most challenging competitions on Kaggle — Abstraction and Reasoning (ARC). 

Launched by Keras creator François Chollet, the ARC competition required the participants to create an AI that can solve reasoning tasks it has never seen before. Ilia along with his teammate Vlad Golubev finished 3rd on the leaderboard, earning him a gold.

Talking about his winning approach for ARC, Ilia revealed that his final solution had more than 6,000 lines of code, with which he tried to create an abstract representation of different colors, images, and binary masks and then use various transformations to find a logical way from the input image to output. The full solution can be found here.

Ilia’s approach to a typical Kaggle competition can be summarised as follows:

  • Make preliminary preparation: create a git repository, set up a virtual environment and so on.
  • Start with a simple EDA (exploratory data analysis) to understand the data better.
  • Make the first simple baseline pipeline.
  • Make sure you have a working cross-validation strategy that correlates with the leaderboard.
  • Generate the list of ideas on how to improve your solution on each step of the pipeline: pre-processing, model, training, post-processing, inference, etc.
  • Sort all thoughts from the most promising to the least ones and start to implement them one by one.
  • Periodically look at public notebooks and discussion to find useful ideas and hypotheses. But, don’t believe them blindly; just add the ideas to your list and try to implement them along with your ideas.
  • At the end of the competition, try to blend or stack some solutions or apply any other common tricks to improve your final score.
  • Most importantly read all the published solutions after the competition ends. 

When it comes tools, Ilia finds himself using the following frequently:

  • Python
  • Pandas, Numpy, scikit-learn, LightGBM for table data
  • PyTorch for neural nets
  • Albumentations, mmdetection, segmentation_models, fast.ai – for different CV tasks
  • Transformers, tokenizers, gensim – for NLP tasks

For processing power, Ilia leveraged cloud services like GCP and AWS and others during his first year on Kaggle, but now he mostly uses his devbox (1x2080Ti, Threadripper 1920x, and 64 GB RAM), which he considers to be more than enough for a lot of tasks.

That said, Ilia also warns the beginners not to be complacent with success on Kaggle because competitions cannot give all the skills and knowledge required for a real-life data science job. A perfect recipe to success, he suggests, would be a combination of modeling practice on Kaggle, theoretical knowledge from courses, and implementation practice from your job can make you a good data scientist.

Final Thoughts

Talking about the hype around ML, Ilia admits that there is some truth to it as most of the methods have found their way into real business applications. For instance, he says, a lot of classic ML such as table data predictions, ranking, search, and time-series forecasts have taken their places as parts of different businesses. Though still not as widespread as classic approaches, he believes that computer vision and NLP models have started to gain popularity in some real business-applications, and going forward they will be adapted much widely.

Reinforcement learning seems a little overhyped right now, but I hope it will find its niche in real life and live through this hype sooner or later.

However, there still prevails a limiting factor in the machine learning industry, and that is the lack of education or at least awareness of data science amongst people outside the industry. He is glad that this problem is being tackled by teaching DS in schools and universities with the creation of DS courses for non-technical specialists. Interdisciplinary collaboration is essential for a domain such as ML that has the potential to touch all aspects of life. And, we cannot have ordinary engineers in critical fields such as healthcare or self-driving cars. 

In this regard, Ilia thinks that the quality which makes ML engineers from ‘good’ to ‘great’ is not exceptional technical skills but the connection to reality and good business acumen. So, if a certain real-world task can be solved using simple linear regression, then it’s a no brainer to train a deep learning model. “It is more of a decision-making ability, which separates the best from the rest,” believes Ilia. 

Share
Picture of Ram Sagar

Ram Sagar

I have a master's degree in Robotics and I write about machine learning advancements.
Related Posts

CORPORATE TRAINING PROGRAMS ON GENERATIVE AI

Generative AI Skilling for Enterprises

Our customized corporate training program on Generative AI provides a unique opportunity to empower, retain, and advance your talent.

Upcoming Large format Conference

May 30 and 31, 2024 | 📍 Bangalore, India

Download the easiest way to
stay informed

Subscribe to The Belamy: Our Weekly Newsletter

Biggest AI stories, delivered to your inbox every week.

AI Courses & Careers

Become a Certified Generative AI Engineer

AI Forum for India

Our Discord Community for AI Ecosystem, In collaboration with NVIDIA. 

Flagship Events

Rising 2024 | DE&I in Tech Summit

April 4 and 5, 2024 | 📍 Hilton Convention Center, Manyata Tech Park, Bangalore

MachineCon GCC Summit 2024

June 28 2024 | 📍Bangalore, India

MachineCon USA 2024

26 July 2024 | 583 Park Avenue, New York

Cypher India 2024

September 25-27, 2024 | 📍Bangalore, India

Cypher USA 2024

Nov 21-22 2024 | 📍Santa Clara Convention Center, California, USA

Data Engineering Summit 2024

May 30 and 31, 2024 | 📍 Bangalore, India

Subscribe to Our Newsletter

The Belamy, our weekly Newsletter is a rage. Just enter your email below.