Underrated But Interesting ML Concepts #4 – EDL, iDistance, ME & CN2

As part of this series, we'll review several fascinating yet underestimated machine learning concepts.

Some fascinating machine learning concepts aren’t covered nearly as much as they should be. In this article, we’ll look at a few of them, including Error-Driven Learning, iDistance, Mixture of experts, and CN2 algorithm.

Error-Driven Learning

For several decades, error-driven learning models have been widely used in the disciplines of animal and human learning. Error-driven learning processes are at the heart of today’s most popular AI systems based on artificial neural networks and have become the dominating strategy in machine learning research. Error-driven learning (also known as discrimination learning) simulates the learning process over time.

Error-driven learning mechanisms were first introduced into cognitive research to provide a formalisation for the results of early classical conditioning experiments. Robert A. Rescorla states in an article on Pavlovian conditioning that one essential concept based on experiment transformed the previous understanding of associative learning for the idea of error-driven learning: learning is based on how well a stimulus predicts a subsequent response or stimulus, not on mere temporal contingency. Error-driven learning insights have recently been applied to a wide range of linguistic problems with promising outcomes. According to a scholar, error-Driven Learning is a type of reinforced (machine) learning in which there are no rewards, and the entire learning is dependent on errors (incorrect actions) and their consequences (costs).

AIM Daily XO

Join our editors every weekday evening as they steer you through the most significant news of the day, introduce you to fresh perspectives, and provide unexpected moments of joy
Your newsletter subscriptions are subject to AIM Privacy Policy and Terms and Conditions.


The iDistance is an indexing and query processing algorithm for k-nearest neighbour searches on point data in multi-dimensional metric spaces used in pattern recognition. The kNN query is one of the most challenging problems to solve when dealing with multi-dimensional data, especially when the data’s dimensionality is high. The iDistance is designed to efficiently handle kNN queries in high-dimensional spaces, and it works particularly well with skewed data distributions, common in real-world data sets. The efficiency of iDistance is determined by the partitioning of data and the selection of reference points.

As per researchers, the iDistance method can be perceived as a way to speed up the sequential scan. Rather than scanning records from the beginning to the conclusion of the data file, the iDistance begins the scan at locations where the closest neighbours can be found quickly and with a high likelihood. Image retrieval, video indexing, similarity search in peer-to-peer systems, and mobile computing have all employed the iDistance.

Download our Mobile App

Mixture of experts

Mixture of experts (ME) is a popular and intriguing combining strategy that has a lot of potential for improving machine learning performance. ME is a neural network ensemble learning technique that has been around for a while. It is built on the divide-and-conquer approach, which divides the problem space among a few neural network experts monitored by a gating network. Other combining methods, such as negative correlation learning (NCL) and mixture of experts (ME), use unique error functions to train NNs while developing negatively correlated NNs

Robert A. Jacobs, one of the authors of a study on Adaptive Mixtures of Local Experts, states that the ME approach is based on the Divide-and-Conquer (D&C) strategy. The problem space is stochastically partitioned into many subspaces using a particular error function, and experts become specialised in each subspace. This technique manages the process with the use of a gated network, which trains alongside the experts. During expert training, the gating network cooperates in partitioning the problem simultaneously, based on differences in the experts’ efficiency in different subspaces. Above all, ME is an ensemble learning method that uses expert models to address a predictive modelling problem in terms of subtasks explicitly.

CN2 algorithm

The field of machine learning known as rule induction, is concerned with extracting formal rules from a data set. The CN2 method is a classification technique for efficiently inducing basic, comprehensible rules of the type “if condition, then predict class,” even in environments with noise. It is built to function even if the training data isn’t ideal. According to experts, it uses ideas from the AQ algorithm to generate rules and decision tree learning to deal with noise. As a result, it generates a ruleset similar to AQ’s, except it can handle noisy data like ID3.

CN2 is a member of the sequential covering family of algorithms. Sequential Covering Algorithm is modified in a few key areas. First, CN2 picks up on rules that apply to all types of training scenarios. Second, CN2 generates a set of rules that are either ordered or unordered. Therefore, it can tolerate noise since it accepts rules with a given degree of precision.

Sign up for The Deep Learning Podcast

by Vijayalakshmi Anandan

The Deep Learning Curve is a technology-based podcast hosted by Vijayalakshmi Anandan - Video Presenter and Podcaster at Analytics India Magazine. This podcast is the narrator's journey of curiosity and discovery in the world of technology.

Dr. Nivash Jeevanandam
Nivash holds a doctorate in information technology and has been a research associate at a university and a development engineer in the IT industry. Data science and machine learning excite him.

Our Upcoming Events

24th Mar, 2023 | Webinar
Women-in-Tech: Are you ready for the Techade

27-28th Apr, 2023 I Bangalore
Data Engineering Summit (DES) 2023

23 Jun, 2023 | Bangalore
MachineCon India 2023 [AI100 Awards]

21 Jul, 2023 | New York
MachineCon USA 2023 [AI100 Awards]

3 Ways to Join our Community

Telegram group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our Daily newsletter

Get our daily awesome stories & videos in your inbox

Is Foxconn Conning India?

Most recently, Foxconn found itself embroiled in controversy when both Telangana and Karnataka governments simultaneously claimed Foxconn to have signed up for big investments in their respective states