Developers Corner

Causal Inference

All About IBM Causal Inference 360 Toolkit

IBM Causal Inference 360 Toolkit is one of its kind toolkit that offers a comprehensive suite of methods under a unified API.

A Complete Guide to Categorical Data Encoding

Encoding categorical data is a process of converting categorical data into integer format so that the data with converted categorical values can be provided to the different models.

Facebook Researcher’s New Algorithm Ushers New Paradigm Of Image Recognition

VICReg combines the variance term with a decorrelation mechanism based on redundancy reduction and covariance regularisation.

Beginners Guide to Self-Organizing Maps

A self-organizing map is also known as SOM and it was proposed by Kohonen. It is an unsupervised neural network that is trained using unsupervised learning techniques to produce a low dimensional, discretized representation from the input space of the training samples, known as a map and is, therefore, a method to reduce data dimensions.

Complete Guide to Transposed Convolutions in CNN Models

it is convenient to not change the size of output or keep the dimensions of input and output the same. This can be achieved by the transposed convolution in a better way

Guide To KNIME – A GUI Way of Data Science

KNIME stands for Konstanz Information Miner which was developed at the Konstanz University, Germany in 2004. It is open-source software written in Java. KNIME relies on predefined components called Nodes for building and executing the workflow.

How To Address Bias-Variance Tradeoff in Machine Learning

Bias and variance are inversely connected and It is nearly impossible practically to have an ML model with a low bias and a low variance. When we modify the ML algorithm to better fit a given data set, it will in turn lead to low bias but will increase the variance. This way, the model will fit with the data set while increasing the chances of inaccurate predictions. The same applies while creating a low variance model with a higher bias. Although it will reduce the risk of inaccurate predictions, the model will not properly match the data set. Hence it is a delicate balance between both biases and variance. But having a higher variance does not indicate a bad ML algorithm. Machine learning algorithms should be created accordingly so that they are able to handle some variance. Underfitting occurs when a model is unable to capture the underlying pattern of the data. Such models usually present with high bias and low variance. 

Understanding the AUC-ROC Curve in Machine Learning Classification

AUC-ROC is the valued metric used for evaluating the performance in classification models. The AUC-ROC metric clearly helps determine and tell us about the capability of a model in distinguishing the classes. The judging criteria being – Higher the AUC, better the model. AUC-ROC curves are frequently used to depict in a graphical way the connection and trade-off between sensitivity and specificity for every possible cut-off for a test being performed or a combination of tests being performed. The area under the ROC curve gives an idea about the benefit of using the test for the underlying question. AUC – ROC curves are also a performance measurement for the classification problems at various threshold settings. 

Tour to All The Useful Processing Steps on Image Data

The first-hand data is usually messy and comes from different sources and distributions. To feed them into the machine learning model they need to be standardized and cleaned up.

Guide to Different Padding Methods for CNN Models

the convolutional layers reduce the size of the output. So in cases where we want to increase the size of the output and save the information presented in the corners

GFP-GAN

Using GANs For Generating High-Quality Faces From Degraded Images

Scientists at Tencent who have developed Generative Facial Prior-GAN (GFP-GAN). It leverages diverse priors encapsulated in pretrained face GAN for restoration.

Beginners Guide to Boltzmann Machine

Boltzmann Machine is a kind of recurrent neural network where the nodes make binary decisions and are present with certain biases. Several Boltzmann machines can be collaborated together to make even more sophisticated systems such as a deep belief network.