MITB Banner

Results

Search Results for: neural network – Page 2

AI Mysteries
Yugesh Verma

How TensorFlow Probability is used in Neural Networks?

There are many cases where we get the requirements of probabilistic models and techniques in neural networks. These requirements can be filled up by adding probability layers to the network that are provided by TensorFlow.

AI Mysteries
Vijaysinh Lendave

What is Activity Regularization in Neural Networks?

Deep neural networks are sophisticated learning models that are prone to overfitting because of their ability to memorize individual training set patterns rather than applying a generalized approach to unrecognizable data.

AI Mysteries
Yugesh Verma

Neural Network Hyperparameter Tuning using Bayesian Optimization

The Bayesian statistics can be used for parameter tuning and also it can make the process faster especially in the case of neural networks. we can say performing Bayesian statistics is a process of optimization using which we can perform hyperparameter tuning.

AI Mysteries
Yugesh Verma

How to Use Lambda Layer in a Neural Network?

the lambda layer has its own function to perform editing in the input data. Using the lambda layer in a neural network we can transform the input data where expressions and functions of the lambda layer are transformed.

AI Mysteries
Vijaysinh Lendave

A Tutorial on Spiking Neural Networks for Beginners

Despite being quite effective in a variety of tasks across industries, deep learning is constantly evolving, proposing new neural network (NN) architectures such as the Spiking Neural Network (SNN).

AI Mysteries
Victor Dey

Exploring Graph Neural Networks

Data Scientists at CRED, Ravi Kumar and Samiran Roy explained the essence of using graph neural networks and how the emerging technology is being utilised by CRED. 

AI Mysteries
Vijaysinh Lendave

A Beginner’s Guide to Neural Network Pruning

Neural network pruning, which comprises methodically eliminating parameters from an existing network, is a popular approach for minimizing the resource requirements at test time.

AI Mysteries
Vijaysinh Lendave

LSTM Vs GRU in Recurrent Neural Network: A Comparative Study

Long Short Term Memory in short LSTM is a special kind of RNN capable of learning long term sequences. They were introduced by Schmidhuber and Hochreiter in 1997. It is explicitly designed to avoid long term dependency problems. Remembering the long sequences for a long period of time is its way of working. 

Contact Us

Subscribe to our newsletter

Subscribe to Our Newsletter

The Belamy, our weekly Newsletter is a rage. Just enter your email below.