Recurrent Neural Networks (RNNs) are neural networks that recall each and every information through time. In the past few years, this neural network has gained much traction and has been utilised in several applications. The applications include speech recognition, machine translation, video tagging, text summarization, prediction and more.
Here, we have listed the top 10 open-source projects on Recurrent Neural Networks (RNNs), in no particular order, that one must try their hands on.
LSTM Human Activity Recognition
About: This project is about Human Activity Recognition (HAR) using TensorFlow on smartphone sensors dataset and an LSTM RNN. Here, you need to classify the type of movement amongst six activity categories, which are walking, walking upstairs, walking downstairs, sitting, standing and laying. For the input data, you will be using an LSTM on the data to learn (as a cell phone attached on the waist) to recognise the type of activity that the user is doing.
Subscribe to our Newsletter
Join our editors every weekday evening as they steer you through the most significant news of the day, introduce you to fresh perspectives, and provide unexpected moments of joy
Your newsletter subscriptions are subject to AIM Privacy Policy and Terms and Conditions.
Know more here.
Text Classifier for Hierarchical Attention Networks for Document Classification
About: This project is called the Text Classifier for Hierarchical Attention Networks for Document Classification. The project uses Keras and the popular IMDB dataset. Also, the functional API makes the Hierarchical InputLayers very easy to implement. In this project, a Hierarchical LSTM network is built as a baseline. After that, Keras magic function TimeDistributed is used to construct the hierarchical input layers and so on.
Know more here.
Handwritten Text Recognition with TensorFlow
About: This project is the Handwritten Text Recognition (HTR) system implemented with TensorFlow or SimpleHTR. The Handwritten Text Recognition (HTR) system is implemented with TensorFlow (TF) and trained on the IAM off-line HTR dataset. This Neural Network (NN) model recognises the text contained in the images of segmented words.
Know more here.
NER-LSTM
About: Named Entity Recognition is a classification problem of identifying the names of people, organisations, etc. in a text corpus. The project is about Named Entity Recognition using multi-layered bidirectional LSTMs and task adapted word embeddings. Here, you will be implementing a two-layer bidirectional LSTM network using TensorFlow to classify the named entities for CoNNL 2003 NER Shared Task.
Know more here.
RNN NLU
About: This project is about Attention-based RNN model for spoken language understanding, mainly for intent detection and slot filling. It requires TensorFlow implementation of attention-based LSTM models for sequence classification and sequence labelling. As a prerequisite and set up, you will need TensorFlow, version r1.2.
Know more here.
RMDL: Random Multimodal Deep Learning for Classification
About: RMDL or Random Multimodal Deep Learning for Classification is an ensemble, deep learning approach for classification. RMDL solves the problem of finding the best deep learning structure and architecture while simultaneously improving robustness and accuracy through ensembles of deep learning architectures.
It includes three Random models, one DNN classifier at left, one deep CNN classifier at the middle, and one deep RNN classifier at right, where each unit could be LSTM or GRU.
Know more here.
Stock Price Predictor
About: This project utilises deep learning models, Long-Short Term Memory (LSTM) and neural network algorithm, to predict stock prices. You will be using Keras to build an LSTM to predict stock prices using historical closing price and trading volume and visualise both the predicted price values over time and the optimal parameters for the model.
Know more here.
Stock Price Prediction LSTM
About: This project is about using LSTM recurrent neural networks in open, high, low and closing prices of Apple Inc. stocks (OHLC Average Prediction). It includes two sequential LSTM layers that have been stacked together and one dense layer that is used to build the RNN model using Keras deep learning library. The prerequisites include Python 2.7 and latest versions of all libraries including deep learning library Keras and Tensorflow.
Know more here.
Deep Spying
About: This project is about spying using a smartwatch and deep learning. The goal of this work is to raise awareness about the potential risks related to motion sensors built-in wearable devices and to demonstrate abuse opportunities leveraged by advanced neural network architectures.
The LSTM-based implementation presented in this research can perform touch logging and keylogging on 12-keys keypads with above-average accuracy even when confronted with unprocessed raw data.
Know more here.
Attention Mechanisms
About: Attention mechanisms have transformed the landscape of machine translation, and their utilisation in other domains of natural language processing. This project includes custom layer implementations for a whole family of attention mechanisms, compatible with TensorFlow and Keras integration.
Know more here.