Beginners Guide to Keras CallBacks, ModelCheckpoint and EarlyStopping in Deep Learning

In this article, we will explore different Keras callbacks functions. We will build a deep neural network model for a classification problem where we will use different callback functions while training the model.
Keras CallBacks

In Deep Learning models Keras callbacks functions can play a very significant role. The training of such models can take even days to complete so we should have some function to monitor and control our model. Suppose, if the model is getting overfitted we can stop the training or if we have reached at least loss and for next epoch, it gets increased we can again stop the training.

Sometimes due to much complexity in deep learning models, they often get crashed and the training gets stopped. Consider you have already trained it for 3 days and all the training gets wasted. To overcome these kinds of situations Keras has several different callbacks functions that can help to get rid of these problems while training the model. 

In this article, we will explore different Keras callbacks functions. We will build a deep neural network model for a classification problem where we will use different callback functions while training the model. For this experiment, we will make use of a Boston Housing Dataset which is publicly available on Kaggle for downloading. We will directly import the data set as it is available in Keras.

Subscribe to our Newsletter

Join our editors every weekday evening as they steer you through the most significant news of the day, introduce you to fresh perspectives, and provide unexpected moments of joy
Your newsletter subscriptions are subject to AIM Privacy Policy and Terms and Conditions.

What will we learn from this article? 

  • Building Deep neural network 
  • Keras Callbacks
  • Visualising loss and accuracy while training 
  • ModelCheckpoint
  • EarlyStopping 
  • Learning Rate Scheduler

The Dataset

We are using a Boston Housing data set that consists of 506 rows and 14 columns. The data was also part of the UCI Machine Learning Repository. The data contains information about different houses in Boston. The goal is to build a model that is capable of predicting the prices of houses based on given features in the data set.

Building Deep Neural Network 

First, we will import the required library and the data set. As we are directly importing the data set from Keras. It returns us the data into a training and testing set. We have stored them in training and testing variables accordingly. After this, we checked about the shape of the data and found there are 404 samples in training and 102 in testing. Use the below code to do all this.

import tensorflow as tf

(X_train,y_train),(X_test,y_test)= tf.keras.datasets.boston_housing.load_data()





Now we will define our network by adding different layers. We have first defined the model to be sequential followed by batch normalization layers for normalizing the inputs. We then defined the dense layer that will give us the output. (Price of the house). We have then compiled the model using optimizer as stochastic gradient descent and loss as mean squared error. Use the below code to define the network.

model = tf.keras.models.Sequential()



model.compile(optimizer='sgd', loss='mse')

After this, we fit the training and validation data over the model and start the training of the network. We have stored the training in a history object that stores the different values while the model is getting trained like loss, accuracy, etc for each epoch. We have defined epochs to be 30. Use the below code for training the network.

history =,y_train, validation_data=(X_test,y_test), epochs=30)


Once the training is done we will see what is present in history objects. Use the below code to check that.





Keras CallBacks

As we can see history object stored loss and validation loss for each epoch now let’s visualize it using graph

Visualizing loss and validation loss while training

import matplotlib.pyplot as plt






Keras CallBacks


This function of keras callbacks is used to save the model after every epoch. We just need to define a few of the parameters like where we want to store, what we want to monitor and etc. 

Use the below to code for saving the model. We have first defined the path and then assigned val_loss to be monitored, if it lowers down we will save it. We will again train the network now.

filepath='/content/drive/My Drive/All ss'

from keras.callbacks import ModelCheckpoint

checkpoint = ModelCheckpoint(filepath,monitor='val_loss',mode='min',save_best_only=True,verbose=1)

callbacks_list = [checkpoint],y_train,validation_data=(X_test,y_test),epochs=15,batch_size=32, callbacks=checkpoint)

Output:  Respective epoch where the validation loss gets lower down the model automatically gets saved at the respective path. You can refer to the below screenshot of the training.

Keras CallBacks


This function of Keras callbacks is used to stop the model training in between. This function is very helpful when your models get overfitted. It is used to stop the model as soon as it gets overfitted. We defined what to monitor while saving the model checkpoints. We also need to define the factor we want to monitor while using the early stopping function. We will monitor validation loss for stopping the model training. Use the below code to use the early stopping function. 

from keras.callbacks import EarlyStopping

earlystop = EarlyStopping(monitor = 'val_loss',min_delta = 0,patience = 3, verbose = 1,restore_best_weights = True)

Keras CallBacks

As we can see the model training has stopped after 10 epoch. This is the benefit of using early stopping. 

Learning Rate Scheduler 

This is a very simple function of callback that can be used to tweak the learning rate over a while. This is scheduled before the training. This gives us the desired output based on the respective epoch. Use the below code to use the learning rate scheduler. 

from keras.callbacks import LearningRateSchedulerscheduler = LearningRateScheduler(schedule, verbose=0)


I will conclude the article by stating that Keras callback is a very efficient function that is used while training the model to compute the performance of the model. We have discussed Early Stopping, Learning Rate Scheduler, Model Checkpoint. You can also explore Tensorboard here in this article titled as “TensorBoard Tutorial – Visualise the Model Performance During Training”.

Rohit Dwivedi
I am currently enrolled in a Post Graduate Program In Artificial Intelligence and Machine learning. Data Science Enthusiast who likes to draw insights from the data. Always amazed with the intelligence of AI. It's really fascinating teaching a machine to see and understand images. Also, the interest gets doubled when the machine can tell you what it just saw. This is where I say I am highly interested in Computer Vision and Natural Language Processing. I love exploring different use cases that can be build with the power of AI. I am the person who first develops something and then explains it to the whole community with my writings.

Download our Mobile App


AI Hackathons, Coding & Learning

Host Hackathons & Recruit Great Data Talent!

AIM Research

Pioneering advanced AI market research

Request Customised Insights & Surveys for the AI Industry

The Gold Standard for Recognizing Excellence in Data Science and Tech Workplaces

With Best Firm Certification, you can effortlessly delve into the minds of your employees, unveil invaluable perspectives, and gain distinguished acclaim for fostering an exceptional company culture.

AIM Leaders Council

World’s Biggest Community Exclusively For Senior Executives In Data Science And Analytics.

3 Ways to Join our Community

Telegram group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our Daily newsletter

Get our daily awesome stories & videos in your inbox

Subscribe to Our Newsletter

The Belamy, our weekly Newsletter is a rage. Just enter your email below.