Now Reading
Docker Solution For TensorFlow 2.0: How To Get Started

Docker Solution For TensorFlow 2.0: How To Get Started

Containers have a long history that dates back to the ‘60s. Over time, this technology has advanced to a great deal and has become one of the most useful tools in the software industry. Today, Docker has become synonymous for containers.

In one of our previous articles, we discussed how Docker is helping in the Machine Learning space. Today, we will implement one of the many use cases of Docker in the development of ML applications.

How To Start Your Career In Data Science?

What you will learn

  • Introduction To Docker
  • Installing & Setting Up Docker
  • Getting Started With Docker
  • TensorFlow 2.0 Container
    • Downloading Tensorflow 2.0-Docker
    • Firing Up Container
    • Accessing The Jupyter Notebook
    • Sharing Files
    • Installing Missing Dependencies 
    • Committing Changes & Saving The container Instance
    • Running Container From The New Image

Introduction To Docker

Docker is a very popular and widely-used container technology. Docker has an entire ecosystem for managing containers which includes a repository of images, container registries and command-line interfaces, among others. Docker also comes with cluster management for containers which allows multiple containers to be managed collectively in a distributed environment.

Installing & Setting Up Docker

Head to and sign up with a Docker ID. Once you are in, you will see the following page.

Click on the Get started with Docker Desktop button.

Click to download the right version for your operating system.

Once the file is downloaded, open it to install Docker Desktop. Follow the standard procedure for installation based on your operating system and preferences.

On successful installation, you will be able to see Docker on your taskbar. 

You can click on the icon to set your Docker preferences and to update it.

If you see a green dot which says Docker Desktop is running we are all set to fire up containers.

Also, execute the following command in the terminal or command prompt to ensure that everything is perfect:

docker --version

If everything is fine, it should return the installed version of the docker.

Docker version 19.03.4, build 9013bf5

Getting Started With Docker

Before we begin, there are a few basic things that you need to know.

Images: An image or Docker Image is a snapshot of a Linux operating system or environment which is very lightweight. Docker Hub which is Docker’s official repository contains thousands of images which can be used to create containers.

Check out the official Docker images here

Containers: Containers are the running instances of a docker image. We use an image to fire up multiple containers.

Some basic docker commands:

Get familiar with the following commands.

docker pull <repository/image_name:version>

  • The above command downloads the specified version of a docker image from the specified repository.

docker images

  • The above command will return a table of images in your local (local machine) repository.

docker run

  • The above command fires up a container from a specified image.

docker ps

  • The above command will return a table of all the running docker containers.

docker ps -a -q

  • The above command will display all the containers both running and inactive.

docker rmi <repo/name:version>

  • The above command can be used to delete a docker image from the local repository.

docker stop <container_id/name>

  • The above command stops a running container.

docker rm -f <container_id/name> <image_id/name>

  • The above command can be used to delete or remove a running Docker container. The -f flag force removes the container if it’s running. Like images, containers also have IDs and names.

We will be using the above commands a lot when dealing with Docker containers. We will also learn some additional commands in the following sections.

TensorFlow 2.0 Container

We will use TensorFlow’s official Docker image with Jupyter named tensorflow:nightly-py3-jupyter. The image comes with preinstalled Jupyter Notebook and the latest TensorFlow 2.0 version.

Downloading TensorFlow 2.0 Docker Image

To download the image run the following command.

docker pull tensorflow/tensorflow:nightly-py3-jupyter

Once all the downloading and extracting is complete, type docker images command to list the Docker images in your machine.

Firing Up The Container

To start the container we will use the Docker run command.

docker run -it -p 1234:8888 -v /Users/aim/Documents/Docker:/tf/ image_id

Let’s break it down:

  • docker run: used to fire up containers from a docker image
  • -it: This flag enables interactive mode. It lets us see what’s going on after the container is created.
  • -p: This parameter is used for port mapping. The above command maps the port 1234 of the local machine with the internal port 8888 of the docker container.
  • -v: This parameter is used to mount a volume or directory to the running container. This enables data sharing between the container and the local machine. The above command mounts the directory /Users/aim/Documents/Docker inside the docker containers /tf directory.
  • Image_id or name: The name or ID of the docker image from which the container is to be created.

We can now list the running containers in the system using docker ps command.

To stop the container use docker stop. The container id will be returned by the docker ps command.

Accessing The Jupyter Notebook

On successful execution of the run command, the Jupyter Notebook will be served on port 1234 of the localhost.

Open up your browser and enter the following url. 


Copy the token from the logs and use it to log in to Jupyter Notebook.

Once logged in you will see an empty directory which is pointing to the /tf/ directory in the container. This directory is mapped to the Documents/Docker directory of the local machine.

Sharing Files

While running the container, we mounted a local volume to the container that maps to the /tf/ directory within the container.

To share any files with the container, simply copy the required files into the local folder that was mounted to the container. In this case copy the file to /Users/aim/Documents/Docker to access it in the Jupyter Notebook.

Once you copy and refresh the notebook, you will find your files there.

Installing Missing Dependencies


Find an example notebook below. In the following notebook we will try to predict the cost of used cars from MachineHack’s Predicting The Costs Of Used Cars – Hackathon. Sign up to download the datasets for free.

Download the above notebook along with the datasets and copy them into your mounted directory. (/Users/aim/Document/Docker – in my case).

Now let’s start from where we left off with our Jupyter Notebook running on docker.

Open the notebook and try to import some of the necessary modules.

import tensorflow as tf
import numpy as np
import pandas as pd


See Also

You will find that most of the modules are missing. Now let’s fix this.

There are two ways to fix this. We can either use pip install from the Jupyter Notebook and commit changes in the container or we can go inside the container install all the missing dependencies and commit changes.

Let’s take the second approach.

Entering The Docker Container


  • Since we have used -it flag we will not be able to use the existing terminal /command prompt window. Open a new terminal for the following process. 

Get the container id using docker ps and use the following command to enter inside the running container.

docker exec -it container_id /bin/bash

Since containers are lightweight Linux kernels, all you need to know are some basic Linux commands.

So let’s install all those necessary modules that we need. For this example, I will install 4 modules that I found missing. 

Inside the container do pip install for all the missing libraries:

pip install pandas
pip install xlrd
pip install sklearn
pip install seaborn

Exit from the container instance by typing exit.


  • The easiest way to do it is by listing all the missing modules inside a requirements.txt file from your local machine and copying it into the shared directory of the container and run pip install -r requirements.txt. You can also use pip freeze > requirements.txt command to export the installed modules from your local environment into requirements.txt file.

Now go back to your Jupyter Notebook and try importing all those modules again.

Hooray! No more missing modules error!

Committing Changes & Saving The container Instance

Now that we have our development environment ready with all dependencies let’s save it so that we don’t have to install all of them again.

Use the following command to commit the changes made to the container and save it as a new image/version.

docker commit container_id new_name_for_image

Eg: docker commit ae6071309f1f tensorflow/tensorflow2.0:all_deps_installed

Running Container From The New Image

Now we have a new image with all the dependencies installed, we can remove the downloaded image and use the new one instead.

To delete or remove an image, use the following command:

docker rmi <repository:tagname(image_name)>

Eg: docker rmi tensorflow/tensorflow:nightly-py3-jupyter


  • To remove an image you must first kill all the running containers of that image. Use docker stop command to stop and docker rm command to remove containers.

To fire up containers from the new image, use the following command:

docker run -p localmachine_port:container_port -v localmachine_directory:container_directory image_name:version

Eg. docker run -it -p 8081:8888 -v /Users/aim/Documents/Docker:/tf/ tensorflow/tensorflow2.0 all_deps_installed


  • You can create as many containers as your machine would permit. In the above example, you can run multiple Jupyter notebooks by mapping different local machine ports to different containers.

Great! You can now set up different development environments for each of your projects!

Subscribe to our Newsletter

Get the latest updates and relevant offers by sharing your email.
Join our Telegram Group. Be part of an engaging community

Copyright Analytics India Magazine Pvt Ltd

Scroll To Top