FastAPI vs Flask: Comparison Guide for Data Science Enthusiasts

FastAPI vs Flask

Data Science being a multidisciplinary area, is not only restricted to creating problem-specific models. One of the challenges faced by people working in this field is deploying any ML model. But nowadays, it is pretty straightforward to deploy or test your machine learning model at the production level. This is an essential step because not everyone is interested in your code; they just want the final application serving their needs. For all data scientists, it is good practice to develop end to end models so that you can forward your model to further testing teams (in our case, domain expert person). There are several paths for the deployment of machine learning models. The web interface is most common, others like Android/IOS apps, IOT devices, etc.

When it comes to web deployment, there are python based frameworks like Django, Flask and the recent one is FastAPI which is more popular nowadays. Django is a high-level python-based framework used for building secure and large scale websites. In contrast, Flask and FastAPI are micro frameworks used to build small scale websites or applications based on ML. You can check here a comparison between these frameworks.

In this article, our primary focus is to build a web interface for machine learning applications using Flask and FastAPI frameworks and to check its functionality based on our needs.

Subscribe to our Newsletter

Join our editors every weekday evening as they steer you through the most significant news of the day, introduce you to fresh perspectives, and provide unexpected moments of joy
Your newsletter subscriptions are subject to AIM Privacy Policy and Terms and Conditions.

Implementation: 

Here we are using GradientBoost based machine learning model for deployment. A gradientBoostClassifier is a group of machine learning algorithms that combines many weak learning models to create a strong predictive model; usually, the Gradient Boosting Classifier uses decision trees. The model predicts whether the person will suffer from Cardiac arrest or not based on ten input parameters. The detailed notebook of the model can be found here.




Input to be taken from the user as below

  1. Persons age: a integer number
  2. Gender: for male = 1 for female = 2
  3. Height in cm:  a integer number
  4. Weight in kg: a integer number
  5. Upper blood pressure: a integer number
  6. Lower blood pressure: a integer number
  7. Cholesterol level: for normal=1, above normal=2, well above normal=3
  8. Glucose level: for normal=1, above normal=2, well above normal=3
  9. Smoking status = Do not smoke= 0, do smoke = 1
  10. Alcohol status = Non Alcoholic = 0, Alcoholic = 1

For saving and loading the model, a pickle module is used to hold the model in binary format and migrate to any platform directly.  

FlaskAPI:

Flask is a micro framework written in Python. Micro frameworks are normally frameworks with little to no dependencies to external libraries. It has multiple modules that make it easier to write applications without worrying about protocol management, thread management, etc. 

Let’s try to build a basic web page using Flask, which returns a simple string.

 from flask import Flask
 app = Flask(__name__)
 @app.route("/")
 def home():
     return "This blog is about FlastAPI and FastAPI"
 if __name__ == "__main__":
     app.run() 

The web page is shown below.

As the model required ten input parameters, imagine we have to showcase ten input parameters for that we have to write HTML code and with the help of a render template we have to return an HTML file in order to take values from the user. 

After completing coding of HTML you can see the interface as below 

To run our application, we need to write code for flask API in order to serve a request from the HTML page and to post the prediction statement,

 from flask import Flask, render_template, request
 import numpy as np
 import pickle
 model = pickle.load(open('Healthcare.pkl', 'rb'))
 app = Flask(__name__)
 @app.route('/')
 def new():
     return render_template('home.html')
 @app.route('/predict', methods=['POST', 'GET'])
 def predict():
     data1 = float(request.form['a'])
     data2 = float(request.form['b'])
     data3 = float(request.form['c'])
     data4 = float(request.form['d'])
     data5 = float(request.form['e'])
     data6 = float(request.form['f'])
     data7 = float(request.form['g'])
     data8 = float(request.form['h'])
     data9 = float(request.form['i'])
     data10 = float(request.form['j'])
     features = np.array([data1, data2, data3, data4, data5,
                         data6, data7, data8, data9, data10])
     pred = model.predict([features])
     def statement():
         if pred == 0:
             return 'Result:- The model has predicted that you will not suffer from any cardic arresst but you should take care of your self.'
         elif pred == 1:
             return 'Result:- You should consult with doctor, The model has predicted that you will suffer form cardic arrest.'
     return render_template('home.html', statement=statement())
 if __name__ == '__main__':
     app.run() 

Entering feature values and hitting the predict button will give you output like this

So after spending nearly 30 minutes provided that you know the HTML coding, we have created a very basic and simple Web interface of our ML model. The problem with this approach is that there is no data validation, and as you know, ML models getting wrong data types will lead to the crash of the whole program. If I pass a string value to any of the input, it will give the error on the HTML page without specifying or any statement for the cause of the error. Error page looks like below 

Even if you want to implement data validation, you have to write many ‘if’ statements to check every possible data type coming in or use separate libraries, which will add more work.

Flask is one such framework that is more popular in the ML community. But there are certain disadvantages like Speed of operation, security threat due to the use of third-party modules, and not providing in-built data validation as shown in the above example, which is more important in our case. 

FastAPI: 

It is a modern framework that allows us to build API seamlessly without much effort and time. As the name itself is fast, it is much faster than the flask because it’s built over ASGI (Asynchronous Server Gateway Interface) instead of WSGI (Web Server Gateway Interface) s the flask is built on. Check here if we want to know more about ASGI and WSGI.

 It has a built-in data validation system that can detect invalid datatype during the run and returns the reason for bad input in JSON format. Fast API uses Pydantic for data validation, something that flask lacks. 

Fast API was built considering these three main concerns, i.e., speed of operation, developer experience and open standards.

It generates the documentation when we run the application while developing the API.

As we have created a separate HTML page to take values from the user end here in FastAPI, there is no such need. If you want to use HTML for more design purposes, you can use it.

 import uvicorn # for ASGI support
 from fastapi import FastAPI
 import pickle
 from pydantic import BaseModel
 class Features(BaseModel):
     Persons_age: float
     Gender: float
     Height_in_cm: float
     Weight_in_kg: float
     Upper_blood_pressure: float
     Lower_blood_pressure: float
     Cholesterol_level: float
     Glucose_level: float
     Smoking_status: float
     Alcohol_status: float 

Here at the beginning, we have used the class method to create a data validation point which is inherent from pydantic Basemodel, where we have defined the data type required by the model.  

Now let’s define the endpoint for our model prediction.  

 model = pickle.load(open('Healthcare.pkl', 'rb'))
 app = FastAPI()
 @app.get("/")
 def home():
     return {'ML model for cardiac arrest prediction'}
 @app.post('/predict')
 def predict(data: Features):
     data = data.dict()
     Persons_age = data['Persons_age']
     Gender = data['Gender']
     Height_in_cm = data['Height_in_cm']
     Weight_in_kg = data['Weight_in_kg']
     Upper_blood_pressure = data['Upper_blood_pressure']
     Lower_blood_pressure = data['Lower_blood_pressure']
     Cholesterol_level = data['Cholesterol_level']
     Glucose_level = data['Glucose_level']
     Smoking_status = data['Smoking_status']
     Alcohol_status = data['Alcohol_status']
     pred = model.predict([[Persons_age, Gender, Height_in_cm, Weight_in_kg, Upper_blood_pressure,
                          Lower_blood_pressure, Cholesterol_level, Glucose_level, Smoking_status, Alcohol_status]])
     def statement():
         if pred == 0:
             return 'Result:- The model has predicted that you will not suffer from any cardic arresst but you should take care of your self.'
         elif pred == 1:
             return 'Result:- You should consult with doctor, The model has predicted that you will suffer form cardic arrest.'
     return {'prediction':statement()}
 if __name__ == '__main__':
     uvicorn.run(app) 

It is very similar to the flask, but we are using a uvicorn server, an ASGI implementation. That’s it; there is no need to render HTML files to serve requests from the user end. After running the application, we need to visit http://127.0.0.1:8000/ 

Now here comes the interesting part of FastAPI because of which it is more popular. To see the automated generated documents and to test the API go to the endpoint ‘/docs’, and you will be presented with a swagger UI that allows you to test the API, as shown below

Go to the post method to define the prediction endpoint and hit ‘try it out’ to check the model output.

Enter the ten input feature values mentioned in the request body and hit the Execute button.

Scroll down and check the summary of execution

If you feed the input so that it can not process in that case, it gives the detailed error message as shown below.

Another documentation generator comes with FastAPI, i.e ReDoc, which also generates beautiful documentation with all the endpoints listed. It can be accessed by hitting the endpoint /redoc as shown below.

Conclusion:

This article mainly focused on how FlaskAPI and FastAPI make a difference when we are deploying models at the production level. After all this discussion, I can say using FastAPI over Flask is always a good choice as ML is concerned because the main goal is to test models in a production environment as it saves a lot of time to build API. In contrast, flask takes a lot of time to build the same and user-friendly documents, which helps you explain your program’s usage to your team.

References: 

Vijaysinh Lendave
Vijaysinh is an enthusiast in machine learning and deep learning. He is skilled in ML algorithms, data manipulation, handling and visualization, model building.

Download our Mobile App

MachineHack

AI Hackathons, Coding & Learning

Host Hackathons & Recruit Great Data Talent!

AIM Research

Pioneering advanced AI market research

Request Customised Insights & Surveys for the AI Industry

CORPORATE TRAINING PROGRAMS ON GENERATIVE AI

Strengthen Critical AI Skills with Trusted Corporate AI Training

Our customized corporate training program on Generative AI provides a unique opportunity to empower, retain, and advance your talent.

AIM Leaders Council

World’s Biggest Community Exclusively For Senior Executives In Data Science And Analytics.

3 Ways to Join our Community

Telegram group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our Daily newsletter

Get our daily awesome stories & videos in your inbox
MOST POPULAR