Active Hackathon

A Brief Overview of OpenChat: Open Source Chatting Framework for Generative Models

OpenChat

OpenChat is an open-source python framework for chatting. It is based on generative models and capable of talking with AI with only one line of code. Currently, it supports two models :

  • BlenderBot : [small, medium, large, xlarge] proposed in Recipes for building an open-domain chatbot by  Stephen Roller, Emily Dinan, Naman Goyal, Da Ju, Mary Williamson, Yinhan Liu, Jing Xu, Myle Ott, Kurt Shuster, Eric M. Smith, Y-Lan Boureau, Jason Weston (Facebook AI Research)

Installation 

THE BELAMY

Sign up for your weekly dose of what's up in emerging technology.

OpenChat can easily be installed by PyPI.

!pip install openchat

Demo of OpenChat

  1. Chatting via terminal 

You can start the chat with AI with these two lines of code. First, it will ask you to mention your user-id. A user-id is used to manage user-specific history.

 from openchat import OpenChat
 OpenChat(model="blenderbot", size="large") 

Use .exit to exit the terminal.

Use .clear to clear all the history.

For using the OpenChat with GPU, specify the device as shown below:

  from openchat import OpenChat
  OpenChat(model="blenderbot", size="large", device="cuda") 
  1. Chat via your environment(useful when deployed on facebook messenger or whatsapp). Web demo is available at this link.

    To start the application,

 !git clone https://github.com/hyunwoongko/openchat.git
 %cd openchat
 from openchat import OpenChat
 from demo.web_demo_env import WebDemoEnv
 OpenChat(model="blenderbot", size="large", env=WebDemoEnv()) 
  • For creating your own environment : Inherit the Base class BaseEnv and implement your method as shown below:
 from typing import Dict
 from flask import Flask, render_template
 from flask_cors import CORS
 from openchat.envs import BaseEnv
 from openchat.models import BaseModel
 from flask_ngrok import run_with_ngrok
 class WebDemoEnv(BaseEnv):
     def __init__(self):
         super().__init__()
         self.app = Flask(__name__)
         run_with_ngrok(self.app)
         CORS(self.app)
     def run(self, model: BaseModel):
         @self.app.route("/")
         def index():
             return render_template("index.html", title=model.name)
         @self.app.route('/send/<user_id>/<text>', methods=['GET'])
         def send(user_id, text: str) -> Dict[str, str]:
             if text in self.keywords:
                 # Format of self.keywords dictionary
                 # self.keywords['/exit'] = (exit_function, 'good bye.')
                 _out = self.keywords[text][1]
                 # text to print when keyword triggered
                 self.keywords[text][0](user_id, text)
                 # function to operate when keyword triggered
             else:
                 _out = model.predict(user_id, text)
             return {"output": _out}
         self.app.run() 

Now run start the application by following code:

 from openchat import OpenChat
 OpenChat(model="blenderbot", size="small", env=WebDemoEnv()) 
  1. Additional Options:
  • Add custom keyword : You can add your customized keyword such as .exit or .clear. For that, call the self.add_keyword(‘.new_keyword’, ‘message to print’, triggered_function)’ method and triggered_function should be a form of function(user_id:str, text:str). Basic structure of the code is shown below:
  from openchat.envs import BaseEnv
 class YourOwnEnv(BaseEnv):
     def __init__(self):
         super().__init__()
         self.add_keyword(".new_keyword", "message to print", self.function)
     def function(self, user_id: str, text: str):
         """do something !""" 
  • Modify generation options 
  1. Modification of max_content_length (number of input history tokens) for chatting. Default is 128.

OpenChat(model="blenderbot", size="small", device="cuda", max_context_length=256)

2. Modification of generation option to get more functionality such as :

 model.predict(
 ...     user_id="USER_ID",
 ...     text="Hello.",
 ...     num_beams=5,
 ...     top_k=20,
 ...     top_p=0.8,
 ... ) 
  • Check histories : Check all your history using self.histories.
 from openchat.envs import BaseEnv
 class YourOwnEnv(BaseEnv):
     def __init__(self):
         super().__init__()
         print(self.histories) 
  • Clear histories : You can clear your history by running the code mentioned below.
 from flask import Flask
 from openchat.envs import BaseEnv
 from openchat.models import BaseModel
 class YourOwnEnv(BaseEnv):
     def __init__(self):
         super().__init__()
         self.app = Flask(__name__)
     def run(self, model: BaseModel):
         @self.app.route('/send/<user_id>/<text>', methods=['GET'])
         def send(user_id, text: str) -> Dict[str, str]:
             self.clear(user_id, text)
             # clear all histories !  

EndNotes

More Great AIM Stories

Aishwarya Verma
A data science enthusiast and a post-graduate in Big Data Analytics. Creative and organized with an analytical bent of mind.

Our Upcoming Events

Conference, in-person (Bangalore)
Machine Learning Developers Summit (MLDS) 2023
19-20th Jan, 2023

Conference, in-person (Bangalore)
Data Engineering Summit (DES) 2023
21st Apr, 2023

Conference, in-person (Bangalore)
MachineCon 2023
23rd Jun, 2023

3 Ways to Join our Community

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Telegram Channel

Discover special offers, top stories, upcoming events, and more.

Subscribe to our newsletter

Get the latest updates from AIM