Oracle Open Sources GraphPipe To Standardise Machine Learning Models

oracle graphpipe

With an aim to solve help organisations deploy machine learning in their ecosystems, Oracle has open sourced their high-performance standard network protocol GraphPipe. Announced on August 15, GraphPipe provides a standard protocol for transmitting tensor data over the network, along with simple implementations of clients and servers that make deploying and querying ML models from any framework a breeze, said Oracle in a statement.

Vish Abrams, Architect, Cloud Statement at Oracle listed out three major problems faced by organisations in implementing ML:

Subscribe to our Newsletter

Join our editors every weekday evening as they steer you through the most significant news of the day, introduce you to fresh perspectives, and provide unexpected moments of joy
Your newsletter subscriptions are subject to AIM Privacy Policy and Terms and Conditions.
  1. There is no standard for model serving APIs
  2. Complications in building model servers
  3. Existing solutions that don’t focus on performance and fall short

“We created GraphPipe to solve these three challenges,” said Abrams. He added, “GraphPipe’s efficient servers can serve models built in TensorFlowPyTorch, mxnet, CNTK, or caffe2. We are pleased to announce that GraphPipe is also available on Oracle’s GitHub.”

As of now, no dominant standard exists for how tensor-like data should be transmitted between components in a deep learning architecture. GraphPipe is designed to bring the efficiency of a binary, memory-mapped format while remaining simple and light on dependencies.

GraphPipe includes:

  • A set of flatbuffer definitions
  • Guidelines for serving models consistently according to the flatbuffer definitions
  • Examples for serving models from TensorFlowONNX, and caffe2
  • Client libraries for querying models served via GraphPipe

Flatbuffers are similar to Google protocol buffers, with the added benefit of avoiding a memory copy during the deserialisation step. The flatbuffer definitions provide a request message that includes input tensors, input names and output names. A GraphPipe remote model accepts the request message and returns one tensor per requested output name. The remote model also must provide metadata about the types and shapes of the inputs and outputs that it supports.

As of now, GraphPipe flatbuffer spec can be found on Oracle’s GitHub, PythonGo, and Java. It also has a plugin for TensorFlow that allows the inclusion of a remote model inside a local TensorFlow graph.

Prajakta Hebbar
Prajakta is a Writer/Editor/Social Media diva. Lover of all that is 'quaint', her favourite things include dogs, Starbucks, butter popcorn, Jane Austen novels and neo-noir films. She has previously worked for HuffPost, CNN IBN, The Indian Express and Bose.

Download our Mobile App

MachineHack | AI Hackathons, Coding & Learning

Host Hackathons & Recruit Great Data Talent!

AIMResearch Pioneering advanced AI market research

With a decade of experience under our belt, we are transforming how businesses use AI & data-driven insights to succeed.

The Gold Standard for Recognizing Excellence in Data Science and Tech Workplaces

With Best Firm Certification, you can effortlessly delve into the minds of your employees, unveil invaluable perspectives, and gain distinguished acclaim for fostering an exceptional company culture.

AIM Leaders Council

World’s Biggest Community Exclusively For Senior Executives In Data Science And Analytics.

3 Ways to Join our Community

Telegram group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our Daily newsletter

Get our daily awesome stories & videos in your inbox