How TensorFlow Lite Fits In The TinyML Ecosystem

TensorFlow Lite has emerged as a popular platform for running machine learning models on the edge. A microcontroller is a tiny low-cost device to perform the specific tasks of embedded systems.

In a workshop held as part of Google I/O, TensorFlow founding member Pete Warden delved deep into the potential use cases of TensorFlow Lite for microcontrollers.

Further, quoting the definition of TinyML from a blog, he said: 

“Tiny machine learning is capable of performing on-device sensor data analytics at extremely low power, typically in the mW range and below, and hence enabling a variety of ways-on-use-case and targeting battery operated devices.” 

A Venn diagram of TinyML showcasing the composition of TinyML (Source: Google I/O) 

How is TinyML different? 

Most machine learning applications are resource-intensive, and expensive to deploy and maintain.

According to PH Data, $65K (INR 47 lakhs) is the bare minimum amount required to deploy and maintain a model over 5 years. As you build a scalable framework to support future modeling activities, the cost might escalate to $95K (INR 70 lakhs) over five years.

On the other hand, TinyML is quite flexible and simple and requires less power.

Each hardware component (mW and below) acts independently, and the storage capacity of most machine learning models barely exceeds 30kb. Also, data can be processed on the device locally, which inevitably reduces data latency and solves the data privacy issue. Arm, Arduino, Sparkfun, adafruit, Raspberry Pi, etc are the major players in TinyML.

TensorFlow Lite, an open-source library by Google, helps in designing and running tiny machine learning (TinyML) models across a wide range of low-power hardware devices, and does not require much coding or machine learning expertise, said Warden.

Benefits of TinyML:  

  • Really small form factors enable multiple use cases 
  • Cheaper devices make ML more accessible 
  • Low battery consumption means devices can run for much longer 
  • Data processing can be done on the device (no cloud connection required) 

How does TinyML work? 

The TinyML process works in four simple steps — gather/collect data, design and train the model, quantise the model and deploy to the microcontroller. 

Source: Google I/O

In a blog post, ‘TensorFlow Lite for Microcontrollers,’ Google has explained some of its latest projects that combine Arduino and TensorFlow to create useful tools: 

To initiate the project, you need a TF4 Micro Motion Kit pre-installed on Arduino. Once you have installed the packages and libraries on your laptop or personalised computer, look for the red, green and blue flashing LED in the middle of the board. The details of the setup are found here

Once the setup is complete, you need to connect the device via Bluetooth; the TF4Micro motion kit communicates with this website via BLE, giving you a wireless experience. Now, tap the button on your Arduino, then wait for the red, green, and blue LED pattern to return. After this, click the ‘connect’ button as shown on the website, then select TF4Micro Motion Kit from the dialogue box. You are now good to go. Similar steps need to be followed for all three experiments — Air snare, FUI and tiny motion trainer

Note: Do not hold the button down as this will clear the board. 

The above experiments will help you get a hang of TensorFlow Lite on microcontrollers. You can also submit your ideas to the TensorFlow Microcontroller Challenge and win exciting cash prizes. 

As part of a TensorFlow Microcontroller Challenge, Sparkfun is giving out a free TensorFlow Lite for Microcontrollers Kit. Click here to get yours. 

Download our Mobile App

Amit Raja Naik
Amit Raja Naik is a seasoned technology journalist who covers everything from data science to machine learning and artificial intelligence for Analytics India Magazine, where he examines the trends, challenges, ideas, and transformations across the industry.

Subscribe to our newsletter

Join our editors every weekday evening as they steer you through the most significant news of the day.
Your newsletter subscriptions are subject to AIM Privacy Policy and Terms and Conditions.

Our Upcoming Events

15th June | Online

Building LLM powered applications using LangChain

17th June | Online

Mastering LangChain: A Hands-on Workshop for Building Generative AI Applications

Jun 23, 2023 | Bangalore

MachineCon 2023 India

26th June | Online

Accelerating inference for every workload with TensorRT

MachineCon 2023 USA

Jul 21, 2023 | New York

Cypher 2023

Oct 11-13, 2023 | Bangalore

3 Ways to Join our Community

Telegram group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our Daily newsletter

Get our daily awesome stories & videos in your inbox

Is Sam Altman a Hypocrite? 

While on the one hand, Altman is advocating for the international community to build strong AI regulations, he is also worried when someone finally decides to regulate it