Thanks to internet of things (IoT), networks are extending farther and farther from conventional workstations and centralised data centres. This trend has resulted in the need for computing power closer to the endpoints and to meet this need, edge computing devices, such as gateways, have been developed.
An alternative, and hopefully more advanced, has presented itself: The likes of tiny machine learning, which embeds analytics on sensors at the very edge of connected ecosystems.
“The question was whether we could perform analytics on the device itself. It was a Mission Impossible kind of task,” explained Evgeni Gousev, senior director at Qualcomm Technologies as well as co-founder and board chairman of the TinyML Foundation.
With TinyML (a trademark of the TinyML Foundation), machine learning algorithms can be applied to small, low-power devices that exist at the end of an IoT ecosystem, even if there is no connectivity. It ensures that devices can process data in real-time exactly where it’s created and can detect and respond to issues in real-time regardless of latency or bandwidth. “It solves cost, power and privacy issues. It’s a cheap, democratic way of doing AI,” Gousev said.
AI at the Edge: What’s driving it
IoT offers a number of benefits, including the ability to understand an environment instantly and respond to issues appropriately. To deliver those benefits, access to intelligence must be fast, for which reason cloud computing has become popular.
“We put AI and machine learning models that kept growing in the cloud where there was massive infrastructure. So, the inference was made there. But if we have a low-power, remote device or poor connectivity, that’s not so good,” said Pablo Micolini, engineering manager at Theorem.
Rather than using the cloud, he believes you should rely on TinyML when trying to make everything fit on tiny devices – it is even better, with on-device processing and no latency. As an example, a wearable device designed to detect critical medical problems would need to have “always-on” capabilities to analyse patient data and respond when an emergency arises.
There are numerous “trigger types of use cases” where smart sensors are programmed to detect and alert to specific events. As an example, retailers could use TinyML to detect empty store shelves and alert them to the need to restock. In an emergency, hoteliers could use TinyML to identify occupied rooms. A manager at an elder care facility could use it to detect falls in each unit.
TinyML: benefits, challenges, and the future
TinyML will grow quickly, according to researchers. ABI Research predicts that 2.5 billion devices with TinyML chipsets will ship in 2030, with low latency, advanced automation and highly power-efficient, low-cost chipsets becoming more important.
For this growth, supporters point to TinyML’s advantages: providing instantaneous analytics and responding accordingly, eliminating latency, and working without connectivity. Furthermore, it can keep the data it processes locally, thereby protecting it from hackers who generally target centralised data stores.
Despite its benefits, enterprise IT leaders should expect challenges when implementing and scaling TinyML within their companies. Along with the growth of AI capability in edge devices, the expansion of 5G networks, and other advances in IoT technologies, this may reduce the appeal and need for TinyML. Nelson believes TinyML is most useful for niche needs, such as deployments in remote locations. Meanwhile, others predict explosive growth in TinyML deployments in the future.