Active Hackathon

Small AI labs in a world of tightening funds

The flexibility, interoperability and cost savings that open source models offer could greatly benefit small research labs.

Big AI labs like DeepMind, Alan Turing Institute, OpenAI, and Meta AI are pushing the frontiers of artificial intelligence. Meanwhile, small labs like AI21 Labs, which has released Jurassic-1 Jumbo–a language model that could give GPT 3 a run for its money, are also catching up.

Not just AI21 Labs, a bunch of small AI labs like ElkanIO Research Lab, Cohere for AI, Hidden Brains Infotech, LIVIA, and Borealis AI are doing impressive work in the AI space. While ElkanIO Research Lab offers video analytics, robotic process automation, facial recognition solutions and conversational AI, Borealis AI performs fundamental and applied research in reinforcement learning, natural language processing, deep learning, and unsupervised learning. LIVIA is engaged in large-scale processing, analysis and interpretation of images and videos using artificial intelligence.


Sign up for your weekly dose of what's up in emerging technology.

However, these research labs face several challenges. For starters, they struggle to find investors. Getting access to grants is also an uphill task because there’s too much competition. 

Solve for small

“First of all, try to do the research that you might be in a unique position to do. That’s a mix of what your colleagues are great in, and what other communities (eg good non-ML labs at your uni, personal hobby/interest, …) you have access to and could collaborate with,” said Lucas Beyer, senior research engineer at Google Brain.

The labs should capitalise on their strengths– in terms of talent and the problem they are trying to solve– to build a competitive moat. Rather than casting the net wide, small labs should focus on a problem they are in a good position to solve for. It’s important for small AI labs to be antifragile. 


The Internet is rife with AI/ML communities like Hugging Face, Kaggle, Github, Spark ML group,, Towards AI, DataQuest and EleutherAI where experts share their insights on AI, ML, computer vision, and so on. These forums are a good place to brainstorm and ideate. Apart from the knowledge transfer, such communities are a good catchment area for AI, ML and data scientist talents and are also ideal for finding collaborators for your projects.

Right talent

Extensive knowledge of machine learning, statistics, and probabilities is critical in doing research in AI, and finding, retaining, and nurturing talent is challenging. A deep understanding of data domains is required for building pathbreaking AI models. AI labs should hire the people with the perfect balance of data intuition and state-of-the-art knowledge. These people are almost all academics, said Foteini Agrafioti, head of Borealis AI.

Today, resource-intensive and corporatized AI labs focus on product-based research leading to an AI/ML monoculture. Thus, academia’s contribution to large-scale AI research is drastically falling. Lot of good AI/ML talents are disillusioned with the commoditization of scientific research. Small AI labs can absorb such talents interested in open-ended research for common good. 

Open source 

The flexibility, interoperability and cost savings that open source models offer could greatly benefit small research labs. TensorFlow, PyTorch, MLflow, NumPy, Keras, and Pandas are some of the popular open-source tools that researchers can use to build solutions. For example, Elkanio Research Labs uses TensorFlow, Python, and Rapidminer for data analytics.

Small labs like Borealis AI not only use open-source AI/ML tools but also open-source their codes and publish their results. Borealis AI’s AdverTorch provides researchers with the tools for conducting research in different directions for adversarial robustness. The Private Synthetic Data Generation toolbox provides machine learning practitioners with the ability to generate private and synthetic data samples from real-world data.

Pre-trained models

Training ML models are both time-consuming and resource-intensive. Small research labs could use different pre-trained models and repurpose them as per their needs using a transfer learning mechanism. Transfer learning is the improvement of learning in a new task (target task) through transferring knowledge from a related task (source task) that has already been learnt. 

“I feel like taking different pre-trained models and sticking them together in interesting ways seems pretty cool/promising without needing massive compute, see our LiT, UViM, but also flamingo, frozen-LM, CLIP-guided art, … Or find better ways to use/transfer them,” said Beyer.

Pre-trained models help save time and reduce computational costs as they require less training. Less computational power eventually reduces the carbon footprint, which is also an added advantage. Inceptionv3 is a CNN built and trained by Google for image classification. T5 is another pre-trained model developed by Google for text classification. YAMNet is a pre-trained deep neural network that can predict audio events from 521 classes. 

Cloud computing

Cloud computing helps small labs lower their operating costs and scale up with easy access to flexible resources. For example, Google’s TPU Research Cloud (TRC) permits small labs to accelerate their machine learning research with free access to Cloud TPUs and frameworks like TensorFlow, PyTorch, Julia and JAX. Cloud TPU is a custom-designed machine learning ASIC to run cutting-edge machine learning models on Google Cloud.

More Great AIM Stories

Zinnia Banerjee
Zinnia loves writing and it is this love that has brought her to the field of tech journalism.

Our Upcoming Events

Conference, Virtual
Genpact Analytics Career Day
3rd Sep

Conference, in-person (Bangalore)
Cypher 2022
21-23rd Sep

Conference, in-person (Bangalore)
Machine Learning Developers Summit (MLDS) 2023
19-20th Jan

Conference, in-person (Bangalore)
Data Engineering Summit (DES) 2023
21st Apr, 2023

3 Ways to Join our Community

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Telegram Channel

Discover special offers, top stories, upcoming events, and more.

Subscribe to our newsletter

Get the latest updates from AIM

The curious case of Google Cloud revenue

Porat had earlier said that Google Cloud was putting in money to make more money, but even with the bucket-loads of money that it was making, profitability was still elusive.

Global Parliaments can do much more with Artificial Intelligence

The world is using AI to enhance the performance of its policymakers. India, too, has launched its own machine learning system NeVA, which at the moment is not fully implemented across the nation. How can we learn and adopt from the advancement in the Parliaments around the world? 

Why IISc wins?

IISc was selected as the world’s top research university, trumping some of the top Ivy League colleges in the QS World University Rankings 2022