MITB Banner

Small AI labs in a world of tightening funds

The flexibility, interoperability and cost savings that open source models offer could greatly benefit small research labs.

Share

Big AI labs like DeepMind, Alan Turing Institute, OpenAI, and Meta AI are pushing the frontiers of artificial intelligence. Meanwhile, small labs like AI21 Labs, which has released Jurassic-1 Jumbo–a language model that could give GPT 3 a run for its money, are also catching up.

Not just AI21 Labs, a bunch of small AI labs like ElkanIO Research Lab, Cohere for AI, Hidden Brains Infotech, LIVIA, and Borealis AI are doing impressive work in the AI space. While ElkanIO Research Lab offers video analytics, robotic process automation, facial recognition solutions and conversational AI, Borealis AI performs fundamental and applied research in reinforcement learning, natural language processing, deep learning, and unsupervised learning. LIVIA is engaged in large-scale processing, analysis and interpretation of images and videos using artificial intelligence.

However, these research labs face several challenges. For starters, they struggle to find investors. Getting access to grants is also an uphill task because there’s too much competition. 

Solve for small

“First of all, try to do the research that you might be in a unique position to do. That’s a mix of what your colleagues are great in, and what other communities (eg good non-ML labs at your uni, personal hobby/interest, …) you have access to and could collaborate with,” said Lucas Beyer, senior research engineer at Google Brain.

The labs should capitalise on their strengths– in terms of talent and the problem they are trying to solve– to build a competitive moat. Rather than casting the net wide, small labs should focus on a problem they are in a good position to solve for. It’s important for small AI labs to be antifragile. 

Networking

The Internet is rife with AI/ML communities like Hugging Face, Kaggle, Github, Spark ML group, Informed.ai, Towards AI, DataQuest and EleutherAI where experts share their insights on AI, ML, computer vision, and so on. These forums are a good place to brainstorm and ideate. Apart from the knowledge transfer, such communities are a good catchment area for AI, ML and data scientist talents and are also ideal for finding collaborators for your projects.

Right talent

Extensive knowledge of machine learning, statistics, and probabilities is critical in doing research in AI, and finding, retaining, and nurturing talent is challenging. A deep understanding of data domains is required for building pathbreaking AI models. AI labs should hire the people with the perfect balance of data intuition and state-of-the-art knowledge. These people are almost all academics, said Foteini Agrafioti, head of Borealis AI.

Today, resource-intensive and corporatized AI labs focus on product-based research leading to an AI/ML monoculture. Thus, academia’s contribution to large-scale AI research is drastically falling. Lot of good AI/ML talents are disillusioned with the commoditization of scientific research. Small AI labs can absorb such talents interested in open-ended research for common good. 

Open source 

The flexibility, interoperability and cost savings that open source models offer could greatly benefit small research labs. TensorFlow, PyTorch, MLflow, NumPy, Keras, and Pandas are some of the popular open-source tools that researchers can use to build solutions. For example, Elkanio Research Labs uses TensorFlow, Python, and Rapidminer for data analytics.

Small labs like Borealis AI not only use open-source AI/ML tools but also open-source their codes and publish their results. Borealis AI’s AdverTorch provides researchers with the tools for conducting research in different directions for adversarial robustness. The Private Synthetic Data Generation toolbox provides machine learning practitioners with the ability to generate private and synthetic data samples from real-world data.

Pre-trained models

Training ML models are both time-consuming and resource-intensive. Small research labs could use different pre-trained models and repurpose them as per their needs using a transfer learning mechanism. Transfer learning is the improvement of learning in a new task (target task) through transferring knowledge from a related task (source task) that has already been learnt. 

“I feel like taking different pre-trained models and sticking them together in interesting ways seems pretty cool/promising without needing massive compute, see our LiT, UViM, but also flamingo, frozen-LM, CLIP-guided art, … Or find better ways to use/transfer them,” said Beyer.

Pre-trained models help save time and reduce computational costs as they require less training. Less computational power eventually reduces the carbon footprint, which is also an added advantage. Inceptionv3 is a CNN built and trained by Google for image classification. T5 is another pre-trained model developed by Google for text classification. YAMNet is a pre-trained deep neural network that can predict audio events from 521 classes. 

Cloud computing

Cloud computing helps small labs lower their operating costs and scale up with easy access to flexible resources. For example, Google’s TPU Research Cloud (TRC) permits small labs to accelerate their machine learning research with free access to Cloud TPUs and frameworks like TensorFlow, PyTorch, Julia and JAX. Cloud TPU is a custom-designed machine learning ASIC to run cutting-edge machine learning models on Google Cloud.

Share
Picture of Zinnia Banerjee

Zinnia Banerjee

Zinnia loves writing and it is this love that has brought her to the field of tech journalism.
Related Posts

CORPORATE TRAINING PROGRAMS ON GENERATIVE AI

Generative AI Skilling for Enterprises

Our customized corporate training program on Generative AI provides a unique opportunity to empower, retain, and advance your talent.

Upcoming Large format Conference

May 30 and 31, 2024 | 📍 Bangalore, India

Download the easiest way to
stay informed

Subscribe to The Belamy: Our Weekly Newsletter

Biggest AI stories, delivered to your inbox every week.

AI Courses & Careers

Become a Certified Generative AI Engineer

AI Forum for India

Our Discord Community for AI Ecosystem, In collaboration with NVIDIA. 

Flagship Events

Rising 2024 | DE&I in Tech Summit

April 4 and 5, 2024 | 📍 Hilton Convention Center, Manyata Tech Park, Bangalore

MachineCon GCC Summit 2024

June 28 2024 | 📍Bangalore, India

MachineCon USA 2024

26 July 2024 | 583 Park Avenue, New York

Cypher India 2024

September 25-27, 2024 | 📍Bangalore, India

Cypher USA 2024

Nov 21-22 2024 | 📍Santa Clara Convention Center, California, USA

Data Engineering Summit 2024

May 30 and 31, 2024 | 📍 Bangalore, India

Subscribe to Our Newsletter

The Belamy, our weekly Newsletter is a rage. Just enter your email below.