Without getting into the definition we can say that the quest to building supercomputing for all is underway. We have way faster and cheaper compute available at our fingertips. As the old comparison goes, the standard mobile phone is much more powerful than the first supercomputer used by universities to support their research. There is another wave now that thinks that it is not enough to give computation to everyone, they want to make sure that everyone has access to high-quality supercomputing cheaply and readily like never before.
The implications of supercomputing or very high-quality fast computing available to ordinary citizens at a cheap price are unimaginable. To decipher these implications, one needs to just think what happened when we gave computation to billions all over the world through cheap smartphones and cheap internet. It resulted in the petabytes of content created every day, millions of people expressing their opinion every moment, it leads to revolutions in politics, resolutions in communication, revolutions in human connections, revolutions in the way people work and collaborate and many more. There is no exaggeration in saying that democratizing simple computation through the internet changed the way we organise our societies.
Data Scientists As Vanguards For The Mission
The web was built by physicists to smoothen their communication and share their research more widely. Physicists and the physics community are responsible for many of the future shaping applications. Data scientists are rising as a strong community spread across many domains and fields. They make sense of data and seeing patterns that are valuable, and are served by many products and tools.
One of the unique demands felt by individual data scientists is the need for personal supercomputation with access to GPUs and other fancy processors. One way to look at this development is to see individuals with a craving for super computation that can not be fulfilled by today’s personal computers and mobile phones. The spread of cloud computing and the ability to access such computing is becoming easier day by day. But let us zoom in on a recent phenomenon, that was mostly neglected by the wider community but can have huge implications for the future. Some implications are how they materially affect the state of computation, but also what they do to the perception and the story about the spread of computation. The story is about Google Colab and data scientists who are the vanguard for this movement, “Supercomputing For All.”
Google Colab is one of the most useful tools in the toolkit of thousands of data scientists and engineers around the world. Google Colab is an easy way of hosting Jupyter Notebooks on the cloud (Google Drive) for data scientists. What more, Google Colab became very famous for providing free GPU computation for a limited amount of time. Google Colab mainly started as an experiment to make it easier for data scientists to spread knowledge of various algorithms in the community.
Google Colab Pro And How It Affects The Data Scientist
Google very sneakily has introduced Colab Pro which is a paid tier for data scientists who want to do more with their Google Colab. The Pro tier will cost $9.99/month for a data scientist. The Colab project has been successful in luring new data scientists who are still learning the craft and want to avail easy environments where Python code can be run in a hassle-free way. This service has been rolled out in the United States for now and will slowly be rolled out all over the world.
The main advantages of the Pro tier are:
- Much Faster GPUs for a paid tier
- Will run longer
- Much more memory on VMs
Many data scientists will find the option of a Pro tier very attractive as they wanted to use Colab for heavier loads and also wanted to avoid spinning EC2 instances for training machine learning models every now and then. Here are some features that data scientists will love to see in Colab Pro or a competing product:
- Ease of deployment for models built in the notebook.
- Ease of building inference engines on top of the notebook.
- Ability to connect to notebooks via SSH.
Google is quick to underline that the resources it offers even on the Pro tier are not unlimited. This is just a start.
Home » The Quest For Building “Super Computing For All” Is Well And Truly On
Supercomputing At Fingertips
All the computing trends and developments point to an eventuality where supercomputing will be supercomputing and heavy computation will be democratised and anyone with meagre resources will be able to access it. Corey J Nolet, a senior engineer at NVIDIA tweeted,
“Raspberry pi: $35, touch screen: $75, keyboard: $10, Google Colab: $9.99, that powerful feeling of supercomputing at your fingertips for 1/20th the cost of a MacBook Pro: priceless.”
Max Woolf, a data scientist at BuzzFeed, tweeted about how the new Colab Pro Tier really makes sense. He tweets, “A preemptible P100 + VM on Google Compute Engine is about ~$0.45/hr, so to exceed that value with Colaboratory Pro (ignoring convenience factors) you’d need to train for more than 22 hours in a month. Which, for deep learning, is not too unreasonable.”
The next wave of activities in this movement will be wide and deep penetration of supercomputing, building and spread of suitable hardware, discovery and creation of use cases where this newfound supercomputing will be put to use. As always, there will be a lot of serendipity and a lot of intelligent design.
Provide your comments below
As a thorough data geek, most of Abhijeet's day is spent in building and writing about intelligent systems. He also has deep interests in philosophy, economics and literature.