Now Reading
Build 2020 Showed That ML Developers Are The Focus For Microsoft

Build 2020 Showed That ML Developers Are The Focus For Microsoft

Vishal Chawla
microsoft build 2020

In the past few years, we have seen the explosion of large scale machine learning models and rapid advancements in artificial intelligence. The contribution of developers has been behind the innovation that we’ve seen recently. Therefore, Microsoft released a wide range of ML-based solutions at Build 2020.  

Microsoft is working to democratise the solutions by making it available for everyone to read and build on top of their ML platform. Microsoft has contributed to this progress by advancing the state-of-the-art in areas like Azure cognitive services, speech recognition, computer vision and natural language understanding. Here are some of the announcements at Build 2020, which will impact AI/ML developers.

New DeepSpeed Features

Microsoft updated its open-source library DeepSpeed for PyTorch to enable everyone to train AI models ten times bigger and five times faster on the same infrastructure. According to the announcement, the library’s optimiser improves memory consumption during training, which promises DeepSpeed users scale and speed improvements by order of magnitude during deep learning.



Microsoft just brought together the optimisations from DeepSpeed as well as other libraries from Microsoft ONNX runtime, making it easy for developers to use all these training optimisations along with the hardware of developer’s choice to support highly efficient training of large scale models.

Open-Source The Turing Language Model

A year ago, the most prominent AI models in the world had 1 billion parameters. Now, Microsoft’s Turing NLG model has 17 billion parameters. The model has learned facts, concepts and grammar from colossal amounts of data, including all the Wikipedia articles. Tens of millions of web pages and dozens of books.  If you ask a question to the model, it will not query against a database or index. But instead, it will understand the semantic meaning of the question, relate it to the world of information that the model has learned and synthesise a grammatically correct sentence. At Build 2020, Microsoft announced it would open-source the Turing Language Model, which developers can leverage for all kinds of innovation.


W3Schools
See Also
NVIDIA’s KeyNote, Microsoft’s Build Conference And More: Top AI News This Week

Impact Of AI Supercomputer On Developers

Late last year Microsoft completed work on its first AI supercomputer. And it was handed over to scientists and engineers to start using in their work. Microsoft cloud-hosted supercomputer has now made it in the list of the top supercomputers that will accelerate projects large and small. For example, having access to ubiquitous high-quality language translation, natural language understanding, summarization and reasoning is going to help supercharge developers. Supercomputing has an impact of translating to real-world implications for developers and organisations by assisting large-scale systems.

Using the supercomputer, developers can benefit from getting the output of this computing power to create large AI models hosted on a supercomputing infrastructure. As you go to the supercomputer to scale models, developers can keep finding new tasks that the models are capable of.  Developers can train on supercomputers massive language models or can do all these different kinds of tasks related to natural language tasks. The models can go much further. Sam Altman, CEO of OpenAI, demonstrated the power of the supercomputer for developers by showing how Microsoft supercomputer trained Open AI language model on thousands of open-source GitHub repositories can help write code. According to Microsoft, this will make any developers more productive, allowing developers to spend less time on repetitive, time-consuming coding tasks and focus more on the creative aspects of writing software.

Provide your comments below

comments

Copyright Analytics India Magazine Pvt Ltd

Scroll To Top