One of the key highlights at the MLDS summit 2020 was Viral B Shah, Co-creator of Julia Computing, who talked about Julia language and how it will become the language of the future. With more than 1000 delegates, MLDS second edition was India’s first applied AI and machine learning conference focused on developers, data scientists and enthusiasts. It emerged as the best forum to learn, network and discover the latest in applied AI and deep learning tools and frameworks.
In a well-attended keynote, presented by Viral B. Shah, Shah explained how Julia will become the language of the future and how differentiable programming helps in accomplishing complex computational programs.
Sign up for your weekly dose of what's up in emerging technology.
Julia is a powerful high-level language with high-performance, where the syntax is similar to Python and Matlab. The language is 10 times faster compared to some of the popular languages like R, Python, Matlab, among others. At the present scenario, various popular organisations such as Intel, Amazon, Nasa, Microsoft, Google, among others, have been using this language in some way or the other.
Julia was officially announced in 2012 and had been exponentially growing since then. The language has been adopted by the community at a faster rate, and currently, it includes over a thousand contributors and more than 3000 packages in differential equations, graph processing, data science, image processing and much more.
Viral said, “Ease of use of dynamic language and the performance of the deployed languages, was the motivation behind creating Julia language.” He further outlined the important features of this language, which include fast, easy to use, productive, expressive, scalable, mathematical, supports all hardware including GPU and TPU as well as supports all clouds.
Talking about how artificial intelligence and machine learning has moved from hype to reality, Viral emphasised how training deep learning models can be expensive as well as time-consuming. He further said that differentiable programming provides you with immense levels of performance while being simultaneously easy to use.
Differentiable Programming With Julia
To understand a little deeper about what differentiable programming is, let us delve through one of the research papers published last year by Viral along with his team, who described how differentiable programming systems could help provide a bridge between machine learning and scientific computing.
Zygote is a Differentiable Programming (∂P) system that is able to take gradients of Julia programs, making automatic differentiation a first-class language feature. Differentiable Programming (∂P) system has the potential to be the lingua franca that can further unite the worlds of scientific computing and machine learning.
The system supports almost all language constructs such as control flow, recursion, mutation, etc. and compiles high-performance code without requiring any user intervention or refactoring to stage computations. This enables an expressive programming model for deep learning as well as allows users to utilise the existing Julia ecosystem of scientific computing packages in deep learning models.
The system can be directly used on existing Julia packages, handling user-defined types, state-based control flow, and plentiful scalar operations through source-to-source automatic differentiation (AD).
Read the paper here.
Coming back to the talk, Viral talked about the libraries and packages in various domains that are exclusively available only in Julia language. One can call any Python, R, C or Java functions from Julia without changing a single line of code.
Viral further talked about a few important tools of Julia such as Flux. He concluded the talk by discussing how Zygote provides an extremely low-overhead AD interface and how this tool has shown to perform at the same level as TensorFlow for ResNet on a TPU pod.