MITB Banner

Wolfram’s New Update Gives Developers Genius-level Generative AI

After being one of the first plugins to ever come to ChatGPT, Wolfram has now gone all in on the LLM wave

Share

Listen to this story

After being one of the first plugins to ever come to ChatGPT, Wolfram has now gone all in on the LLM wave. In the latest version 13.3 update, the Wolfram language has added support for LLM technology, as well as integrating an AI model into the Wolfram Cloud. 

This update comes on the heels of Wolfram slowly building the tooling for making the language LLM-ready. The update puts LLMs directly into the language with the introduction of an LLM subsystem for the language. It also builds on the LLM functions technology added in May, which ‘packages’ AI powers into a callable function, with the new subsystem now being user-addressable. 

With these new updates, developers have a whole new way of interfacing with their data. This approach combines Stephen Wolfram’s idea of natural language programming along with the Wolfram language’s symbolic programming, creating a force to be reckoned with. What’s more, with the Wolfram language API, this can be plugged in to larger systems, delivering amazing power through a natural language interface. 

LLM-powered computation 

The Wolfram language’s strengths lie in its symbolic programming capabilities. Using code, the language can perform many complex mathematical functions, such as algebra, matrix manipulations, and differential equations. To bolster the language’s logical problem-solving capabilities, Stephen Wolfram, the creator of the language, decided to add in LLM capabilities to the bot. 

Wolfram’s tryst with LLMs began with the creation of the Wolfram ChatGPT plugin, which empowered the chatbot with the symbolic programming capabilities of the language. Stephen then embarked upon a journey to take symbolic language to new heights, saying to AIM, “We’re about to use the symbolic language [Wolfram] to provide a way of using LLM as a component in a larger software stack. It’s something that you can do in a very beautiful way.”

The 13.3 update seems to be a step towards this direction, bringing LLMs directly into Wolfram language through a LLM subsystem. After the chatbot plugin and LLM function calling, Stephen introduced Chat Notebooks. This allows users to easily interact with LLMs in the Wolfram Notebook through a text box, allowing them to generate powerful code in the Wolfram language. Stephen called this interface an example of “using an LLM as a linguistic interface with common sense”, as it allows the users to interact with the language without needing to know the syntax. 

Stephen thinks this is a natural step to the existing capabilities of the program, stating, “When you’re doing something you’re familiar with, it’ll almost always be faster and better to think directly in Wolfram Language, and just enter the computational language code you want. But if you’re exploring something new, or just getting started on something, the LLM is likely to be a really valuable way to “get you to first code.”

The in-built LLM has self-correcting capabilities as well, allowing it to fix its own errors before running it and outputting code snippets. This means that it can also debug existing code in Wolfram language, and can even look at details like stack trace and error documentation to fix broken code. 

It also comes with a few different personas, each geared towards a different purpose. The code assistant persona writes codes and explains it, the code writer persona only generates the code, while others like Wolfie and Birdnado respond to the user “with an attitude”. 

To extend the functionality of the LLM even further, Wolfram also launched the prompt repository, which can be used to get additional function prompts and modifier prompts. While he stated that Wolfram language will continue to become “increasingly integrated as a tool into LLMs’, the prompt repository currently showcases the capabilities of the language’s new AI tools. 

A developer’s playground

The prompt repository is a community-contributed prompts platform that will allow the LLM in Wolfram language to adopt many different personas, each with discrete use-cases. In addition to this, the community has also contributed various functions that can extend the functionality of the language beyond its already-comprehensive list of inbuilt functions. 

The prompts in the repository are split into three main categories, namely personas, functions, and modifiers. Personas define the style of interaction with the user, functions generate output from existing text, and modifiers apply an effect to the output coming from the LLMs. Each of these functions can also be called in code, allowing developers to integrate them easily into existing code or even extend the functionality of a program. 

The repository serves a very important purpose is speeding up workflows by allowing developers to avoid LLM wrangling. In Wolfram’s words, “Sometimes just using the [prompt’s] description will indeed work fine. But often it won’t. Sometimes that’s because one needs to clarify further what one wants. And sometimes there’s just a certain amount of “LLM wrangling” to be done. And this all adds up to the need to do at least some “prompt engineering” on almost any prompt.”

The repository effectively removes the need for this prompt engineering by creating a database of accessible and callable prompts, which are then converted into Wolfram Language code by the LLM. These prompts use the language to carry out a function on the given text,  extending its functionality beyond mathematical problems.

Some of the sample prompts that stood out to us were the LongerRephrase prompt, which expands a given statement, Scientific Jargonize, which makes a plain text sentence sound like it has come out of a research paper, and TweetConvert, which converts data into a tweet. There are also a host of other prompts which can convert sentences into product pitches, dejargonize complex pieces of text, check grammar stringently, and even generate puns. 

Using the ever-growing repository of prompts, devs and citizen developers alike can use the Wolfram language to easily modify large pieces of text. What’s more, since each prompt can be called as a function, they can be added on to any program to give it LLM super powers. Once the added LLM functionality goes live in the coming days, the Wolfram language will become an indispensable tool in the belt of AI enthusiasts and developers alike.

Share
Picture of Anirudh VK

Anirudh VK

I am an AI enthusiast and love keeping up with the latest events in the space. I love video games and pizza.
Related Posts

CORPORATE TRAINING PROGRAMS ON GENERATIVE AI

Generative AI Skilling for Enterprises

Our customized corporate training program on Generative AI provides a unique opportunity to empower, retain, and advance your talent.

Upcoming Large format Conference

May 30 and 31, 2024 | 📍 Bangalore, India

Download the easiest way to
stay informed

Subscribe to The Belamy: Our Weekly Newsletter

Biggest AI stories, delivered to your inbox every week.

AI Forum for India

Our Discord Community for AI Ecosystem, In collaboration with NVIDIA. 

Flagship Events

Rising 2024 | DE&I in Tech Summit

April 4 and 5, 2024 | 📍 Hilton Convention Center, Manyata Tech Park, Bangalore

MachineCon GCC Summit 2024

June 28 2024 | 📍Bangalore, India

MachineCon USA 2024

26 July 2024 | 583 Park Avenue, New York

Cypher India 2024

September 25-27, 2024 | 📍Bangalore, India

Cypher USA 2024

Nov 21-22 2024 | 📍Santa Clara Convention Center, California, USA

Data Engineering Summit 2024

May 30 and 31, 2024 | 📍 Bangalore, India

Subscribe to Our Newsletter

The Belamy, our weekly Newsletter is a rage. Just enter your email below.