Listen to this story
When a disruption on the scale of ChatGPT comes along, there’s a sea change made to every job it touches. Older job titles in tech become redundant as quickly as new ones are formed. In the current times of LLMs-gone-wild, NLP engineers are in the thick of things and will presumably be one of the first ones to see the repercussions of this shift. Up until now, the job description of an NLP engineer was a lengthy list of requirements, including expertise in text representation, semantic extraction techniques, data structures and modelling.
LLMs are more accessible with plugins
But plugins have found a way to make these LLMs accessible beyond the sludge of just internet training data. Among a host of other tasks like scheduling tasks and doing taxes, plugins are able to help with language translation and sentiment analysis. Plugins can turn a mere chatbot or an LLM into an ecosystem overnight. If ChatGPT is a smartphone, the plugins are the apps on the phone.
And the OpenAI Application Programming Interfaces or APIs have figured out a more user-friendly prototype for users, which is not just for those who have learned command sequences.
But despite the simplicity of these interfaces, many LLM algorithms need to be customised depending on the industry they are being used in. For example, LLMs used in the healthcare industry have to process and interpret electronic health records (EHRs), suggest treatments and summarise patient healthcare reports based on the doctor’s notes and voice recordings.
Similarly, LLMs in the financial sector will have to summarise earnings calls, create meeting transcripts and even perform fraud analysis to protect consumers.
While it has become simpler to interact with an LLM — all that the chatbot needs is a prompt, this has given birth to an entirely new skill called prompt engineering. Almost as soon as the job title itself came about, an offer posted by Anthropic AI for a ‘Prompt Engineer and Librarian’ within a salary range of USD 175,000 to USD 335,000 turned heads.
A new future for NLP engineers?
In a blog posted in December last year, Miguel Ballesteros, principal applied scientist with Amazon Web Services’ AI Labs discussed how NLP is tilting towards prompt engineering. “In the past, we used to have feature engineering, where you have a statistical model, and you are adding different types of representations or features, and then you need to tune the model along with your features. These days, with large language models, the new thing that is coming is the idea of prompts, in all of its variations, as a way of priming the models,” he stated.
All this while, feature engineering has been the backbone of NLP processes with very specific techniques for different types of problems. For example, some problems might require us to extract grammatical features from the text, while some might just need us to get the frequently occurring words.
On the face of it, prompt engineering may seem like it’s a skill that’s easily learned but experts have called this a falsehood. “Suppose we are interacting with ChatGPT and ask it to, “Write about your day.”. This prompt is vague and doesn’t provide any specific direction or context. It’s unclear what the writer should focus on or what kind of response is expected. This is an example of a bad prompt, especially in an enterprise context.
Now, if we are to ask “Describe a memorable moment from your day, including what happened, how it made you feel, and why it was significant to you.” This prompt is specific and provides clear direction, outlining what the writer should include in their response. It’s also open-ended enough to allow for creativity and personalization in the writer’s answer,” Parag R., senior manager of data science at Accenture discussed in his blog posted in March.
Prompt engineering DOES need skills
The heat that prompt engineering jobs are gathering also doesn’t immediately make the skills of NLP engineers redundant. After all, even if LLMs are easy to talk to, they aren’t telepathic. “If you tell a chatbot to create a complex piece of software, it will shrug its shoulders. But tell it to break down the tasks needed to do so into chunks and then start working on those chunks one by one, and you are more likely to start getting somewhere.
So it’s unlikely that all those years you’ve spent learning about coding and software engineering have gone to waste. You’ll still need that knowledge and experience to help you pick the right prompts and to ensure that ChatGPT’s output is on the right track,” author and tech strategist Bernard Marr noted in a Forbes article.
Startup advisor and tech strategist Ashish Mukherjee explains, “There can be parallels drawn between this and the past. At one point the roles of system administrators were thought to have become defunct but eventually as the environment became more complex, they evolved into becoming DevOps engineers. And all their past experiences did come to count. As a technological advancement happens, the market responds to it adequately. That’s exactly what I predict will happen with NLP engineers. As things get commodified and the complex use cases increase, they arm themselves with new skills like prompt engineering. Human judgement is still important.”
Prompts are a tricky thing. The right sequence of words can give us exactly what we need from the chatbot while the wrong prompt can elicit gibberish. And who can understand the science behind language better than an NLP engineer?