MITB Banner

Can Artificial Intelligence Learn From A Child’s Language Acquisition Process?

Can language be taught to artificial intelligence? Given that “machines are capable of doing any work man can do” researchers are working hard on long-standing projects to train machines so they can talk just like us. But unlike identifying images, words are grounded in sense and representation.

Jonathan Mugan, co-founder & CEO at DeepGrammar, an AI company focused on NLP believes to train machines so that they can talk with us, we need to immerse them in an environment which is similar to ours. In this direction, there is a considerable research underway in developing better real worlds by allowing computers to spend more time in the virtual one. Case in point – OpenAI trained machines by playing Grand Theft Auto V to help it understand the real world better.

Even though artificial intelligence has made considerable leap in the recent years, especially with deep learning, but current chatbots are still an embarrassment, added Mugan. Despite the complexity of language, a child is able to learn a natural language competently in a short span of time. Can machines have the capacity to learn language like children do?

Can AI Learn from a child’s language acquisition process?

According to Steven Pinker, experimental cognitive psychologist and a leading mind on languages and the mind, the “child’s language acquisition process has solved a remarkably difficult computational problem”. Pinker shared in a recent interview to Edge that AI’s problem can be stated as an engineering task –  designing an algorithm that ingests a bunch of sentences and their contexts from any of the five thousand languages on the planet, and after crunching through a number of these sentences — drums up the grammar for the language, regardless of what the language is. Amaxon’s Alexa & Google Home outfitted with voice-enabled interface are excellent examples of present day automated speech recognition and NLG.

The key question is whether a computational simulation of natural language acquisition (just in a way a child acquires natural language) could be used to develop computer systems that can recognize, understand and generate natural languages, solving in this way the problem between machines and humans, proposed the research paper titled Children as Models for Computers: Natural Language Acquisition for Machine Learning.

Let’s look at some of the research that is underway in this field coming out of universities across the globe:

General Inference framework: According to researchers Leonor Becerra-Bonache and Maria Dolores Jiménez-López, Grammatical Inference (GI) – is a subfield of Machine Learning that deals with the learning of formal languages from a set of data and GI can provide a good theoretical framework to understand how children process and acquire languages. How GI does that is by providing a number of alternative learning algorithms — Query learning model & PAC learning model. GI presents a scenario that is close to real life – there is a teacher and a learner wherein the teacher provides information about a language to the learner and the learner infers the grammar for that language.

MIT SpeechHome Corpus: MIT researcher Deb Roy went a step ahead and put together a Human Speechome Project with the goal of making a comprehensive and unbiased record of one child’s development at home. The data provided a basis for further study and the researcher developed a robot that could learn from “show-and-tell” interactions with a human teacher. Given a number of visual presentations of objects paired with spoken descriptions of the objects, the robot learned to

(a) segment continuous speech in order to discover spoken words units

(b) form visual categories of object shapes and colors

(c) learn semantically appropriate associations between speech labels and visual categories.

The research stated child’s language acquisition process could advance the understanding of language acquisition through cross-disciplinary methods that bring together the human sciences with computational sciences and design.

AI/LT Lab at Flinders University: One of Australia’s leading universities, Flinders AI/LT lab is playing a major role in understanding the grounded syntax and semantics in which the computer/system/robot really understands what is being talked about. Children learn grammar and meaning in an unsupervised way by learning patterns in context, and this is also emulated in the lab’s AI systems. For example, there are projects which by combining additional enhancements such as camera, microphone input enables lip-reading to be used to improve speech recognition under noisy conditions. It also allowed researchers to track more expressions and emotional content and synthesizing facial expressions. Biometric signals are also picked up as inputs to validate theories and models of language, learning and emotion.

Why are tech giants battling over voice?  

Amazon Alexa Echo family

With a shift in human-computer interactions, big tech giants like Google, Facebook & Amazon are plowing in billions of dollars to advance natural language understanding in their voice-powered devices. Companies are aggressively expanding their teams working on conversational interfaces. The rise of Alexa-powered speakers and Google Home have taken voice assistants from smartphones to millions of households where the devices are delivering high-level accuracy in conversational AI.

An industry research cites that tech behemoths Microsoft, IBM, Facebook, Google and Amazon have all inked previous NLP transactions and have exhibited a heightened interest in independent NLP platform providers. Google was the first to make the acquisition of API.AI, which provides developers with access to NLP capabilities. Earlier this year, SAP acquired French bot building platform Recast.ai earlier this January, while last year in January, Redmond giant Microsoft bought Maluuba, a Montreal-based company that uses deep learning to develop natural-language understanding.

Access all our open Survey & Awards Nomination forms in one place >>

Picture of Richa Bhatia

Richa Bhatia

Richa Bhatia is a seasoned journalist with six-years experience in reportage and news coverage and has had stints at Times of India and The Indian Express. She is an avid reader, mum to a feisty two-year-old and loves writing about the next-gen technology that is shaping our world.

Download our Mobile App

CORPORATE TRAINING PROGRAMS ON GENERATIVE AI

Generative AI Skilling for Enterprises

Our customized corporate training program on Generative AI provides a unique opportunity to empower, retain, and advance your talent.

3 Ways to Join our Community

Telegram group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our Daily newsletter

Get our daily awesome stories & videos in your inbox
Recent Stories