MITB Banner

When Should Newsrooms Use AI?

Share

When Should Newsrooms Use AI?

Illustration by When Should Newsrooms Use AI?

A South Korean cable television channel, MBN debuted an AI-powered news anchor, developed in collaboration with Moneybrain, according to recent news reports.

The AI news anchor called AI Kim developed to have resemblance with Kim Ju-ha, a news reporter at MBN, has the exact same looks, nuances, and sound, as it even copies small gestures that she does while reporting the news.

“I was created through deep learning 10 hours of video of Kim Ju-ha, learning the details of her voice, the way she talks, facial expressions, the way her lips move, and the way she moves her body,” said AI Kim, adding, “I am able to report news exactly the way that anchor Kim Ju-ha would.”

AI Kim, however, is not the first AI-powered news reporter on television. A news anchor developed by a Chinese news agency Xinhua in 2018, was a product of Natural Language Generation (NLGs) and Generative Adversarial Network (GANs)

Automated Journalism Has Helped Newsrooms Improve Processes

Over the period several AI applications have found a home in major newsrooms across the world. 

The New York Times, for instance, introduced an AI project, Editor, in 2015. The app helps in tagging of words, phrases and sentences and sends back relevant information based on it, thus saving the journalist’s time on the research process and providing fast and accurate fact-checking. The Times is also using AI to moderate comments to encourage constructive discussions and remove trolls and abusive language. 

Another AI application for semantic tagging has been developed by BBC News Labs called The Juicer. The Juicer takes news content, automatically tags it and then provides a fully-featured API to access content and data. 

The Washington Post and Yahoo news both have used automated-text generation. 

Another initiative by the leading UK news agency Press Association, called RADAR AI uses NLG tool Arria to generate tailored news for thousands of local news organisations. Using datasets that have data entries for multiple local bodies, the tool generates news for many local stories rather than producing just one national story.

AI-based tools, thus, have helped many newsrooms to automate processes, thus saving time on research, building visualisations, generating news, or in the case of RADAR AI, help keep local news survive, especially during the pandemic.

The Key Is In Identification Of Which Tasks Can Be Automated And Which Should Remain Inherently Human

A paper published in the journal, Journalism Practice, conducted thorough research to examine the implications of automation in newsrooms through field theories and in-depth interviews.

While the discussion around the use of automation or AI and the number of jobs it will take away remains relevant, other major concern arises in the form of quality. 

Automation has a tendency towards easy-to-consume products. This is already evident in journalism’s shift towards data-centric and short, easy-to-digest content that caters to audience preferences. 

As news outlets try to remain financially viable and at the same time produce well-researched and evidence-centric news, technologists can help them do better, work more efficiently, and innovate. However, at the same time, more infiltration of technologists in the field can lead to journalism losing its autonomy. 

Immersive Automation, a research consortium, came up with a thorough report that focuses on news creation and automation. 

For pragmatic considerations of implementing and deploying automation in newsrooms, it is important for organisations faced with potential radical changes such as news automation, to implement strategies that “incorporate a holistic view of these news tools of how they reflect (or deflect) the identity of the organisations.” 

The report also highlights the ethical considerations and transparency that organisations need to consider in terms of data, selection of facts, self-regulation, legal liability, by-lines, personalisation and correction policy.

“The future of automation lies in the deconstruction of fundamental principles of journalism,” the report concludes, “That means, breaking down journalistic work into actual information artefacts and micro-processes so as to analyse what can be automated and what are inherently human tasks.” 

In Conclusion

The entry of AI-based tools and products in newsrooms looks inevitable. While this has several advantages across processes in news production, it is important not to lose values the news organisation stands for. 

A clear distinction of what should be automated and what should be carried out by humans is a must.

Share
Picture of Kashyap Raibagi

Kashyap Raibagi

Kashyap currently works as a Tech Journalist at Analytics India Magazine (AIM). Reach out at kashyap.raibagi@analyticsindiamag.com
Related Posts

CORPORATE TRAINING PROGRAMS ON GENERATIVE AI

Generative AI Skilling for Enterprises

Our customized corporate training program on Generative AI provides a unique opportunity to empower, retain, and advance your talent.

Upcoming Large format Conference

May 30 and 31, 2024 | 📍 Bangalore, India

Download the easiest way to
stay informed

Subscribe to The Belamy: Our Weekly Newsletter

Biggest AI stories, delivered to your inbox every week.

AI Courses & Careers

Become a Certified Generative AI Engineer

AI Forum for India

Our Discord Community for AI Ecosystem, In collaboration with NVIDIA. 

Flagship Events

Rising 2024 | DE&I in Tech Summit

April 4 and 5, 2024 | 📍 Hilton Convention Center, Manyata Tech Park, Bangalore

MachineCon GCC Summit 2024

June 28 2024 | 📍Bangalore, India

MachineCon USA 2024

26 July 2024 | 583 Park Avenue, New York

Cypher India 2024

September 25-27, 2024 | 📍Bangalore, India

Cypher USA 2024

Nov 21-22 2024 | 📍Santa Clara Convention Center, California, USA

Data Engineering Summit 2024

May 30 and 31, 2024 | 📍 Bangalore, India

Subscribe to Our Newsletter

The Belamy, our weekly Newsletter is a rage. Just enter your email below.