Now Reading
How New Zealand Massacre Footage Was The Real Test For Google's AI


How New Zealand Massacre Footage Was The Real Test For Google's AI


The entire world was shaken hard by an incident that took place in New Zealand on March 15, a country that is very rarely or not known for terror attacks. Brenton Tarrant, a 28 year old Australian, went on a killing spree in two Mosques at Christchurch New Zealand. He also was kind enough to share his views of extremism in social media by going live on Facebook as he killed innocent lives. As the New Zealand government and the people were trying their best to forget such a peace disrupting event, the entire world was watching it on social media platforms.



Google has always been strict against the spreading of violent and hateful videos in its video sharing platform, Youtube. Apparently, they have hired thousands of moderators and some best AI geeks to address the problem.

However last Friday when the entire world stood stunned by the bloody massacre, the footage of which was spreading violently across Youtube was challenging all of its deep minds to work to put an end to the share of violence.

The footage first appeared as live streaming on Facebook, but it didn't end there, by the time Facebook managed to remove the footage from its servers, people had already downloaded it and started sharing on all social media.

As reported by a popular news media company, Youtube took unprecedented steps including temporarily disabling several search functions and cutting off human review features to speed up the removal of videos flagged by automated systems. Many of the new clips were altered in ways that outsmarted the company’s detection systems.

Reportedly, Mohan, YouTube’s chief product officer said that the uploads came more rapidly and in far greater volume than during previous mass shootings.“Every time a tragedy like this happens we learn something new, and in this case, it was the unprecedented volume” of videos, Mohan said. “Frankly, I would have liked to get a handle on this earlier.”

Hany Farid, a computer science professor at the University of California at Berkeley's School of Information said, “Once content has been determined to be illegal, extremist or a violation of their terms of service, there is absolutely no reason why, within a relatively short period of time, this content can't be eliminated automatically at the point of upload,” as reported by Gerrit De Vynck and Jeremy Kahn on Bloomberg.

A feature called ContentID has been around for a long time which gave copyright owners such as film studios the ability to claim content as their own, get paid for it, and have illegal copies deleted. Similar technology has been used to blacklist other illegal or inappropriate content, including child pornography and terrorist propaganda videos. But the real challenge is that people have always come up with ways to evade them, the report says.

See Also

Reportedly, Rasty Turek, chief executive officer of Pex said, "There are so many ways to trick computers,". He added that making minor changes to a video, such as putting a frame around it or flipping it on its side, can throw off software that's been trained to identify troubling images.

Another challenge for Social Media is its own feature that enables people to live stream videos. Live streaming is hard to get around as the AI systems can only analyze them after the video has been uploaded completely. This makes it possible for users to stream videos on social media without being detected for some time.

Adding to the complication is the part that popular news media companies play in the sharing of edited or censored video or pictures of the footage. Youtube can't simply take down the content from a news report as it would be against press freedom rights.

Last Word

The sharing of disturbing content is just one among the many major problems that social media has. Is it just the social media to be blamed or should we consider the mindset of people who actually use them for such purposes? Whatever moto drives them, good or bad, there will always be a way to share violence, and social media may well be just a victim of exploitation. Such incidents are a constant reminder of the real darkness in humans.



Register for our upcoming events:


Enjoyed this story? Join our Telegram group. And be part of an engaging community.

Provide your comments below

comments

What's Your Reaction?
Excited
0
Happy
0
In Love
0
Not Sure
0
Silly
0
Scroll To Top