The two online platforms — Facebook and Kaggle have recently started facing an online backlash after they disqualified the apparent winners of Facebook’s Deepfake Detection Challenge.
Setting the context — the competition was launched by Facebook last year in order to encourage the development of newer technologies to detect deepfakes and manipulated media. The competition was hosted on data science and machine learning community site Kaggle, in partnership with Facebook, and witnessed more than 2,000 submissions.
After posting the names of the winning team and announced its winning prize of $1 million, Facebook stated, “You may notice that the top rankings have changed. Unfortunately, the top two teams in the preliminary standings used external data sources in their winning submissions that were not allowed under the rules of this competition.”
Explaining the rules, Facebook further stated that the competition allows participating teams to use external data to develop and test models and submissions, however, with some criteria involved — it should have been ensured that the “external data is available to use by all participants of the competition for purposes of the competition at no cost to the other participants; also it was necessary to post such access to the external data for the participants to the official competition forum prior to the entry deadline.”
According to the news reports, the previous winning team, ‘All Faces Are Real,’ manually created a face image dataset from YouTube videos with CC-BY license, which explicitly has been allowed for commercial use, and the Flickr-Faces-HQ Dataset.
The team didn’t agree with the disqualifying rules and stated: “Facebook felt some of our external data ‘clearly appears to infringe third party rights’ despite being labelled as CC-BY (it’s not clear what data they were referring to explicitly).”
The team further stated that they didn’t undermine any rule knowingly and argued that “why Kaggle never took the opportunity to clarify that external data must additionally follow the more restrictive rules for winning submission documentation.”
With disappointment, the team further stated that “Specifically, we were asked to provide additional permissions or licenses from individuals appearing in [our] external dataset. Unfortunately, since the data was from public datasets, we didn’t have specific written permission from each individual appearing in them, nor did we have any way of identifying these individuals.”
On this, Facebook has clarified their stance by stating that the competition clearly asked their participants to submit their code to be tested against a black box data set with challenging and unshared real-world examples. The determination of the winners by Facebook was done by evaluating these models in the participation against the black box dataset. For this Facebook used the log-loss score against the private test set held outside the Kaggle platform, which “contains videos with a similar format and nature as the Training and Public Validation/Test Sets, but are real, organic videos with and without deepfakes.”
To explain adequately, Facebook stated that the challenge here is to generalise from unknown examples to unfamiliar instances. The separate black box data set consists of 10,000 videos and were not available to other participants; they “had to design models that could be effective even under unforeseen circumstances.”
Such ambiguity in the rules regarding external data was flagged by many participants. In Kaggle’s discussion forum, winner Selim Seferbekov asked, “As external data might be very helpful to obtain better scores on public/private leaderboards how are you going to validate solutions’ compliance with the rules?”
This decision of Facebook has not been convincing for many machine learning practitioners.
Further, the rules of the competition posted on Kaggle detailed out stating — “in any such dispute, under no circumstances will any Competition participant be permitted or entitled to obtain awards for, and hereby waives all rights to claim punitive, incidental or consequential damages, or any other damages, including attorneys’ fees, other than the individual participant’s actual out-of-pocket expenses (if any), not to exceed ten dollars ($10 USD), and each participant further waives all rights to have damages multiplied or increased.”
However, the winning team concluded by stating — “Successful Kaggle competitions rely on trust between competitors and Kaggle that the rules will be fairly explained and applied, and this trust has been damaged.”