Yann Andre LeCun, a French computer scientist who focuses on machine learning, computer vision, mobile robotics, and computational neuroscience, recently tweeted that one of his articles has been rejected from NeurIPS 2021.
Proudly rejected from #NeurIPS2021 ! https://t.co/wVJDdUYvPK
— Yann LeCun (@ylecun) September 29, 2021
Yann LeCun is a Silver Professor at New York University’s Courant Institute of Mathematical Sciences and Vice President, Chief AI Scientist at Facebook. He is well-known for his work on optical character recognition and computer vision using convolutional neural networks (CNNs) and is often regarded as the inventor of convolutional nets. He is also a co-creator of the DjVu image compression technology. In addition, LeCun and Léon Bottou co-created the Lush programming language. The author is a multifaceted individual with academic and industrial experience in artificial intelligence, machine learning, deep learning, computer vision, intelligent data analysis, data mining, data compression, digital library systems, and robotics.
Yann recently co-authored an article with Adrien Bardes and Jean Ponce in NeurIPS titled “VICReg: Variance-Invariance-Covariance Regularization for Self-Supervised Learning” (ArXiv link), which has received 12 citations since May. However, to everyone’s surprise, the article was rejected.
Subscribe to our Newsletter
Join our editors every weekday evening as they steer you through the most significant news of the day, introduce you to fresh perspectives, and provide unexpected moments of joy
Your newsletter subscriptions are subject to AIM Privacy Policy and Terms and Conditions.
Overview of the article
VICReg is a simple self-supervised image representation learning approach that decomposes the problem into three distinct principles: learning invariance to different views with an invariance term, avoiding the collapse of the representations with a variance regularisation term and spreading the information across the different dimensions of the representations with a covariance regularisation term.

VICReg: Variance-Invariance-Covariance Regularization for Self-Supervised Learning.
— Yann LeCun (@ylecun) May 12, 2021
By Adrien Bardes, Jean Ponce, and yours truly.https://t.co/Ih4nRoMZYv
Insanely simple and effective method for self-supervised training of joint-embedding architectures (e.g. Siamese nets).
1/N
On several downstream tasks, VICReg obtains outcomes on par with state of the art, stretching the frontiers of non-contrastive self-supervised learning.
Image Source: VICReg.
The computation of the covariance matrix for each processed batch, which is quadratic in the dimension of the projection vectors, determines the computational and memory cost of VICReg. Experiments have shown that raising the dimension of the projections increases performance significantly, indicating the need for alternate redundancy reduction methods that do not rely on the entire computation of the covariance matrix.
Image Source: Table.2 Results
When used with methods like SimSiam, the hinge-on-variance term prevents collapse and eliminates the requirement for batch-norm or a predictor.
Image source: Table.3
The authors stated that future research would look into how different approximation techniques and whole new redundancy reduction methodologies based on higher-order statistics might be used to overcome this quadratic barrier.
How legends cope with adversity
In response to one of the questions on Linkedin, “Is it possible for the neurips2021 reviewers to reject a paper written by Prof Yann LeCun? since he is widely regarded as one of the best minds in the field of artificial intelligence,” he responded in the manner, “Over the years, I’ve had a number of papers rejected by NeurIPS. NeurIPS reviews are conducted in a double-blind fashion, which means that the reviewers are unaware of the authors’ identities. That is generally a good thing: a paper should not be accepted just on the basis of the celebrity of one of the authors. Indeed, many of my rejected articles were properly rejected: they were written by students and did not impress me. However, it was beneficial for pupils to receive input from reviewers. A handful of the articles that were rejected were ones that I thought were extremely good and fascinating. One of them is this.”
Academic publishing is notorious for rejection. Many scholars used to wait to see if their articles would be accepted for the running year’s NeurIPS conference. Articles are rejected for various reasons, ranging from easily avoidable errors and omissions to simply falling outside the journal’s scope. Even if a manuscript appears to be “of No Interest to Anyone,” it may be of interest to someone. Nobody wants to see their paper go unpublished after years of research and months of writing and formatting a well-produced research paper.
LeCun carries many feathers in his hat. A member of the National Academies of Sciences and Engineering of the United States of America and the National Academy of Engineering, he is also the recipient of the American Academy of Achievement’s Golden Plate Award. Furthermore, he is a dignitary who received the Turing Award with his fellow scholars. Naturally, the rejection of his paper comes with a surprise and has stunned many in the academic and the AI field. In a recent tweet, the author stated that “the issue is not with NeurIPS, but with the screening practises of very selective conferences in emerging fields.”
It's not a problem with NeurIPS, but a general problem with the reviewing habits of highly-selective conferences in growing fields.
— Yann LeCun (@ylecun) September 30, 2021
Similarly, in a LinkedIn post, he stated, “I’m happy to report that the article has been rejected from NeurIPS 2021.”
Source: Linkedin post
As he explained and what seems like the likely explanation, this was the doing of double-blind reviews. The discussion remains whether that was a good thing, although LeCunn expressed it is.