Why OpenAI Needs to be Singled Out in the Troubled Tech Valley 

Tech companies are operating on a thin line between ‘making products better’ and theft
Listen to this story

AI models pretending to be humans come at the expense of everything you search, read or click on the internet. Whether it’s your Instagram photos, conversations with chatbots like Bing or emails, they all produce a trove of personal data for the big tech. Even though Silicon Valley has been feeding on public’s data, fingers have been particularly raised at the celebrity — OpenAI — for its closed-door activities. 

The Californian AI research lab released a 98-page technical report about GPT-4 earlier this year which was deemed not transparent or “open” in any meaningful way. The company run by Sam Altman did not disclose any details about the architecture (including model size), hardware, the magnitude of its dataset, or the training method, making it the most secretive release so far. 

Emily M Bender, a professor of Linguistics at the University of Washington, tweeted that this secrecy did not come as a surprise to her. “They are wilfully ignoring the most basic risk mitigation strategies, all while proclaiming themselves to be working towards the benefit of humanity,” she tweeted

Subscribe to our Newsletter

Join our editors every weekday evening as they steer you through the most significant news of the day, introduce you to fresh perspectives, and provide unexpected moments of joy
Your newsletter subscriptions are subject to AIM Privacy Policy and Terms and Conditions.

Keeping it so hush-hush has made it unfair to people who are now finding it difficult to know whether their work has been scraped. Moreover, it has become an impossible task to prove their intellectual property and copyright in court – giving OpenAI a legal, yet unfair advantage. 

The paper received enormous backlash with critics claiming that the company did not want to reveal the details to maintain its dominant position in the market. OpenAI is at fault, and so are the rest of the companies building these AI models to mimic the way humans work, play and create. We have just been taking these companies’ word for it and as a result, privacy is a mess.

Safe, private, and secure: theoretically 

After some initial struggles, OpenAI got away from being outcast from Italy by providing limited controls. Even Google’s writing-helper Smart Compose, trained on the public’s Gmail data, is off by default as per European Union’s laws.

Tech companies often have minute legal struggles now and then but eventually dodge the bullet either by compensating with penalties, finding a legal loophole, or tweaking the company’s guidelines here and there. 

In less than a decade, tech companies, including Google, Apple, Meta and Amazon have collectively received penalties of over $30 billion. Fines are not just a ‘cost of doing business’ for tech giants, but as the president of the French Competition Authority, Isabelle da Silva, declared, “fines are an element of the identification of what is wrong in the conduct”. 

A recent exposé by Geoffrey A Fowler of The Washington Post poses a thought-provoking question: “Which data of ours is and isn’t off limits?” The investigative piece takes a deep dive into how the Valley companies are using your data and there’s not much you can do about it. He further notes much of the answer is wrapped up in lawsuits, investigations and hopefully some new laws. But meanwhile, big tech is making up its own rules.

Drawing a line 

The debate around tech companies and their Orwellian mass collection of data has been going on since the beginning of time. Their refusal to budge from these practices has often caused outrage, resulting in authorities (finally) stepping in.  

Mozilla has launched a campaign calling on the software giant to come clean. “If nine experts in privacy can’t understand what Microsoft does with your data, what chance does the average person have?” the announcement note stated. As a part of the campaign four lawyers, three privacy experts, and two campaigners looked at software giant Microsoft’s updated Service Agreement, which will go into effect on September 30. Surprisingly, none of the experts could tell if Microsoft plans on using your personal data. 

Exactly a year ago, the Federal Trade Commission (FTC) announced an initiative to draft rules to crack down on what it considers to be “harmful commercial surveillance” or “the business of collecting, analyzing and profiting from information about people”. There has been no update on the case since then. 

Tech companies are operating on a thin line between ‘making products better’ and theft. On a darker note, these AI companies have had a constant tussle with in-house and other ethicists globally. 

Even though OpenAI started as a non-profit champion, it has become a part of the money-making circus in the Bay Area. The company’s darling ChatGPT has given enough reasons for artists and authors to drag the startup to court. But its tight-lipped approach has managed to give it leverage above the others. Ironically, when GPT-2 was released in 2019, Jack Clark, OpenAI’s former policy director OpenAI, said that rather than act like it isn’t there, “it’s better to talk about AI’s dangers before they arrive”.

Tasmia Ansari
Tasmia is a tech journalist at AIM, looking to bring a fresh perspective to emerging technologies and trends in data science, analytics, and artificial intelligence.

Download our Mobile App

MachineHack | AI Hackathons, Coding & Learning

Host Hackathons & Recruit Great Data Talent!

AIMResearch Pioneering advanced AI market research

With a decade of experience under our belt, we are transforming how businesses use AI & data-driven insights to succeed.

The Gold Standard for Recognizing Excellence in Data Science and Tech Workplaces

With Best Firm Certification, you can effortlessly delve into the minds of your employees, unveil invaluable perspectives, and gain distinguished acclaim for fostering an exceptional company culture.

AIM Leaders Council

World’s Biggest Community Exclusively For Senior Executives In Data Science And Analytics.

3 Ways to Join our Community

Telegram group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our Daily newsletter

Get our daily awesome stories & videos in your inbox
MOST POPULAR