For many years now, there have been talks about how software engineers should take a Hippocratic oath to safeguard the world from the adverse impact of new technologies which they develop while working for companies or in research labs. Hippocratic oath in the medical profession is not unknown where doctors take a pledge for the welfare of their patients and maintain the ethics of the medical community.
Many think that there should be a Hippocratic oath for software developers and engineers so refrain from misusing an algorithm and perhaps write harmful code. A similar ethical pledge will commit software developers to consider deeply on the possible use cases and the impact of their work and influence them to perform tasks only that do not harm society.
The oath is meant to abstain software professionals not to perform a task which could be harmful to a company, users or even society at large. The debate is especially crucial in this era of AI and autonomous systems, where software developers have built systems that gather and utilise personal data.
Microsoft President Came In Favor Of The Hippocratic Oath For Software Engineers
We have seen many business leaders ask for a Hippocratic oath for software developers. Brad Smith, Microsoft President, has argued in a recent interview that industries should go for Hippocratic Oath for software engineers, and even came up with six ethical principles within Microsoft.
In the foreword to Microsoft’s recent book, The Future Computed, executives Brad Smith and Harry Shum proposed that Artificial Intelligence (AI) practitioners highlight their ethical commitments by taking an oath analogous to the Hippocratic Oath sworn by doctors for generations.
We know that there is no ethical code of conduct for how autonomous systems should be constructed given the impact those systems have on the world, especially if they end up in the hands of malicious entities.
Having a Hippocratic Oath for AI, according to Smith, will uphold ethical standards, and create a broader global conversation about the ethics of AI is needed, and ultimately, a new legal framework.
Software Engineers Say It’s Tech Business Leaders Who Need To Take The Oath
But developers have pointed to the fact that the biggest risk in software ethics is not the role of software engineers but the ethics involved in the software engineering process. Businesses typically may not enforce ethical software processes and standards because of the pressures of taking a product to market. That way, a piece of the software product may have some technical debt which developers may not have the time to patch. That way, the onus may fall more on tech businesses, not the software engineers or developers.
Senior Software Developer, Carl Vitullo writes on Twitter, “Engineers aren’t who the buck stops with, so a Hippocratic oath being taken by them is a farce. We need management, sales, and product to take a Hippocratic oath also.”
According to many developers, by the time software jobs are handed to engineers, most of the ethical judgments have been made by product managers, designers, and stakeholders who are focused on their ends. Software engineers are liable to their bosses before their users. So it’s the business leaders, particularly leaders of large companies who should be taking the oath, and not put the onus on developers who in reality do not have much say when it comes to writing ethical software.
While Hippocratic oath for developers seems more altruistic than the other options, it’s enough legislation and setting standards, software audits and ethical incentives will have a far more significant impact than a software engineer taking an oath.
In software engineering, not doing any harm begins with a deep understanding of technology. The question then arises — will developers without a proper degree in engineering be asked to take an oath without being fully trained in software, or will computer science education become a mandatory requirement for all developers to acquire a license in doing software work? It means that software developers will require in-depth knowledge and be taught how to write code for systems and software securely. If this happens, it may lead to a further scarcity of software talent and may also stifle innovation.
Subscribe to our NewsletterGet the latest updates and relevant offers by sharing your email.
Vishal Chawla is a senior tech journalist at Analytics India Magazine and writes about AI, data analytics, cybersecurity, cloud computing, and blockchain. Vishal also hosts AIM's video podcast called Simulated Reality- featuring tech leaders, AI experts, and innovative startups of India.