TOKYO Being personally offended by a robot may seem absurd, but one of these days you could find yourself on the receiving end of a rather nasty android. At least that's what the Japanese Society for Artificial Intelligence thinks. The group recently announced ethics guidelines covering not only people engaged in artificial intelligence, but also the entities they create.
"We have made sure that the guidelines will also apply to [future] risks, such as an AI created by an AI," said Yutaka Matsuo, a project associate professor at the University of Tokyo and one of the guidelines' authors.
The move underscores public concern about inappropriate use of AI. While recognizing AI's undeniable benefits, the group wants people to realize that its misuse could have unfortunate consequences. To prevent this, the guidelines urge researchers to "listen to various opinions in society" and be guided by what they hear.
The futuristic tone of the document can be found in a provision that specifically requires AI to adhere to the same ethics as their flesh and blood creators. "We included it, considering that in the future an AI [entity] could receive citizenship," said Toyoaki Nishida, a professor at Kyoto University and another writer of the guidelines. Nishida believes that no one has ever delved into this issue so deeply.
Some AI entities are actually self-learning, which is not bad in itself. It's only when they learn the wrong things that dealing with them gets iffy. Tay, an AI chatbot released by Microsoft in 2016, was shut down after only 16 hours due to a series of inflammatory remarks. And when AI entities start creating other robots like themselves, problems could multiply if there isn't a code of ethics governing both researchers and AI.
Ethics guidelines are fairly common for academic societies in the medical field, but rare for engineering groups. "We hope the guidelines will be a starting point for discussion between society and AI researchers," said Hideaki Takeda, a professor at the National Institute of Informatics who also contributed to the JSAI document.