TOKYO -- Japan will make companies responsible for explaining decisions made by artificial intelligence software they use, according to a government draft of legal guidelines shared with Nikkei.
The list stipulates AI should not infringe on basic human rights. Personal information should be handled carefully, it continues, and AI's security must be guaranteed. It also calls for maintaining a fair competitive playing field, making AI more accessible by improving education, and building an environment that encourages cross-border data sharing.
A top goal is to increase transparency around how AI makes decisions, such as whether to extend a loan or hire someone for a job. A lack of clarity in AI's decision-making standards can leave the person being evaluated dissatisfied or uneasy.
There are also fears that AI could factor gender or ethnicity into a decision on whether to hire someone, for instance, without the knowledge of even the company employing it. Assigning people with the ultimate responsibility for clearly explaining decisions is expected to ease fears surrounding the use of the technology.
The seven guidelines will be officially unveiled next month by a government council on forming principles for a "human-centric" AI society, chaired by University of Tokyo Professor Osamu Sudo. Japan will call on Group of 20 members to adopt the rules at June summit meetings in Osaka.
The basic rules will be used as a basis for crafting legislation, with an eye to helping avoid turmoil for foreign businesses in Japan, which might otherwise operate according to rules specific to them or to their home country.
"Japan is behind on establishing rules for AI," said Kenji Toyoda at the Mizuho Information & Research Institute. He stressed that it was "critical to play a part in setting rules, not just outside the European Union but also together with all members of the Organization for Economic Cooperation and Development," which includes a number of EU countries. The EU aims to publish its own set of AI guidelines by the end of this year.