There is a Need for Principles to Govern AI: Microsoft Chief

163

Nadella said Microsoft had to develop software that was safe in terms of design and now the same is applicable to artificial intelligence (AI)

As the creators of artificial intelligence (AI), we need principles that can govern AI, said Satya Nadella, Chief Executive Officer at Microsoft, while adding that privacy is to be taken as a human right going ahead.

There is a need for taking tough discussions on global standards in the wake of a ‘techlash’ being faced by the world leaders, he said while speaking at the World Economic Forum Annual Meeting at Davos, Switzerland.

Nadella further said Microsoft had to develop software that was safe in terms of design and now the same is applicable to the AI.

However, the concern was the ‘black box’. Today, AI is mainly driven by data, but latest advances will help in explaining the black box so that it can be controlled ethically as well as in terms of regulation, he added.

With AI algorithms getting smarter, they have become more incomprehensible and it is difficult for humans to understand how a particular machine learning (ML) algorithm takes decisions. This opaque process is known as the ‘black box’.

‘Privacy is a human right’

Nadella further said, “The world is a tech sector. Every retail company, every health company will have to think of data. We should think of privacy as a human right.”

“We have to begin from the main principle that data is owned by the user and not only the technology sector, but the entire economy must also get to grips with this,” he said.

On facial recognition, Microsoft chief said that he can give examples of around 10 uses that are virtuous and can improve life, as well as 10 uses that would also be worrying.

“That is why Microsoft is putting together principles on fair use of technology. But self-regulation only goes so far. We would welcome regulation that stops it being a race to the bottom,” he said.

LEAVE A REPLY

Please enter your comment!
Please enter your name here