Home Law and Information Technology The leaked Apple document that explains the neutral character of Siri

The leaked Apple document that explains the neutral character of Siri

The leaked Apple document that explains the neutral character of Siri

Siri was the first great virtual assistant that reached the masses. Introduced with the iPhone 4s, over the years it has been learning more and more functions, with artificial intelligence being its main feature. But how does Apple define the character of giving this artificial intelligence of Siri? A document leaked last week reveals some of the keys, in addition to the possible existence of a new device with Siri.

According to a document obtained by The Guardian (after the controversy of Siri’s listening), Apple has designed Siri to be as neutral as possible. The document shows how Siri manages to get out of trouble whenever he is asked about controversial issues. In order not to enter into controversies, sometimes it even offers unsatisfactory or general answers to not get into the subject too much.

The Guardian shows as an example the theme of feminism. If you ask the Apple assistant if she is a feminist, she will answer with an answer that avoids answering directly or that does not affirm that it really is (or that it is not). For example “I strongly believe in equality and respect in the treatment of people”. On the contrary, Alexa or Google Assistant do clearly indicate that they are. On other issues such as homosexuality or racism directly does not think.

A guide that is inspired by Asimmov

As a curiosity, the leaked document also explains how Siri’s ethics should work. For this he makes use of the robotics laws of science fiction author Isaac Asimov. In addition to the three laws of robotics, it also has accounts specifically created for Siri’s behavior guide. For example:

“An artificial being must not represent himself as human, nor through omission allow the user to believe that he is one.”

“An artificial being must not violate the human ethical and moral standards that are commonly maintained in its region of operation.”

“An artificial being must not impose its own principles, values ​​or opinions on a human being.”

The answers and the general character of Siri are especially important if we consider that he is an assistant used daily by millions of people. A response with an ideological or balanced stance on one side rather than the other can not only create controversies, but controversy for the company itself.

Via | The guardian

NO COMMENTS

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Exit mobile version