Description:Can security automata (robots and AIs) make moral decisions to apply force on humans correctly? If they can make such decisions, ought they be used to do so? Will security automata increase or decrease aggregate risk to humans? What regulation is appropriate? Addressing these important issues this book examines the political and technical challenges of the robotic use of force.The book presents accessible practical examples of the 'machine ethics' technology likely to be installed in military and police robots and also in civilian robots with everyday security functions such as childcare. By examining how machines can pass 'reasonable person' tests to demonstrate measurable levels of moral competence and display the ability to determine the 'spirit' as well as the 'letter of the law', the author builds upon existing research to define conditions under which robotic force can and ought to be used to enhance human security.The scope of the book is thus far broader than 'shoot to kill' decisions by autonomous weapons, and should attract readers from the fields of ethics, politics, and legal, military and international affairs. Researchers in artificial intelligence and robotics will also find it useful.We have made it easy for you to find a PDF Ebooks without any digging. And by having access to our ebooks online or by storing it on your computer, you have convenient answers with Ethics and Security Automata: Policy and Technical Challenges of the Robotic Use of Force (Emerging Technologies, Ethics and International Affairs). To get started finding Ethics and Security Automata: Policy and Technical Challenges of the Robotic Use of Force (Emerging Technologies, Ethics and International Affairs), you are right to find our website which has a comprehensive collection of manuals listed. Our library is the biggest of these that have literally hundreds of thousands of different products represented.
Pages
—
Format
PDF, EPUB & Kindle Edition
Publisher
—
Release
—
ISBN
1138050229
Ethics and Security Automata: Policy and Technical Challenges of the Robotic Use of Force (Emerging Technologies, Ethics and International Affairs)
Description: Can security automata (robots and AIs) make moral decisions to apply force on humans correctly? If they can make such decisions, ought they be used to do so? Will security automata increase or decrease aggregate risk to humans? What regulation is appropriate? Addressing these important issues this book examines the political and technical challenges of the robotic use of force.The book presents accessible practical examples of the 'machine ethics' technology likely to be installed in military and police robots and also in civilian robots with everyday security functions such as childcare. By examining how machines can pass 'reasonable person' tests to demonstrate measurable levels of moral competence and display the ability to determine the 'spirit' as well as the 'letter of the law', the author builds upon existing research to define conditions under which robotic force can and ought to be used to enhance human security.The scope of the book is thus far broader than 'shoot to kill' decisions by autonomous weapons, and should attract readers from the fields of ethics, politics, and legal, military and international affairs. Researchers in artificial intelligence and robotics will also find it useful.We have made it easy for you to find a PDF Ebooks without any digging. And by having access to our ebooks online or by storing it on your computer, you have convenient answers with Ethics and Security Automata: Policy and Technical Challenges of the Robotic Use of Force (Emerging Technologies, Ethics and International Affairs). To get started finding Ethics and Security Automata: Policy and Technical Challenges of the Robotic Use of Force (Emerging Technologies, Ethics and International Affairs), you are right to find our website which has a comprehensive collection of manuals listed. Our library is the biggest of these that have literally hundreds of thousands of different products represented.