Robotic Companions: Some Ethical Considerations about Designing a Good Life with Robots Lawrence M. Hinman, Ph.D. Professor of Philosophy Co-Director,

Slides:



Advertisements
Similar presentations
Business Ethics for Real Estate: A. Glean
Advertisements

Libertarianism and the Philosophers Lecture 4
Higher RMPS Lesson 4 Kantian ethics.
Medical Ethics What’s it all about?.
The idea of morality as a social contract offers an explanation of why its reasonable to act in accordance with the dictates of morality As such it provides.
Gallup Q12 Definitions Notes to Managers
The Moral Status of the Non- Human World: Matheny
Kant Philosophy Through the Centuries BRENT SILBY Unlimited (UPT)
Kantian Ethics (Duty and Reason)
You: Growing and Changing
Introduction to Ethics Lecture 6 Ayer and Emotivism By David Kelsey.
Summer 2011 Thursday, 07/21. Appeals to Intuition Intuitively, it may not seem that the Chinese room has understanding or that the Blockhead or China-brain.
Lawrence M. Hinman, Ph.D. Director, The Values Institute University of San Diego 6/9/2015(c) Lawrence M. Hinman1 Divine Command Theories of Ethics.
Prepared By Jacques E. ZOO Bohm’s Philosophy of Nature David Bohm, Causality and Chance in Modern Physics (New York, 1957). From Feyerabend, P. K.
Research problem, Purpose, question
Aesthetics: Diversity in Criticism and Analysis of the Arts Julie Van Camp Fulbright Lecturer Comenius University Professor of Philosophy California State.
MORAL THEORY: INTRODUCTION PHILOSOPHY 224. THE ROLE OF REASONS A fundamental feature of philosophy's contribution to our understanding of the contested.
Deontology in practical ethics
THEORIES ABOUT RIGHT ACTION (ETHICAL THEORIES)
thinking hats Six of Prepared by Eman A. Al Abdullah ©
ToK - Identity “Who am I?”.
The answer really annoys me for 3 reasons: 1.I think the statement is arrogant. It doesn’t take into account any definitions of God but solely focuses.
Leadership Ethics by Chad Stoskopf.
THE ESSAY: THE 3 LEVELS OF COMPOSITION. AN OVERVIEW OF THE 3 LEVELS  I. LEVEL ONE = MOST THEORETICAL (INCLUDES YOUR THESIS)  II. LEVEL TWO = DEFINED.
Bioethics 101 Lesson two.
to Effective Conflict Resolution
Lawrence M. Hinman, Ph.D. Director, The Values Institute University of San Diego 9/24/20151(c) Lawrence M. Hinman Psychological Egoism.
Lawrence M. Hinman, Ph.D. Director, The Values Institute University of San Diego 10/1/2015(c) Lawrence M. Hinman Basic Moral Orientations Overview.
4/12/2007dhartman, CS A Survey of Socially Interactive Robots Terrance Fong, Illah Nourbakhsh, Kerstin Dautenhahn Presentation by Dan Hartmann.
New Leader 101 Lesson 4 What to Do Your First Day on the Job as New Leader.
Highlights from Educational Research: Its Nature and Rules of Operation Charles and Mertler (2002)
Impression Management Which masks do you wear???.
Course Summary & Conclusions LIS488’s Final Class.
Business Ethics Lecture Rights and Duties 1.
AIT, Comp. Sci. & Info. Mgmt AT02.98 Ethical, Legal, and Social Issues in Computing September Term, Objectives of these slides: l What ethics is,
Introduction to Critical Thinking Developing Critical Thinking Skills.
Virtue Ethics and Moral Pluralism
Disability Ethics Dr Paul Jewell Faculty of Health Sciences & School of Education Flinders University
Philosophy 224 Moral Theory: Introduction. The Role of Reasons A fundamental feature of philosophy's contribution to our understanding of the contested.
Introduction to Ethics Lecture 12 Kant By David Kelsey.
Welcome to Ethics Ethics and citizens rights DR. BURTON A. AGGABAO Professorial lecturer
© Michael Lacewing Abortion and persons Michael Lacewing
ETHICS in the WORKPLACE © 2012 Cengage Learning. All Rights Reserved. Chapter 2 Ethical Principles.
After today’s lesson I will be able to: Explain Kant’s theory on moral ethics Explain the term ‘categorical imperative’ Understand the phrase 'Duty and.
 a person who acts freely and knowingly and who is accountable for his/her actions  human beings possess a power to do things that sets us apart from.
Copernicus’ Revolutionary Theory. Skim the Para.1 and draw the two theories of the universe with the following pictures. Sun earth Before Copernicus’
Unit 1 The Concept of Law. What is a Commonplace?  The set of everyday truths about a given subject matter providing us a shared subject matter for inquiry.
ETHICALETHICALETHICALETHICAL PRINCIPLESPRINCIPLESPRINCIPLESPRINCIPLES.
ETHICS in the WORKPLACE © 2012 Cengage Learning. All Rights Reserved. Chapter 1 Welcome to Ethics.
RESPONDING TO RULES HOW TO: MAKE COMPLAINTS TAKE “NO” FOR AN ANSWER DISAGREE APPROPRIATELY CHANGE RULES.
Philosophy 1050: Introduction to Philosophy Week 5: Plato and arguments.
 Character- a person’s use of self-control to act on responsible values.  Value- a principle or standard that guides the way a person behaves..
CAS Managebac update CAS opportunity for someone with a scanner. Cambodia?
By Ishaq Breiwish 4G Empty Quarter Ecosystems in the U.A.E.
AS Ethics Utilitarianism Title: - Preference Utilitarianism To begin… What is meant by preference? L/O: To understand Preference Utilitarianism.
Kantian Ethics Good actions have intrinsic value; actions are good if and only if they follow from a moral law that can be universalized.
Retail Coaching Workbook. Feedback 2 Definition What is Feedback 1.Communication of something to a person or group which gives that person information.
Seminar Two.  1. Review of Work Due  2. Course Content  Review of Consequentialism  Non-Consequentialism  Medical Ethics  Doctor-Patient Relationships.
Philosophy 219 Introduction to Moral Theory. Theoretical vs. Practical  One of the ways in which philosophers (since Aristotle) subdivide the field of.
Introduction to Moral Theory
Ethics and Values for Professionals Chapter 2: Ethical Relativism
Employability Skills.
Introduction to Moral Theory
Introduction to Moral Theory
Utilitarianism: Modern Applications of the theory
Ethics Matters Binary and Gradualist Concepts: Some Key Issues
Introduction to Philosophy Lecture 14 Immanuel Kant
Why Study Ethics and computing?
What Are Ethics? What are the objectives?
EECS 690 April 30.
Presentation transcript:

Robotic Companions: Some Ethical Considerations about Designing a Good Life with Robots Lawrence M. Hinman, Ph.D. Professor of Philosophy Co-Director, Center for Ethics in Science & Technology University of San Diego May 17, 2009

Overview Definition: Robotic Companions The General Question Designing a good life that encompasses both humans and robots Ethics as experimental science Seven Specific Questions Transforming filial responsibility Transforming expectations of humans Designed for honesty Sexual companions Robotic fungibility Robots as slaves Summary Conclusion

Definition: Robotic Companions Principal focus is on sociable robots (following Breazeal et al.): Roughly humanoid in appearance Fairly autonomous Capable of emotion recognition and voice recognition Basic drive to care for others Capable of expressing information Capable of expressing (the appearance of) emotions

The General Question Distinguish two conceptions of ethics Negative, other-directed. Focuses on how others are wrong. Positive, future-directed. Focuses on how we can create a good life together. The general question here is about what counts as a good life together that encompasses both humans and robotic companions. Part of a larger domain that includes cyborgs, animals, and more autonomous robots. This means that ethics must do empirical research to determine the ways in which humanity is being transformed. The following specifies areas for research, not a priori answers.

Ethics as an Experimental Science This suggests that the job of moral philosophers is not to dictate right and wrong, but to highlight areas of concern for research. Nadeau suggests: Artificial intelligence works by heuristics, and there is one heuristic theory of moral reasoning, rule utilitarianism. The idea is that from experience one learns which patterns of behavior have caused benefit and which have caused harm, and that experience is generalized to moral rules of thumb that guide ethical actions. The rules of thumb can be overridden in circumstances in which it becomes evident that following them will cause harm or fail to do good. They are defaults.

Filial Responsibility The first interesting question is about the possible ways that companion robots can transform our understanding of filial responsibility. The moral contours of human life are shaped by certain basic events, including Being born Creating new life (conceiving) Working Dying Being nurtured Nurturing Q1: How will the widespread use of companion robots transform our experience of nurturing and being nurtured?

Changing Expectations about Humans Companion robots can be extraordinarily patient, tolerant, and supportive—often far more so than their human counterparts. The second interesting question concerns the impact that human-robot interactions will have on human-human interactions. Bluntly put, will we come to prefer robots? Q2: How will the widespread use of companion robots change our expectations about other humans? Will we expect more of them?

Designed for Honesty We face a number of interesting questions about the honesty of companion robots. Here are two. Q3:Should companion robots always tell the exact truth to their charges? We could imagine someone asking his companion robot if he looks healthy today. The robot might always tell the truth, might always say only positive things, or might exaggerated the positive by 10%. Q4:Should companion robots always report accurately on their charges to their supervisors? It would be surprising if companion robots didn’t eventually include a reporting function to send information back to supervisors.

Sexual Companions The next interesting question is whether we should allow such robots to provide sexual stimulation or satisfaction to their charges. Q5: Should companion robots function as sexual companions?

Fungibility Many objects are fungible—one instance can be substituted for another without loss or change. A dollar bill is a paradigm case—one is just as good as any other. Objects of emotional attachment are generally not fungible. If I am married to someone who has a twin, I couldn’t substitute the twin for my spouse in the way in which I could substitute one dollar bill for another. Q6: Should companion robots be treated as fungible? In other words, are robotic companions to be seen as interchangeable or as

Robots and slaves I wonder whether we don’t implicitly think about robotic companions as slaves, available to do our bidding but not centers of interest in themselves. Q7: Should companion robots be designed and treated as slaves? I don’t know the answer to this question, but the question is implicit in several of the preceding questions. It seems that we might be able to understand some of the possible dangers here by looking at the literature on slavery: Aristotle on the natural slave, Hegel on the master-slave dialectic, Marx on Hegel, narrative accounts of slaves, etc.

Summary Q1: How will the widespread use of companion robots transform our experience of nurturing and being nurtured? Q2: How will the widespread use of companion robots change our expectations about other humans? Will we expect more of them? Q3:Should companion robots always tell the exact truth to their charges? Q4:Should companion robots always report accurately on their charges to their supervisors? Q5: Should companion robots function as sexual companions? Q6: Should companion robots be treated as fungible? Q7: Should companion robots be designed and treated as slaves?

Conclusion Companion robots will be a fact of life in the near future, barring some major disaster. The interesting question is how can we construct a good life together with companion robots and human beings? The intent of the preceding seven questions is to highlight areas of concern, factors that might make it more difficult to construct a good life together.