IS 376 Ethics and Convergence

Slides:



Advertisements
Similar presentations
Medical Ethics What’s it all about?.
Advertisements

What is deontology?.
Business Ethics What you really need to know!. What is Ethics?  A practice of deciding what is right or wrong.  Ethical decisions must affect you or.
Frameworks for Moral Arguments
BUSINESS ETHICS (Some Summary Only)
Ethics for the Information Age
Business Ethics Fundamentals
Phil 160 Kant.
A Gift of Fire, 2edChapter 10: Professional Ethics and Responsibilities1 PowerPoint ® Slides to Accompany A Gift of Fire : Social, Legal, and Ethical Issues.
Individual Factors: Moral Philosophies and Values
Ethics and Morality Theory Part 2 11 September 2006.
Ethics and ethical systems 12 January
COMP 381. Agenda  TA: Caitlyn Losee  Books and movies nominations  Team presentation signup Beginning of class End of class  Rawls and Moors.
Delmar Learning Copyright © 2003 Delmar Learning, a Thomson Learning company Nursing Leadership & Management Patricia Kelly-Heidenthal
Ethical Theories High-level account of how questions about morality should be addressed. Similar to engineering models? V=IR: a tool to solve many engineering.
Ethical Theories: Deontology and Teleology
ENGINEERING PROFESSIONALISM AND ETHICS EGN 4034 FALL 2008 CHAPTER 3-4 Organizing Principles.
Ethics and Social Responsibility
Principles of Management Core Principles
Deontological & Consequential Ethics
Chapter One: Moral Reasons
PHIL 2 Philosophy: Ethics in Contemporary Society
CSE3PE: Professional Environment Introduction to Ethical Theory.
Business Law with UCC Applications,13e
Ethics of Administration Chapter 1. Imposing your values? Values are more than personal preferences Values are more than personal preferences Human beings.
“A man without ethics is a wild beast loosed upon this world.”
Ethics CS4310 Fall 2012 Updated 2/09. What is a Profession?
The Ethical Basis of Law and Business Management.
THEORIES OF ETHICS PART 2 OF CHAPTER 12 (ETHICS).
 Ethics, also known as moral philosophy, is a branch of philosophy that addresses questions about morality—that is, concepts such as good and evil, right.
Ethics - 1 Key Definitions  Moral: “relating to principles of right and wrong”  Ethics: “the discipline of dealing with what is good and bad and with.
Humanities 375, September 8, Why are we reading this book? u 1. To raise your sensitivity to circumstances involving information technology that.
CSE/ISE 312 Ethics Do the Right Thing
PEP 570, DeGeorge, Chp. 3 10/28/20151 Chapter Three: Dr. DeGeorge Utilitarianism: Justice and Love.
MORAL REASONING A methodology to help people deal with moral dilemmas The Key to doing well on paper 3.
PAPER 3 REMINDERS. THREE SECTIONS Critical Thinking Moral Reasoning Tentative solution.
AREA 1 GUIDING PRINCIPLES SECTION 3 Consequences (Utilitarian Ethics) Duty and Reason (Kantian Ethics)
© 2010 Jones and Bartlett Publishers, LLC A Practical Approach For Decision Makers SECOND EDITION EILEEN E. MORRISON.
Ethics and Morality Theory Part 3 30 January 2008.
AIT, Comp. Sci. & Info. Mgmt AT02.98 Ethical, Legal, and Social Issues in Computing September Term, Objectives of these slides: l to describe an.
Moral Reasoning Part II 3/8/2012. Learning Objectives Use knowledge and analyses of social problems to evaluate public policy, and to suggest policy alternatives,
CHAPTER ONE ETHICS MUSOLINO SUNY CRIMINAL & BUSINESS LAW.
Business Ethics Chapter # 3 Ethical Principles, Quick Tests, and Decision-Making Guidelines  The best kind of relationship in the world is the one in.
Morality in the Modern World
Ethics Overview: Deontological and Teleological ( Consequentalist) Systems.
PROFESSIONAL ETHICS Calvin Gotlieb, Professor Emeritus, Department of Computer Science University of Toronto York University October 18, 2006.
Ethics Systematizing, defending, and recommending concepts of right and wrong behavior
©2001 West Legal Studies in Business. All Rights Reserved. 1 Chapter 4: Ethics and Business Decision Making.
Utilitarianism Utilitarians focus on the consequences of actions.
Chapter 24 Ethical Obligations and Accountability Fundamentals of Nursing: Standards & Practices, 2E.
Business Communication Workshop
Professional Ethics and Responsibilities
CS 3043 Social Implications Of Computing 2/16/2016© 2009 Keith A. Pray 1 Class 2 Ethics And Professions Keith A. Pray Instructor socialimps.keithpray.net.
A Study of Ethical Thinking You get to decide what works for You.
 In both business and marketing it is important to be ethical ( it’s also important to be in ethical in life).  In the short term, making unethical.
Moral Reasoning and Ethical Theories “Good engineering, good business, and good ethics work together in the long run.
Theory of Consequences and Intentions There are two traditional ways of looking at the “rightness” or “wrongness” of an act. 1. Look at the consequences.
DEONTOLOGICAL ETHICS (CH. 2.0) © Wanda Teays. All rights reserved.
Ethical Decision Making. Daniels College Mission.
WEEK 2 Justice as Fairness. A Theory of Justice (1971) Political Liberalism (1993)
Introduction to Philosophy
Moral Reasoning  Ethical dilemmas in management are not simple choices between “right” and “wrong”.They are complex judgments on the balance between economic.
Moral Reasoning  Ethical dilemmas in management are not simple choices between “right” and “wrong”.They are complex judgments on the balance between economic.
20th century conflict day one
Moral Reasoning  Ethical dilemmas in management are not simple choices between “right” and “wrong”.They are complex judgments on the balance between economic.
Computer Ethics.
Moral Reasoning  Ethical dilemmas in management are not simple choices between “right” and “wrong”.They are complex judgments on the balance between economic.
Intro to Philosophy Ethical Systems.
Ethical concepts and ethical theories Topic 3
Professional Ethics (GEN301/PHI200) UNIT 2: NORMATIVE THEORIES OF PROFESSIONAL ETHICS Handout # 2 CLO # 2 Explain the rationale behind adoption of normative.
Presentation transcript:

IS 376 Ethics and Convergence Dr. Kapatamoyo 11/20/14

Defining Terms Society: an association of people organized under a system of rules designed to advance the good of its members over time. Every society has rules of conduct describing what people ought and ought not to do in various situations. These rules are called Morality. A person may belong to various societies, which can lead to some interesting moral dilemmas. Ethics: is the study of morality, a rational examination into people's moral beliefs and behavior.

Broad Issues Forming communities allow us to enjoy better lives than if we lived in isolation. Communities facilitate the exchange of goods and services. There is a price associated with being part of a community. Communities impose certain obligations and prohibit some actions. Responsible community members take the needs and desires of other people into account when they make decisions. They recognize that virtually everybody shares the “core values” of life, happiness, and the ability to accomplish goals.

The Law of Unintended Consequences Human actions have unintended or unforeseen effects. These effects can be positive or negative, and in some cases perverse (totally in opposite to what was originally intended). Interactions with Robots will also generate unforeseen consequences. Therefore, this law equally applies to the study of information technology or just any other type of technology.

Is Technology Neutral? A central issue of contention between technological determinists and social determinists is whether technology is neutral. Technological determinism holds that technology is value-free, and is therefore neutral. Technical features determine how people may use a particular technology Social determinism argues that technology is value- laden, and cannot be neutral (cannot exist in a vacuum). What features are put there in the first place? Who makes the decision?

Impact of IT IT can have an impact at these levels: The impact can be: Individual Group Organizational Societal National Global The impact can be: Social Economic Political Legal Psychological Historical Ethical

Isaac Asimov (1942) Science fiction writer, Asimov, wrote a short story "Runaround”. And included the Three Laws of Robotics: A robot may not injure a human being or, through inaction, allow a human being to come to harm. A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

Towards an Ethical Robot Alan Winfield (England), When robots are to be trusted, especially when interacting with humans, they will need to be more than just safe. Predicting the consequences of both their own actions, and the actions of other dynamic actors in their environment. With an ‘ethical’ action selection mechanism a robot can sometimes choose actions that compromise its own safety in order to prevent a second robot from coming to harm.

Ethics Ethics is the study of what it means to “do the right thing.” It is often equated with moral philosophy because it concerns how one arrives at specific moral choices. Ethical theory posits that people are rational, independent moral agents, and that they make free choices. Computer ethics is a branch of ethics that specifically deals with moral issues in computing (as a profession; other professions have specific ethical standards as well).

The Michael Industrial Complex

Why Study Ethics? Society is changing rapidly as it incorporates the latest advances in information technology. Many of these interactions have to be analyzed from an ethical standpoint. Ethics is the rational, systematic analysis of conduct that can cause benefit or harm to other people. Its important to note that ethics is focused on the voluntary, moral choices people make because they decided they ought to take one course of action rather than an alternative. It is not concerned about involuntary choices or choices outside the moral realm.

3 Perspectives on CyberEthics Number 1 Professional Ethics; The Main emphasis here is on the Designing, Developing and Maintenance of technologies. Most professions have this sort of ethics specific to the field.

3 Perspectives on CyberEthics Number 2 Philosophical Ethics; Ethic issues here typically involve concerns of responsibility and obligation affecting individuals as members of a certain profession; or broader concerns such as social policies and individual behavior. Three distinct stages for ethic applications are Identify a particular controversial practice as a moral problem. Describe and analyze the problem by clarifying concepts and examining the factual data associated with that problem. Apply moral theories and principles in the deliberative process in order to reach a position about that particular moral issue.

3 Perspectives on CyberEthics Number 3 Sociological Ethics These are non-evaluative and focus on particular moral systems and reports how members of various groups and cultures view particular moral issues. These are descriptive ethics. The first two (Number 1 and 2) are normative ethics.

Normative vs. Descriptive Ethics Normative ethics: focus on What we should do in making practical moral standards. Descriptive ethics: focus on What people actually believe to be right or wrong, The moral values (or ideals) they hold up to, How they behave, and What ethical rules guide their moral reasoning.

Deontological Views: Key Principles The principal philosopher in this tradition is Immanuel Kant (1724-1804, Kalinigrad, Russia). Ethical decisions should be made solely by considering one's duties and the absolute rights of others. Act only according to that maxim by which you can at the same time will that it would become a universal law. There are four key principles to Kant’s Categorical Imperative: The principle of universality: Rules of behavior should be applied to everyone. No exceptions. Logic or reason determines rules of ethical behavior. Treat people as ends in themselves, but not as means to ends. Absolutism of ethical rules. e.g., it is wrong to lie (no matter what!)

Utilitarianism Founded by John Stuart Mill (1806-1873, London, England) An ethical act is one that maximizes the good or “utility” for the greatest number of people. Consequences are quantifiable, and are the main basis of moral decisions. Rule-utilitarianism: Applies the utility principle to general ethical rules rather than to individual acts. The rule that would yield the most happiness for the greatest number of people should be followed. Act-utilitarianism: Applies utilitarianism to individual acts. We must consider the possible consequences of all our possible actions, and then select the one that maximizes happiness to all people involved.

Natural Rights One of the founding fathers is John Locke (1632-1704, Essex, England). Natural rights are universal rights derived from the law of nature (e.g., inherent rights that people are born with). Ethical behavior must respect a set of fundamental rights of others. These include the rights of life, liberty, and property.

Situational Ethics Originally developed by Joseph Fletcher (1905- 1991, Newark, NJ). There are always 'exceptions to the rule.’ The morality of an act is a function of the state of the system at the time it is performed. “Each situation is so different from every other situation that it is questionable whether a rule which applies to one situation can be applied to all situations like it, since the others may not really be like it. Only the single law of love (agape) is broad enough to be applied to all circumstances and contexts.” A pioneer in the field of bioethics, and involved in the topics of abortion, infanticide, euthanasia, eugenics, and cloning.

Negative Rights vs. Positive Rights Negative rights (or liberties) are rights to act without interference. e.g., rights to life, liberty and property; or to vote; You cannot demand (or expect) facilities to be provided to you to exercise your right. Positive rights (or claim-rights) are rare. These are rights that impose an obligation on some people to provide certain things to others. Such as free education for children Controversies often rise as to whose (what) rights should take precedence.

Normative Questions Should computers, computer systems and robots make human level decisions? Then what? Who shoulders the liability?