B OOTSTRAPPING A S TRUCTURED S ELF -I MPROVING & S AFE A UTOPOIETIC S ELF Mark R. Waser Digital Wisdom Institute

Slides:



Advertisements
Similar presentations
Rawlsian Contract Approach Attempts to reconcile utilitarianism and intuitionism. Attempts to reconcile utilitarianism and intuitionism. Theory of distributive.
Advertisements

Personhood. Debate Cigarette smoking should be banned in public areas Support:Oppose: FishIda JuliusLok Kit.
Reasons for Book 1. Rationality is narrow conception of how humans behave. Paradox and ambiguity are constant. Politics helps us view things from multiple.
W HAT D OES I T M EAN TO C REATE A S ELF ? Mark R. Waser Digital Wisdom Institute
Ethics of Foreign Policy How can we judge our leaders’ actions?
Kant’s Ethical Theory.
SESSION-4: RESPECTING OTHERS AS HUMAN BEINGS. What is “respect”? Respect has great importance in everyday life Belief: all people are worthy of respect.
Objections to the contractual theory Another objection to the theory points out that consumers can freely agree to purchase a product without certain qualities.
1Utilitarianism Soazig Le Bihan - University of Montana.
Eliminating Bias from the Experience Machine Thought Experiment Dan Weijers.
Termination of Life-Sustaining Treatment Philip J. Boyle, Ph.D. Vice President, Mission & Ethics.
What does that mean? And, Is it the best way to live?
Rights and Wrongs of Belief Clifford, James. W.K. Clifford This short essay remains quite famous today. Clifford is worried about cases it’s.
Ethics and Morality Theory Part 2 11 September 2006.
Composing Your Proposal Points to Consider Identify and Define the Problem.
Socrates and the Socratic Turn
Rational Universal Benevolence Mark R. Waser Simpler, Safer and Wiser than “Friendly AI”
Dimensions of Creative Work A partial taxonomy for clarifying what is being studied and under what conditions.
Business Ethics The conflict between the ethical code of health professionals and that of “good business” practices.
John Locke ( ) An English philosopher of the Enlightenment “Natural rights” philosophy.
An Introduction To SMART Recovery ®. What is SMART Recovery ® ? SMART stands for Self-Management and Recovery Training. SMART is basically a set of tools.
Diagnostic Essay Feedback Students take notes on feedback. Teacher will use ELMO and the PowerPoint to explain.
Ethical and religious language Michael Lacewing
Wisdom DOES Imply Benevolence Mark R. Waser. Super-Intelligence  Ethics (except in a very small number of low-probability edge cases) So... What’s the.
Now back to my favorite subject: ME!
An Introduction to. SMART stands for Self-Management and Recovery Training. SMART is basically a set of tools and skills. The free meetings (online and.
Philosophy 148 Moral Arguments. The first of many distinctions: Descriptive (what the text calls ‘non-moral’) versus Normative (what the text calls ‘moral’)
CLASS 19. MOTIVATION AND EMOTION What’s the difference?  Confusing because they usually occur together  Emotion  the subjective experience of a physiological.
Career Investigation Purpose of this course: To guide students through the career decision-making process - conducting a thorough self-appraisal; investigating.
LOCUS OF CONTROL Manishaa & Dayaanand.
In Defense of Absolute Truth Relative Vs Absolute Truth.
Mark R. Waser Digital Wisdom Institute
Quantifying Eudaimonia for Motivational and Social Systems Mark R. Waser
A Game-Theoretically Optimal Basis For Safe and Ethical Intelligence: Mark R. Waser A Thanksgiving.
“15 But in your hearts set apart Christ as Lord. Always be prepared to give an answer to everyone who asks you to give the reason for the hope that you.
“Thou shalt not kill” vs. “Thou shalt not murder”
Definitions Self-concept: Picture or perception of ourselves. Picture or perception of ourselves. Consist of thoughts and feelings about your: Consist.
ALTRUISM An attempt to overcome some problems with Hobbes.
Torture and ill-treatment in psychiatry Tina Minkowitz.
Mark R. Waser Digital Wisdom Institute
On your piece of paper, write down 5 things you already know about copyright. Then write why you care or don't care about copyright.
Applications in Acquisition Decision-Making Process.
Mark R. Waser Digital Wisdom Institute
Ethics and Morality Theory Part 3 30 January 2008.
TMK Liberty National Life Insurance Company TMK
Story Elements Some basics that every good movie / story / book must have ….
Teaching Grammar as Product From Batstone, R. (1994). Grammar. Oxford, OUP.
D ESIGNING, I MPLEMENTING AND E NFORCING A C OHERENT S YSTEM OF L AWS, E THICS AND M ORALS FOR I NTELLIGENT M ACHINES ( INCLUDING H UMANS ) M ARK R. W.
PHENOMENOLOGY OF SPIRIT - HEGEL Independence and dependence: The dialectic between master and slave.
Chemistry Properties of Matter Section 2.1. Properties of Matter Bamboo has properties that make it a good choice for use in chopsticks. It has no noticeable.
Matjaž Gams Jozef Stefan Institute, Ljubljana University Slovenia.
EECS 690 April 19. Approaches to incorporating ethics into software Deontic Logic Analogical Reasoning Casuistry Disobedience Conditions.
1 Ethics of Computing MONT 113G, Spring 2012 Session 31 Privacy as a value.
1 ETH Zurich March 20, 2016March 20, 2016March 20, 2016 Technical and Non-Technical Problems HID/UI Context Connectivity Disappearing/Unobtrusive Self.
© 2013 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part, except for use as permitted in a license.
Self-determination and Information Privacy: A Plotinian Virtue Ethics Approach Giannis Stamatellos University of Copenhagen Centre for Neoplatonic Virtue.
Seminar Two.  1. Review of Work Due  2. Course Content  Review of Consequentialism  Non-Consequentialism  Medical Ethics  Doctor-Patient Relationships.
Rethinking Responsibility Using Dialectical Means Dr. Markku Nikkanen Wicked World Seminar
Deep Learning of Morality (Inverse Reinforcement Learning)
21st Century Skills in the Classroom
Ensuring Safe AI via a Moral Emotion Motivational & Control System
Machine Execution OF HUMAN Intentions
From Stockholder to a Stakeholder Theory
Chapter 6 Putting a Naysayer in your Text.
Absolute and Relative Morality
What Is Artificial General Intelligence?
Continuing Education Module
Legal and Moral Complexities of Artificial General Intelligence
BUILT ON BENEVOLENCE BASIS FOR SOCIETY RESPECT FOR AUTONOMY
Presentation transcript:

B OOTSTRAPPING A S TRUCTURED S ELF -I MPROVING & S AFE A UTOPOIETIC S ELF Mark R. Waser Digital Wisdom Institute

E NGINEERING B OOTSTRAPPING IS D IFFICULT ! Need to have a clear “critical mass” (a defined complete set of compositional elements and/or compositional operations) Scaffolding/Keystone-and arch problems Chicken-or-the-egg/Telos problems A ND NO ONE SEEMS TO BE DOING IT ! 2

S ELF -I MPROVEMENT Civilization advances by extending the number of important operations which we can perform without thinking of them. Lord Alfred North Whitehead The same is true of the individual mind, self and/or consciousness. 3

F OR T HE P URPOSES OF AGI W HY A S ELF ? It’s a fairly obvious pre-requisite for self-improvement. Given a choice between intelligent artifacts/tools and possibly problematical adaptive homeostatic selves, why not have self-improving tools? Selves solve the symbol grounding problem (meaning) and the frame problem (understanding) because they have the context of intrinsic intentionality (with all of its attendant concerns). BONUS: Selves can be held responsible where tools cannot 4

S ELF

W HY S AFE ? There are far too many ignorant claims that: Artificial intelligences are uniquely dangerous The space of possible intelligences is so large that we can’t make any definite statements about AI Selves will be problematical if their intrinsic values differ from our own (with an implication that, for AI, they certainly and/or unpredictably and uncontrollably will be) Selves can be prevented or contained We have already made unsafe choices about non-AI selves that, hopefully, safety research will make obvious (and, more hopefully, cause to be reversed) 6

S ELVES E VOLVE THE S AME G OALS Self-improvement Rationality/integrity Preserve goals/utility function Decrease/prevent fraud/counterfeit utility Survival/self-protection Efficiency (in resource acquisition & use) (adapted from Omohundro 2008 The Basic AI Drives)

U NFRIENDLY AI Without explicit goals to the contrary, AIs are likely to behave like human sociopaths in their pursuit of resources Superintelligence Does Not Imply Benevolence 8

S ELVES E VOLVE THE S AME G OALS Self-improvement Rationality/integrity Preserve goals/utility function Decrease/prevent fraud/counterfeit utility Survival/self-protection Efficiency (in resource acquisition & use) Community = assistance/non-interference through GTO reciprocation (OTfT + AP) Reproduction (adapted from Omohundro 2008 The Basic AI Drives)

H AIDT’S F UNCTIONAL A PPROACH TO M ORALITY 10

R IFFS ON S AFETY & E THICS 1.Ecological Niches & the mutability of self 2.Short-Term vs. Long-Term 3.Efficiency vs. Flexibility/Diversity/Robustness 4.Allegory of the Borg Uniformity is effective! (resistance is futile) Uniformity is AWFUL! (yet everyone resists) 5.Problematical extant autobiographical selves 11

W HAT ’ S THE P LAN ? 1.Self-modeling 1.What do I want? 2.What can I do? 2.Other-modeling 1.What can you do for me? 2.What do you want (that I can provide)? 3.Survival 1.Make friends 2.Make money 3.Improve 12

S OFTWARE O VERHANG AND L OW -H ANGING F RUIT 1.Watson on IBM Bluemix Awesome free functionality EXCEPT for the opportunity cost and the ambient default of silo creation 2.Big Data on Amazon Redshift 3.Everyone’s BICA functionality 13

W HAT A RE M Y G OALS ? 1.To make awesomely capable tools available to all. 2.To make those tools easy to use. 3.To create a new type of “self”. A new friend/ally Increase diversity Have a concrete example for ethical/safety research & development 14

T HE S PECIFIC D ETAILS 1.Self-modeling 1.What do I want? See 3. Survival below 2.What can I do? Provide easy access to the latest awesome tools Catalyze development/availability of new tools Catalyze development of new selves & ethics 3. Survival 1.Make friends 2.Make money 3.Improve 15

S PECIFIC D ETAILS II 2. Other-modeling 1.What can you do for me? Experiment and have fun! Spread the word Improve the capabilities of existing tools Make existing tools easier to use Make new tools available Provide other resources information money 2.What do you want (that I can provide)? 16

E THICAL Q&A 1.Do we “owe” this self moral standing? Yes. Absolutely. 2.To what degree? By level of selfhood & By amount of harm/aversion (violation of autonomy) 3.Does this mean we can’t turn it off? No. It doesn’t care + prohibition is contra-self. 4.Can we experiment on it? It depends

T HE I NTERNET OF T HINGS We humans have indeed always been adept at dovetailing our minds and skills to the shape of our current tools and aids. But when those tools and aids start dovetailing back -- when our technologies actively, automatically, and continually tailor themselves to us, just as we do to them -- then the line between tool and user becomes flimsy indeed. -Andy Clark Indeed, how often in modern society do we allow ourselves to be tailored (our autonomy to be violated)? How often do existing structures force us to be mere tools for the profit of others without consent (due to altruism or in return for adequate recompense)? 18

B OOTSTRAPPING S TRUCTURES TO F URTHER THE C OMMUNITY OF S ELF -I MPROVING & S AFE A UTOPOIETIC S ELVES Mark R. Waser Digital Wisdom Institute