If intelligence is the ability to solve unanticipated problems,

Slides:



Advertisements
Similar presentations
1 Turing Machines and Equivalent Models Section 13.2 The Church-Turing Thesis.
Advertisements

The LC-3 – Chapter 6 COMP 2620 Dr. James Money COMP
Graphs Graphs are the most general data structures we will study in this course. A graph is a more general version of connected nodes than the tree. Both.
Matrices: Inverse Matrix
12.1 Systems of Linear Equations: Substitution and Elimination.
Class Discussion Chapter 2 Neural Networks. Top Down vs Bottom Up What are the differences between the approaches to AI in chapter one and chapter two?
MOHAMMAD IMRAN DEPARTMENT OF APPLIED SCIENCES JAHANGIRABAD EDUCATIONAL GROUP OF INSTITUTES.
Copyright © Cengage Learning. All rights reserved. CHAPTER 2 THE LOGIC OF COMPOUND STATEMENTS THE LOGIC OF COMPOUND STATEMENTS.
Theory of Computation. Computation Computation is a general term for any type of information processing that can be represented as an algorithm precisely.
7.3 Solving Systems of Equations in Three Variables
Linear Systems The definition of a linear equation given in Chapter 1 can be extended to more variables; any equation of the form for real numbers.
1 Chapter 2 Matrices Matrices provide an orderly way of arranging values or functions to enhance the analysis of systems in a systematic manner. Their.
Introduction Information in science, business, and mathematics is often organized into rows and columns to form rectangular arrays called “matrices” (plural.
1 1.1 © 2012 Pearson Education, Inc. Linear Equations in Linear Algebra SYSTEMS OF LINEAR EQUATIONS.
Copyright © Cengage Learning. All rights reserved.
Chapter 10 Review: Matrix Algebra
Systems of Linear Equation and Matrices
MATH 250 Linear Equations and Matrices
Row Reduction Method Lesson 6.4.
Emergent Inference, or How can a program become a self-programming AGI system? Sergio Pissanetzky Self-programming Workshop AGI-11.
Copyright © Cengage Learning. All rights reserved. CHAPTER 5 Extending the Number System.
COMPUTER PROGRAMMING. Control Structures A program is usually not limited to a linear sequence of instructions. During its process it may repeat code.
Diagonalization and Similar Matrices In Section 4.2 we showed how to compute eigenpairs (,p) of a matrix A by determining the roots of the characteristic.
A Relational Virtual Machine for Program Evolution Sergio Pissanetzky SER 3534 SER A Relational Virtual Machine for Program.
The Theory of Objects and the automatic generation of Intelligent Agents Sergio Pissanetzky 2009.
Maestro AI Vision and Design Overview Definitions Maestro: A naïve Sensorimotor Engine prototype. Sensorimotor Engine: Combining sensory and motor functions.
1 Chapter 1: Constraints What are they, what do they do and what can I use them for.
Linear Equations in Linear Algebra
Functional Programming
The NP class. NP-completeness
Logic.
CSE202: Introduction to Formal Languages and Automata Theory
Eigenfaces (for Face Recognition)
Methods of Proof.
Week 5 - Friday CS 113.
CS344: Introduction to Artificial Intelligence (associated lab: CS386)
5 Systems of Linear Equations and Matrices
How Many Ways Can 945 Be Written as the Difference of Squares?
Modeling Arithmetic, Computation, and Languages
ARTIFICIAL INTELLIGENCE
I am presenting a new type of structured ANNs based on the MMC.
Numerical Representation of Strings
Vocabulary Algorithm - A precise sequence of instructions for processes that can be executed by a computer.
If intelligence is the ability to solve unanticipated problems,
Using Algebra Tiles to Solve Equations, Combine Like Terms, and use the Distributive Property Objective: To understand the different parts of an equation,
The Matrix Model of Computation (MMC)
Symbolic Implementation of the Best Transformer
Chapter 10: Solving Linear Systems of Equations
BINARY STORAGE AND REGISTERS
Unit# 9: Computer Program Development
Graph Paper Programming
Adaptive Systems and Analyst-independent technologies
Elementary Matrix Methid For find Inverse
Data Flow Analysis Compiler Design
Systems of Linear Equations and Matrices
P.V.G’s College of Engineering, Nashik
Linear Equations in Linear Algebra
Foundations of Discrete Mathematics
CPSC 121: Models of Computation
Introductory Concepts
Copyright © Cengage Learning. All rights reserved.
CPSC 121: Models of Computation
SIDE SPLITTER Hot Potato; e together
Structural Emergence in Partially Ordered Sets
Sergio Pissanetzky 2009 Title
Discrete Mathematics and Its Applications
U3L2 The Need For Algorithms
Software Development Techniques
Representations & Reasoning Systems (RRS) (2.2)
Matrix Algebra THE INVERSE OF A MATRIX © 2012 Pearson Education, Inc.
Presentation transcript:

If intelligence is the ability to solve unanticipated problems, Title Background Properties Problem Solution Result Profile Data Channel Commutation Ontology Learning Circuit Abstr. mach. Conclusions If intelligence is the ability to solve unanticipated problems, then artificial intelligence needs universal representations Sergio Pissanetzky Sergio@SciControls.com Title

The Matrix Model of Computation (MMC) Title Background Properties Problem Solution Result Profile Data Channel Commutation Ontology Learning Circuit Abstr. mach. Conclusions The Matrix Model of Computation (MMC) Matrix of Services C a b c d Ω Г Λ 3 C A M 1 a = b / c 2 b = a + b + 3 3 d = f(a, c) 4 Ω × Г → Ω × Г × Λ 5 if(d) a = c; 6 else a = 3; Matrix of Sequences Q This work is based on the Matrix Model of Computation. The model comes in two forms. The imperative form is defined as a pair of matrices: C and Q. It is natural to represent services in matrix form, in terms of arguments and codomains. It is also natural to think in terms of sequences. Ω=states, Г=alphabet, Λ= (left, right). The definition is simple and natural, but it has very powerful properties. serv d next 1 2 3 4 true 5 false 6

Properties of the MMC Universal Algebra of operations Title Background Properties Problem Solution Result Profile Data Channel Commutation Ontology Learning Circuit Abstr. mach. Conclusions Properties of the MMC Universal Algebra of operations Turing – equivalent Formal algorithms Quantum – equivalent Connectionist Mathematically formal Transformations, refactoring Relational database Training modes Computer program Machine – interpretable Object – oriented Fractal structure The model is universal because it can represent any Turing machine, and any Quantum computer. It is mathematically formal. It is a relational database, because the matrices represent relations of various degrees. It is a computer program ready for execution, but not a programming language. It is naturally OO. I will say some more about this in a moment. The MMC has an algebra of operations, which can be used to write formal algorithms. It is a connectionist model. It supports many transformations, including refactoring. It supports all training modes used in AI. It is machine interpretable, and has a fractal structure. The most important: Every finitely realizable physical system can be perfectly represented by an MMC. Saying that MMC is universal is saying that it represents all we can know, because we only know what we can know. Because of which, I propose the MMC as a model for a formal Theory of Cognition. Yet, the imperative form has a problem. Any finitely realizable physical system can be perfectly represented by an MMC operating by finite means Mathematical Model for a formal Theory of Cognition 4 publications

The Problem There is no AI in the MMC The Diagnostic Title Background Properties Problem Solution Result Profile Data Channel Commutation Ontology Learning Circuit Abstr. mach. Conclusions The Problem There is no AI in the MMC The Diagnostic You can not build artificial intelligence by taking away a system’s initiative and independence “You can not build character and courage by taking away men’s initiative and independence.” Abraham Lincoln Human intelligence stands in the way of artificial intelligence. It must be removed, while still preserving behavior and properties.

Title Background Properties Problem Solution Result Profile Data Channel Commutation Ontology Learning Circuit Abstr. mach. Conclusions The Solution You take away a system’s initiative and independence when: ● You issue guarantees to services and imperatively control the behavior. ● You make it dependent on man-made structures. ● You reuse code or variables. ● You install multifunction services. ● Your issue incomplete specifications. a b c d Ω Г Λ 3 1 C A 2 M 4 5 6 A series of transformations is required. Imperative control of an area in a system is like a system within a system, like a parasite, a cancer. It behaves without regard for the rest of the system. Removing it results in massive parallelism. Ex: 1, 3. Man-made structures are solutions to problems obtained by humans. They are encrypted information, unknown to the AI system, something that it can not understand. Reuse of code or variables creates interdependencies and constrains the AI system’s independence. Multifunction services create interdependencies between the tasks. Incomplete specifications make the system dependent on the missing pieces. The solution I propose is a series of transformations that eliminate all traces of human intelligence while still preserving behavior and properties. Go directly to the results and examples. serv d next 1 2 3 4 true 5 false 6 Q = C =

The Result. The canonical form M = (C) Title Background Properties Problem Solution Result Profile Data Channel Commutation Ontology Learning Circuit Abstr. mach. Conclusions The Result. The canonical form M = (C) PROGRAM tc tj tf tk tb te tl ta td wz tg wx sx th wy ti sz sy DATA tc = a * fz tj = b * fx tf = d * vz tk = b * fy tb = a * fy te = d * vy tl = b * fz ta = a * fx td = d * vx wz = vz + tl tg = ta + td wx = vx + tj sx = rx + tg th = tb + te wy = vy+tk ti = tc + tf sz = rz + ti sy = ry + th C A a, fz b, fx d, vz b, fy a, fy d, vy b, fz a, fx d, vx vz vx rx vy rz ry This is a different, bigger example. It illustrates a canonical matrix of services, the canonical form of the MMC. There are no M’s, the matrix is square lower-triangular. All C’s are on the diagonal. Only A’s in lower triangle. On the left, a set of equations, or the statements of a computer program. On top, the variables used in the equations. Columns permuted in such a way that all C’s fall on the diagonal. The conversion from any computer program can be done by a parser. I will now explain its most important property: the cMMC can infer the ontology for this program. The program looks very confusing, particularly because I am not telling you what it means, or what the ontology is. I will now explain how the MMC can infer the ontology.

Profile C A tc tj tf tk tb te tl ta td wz tg wx sx th wy ti sz sy Title Background Properties Problem Solution Result Profile Data Channel Commutation Ontology Learning Circuit Abstr. mach. Conclusions tc tj tf tk tb te tl ta td wz tg wx sx th wy ti sz sy Profile C A The Scope Constriction algorithm (SCA) finds similarities in the services and variables and groups them together forming objects. The resulting ontology is natural because it depends on, and is determined by, the system. SCA is problem and domain independent. It is a universal MMC algorithm. Here is one motivation for the SCA. Explain scope. The profile is the union of all scopes. This profile is too big. There is no reason to initialize variables so far ahead of the point where they are used.

“Turbulent” flow Data channel C A tc tj tf tk tb te tl ta td wz tg wx Title Background Properties Problem Solution Result Profile Data Channel Commutation Ontology Learning Circuit Abstr. mach. Conclusions Data channel tc tj tf tk tb te tl ta td wz tg wx sx th wy ti sz sy C A “Turbulent” flow Here is another motivation for SCA. A different view of the matrix: a channel where data flows. Data flows in the variables, from the point where a variable is initialized to the point where it is used. This channel is very wide and disorganized. Data flows are too long and entangled. We want to make the channel narrower, and better organized.

Service Commutation C A tc tj tf tk tb te tl ta td wz tg wx sx th wy Title Background Properties Problem Solution Result Profile Data Channel Commutation Ontology Learning Circuit Abstr. mach. Conclusions tc tj tf tk tb te tl ta td wz tg wx sx th wy ti sz sy Service Commutation C A Service commutativity is the tool used to reduce the profile and narrow the data channel. Every variable must be initialized before it is used. Otherwise, the order of the services is irrelevant. The profile of the matrix changes when the order of the services is changed. The green services are commutative, but commuting them would increase the profile. The pink services are not commutative. A service such as the blue one can be shifted by repeated commutation. Doing so reduces the profile. Systematic repeated commutation is used by SCA to minimize the profile.

“Laminar” flow G H G H G H C A PROGRAM td ta tg sx tj wx te tb th sy tk wy tf tc ti sz tl wz DATA td = d * vx ta = a * fx tg = ta + td sx = rx + tg tj = b * fx wx = vx + tj te = d * vy tb = a * fy th = tb + te sy = ry + th tk = b * fy wy = vy + tk tf = d * vz tc = a * fz ti = tc + tf sz = rz + ti tl = b * fz wz = vz + tl C A d, vx a, fx rx b,fx vx d, vy a, fy ry b, fy vy d, vz a, fz rz b, fz vz G “Laminar” flow H G H Narrow, well-organized channel. The scopes are short, no entanglement  Laminar flow. More important: the matrix is partitioned into diagonal blocks. No flows cross the dotted lines  encapsulation. Both the data and the functions that use it have been encapsulated  objects. There are 6 objects, but only two classes  SCA has revealed the natural ontology. The objects inherit from previously existing objects  SCA has revealed the inheritance hierarchy. The program has also been “refactored”. The objects can be extracted as submodels. Two examples of SCA have been published. Next, an algorithm for learning. G H

Canonical example and the act of learning expression token tok in expr number operator symbol nbr in expr oper in expr symb in expr first second third nbr first oper first oper second nbr third 1 2 3 + - 1 first 3 first + second - second 2 third 1+2 3-2 A C H Change the language: neural clique instead of variable, activation instead of initialization. What happens in your brain when you look at 1+2=3? You know it’s an expression, AND it has tokens. You look at the first token, you know it’s 1, AND you know that 1 is a number, so you have a number in an expression, AND it is in the first place. Next, there is another token AND it is +, and so on, until you get to “result of 1+2”. Only the result, the value is still unknown. But the teacher is repeating “1+2 is 3”, “1+2 is 3”, ... So the neural clicks “result of 1+2” and 3 are flashing repeatedly. The Hebbian switch senses this and makes the connection. Similar for 3-2=1. But let’s look at this a little differently. I said that the services are autonomous, they execute when the arguments become available. The word AND suggests an AND switch. So I can make an equivalent circuit.

Equivalent circuit 1 + 2 = 3 and 3 – 2 = 1 H A C expression token tok in expr number operator symbol nbr in expr oper in expr symb in expr first second third nbr first oper first oper second nbr third 1 2 3 + - 1 first 3 first + second - second 2 third 1+2 3-2 A C H And here is the circuit. expression AND token is token in expression, and so on. Note the difference, however, in the rows with the H: there is no service there. The service is created at the time the Hebbian switch establishes the connection. Of course, a delay is required, as always when feedback happens. H H

Abstract machine 1 + 2 = 3 and 3 – 2 = 1 expression token tok in expr number operator symbol nbr in expr oper in expr symb in expr first second third nbr first oper first oper second nbr third 1 2 3 + - 1 first 3 first + second - second 2 third resu1+2 resu3-2 This circuit can learn indefinitely, in both extension and detail. It is an abstract machine, it can be built. H H

● human intelligence obstructs AI Title Background Properties Problem Solution Result Profile Data Channel Commutation Ontology Learning Circuit Abstr. mach. Conclusions Conclusions I have argued that: ● human intelligence obstructs AI ● AI structures must be universal, not ad-hoc I have proposed: ● transformations to remove man-made structures ● a new canonical form of the MMC ● an algorithm that can infer the ontology ● an algorithm for unlimited learning ● a universal abstract machine The conclusions.