The Matrix Model of Computation (MMC)

Slides:



Advertisements
Similar presentations
Ch:8 Design Concepts S.W Design should have following quality attribute: Functionality Usability Reliability Performance Supportability (extensibility,
Advertisements

Chapter 4 Systems of Linear Equations; Matrices
Chapter 4 Systems of Linear Equations; Matrices
Using Sparse Matrix Reordering Algorithms for Cluster Identification Chris Mueller Dec 9, 2004.
Matrices: Inverse Matrix
Elementary 3D Transformations - a "Graphics Engine" Transformation procedures Transformations of coordinate systems Translation Scaling Rotation.
Grover. Part 2. Components of Grover Loop The Oracle -- O The Hadamard Transforms -- H The Zero State Phase Shift -- Z O is an Oracle H is Hadamards H.
2.1 si SI31 Advanced Computer Graphics AGR Lecture 2 Basic Modelling.
Grover. Part 2 Anuj Dawar. Components of Grover Loop The Oracle -- O The Hadamard Transforms -- H The Zero State Phase Shift -- Z.
GG313 Lecture 12 Matrix Operations Sept 29, 2005.
MOHAMMAD IMRAN DEPARTMENT OF APPLIED SCIENCES JAHANGIRABAD EDUCATIONAL GROUP OF INSTITUTES.
CS 4731: Computer Graphics Lecture 8: 3D Affine transforms Emmanuel Agu.
7.3 Solving Systems of Equations in Three Variables
LIAL HORNSBY SCHNEIDER
Matrices Write and Augmented Matrix of a system of Linear Equations Write the system from the augmented matrix Solve Systems of Linear Equations using.
1 1.1 © 2012 Pearson Education, Inc. Linear Equations in Linear Algebra SYSTEMS OF LINEAR EQUATIONS.
Mathematical Fundamentals
Plot each point on graph paper.You will attach graph paper to Bell Ringer sheet. 1. A(0,0) 2. B(5,0) 3. C(–5,0) 4. D(0,5) 5. E(0, –5)  A A  B CC 
RMIT University; Taylor's College1 Lecture 6  To apply the Principle of Mathematical Induction  To solve the Towers of Hanoi puzzle  To define a recurrence.
Emergent Inference, or How can a program become a self-programming AGI system? Sergio Pissanetzky Self-programming Workshop AGI-11.
Bell Assignment 1.Graph the equation y = x 3 + 3x 2 – 1 on your GUT. Then use the graph to describe the increasing or decreasing behavior of the function.
Matrix. REVIEW LAST LECTURE Keyword Parametric form Augmented Matrix Elementary Operation Gaussian Elimination Row Echelon form Reduced Row Echelon form.
Copyright © 2013, 2009, 2005 Pearson Education, Inc. 1 5 Systems and Matrices Copyright © 2013, 2009, 2005 Pearson Education, Inc.
Chapter 3 Solution of Algebraic Equations 1 ChE 401: Computational Techniques for Chemical Engineers Fall 2009/2010 DRAFT SLIDES.
Art 321 Lecture 7 Dr. J. Parker. Programming In order to ‘make things happen’ on a computer, you really have to program it. Programming is not hard and.
Solving Compound Inequalities. Domain: A-REI Reasoning with Equations & Inequalities Cluster: 1. Understand solving equations as a process of reasoning.
SINGULAR VALUE DECOMPOSITION (SVD)
1. 2 Preface In the time since the 1986 edition of this book, the world of compiler design has changed significantly 3.
4.5 Inverse of a Square Matrix
Learning Objectives for Section 4.5 Inverse of a Square Matrix
Computer Graphics Matrices
Applications of the Matrix Model of Computation Sergio Pissanetzky.
Chapter 1 Section 1.6 Algebraic Properties of Matrix Operations.
Slide Copyright © 2009 Pearson Education, Inc. 7.4 Solving Systems of Equations by Using Matrices.
The Theory of Objects and the automatic generation of Intelligent Agents Sergio Pissanetzky 2009.
Geometric Transformations Ceng 477 Introduction to Computer Graphics Computer Engineering METU.
GIS’s Roots in Cartography Getting Started With GIS Chapter 2.
ECE 576 – Power System Dynamics and Stability Prof. Tom Overbye University of Illinois at Urbana-Champaign 1 Lecture 23: Small Signal.
Chapter 4 Systems of Linear Equations; Matrices
Linear Equations in Linear Algebra
The language focusses on ease of use
Efficient Evaluation of XQuery over Streaming Data
Sec. 4-3: Matrix Multiplication 8/24/17
Basic 1960s It was designed to emphasize ease of use. Became widespread on microcomputers It is relatively simple. Will make it easier for people with.
Review: Transformations
Transformations of Functions
How Many Ways Can 945 Be Written as the Difference of Squares?
Practice Contest II Problem H: Two Rings
Computer Graphics Transformations.
Copyright © Cengage Learning. All rights reserved.
I am presenting a new type of structured ANNs based on the MMC.
Review: Transformations
If intelligence is the ability to solve unanticipated problems,
Three-Dimensional Graphics
If intelligence is the ability to solve unanticipated problems,
Unit-5 Geometric Objects and Transformations-II
CSC4820/6820 Computer Graphics Algorithms Ying Zhu Georgia State University Transformations.
THE GRAPH OF A QUADRATIC FUNCTION
Three-Dimensional Graphics
Learning Objectives for Section 4.5 Inverse of a Square Matrix
Exponential Functions
Copyright © Cengage Learning. All rights reserved.
Unit #1 Transformations
Chapter 4 Systems of Linear Equations; Matrices
Linear Equations in Linear Algebra
Structural Emergence in Partially Ordered Sets
Sergio Pissanetzky 2009 Title
U3L2 The Need For Algorithms
Introduction to Artificial Intelligence Lecture 22: Computer Vision II
Presentation transcript:

The Matrix Model of Computation (MMC) I present the Matrix Model of Computation MMC. Sergio Pissanetzky

The MMC consists of two sparse matrices: M = (C, Q) C = Matrix of Services Q = Matrix of Sequences The model has two forms: ● imperative form ● canonical form The matrix model consists of two sparse matrices: the matrix of services C, and the matrix of sequences Q. In this talk I can only discuss matrix C. The model has two forms: imperative, and canonical. This talk is restricted to the imperative form and to matrix C.

* C + A PROGRAM tc tj tf tk tb te tl ta td wz tg wx sx th wy ti sz sy DATA tc = a * fz tj = b * fx tf = d * vz tk = b * fy tb = a * fy te = d * vy tl = b * fz ta = a * fx td = d * vx wz = vz + tl tg = ta + td wx = vx + tj sx = rx + tg th = tb + te wy = vy + tk ti = tc + tf sz = rz + ti sy = ry + th * C + A a, fz b, fx d, vz b, fy a, fy d, vy b, fz a, fx d, vx vz vx rx vy rz ry Here is an example of matrix C. Matrix C describes a set of relations. Each row is a tuple in one of the relations, called a service. Each column is a variable used in the services. On the left, I have listed a simple program, or perhaps a set of equations. On the top I have listed all the variables. For each row, I have declared the role played by each variable, examples tb, te, th. The conversion can be done by a parser. The matrix contains the same information as the program. If I look at a row, I can see what variables are used by that service, and how they are used. If I look at a column, I can see the entire lifecycle of a variable. The pink extent is the scope of the variable.

The MMC is simple, yet very rich in features Universal Natural ontology Mathematically formal Dynamic mode Turing – equivalent Self – organizing Quantum – equivalent Connectionist Relational database Massively parallel Computer program Data channel Algebra of operations Transformations, refactorings Formal algorithms Training modes The MMC is simple, but very rich in features. Explain the features. In this talk I can address only two or three of the features. In particular, I will argue that there are objects and classes of objects an inheritance hierarchies hidden in matrix C. I will present an algorithm that can reveal this natural ontology: the SCA algorithm.

The Scope Constriction Algorithm (SCA) Matrix C contains the natural ontology of the system. The SCA algorithm finds the natural ontology.

Profile of matrix C C A tc tj tf tk tb te tl ta td wz tg wx sx th wy ti sz sy Profile of matrix C C A Here is one motivation for the SCA. The profile is the union of all scopes. This profile is too big. There is no reason to initialize variables so far ahead of their use.

Data channel “Turbulent” flow C A tc tj tf tk tb te tl ta td wz tg wx sx th wy ti sz sy Data channel C A “Turbulent” flow Here is another motivation for SCA. A different view of the matrix shows a channel where data flows. Data is contained in variables. I flows from the C where the variable is initialized, down the scope, it is intercepted by the A of a service, and conveyed horizontally to another C. At that point, the data is used to initialize another variable, and then discarded. This channel is very wide and disorganized. Data flows are too long and entangled. I call it turbulent flow. The motivation is that I want to make the channel narrower and better organized.

Service Commutation C A tc tj tf tk tb te tl ta td wz tg wx sx th wy sz sy Service Commutation C A Here is how SCA works: service commutation. Two services commute if the order of initialization/use is not reversed. The green services are commutative, the pink services are not. Repeated commutation can be used to reduce the profile. The blue service can be shifted two steps up, first by commuting it with the orange service, then with the yellow service. Profile is reduced in all three columns: tl, ta, td. Some commutations reduce the profile, others can increase it, still others leave the profile unchanged. The algorithm systematically selects commutations that reduce the profile, and stops when no more such commutations are possible. The result is the minimum profile. But as the profile gets smaller, other things happen: the services coalesce together into highly cohesive but weakly coupled clusters, forming objects. Service commutation is a refactoring operation.

G “Laminar” flow H G H G H C A PROGRAM td ta tg sx tj wx te tb th sy tk wy tf tc ti sz tl wz DATA td = d * vx ta = a * fx tg = ta + td sx = rx + tg tj = b * fx wx = vx + tj te = d * vy tb = a * fy th = tb + te sy = ry + th tk = b * fy wy = vy + tk tf = d * vz tc = a * fz ti = tc + tf sz = rz + ti tl = b * fz wz = vz + tl C A d, vx a, fx rx b,fx vx d, vy a, fy ry b, fy vy d, vz a, fz rz b, fz vz G “Laminar” flow H G H Narrow, well-organized channel. The scopes are short, no entanglement  Laminar flow. More important: the matrix is partitioned into diagonal blocks. No flows cross the dotted lines  encapsulation! Both the data and the functions that use it have been encapsulated  objects! There are 6 objects, but only two classes  SCA has revealed the natural ontology! The objects inherit from previously existing objects  SCA has revealed the inheritance hierarchy. The program has also been “refactored”. The objects can be extracted as submodels. Call G the big class and H the small class. G H

Where did the objects come from? The refactored code G gx; sx = gx.go(a, d, fx, rx, vx); H hx; wx = hx.go(b, fx, vx); G gy; sy = gy.go(a, d, fy, ry, vy); H hy; wy = hy.go(b, fy, vy); G gz; sz = gz.go(a, d, fz, rz, vz); H hz; wz = hz.go(b, fz, vz); The original code, not object-oriented, has been refactored into object-oriented code. Class G has a constructor and a method go. Class H has a constructor and a method go. The code can be written in C++, Java, C#. I can now rewrite matrix Q in terms of the new classes and repeat all the procedure. I am going to leave you with a question: Where did the objects come from? I didn’t make them myself. I didn’t tell the algorithm to make objects ... Where did the objects come from?

CONCLUSIONS AND OUTLOOK ● Objects and inheritance occur naturally in systems. They can be revealed by the SCA algorithm. Impact on Computer Science, Software Engineering, Refactoring, the Semantic Web, Artificial Intelligence, Biology, Neuroscience, Linguistics. Conjecture: Our mind uses the same process to make objects. This is an amazing result, but I conjecture that our mind uses the same process to make objects.