Fractal Composition of Meaning: Toward a Collage Theorem for Language Simon D. Levy Department of Computer Science Washington and Lee University Lexington,

Slides:



Advertisements
Similar presentations
Carolina Galleguillos, Brian McFee, Serge Belongie, Gert Lanckriet Computer Science and Engineering Department Electrical and Computer Engineering Department.
Advertisements

Math Modeling Final Project APPLICATIONS of FRACTALS Advisor: Professor Alber Fang Qi Pu Wan Xue Rui FRACTAL LANDSCAPES FRACTAL IMAGE COMPRESSION.
IMI 1 Approximation Theory Metric: Complicated Function Signal Image Solution to PDE Simple Function Polynomials Splines Rational Func.
Non-linear Dimensionality Reduction CMPUT 466/551 Nilanjan Ray Prepared on materials from the book Non-linear dimensionality reduction By Lee and Verleysen,
On Constrained Optimization Approach To Object Segmentation Chia Han, Xun Wang, Feng Gao, Zhigang Peng, Xiaokun Li, Lei He, William Wee Artificial Intelligence.
Language Specfication and Implementation - PART II: Semantics of Procedural Programming Languages Lee McCluskey Department of Computing and Mathematical.
1 Learning to Detect Objects in Images via a Sparse, Part-Based Representation S. Agarwal, A. Awan and D. Roth IEEE Transactions on Pattern Analysis and.
Faster Fractal Image Compression Using Quadtree Recomposition Mahmoud, W.H. ; Jackson, D.J. ; A. Stapleton ; P. T. Gaughan Image and Vision Computing,
Motion Analysis (contd.) Slides are from RPI Registration Class.
CSci 6971: Image Registration Lecture 4: First Examples January 23, 2004 Prof. Chuck Stewart, RPI Dr. Luis Ibanez, Kitware Prof. Chuck Stewart, RPI Dr.
Fractal Image Compression
Indexing by Latent Semantic Analysis Written by Deerwester, Dumais, Furnas, Landauer, and Harshman (1990) Reviewed by Cinthia Levy.
1 Visualizing the Legislature Howard University - Systems and Computer Science October 29, 2010 Mugizi Robert Rwebangira.
Introduction to Computational Natural Language Learning Linguistics (Under: Topics in Natural Language Processing ) Computer Science (Under:
KNN, LVQ, SOM. Instance Based Learning K-Nearest Neighbor Algorithm (LVQ) Learning Vector Quantization (SOM) Self Organizing Maps.
Algorithmic Problems in Algebraic Structures Undecidability Paul Bell Supervisor: Dr. Igor Potapov Department of Computer Science
Latent Semantic Analysis (LSA). Introduction to LSA Learning Model Uses Singular Value Decomposition (SVD) to simulate human learning of word and passage.
November 24, 2009Introduction to Cognitive Science Lecture 21: Self-Organizing Maps 1 Self-Organizing Maps (Kohonen Maps) In the BPN, we used supervised.
Iterated Function Systems (IFS) and Fractals Math 204 Linear Algebra November 16, 2007.
Fractal Image Compression By Cabel Sholdt and Paul Zeman.
Utilising software to enhance your research Eamonn Hynes 5 th November, 2012.
Overview of Kernel Methods Prof. Bennett Math Model of Learning and Discovery 2/27/05 Based on Chapter 2 of Shawe-Taylor and Cristianini.
Presented By Wanchen Lu 2/25/2013
Governor’s School for the Sciences Mathematics Day 9.
Computer Graphics Group Tobias Weyand Mesh-Based Inverse Kinematics Sumner et al 2005 presented by Tobias Weyand.
CMU SCS : Multimedia Databases and Data Mining Lecture #26: Compression - JPEG, MPEG, fractal C. Faloutsos.
CIS750 – Seminar in Advanced Topics in Computer Science Advanced topics in databases – Multimedia Databases V. Megalooikonomou Compression: JPEG, MPEG,
1. 2 Plan Introduction Overview of the semester Administrivia Iterated Function Systems (fractals)
Dr.-Ing. Khaled Shawky Hassan
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
1 GEM2505M Frederick H. Willeboordse Taming Chaos.
Pattern Similarity and Storage Capacity of Hopfield Network Suman K Manandhar Prof. Ramakoti Sadananda Computer Science and Information Management AIT.
Modelling Language Evolution Lecture 5: Iterated Learning Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit.
4.1 Vector Spaces and Subspaces 4.2 Null Spaces, Column Spaces, and Linear Transformations 4.3 Linearly Independent Sets; Bases 4.4 Coordinate systems.
Chapter 8 Fuzzy Associative Memories Li Lin
Sampletalk Technology Presentation Andrew Gleibman
Kohonen Mapping and Text Semantics Xia Lin College of Information Science and Technology Drexel University.
CS 478 – Tools for Machine Learning and Data Mining Backpropagation.
Connectionist Models of Language Development: Grammar and the Lexicon Steve R. Howell McMaster University, 1999.
Visibility map (high low) Surveillance with Visual Tagging and Camera Placement J. Zhao and S.-C. Cheung — Center for Visualization and Virtual Environment,
Katrin Erk Vector space models of word meaning. Geometric interpretation of lists of feature/value pairs In cognitive science: representation of a concept.
Distributed Representations: Simon D. Levy Department of Computer Science Washington and Lee University Lexington, VA PHIL May 2006 Preaching.
1 CSC 594 Topics in AI – Text Mining and Analytics Fall 2015/16 6. Dimensionality Reduction.
The Fractal Beauty of Emergence: Re-EnVisioning Intelligence in Man and Machine Simon D. Levy Department of Computer Science Washington and Lee University.
Algorithmic Detection of Semantic Similarity WWW 2005.
Levels of Image Data Representation 4.2. Traditional Image Data Structures 4.3. Hierarchical Data Structures Chapter 4 – Data structures for.
DISCRETE COMPUTATIONAL STRUCTURES CSE 2353 Fall 2010 Most slides modified from Discrete Mathematical Structures: Theory and Applications by D.S. Malik.
Intelligent Database Systems Lab N.Y.U.S.T. I. M. Mining massive document collections by the WEBSOM method Presenter : Yu-hui Huang Authors :Krista Lagus,
Vector Quantization Vector quantization is used in many applications such as image and voice compression, voice recognition (in general statistical pattern.
November 21, 2013Computer Vision Lecture 14: Object Recognition II 1 Statistical Pattern Recognition The formal description consists of relevant numerical.
Types and Programming Languages
Hybrid Intelligent Systems for Network Security Lane Thames Georgia Institute of Technology Savannah, GA
Lecture 14, CS5671 Clustering Algorithms Density based clustering Self organizing feature maps Grid based clustering Markov clustering.
-BY DARSHAN ALAGUD UNDER GUIDANCE OF K.R.RAO. FALL 09, ELECTRICAL ENGINEERING DEPARTMENT, UNIVERSITY OF TEXAS AT ARLINGTON A Study on Fractal Image Compression.
4 Vector Spaces 4.1 Vector Spaces and Subspaces 4.2 Null Spaces, Column Spaces, and Linear Transformations 4.3 Linearly Independent Sets; Bases 4.4 Coordinate.
Algorithmic Problems in Algebraic Structures Undecidability Paul Bell Supervisor: Dr. Igor Potapov Department of Computer Science
Data Science Dimensionality Reduction WFH: Section 7.3 Rodney Nielsen Many of these slides were adapted from: I. H. Witten, E. Frank and M. A. Hall.
Fall 2004 Backpropagation CS478 - Machine Learning.
Learning linguistic structure with simple and more complex recurrent neural networks Psychology February 2, 2017.
Fractals.
Fractal image compression
Announcements HW4 due today (11:59pm) HW5 out today (due 11/17 11:59pm)
Vector-Space (Distributional) Lexical Semantics
May 26, 2005: Empiricism versus Rationalism in Language Learning
Efficient Estimation of Word Representation in Vector Space
Fractal Image Compression
APPLICATIONS of FRACTALS
Word Embedding Word2Vec.
Localizing the Chaotic Strange Attractors of Multiparameter Nonlinear Dynamical Systems using Competitive Modes A Literary Analysis.
Latent Semantic Analysis
Presentation transcript:

Fractal Composition of Meaning: Toward a Collage Theorem for Language Simon D. Levy Department of Computer Science Washington and Lee University Lexington, VA

Part I: Self-Similarity

... But I know, too,

That

the blackbird is involved in

what what

... But I know, too, That the blackbird is involved In what – Wallace Stevens I know.

Part II: A Little Math

Iterated Function Systems IFS: Another way to make a fractal Start with an arbitrary initial image Apply a set of contractive affine transforms Repeat until image no longer changes E.g., Sierpinski Triangle...

Sierpinski Triangle I.e., three half-size copies, one in each of three corners of [0,1] 2

Sierpinski Triangle: 0 Iterations

Sierpinski Triangle: 1 Iteration

Sierpinski Triangle: 2 Iterations

Sierpinski Triangle: 3 Iterations

Sierpinski Triangle: 4 Iterations

Sierpinski Triangle: 5 Iterations

Sierpinski Triangle: 6 Iterations

Sierpinski Triangle: 7 Iterations

Sierpinski Triangle: New Initial Image

Sierpinski Triangle

IFS Fractals in Nature

Fractal Image Compression Doesn't matter what image we start with All information needed to represent final “target” image is contained in transforms Instead of storing millions of pixels, determine transforms for target image, and store them How to determine transforms?

The Collage Theorem Let is the attractor or “fixed point” of Collage Theorem (Barnsley 1988): Given arbitrary target image, transforms encoding are s.t. Use various practical methods to find

Practical Fractal Image Compression Most real-world images are only partially self- similar Arbitrary images can be partitioned into “tiles”, each associated with a transform. Compression algorithm computes and stores locations and transforms of tiles

Practical Fractal Image Compression

Part III: Unification

The “Two Cultures” Discrete Symbols Semantic Relations Grammar Rules Graph Structures Continuous Vectors Metric Spaces Continuous Transforms Image s Linguistics AI Logic Dynamical Systems Chaos Electrical Engineering

Meanings as Vectors “You shall know a word by the company it keeps” – J. R. Firth Vector representation of a word encodes co- occurrence with other words Latent Semantic Analysis (Indexing) – Singular Value Decomposition of co- occurrence matrix on text; 300-dimensional vectors [Landauer, Dumais, Kintsch]

Meanings as Vectors Self-Organizing Maps – Collapse high- dimensional descriptions (binary features or real-val vectors) into 2-D [Kohonen] Simple Recurrent Networks – Hidden-variable temporal model predicting next word based on current; 150-D vectors [Elman]

Meanings as Vectors Fred says the woman arrived. The woman says fred arrived. Fred loves the woman. The woman loves fred. The woman arrived. Fred arrived.

Composing Meaningful Vectors the woman.

Composing Meaningful Vectors loves the woman.

Composing Meaningful Vectors Fred loves the woman.

Part IV: Conclusions

Advantages of Vector Representations Meaning as a gradient phenomenon (semantic spaces = vector spaces) Can represent all transforms with a single hidden-variable non-linear equation (“grammar”) Gradient-descent methods as learning model Principled, biologically plausible alternative to “Words and Rules” approach [Chomsky, Pinker, Fodor]

A Collage Theorem for Language

A Collage Conjecture for Language

A Collage Hypothesis for Language

A Collage S.W.A.G. for Language Words/meanings are co-occurrence vectors. Compositions of meanings are transients to words. “Correct” set of transients is one for which word vectors form a subset of the attractor.