A Signal Processing Approach to Vibration Control and Analysis with Applications in Financial Modeling By Danny Kovach.

Slides:



Advertisements
Similar presentations
5.4 Basis And Dimension.
Advertisements

Lect.3 Modeling in The Time Domain Basil Hamed
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Periodograms Bartlett Windows Data Windowing Blackman-Tukey Resources:
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: The FIR Adaptive Filter The LMS Adaptive Filter Stability and Convergence.
Matrices: Inverse Matrix
Linear Algebraic Equations
THE DIMENSION OF A VECTOR SPACE
1 Learning to Detect Objects in Images via a Sparse, Part-Based Representation S. Agarwal, A. Awan and D. Roth IEEE Transactions on Pattern Analysis and.
Discrete Time Periodic Signals A discrete time signal x[n] is periodic with period N if and only if for all n. Definition: Meaning: a periodic signal keeps.
Quantum One: Lecture 7. The General Formalism of Quantum Mechanics.
Lecture 24 Introduction to state variable modeling Overall idea Example Simulating system response using MATLAB Related educational modules: –Section 2.6.1,
1 Chapter 8 The Discrete Fourier Transform 2 Introduction  In Chapters 2 and 3 we discussed the representation of sequences and LTI systems in terms.
Ch 8.1 Numerical Methods: The Euler or Tangent Line Method
Data Mining Chapter 1 Introduction -- Basic Data Mining Tasks -- Related Concepts -- Data Mining Techniques.
Department of Electrical Engineering, Southern Taiwan University Robotic Interaction Learning Lab 1 The optimization of the application of fuzzy ant colony.
Topics in Artificial Intelligence By Danny Kovach.
Various topics Petter Mostad Overview Epidemiology Study types / data types Econometrics Time series data More about sampling –Estimation.
September Bound Computation for Adaptive Systems V&V Giampiero Campa September 2008 West Virginia University.
Motivation Thus far we have dealt primarily with the input/output characteristics of linear systems. State variable, or state space, representations describe.
Similarity Searching in High Dimensions via Hashing Paper by: Aristides Gionis, Poitr Indyk, Rajeev Motwani.
4 © 2012 Pearson Education, Inc. Vector Spaces 4.4 COORDINATE SYSTEMS.
AGC DSP AGC DSP Professor A G Constantinides©1 Signal Spaces The purpose of this part of the course is to introduce the basic concepts behind generalised.
UNIT 5.  The related activities of sorting, searching and merging are central to many computer applications.  Sorting and merging provide us with a.
Basic Concepts of Information Theory Entropy for Two-dimensional Discrete Finite Probability Schemes. Conditional Entropy. Communication Network. Noise.
Intelligent Database Systems Lab Advisor : Dr. Hsu Graduate : Chien-Shing Chen Author : Jessica K. Ting Michael K. Ng Hongqiang Rong Joshua Z. Huang 國立雲林科技大學.
1 Chapter 4 Unordered List. 2 Learning Objectives ● Describe the properties of an unordered list. ● Study sequential search and analyze its worst- case.
4 4.5 © 2016 Pearson Education, Inc. Vector Spaces THE DIMENSION OF A VECTOR SPACE.
Arrays Department of Computer Science. C provides a derived data type known as ARRAYS that is used when large amounts of data has to be processed. “ an.
COMPUTE INVERSE KINEMATICS IN A ROBOTIC ARM BY USING FUZZY LOGIC Subject: Robotics Applications Student: Bui Huy Tien Student ID: M961Y204.
 1 More Mathematics: Finding Minimum. Numerical Optimization Find the minimum of If a given function is continuous and differentiable, find the root.
A Self-organizing Semantic Map for Information Retrieval Xia Lin, Dagobert Soergel, Gary Marchionini presented by Yi-Ting.
1 College of Communication Engineering Undergraduate Course: Signals and Linear Systems Lecturer: Kunbao CAI.
VECTOR SPACE INFORMATION RETRIEVAL 1Adrienn Skrop.
Basic Concepts of Information Theory Entropy for Two-dimensional Discrete Finite Probability Schemes. Conditional Entropy. Communication Network. Noise.
Introduction toData structures and Algorithms
Track-While-Scan (TWS)
Singular Value Decomposition and its applications
The Acceptance Problem for TMs
Programming Logic and Design Seventh Edition
Chapter 12 Design via State Space <<<4.1>>>
Self-Organizing Network Model (SOM) Session 11
Linear Algebra Review.
3.1 Clustering Finding a good clustering of the points is a fundamental issue in computing a representative simplicial complex. Mapper does not place any.
Applied Discrete Mathematics Week 2: Functions and Sequences
Modeling and Simulation Dr. Mohammad Kilani
Deep Feedforward Networks
Advanced Control Systems (ACS)
Arrays 2.
Neural Networks.
3.1 Clustering Finding a good clustering of the points is a fundamental issue in computing a representative simplicial complex. Mapper does not place any.
Speech Recognition Christian Schulze
Modern Control Systems (MCS)
Quantum One.
Hidden Markov Models Part 2: Algorithms
Unit-2 Divide and Conquer
Part 3 - Chapter 9.
Further Data Structures
Notes Assignments Tutorial problems
Neural Networks and Their Application in the Fields of Coporate Finance By Eric Séverin Hanna Viinikainen.
STOCHASTIC HYDROLOGY Random Processes
6.5 Taylor Series Linearization
Digital and Non-Linear Control
8. Stability, controllability and observability
Numerical Analysis Lecture10.
The Fourier Series for Continuous-Time Periodic Signals
Linear Equations in Linear Algebra
Management From the memory view, we can list four important tasks that the OS is responsible for ; To know the used and unused memory partitions To allocate.
Maths for Signals and Systems Linear Algebra in Engineering Lectures 4-5, Tuesday 18th October 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN.
§1—2 State-Variable Description The concept of state
Linear Equations in Linear Algebra
Presentation transcript:

A Signal Processing Approach to Vibration Control and Analysis with Applications in Financial Modeling By Danny Kovach

An Overview of the Talk Special thanks General overview of the project Mathematical aspects of the theory Preliminary results

An Overview of the Project Sorting the Memory Structure: Signal Matching – To facilitate signal retrieval, or recognition, the memory structure has the ability to sort itself, imposing an association between items in memory, thus leavening the overhead of an unordered search. About this project – This project was funded by the FSGC grant for the purpose of vibration control in the Aries project. The Problem – To identify characteristically harmful signals in an effort to control and dampen them. Adaptation – The tolerance of the distance function and signal recognition mechanisms can be set autonomously by the program The Solution – An algorithm was developed which scans for harmful signals. Learning and Remembering – The program has the ability to assimilate new signals not previously stored in memory. It continuously resorts itself with respect to these new elements. How it Works – The program compares an incoming signal by means of a specified distance function to fragments stored in memory. The Memory Structure – The program has an innate memory structure capable of storing signals that can be predefined or randomly generated.

Mathematical Aspects Distance Function Topology Organization and Convergence

The Notion of a Signal Let us consider functions of some time or frequency variable, and assume that we can measure them by some means. We can represent n samplings of the function as an n dimensional vector. Can scan signals in real time using a sliding window. When the context is clear, we call this representation a signal. Fig. 1, [Jones, 2008]

The Distance Function Given two signals, we can calculate their distance or homology by means of the following function. Given a (finite) subset of functions, we can organize the values to obtain a memory structure (MS) which takes the form of a matrix A, with An example MS is plotted graphically in figure 4. Similar signals have similar distances. Observe that the diagonal is zero. Fig. 2

Inducing a Topology To form a topology on the set of all signals, let us consider the definition of an open ball Then the set of all of these open balls at all points will serve as a base for a topology. Observe that the choice of the tolerance parameter adjusts the coarseness and fineness of the topological space. Since the MS is necessarily finite, all topologies will have a finite cardinality. The product of multiple topologies is therefore itself a topology, [Dshalalow, 2001]. We use topological concepts to recover signals from memory.

Organizing the MS A linear search can be very time consuming. We will organize the MS to aid signal recognition as follows (see movie 2) Choose an element in the MS, called the pivot. Calculate the distance between all elements and the pivot using h. Arrange all signals into a vector according to their distance via h. Call this structure the derived memory structure. Movie 1

Convergence The derived MS can be used to facilitate signal retrieval. The algorithm for signal retrieval, or recollection is summarized as follows Calculate the distance between the test signal and the pivot. Find the set of signals in the derived MS that share this distance to within some tolerance. Search this subset to find the signal in memory most closely related to the input signal. The results of a linear search, and a search using the above algorithm are shown below. Fig. 3

The Dynamic Memory Structure (DMS) We can employ the above theory to create the DMS, which consists of the following substructures. An ordered set of all signals (derived memory structure). A list of distances. A histogram of the recollections of each element in the DMS. The tolerance parameter. The signal length. The histogram is formed initially as follows Every time an element s’ is recalled from the DMS, the histogram is updated as follows

Highlights of the DMS The DMS can Dynamically allocate memory. Constantly update the histogram with respect to recollections. Assimilate new elements. Delete unused elements. Automatically place a new element in its respective position in the ordered MS. Can set the tolerance parameter with respect to a zero signal. Zero Signal – A signal of negligible variation from the zero element. Adjusts tolerance parameters with respect to incoming signals. The next several slides provide a demonstration of the utilities of the DMS.

histogram.cpp This program illustrates how the histogram works. Three elements are stored in memory, and their distances are calculated. Each element in the histogram is initialized to 100, and u = 0.1. The DMS is conditioned with the third element a total of 10 times. The DMS correctly identifies the input signal each time and adjusts the histogram accordingly.

intuition.cpp Uses the Histogram to predict the next mostly likely choice for a series of outcomes. The DMS is shown a series of occurrences of the numbers 1, 2, and 3, all stored in memory. The outcomes are slighted to number 3. The DMS is asked to predict the next most likely occurrence by choosing the largest value in the histogram.

learntolerance.cpp The DMS is shown a series of signals that are defined to be zero. The DMS figures out the tolerance necessary to identify each signal with the pivot (the zero signal). The program can also set the tolerance using real time input using a sliding window technique.

recollection.cpp The first part of this program is to test whether the DMS can correctly identify elements that are stored in memory. It is conditioned with a series of test signals known to be in memory. The histogram indicates that the DMS has correctly identified the elements.

recollection.cpp The second part shows that the DMS can correctly assimilate new signals into memory. The DMS is shown a new element. The output indicates that the system is unfamiliar with the element, and adds it into memory.

closedloop.cpp This program sets the tolerance with respect to the zero signal. The tolerance is adjusted to keep the signal “in focus” If signals are recalled abundantly, the tolerance is increased. If the DMS is learning too many new elements, the tolerance is decreased. Here the zero signal consists of signals composed of elements 0.0, 0.1, 0.2, and 0.3.

closedloop.cpp Next, the DMS is conditioned with more signals composed of the elements 0.0, 0.1, 0.2, and 0.3. The DMS identifies the elements with the pivot, and changes the tolerance parameter in response. Consequently, other elements are recalled from memory.

A Discussion of the Method What makes this method distinct? The distance function Use of the floor function and tolerance parameter. The coarseness or fineness of the induced topology is constantly updated. Organization All elements are organized with respect to a pivot. This organization facilitates signal recollection. The system is dynamic It can change its internal tolerance parameter. It can assimilate new elements. It can delete elements that are not recalled often. It constantly reorganizes itself with respect to these changes. It keeps track of the elements that it recalls This information can be used to make future predictions.

Future Developments Many areas are lacking at this point. Some areas for improvement include Testing signal recognition in real time, specifically Using larger signals more pertinent to real life data. Using a DMS with a large number of signals. Investigating transformations. Some transformations with promise include Fourier transforms. The discrete derivative operation. Researching the effectiveness of multiple DMS’s. Extend the capabilities beyond signal recognition to Terrain mapping. Object recognition. Financial Analysis. More advanced forms of AI. Storing actions as memory elements.

Conclusion We studied the formulation of the DMS for the present purpose of signal recognition. A distance function was formulated, a topology was induced, and signal retrieval was discussed. Initial testing and future developments were discussed.

References

References