Download presentation
Presentation is loading. Please wait.
Published byPamela Lindsey Modified over 9 years ago
1
…
2
2 Ongoing software project, not “theory” Encapsulated internals & interfaces Today: –Details of module internals –Details of architecture & signaling/feedback –Single, clean, simple inputs –(26 slides) Not yet: –time –noise –robustness –multiple/partial hypotheses
3
3 One “compressor”: Generic memory unit Learns about low-dim structure in high-dim data Converts live data between low-dim high-dim Hierarchy of compressors: Each learns from compressed & combined output of those below Bi-directional (feedback)
4
4 Compressor Internals Probability estimation Bi-directional mapping Matching to previous compression Compressing Quantizing & representing high-dim input old: P = p 1 + p 2 + …
5
5 Quantizing & representing high-dim input “Point” = position, weight, radius Two point-clouds: mapping vs. learning (sync occasionally) 1. Find 3 closest cloud-points X Result: Point-cloud approximates input cloud, with roughly equal weight per point 2. Choose the lightest 3. Move it to absorb new point, preserving center of mass 4. Increase weight 5. Update radius 6. (prune lightweight points) Online updates:
6
6 Compressing high to low (ISOMAP) 1. Find local distances in high-dim space 2. Create long-range distances from shortest piecewise path (“geodesic”) 3. Link “islands” until all D ij defined 4. Diagonalize F(Dij) to get low-dim cloud (arbitrary coordinates)
7
7 Keeping new maps consistent with old ones Old cloud The low-dim mapping is not always unique… …so rotate & stretch new cloud to minimize distance from old one (SVD) New cloud Rotated new cloud
8
8 Mapping new points using point-clouds 1. Find new point’s closest 4-5 neighbors 2. Express it as their center-of-mass (SVD) 3. Construct low-dim output from corresponding neighbors & weights 4. Also works mapping low high = W2W2 W1W1 W3W3 W4W4 W2W2 W1W1 W3W3 W4W4 =
9
9 Prob. Estimation Each point is center of gaussian P = p 1 + p 2 + … RiRi P i = exp ( -0.5 r 2 / R 2 ) / (R D P tot ) “Probability” of test point is sum over local gaussians P = p 1 + p 2 + … Probability =“Closeness” to manifold = how much to trust this point … use it later in mixing estimates.
10
10 Compressors interacting Creating forward output Feedback mixed back in Settling
11
11 Creating output Map from high to low dim Expose result to all Compressors above Re-map output backwards to high dim Expose as feedback to Compressors below
12
12 P Mix feedback into output 2. Get probabilities of feedback and own output 1. Average feedback from above 3. Create weighted mixture of them P
13
13 2. Iterate a few times to settle Updating and settling 1. Expose mixture as updated output, and map downward as updated feedback --- done with description of system --
14
14 General simulation results 3-layer hierarchy with 2-1 convergence Input is 9x6 “pixel” space with random illumination Display low-dim output in 2-D color
15
15 Simple 1-dim illumination How does each module map the input space? ?
16
16
17
17 Toroidal 1-dim illumination How does each module map the circular input space? ? =
18
18
19
19 2-dim spot illumination How does each module map the 2-D input space? ?
20
20
21
21 “Hallucinating” spots driven from above 1.Force activity at a single location in top module 2. Let feedback move down 3. Look at what lower modules think input ought to be ?? ?
22
22
23
23 2-dim clustered spots (left & right) How does each module map the 2-D input space? ?
24
24
25
25 Next steps Architecture –Time –Reference problem –Reference platform –Integration method –Separate streams for transforms vs. objects –Get people involved! Algorithms –Noise –Multiple hypotheses –Distributed representation –“neurons” –Better quantization, mapping, robustness
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.