Download presentation
Presentation is loading. Please wait.
2
Vision Based Motion Control Martin Jagersand University of Alberta CIRA 2001
3
Vision Based Motion Control Martin Jagersand University of Alberta CIRA 2001
4
Content 1. Vision based motion control 2. Programming and solving whole human tasks 3. Software systems for vision and control 4. Discussion
5
1. How to go from Visual sensation to Motor action?> Camera -> Robot coordRobot -> Object
6
Closed loop traditional visual servoing This talk: focus on estimating the geometric transforms EE
7
Lots of possible coordinates Camera – Frame at projection center – Many different models Robot – Base frame – End-effector frame – Object frame Traditional modeling: P=P 1 ( ) P 2 ( )… P n ( )
8
Hand-Eye system Motor-Visual function: y=f(x) Jacobian: J=( df i / dx j )
9
Recall: Visual specifications Point to Point task “error”: Why 16 elements?
10
Visual Servoing Observed features: Motor variables: Local linear model: Visual servoing steps: 1 Solve: 2 Move:
11
Find J Method 1: Test movements along basis Remember: J is unknown m by n matrix Assume movements Finite difference:
12
Find J Method 2: Secant Constraints Constraint along a line: Defines m equations Collect n arbitrary, but different measures y Solve for J
13
Find J Method 3: Recursive Secant Constraints Based on initial J and one measure pair Adjust J s.t. Rank 1 update: Consider rotated coordinates: – Update same as finite difference for n orthogonal moves
14
Trust region of J estimate Let be the trust region at time t Define a model agreement: Update the trust region recursively: Where d upper and are d lower predefined constants
15
Visual Servoing Steps 1. Solve: 2. Update and move: 3. Read actual visual move 4. Update Jacobian:
16
Visual Servoing Steps 1. Solve: 2. Update and move: 3. Read actual visual move 4. Update Jacobian:
17
Jacobians = Spline model of underlying non-linear function Over time acquires several Jacobians J Each J a hyperplane Collection of J’s form a (sparse) piecewise linear spline
18
Jacobian based visual model Assume visual features m>>n motor freedoms All visual change restricted to n freedoms by: 1. Can predict visual change 2. Can also parameterize x visually
19
Related visual model: Affine model Affine basis Image projection of origin: Image basis: e1e1 e2e2 e3e3 O
20
Find affine coordinates Observe (track) y through time Solve an equation system to find q Reprojection: Have q,want y e1e1 e2e2 e3e3 O q
21
Relation Affine – Jacobian image models Rewrite affine model
22
Composite affine and Jacobian model Chain the affine and Jacobian model Represents rigid objects in arbitrary motor frame
23
Transforms Affine-Jacobian model Measurement matrix Affine coordinate equation:
24
Experiment: Affine animation of rigid structure
25
Affine vs. Visual-Motor
26
Other sensory modalities: Force and contact manipulation Accuracy is limited by: Visual tracking and Visual goal specification Specifying well defined visual encodings can be difficult Limited to non-occluded settings Not all tasks lend themselves to visual specification.
27
Constraint Geometry Impact force along surface normal: Sliding motion: 3 rd vector:
28
Constraint Frame With force frame = tool frame we get: Assume frictionless => Can update each time step P1P1 P2P2 P3P3
29
Hybrid Control Law Let Q Joint -> Tool Jacobian Let S be a switching matrix, e.g. diag([0,1,1]) Velocity control u: Visual partForce part
30
Accounting for Friction Friction force is along motion direction! Subtract out to recover surface normal:
31
Motion Sequence
33
Summary of model estimation and visual motion control Model estimation is on-line and requires no special calibration movements Resulting Jacobians both model/constrain the visual situation and provide visual motor transf. Motion control is direct from image based error functions to motor control. No 3D world space.
34
2. How to specify a visual task sequence? 1. Grasp 2. Move in 3. Cut 1. Grasp 2. Reach close 3. Align 4. Turn
35
Recall: Parallel Composition Example: E ( y ) = wrench y - y 47 25 y (y y ) 8 3 4 6 1 2 Visual error function “spelled out”:
36
Serial Composition Solving whole real tasks Task primitive/”link” 1. Acceptable initial (visual) conditions 2. Visual or Motor constraints to be maintained 3. Final desired condition Task =
37
“Natural” primitive links 1. Transportation Coarse primitive for large movements <= 3DOF control of object centroid Robust to disturbances 2. Fine Manipulation – For high precision control of both position and orientation – 6DOF control based on several object features
38
Example: Pick and place type of movement 3. Alignment??? To match transport final to fine manipulation initial conditions
39
More primitives 4. Guarded move – Move along some direction until an external contraint (e.g. contact) is satisfied. 5. Open loop movements: When object is obscured Or ballistic fast movements Note can be done based on previously estimated Jacobians
40
Solving the puzzle…
41
Teaching and Programming in Visual Space 1. Tele Assistance A tele-operator views the scene through stereo cameras Objects to be manipulated are pointed out on-line 2. Visual Programming Off-line Like xfig, macpaint, but with a palette of motor actions. 3. Teaching by Showing A (human) manipulation is tracked in visual space The tracked data is used to (automatically?) generate a sequence of visual goals
42
HCI: Direct manipulation Example: xfig drawing program Icons afford use Results visible Direct spatial action- result mapping line([10, 20],[30, 85]); patch([35, 22],[15, 35], C); % C complex structure text(70,30,'Kalle'); % Potentially add font, size, etc matlab drawing:
43
Example: Visual programming
44
Task control summary Servoing alone does not solve whole tasks – Parallel composition: Stacking of visual constraints to be simultaneously satisfied – Serial composition: Linking together several small movements into a chain of continuous movements Vision-based user interface – Tele-assistance – Visual Programming – Teach by showing
45
Types of robotic systems Auton omy Generality Supervisory control Tele-assistance Programming by demonstration Preprogrammed systems
46
3. Software systems for vision- based control
47
Hand-Eye System
48
System requirements Solve many very different motion tasks – Flexible, teachable/re-programmable Real time – On special embedded computers or general workstations Different special HW Multiprocessors
49
Toolbox
50
System design Interpreted “scripting” language gives flexibility Compiled language needed for speed and HW interface. Examples Matlab Greencard Haskell C, C++, fortran PVM C, C++ Dyn linking (mex)
51
Usage example: Specialize robot – projandwait(zero3,’robotmovehill’,A3D,’WaitForHill’); Initialize goals and trackers – [TrackCmd3D,N] = InitTrackers([1 1],[0,1]); – PU = GetGoals([1 1],[0,1]); Servo control – J3s = LineMove(‘projandwait’,TrackCmd3D,J3i,PU,Ndi,err)
52
Software systems summary Most current demos solve one specific movement For solving many everyday tasks we need flexibility and reprogrammability – Compiled primitive visual trackng and – Interpreted scripting language – Higher order functions
53
Workshop conclusions ?
54
Sensing is unreliable and incomplete – Can’t reliably build internal 3D world models, but can use the real world as an external reference. A-priori object and world models uncommon in human environments – Estimate on-line and only what’s needed. Human users require human interaction techniques – Interacting by visual pointing and gestures is natural.
55
Action/Perception division in human and machine hand-eye syst.
56
Open questions? For particular tasks what are the most natural representations and frames? Global convergence of arbitrarily composed visual error functions? Robustness? Interaction with other sensing modalities?
57
Feedback system Fast internal feedback Slower external trajectory corrections
58
Short and long control loops
59
Applications for vision in User Interfaces Interaction with machines and robots – Service robotics – Surgical robots – Emergency response Interaction with software – A store or museum information kiosk
60
Service robots Mobile manipulators, semi-autonomous DIST TU BerlinKAIST
61
TORSO with 2 WAMs
62
Service tasks This is completely hardwired! Found no real task on WWW
63
But Maybe first applications in tasks humans can’t do?
64
Why is humanlike robotics so hard to achieve? See human task: – Tracking motion, seeing gestures Understand: – Motion understanding: Translate to correct reference frame – High level task understanding? Do: – Vision based control
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.