Download presentation
Presentation is loading. Please wait.
1
Learn how to make your drawings come alive… Lecture 3: SKETCH RECOGNITION Analysis, implementation, and comparison of sketch recognition algorithms, including feature-based, vision-based, geometry-based, and timing-based recognition algorithms; examination of methods to combine results from various algorithms to improve recognition using AI techniques, such as graphical models.
2
Class Overview Paper discussion (lead by Joshua Peschel) Discussion of previous homework and implementation issues Lecture on classification details of Rubine Introduction to Long Homework
3
Read Paper Discussion: Rubine (lead by Joshua Peschel) Thoughts?
4
Class Overview Paper discussion (lead by Joshua Peschel) Discussion of previous homework and implementation issues Lecture on classification details of Rubine Introduction to Long Homework
5
Homework Implementation Discussion Compare average feature values for each gesture type. What did people get?
6
Implementation Issues (Repeat Slide from Lecture 2) Tablets are faster now than they were Rubine paper was based on slower data (mouse?) But, for correct feel, pen needs to be fast Issues you may have: –Duplicate location (2 consecutive points in same place) –Duplicate time (2 consecutive points at the same time) –Divide by zero (because of above problems) –Make sure your to convert to double before dividing in Java –Remove the second point not the first for duplicate points
7
Performing Recognition on these Features What do you think about the features? Can you foresee any recognition problems? Which features will be more helpful?
8
Rubine Classification Evaluate each gesture 0 <= c <= C. V c = value = goodness of fit for that gesture c. Pick the largest V c, and return gesture c
9
Class Overview Paper discussion (lead by Joshua Peschel) Discussion of previous homework and implementation issues Lecture on classification details of Rubine Introduction to Long Homework
10
Rubine Classification W c0 = initial weight of gesture W ci = weight for the I’th feature F i = i th feature value Sum the features together
11
Compute gesture covariance matrix How are the features of the shape related to each other? Look at one example - look at two features – how much does each feature differ from the mean – take the average for all examples – that is one spot in the matrix http://mathworld.wolfram.com/Covariance.html Is there a dependency (umbrellas/raining)
12
Normalize cov(X) or cov(X,Y) normalizes by N-1, if N>1, where N is the number of observations. This makes cov(X) the best unbiased estimate of the covariance matrix if the observations are from a normal distribution.For N=1, cov normalizes by N They don’t normalize for ease of next step (so just sum, not average)
13
What does it mean to “Normalize” Taking the average But… we want to find the true variance. Note that our sample mean is not exactly the true mean. By definition, our data is closer to the sample mean than the true mean Thus the numerator is too small So we reduce the denominator to compensate
14
Common Covariance Matrix How are the features related between all the examples? Top = non normalize total covariance Bottom = normalization factor = total number of examples – total number of gesture types (classes)
15
Weights W cj = weight for the j th feature of the c th shape Sum for each feature –Common Covariance Matrix inverted* ij –Average feature value for the i th feature for the c th gesture
16
Initial Weight Initial gesture weight = Sum for each feature in class: –Feature weight * average feature value
17
Rubine Classification Evaluate each gesture 0 <= c <= C. V c = value = goodness of fit for that gesture c. Pick the largest V c, and return gesture c
18
Rubine Classification W c0 = initial weight of gesture W ci = weight for the I’th feature F i = i th feature value Sum the features together
19
Rubine Adjustments: Eliminate Jiggle Any input point within 3 pixels of the previous point is discarded
20
Rubine Adjustments: Rejection Technique 1 If the top two gestures are near to each other, reject. V i > V j for all j != i Reject if less than.95
21
Rubine Adjustments: Rejection Technique 2 Mahalanobis distance Number of standard deviations g is from the mean of its chosen class i.
22
Implementation Issues Java matrix class JAMA for Java: http://math.nist.gov/javanumerics/jama/ http://math.nist.gov/javanumerics/jama/ or unused for C++: http://www.techsoftpl.com/matrix/download.htm or unused for C++: http://www.programmersheaven.com/dow nload/30784/download.aspx http://www.techsoftpl.com/matrix/download.htm http://www.programmersheaven.com/dow nload/30784/download.aspx
23
Class Overview Paper discussion (lead by Joshua Peschel) Discussion of previous homework and implementation issues Lecture on classification details of Rubine Introduction to Long Homework
24
Long features (from Rubine) 1. Cosine of initial angle 2. Sine of initial angle 3. Size of bounding box 4. Angle of bounding box 5. Distance between first and last points 6. Cosine of angle between first and last points 7. Sine of angle between first and last points 8. Total length 9. Total angle 10. Total absolute angle 11. Sharpness
25
Long Features: New 12. Aspect [abs( – #4)] 13. Curviness 14. Total angle traversed / total length 15. Density metric 1 [#8 / #5] 16. Density metric 2 [#8 / #3] 17. Non-subjective “openness” [#5 / #3] 18. Area of bounding box 19. Log(area) 20. Total angle / total absolute angle 21. Log(total length) 22. Log(aspect)
26
Class Overview Paper discussion (lead by Joshua Peschel) Discussion of previous homework and implementation issues Lecture on classification details of Rubine Introduction to Long Homework
27
Read Long paper – who wants to present? Implement Classifier Apply Rubine features to your classifier (w/out adjustments – save for weekend). Computer accuracy on Math data. Implement Long Features Apply Long features to your classifier (compute accuracy on Math data) Bring in Classification results
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.