Presentation is loading. Please wait.

Presentation is loading. Please wait.

Introduction of Real-Time Image Processing

Similar presentations


Presentation on theme: "Introduction of Real-Time Image Processing"— Presentation transcript:

1 Introduction of Real-Time Image Processing
Parya Jandaghi Prof. Arabnia Spring 2016

2 Outline Key Parameters in Image Processing
Differences between Real-time and Non Real-time Image Processing Examples of Real-time Image Processing Face Recognition Emotion Recognition QR Code Detection Post Processing in Video Games Speed Detection

3 Image Processing Output Processor Input One Time Continuous
Extract Data Modify Add Output One Image Sequence of Images Array of data

4 Real Time Not Real Time Output & Input
Produce output simultaneously with input (continuous) Has no value when delivered too late Not Real Time Non continuous Time of Processing is not the priority

5 Real-Time Image Processing – Multi Resolution encoding

6 Face Recognition Find a person in videos

7 Emotion Recognition

8 QR Code Detection

9 QR Code Decoding

10 QR Code Decoding

11 QR Code Detection

12 Post Processing in Video Games
Bloom Effect Anti Aliasing Effect

13 Bloom Effect in real world

14 Bloom Effect in video games
Frame Buffer Binary Version Applied Gaussian Filter

15 Bloom Effect in video games

16 Anti Aliasing Effect What is aliasing? Solution?

17 Anti Aliasing Effect

18 Anti Aliasing Effect Solution?

19 Anti Aliasing Effect

20 Speed Detection Camera System using Image Processing
Usage: Speeds of vehicles on high ways, sport, competitions, etc. Stages: Object Detection Phase Object Tracking Phase (Segmentation, Labelling, Center Extraction) Speed Calculation Phase

21 Speed Controller Frame T Frame T+1 18 Meters Frame T+29 Frame T+30
Video Recorder (30 Frames Per Second) -> Time of 30 Frames = 1 Second Distance ~= 18 Meters V=dx/dt -> Speed = Distance / Time = 18 Meters / 1 Second = 18 m/s = mile/h

22 Extracting motion Frame n-1 Frame n Difference
I(n, x, y) = Color of pixel(x, y) in the nth frame D(n, n-1, x, y) = 0 if |l(n, x, y) – l(n-1, x, y)| < epsilon(~0) 1 otherwise

23 Extracting motion Frame n Frame n+1 Difference
I(n, x, y) = Color of pixel(x, y) in the nth frame D(n, n-1, x, y) = 0 if |l(n, x, y) – l(n-1, x, y)| < epsilon(~0) 1 if otherwise

24 Extracting motion Difference n-1&n Difference n&n+1 Common
Common(n-1, n, n+1, x, y) = 0 if D(n, n-1, x, y) ~= D(n+1, n, x, y) < epsilon(~0) 0 if |D(n, n-1, x, y) - D(n+1, n, x, y)| > epsilon(~0) 1 if otherwise (Both pixels are white then common is white) Common = D(n, n-1, x, y) * D(n+1, n, x, y)

25 Object Tracking Object Segmentation
Scan the foreground image horizontally Scan the foreground image vertically First iteration

26 Object Tracking Object Segmentation
Scan the foreground image horizontally Scan the foreground image vertically Second iteration

27 Object Labelling In order to keep track of the moving objects, labelling is an essential process. This is because each object must be represented by a unique label while keeping in mind that the object shall preserve its label without any change. This is since the moment it enters the scene (at frame F0) till it leaves the scene (at frame Fn)

28 Center Extracting The object is being ready for the tracking phase. But, for optimization issues, we have discovered that no need to track the whole object pixel by pixel, we just need a descriptive point representing the object.

29 Speed Calculation

30 Challenges Dealing with noises Object Dismissal
Advantages compared to Doppler devices

31 Thank you 


Download ppt "Introduction of Real-Time Image Processing"

Similar presentations


Ads by Google