Presentation is loading. Please wait.

Presentation is loading. Please wait.

Submitted by : Mark Gakman, Herzel Abramov Supervisors : Ina Rivkin, Eli Shoushan Vitaly Savuskan, Avi Hohama, Prof. Yael Nemirovsky.

Similar presentations


Presentation on theme: "Submitted by : Mark Gakman, Herzel Abramov Supervisors : Ina Rivkin, Eli Shoushan Vitaly Savuskan, Avi Hohama, Prof. Yael Nemirovsky."— Presentation transcript:

1 Submitted by : Mark Gakman, Herzel Abramov Supervisors : Ina Rivkin, Eli Shoushan Vitaly Savuskan, Avi Hohama, Prof. Yael Nemirovsky

2

3 “Shooting detection in maximum light condition” We would like to use the new available technology (SPAD) to form a device possible of reaching our goal

4

5

6  Purpose – suitable for a helmet ◦ Small size ◦ Low weight ◦ Low power

7 Definition of “Frame” is required for the algorithm. SPAD emits a two level digital signal : - ‘high’ in the presence of a photon (1.8V) - ‘low’ in an absence of a photon (0V)

8  While collecting all of the SPADS samples, we build the frame by counting the changes (high->low or low->high,doesn’t matter) and saving this information in each pixel. (8 bits per pixel) 8 bits 32

9 For example: If we count the ‘low’ to ‘high’ changes: 1->0->1->1->1->0->0->1->0->1

10  Saving four frames (y1,...,y4)  Defining: ◦ x1=y2-y1 ◦ x2=y3-y2 ◦ x3=y4-y3  If those three values are bigger than given threshold => suspicious scene

11 Main parameters which our algorithm will be based on : 1. mu – “average” of frames subtraction product, which we will have to reduce from our X’s. 2. sigma – “variance” which we will have to compare to our X’s. 3. c – a parameter which multiplies the variance.

12 Algorithm divides into 2 categories: 1. NON-ADAPTIVE : mu and sigma are given 2. ADAPTIVE : mu and sigma are adapted through the algorithm, updating with every new frame (second part of the project) (c is given anyway)

13 First, a reminder of the PCB

14

15

16 Now, a reminder of the FPGA Block Design

17

18 Gets DATA from the imager, And arranges it into frames. Each frame : 64us/0.01us= 6400 clks

19

20

21 Each pixel at a time -> LESS POWER 6400clks/1024pixels=6 clks (rounded to “floor”) to work with each pixel, until the next frame arrives (working with a width of 8 bits).

22

23 Now to the implementation in the VIVADO tool

24 VIVADO software was used in order to integrate our system and implement it (gate level) BRAM ALGORITHM FRAME ACQUIRE

25

26  The simulations of the code were comprehensive, we’ll show a few highlights: ◦ 1. Example of frame building. ◦ 2. End of frame. ◦ 3. Algorithm detection.

27  The second time (after 32 cycles) another data received : ◦ 1. Example of frame building :  The first time data received : clk Data (spad) Data (bram) clk Data (spad) Data (bram)

28  End of frame - frame ready signal, and the change of the bram enables :

29 Algorithm detection : first the defining of the constants as given by Avi The detect itself: after providing data that has an event in 32’th pixel:

30 LED board PCB + Zed board LED pipe

31  Checking every port - successful  Voltage check ◦ Vcc, GND – successful ◦ LDO and DC-DC – faced some issues  As a result - used an external power supply via an external connector on our PCB

32  A lot of hard work was made in order to receive data from the SPAD (very sensitive sensor).  before powering up the device one must be extremely careful – only Vitaly is authorized to power it up.  Using a scope, we were able to confirm the SPAD was “alive”

33  Our system includes : ◦ PCB connected to ZedBoard (FPGA board) ◦ Verilog implementation in Vivado ◦ LED board ◦ MATLAB script The system is able to sample data from the SPAD, build it into a frame and run the algorithm. Using chip scope we can draw a sample and a frame from the SPAD to the computer and convert it to a grayscale image (MATLAB script)

34  Using this tool, enables us the option to view a sample or a frame of the SPADs using triggers: ◦ A sample – the trigger is on the row address ◦ A frame – the trigger is on the BRAM enables (which change with each finished frame)

35 Checking the signals on the hardware using VIVADO logic analyzer tool Notice the BRAM enable signals toggle, and the BRAM data change due to the SPAD data rising

36  After completing the set-up work of the whole system, we configured the LED with a proper signal, and faced it towards the lens  Unfortunately – No detection

37  Since the detection failed (after simulations and debugs), Avi suggested that the problem is in the optics of the lens  We wrote a MATLAB program, which takes a frame from our system (via VIVADO), and shows it as a grayscale image  Using these images, it is possible to check: ◦ Whether the light on the SPAD is focused ◦ How strong and influencing is the light from the LED

38 White-a lot of counts (255 max) Black-no counts (0 min) AVG=64 *the black columns, especially in the left, are due to lack of connections in the SPAD device.

39 White-a lot of counts (255 max) Black-no counts (0 min) AVG=75 *the black columns, especially in the left, are due to lack of connections in the SPAD device.

40

41  We can definitely see the differences between the cases

42  Although we saw the effects of the LED, the bad news are that they are spread all over the working pixels! (we want focus over few pixels in order to get a detection)  In order to calibrate the lens, one will have to build a REAL-TIME software that takes the frames from the SPAD and turns them into informative images

43  As mentioned before, a real-time image generator is required, for the process of lens calibration  After completing the task of detection, the next goal is to draw the frames to the ARM unit

44  Our system includes : ◦ PCB connected to ZedBoard (FPGA board)  SPAD itself  Lens for light focus  Pipe for dark environment  Power cables  External connector for LED PCB ◦ Verilog implementation in Vivado  RTL code provided (Verilog) ◦ LED board  Option of configuration by wave generator ◦ MATLAB script  Able to generate a grayscale image

45 The system is able to sample data from the SPAD, build it into a frame and run the algorithm. The system is able to produce: - Snapshot of the SPAD - A whole Frame generated by the SPAD - Grayscale image of the frame (MATLAB) - More triggers can be configured All the system functions were approved by Avi

46 Thank you for listening ! ◦ Questions?


Download ppt "Submitted by : Mark Gakman, Herzel Abramov Supervisors : Ina Rivkin, Eli Shoushan Vitaly Savuskan, Avi Hohama, Prof. Yael Nemirovsky."

Similar presentations


Ads by Google