Presentation is loading. Please wait.

Presentation is loading. Please wait.

OptiCane Group 18 ECE 445 Spring 2019

Similar presentations


Presentation on theme: "OptiCane Group 18 ECE 445 Spring 2019"— Presentation transcript:

1 OptiCane Group 18 ECE 445 Spring 2019
Angela Park Christian Reyes Yu Xiao Zhang

2 Introduction 36 million people are blind worldwide[1]
Current solutions include: volunteer guides, service dogs, white canes White canes provide independence at relatively low costs Christian: The purpose of this project is to explore the problem space for aiding individuals with visual impairment. The World Health Organization estimates that currently there are 36 million individuals globally that are considered to be blind. Current aids available include volunteer guides, service dogs, and traditional white canes. Of the three, white canes maintain a level of independence for the user while also being cost effective. For our project, we decided that we would further improve upon the traditional design of the white cane as Angela will describe in the next slide.

3 Objective Use sensor array for object detection in urban environments
Create haptic feedback in the form of vibration patterns and differing intensity Angela: For our design, we wanted to maintain much of the features that the white cane has in its operational sense such as the tapping and scraping motions the user performs to receive environmental context. In order to improve upon this, we wanted to pursue an approach that would provide a user more information about the relative size and distance he or she is from surrounding objects without sacrificing its functionality. By using an array of sensors and haptic motor feedback, we devised a system that uses differing pulse patterns and intensities in order to indicate to the user these object characteristics in an easy to learn format. Shawn will now discuss the major design considerations we made in order to complete the physical project.

4 Design Now, we will go over our design in its entirety.

5 Physical Layout Sensors spaced 13 inches along the stick
Stick must be held by user at 45° angle to align sensors parallel to ground Shawn: For the final product, we determined that we would use a cane with a length of 1 m (approximately 40 inches in length) with a handle with a length of at least 10 cm. Along the cane, we decided to place each of our sensors 33 cm apart with the lowest sensor starting 5 cm from the bottom. Each of these sensors are mounted at a 45 degree angle from the cane in order for the IR beams to be parallel to the ground.

6 Physical Layout Robust design while maintaining lightweight factor
Aluminum construction Rigid yet flexible to withstand tapping and bumping Christian: Pictured here is our final product. As we planned, the cane has each sensor spaced accordingly and is angled such that the projected beams are parallel to the ground. The main PCB is housed within the black plastic box that is seen mounted on the cane near the handle and all of the wires are housed within the cane itself. From the box are also a few wires that lead up to our vibration wearable within the bracelet attached to Shawn’s wrist. We decided that for our design that it would be primarily composed of aluminum for its robustness and lightweight characteristics. In addition to this, we kept the tip of the cane as a smooth plastic in order to reduce the friction between it and the ground so that a user can still feel the textures of different surfaces.

7 Hardware Overview Power Supply- 9V Li-Ion Battery
Control Unit - Microcontroller Sensor Module Feedback Module C Now we are going to breakdown each of the modules that we needed to create for the goal of this project. On the next slide, Angela will show our block diagram of how each of these modules interact with each other and describe them in further detail.

8 Block Diagram A Pictured here is our block diagram. We decided that there would be 4 major subsystems that we would need to implement. Our power unit consists of a 9V battery and voltage regulators. The control unit is the brain of our project and contains our microcontroller that controls communication between the components. The sensor module contains an array of 4 LIDAR sensors, and our haptic feedback module contains our vibration disc, which is controlled by a motor driver.

9 Power Supply 9V Li-Ion Battery housed in PCB housing Removable
5V, 3.3V, and 2.8V voltage regulators Angela

10 Total Current Draw Device Quantity Current Laser sensor
(active ranging mode) 4 76 mA Haptic Motor Driver (average battery current during operation) 1 2.5 mA Motor Disc (maximum rated current) 60 mA Microcontroller (16 MHz at 5V) 10 mA Total 148.5 mA

11 Power Supply (Results)
Starting voltage: 9.69 V Ending voltage: 8.57 V (after 3 hours) Angela

12 Control Unit ATmega328p 16 MHz clock, 5V operating voltage Angela:
The ATmega328p was a natural choice for our project because it was easily programmable, handled I2C communication, and inexpensive. Additionally, there is a lot of community support for using the chip so any kinds of problems or questions that would come up could be easily resolved with further research. In order to program the chip, we installed a bootloader onto it directly and programmed it using the Arduino IDE. Another benefit to using the DIP package version of the ATmega is because of the ease of swapping it out into a socket that we directly soldered to our PCB making reprogramming and fine tuning much more streamlined.

13 Laser Sensor Module VL53L0X Time-of-Flight Laser Sensor
Detection range: 30mm - 2.2m Christian The sensors that we decided to use are VL53L0X Time of Flight sensors. They work by emitting an invisible laser that when reflected is received by the sensor. The time that it takes for this laser to be received by the sensor corresponds to the distance the target is away. They are able to be programmed in different sensing modes (a default mode and long range mode) and can measure within a mm of precision. We chose to use IR sensors as opposed to ultrasonic because of its small FOV, precision, and faster acquisition time. For our design we decided to use the long-range mode which can measure up to 2.2 m then fine tuned the maximum range to a length that we considered reasonable for a user to gauge their distance from a given object which we determined to be 1.5 m.

14 Laser Sensor Array (Results) - Plot
Christian In order to test the accuracy of these sensors, we created a test where we would measure distances between 0 and 2.2 m in 20 cm increments using the sensors. We then took the percent difference of each measurement by comparing what the sensor measured and what the actual distance was and found that as the distance increased, the sensors became more accurate with the lowest percent error being about 0.25%. The large spike at the shorter distance can be attributed to the minimum distance that the sensor can actually detect which was recorded to be about 2.3 cm.

15 Laser Sensor Array (Results) - Analysis
Percent error calculation Average percent error 2.017% C From the collected data, we found that the average percent error was approximately 2%, which was heavily influenced by the minimum distance that the sensor could collect data at.

16 Feedback Module DRV2605L motor driver
Provides tactile information from sensor array A To create the feedback module, we decided to use a DRV2605L motor driver and a vibration motor disc from SparkFun. The data received from the sensor array that is communicated to the microcontroller on an I2C bus can vary the amount of voltage that drives the motor which in turn will vary the intensity. The values that we relay to the motor driver are PWM values which are associated with the intensity of the vibration and are mapped linearly to the distance data that is received ranging from 128 being the weakest intensity to 0 being the most intense vibration setting.

17 Haptic Feedback (Results)
After programming the microcontroller to maintain a linear relationship between sensor distance and vibration intensity, we tested to see how accurate the mapping was by testing the distance measured to the associated PWM output sent to the motor driver. As demonstrated in the graph, the linear mapping was successful and did not show any failure in communication.

18 Software Overview Sensor Number Starting from Bottom of Cane to Top
Vibration Pattern 1st sensor (bottom) A single vibration separated by a pause. 2nd sensor Two vibrations in quick succession separated by a pause. 3rd sensor Three vibrations in quick succession separated by a pause. 4th sensor (top) Four vibrations in quick succession separated by a pause. S This is an overview of our vibration patterns. It corresponds to the highest sensor that is tripped.

19 Software Flowchart S First the sensors are initialized to long range mode. The motor controller is set to PWM mode. Then read the measurements from each sensor. They will report the distance to the first object detected in millimeters. We then save the minimum value of the sensors. We then find the highest sensor that measured a distance less than the max range and save that as the height. We use the minimum distance and the height to create the vibration pattern and vibrate the motor.

20 Future Hardware Development
Incorporate battery gauge Battery rechargeability Gravity sensors, such as gyroscope or accelerometer, for orientation Molded handle for user to hold at correct orientation A Add a method to measure the battery life of the device. Add a rechargeable battery and an interface to charge the battery such as via usb. A stabilization system such as a gyroscope to keep the sensors aligned. This would solve the problem of the sensors pointing at the ground. Add wireless wearability with bluetooth connection and a way to detect specific objects such as stairs or ramps.

21 Future Software Development
Unique stairs discrimination Battery life monitor Variable detection range S Add new patterns for detecting stairs or ramps. A way for user to check the battery life. The user can push a button and a specific pattern can play indicating the battery life. A warning when the battery is low. A way to change the detection range easily.

22 Ethics and Safety Guide users around obstacles through a safe path
Will not purposefully guide users through dangerous environments Will not fabricate false information Shall not be used as a weapon Violating will infringe upon IEEE code of ethics[2] S Our project will accurately report information about the environment and attempt to convey that in a clear and concise manner. We will not fabricate false information or purposefully lead the user to dangerous environments. Our project should only be used as a navigation device and is not to be used as a weapon. Violation any of these will infringe upon the IEEE of ethics.

23 Conclusion Sensor detection range tested and verified
Sensors successfully integrated with vibration motor feedback through I2C communication Shawn: Our project was very successful. Our sensor subsystem was able to accurately detect the range and height of objects based on our tests. Haptic feedback subsystem was able to clearly convey our sensor data to the user. As mentioned before, our future work involves adding wireless wearbility with Bluetooth, gyroscopic orientation of the sensors and rechargeable battery with battery life monitor.

24 Questions?

25 References [1] World Health Organization, ‘Blindness and vision impairment’, [Online]. Available: [Accessed: 3-February-2019]. [2] Institute of Electrical and Electronics Engineers, Inc, “IEEE Code of Ethics”, 2006, Available: html


Download ppt "OptiCane Group 18 ECE 445 Spring 2019"

Similar presentations


Ads by Google