Presentation is loading. Please wait.

Presentation is loading. Please wait.

SUPERB-IT Center For Hybrid & Embedded Software Systems COLLEGE OF ENGINEERING, UC BERKELEY July 29, 2005 SUPERB-IT.

Similar presentations


Presentation on theme: "SUPERB-IT Center For Hybrid & Embedded Software Systems COLLEGE OF ENGINEERING, UC BERKELEY July 29, 2005 SUPERB-IT."— Presentation transcript:

1 SUPERB-IT Center For Hybrid & Embedded Software Systems COLLEGE OF ENGINEERING, UC BERKELEY http://chess.eecs.berkeley.edu/superb/ July 29, 2005 SUPERB-IT Faculty & Staff CHESS NSF MEP Family & Friends Modeling of Distributed Camera Networks Murphy Junior Gant Diablo Valley College Mentor: Yang Zhao Acknowledgements Abstract Model of Camera Network: 3 rd Floor, Cory Hall Future Outlook Plots graph of object in motion Omni-directional camera Results Camera and sensor networks are used for environmental observation, military monitoring, building monitoring, and healthcare; however, some of the challenges faced are issues of packet loss, battery, power loss, collisions, and geographical restrictions. Through the functionalities of VisualSense, a modeling and simulation framework for wireless and sensor networks built on Ptolemy II, it is possible to extend existing composite actors and Java classes designed for sensor data to use data from camera networks. One application of VisualSense is modeling camera networks. In conjunction with a clustering and power algorithm, this research determined not only which cameras actively monitored the moving object, but how many cameras monitored if constrained by a power budget. Additionally, based upon the capabilities and limitations of each camera, a development of tracking algorithms filtered out all wasteful, underdeveloped data, provided from cameras that were out of a reasonable scope. Successfully visualize the camera network output by fusing sensor data Foundation for tracking several objects at any given time Central-server computation model for tracking Simultaneously track multiple objects Configure cameras with zoom in/out capabilities Use of VisualSense framework to give feedback in addition to visualization This finite-state machine called “Election” is an addition to the camera composite actors in an effort to reduce data redundancy. When an object is detected by a camera, that camera broadcasts its self- computed visibility value; this initiates the “election” process which has an idle time to compensate for wireless communication delay. Each camera stores its own collection of gathered visibility values which it uses to decide whether or not it is one of the top two leading cameras needed for tracking. The leading cameras transition into a high resolution state while the rest transition into an idle state, until further notice. Ultimately, either one of the two leading cameras loose sight of the moving object or a new camera detects the moving object and the “election” process recycles. Finite-State Machine: “Election” Wireless Channel Sensor Node Wireless Sound Detector Sound Source Omni-directional Camera Recti-linear Camera Moving Object Head Station Outer Boundary Restricted Area Restricted Scope Capability The Merge Algorithms were created to handle such issues of camera management, visibility, and energy consumption and this research focused on the simulation of a camera network that monitored the motion of a single object in a set of corridors in Cory Hall. Implementation of reliable camera management techniques through the use of state machines and intuitive procedures reinforced proposed solutions; however, there are tradeoff factors such as between communication and power consumption. Process


Download ppt "SUPERB-IT Center For Hybrid & Embedded Software Systems COLLEGE OF ENGINEERING, UC BERKELEY July 29, 2005 SUPERB-IT."

Similar presentations


Ads by Google