Download presentation
Presentation is loading. Please wait.
Published byGriselda Bell Modified over 9 years ago
1
Review of the test results and plan for the final testing campaign Panagiotis Mousouliotis EDUSAFE ESR3 PhD Candidate, Aristotle University of Thessaloniki Systems Engineer, NOVOCAPTIS
2
Overview Test Results FTC Plan
3
Test Results Testing campaign related work consisted of HW selection and integration for the AR prototype SW setup, development, debugging, and integration for the AR prototype Given the time constraints Low-power PC-like HW (x86-64bit) was used (small size Intel-based low-power motherboard with the required I/O interfaces, and a dedicated touchscreen for UI) PC-like HW -> maximum SW support for application development -> faster SW development using mature SW The result is a functional AR prototype
4
Test Results Some drawbacks Consumer devices such as action cameras, HDMI splitters, frame- grabbers, and PARALINX Arrow are difficult to integrate because they are systems designed for a specific functionality and not to interact with each other as components of a bigger system Their integrated behavior is unknown until they are tested together (negotiation issues) The operation of the prototype system consists of operating separate devices in specific order (power the HDMI camera -> power the motherboard -> press buttons on the camera for selecting the desired mode -> etc.)
5
Test Results Some drawbacks Powering each device separately complicates further the test and the HW integration (e.g. use additional batteries and providing power sources with different voltage levels) PARALINX Arrow requires pushing a synchronization button to connect the transmitter with the receiver and the two devices (receiver-transmitter) require to be side by side when connection is lost, press the button to sync and regain connectivity It would be beneficial for the tests to use devices that can be integrated easily, operated and powered in a unified way
6
Test Results Although we concentrated exclusively in testing the AR functionality, a few supervision related results using the webcam AR prototype are: Successful H264 video streaming from the prototype to the server supervision GUI using UDP (using a GStreamer video pipeline) – the functionality is there but further SW development is required for a complete application Tested the UDP H264 video streaming in the ATLAS environment Mini USB WiFi adapters (e.g. ASUS USB-N10 Nano) performed poorly (loses easily connectivity - max range 10 meters) When in range the framerate was 30fps WiFi devices with external antenna(s) are required for better coverage (more power consumption)
7
FTC Plan Due to the AR system prototyping there was no time to work on my individual project (according to EDUSAFE Annex I and the related deliverables) So, my plan is To develop the supervision SW (a very early version is already functional, but limited) Client-Server application: C++, Threads, Sockets, GStreamer Application level transmission protocol for reliable-and-real-time video streaming (additional work and research is required for this one!) To port the supervision SW on an embedded development board
8
FTC Plan a.Selection and purchase of a low cost, small-size-low-weight (wearable), low power, adequate performance, with dedicated camera interface (dedicated camera) embedded board b.Development of the required software to implement the required supervision functionality c.Porting of the required software to the selected embedded board d.Research the adaptation to computer vision requirements (according to Annex I WP2 description) e.Development of the required software to implement the required protocol for fast video data wireless transmission f.Porting of the required software to the selected embedded board g.Submit final EDUSAFE project deliverables, as set out in EDUSAFE Annex I h.Compare the implemented solution with the theoretically solution, which results from the PhD related research
9
FTC Plan PhD Research (ATC-Today) Study of models of computation used for formal description of embedded systems (formal models can be verified using analytic methods or simulation of the model) – Berkeley Ptolemy II Framework is currently studied The idea is 1.To describe the embedded system using a formal model 2.To use of a SW library describing HW components (execution times of computation/communication, power consumption etc.) 3.Map the HW components to the embedded system model using multi-objective optimization algorithms (the objectives usually are cost, power, performance) – using a MOEA framework
10
FTC Plan Since my PhD studies won’t end before (at least) 4 years from today and since I’m required to deliver implementation related to the EDUSAFE ESR3 individual project and related deliverables, EDUSAFE can’t wait for my PhD related research results in order to chose an optimal embedded system, so.. The Raspberry Pi 2 embedded board + the dedicated camera module will be used for the implementation of the Supervision system
11
FTC Plan Why Raspberry Pi 2: Sufficiently powerful for supervision purposes (900 MHz quad-core ARM Cortex-A7) Small size, weight (85.60mm × 56.5mm, 45grams), and low power consumption (max 9Watts) to support a mobile and wearable application Low cost (US$35 for the processor board module and US$25 for the camera module) Provides a really small (around 25 x 20 x 9mm, 3grams) dedicated camera module with build-in H264 encoder which makes the system optimized (performance/power consumption) for network video streaming
12
FTC Plan Why Raspberry Pi 2: Provides support for many I/O interfaces (can be used to directly interface sensors – additional SW development may required) Provides mature SW and a very big user community GStreamer camera module support (already tested!) C/C++ library for using the camera module for CV purposes Component interconnection schematics are available providing the flexibility to redesign the board to adapt to specific needs
13
Thank you for your attention! Questions?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.