Programming HCI Yingcai Xiao Yingcai Xiao.

Slides:



Advertisements
Similar presentations
FatMax Licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 2.5 LicenseCreative Commons Attribution-NonCommercial-ShareAlike 2.5.
Advertisements

Davide Spano CNR-ISTI, HIIS Laboratory, Via G. Moruzzi Pisa, Italy.
DETAILED DESIGN, IMPLEMENTATIONA AND TESTING Instructor: Dr. Hany H. Ammar Dept. of Computer Science and Electrical Engineering, WVU.
Joshua Fabian Tyler Young James C. Peyton Jones Garrett M. Clayton Integrating the Microsoft Kinect With Simulink: Real-Time Object Tracking Example (
Game Development with Kinect
Input and Interaction Dr. Yingcai Xiao. A good user interface allows users to perform interaction tasks with ease and joy. WYSIWYG (What you see is what.
1 References: 1. J.M. Hart, Windows System Programming, 4th Ed., Addison-Wesley, 2010, Ch.12 2.Microsoft Kinect SDK for Developers,
Yingcai Xiao Interactive Visualization with NUI and Game Engines.
Sage CRM Developers Course
CSE 380 – Computer Game Programming Render Threading Portal, by Valve,
Kinect Part II Anna Loparev.
Canyon Adventure Technology David Maung, Tristan Reichardt, Dan Bibyk, Juan Roman Department of Computer Science and Engineering The Ohio State University.
Yingcai Xiao EDP Scripting Yingcai Xiao. Why do we need EDP? What is EDP? How to implement EDP? How can we take advantages of EDP in game design and implement?
Welcome to CIS 083 ! Events CIS 068.
INTRODUCTION Generally, after stroke, patient usually has cerebral cortex functional barrier, for example, the impairment in the following capabilities,
Components of Database Management System
© Copyright by Deitel & Associates, Inc. and Pearson Education Inc. All Rights Reserved. 1 Tutorial 27 - Phone Book Application Introducing Multimedia.
Project By: Brent Elder, Mike Holovka, Hisham Algadaibi.
Yingcai Xiao Game Development Interactive Animation.
Precomputation- based Prefetching By James Schatz and Bashar Gharaibeh.
Wiimote/Kinect Lab Midterm Update Senior Design December 2011, Group 16 Adviser: Dr. Tom Daniels Brenton Hankins Rick Hanton Harsh Goel Jeff Kramer.
Interactive Computer Graphics
SCRIPT PROGRAMMING WITH FLASH Introductory Level 1.
Yingcai Xiao Event-driven Programming in Game Development Yingcai Xiao.
CHAPTER 6 Threads, Handlers, and Programmatic Movement.
Topic 4: Distributed Objects Dr. Ayman Srour Faculty of Applied Engineering and Urban Planning University of Palestine.
Automation Testing- QTP Rajesh Charles Batch No: Date: jan
REEM ALMOTIRI Information Technology Department Majmaah University.
Introduction to Operating Systems Concepts
Game Development with Unity3D
Input / Output Chapter 9.
Interactive Animation
Creative Coding & the New Kinect
Collision Theory and Logic
Game Development with Unity3D Inside/Outside Unity3D
Architectures of Digital Information Systems Part 1: Interrupts and DMA dr.ir. A.C. Verschueren Eindhoven University of Technology Section of Digital.
CompSci 230 S Software Construction
8. Installing Pygame
Sai Goud Durgappagari Dr. Yingcai Xiao
Collision Theory and Logic
Lesson Objectives Aims Key Words Interrupt, Buffer, Priority, Stack
Objectives You should be able to describe: Interactive Keyboard Input
Goals Give student some idea what this class is about
Mobile Operating System
Human Computer Interaction
Activities and Intents
Computer Programming I
CSC461 Lecture 8: Input Devices
The Client/Server Database Environment
Understanding an App’s Architecture
Vijay Kumar Kolagani Dr. Yingcai Xiao
CAPTURING OF MOVEMENT DURING MUSIC PERFORMANCE
Understand Windows Forms Applications and Console-based Applications
Programming Models for Distributed Application
Event-driven Programming
Chapter 7 Additional Control Structures
A Prime Example of HCI Application
Android Topics UI Thread and Limited processing resources
Analysis models and design models
Game Loop Update & Draw.
Week 6: Time and triggers!
Professional Environment
Distribution Infrastructures
Chapter 13: I/O Systems I/O Hardware Application I/O Interface
Why Threads Are A Bad Idea (for most purposes)
Implementing Processes, Threads, and Resources
Exception and Event Handling
Why Threads Are A Bad Idea (for most purposes)
Why Threads Are A Bad Idea (for most purposes)
Chapter 13: I/O Systems.
Presentation transcript:

Programming HCI Yingcai Xiao Yingcai Xiao

Input Computation Output Interaction HCI Input Computation Output Interaction

Interaction => events (software notifications of hardware status changes.) Output Device Driver Input Device Driver Computer Program

EDP EDP: Event-driven programming Application awaits in an idle state and responds to user requests by calling the corresponding even handlers. Yingcai Xiao

Events #define PI 3.14 char c; bool done = false; float out; int n = 1000; while(!done) { cout << “Please make your selection:” cin >> c; switch(c) { case ‘s’: // stop current computation break; case “r”: // run computation for (int i = 0; i < n; i++) { out = sqrt(PI); cout << i; } break; case “q”: // exit the program done = true; break; } cout << out;

Key Components of EDP Event generators (input devices) Events Event loop Event mapping (compile time) Event dispatching (run time) Event handlers Computation Output

Event generators They are input devices For computers: CLI, GUI, NUI devices Smart phones are computers: above plus sensors IoT: sensors, embedded devices,

Events Notifications of status changes in the input devices. Generated by the OS when users interact with input devices. Sent to the active application (the application owns the active window.) Queued at the active application’s event queue. Removed by the application one by one to be processed. Dispatched to event handlers according to event mapping.

Events All other events and new incoming events await until the current event is being processed. Software can generate events too: Collision event in gaming. Idle event in background animation. Interactive event should NOT take too long to handle, otherwise it may delay the processing of upcoming events. Other events: network events, background events, timed events, ...

Event representation An event can be represented by a char, a string, an encoded number, or an object-oriented message. Encoded numbers and object-oriented messages are most commonly used. Simple events can be represented by characters. Complicated events need to be represented by object-oriented messages

Event Loop The event loop checks if there is any events (including idle events) in the queue. If yes, dispatches the event in the front of the queue to its handler. If not, waits for input. Or puts an idle event into the queue if there is an idle-event handler.

Event Mapping Event mapping is done when a programmer registers an event handler to an event. Usually at compile time. A handler can be registered to multiple events. An event can have multiple handlers. Make sure the handlers define proper arguments for the events.

Event Dispatching Event dispatching is done at runtime. The dispatcher invokes the corresponding handlers one-by-one for each current event. Usually in the order of registration. Sometimes a pre and a post event are added, e.g., ButtonDown and ButtonUp. Most of the time, only one handler is registered for each event.

Event Dispatching For most applications, dispatching and mapping are implemented via a switch statement. Large systems, like Windows OS, use a hash table to speed up searching of event handlers.

Event handlers Code which processes events. Usually a call-back function. e.g. OnMouseMove Most likely has input arguments, e.g., cursor location. Note: no new event can be processed when an event handler is running. One challenge is to pass a computational value to event handlers. Commonly, a global variable is used.

Event and event handlers For your program to support a hardware system (e.g, a smart phone), you need to code event handlers for the events which can are generated by the hardware system. Your program will be able to support multiple hardware systems if you code the event handlers for these systems. One handler can be coded to support two different events on two different hardware systems but to perform the same action. For example, a left-move handler can be registered to handle the left-arrow-key event on a PC and a left-tilting event on a smart phone. The game object can be moved to the left by hitting the left key on a PC or by tilting the phone to the left on a smart phone.

Computation Computation is usually done in event handlers. Sometimes in the background. Any long-running computation in an event handler blocks the system to process subsequent events, hence slows the system down. Long-running computation can be broken down as small background tasks to increase interactivity of the system. The best approach is to create separate threads for input events and for computation.

EDP in programming languages C++ has no predefined events other than characters for key strokes C++ do not have any constructs for users to define their own events. Java2 has EDP constructs built in (Java Swing). Java2 has predefined event classes. C# has full support of EDP. C# has constructs to allow users to define their own events.

HCI Example NUI Kinect

Natural User Interfaces: Voice controls Kinect 3D sensor LeapMotion

NUI-based Interactive Computer Graphics Display User Controller Graphics Application

NUI: Three parts of NUI: Hardware: e.g., Kinect Software: drivers (OpenNI), middleware Application: integration of HW enabling software with applications.

OpenNI

OpenNI Production Nodes: a set of components that have a productive role in the data creation process required for Natural Interaction based applications. the API of the production nodes only defines the language. The logic of data generation must be implemented by the modules that plug into OpenNI. E.g. for a production node that represents the functionality of generating hand-point data, the logic of hand-point data generation must come from an external middleware component that is both plugged into OpenNI, and also has the knowledge of how to produce such data.

OpenNI body imaging (2) joint recognition (3) hand waving

OpenNI: Sensor-Related Production Nodes  Device: represents a physical device (a depth sensor, or an RGB camera). Its main role is to enable device configuration.  Depth Generator: generates a depth-map. Must be implemented by any 3D sensor that wishes to be certified as OpenNI compliant.  Image Generator: generates colored image-maps. Must be implemented by any color sensor that wishes to be certified as OpenNI compliant  IR Generator: generates IR image-maps. Must be implemented by any IR sensor that wishes to be certified as OpenNI compliant.  Audio Generator: generates an audio stream. Must be implemented by any audio device that wishes to be certified as OpenNI compliant.

OpenNI: Middleware-Related Production Nodes  Gestures Alert Generator: Generates callbacks to the application when specific gestures are identified.  Scene Analyzer: Analyzes a scene, including the separation of the foreground from the background, identification of figures in the scene, and detection of the floor plane. The Scene Analyzer’s main output is a labeled depth map, in which each pixel holds a label that states whether it represents a figure, or it is part of the background.  Hand Point Generator: Supports hand detection and tracking. This node generates callbacks that provide alerts when a hand point (meaning, a palm) is detected, and when a hand point currently being tracked, changes its location.  User Generator: Generates a representation of a (full or partial) body in the 3D scene.

OpenNI: Recording Production Notes  Recorder: Implements data recordings  Player: Reads data from a recording and plays it  Codec: Used to compress and decompress data in recordings

OpenNI: Capabilities Supports the registration of multiple middleware components and devices. OpenNI is released with a specific set of capabilities, with the option of adding further capabilities in the future. Each module can declare the capabilities it supports. Currently supported capabilities:  Alternative View: Enables any type of map generator to transform its data to appear as if the sensor is placed in another location.  Cropping: Enables a map generator to output a selected area of the frame.  Frame Sync: Enables two sensors producing frame data (for example, depth and image) to synchronize their frames so that they arrive at the same time.

OpenNI: Capabilities Currently supported capabilities:  Mirror: Enables mirroring of the data produced by a generator.  Pose Detection: Enables a user generator to recognize when the user is posed in a specific position.  Skeleton: Enables a user generator to output the skeletal data of the user. This data includes the location of the skeletal joints, the ability to track skeleton positions and the user calibration capabilities.  User Position: Enables a Depth Generator to optimize the output depth map that is generated for a specific area of the scene.

OpenNI: Capabilities Currently supported capabilities:  Error State: Enables a node to report that it is in "Error" status, meaning that on a practical level, the node may not function properly.  Lock Aware: Enables a node to be locked outside the context boundary.  Hand Touching FOV Edge: Alert when the hand point reaches the boundaries of the field of view.

OpenNI: Generating and Reading Data Production nodes that also produce data are called Generator. Once these are created, they do not immediately start generating data, to enable the application to set the required configuration. The xn::Generator::StartGenerating() function is used to begin generating data. The xn::Generator::StopGenerating stops it. Data Generators "hide" new data internally, until explicitly requested to expose the most updated data to the application, using the UpdateData request function. OpenNI enables the application to wait for new data to be available, and then update it using the xn::Generator::WaitAndUpdateData() function.

Interactive Game with Kinect

Video Game Interactive animation: user-> interface -> game object action -> feedback (A/V, haptic) Game objects can represent data.

Video Game Display User Controller Game (Software)

Video Game Display Device Driver (GDI) Input Device Driver Game (Software)

Software for Kinect-based game development OpenNI: a general-purpose framework for obtaining data from 3D sensors SensorKinect: the driver for interfacing with the Microsoft Kinect NITE: a skeleton-tracking and gesture-recognition library Unity3D: a game engine ZigFu: Unity Package for Kinect (Assets and Scripts)