Download presentation
Presentation is loading. Please wait.
Published byIrma Murphy Modified over 9 years ago
1
Department of CSE, KLU 1 Chapter 8 Implementation support
2
Department of CSE, KLU 2 Implementation support programming tools –levels of services for programmers windowing systems –core support for separate and simultaneous user- system activity programming the application and control of dialogue interaction toolkits –bring programming closer to level of user perception user interface management systems –controls relationship between presentation and functionality
3
Department of CSE, KLU 3 8.1 Introduction How does HCI affect of the programmer? Advances in coding have elevated programming hardware specific interaction-technique specific Layers of development tools –windowing systems –interaction toolkits –user interface management systems
4
Department of CSE, KLU 4 8.2 Elements of windowing systems Device independence programming the abstract terminal device drivers image models for output and (partially) input pixels PostScript (MacOS X, NextStep) Graphical Kernel System (GKS) Programmers' Hierarchical Interface to Graphics (PHIGS) Resource sharing achieving simultaneity of user tasks window system supports independent processes isolation of individual applications
5
Department of CSE, KLU 5 roles of a windowing system
6
Department of CSE, KLU 6 Architectures of windowing systems three possible software architectures –all assume device driver is separate –differ in how multiple application management is implemented 1. each application manages all processes –everyone worries about synchronization –reduces portability of applications 2. management role within kernel of operating system –applications tied to operating system 3. management role as separate application maximum portability
7
Department of CSE, KLU 7 The client-server architecture
8
Department of CSE, KLU 8 X Windows architecture
9
Department of CSE, KLU 9 X Windows architecture.... pixel imaging model with some pointing mechanism X protocol defines server-client communication separate window manager client enforces policies for input/output: –how to change input focus –tiled vs. overlapping windows –inter-client data transfer
10
Department of CSE, KLU 10 8.3 Programming the application repeat read-event(myevent) case myevent.type type_1: do type_1 processing type_2: do type_2 processing... type_n: do type_n processing end case end repeat read-evaluation loop paradigm
11
Department of CSE, KLU 11 notification-based programming paradigm void main(String[] args) { Menu menu = new Menu(); menu.setOption(“Save”); menu.setOption(“Quit”); menu.setAction(“Save”,mySave) menu.setAction(“Quit”,myQuit)... } int mySave(Event e) { //save the current file } int myQuit(Event e) { //close down }
12
Department of CSE, KLU 12 notification-based programming paradigm... 1. /* 2. * quit.c -- simple program to display a panel button that says "Quit". 3. * Selecting the panel button exits the program. 4. */ 5. # include 6. # include 7. # include 8. Frame frame; 9. main (argc, argv) 10. int argc; 11. char *argv[]; 12. { 13. Panel panel; 14. void quit(); 15. 16. xv_init(XV_INIT_ARGC_PTR_ARGV, &argc, argv, NULL); 17. frame = (Frame) xv_create(NULL, FRAME, 18. FRAME_LABEL, argv[0], 19. XV_WIDTH, 200, 20. XV_HEIGHT, 100, 21. NULL); 22. panel = (Panel) xv_create(frame, PANEL, NULL); 23. (void) xv_create(panel, PANEL_BUTTON, 24. PANEL_LABEL_STRING, "Quit", 25. PANEL_NOTIFY_PROC, quit, 26. NULL); 27. xv_main_loop(frame); 28. exit(0); 29. } 30. void quit() 31. { 32. xv_destroy_safe(frame); 33. } A simple program to demonstrate notification- based programming.
13
Department of CSE, KLU 13 notification-based programming paradigm... The program produces a window, or frame, with one button, labeled Quit, which when selected by the pointer device causes the program to quit, destroying the window. Three objects are created in this program: the outermost frame, a panel within that frame and the button in the panel. The procedure xv_create, used on lines 17, 22 and 23 in the source code, is used by the application program to register the objects with the XView notifier. In the last instance on line 23, the application programmer informs the notifier of the callback procedure to be invoked when the object, a button, is selected. The application program then initiates the notifier by the procedure call xv_main_loop. When the notifier receives a select event for the button, control is passed to the procedure quit which destroys the outermost frame and requests termination.
14
Department of CSE, KLU 14 going with the grain system style affects the interfaces –modal dialogue box easy with event-loop (just have extra read-event loop) hard with notification (need lots of mode flags) –non-modal dialogue box hard with event-loop (very complicated main loop) easy with notification (just add extra handler) beware! if you don’t explicitly design it will just happen implementation should not drive design
15
Department of CSE, KLU 15 8.4 Using toolkits Interaction objects –input and output intrinsically linked Toolkits provide this level of abstraction –programming with interaction objects (or techniques, widgets, gadgets) –promote consistency and generalizability –through similar look and feel –amenable to object-oriented programming movepressreleasemove
16
Department of CSE, KLU 16 Using toolkits....
17
Department of CSE, KLU 17 interfaces in Java Java toolkit – AWT (abstract windowing toolkit) Java classes for buttons, menus, etc. Notification based; –AWT 1.0 – need to subclass basic widgets –AWT 1.1 and beyond -– callback objects Swing toolkit –built on top of AWT – higher level features –uses MVC architecture (see later)
18
Department of CSE, KLU 18 8.5 User Interface Management Systems UIMS add another level above toolkits –toolkits too difficult for non-programmers concerns of UIMS –conceptual architecture –implementation techniques –support infrastructure non-UIMS terms: –UI development system (UIDS) –UI development environment (UIDE) e.g. Visual Basic
19
Department of CSE, KLU 19 UIMS as conceptual architecture A major issue in this area of research is one of separation between the semantics of the application and the interface provided for the user to make use of that semantics. There are many good arguments to support this separation of concerns: Portability To allow the same application to be used on different systems it is best to consider its development separate from its device-dependent interface. Reusability Separation increases the likelihood that components can be reused in order to cut development costs. Multiple interfaces To enhance the interactive flexibility of an application, several different interfaces can be developed to access the same functionality. Customization The user interface can be customized by both the designer and the user to increase its effectiveness without having to alter the underlying application.
20
Department of CSE, KLU 20 UIMS tradition – interface layers / logical components linguistic:lexical/syntactic/semantic Seeheim: Arch/Slinky
21
Department of CSE, KLU 21 Seeheim model Presentation Dialogue Control Functionality (application interface) USER APPLICATION switch lexical syntacticsemantic
22
Department of CSE, KLU 22 conceptual vs. implementation Seeheim –arose out of implementation experience –but principal contribution is conceptual –concepts part of ‘normal’ UI language … because of Seeheim … … we think differently! e.g. the lower box, the switch needed for implementation but not conceptual
23
Department of CSE, KLU 23 semantic feedback different kinds of feedback: –lexical – movement of mouse –syntactic – menu highlights –semantic – sum of numbers changes semantic feedback often slower –use rapid lexical/syntactic feedback but may need rapid semantic feedback –freehand drawing –highlight trash can or folder when file dragged
24
Department of CSE, KLU 24 what’s this?
25
Department of CSE, KLU 25 the bypass/switch rapid semantic feedback direct communication between application and presentation but regulated by dialogue control
26
Department of CSE, KLU 26 more layers!
27
Department of CSE, KLU 27 Arch/Slinky more layers! – distinguishes lexical/physical like a ‘slinky’ spring different layers may be thicker (more important) in different systems or in different components
28
Department of CSE, KLU 28 monolithic vs. components Seeheim has big components often easier to use smaller ones –esp. if using object-oriented toolkits Smalltalk used MVC – model–view–controller –model – internal logical state of component –view – how it is rendered on screen –controller – processes user input
29
Department of CSE, KLU 29 Concern not addressed by the Seeheim model is how to build large and complex interactive systems from smaller components. Its seen that object based toolkits are amenable to such a building blocks approach, and several other conceptual architectures for interactive system development, model – view – controller paradigm, have been proposed to take advantage of this. MVC for short – suggested in the Smalltalk programming environment has the ability to build new interactive systems based on existing ones. The link between application semantics and presentation can be built up in units by means of the MVC triad. The model represents the application semantics; the view manages the graphical and/or textual output of the application; and the controller manages the input. Models are very general, they must be used to portray any possible application semantics. A model can be associated with several MVC triads, so that the same piece of application semantics can be represented by different input–output techniques. Each view–controller pair is associated to only one model. MVC model - view - controller
30
Department of CSE, KLU 30 MVC model - view - controller...
31
Department of CSE, KLU 31 MVC issues MVC is largely pipeline model: input control model view output but in graphical interface –input only has meaning in relation to output e.g. mouse click –need to know what was clicked –controller has to decide what to do with click –but view knows what is shown where! in practice controller ‘talks’ to view –separation not complete
32
Department of CSE, KLU 32 PAC model presentation-abstraction-control PAC model closer to Seeheim –abstraction – logical state of component –presentation – manages input and output –control – mediates between them There are three important differences between PAC and MVC. –First, PAC groups input and output together, whereas MVC separates them. –Secondly, PAC provides an explicit component whose duty it is to see that abstraction and presentation are kept consistent with each other, whereas MVC does not assign this important task to any one component, leaving it to the programmer/ designer to determine where that chore resides. –Finally, PAC is not linked to any programming environment, though it is certainly conducive to an object-oriented approach. It is because of this difference that PAC could so easily isolate the control component; –PAC is more of a conceptual architecture than MVC because it is less implementation dependent.
33
Department of CSE, KLU 33 PAC model presentation-abstraction-control
34
Department of CSE, KLU 34 PAC presentation – abstraction - control abstractionpresentation control AP C AP C AP C AP C
35
Department of CSE, KLU 35 Implementation of UIMS It is important to determine how components in a conceptual architecture can be realized. Implementations based on the Seeheim model must determine how the separate components of presentation, dialog controller and application interface are realized. The use of callback procedures in notification-based programming is one way to implement the application interface as a notifier. In MVC, callback procedures are also used for communication between a view or controller and its associated model. Myers has outlined the various implementation techniques used to specify the dialog controller separately. Some of the techniques that have been used in dialog modeling in UIMS are listed here.
36
Department of CSE, KLU 36 Implementation of UIMS... Techniques for dialogue controller –menu networks The communication between application and presentation is modeled as a network of menus and submenus. To control the dialog, the programmer must simply encode the levels of menus and the connections between one menu and the next submenu or an action. Links between menu items and the next displayed menu model the application response to previous input. –grammar notations The dialog between application and presentation can be treated as a grammar of actions and responses, and, therefore, described by means of a formal context-free grammar notation, such as BNF (Backus–Naur form). These are good for describing command-based interfaces, but are not so good for more graphically-based interaction techniques.
37
Department of CSE, KLU 37 Implementation of UIMS... –state transition diagrams These diagrams can be used as a graphical means of expressing dialog. The difficulty with these notations: Lies in linking dialog events with corresponding presentation or application events. It is not clear how communication between application and presentation is represented. –event languages These are similar to grammar notations, except that they can be modified to express directionality and support some semantic feedback. Event languages are good for describing localized input–output behavior in terms of production rules. A production rule is activated when input is received and it results in some output responses. This control of the input–output relationship comes at a price. It is now more difficult to model the overall flow of the dialog.
38
Department of CSE, KLU 38 Implementation of UIMS... –declarative languages A declarative approach concentrates more on describing how presentation and application are related. Declarative languages, therefore, describe what should result from the communication between application and presentation, not how it should happen in terms of event sequencing. –Constraints Constraints systems are a special subset of declarative languages. Constraints can be used to make explicit the connection between independent information of the presentation and the application. –graphical specification These techniques allow the dialog specification to be programmed graphically in terms of the presentation language itself. This technique can be referred to as programming by demonstration N.B. constraints instead of what happens say what should be true used in groupware as well as single user interfaces (ALV - abstraction–link–view)
39
Department of CSE, KLU 39 Summary Levels of programming support tools Windowing systems –device independence –multiple tasks Paradigms for programming the application –read-evaluation loop –notification-based Toolkits –programming interaction objects UIMS –conceptual architectures for separation –techniques for expressing dialogue
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.