Download presentation
Presentation is loading. Please wait.
Published byDaniella Tucker Modified over 9 years ago
1
Informed Discussion in Information Technology Survey Courses Amber Settle CTI, DePaul University Joint work with André Berthiaume, Evelyn Lulis, and Abdulrahman Mirza ISECON November 7, 2003
2
2 Outline Introduction Informed discussion Definition and motivation Courses Debate structure and topics Discussion formats and topics Conclusions – General – Graduate versus undergraduate Course evaluations Future work
3
3 Introduction Approach: Introduce ethical, legal, and social topics into existing technical survey courses (Cohen and Cornwall 1989) Previous work: Highly structured debates requiring prior research to promote interactive learning (Settle and Berthiaume 2002) Contribution: A re-examination of the debates − Less structure versus more structure − Undergraduate versus graduate − Course evaluations
4
4 Informed discussion Informed discussion: Informal debates based on significant prior research Motivation – Formal debates effective for graduate courses Well-structured Good argumentation – Undergraduates are stifled by formality Lack of spontaneity Poor involvement
5
5 The courses Undergraduate – ECT 250: Survey of e-commerce technology ECT, IS, and NT B.S. degree programs Dual purpose: Survey of e-commerce technology and preparation for client-side Web application development – CSC 200: Survey of computing technology CS B.S. degree program (during the relevant time period) Strictly an orientation for majors Graduate DS 420: Foundations of distributed systems ECT M.S. degree program Foundational issues in building distributed systems Similarities: Technological survey courses Differences: Level and maturity of students
6
6 Debate structure Positions: Pro and con Example: Offensive Web content Pro: Content must be controlled to protect minors Con: Web content is protected speech Preparation: A written research summary – Context for the debate – A summary of the position taken – A list of references with short quotes Debates took place during special class sessions Grades based on research paper and performance or on exam questions for non-participants
7
7 In-class debates Pro’s opening statement (5 minutes) Gives context and states position Con’s cross-examination (3 minutes) Rebuttal of pro’s position Con’s position statement (4 minutes) Statement of position Pro’s cross-examination (3 minutes) Response to con’s statements Comments/questions (8 minutes) Assigned interrogators, audience, instructor Closing statements (2 minutes each) Points of each side are recapped
8
8 Debate topics Offensive Web content: Controlling content viewing Copyrighting digital media: Napster case and others The U.S. government vs. the Microsoft Corporation Legal issues in e-commerce: Digital signatures Sklyarov case and code breaking in general U.S. bill draft: Government imposed software security The French government versus Yahoo! Virtual child pornography Internet taxation American Disability Act and Southwest Airlines The Bermann bill The Verizon case and the DMCA
9
9 Discussion formats Twenty minutes of each class session Current events (Fall 2001) − Terrorism and technology − Articles brought to class Course topics (Winter/Spring 2002) − Based on weekly course topics − Articles brought to class Informal debates (Fall 2002) − Controversial topics in IT − Material brought to class − Prepared to defend either side
10
10 Discussion topics Technology and terrorism including privacy rights Monitoring Web content Credit card transactions on-line The ethics of data gathering/sharing on commercial sites The feasibility of electronic voting
11
11 General conclusions Best response: − Current events − Practical topics − Ethical considerations − Material integrated into the course − “Warm-up” exercises provided Worst response: − Topics taken strictly from lecture − Material separated from the course Variation between graduate and undergraduate
12
12 Undergraduate courses Data from seven quarters of classes (Spring/Fall 2001, Winter/Spring/Fall 2002, Winter/Spring 2003) Positives: – Enthusiastic – Better engagement in the course Negatives: – Unstructured responses – Little analysis of material – Inability to draw connections
13
13 Graduate course Data from three quarters of classes (Winter/Spring/Fall 2002) Results: – Evaluated technical aspects well – Very articulate and objective – Good audience participation
14
14 Course evaluations Conducted in CTI every quarter for every class Evaluations are: − Mandatory, strictly anonymous, completed online − Completed during the last two weeks of the quarter − Twenty-two multiple choice questions and comments − Rate aspects of the course and the instructor For this work, data from six questions was used: The questions concerned: − Overall estimate of course − Technical development − Interest in the course − Relationship to other fields − Participation in the course
15
15 Impact on evaluations Data not compared between instructors Current events integrated into the course improve: − Satisfaction with participation − Satisfaction with instructor motivation of the material − The perceived technical merit of the course The increase was dramatic in some cases (15%) Comparisons between different informal discussions was inconclusive − Taught by an instructor new to CTI − Population shifts over time − The purpose of the course changed
16
16 Future Continuing work in the three courses Gather more evaluation data Integrate debates into other CTI courses Liberal Studies (general education courses): − CSC 200 (new incarnation) − Capstone courses − Possible new course: sophomore seminar on multiculturalism (women in IT, Digital Divide, etc.) Investigate the use of debates across the university as a whole
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.