Beth Rubin, School for New Learning

Slides:



Advertisements
Similar presentations
Zehra Akyol & Randy Garrison. Theoretical Background Research Question Methodology Results Discussion Conclusion.
Advertisements

“Digital technologies are for education as iron and steel girders, reinforced concrete, plate glass, elevators, central heating and air conditioning.
Barry Spencer eLearning Barry Spencer eLearning Development Coordinator Bromley College.
Agenda 1.Explore challenges of transitions 2.Teaching Presence 3.COI Model 4.Presence Rubric 5.Development of examples of strategies 6.New strategies.
Facilitating Online Discussions Jason D. Baker. Topics Discussion Value Discussion Tools Discussion Tips.
Increasing Doctoral Student Persistence: Strategies for Fostering Community Amanda J. Rockinson-Szapkiw, LPC, Ed.D. Lucinda S. Spaulding, Ph.D. School.
Barry Spencer A Transformation Case Study Barry Spencer.
Investigating Relationships among Elements of Interaction, Presence, and Student Learning in a Graduate Online Course Lydia Kyei-Blankson, Department of.
Three Hours a Week?: Determining the Time Students Spend in Online Participatory Activity Abbie Brown, Ph.D. East Carolina University Tim Green, Ph.D.
State University of New York A model for online learning in the SUNY Learning Network NLII Annual Meeting – January Online Learning Environments:
Instructional Design & Technology Cooperative Learning Effects in Online Instruction Beth Allred Oyarzun.
PhD Research Seminar Series: Writing the Method Section Dr. K. A. Korb University of Jos.
Designing an Effective Online Developmental Literacy Course David Caverly, Ph.D., Texas State University National Association for Developmental Education.
College Reading and Learning Association Conference Richmond, VA 10/29/09.
Asynchronous Discussions and Assessment in Online Learning Vonderwell, S., Liang, X., & Alderman, K. (2007). Asynchronous Discussions and Assessment in.
A Framework for Inquiry-Based Instruction through
Learners’ Attitudes and Perceptions of Online Instruction Presented by: Dr. Karen S. Ivers Dr. JoAnn Carter-Wells Dr. Joyce Lee California State University.
David W. Klein Helen A. Schartz AERA National Conference Vancouver, B.C., Canada April 16, 2012 Instructional Strategies to Improve Informed Consent in.
Building Trust & Effective Communication Alisa Cooper, EdD Faculty, Assistant Chair/eCourses Coordinator English Department Glendale.
Chapter Three: The Use of Theory
Vygotsky’s Zone of Proximal Development (ZPD) Don Martin EPSY 6304 Cognition and Development UT-Brownsville Professor Garcia By PresenterMedia.comPresenterMedia.com.
Analyze Design Develop AssessmentImplement Evaluate.
State University of New York An Emerging Model for Online Learning MERLOT International Conference – August A Systemic Approach to Online Learning.
Kimberly Hoyt Walden University EDUC 8841 July 1, 2011.
The Peer Review Process in graduate level online coursework. “None of us is as smart as all of us” Tim Molseed, Ed. D. Black Hills State University, South.
Surveying instructor and learner attitudes toward e-learning Presenter: Jenny Tseng Professor: Ming-Puu Chen Date: April 12, 2008 Liaw, S., Huang, H.,
ET4 Online Symposium, 2013 Using Text Analytics to Enhance Data-Driven Decision Making Liz Wallace Director, Institutional Research Melissa Layne, Ed.D.
By Bundhun Amit Varma HMOA  Define Online Discussion  Recognise models of online discussions ◦ Synchronous ◦ Asynchronous  Distinguish three.
INTRODUCTION TO E-LEARNING. Objectives This chapter contains information on understanding the fundamental concepts of e-learning. In this chapter, e-learning.
Principles for Online Communication: Influencing learners’ experiences of you as the teacher.
 Building Student Motivation to Learn: Defining Student Marketability to Potential Employers John Breskey, Ph.D. CSU Fullerton 16 th Annual Teaching Symposium.
Assessment Online. Student Assessment Design learner-centered assessment that include self-reflection Design grading rubrics to assess discussions, assignments,
By Kadir Kozan 1. Research Problem Rationale/Significance/Why? Conceptual Frameworks Data Collection & Analysis Validity Limitations 2.
Interactivity in Asynchronous Courses eCampusAlberta Christine Marles, MDE Feb. 24, 2015.
Improving Courses Across an Online Program: A Design-Based Approach Leonard Bogle, Scott Day, Daniel Matthews & Karen Swan University of Illinois Springfield.
Information Retention in e-Learning De Leon Kimberly Obonyo Carolyne Penn John Yang Xiaoyan.
Development of the Construct & Questionnaire Randy Garrison & Zehra Akyol April
Internet Self-Efficacy Does Not Predict Student Use of Internet-Mediated Educational Technology Article By: Tom Buchanan, Sanjay Joban, and Alan Porter.
Assessing Student Learning Using the Blackboard Discussion Board Assessment Workshop York College CUNY April 7, 2010 Wenying Huang-Stolte, Ph.D. William.
Activities for Building Community Online
Online Quality Course Design vs. Quality Teaching:
Online engagement Techniques
Jenn Shinaberger Lee Shinaberger Corey Lee Coastal Carolina University
Inquiry-based learning and the discipline-based inquiry
Interact 2: Communicating
Professor Emeritus, University of Calgary
Exploratory Factor Analysis Participants, Procedures, & Measures
A nationwide US student survey
Jenn Shinaberger Corey Lee Lee Shinaberger Coastal Carolina University
Joyce Bahhouth Bladen Community College
Presented by Marie Lippens
Overview of Research Process
Simply Complicated: The Use of Student Interaction Data in Online Learning Environments to Inform Teacher Inquiry and Learning Design A report on the literature.
NSSE Results for Faculty
Communication Tools & Strategies in Online Environments
Technology to Promote Presence Interactive Strategies
Director, Institutional Research
Derek Herrmann & Ryan Smith University Assessment Services
Chapter Eight: Quantitative Methods
The Effect Of Social Context On The Reflective Practice Of Preservice Science Teachers: Leveraging A Web-Supported Community Of Teachers Jim MaKinster.
Office of Online Education: Tips for Effective Online Teaching
CMC Meina Zhu, Susan C. Herring, and Curtis J. Bonk
The Heart of Student Success
Creating a Community of Inquiry
Trends and Terminology in Online Learning
Traditional Meana (SD)
Online Learning Communities: A Vision of the Future of Vanderbilt
Katherine M. Hitchcock, Ph.D. Michelle Franz
DR. Ibrahim H.M. Magboul Community College of Qatar
Presentation transcript:

Community of Inquiry and the Effects of Technology on Online Teaching and Learning Beth Rubin, School for New Learning Ron Fernandes, School of Public Service Maria Avgerinou, School of Education DePaul University, Chicago AERA- Vancouver, Canada, April 16 2012

Overview Mixed method study Extends Community of Inquiry (CoI) model to include the effects of the Learning Management System (LMS) Proposes a model Generates and tests two hypotheses Extends research on Community of Inquiry (CoI) model to include the effects of Learning Management System (LMS) affordances. Proposes a model to explain the effects of the LMS technology and the Community of Inquiry on course satisfaction Generates and tests two hypotheses from the model.

Community of Inquiry Supporting Discourse SOCIAL PRESENCE Garrison, D.R., Anderson, T., & Archer, W. (2000). Critical inquiry in a text-based environment: Computer conferencing in higher education. Internet and Higher Education, 2, 87-105. Community of Inquiry Supporting Discourse SOCIAL PRESENCE COGNITIVE PRESENCE EDUCATIONAL EXPERIENCE Setting Climate Selecting Content http://communitiesofinquiry.com/ papers Constructivist model Cognitive presence: is “the extent to which learners are able to construct and confirm meaning through reflection and discourse” (Boston, et al., 2009, p. 69), Social presence: social is “the ability of learners to project themselves socially and emotionally as well as their ability to perceive other learners as ‘real people’” (p. 68). Teaching presence: is “the design, facilitation and direction of cognitive and social processes for the purpose of realizing personally meaningful and educationally worthwhile learning outcomes” (Anderson, Rourke, Garrison, & Archer, 2001, p. 5). Social presence are associated with student satisfaction, engagement, course completion and actual learning (Swan & Shih, 2005). Cog presence, teaching presence – less research, but some TEACHING PRESENCE (Structure/Process) Garrison, Anderson & Archer, 2000

Mediated by Technology ALL Instruction in fully online courses is mediated by technology. All of it. So let’s consider that… What do people do with the teaching technology?

Role of LMS Examples: Blackboard, Desire2Learn, Moodle Coursework is organized and paced Learning resources are accessed Student work is collected and returned Communication occurs Among students Between students and instructor Feedback is delivered (Lohr, 2000). \Most online courses are taught in a Learning Management System – LMS. Examples… … role (what it’s used to do) Communication can include asynchronous (discussion, email, wiki) or - synchronous (chat, webinar, etc.) If the system allows chat, wiki, blog, voice thread… What does the system allow you to do? This leads us to consider the concept of affordances.

Affordances … The concept of Affordances: Definition: What a tool lets you do (Norman, 1999;Bower, 2008; Siemans & Tittenberger, 2009) All tools have affordances – physical, virtual, all tools. A chair lets you sit or stand on it. Roll it across the room? Move boxes on it? Rock backwards on it?

What can you do with a door? Ask: What does a door let you do? A door lets you open or close it. Keep it adjar; wedge it open. Lock it?

A door lets you open or close it. Keep it ajar Lock it?

Slide it Look through it Hang plants on it It all depends on how the door is designed.

LMS Affordances Actions enabled by Learning technologies (Norman, 1999; Bower, 2008) Some Types: Course integration Instructor feedback Communication Overall ease of use What does an LMS let you do? An LMS that allows all the materials needed in one week to be visually grouped on a single page with contiguous placement of all learning elements makes it easier for students to find the materials (Clark & Mayer, 2008; Mayer, 2005; Vicario, 1998). LMS designs that allow extensive interlinking with “smart links” that copy from term to term, “next” and “back” buttons, and checklists with links to all learning elements, support integration of the course learning elements and allow students to find tools when they need them. How easy is it to do these things? How many clicks does it take? How long do you have to wait? How easy is it to figure out?

An Effective LMS… Supports active engagement and meaningful connections between segments of course Facilitates easy communication Facilitates formative feedback Actions that are made easy by the system are more likely to occur, while those that present barriers are less likely to The last one is our premise – LMS matters because of the connections between ideas and people that it enables… or doesn’t. This position is not universally supported. Swan (2003/4) supports it; Richard Clark argues that “Teaching technology is like a truck that takes your vegetables to market; any truck will do, you care about the vegetables.” This was opposed by several others – e.g., Clark (1983 and again 1994) vs. Kozma (1991 and 1994); Swan, 2005. We think that is wrong; the technology matters. To push the metaphor – a refrigerated truck will keep your milk a lot more effectively.

Model

Hypotheses Hypothesis 1: There is a positive association between LMS affordances that facilitate finding resources, communication, and using the system as perceived by students and cognitive, social, and teaching presence. Hypothesis 2: Controlling for cognitive, social and teaching presence, satisfaction with an LMS is positively associated with satisfaction with the online course supported by that LMS.

Research Methods Mixed method comparative study We identify two LMSs with different affordances: Blackboard and Desire2Learn (D2L) Focus on: Tools to integrate course elements Number and ease of communication tools Interface providing and accessing student feedback We identify two LMSs with different affordances for structuring course materials, supporting automated communication, and providing student feedback: Blackboard and Desire2Learn (D2L). - Are there tools to integrate course elements? (Durable links, forward/back; put all tools used in a week together…) Communication includes number of places to email; availability of automated email. Ease of interface Feedback tools – on discussion (formative feedback; ease of reviewing student posts, closed and open-ended feedback.

Research Methods-Phase 1 Setting: large private Midwestern university Existing online courses (pre-designed) Copied into D2L Used tools to interconnect elements Used automated emails Study conducted in a large private Midwestern university, using existing online courses that were previously taught in Blackboard. Copied a set of pre-designed courses from Blackboard into D2L. All course materials, including readings, assignments, discussions, syllabi, and other aspects were copied exactly, although tools available to interconnect elements of the courses (e.g., checklists, internal links) were used, automated emails were enabled when requested by faculty.

Research Methods-Phase 1 (Cont.) Twelve pilot faculty Five schools volunteered to participate Ten faculty in Blackboard Six faculty in D2L (trained) Semi-structured interviews on LMS affordances Surveys of students and faculty: CoI (Arbaugh, et al 2008; Swan, et al, 2008) Twelve pilot faculty from five schools volunteered to participate Ten faculty participated by teaching their courses in Blackboard; Six of participated in D2L (Four did both). After the course was over… Faculty participated in a semi-structured interview after the course ended to describe their view of the LMS affordances. Surveys were administered to students and faculty via an online survey tool. Community of Inquiry (CoI) survey (Arbaugh et al., 2008; Diaz et al., 2010; Swan, Richardson, Ice, Garrison, Cleveland-Innes & Arbaugh, 2008) to measure social presence, teaching presence and cognitive presence; this tool has been repeatedly validated in many settings (Shea & Bidjerano, 2009; Diaz et al., 2010). Other items were added to measure satisfaction with the course and the LMS, perceived affordances (i.e. the ease of finding materials: “Once I got used to it, it was easy to find what I needed in this course management system”; ease of communication: “The tools in this course management system made it easy to communicate with others in the course”; and overall ease of use: “The course management system was easy to use”).

Research Methods-Phase 2 Extended study to survey broader range of online courses in Blackboard Student and faculty surveyed Aggregated the student survey data across phases. Phase 2. Extended the study to collect survey data from faculty and students in a broader range of online courses in Blackboard. We aggregated the student survey data for analysis across both phases. As the faculty data is still being analyzed, in this paper we are presenting only student survey data. Continuing to collect data in D2L

Participants 605 adult students Graduate (N = 127) Undergraduate (N = 478) students Fully-online courses in five different schools Business, Education, Public Administration, Computer Science, and Interdisciplinary fields 108 sections of 43 unique courses 12 in D2L Seven terms (standardized courses) Courses were pre-set – standardized over faculty and over terms.

Descriptive Statistics Nearly two-thirds (64%) female Online experience: 20% in first online course 21% had ten or more online courses the rest distributed in between Ages: 19 to 69; average was 34.9 Most in Blackboard (76.7%); D2L (23.3%) Most had online experience

Methods Factor analysis of student CoI (confirmed) Create scales: Teaching Presence (TP), Social Presence (SP), and Cognitive Presence (CP) scores. Create overall CoI Factor analysis of student CoI data – This confirmed the factor loadings Create separate scales of teaching presence (TP), social presence (SP), and cognitive presence (CP) scores for each respondent. Combined into a single Community of Inquiry score per student

Methods (Cont.) Regress LMS affordances and student satisfaction with faculty on student perceptions of CoI Controls: student & faculty age, gender, # online, etc. Regress TP, SP, and CP, and student satisfaction with LMS on satisfaction with course Controls: student age, gender, # online First part of model, first hypothesis: - Regress LMS affordances and student satisfaction with faculty on student perceptions of CoI Control for faculty age, sex, # of online courses taught, student age, sex, # of online courses taken, and LMS Second part of model, second hypothesis: - Regress TP, SP, and CP, and student reported satisfaction with LMS on Student satisfaction with online course (dependent variable). Control for student age, sex, number of prior online courses

Results Reduced sample because we only included data where faculty also filled out the survey – to get faculty information (age, gender, N online taught, etc.) With all other factors accounted for, including satisfaction with the faculty, we found that ease of communication and (marginally) ease of finding needed resources significantly predicted Student CoI. - Overall ease of use did not!

Results With all else held constant, and all presences entered, satisfaction with LMS significantly predicted course satisfaction.

Discussion Hypothesis 1: Two affordances had a positive effect on the creation of a community of inquiry in online courses. Adjusted R2 of 0.57. Hypothesis 2: Satisfaction with the LMS had an independent effect on satisfaction with the online course, after controlling for the three Community of Inquiry presences. Adjusted R2 of 0.57. H1 : Two affordances had a positive effect on student perception of the creation of a community of inquiry in online courses. The model was statistically significant with an adjusted R2 of 0.57. Hypothesis 2: Satisfaction with the LMS had a significant independent effect on satisfaction with the online course, after controlling for the three Community of Inquiry variables (presences). The overall model was statistically significant, with an adjusted R2 of 0.57.

Conclusion Technology used in online courses affects student outcomes Students care about the LMS, in addition to teaching presence, faculty presence and social presence However: Ease of Use was NS “Reading All” had highly significant effect on CoI

Future Research Consider faculty perceptions and behavior as well as student Compare same class, same faculty, across LMSs Examine “ease of use” at larger scale Look at faculty perceptions of the LMS and the CoI., and their giving feedback and other behavior. If LMS affects students – does it affect faculty? (Perceptions and behaviors) Isolate LMS in large enough sample for statistical significance

References Anderson, T., Rourke, L., Garrison, D. R.,& Archer, W. (2001). Assessing teaching presence in a computer conference environment. Journal of Asynchronous Learning Networks, 5(2), 1-17. Boston, W., Diaz, S.R., Gibson, A.M, Ice, P., Richardson, J., & Swan, K. (2009). An exploration of the relationship between indicators of the Community of Inquiry framework and retention in online programs. Journal of Asynchronous Learning Networks, 13(3), 67-83. Bower, M. (2008). Affordance analysis – matching learning tasks with learning technologies. Educational Media International, 45(1), 3–15. Clark, R. E. (1983). Reconsidering research on learning from media. Review of Educational Research, 53(4), 445-459. Clark, R. E. (1994). Media will never influence learning. Educational Technology Research and Development, 42(1), 21-29. Garrison, D.R., Anderson, T., & Archer, W. (2010). The first decade of the community of inquiry framework: A retrospective. Internet and Higher Education, 13(1-2),5-9. Garrison, D.R., Anderson, T., & Archer, W. (2000). Critical inquiry in a text-based environment: Computer conferencing in higher education. Internet and Higher Education, 2, 87-105. Garrison, D. R., & Cleveland-Innes, M. (2005). Facilitating cognitive presence in online learning: Interaction is not enough. American Journal of Distance Education, 19(3), 133−148. Garrison, D. R., Cleveland-Innes, M., & Fung, T.S. (2010). Exploring causal relationships among teaching, cognitive and social presence: Student perceptions of the community of inquiry framework. Internet and Higher Education, 13(1-2), 31-36. Kozma, R. B. (1991). Learning with media. Review of Educational Research, 61(2), 179-211.

References Kozma, R. B. (1994). Will media influence learning: Reframing the debate. Educational Technology Research and Development 42(2), 7-19. Lohr, L.L. (2000). Designing the instructional interface. Computers in Human Behavior, 16,161-182. Mayer, R.E. (2005). Principles for reducing extraneous processing in multi-media learning: Coherence, signaling, redundancy, spatial contiguity, and temporal contiguity. In R.E. Mayer (Ed.), The Cambridge handbook of multimedia learning (pp. 183 – 200). New York, NY: Cambridge University Press. Nicol, D.J., & Macfarlane-Dick, D. (2006). Formative assessment and self-regulated learning: A model and seven principles of good feedback practice. Studies in Higher Education, 31(2) 199-218. Norman, D.A. (1999). Affordance, conventions, and design. Interactions, 6(3), 38–43. Richardson, J., & Swan, K. (2003). Examining social presence in online courses in relation to students' perceived learning and satisfaction. Journal of Asynchronous Learning Networks, 7(1), 68-88. Rourke, L., & Kanuka, H. (2009). Learning in Communities of Inquiry: A review of the literature, Journal of Distance Education, 23(1),19-48. Sadler, D.R. (1989) Formative assessment and the design of instructional systems. Instructional Science, 18, 119-144. Swan, K. (2005.) Interface matters: What research says about the mediating effects of course interfaces. Paper presented at the 20th Annual Conference on Distance Teaching and Learning, Madison, Wisconsin. Retrieved January 3, 2012 from http://www.uwex.edu/disted/conference/Resource_library/proceedings/04_1076.pdf. Swan, K., Richardson, J.C., Ice, P., Garrison, D.R., Cleveland-Innes, M., & Arbaugh, J.B. (2008). Validating a measurement tool of presence in online Communities of Inquiry. E-mentor, 2.

Q&A Contact Information Beth: brubin@depaul.edu Ron: rfernan7@depaul.edu Maria: mavgerin@depaul.edu