Stephen L. Firsing III, PhD, MPA, MA John Delport, PhD Fredanna A.D. M’Cormack McGough, PhD, RD John Yannessa, PhD Mariel Celina Po, BS Kaitlyn Brown,

Slides:



Advertisements
Similar presentations
Observation and Evaluation of the Use of Clickers in CU Physics Courses Kennda Lynch ASTR 5835 Teaching Seminar, Spring 2008.
Advertisements

Purpose: The purpose of this survey is to enable staff at the University to evaluate the use of TurningPoint App:-  Influence use of the App by staff.
College Algebra Course Redesign Southeast Missouri State University.
Learning Community II Survey Spring 2007 Analysis by Intisar Hibschweiler (Core Director) and Mimi Steadman (Director of Institutional Assessment)
Physics 1100 –Spring 2012 Physics Conceptual Physics Dr. James Wolfson.
Discussion examples Andrea Zhok.
Nancy Vader-McCormick, Ph.D. Professor of Communication Humanities Division Delta College Innovations 2014 Conference, March 4, 2014
Literature on electronics as distractions  % of students report texting in class;  Over 99% while studying  97% report noticing other students.
OCTOBER ED DIRECTOR PROFESSIONAL DEVELOPMENT 10/1/14 POWERFUL & PURPOSEFUL FEEDBACK.
Making Big Classes Small: Penn State’s Blended Learning Initiative Renata Engel John T. Harwood January 30, 2006 Copyright Penn State, This work.
Clickers in the Classroom Monday Models Spring 08 source:
Rebekka Darner, PhD University of Florida.  My past teaching experiences  Why did I start using Mastering?  How do I use Mastering?  Tutorials  Activities.
Using Intentional Student Feedback to Sustain My Teaching 2010 Teaching and Learning Symposium John H. Bantham Management & Quantitative Methods.
Student Forum March5, pm - Collaborate Students will share their thoughts on topics including: --experiences with online courses --ways instructors.
End of Semester Survey Mrs. Frask End of 1 st semester.
Think. Learn. Succeed. Preparing International Students to Meet Academic Writing Challenges Melissa Allen Coordinator of Support Services for Non-Native.
Inessa Levi 11/05/2012.  Friendly, flexible and responsive course management system  Specifically geared towards mathematics and sciences  Tutorial.
STEPHANIE AFFUL JANUARY 30, 2013 Using Clickers in Fontbonne Classrooms.
Using Blended Learning to Increase Student Engagement Fiona Fui-Hoon Nah, Ph.D. Professor of Business & Information Technology (BIT) March 14, 2014.
Closing the Assessment Loop Andong He Oct 20 th, 2009 Confusius (551 B.C - 479 B.C) Be insatiable in learning; be tireless in teaching. Be insatiable in.
Using SMART Response System in Elementary Common Core Mathematics Classroom Dr. James Oigara Canisius College, Buffalo, NY.
Click to add title A Busy Professor’s Guide to Sanely Flipping Your Classroom Dr. Cynthia Furse Electrical & Computer Engineering.
Thank you for the kind feedback. I truly do hope you have enjoyed the course and have had a good learning experience. Most people said they found the course.
Patrik Hultberg Kalamazoo College
The Use of Formative Evaluations in the Online Course Setting JENNIFER PETERSON, MS, RHIA, CTR DEPARTMENT OF HEALTH SCIENCES.
UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation, Standards, and Student Testing An Exploratory Study.
Dr. Campbell Hime Welcome to Effective Writing 1- CM 107.
First class entered in 2002, current enrollment 334 Programs: Undergraduate Electrical, Computer, Mechanical and Biological Engineering Hands-on/project/team.
Lawrence Epstein (Larry) Teaching Professor/Interim Dept. Head Westphal College of Media Arts & Design.
SVM Education Day: On-Line Discussion Site and other Tools Kristen A. Bernard, DVM, PhD.
Then Now  Teaching as Art  Teachers and Teaching  Great teachers are born  How did I do?  Scholarship informs Teaching  Culture of Unexamined assumptions.
In-Class Polling ITG. Agenda Polling Tools (Clickers, Polls Everywhere) Implications in teaching & learning Emerson faculty experiences Tips for an effective.
Documenting Your Teaching and Student Evaluations
SOS Early Move-In Program Assessment ( )
Academy on Violence and Abuse 2015 Global Scientific Summit
2016 Fall OpenStax class ABAC : EUNKYUNG YOU.
Dr Tracy Lovatt and Dr Maggie Bartlett
Connecting to Distance Education Students
Paulina Liżewska Paweł Kamiński
Active learning Flipped Classrooms
An experiment in Engagement
Self Esteem is a way of thinking and feeling about yourself.
How to Use Lesson Plans and Curricula
Clickers & Other Assessment Tools
Academy on Violence and Abuse 2015 Global Scientific Summit
Classroom Assessment Techniques
Peer Instruction and Just in Time Teaching Marion Birch & Niels Walet School of Physics and Astronomy.
Retention and Student engagement in the flipped classroom
Clickers: Ready, Set, Assess
UNIT I- YOUR LIFE. YOUR DREAMS
Student Engagement An engaging presentation presented engagingly
Peer Evaluation of Teammates
Mira Karac Cuyahoga Community College League of Innovations 2010
Helping US Become Knowledge-Able About Student Engagement
Student Perspectives of TBL in a CS Course: Summary of Qualitative Findings Michael S. Kirkpatrick SIGCSE Technical Symposium March 2017.
Welcome! Bruce Maeder Information Technology Transfer Specialist
Student Course Evaluation Revision Task Force
Getting to Know Your Students Through Reflective Writing Assignments
Live interactive audience participation By: Carolyn Rygg
Grad V.S. Undergrad Clickers V.S. Non-clickers
What have you learned from your audience feedback?
Poll Everywhere & Kahoot!
Learning Community II Survey
BOCES Distance Learning Program Quality Access Support
an open digital question authoring tool
The Impact of Peer Learning on Assessment Literacy and Feedback Orientation
CS a-spring-midterm2-survey
Evaluating High Impact Research Assignments for Undergraduates: Incorporating Research in Freshman/Sophomore Classes Lisa van Raalte, Ph.D.
Active Class from Day 1 How much time do you spend explaining the syllabus on the first day of class? How much information do students retain? Do you post.
Year 11 & 12 Maths from a students’ viewpoint
Presentation transcript:

Stephen L. Firsing III, PhD, MPA, MA John Delport, PhD Fredanna A.D. M’Cormack McGough, PhD, RD John Yannessa, PhD Mariel Celina Po, BS Kaitlyn Brown, Student of Public Health Evaluation of two audience response systems in the college classroom

Let’s try out the TurningPoint Audience Response System!

What is your favorite Super Hero? A. Superman B. Batman C. Iron Man D. Captain America E. Wonder Woman

If you had to be some kind of animal, what kind of animal would you be? A. Eagle B. Shark C. Cheetah D. Dolphin E. Turtle

Now lets try out Poll Everywhere!

Why use audience response systems in the classroom? 1. Immediate Feedback 2. Formative Assessment 3. Students Compare Responses

What was the Purpose of this Research Project? The purpose of this research project was to evaluate the adequacy of two popular clicker systems in the Coastal Carolina University classroom: TurningPoint and Poll Everywhere.

What are the major differences between the clicker systems? TurningPoint ($, RF clicker) VS. Poll Everywhere (40 free, online, text)

What was the Research Methodology? R = 13 Public Health Courses (3P) 4 Special Education Courses (1P) X a = TurningPoint X b = Poll Everywhere 0=Observation Points R X a O 1 X b O 2 O 3 Mid- Semester End- Semester Start- Fall 2014 Spring 2015

What were the Outcomes for Observation #1 (TurningPoint)? Quantitative = Audience Response Technology Questionnaire or ART-Q (25 items) Quantitative = Audience Response Technology Questionnaire or ART-Q (25 items) Appraisal/Learning Negative Grade Impact Attendance Enjoyment Preparation/Motivation Ease of Use Instructor Performance Qualitative = Overall Likes & Dislikes (2 items) Qualitative = Overall Likes & Dislikes (2 items)

What were the Outcomes for Observation #2 (Poll Everywhere)? Quantitative = Audience Response Technology Questionnaire or ART-Q (25 items) Quantitative = Audience Response Technology Questionnaire or ART-Q (25 items) Appraisal/Learning Negative Grade Impact Attendance Enjoyment Preparation/Motivation Ease of Use Instructor Performance Qualitative = Overall Likes & Dislikes (2 items) Qualitative = Overall Likes & Dislikes (2 items)

What were the Outcomes for Observation #3 (Comparison)? Quantitative = 5 ARS comparison items Quantitative = 5 ARS comparison items Was easier for you to use? (TP, PE, Neither) Engaged you more in the course material? Was easier for your instructor to use? Was more reliable or had fewer technical problems? Did you prefer using overall? Qualitative = General comments about the ARS systems and about the use of ARS (of any kind) in the classroom (2 items) Qualitative = General comments about the ARS systems and about the use of ARS (of any kind) in the classroom (2 items)

Study Participants = Undergraduate Students TurningPoint (O 1 ) N = 306 TurningPoint (O 1 ) N = 306 Poll Everywhere (O 2 ) N = 317 Poll Everywhere (O 2 ) N = 317 Comparison Survey (O 3 ) N = 323 Comparison Survey (O 3 ) N = 323

Appraisal/Learning Results (0 1 vs. 0 2 ) Question on ART-Q Questionnaire TurningPoint Mean Score SA=5, SD=1 Poll Everywhere Mean Score SA=5, SD=1 1.Because of the ARS system, I had a better idea of what to expect on exams, quizzes, or assignments (p =.057) I believe that I knew more about what would be emphasized on exams, quizzes, or assignments because of the ARS system (p =.244) Using the ARS system gave me a preview of what I needed to know for exams, quizzes, or assignments (p =.130) Note: P values calculated by paired samples t-test; significantly different values = p <.05

Negative Grade Impact Results (0 1 vs. 0 2 ) Question on ART-Q Questionnaire TurningPoint Mean Score SA=5, SD=1 Poll Everywhere Mean Score SA=5, SD=1 4.The ARS system interfered with my getting a good grade (p =.259) Because we used the ARS system, I expect to get a lower grade than I would have otherwise (p =.026) Using the ARS system negatively impacted my grade (p =.083) 1.709

Attendance Results (0 1 vs. 0 2 ) Question on ART-Q Questionnaire TurningPoint Mean Score SA=5, SD=1 Poll Everywhere Mean Score SA=5, SD=1 7.Because the ARS system was used, I attended class more regularly than I would have otherwise (p =.561) Using the ARS system increased my likelihood of attending class on time (p =.969) The ARS system motivated me to attend class on time (p =.750) 2.970

Enjoyment Results (0 1 vs. 0 2 ) Question on ART-Q Questionnaire TurningPoint Mean Score SA=5, SD=1 Poll Everywhere Mean Score SA=5, SD=1 10.Using the ARS system was fun (p =.000) It was exciting to answer questions using the ARS system (p =.000) I enjoyed using the ARS system (p =.000) I did not like using the ARS system (p =.001) I had a good experience with the ARS system (p =.001) 3.977

Preparation/Motivation Results (0 1 vs. 0 2 ) Question on ART-Q Questionnaire TurningPoint Mean Score SA=5, SD=1 Poll Everywhere Mean Score SA=5, SD=1 15.Using the ARS system made me more likely to review my notes prior to class (p =.177) Using the ARS system encouraged me to do readings prior to class (p =.967) Because we used the ARS system, I prepared for class more than I would have otherwise (p =.252) The ARS system boosted my enthusiasm for studying the material we learned in the course (p =.259) 3.392

Ease of Use Results (0 1 vs. 0 2 ) Question on ART-Q Questionnaire TurningPoint Mean Score SA=5, SD=1 Poll Everywhere Mean Score SA=5, SD=1 19.Using the ARS system was easy (p =.000) I had no problems using the ARS system (p =.000) Using the ARS system was pretty hard (p =.000) 1.707

Instructor Performance Results (0 1 vs. 0 2 ) Question on ART-Q Questionnaire TurningPoint Mean Score SA=5, SD=1 Poll Everywhere Mean Score SA=5, SD=1 22.The ARS questions used by the instructor were easy (p =.543) The ARS questions used by the instructor were clear and understandable (p =.015) The ARS questions used by the instructor were asked at the appropriate times during class (p =.018) Overall, I feel the ARS system improved the instructor’s ability to teach (p =.120) 3.846

Overall Mean Scores Results (0 1 vs. 0 2 ) TurningPoint (O 1 ) ART-Q Mean Score = TurningPoint (O 1 ) ART-Q Mean Score = Poll Everywhere (O 2 ) ART-Q Mean Score = Poll Everywhere (O 2 ) ART-Q Mean Score = Mean Scores Significantly Different (p =.021) Mean Scores Significantly Different (p =.021)

Comparison Results (0 3 ) Of the two audience response systems used in this class, which system… Turning Point Poll Everywhere Neither 28.Was easier for you to use?57.4%39.8%2.8% 29.Engaged you more in the course material? 53.0%39.8%7.2% 30.Was easier for your instructor to use? 45.7%46.3%8.0% 31.Was more reliable or had fewer technical problems? 43.8%44.1%12.2% 32.Did you prefer using overall?54.3%41.6%4.0%

Qualitative Results: Dislikes about Poll Everywhere Did not like texting long codes (28) “Too many numbers to text, shorten numbers” “The codes to enter were too long. Maybe if it were like one letter it would have been better” “It confused some students about which numbers to text” “I have a flip phone – too challenging” “Too much to keep texting numbers over and over instead of one click” “Not as fast as using the clickers”

Qualitative Results: Dislikes about Poll Everywhere Did not like the technical issues/problems (24) “Does not work on my phone; message block” “My texts never go through” “Sometimes my answers wouldn’t submit” “Technical difficulties” “Tech problems!!!” “Sometimes the texts don’t go through” “Sometimes technology is slow, doesn’t work”

Qualitative Results: Dislikes about Poll Everywhere Did not like using cell phone/texting in-class (19) “Having to use my phone for it” “If your cell phone is dead/has no service then you can’t use it” “Incorporating cell phones into the classroom seems inherently distracting” “Hard to stay focused with my phone in front of me” “Having to use my cell phone. It was so tedious” “Might not always have cell phone or data service to use it”

Qualitative Results: Dislikes about Poll Everywhere Did not like that responses were real time (7) “Everyone went with the majority answer” “No challenge because it shows the answers selected by others as they answer” “You could see everyone’s response as they were going – could influence vote” “You could see the most popular answers before you vote”

Qualitative Results: Dislikes about TurningPoint Did not like the technical issues/problems (41) “It did not work any time we used it” “It wasn’t 100% reliable” “It has glitches a lot of the time” “Sometimes it worked and sometimes it didn’t” “A lot of technical problems, wish we were able to use it more”

Qualitative Results: Likes about Poll Everywhere Students liked that Poll Everywhere was: Engaging/interactive (37) Simple to use (36) A good way to review material (35) Anonymous (17) Using cell phones (11)

Qualitative Results: Likes about TurningPoint Students liked that TurningPoint was: Engaging/interactive (67) Good to review course material (57) Simple to use (42) Anonymous (26)

Conclusion Findings indicate, both quantitatively and qualitatively, that students at Coastal Carolina University favor using the TurningPoint system over the Poll Everywhere system in the classroom.