How Technologically Literate are EMCC Students ?.

Slides:



Advertisements
Similar presentations
Embedding Assessment of Student Learning Outcomes in Regularly Scheduled Assignments Dr. Larry H. Kelley Auburn, Alabama
Advertisements

Session Learning Target You will gain a better understanding of identifying quality evidence to justify a performance rating for each standard and each.
Using Embedded Assessment to Collect Evidence of Student Learning Using Embedded Assessment to Collect Evidence of Student Learning.
Performance Tasks for English Language Arts
On-Demand Writing Assessment
Using Embedded Assessment to Collect Evidence of Student Learning Using Embedded Assessment to Collect Evidence of Student Learning.
MUS Outcomes Assessment Workshop University-wide Program-level Writing Assessment at The University of Montana Beverly Ann Chin Chair, Writing Committee.
Assessment of the Impact of Ubiquitous Computing on Learning Ross A. Griffith Wake Forest University Ubiquitous Computing Conference Seton Hall University.
Freshman Inquiry Portfolio Assessment Rowanna Carpenter, PhD Portland State University.
A Guide for College Assessment Leaders Ursula Waln, Director of Student Learning Assessment Central New Mexico Community College.
“Language shapes the way we think, and determines what we can think about.”
Advanced Topics in Standard Setting. Methodology Implementation Validity of standard setting.
Bill Zannini Business Programs Coordinator October 27, 2008.
BOARD ENDS POLICY REVIEW E-2 Reading and Writing Testing Results USD 244 Board of Education March 12, 2001.
An Assessment Study of the Communication “A” Requirement at UW-Madison University General Education Research Group Chuck Halaby, L&S Nancy Westphal-Johnson,
JHLA Junior High Literacy Assessment. The school year saw the first administration of the Junior High Literacy Assessment. The assessment was.
Writing Skills Improvement Guide Dr. Zubair A. Baig Computer Engineering Department KFUPM, Dhahran.
ASSESSMENT RUBRIC FOR ETEC 590 E-PORTFOLIO Master of Educational Technology (MET)
A Multi-method Approach: Assessment of Basic Communication Cheryl E Drout, Ph.D. SUNY-Fredonia.
Understanding the Process and the Product Professional Development Spring, 2012.
performance INDICATORs performance APPRAISAL RUBRIC
Technical Report Writing and Presentation Skills Course Outline 1.
EVIDENCE BASED WRITING LEARN HOW TO WRITE A DETAILED RESPONSE TO A CONSTRUCTIVE RESPONSE QUESTION!! 5 th Grade ReadingMs. Nelson EDU 643Instructional.
COMPUTER CAREERS Computer Information Technology.
May 8 th Assessment Day 2015 Agenda Introductions Assessment Overview Review General Education Outcomes. Overview of past assessment work. What.
Evidence of meeting the 11 Illinois State Teaching Standards (What goes where)
Assessment Report  LO3 Apply logical, scientific and/or quantitative reasoning to develop a thesis or hypothesis. 1*   LO5 Draft and respond.
Procedures 6.02 Apply procedures to develop multimedia presentations used in business.
Eportfolio: Tool for Student Career Development and Institutional Assessment Sally L. Fortenberry, Ph.D., and Karol Blaylock, Ph.D. Eportfolio: Tool for.
 Posted February  Standards-Based Homework and Grading Committee – (completed work in 2007)  Homework and Grading Policy and Procedure (established.
The project will involve 11 th grade students reading A Raisin in the Sun by Lorraine Hansberry, forming into small groups and each writing a five hundred.
IT133 Software Applications
Universally Designed Syllabi Kirsten Behling, MA Suffolk University.
ELA Common Core Shifts. Shift 1 Balancing Informational & Literary Text.
Information Literacy Assessment SPECIAL THANKS TO JIM WAUGH, OPIE!
Create a Journal Entry Think about a “successful” class you recently taught. Create a journal entry about this class. –What was taught? –How was it taught?
Rethinking English Composition A Redesign Project in Process Anne Homan English Instructor Math, English, and Developmental Studies, Division Chair State.
After lunch - Mix it up! Arrange your tables so that everyone else seated at your table represents another district. 1.
Using Peer Reviewed Research to Teach Reading, Critical Thinking and Information Literacy in Student Success Courses Dr. Christine Harrington Middlesex.
Academic Seminar Weeks 9 Announcements Share out: What have you learned? “The Evolution of your Practicum Experience” Lecture: Uses of Data (Final Reflection)
NSE Assessment Overview: Your Role in Outcomes-Based Assessment What Will We Learn About Our Students?
Peer Evaluation and the College of Business Effective 3/15/93.
Sentence Fluency and Conventions An In-Depth Training Session For English Language Arts Teachers.
Anchor Standards ELA Standards marked with this symbol represent Kansas’s 15%
ANALYSIS AND ATTRIBUTES OF APPROPRIATE ASSESSMENTS Coastal Carolina University.
1 REPORT CARDS & GRADING SYSTEMS : ASSESSMENT OF LEARNING CHAPTER 11.
Oman College of Management and Technology Course – MM Topic 7 Production and Distribution of Multimedia Titles CS/MIS Department.
Procedures. Sabbir Saleh, Lecturer, UniSA Slide 2 Procedures for Creating a Multimedia Presentation 1. Identify the Purpose of the Presentation 2. Identify.
Assessment Literacy and the Common Core Shifts. Why Be Assessment Literate? ED Test and Measurement 101 was a long time ago Assure students are being.
An Institutional Writing Assessment Project Dr. Loraine Phillips Texas A&M University Dr. Yan Zhang University of Maryland University College October 2010.
MUS Outcomes Assessment Workshop University-wide Program-level Writing Assessment at The University of Montana Beverly Ann Chin Chair, Writing Committee.
Reflections, Discussion Threads and Peer Review for Assessment in Online Learning Kristine Rabberman, Ph.D. Carol A. Muller, Ph.D.
NOTE: To change the image on this slide, select the picture and delete it. Then click the Pictures icon in the placeholder to insert your own image. COMMON.
Mary Ann Roe e-Colorado Portal Coordinator Colorado Department of Labor and Employment Jennifer Jirous Computer Information Systems Faculty Pikes Peak.
Capstone: Identifying the impact of advisor review on the quality of student scholarly writing Colleen Burnham MBA, Caroline Alper MD, Melissa A. Fischer.
Unit 3 Learning Styles Learning Styles Study Styles LASSI Discussion Assignment Seminar.
The Process The Results The Repository of Assessment Documents (ROAD) Project Sample Characteristics (“All” refers to all students enrolled in ENGL 1551)
SCGR Results Spring 2016 Student Academic Achievement Committee
Effective Presentations
How Technologically Literate are EMCC Students?
As Good As It Gets…For Now:
Analysis: Clarity of Scholarly Question Style & Scholarly Relevance
Diagnosis and Remediation of Reading Difficulties
Institutional Learning Outcomes Assessment
Creating & Managing for Teaching Purposes
Background Third time assessed
Technical Communication and the RosE-Portfolio
Background Third time assessed
SCGR Results Spring 2016 Student Academic Achievement Committee
Analysis: Clarity of Scholarly Question Style & Scholarly Relevance
Presentation transcript:

How Technologically Literate are EMCC Students ?

Technological Literacy A COMPARISON OF 2012 AND 2015 ASSESSMENTS SAAC REPORT 11/25/15

Purpose and Methodology  Spring 2012: Technological Literacy assessment rubric developed  Designed to assess student technological literacy across the College  Spring 2015: Reassessed with same rubric

Purpose and Methodology Seven common “deliverable components” were assessed:  Data management  References  Assignment content  Communication  Layout  Embedded objects  Conventions

What was Assessed?  Data managing : was the respondent’s name properly displayed, was the course identified, and was the file saved in the correct format?  References : were the appropriate number of references used, were all references listed in APA or MLA style, and were the references embedded into the document?  Assignment content : was there evidence of effective research, did the student achieve the purpose of the assignment, and was the correct audience targeted?  Communication : was the assignment successfully transmitted, and does the student provide evidence supporting the retrievability of the file  Layout : did headings/subheadings reflect a logical structure, was there a consistent visual theme, were appropriate font color and size used, and were animations appropriately used?  Embedded objects : did the deliverable contain the following types of objects: pictures, video, audio files, animation, hyperlinks, and other (per instructor)?  Conventions : phrases are of appropriate length for the medium, punctuation is appropriate, grammar usage is appropriate, and spelling is accurate?

Participant Information Information Prefixes87 Sections1514 Instructors116 Students Class Level0 Dev Ed, 22% 100 level, 78% 200 level 1 Dev Ed, 28% 100 level, 68% 200 level Sophomores19% Sophomores47% sophomores Freshmen42% New Freshmen20% New Freshmen Lowest AreasEmbedded Objects, References

Overall Proficiency (scored at 3 or 4)

Overall Differences in the Means Aggregate score of the Technology Literacy assessment was high in both years  2012: 3.80 (on a 4-point scale)*  2015: 3.60 (on a 4-point scale)* * statistically significant level

COMPARING NEW FRESHMEN to SOPHOMORES 2015 RESULTS Mean scores for Sophomores were only statistically significantly different than Freshmen in Data Management.  Freshmen: 3.83 (on a 4-point scale)*  Sophomores: 3.98 (on a 4-point scale)* * statistically significant level

Analysis - Limitations  Cannot extrapolate the results to represent the college population due to small sample size  Differences in scoring methodologies between 2012 and  Faculty did not identify specific deliverable components missing in 2012 (just if the total number was sufficient).  This was corrected in 2015 but may compromise the ability to accurately compare results between the two assessments  Proportional changes between new freshmen and sophomores further compromises the ability to accurately compare results between the two years  With only one instructor responsible for almost half of all 2015 assessments, the final results cannot be extrapolated to be representative of the college at large  Anecdotally three instructors (representing more than half the students sampled) indicated their grading rigor increased between 2012 and 2015

Improvements For Next Cycle  Feedback from Leadership Council (11/25)  Feedback from Classroom Conversations (12/2)  Need to get more faculty involved  Review the rigor of the assessment (is it too easy?)  Inform the faculty that references, embedded objects were lowest areas of proficiency for two cycles in a row.  Work with Instructional Computing faculty members for creating a tutorial video for how to embed objects into student work.  Work with Writing Center to integrate appropriate format for references across the curriculum

Questions and Suggestions  Thanks for your input!