How Technologically Literate are EMCC Students?

Slides:



Advertisements
Similar presentations
Session Learning Target You will gain a better understanding of identifying quality evidence to justify a performance rating for each standard and each.
Advertisements

Analyzing Student Work
Performance Tasks for English Language Arts
PD Plan Agenda August 26, 2008 PBTE Indicators Track
MUS Outcomes Assessment Workshop University-wide Program-level Writing Assessment at The University of Montana Beverly Ann Chin Chair, Writing Committee.
1 Literacy PERKS Standard 1: Aligned Curriculum. 2 PERKS Essential Elements Academic Performance 1. Aligned Curriculum 2. Multiple Assessments 3. Instruction.
TWS Aid for Scorers Information on the Background of TWS.
February  Founded  Located in Crystal Lake, IL.  offers six associate's degrees and 17 Associate of Applied Science degrees.  About 400.
The Personal Development Plan (PDP)
Achievement for All Implementing Differentiation through the MOSAICS Program Dr. Denise Pupillo.
Universal Design for Learning in the College Classroom Abstract This Faculty Learning Community (FLC) integrated components of Universal Design for Learning.
 Posted February  Standards-Based Homework and Grading Committee – (completed work in 2007)  Homework and Grading Policy and Procedure (established.
Universally Designed Syllabi Kirsten Behling, MA Suffolk University.
Information Literacy Assessment SPECIAL THANKS TO JIM WAUGH, OPIE!
Joanna Fulbright Jeremy Nicholson Chris Lorch Ozarka College, Melbourne, Arkansas Based on a process created by the Des Moines Area Community College System.
Rethinking English Composition A Redesign Project in Process Anne Homan English Instructor Math, English, and Developmental Studies, Division Chair State.
WASC/CDE Visiting Committee Final Presentation ___________________ Dublin High School ___________________ March 9-11, 2009.
NSE Assessment Overview: Your Role in Outcomes-Based Assessment What Will We Learn About Our Students?
WRIT 1122 Faculty meeting September 23, Satisfaction with goals and features  The survey results showed that faculty are satisfied overall with.
THE SLO PROCESS #1 create/update SLO’s, rubrics, & assessment methods #2 assess all students in all sections of all courses #3 maintain SLO assessment.
 The Business Education Department of Pasco High School needs a 3D curriculum that guide/support a diverse group of students to meet the demanding skills.
Mathematics Performance Tasks Applying a Program Logic Model to a Professional Development Series California Educational Research Association December.
How Technologically Literate are EMCC Students ?.
MT ENGAGE Student Learning Outcomes and Assessment April 27, 2015.
Assessment Literacy and the Common Core Shifts. Why Be Assessment Literate? ED Test and Measurement 101 was a long time ago Assure students are being.
East Longmeadow Public Schools SMART Goals Presented by ELPS Leadership Team.
MUS Outcomes Assessment Workshop University-wide Program-level Writing Assessment at The University of Montana Beverly Ann Chin Chair, Writing Committee.
Reflections, Discussion Threads and Peer Review for Assessment in Online Learning Kristine Rabberman, Ph.D. Carol A. Muller, Ph.D.
NOTE: To change the image on this slide, select the picture and delete it. Then click the Pictures icon in the placeholder to insert your own image. COMMON.
Student Learning Outcomes Assessment Montgomery College Fall 2011 Orientation.
Quantitative Literacy Across the Curriculum. Members of the QLAC Committee Beimnet Teclezghi – co-chair Laura Pannaman – co-chair Marilyn Ettinger John.
Capstone: Identifying the impact of advisor review on the quality of student scholarly writing Colleen Burnham MBA, Caroline Alper MD, Melissa A. Fischer.
Data Review Team Time Spring Purpose 0 This day is meant to provide school leadership teams with time to review the current status of their.
Advanced Writing Requirement Proposal
CAEP Standard 4 Program Impact Case Study
Department of Physics and Goal 2 Committee Chair
Welcome! Session Recording
Consider Your Audience
COM 705 OUTLET Experience Tradition/com705outlet.com
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
SCGR Results Spring 2016 Student Academic Achievement Committee
As Good As It Gets…For Now:
Data Review Team Time Spring 2014.
ORGANIZATIONAL STRUCTURE
Analysis: Clarity of Scholarly Question Style & Scholarly Relevance
Diagnosis and Remediation of Reading Difficulties
Student Learning Outcomes Assessment
Assessment Workshop Design Program Portfolio & Presentation / ART- 230
Institutional Learning Outcomes Assessment
Tools for Infusing QM Standards into the Course Development Process
Beth Perrell Arri Stone
Background Third time assessed
Welcome! Session Recording
Standards-Based Grading
Introduction to Student Achievement Objectives
Technical Communication and the RosE-Portfolio
Standards-Based Grading
Surveying the Freshman Class
Using Data to Assess High Quality Professional Development Findings From Interviews With Massachusetts Educators May 2015 Massachusetts Department.
AET/515 Instructional Plan Template (Shirmen McDonald)
Background Third time assessed
AND.
PD Goals Program Overview December, 2012
PD Goals Program Overview December, 2012
SCGR Results Spring 2016 Student Academic Achievement Committee
Curriculum Coordinator: Pamela Quinn Date of Presentation: 1/19/18
Curriculum Coordinator: Patrick LaPierre February 3, 2017
Analysis: Clarity of Scholarly Question Style & Scholarly Relevance
Student Learning Outcomes Assessment
General Studies ePortfolio Pilot
Presentation transcript:

How Technologically Literate are EMCC Students?

Technological Literacy A Comparison of 2012 and 2015 Assessments SAAC Report 11/25/15

Purpose and Methodology Fall 2011: Technological Literacy assessment rubric developed by SAAC Designed to assess student technological literacy across the College Spring 2012: Technological Literacy assessment implemented Spring 2015: Reassessed with same rubric

Purpose and Methodology Seven common “deliverable components” were assessed: Data management References Assignment content Communication Layout Embedded objects Conventions

What was Assessed? Data managing: was the respondent’s name properly displayed, was the course identified, and was the file saved in the correct format? References: were the appropriate number of references used, were all references listed in APA or MLA style, and were the references embedded into the document? Assignment content: was there evidence of effective research, did the student achieve the purpose of the assignment, and was the correct audience targeted? Communication: was the assignment successfully transmitted, and does the student provide evidence supporting the retrievability of the file Layout: did headings/subheadings reflect a logical structure, was there a consistent visual theme, were appropriate font color and size used, and were animations appropriately used? Embedded objects: did the deliverable contain the following types of objects: pictures, video, audio files, animation, hyperlinks, and other (per instructor)? Conventions: phrases are of appropriate length for the medium, punctuation is appropriate, grammar usage is appropriate, and spelling is accurate?

2012-2015 Participant Information Prefixes 8 7 Sections 15 14 Instructors 11 6 Students 237 234 Class Level 0 Dev Ed, 22% 100 level, 78% 200 level 1 Dev Ed, 28% 100 level, 68% 200 level Sophomores 19% Sophomores 47% sophomores Freshmen 42% New Freshmen 20% New Freshmen Lowest Areas Embedded Objects, References

Overall Proficiency (scored at 3 or 4)

Overall Differences in the Means Aggregate score of the Technology Literacy assessment was high in both years 2012: 3.80 (on a 4-point scale)* 2015: 3.60 (on a 4-point scale)* * statistically significant level

COMPARING NEW FRESHMEN to SOPHOMORES 2015 RESULTS Mean scores for Sophomores were only statistically significantly different than Freshmen in Data Management. Freshmen: 3.83 (on a 4-point scale)* Sophomores: 3.98 (on a 4-point scale)* * statistically significant level

Analysis - Limitations Cannot extrapolate the results to represent the college population due to small sample size Differences in scoring methodologies between 2012 and 2015. Faculty did not identify specific deliverable components missing in 2012 (just if the total number was sufficient). This was corrected in 2015 but may compromise the ability to accurately compare results between the two assessments Proportional changes between new freshmen and sophomores further compromises the ability to accurately compare results between the two years With only one instructor responsible for almost half of all 2015 assessments, the final results cannot be extrapolated to be representative of the college at large Anecdotally three instructors (representing more than half the students sampled) indicated their grading rigor increased between 2012 and 2015

Questions and Suggestions Developmental history of students using technology: is Ppt really appropriate assessment platform? Does Ppt reflect what is really needed for technological literacy? What about LMS? Workforce reality: Ppt is widespread Importance of establishing baseline assessment of Tech knowledge coming in, as well as going out Working on assumption that students have a basic level of tech understanding (e.g. Canvas) (eLearning advisory committee working on creating a course for students to become literate in Canvas) International technological standards can be used for future reference? Challenge of new generation with reading issues

Questions and Suggestions Our students are hitting ceiling – high enough standards? Gear assessment to higher industry standards Importance of transference of skills Fall 2017: SAAC needs to take a deep look at appropriate assessment, industry standards, international standards – incorporate 21st century technology skills Can we take a look at where/when CIS105 is offered? What about involving them in the assessment Importance of faculty collaboration, use team leaders, cross curricular heterogeneity, etc. Future discussion of CIS105 involvement What about students taking a tech literacy survey? Eliminate variables and bias. Some type of standardized test, each class could participate Open source assessments? Continue discussion of certain curriculums/areas having primary responsibility of assessment in their acknowledged discipline

Questions and Suggestions Ideas for increasing faculty involvement? Letter from president More interaction, easier pathway to integration into what is already an overloaded curriculum Department specific ideas for each assessment Establish workshops for faculty to work on future assessments Possible to pare down assessments to be discipline-specific related For establishing criterion, ASU requirements may be used for developing gen ed assessments Search schedule for Classes>sorted by Gen Ed designation; also district curriculum site

Questions and Suggestions Our students are hitting ceiling – high enough standards? Gear assessment to higher industry standards Importance of transference of skills Fall 2017: SAAC needs to take a deep look at appropriate assessment, industry standards, international standards – incorporate 21st century technology skills Can we take a look at where/when CIS105 is offered? What about involving them in the assessment Importance of faculty collaboration, use team leaders, cross curricular heterogeneity, etc. Future discussion of CIS105 involvement What about students taking a tech literacy survey? Eliminate variables and bias. Some type of standardized test, each class could participate Open source assessments?

Suggestions from SAAC and Leadership Council 11/26 Meeting Get feedback from Classroom Conversations (12/2) Need to get more faculty involved Review the rigor of the assessment (is it too easy?) Inform the faculty that references, embedded objects were lowest areas of proficiency for two cycles in a row. Do we enable sloppy behavior? Work with Instructional Computing faculty members for creating a tutorial video for how to embed objects into student work. Work with Writing Center to integrate appropriate format for references across the curriculum More of a interactive tutorial? Separate assignment for students before tech literacy assessment? Writing Center remind faculty every semester that reference tutorials are available Include Ppt citations guidelines – video specific to this Continue for SAAC co-chairs to visit divisional meetings to encourage participation