We Need Your Help What we need you to do for us: If we build it, will you use it? Be willing to test-drive the user interface and provide feedback. Help.

Slides:



Advertisements
Similar presentations
Willing to spend the time! Self motivated! Self responsibility! (If you need something Ask For IT!!!!!) Ability to communicate! (Vocabulary) Write,
Advertisements

Week 6: Tying things together. Review Multiple Intelligence Extending the classroom Role of lesson plans Facets of understanding Blooms taxonomy Motivation.
1 Effective Feedback to the Instructor from Online Homework Michigan State University Mark Urban-Lurain Gerd Kortemeyer.
The Common Core State Standards: Opportunities and Challenges for the Mathematical Education of Teachers.
ED EL 335 Class Notes Day 1—Course Introduction. Vision Statement After completing EdEl 335, pre-service teachers will have a vision and understanding.
TAILS: COBWEB 1 [1] Online Digital Learning Environment for Conceptual Clustering This material is based upon work supported by the National Science Foundation.
Cambridge International Examinations
Developing a Statistics Teaching and Beliefs Survey Jiyoon Park Audbjorg Bjornsdottir Department of Educational Psychology The National Statistics Teaching.
Applying How People Learn (HPL) Key Findings to Higher Education National Research Council MSP Workshop How People Learn June 28, 2004 Bonnie J. Brunkhorst.
STEM Education Reorganization April 3, STEM Reorganization: Background  The President has placed a very high priority on using government resources.
Overview of Nursing Informatics
Scientific Teaching 12 October 2004 Diane Ebert-May Department of Plant Biology Michigan State University
1 Exploring NSF Funding Opportunities in DUE Tim Fossum Division of Undergraduate Education Vermont EPSCoR NSF Research Day May 6, 2008.
NLII Mapping the Learning Space New Orleans, LA Colleen Carmean NLII Fellow Information Technology Director, ASU West Editor, MERLOT Faculty Development.
INACOL National Standards for Quality Online Teaching, Version 2.
Course Portfolio: Making Pedagogy Visible Center for Excellence in Teaching and Learning (CETL) Queensborough Community College, CUNY Spring 2008.
A Summary of Recommendations From the National Conference C. Brewer, U MT, 2/2010.
Interactive Science Notebooks: Putting the Next Generation Practices into Action
WIP – Using Information Technology to Author, Administer, and Evaluate Performance-Based Assessments Mark Urban-Lurain Guy Albertelli Gerd Kortemeyer Division.
Course Design Adam Berman Nydia MacGregor. Today’s goals and agenda Identify best practices of designing a course Understand how students learn Understand.
What should teachers do in order to maximize learning outcomes for their students?
Intel ® Teach Program International Curriculum Roundtable Programs of the Intel ® Education Initiative are funded by the Intel Foundation and Intel Corporation.
Critical Friends: Building a Professional Learning Community Stefanie D. Livers College of Education and Human Development University of Louisville SUMMARYPROJECT.
Redesigning student learning opportunities: Some examples from the classroom. Scott Leong Department of Accounting Illinois State University.
Lecture # 8 SCIENCE 1 ASSOCIATE DEGREE IN EDUCATION DIVERSITY, ADAPTATION, AND EVOLUTION.
Learning Unit Documents and Examples. Learning Units - basic building block of a course For iGETT a Learning Unit consists of –Three parts Instructor.
The Genetics Concept Assessment: a new concept inventory for genetics Michelle K. Smith, William B. Wood, and Jennifer K. Knight Science Education Initiative.
Pedagogic Service Project: Enriching the MERLOT Collection Ellen Iverson Science Education Resource Center, Carleton College Scott Cooper University of.
Assessing Student Learning Lynn Merklin Assistant Provost Office of Institutional Effectiveness August, 2014.
T 7.0 Chapter 7: Questioning for Inquiry Chapter 7: Questioning for Inquiry Central concepts:  Questioning stimulates and guides inquiry  Teachers use.
DeAnn Huinker, UW-Milwaukee MMP Principal Investigator 26 August 2008 This material is based upon work supported by the National Science Foundation under.
=_A-ZVCjfWf8 Nets for students 2007.
Using Bibliographic Software as a Tool for Promoting Academic Integrity Amongst Undergraduate Students: A Case Study Debbie Booth Faculty Librarian – Engineering.
Data Management for Large STEP Projects Michigan State University & Lansing Community College NSF STEP PI Meeting March 15, 2012 Workshop Session I-08.
Assessment Research and Tools: when, why, how? Diane Ebert-May Department of Plant Biology Michigan State University
Black Box Software Testing Copyright © 2003 Cem Kaner & James Bach 1 Black Box Software Testing Spring 2005 PART 7 -- FUNCTION TESTING by Cem Kaner, J.D.,
DATABASES Southern Region CEO Wednesday 13 th October 2010.
Dr. Lesley Farmer California State University Long Beach
Standards-Based Curricula Dr. Dennis S. Kubasko, Jr. NC Teach Summer 2009.
Data-Informed Faculty Development Planning Howard Community College August 17, 2012 Shelley A. Chapman, PhD.
Teaching to the Standard in Science Education By: Jennifer Grzelak & Bonnie Middleton.
We Need Your Help What we need ASM members to do for us: If we build it, will you use it? Be willing to test-drive the user interface and provide feedback.
Record Keeping and Using Data to Determine Report Card Markings.
Submitting Course Outlines for C-ID Designation Training for Articulation Officers Summer 2012.
Pedagogic Service Project: Enriching the MERLOT Collection
Information Literacy Module for Majors Available to support any department Tony Penny, Research Librarian – Goddard Library Supporting the Architecture.
WISER: Teaching Information literacy This session will give an overview of the key concepts and models of information literacy as an important transferable.
Mark Vargas, Library Director Anne Buchanan, Assistant Librarian Saint Xavier University Library Building and Strengthening Upper Level Research Skills.
Dissemination within and between Institutions Faculty Need to Change Workshops Team Structure Faculty Recognition and Rewards Abstract Faculty Institutes.
First week. Catalog Description This course explores basic cultural, social, legal, and ethical issues inherent in the discipline of computing. Students.
Chantira Chiaranai, RN, PhD Institute of Nursing Suranaree University of Technology Electronic Learning Portfolio: An Innovation Emphasizing Self-directed.
Presented at the 3rd Annual Best Practices In Assessment Poster Competition Contact Information : Molly M. Jameson, Ph.D. Department of Psychology
An AAC Professional Learning Module Book Study based on the AAC publication Scaffolding for Student Success Scaffolding for Student Success Module 3: A.
An AAC Professional Learning Module Book Study based on the AAC publication Scaffolding for Student Success Scaffolding for Student Success Module 1: Assessment.
Applying Principles of Learning Diane Ebert-May Department of Plant Biology Michigan State University Assessment.
CHOOSING AN ASSESSMENT APPROACH An Inventory of Available Options Ursula Waln, Director of Student Learning Assessment Central New Mexico Community College.
CDIO: Overview, Standards, and Processes (Part 2) Doris R. Brodeur, November 2005.
Writing Learning Outcomes Best Practices. Do Now What is your process for writing learning objectives? How do you come up with the information?
Hindsight is 20/20: Garnering institutional support for the implementation of media literacy in science and business classrooms Ashley Downs, Kelly LaVoice,
Critical Information Literacy
Scott Elliot, SEG Measurement Gerry Bogatz, MarketingWorks
CRITICAL CORE: Straight Talk.
Sourcing Event Tool Kit Multiline Sourcing, Market Baskets and Bundles
Building and Strengthening Upper Level
DBM 380 Competitive Success/snaptutorial.com
DBM 380 Teaching Effectively-- snaptutorial.com
Using Action Research to Guide Student Learning Experiences & Assessments Exploring the synergistic activities of curriculum design/delivery and assessment.
Student Satisfaction Results
Nanotechnology & Society
Curriculum Coordinator: Janet Parcell Mitchell January 2016
Presentation transcript:

We Need Your Help What we need you to do for us: If we build it, will you use it? Be willing to test-drive the user interface and provide feedback. Help seed the database with assessments and data from your course(s) What is the FIRST Database? Faculty Institutes for Reforming Science Teaching (FIRST) has engaged faculty from over 50 institutions in professional development focused on active, inquiry-based teaching designed to improve student learning. We are constructing a database to support research on undergraduate STEM education. The database will support storing, searching and analyzing assessment data from undergraduate STEM courses. This database will facilitate both data-driven instructional decision making (Figure 1) and research in science education. FIRST Assessment Database: Your Role Database Functionality We are using a large variety of assessment data collected at Michigan State University and from colleges and universities across the nation to design the FIRST Assessment Database. Faculty from all STEM disciplines will input and retrieve data from the database to explore questions about effective teaching and learning in undergraduate education (Figure 2). This project will facilitate cross-institutional studies using assessment data from large numbers of students and classes. The database is the bridge between teaching and research that enables faculty to become both expert users of and contributors to the scholarship of scientific teaching. Your Research Question Did teaching students to create, interpret and critique models result in better student understanding of evolution? Figure 1. Two models of instructional design. Metadata - data about data To promote STEM education research, you tag assessment items with metadata, such as Discipline-specific concepts. In biology, for example, we plan to use the National Biological Information Infrastructure (NBII) Biocomplexity Thesaurus (Figure 5). Additional metadata tags will include standard psychometrics such as difficulty and discrimination, Bloom’s Taxonomy of Educational Objectives, Professional society tags and whether an assessment item is copyrighted (as in a textbook question or a published concept inventory). Metadata collected by the FIRST database will also encompass information about courses and institutions (Figure 6). These data will facilitate longitudinal studies in addition to comparisons among courses across institutions. Figure 5: Example of concepts retrieved from the NBII Biocomplexity Thesaurus ( Figure 3. Data sources from a faculty member, including: (a) output from course management software, (b) excel spreadsheets of student exam, homework and in- class assignments, (c) clicker data, (d) student open-ended responses and (e) exams. Your Data Sources You upload to the FIRST Database all assessments and related data from your introductory biology course (Figure 3), including assessment questions and grading rubrics. You tag each assessment item with both a concept category (see Figure 5) and an instructional strategy (i.e., modeling, pair-share, JiTT, etc). You have the option to tag assessment in several additional ways (see the discussion of Metadata, next column), or you may bypass this task at this time. Security, Intellectual Property, IRB Institutional review boards (IRB) are unique to each college and university. This reality demands a flexible database that is both secure and protective of student data. The FIRST Assessment Database will support two levels of data access - restricted (available only to the faculty member who uploaded the data) and public (available to all registered database users). Many assessment items are copyrighted - as in textbook questions or published concept inventories. Faculty members using the database must be aware of and take responsibility for obtaining appropriate permissions for copyrighted or otherwise protected material. Without such permissions, an assessment item cannot be part of the public database. For more information on copyright issues and intellectual property rights, please see our website. Student responses to the FIRST Assessment Database are de-identified by a hashing function that ensures student anonymity. To promote longitudinal studies, however, the database will link students across courses at an institution. A faculty member may chose or be directed by their institution’s IRB to remove identifying information about themselves and the institution. This material is based upon work supported by the National Science Foundation (NSF) under award Any opinions, findings and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the NSF. Figure 2. Overview of the FIRST Assessment Database. Figure 4. Potential data output from the FIRST database. Data Analysis To analyze whether teaching students to make models improved their learning of evolution, you query your current course and two past courses for data on evolution and modeling. The FIRST database returns a single spreadsheet file with the data from each course (Figure 4). You import the data into your favorite statistical analysis software. As a correlated question, your are interested in comparing your students’ modeling abilities with those of students from other institutions. Your again search the FIRST database for courses at other colleges or universities using model-based education to teach evolution. You limit your query to include only large introductory biology courses at institutions similar in size to yours. You then limit the returned data to include only assessments related to modeling in evolution. The FIRST database again returns a spreadsheet that you can import into statistical software for analysis. Based on your analysis, model-based learning led to significant gains in student learning of evolution in your course and courses at other universities. These results prompt you to redesign additional portions of your course to include modeling. Institution type/size Course format (lab, lecture, discussion, etc) Course size Targeted students (majors, non-majors, lower or upper level) Course Syllabus Type of assessment (e.g., in-class, open book, exam) Proportion of final grade Bloom’s level of understanding Concept category Course-level Metadata Assessment-level Metadata Figure 6: Examples of course- and assessment-level metadata captured by the FIRST Assessment Database. Data-driven instructional design Analyze student learning outcomes Analysis of student data drives course modification Design course Teach course Assess student outcomes Content-driven instructional design Discipline-based knowledge (textbooks) For more information, please contact Diane Ebert-May or Mark Urban-Lurain Design course Teach course Assess student outcomes Discipline-based knowledge (textbooks) Formative feedback Selection Evolution Genetic drift Group selection Host selection Kin selection Natural selection Adaptations (biological) Ecophenes Fluctuating asymmetry Genetics Mutation Phylogeny Selective media Speciation Competition Diversity indices Dominant species Environmental effects Fitness Genetic load Genetics Survival value