Faculty use of digital resources and its impact on digital libraries

Slides:



Advertisements
Similar presentations
The Common Core State Standards: Opportunities and Challenges for the Mathematical Education of Teachers.
Advertisements

Flora McMartin - Broad Based Knowledge Alan Wolf - University of Wisconsin - Madison.
THE ADVANCED TECHNOLOGY ENVIRONMENTAL AND ENERGY CENTER (ATEEC) Summative External Evaluation July 1, 2013 – June 30, 2014 PRELIMINARY OUTLINE.
Assessment Surveys July 22, 2004 Chancellor’s Meeting.
Fifth Annual NSF Robert Noyce Teacher Scholarship Program Conference July 7-9, 2010 Enrique Ortiz University of Central Florida Using a Teaching Goals.
Enhancing your Teaching and Developing New Leadership: Impact of the On the Cutting Edge Professional Development Program Ellen Iverson, Cathy Manduca,
Understanding Faculty Work Habits as a Foundation for Professional Development Cathryn A Manduca, Ellen Iverson Sciece Education Resource Center Carleton.
Addressing Teacher Dispositions at the community college Level
BTEC 1 NQF BTEC Foundation Diploma in Art and Design PLANNING.
Overcoming Barriers to Access and Use of Digital Learning Materials by Instructors in Higher Education Alan Wolf, University of Wisconsin - Madison Flora.
Evaluation & MathDL Flora McMartin Broad-based Knowledge MathDL Workshop, October 2008.
Corinne H. Lardy Cheryl L. Mason San Diego State University The Association for Science Teacher Education (ASTE) January 14-16, 2010 Sacramento, California.
Engaging faculty use of the web in teaching as a basis for designing Cathy Manduca, SERC, Carleton College Flora McMartin, MERLOT Wes Shumar, Drexel University.
Open Educational Resources: How are US Faculty Using Them? Flora McMartin, Broad-based Knowledge Alan Wolf, University of Wisconsin - Madison.
Evaluation of the Noyce Teacher Scholarship Program 2010 NSF Noyce Conference Abt Associates Inc. July 9, 2010.
Developing Strategies to Encourage Use of Digital Learning Materials Alan Wolf - University of Wisconsin - Madison Flora McMartin - Broad Based Knowledge.
Fostering Sustained Impact: Lessons Learned from Geoscience Faculty Workshops Ellen Roscoe Iverson, Cathryn A. Manduca, Science Education Resource Center,
Digital Preservation MetaArchive Cooperative, Digital Preservation Policy Planning Workshop Boston College, Boston, MA October 26, 2010.
The Logic Model as Compass: Guiding Program Staff Through the Consequences of Evaluation Ellen Roscoe Iverson, Carleton College,
Quality Online Preparation: Qualities of Faculty, Courses, and Preparation Programs By Dr. Erin O’Brien Valencia College League of Innovation Conference,
1 Using DLESE: Finding Resources to Enhance Teaching Shelley Olds Holly Devaul 11 July 2004.
Understanding Student Use of Digital Learning Resources
CAEP Standard 4 Program Impact Case Study
Introduction to Survey Research
Learning Assessment Techniques
Understanding Student Use of Digital Learning Resources
Developing Community Assessments
Quality Assurance processes
Board on science education
Alexandria City Public Schools Preliminary Results of the 2016 Teaching, Empowering, Leading, and Learning (TELL) Survey. Dawn Shephard Associate Director, Teaching,
INSTRUCTIONAL DESIGN Many definitions exist for instructional design 1. Instructional Design as a Process: 2. Instructional Design as a Discipline: 3.
Individualized research consultations in academic libraries: Useful or useless? Let the evidence speak for itself Karine Fournier Lindsey Sikora Health.
Research from the NCCSD: What’s new and exciting?
Researchers (in alphabetical order)
IENG 451 / 452 Voice of the Customer: Collecting VOC information
Quality Assurance and Enhancement at The University of Edinburgh
A Research-Based Framework
Web section best practice checklist for departments
Evaluating the Portal: The Story Behind the Numbers
General Education Assessment
Conducting a Comprehensive Needs Assessment: The Basics
Simon Pawley Market Research, Oxford University Press
Business and Management Research
Writing to Learn vs. Writing in the Disciplines
Statistics and Research Desgin
Institutional Effectiveness USF System Office of Decision Support
Improving the First Year: Campus Discussion March 30, 2009
Community Technology Assessments
This presentation will include:
Office of Secretary of Defense
Time Line for Program Reviews
General Studies ePortfolio Pilot
Regular and Substantive Interaction in Oral Communication and Laboratory Science Anna Bruzzese, South Representative Geoffrey Dyer, Online Education Committee.
2018 SMU Staff Performance Review Training
Business and Management Research
Factors Motivating Use of Digital Libraries
The Heart of Student Success
February 21-22, 2018.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Program Review Guidelines & Processes at SUNY New Paltz
Time Line for Program Reviews
IDEA Student Ratings of Instruction
Extending “Scholarship” to Including Teaching in a Digital World
Program Discontinuance versus Program Discontinuance
Aspects of Online Courses That Are More Effective and Successful Than Face-to-Face Courses Eli Collins-Brown, Ed. D. Methodist College of Nursing.
Pedagogical Practice, Shift, and Professional Growth in Online Courses
Street Manager Communications approach
General Studies ePortfolio Pilot
COURSE EVALUATION Spring 2019 Pilot August 27, 2019.
Presentation transcript:

Faculty use of digital resources and its impact on digital libraries Josh Morrill, Morrill Solutions Research Alan Wolf, University of Wisconsin - Madison Ellen Iverson, Carleton College Flora McMartin, Broad Based Knowledge Cathy Manduca, Carleton College Glenda Morgan, California State University

Outline Background Focus groups Survey Design Survey delivery Current results Follow up

Study background Research questions What are the characteristics of online collections that make them useful for teaching? How do faculty employ materials in useful collections? How are collections, resources, and services best aligned with faculty work patterns?

Overview of methodology Focus groups Code transcripts & determine themes Survey design Institution recruitment Survey delivery Analysis

Focus groups Conducted across the range of higher education institutions, STEM disciplines, and instructor ranks. Questions Finding materials (types, methods, barriers) Sharing materials (types, methods, barriers) Communities and their roles in teaching improvement and sharing. How they use the web for professional development

What they know about DLs Personal definitions of DLs vary widely Very few people knew about NSF DL efforts Barriers Information overload Concern about copyright and use Not invented here Google as content aggregator Concern about the availability of resources

Themes arising from the focus groups Characteristics of online collections that make them useful Faculty work patterns vis a vis digital learning materials and teaching Alignment between DLs and faculty work patterns Faculty use of DLs and digital materials Obstacles to faculty use of DLs Faculty use of Web for P&T purposes Bullet 1 is “The kinds of content, features/services that attract users to the collection and that they return to because of these characteristics” Bullet 2 is “This theme attempts to identify the work patterns of faculty members as they prepare for a class or course. The section spans from the planning process through teaching” Bullet 3 is “What are the features and services that faculty desire and which ones do digital libraries provide? How do faculty perceive these services and features? What might make them easier to use?” Bullet 4 is “Level to which faculty use digital libraries, if at all and why they say they do or don’t. What gets in the way of their use of these services? “ Bullets 5&6 are self explanatory

Survey design Three Facets of Survey Validity Face Validity External Validity Internal Validity

Face Validity (Def) Does this “seem” right? Does this survey address what we want it to address? How we addressed Face Validity: Developed survey over 5 months (Feb – Jun) with grant research group. Open process with extensive feedback, meetings and revisions.

External Validity (Def) Do other people think this survey seems “right”/ cohesive/ clear? How we addressed External Validity: Pre-Tested survey on appox. 20 faculty members (R1, technical college, masters and liberal arts colleges) 6 were followed up with in depth interviews. The remainder were given a survey with comment boxes on each page.

Internal Validity (Def) Are you measuring what you THINK you are measuring? How can you minimize error in your measurement? How we addressed Internal Validity: “Reverse Coded” items to assess mindfulness. Likert Scales Throughout: forced choice vs. ambivalence scale points. (Skip Logic helps this immeasurably) Factor Analysis/ Reliability Testing (more to come here)

Recruiting Recruiting is ongoing We have attempted contact with all higher education institutions for whom we could identify contacts. Contacts attempted were typically library directors, deans of science colleges, learning technology centers, and occasionally chairs of science departments

Recruiting We have have received responses from almost 150 institutions More than 75 have agreed to participate The majority have sent the survey to all their faculty (all disciplines, all ranks). There has been an outstanding response from the community colleges.

Recruiting - what is left to do We believe that given the current rate of participation we are on track to gather statistically significant sample We may need to recruit more doctoral and master’s degree granting institutions

Preliminary results To date, much of the focus group data is being confirmed --- But data is still coming in. ….AND WE STILL WANT MORE! But the level of granularity in the survey is giving some unique nuance/ differentiation to the focus group data. For example…?

How does this compare with the focus groups? Low-complexity High granularity Google Survey Low-complexity High granularity Google BUT… There are some interesting trends emerging

Follow up - Faculty development The original idea for this grant Analyze which practices work under which circumstances Provide guidance for faculty developers as to which practices might work best for local circumstances.

Follow up - web search observational studies Survey gives ½ the story --- But what happens BEFORE people enter a digital library? What are the decision points in a search process that may now lead to “Google” but could be changed to lead to a digital library?

Contact Information If you are interested in participating Contact alanwolf@wisc.edu For more details visit http://serc.carleton.edu/facultypart We wish to acknowledge the National Science Foundation for their support (DUE-0435398).