Download presentation
Presentation is loading. Please wait.
1
2017 Experiment University of Minnesota
Overview of Experiment Results of Experiment Applying to Other Schools 1 2 3 This portion of my presentation is going to focus on the SERU experiment that was conducted at the University of Minnesota this year. I will be providing you with a brief overview of the experiment, discussing the results of the experiment, and then going over how we can apply aspects of this experiment to other Research Universities going forward.
2
Purpose Eliminate existing idiosyncratic email structure
s sent from 11:00 p.m. to 5:00 a.m. on Sunday/Monday s sent in batches of 500 from an OIR employee s sent only once a week, only one per week Introduce flexibility and ownership to the process Add time-sensitive and customized incentives Customize messages based upon collaborations with folks across campus Rapidly respond to lack of representation among groups of students Capitalize upon technology to achieve a higher response rate
3
Email Overview Variety of sponsored partnerships Targeted populations
President, Associate Vice Provost for Undergraduate Education, Deans, Associate Deans, academic advisors Targeted populations Response rates available weekly s sent from sponsors directly to groups with lower response rates Incentives Over 1,000+ incentives collected from local donors (mostly restaurants and local businesses) Three guaranteed incentives for everyone Timeliness Flash s (the next 500 students who respond…if you respond this evening….)
4
Email Overview Messaging
How the data are used, links to new programs that utilize SERU data Examples of research projects and results Twitter—tweets of in-progress analyses, response rates, incentives, and prize winners
5
Twitter Analytics
6
Twitter Analytics
7
Overview Messaging to Students
8
Overview Messaging s to advisors
9
Email Overview Messaging
s to Associate Deans and Directors of Undergraduate Studies
10
Results of Experiment Lets take a look at the results of the experiment. I will be focusing solely on response rates for this presentation. This graph shows the response rates for the first 10 weeks of the 2015 and 2017 administrations of SERU at the University of Minnesota. As you can see, the experiment was tremendously successful at increasing response rates. Some other notable points: Between weeks 2 and 4 are when UMN did targeted reminders based on columns that Minnesota added to the seed file such as greek life, ethnicity, and honors students. You can see that these kept the response rate growing steadily after launch Perhaps the most notable point of all, is how well MN did in week 1. They actually received a higher response rate in the first week than the 2015 administration received over 10 weeks. This was tied to many factors such as social media, posters, flyers, and buy-in from colleges and units.
11
Overview of Experiment
Purpose of Experiment OIR wanted to be in charge of sending reminders to target specific groups Experiment Process OMS trained OIR OIR dedicated one full-time employee annually to SERU Additional Components of Experiment Time-sensitive incentive reminders Locally sourced incentives Seed file additions to target more specific audiences Social media campaign Departments consulted when creating wild card module to increase buy-in across campus Purpose The purpose of the experiment was to provide OIR the ability to send out reminders to very specific groups whenever they wanted to. Process To accomplish this, OIR dedicated 1 full FTE employee to SERU. And its worth noting that this person works on SERU fulltime year round, not just during administration. OMS trained this individual on how to administer SERU through Qualtrics. Additional Components While the main purpose of the experiment was to give MN access to send s to targeted populations. There were several other components of the experiment that played a role in its success. Here are a few examples: Time sensitive reminders: This is something OMS had pilot-tested on a small scale with a previous project. This is the idea of sending out a reminder and offering an incentive to anyone who completed the survey in a certain timeframe. I will touch on a few examples shortly. Locally Sources Incentives: All of the incentives used by MN were free, and were donated by local businesses in the area. s from students suggest that these were highly motivational in convincing students to take the survey. Seed File Additions to Target More Specific Audiences: As part of the experiment, we allowed MN to add additional fields to their seed file. These were then used when sending out reminders so more specific groups could be targeted. Some examples of added fields include fields related to greek life, financial aid, and ethnicity. Some colleges and other units also encouraged competition between units to enhance response. A Social media campaign: was used throughout administration. This helped raise awareness throughout the campus. Departments consulted when creating wild card module and partnerships were created to increase buy-in across campus: By working with various departments when designing the wildcard, MN was able to include questions that met the needs of several different departments. This in turn led to individual departments increasing their promotion of SERU in an effort to get the valuable data they wanted from the Wildcard.
12
Applying the Experiment
Many unique factors led to success. Adequate resources are a large factor. We believe the concept of training and having schools be in charge of reminders is not ready to be adapted to the entire consortium However, there are a number of steps that can be taken by OMS and schools to get similarly effective outcomes. We can allow schools to add more specific columns to the seed file for targeted s OMS can be even more adaptable to sending highly targeted s on short notice We can update our existing guides based on lessons learned from the UMN experiment and other presentations today We can help schools implement time sensitive reminders Many unique factors led to success. Adequate resources are a large factor. We believe the concept of training and having schools be in charge of reminders is not ready to be adapted to the entire consortium However, there are a number of steps that can be taken by OMS and schools to get similarly effective outcomes. We can allow schools to add more specific columns to the seed file for targeted s OMS can be even more adaptable to sending highly targeted s on short notice We can update our existing guides based on lessons learned from the UMN experiment and other presentations today We can help schools implement time sensitive reminders
13
Time-Sensitive Incentive Reminders
What are they? Reminders that offer desirable incentives based on a time-sensitive deadline. Can include guaranteed incentives or sweepstakes Examples: Complete the survey within the next 24 hours to receive… The next 100 people who complete the survey will receive… The 20th, 50th, and 100th person who complete the survey starting now will receive… Participating schools: Minnesota, Michigan, and Florida (see handout for outcomes) What are they? Reminders that offer desirable incentives based on a time-sensitive deadline. Can include guaranteed incentives or sweepstakes Use of incentives can be low-cost depending on implementation These reminders can use costly or inexpensive incentives and we have seen improved outcomes with either option (Michigan and Florida $500 on their time-sensitive reminder and achieved effective outcomes. However, we have also seen effective outcomes using as little as $50-$100). There are a number of ways to make this work. Here are a few examples of what we saw during this years administration: Examples: Complete the survey within the next 24 hours to receive… The next 100 people who complete the survey will receive… The 20th, 50th, and 100th person who complete the survey starting now will receive… The major theme is to be thinking about how to effectively combine time sensitivity with your budget for incentives. This can be a high or low cost proposition. For example, Michigan offered a $5 incentive to the first one hundred respondents which cost $500. Minnesota implemented this process in a number of different low-cost ways. Sometimes offering an entry into a raffle drawing for $100 and sometimes offering guaranteed incentives that were donated from community businesses. Participating schools: Minnesota, Michigan, Florida, Delaware (see handout for outcomes)
14
Questions?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.