ResLife Student Staff Training: Is It Working? (Part 3)

Across the country a week or two before all students return to campus, we see housing student leaders come back early to prepare and train for the upcoming year. This often involves team building, crisis management, facetime with campus resource leaders, and hopefully some time carved out for hall preparations. After the whirlwind of the training season and opening is completed after catching our breath, there is a question lingering over housing departments: did the training work?

Posts in this series:
Part 1 | Part 2 – Behind Closed Doors | Part 3

Across the country a week or two before all students return to campus, we see housing student leaders come back early to prepare and train for the upcoming year. This often involves team building, crisis management, facetime with campus resource leaders, and hopefully some time carved out for hall preparations. After the whirlwind of the training season and opening is completed after catching our breath, there is a question lingering over housing departments: did the training work?

In the first part of this series, we discussed Kirkpatrick’s Four Levels of Training Evaluation (1994) and how to assess if learners are enjoying training and if learning transfer occurred. As a reminder, here is Kirkpatrick’s model:

A visualization of Kirkpatrick’s Four Levels of Training Evaluation, a four-level pyramid. Level 1 says reaction with the question “did the learners enjoy the training", level 2 says learning with the question “did learning transfer occur”, level 3 says impact with the question “did the training change behavior”, and level 4 says results with the question “did the training influence performance”.

Source: KudoSurvey

In the second part of this series, we looked at how to use Behind Closed Doors as a form of observational assessment and discussed how to build rubrics. Now that we have the bottom two layers of the pyramid built, we can look up to see that impact and results are still to go. 

Level 3 and Level 4 – Impact and Results

I am going to be very, very honest here with you. It is extremely difficult, near impossible, to say with any level of confidence that changes in behavior and changes in performance are a result of your training practices. Is that to say it cannot be done? Of course not. But the effort, time, and statistical finessing to take these steps are tedious and rife with error.

Why is that? It is because as soon as training is over, the training isn’t really over. There are departmental continuing developments, staff meeting training, 1:1 meetings, continued coaching from peers, and a multitude of other learning moments that continue throughout the entire contract period. Since these experiences are not universal and will very likely create differences in behaviors and performance, they make it near impossible to actually say that changes in behavior and changes in performance came from your training. The steps to confidently say that these changes in behavior and performance are due strictly to your training involves statistical gymnastics that make my head hurt just thinking about them (and I am getting my doctorate in this kind of stuff). Add on the constantly changing population as people leave the position and others join the team, it makes for an absolute mess.

I actually did something like this for a class project, take this as a cautionary tale. It took me about 10 hours a week for the majority of my semester to do this work and to truly align the training sessions outcomes to the job action letters on files to the end-of-semester evaluations, then logging all of the information and running statistical magic with the assistance of my faculty member. We problem-solved through issues that came forward from an inconsistent sample, considering mid-year hires and accounting for people leaving the position. What did we learn in the end? Folks who didn’t complete their daily assessments in the fall training were likely to be terminated from their role due to grades below the GPA requirements. Nothing else had any level of significance or even a meaningful pattern to try to bring back to training practices. Would I ever do this again? Absolutely not. Did it get me a really solid project for class? Totally. Ultimately, I would recommend not trying to directly connect your training to the potential changes in behavior and performance.

Does this mean we should just completely ignore level 3 and level 4 of Kirkpatrick? I don’t think so! Instead, I would encourage your team to look more holistically at changes in behavior and performance on a semester-by-semester or annual basis. Here are a few ways that your team can use activities likely already happening to track these changes over time.

Level 3 – Impact

Job Action

One of the first things to explore the impact of your training and continued development opportunities is to take a look at the job action happening in your department. I would strongly encourage your team to have a centralized location where you are keeping your job action letters and to get into the habit of reviewing those letters on a regular basis from a campus-wide lens around where student staff may not be meeting expectations. Please note, I am not saying to give out student staff employment files to just anyone in your department to do this. It is important to discuss this with an HR representative at your institution to discuss the different privacy laws and regulations that exist within your city, county, and state before creating your plan on how to do this.

This is an especially helpful way to consider if specific changes to your training had an actual impact on performance. For example, if you implement some of these recommended changes to Behind Closed Doors you could then compare job action trends from last year to this year to see if these changes had an impact on student staff appropriately following duty protocol (assuming you have a culture where that is documented through job action). This could be something as simple as counting job action numbers around duty protocol violations over both last fall and this fall to compare them, no need for intense statistical analysis.

Of The Months (Or Other Shout Out Style Nominations)

The qualitative data that exists within Of The Months, or a similar shout style nomination platform, has a lot of opportunity to be used to understand the impact that RAs are having in their communities. I personally believe that Housing departments are often swimming in data to the point that they don’t realize all of the data that is around them. It is kind of like the idea that if you are in the water, can you actually be wet? Here are a couple of questions your Of The Months might be able to answer for you:

  • What aspects of the RA role are people consistently being nominated around? Are these what we value as a department?
  • What areas of work are receiving recognition? How do these connect back to what they have been trained on?
  • Are RAs valuing the same things that leadership are valuing regarding work performance? What gaps may exist?
  • What kinds of programs are being nominated? What might that say about the values of the RA staff?
  • What categories are consistently not receiving nominations? Why might that be the case?

Level 4 – Results

Performance Evaluations

Very similar to the job action letters, using performance evaluations as a way to see the long-term results of training is a solid move! This is an activity your department likely already does, so building it into your long-term assessment plan can be a way to work smarter and not harder. Reviewing all performance evaluations from a campus-wide lens can help you identify where your student staff are excelling, where they may need additional assistance, and perhaps gaps that exist in your current training and development plan. Again, discuss this with an HR representative at your institution before creating your plan on how to do this. 

While this blog wasn’t as resource-heavy, I hope it did help you to see how you can build upon the assessment work you’re doing with training testing and training observations to make your long-term assessment plan meaningful. Remember, by practicing meaningful and strategic assessment your department can more effectively tell your story and have actionable data.

I hope you enjoyed this series and thank you for sticking with me through it! I look forward to bringing you more easily usable assessment resources very soon that can connect to the work your department will be doing during the academic year.

Comments are closed.

Up ↑

Discover more from Roompact

Subscribe now to keep reading and get access to the full archive.

Continue reading