This week, Dustin is joined by Danielle to discuss her recent work beginning to implement an assessment plan at her institution. She reflects on accomplishing similar work in prior roles and gives advice to others on how best to navigate the emotional and logistical aspects of this work.
Guests:
- Danielle Parker – Assistant Director for Academic Initiatives & Student Engagement at Temple University
Listen to the Podcast:
Watch the Video:
Read the Transcript:
Dustin Ramsdell:
Welcome back, everyone, to Roompact’s ResEdChat Podcast. If you’re new to the show, every episode we feature a variety of topics of interest to higher professionals who work in and with university housing, residence life, residential education, whatever we might call it. We’re covering a wide variety of topics, as I said, on the show. This episode here, I think hopefully, what we’re trying to do here is a string of episodes focused on different aspects of “assessment work,” a core component of the curricular approach and so much that’s happening nowadays in residence life.
I feel like we’re starting a little bit broad here, so we’ll see where we go in the conversation, but just building an assessment plan, why it’s important, how to utilize it and use technology to assist in that process. I will let our guest, let our expert get more into detail on that.
We’ll start, as we always do. Danielle, if you want to briefly introduce yourself and go over your professional background, and then we’ll get into the assessment plan that you built and get into a little bit of the details of it.
Danielle Parker:
Hi, everybody. My name is Danielle Parker. I use she/her pronouns. My journey in higher ed started obviously when I was an undergrad, super involved in the RA world, in the student government world. Then I started going to grad school. I actually changed my career path from doing medicine and doing a STEM background, to then transitioning into higher ed. I went to a school called Indiana University of Pennsylvania. At that time, I was working at Carnegie Mellon. They were in the process of creating rubrics for their curriculum, so that was my first exposure to what curricular design is and how it can look within a residence hall.
I knew from that point I just really enjoyed the structure of curriculum, I really enjoyed res life. I started applying for jobs and found myself at an institution in Florida called Embry-Riddle Aeronautical University, where they also were utilizing the curricular approach. It was a small, little flight school. We were really trying to evolve our curriculum. I think it was in year two of implementation. Did a lot of really good work down there, and I was known as the person who always asked the question of, “How are we going to assess this program? How are we going to assess this strategy?”
To the point where, I remember I was sitting in meetings and they were like, “Aren’t you going to ask the question, Danielle?” I was like, “But what question?” They were like, “How are we going to assess it?” I go, “Good thing I have a good impact, because we’re getting into the mindset that we need to start assessing.”
I then decided it was a little too hot down in Florida, a little too far away from family, so I landed at UMBC which is in Baltimore, Maryland. Where I really got the cool opportunity to work with restorative practices as well as their curriculum. That’s really where my passion for assessment bloomed. I was really able to get experience in understanding how do you design and implement assessment plans in order to improve a residential curriculum for students.
Most recently, I went on the job search again because I was like, “I need more of a challenge.” Now I’m at Temple University as the assistant director for academic initiatives and student engagement. Now I oversee and chair our curriculum, which is referred to as SOAR. I’m in the process of learning how is SOAR implemented, what works, what doesn’t work. Hopefully soon, we’ll be able to have a really robust assessment plan to help with improving the strategies and improving our student staffs’ experience with SOAR and implementing SOAR.
Dustin Ramsdell:
Yeah. Exciting stuff. I feel like we’ve had a handful of people from UMBC I think specifically, I think they’re doing good work down there. Yeah. Excited to have you on because I know Paul, the architect of our show, he obviously originated it and helps host some episodes, and orchestrate them behind-the-scenes. He recommended you coming on as part of the presentation that you gave at the Institute for Curricular Approach, ICA, which we’ve covered on the show before. It was talking about your assessment plan. It was covering this academic year. Recording this in December 2024, just to put us in place, and time, and space here.
I would say you’re probably in the thick of it, of developing it, implementing it, all that kind of stuff. If you could explain your assessment plan for this academic year and how it took shape, and then we’ll go from there.
Danielle Parker:
Let’s see. A lot of my assessment planning is revolving around the assessment cycle. I’m sure, for the folks that went to grad school and know about the assessment cycle, it’s all about planning, doing, figuring it out, and then tweaking, and then planning, doing. It revolves in a circle.
The cool thing I learned while I was in grad school, because I took an assessment-specific course, was it doesn’t matter where you’re starting point is. You just need to jump in, and you need to start going in the cycle. Which is a really cool opportunity for anybody whose doing the curricular approach and you’re like, “I don’t know where to start.” You can always pull up the assessment cycle and pick where do you want to start.
Then the other thing I like to think a lot about is when you’re thinking about assessment, what do you want to know? Sometimes people can see that as Pandora’s box, and they don’t want to know the truth. Or maybe we do want to know the truth and we’re upset about the outcome. I really try not to let that distract us from what our ultimate goal is.
When I’m thinking about this year, I think it’s really hard for me to deep dive into this is the while cycle that we have. But I can share little spotlights that we do have. The first thing that I really wanted to know when I started at this job was I wanted to know how our strategies were actually doing with our residents. Because right now, Temple has a ton of good information about student implementation of our curriculum, the satisfaction or where are they spending most of their time. But we didn’t really have a ton of information about how are the residents interacting with our curriculum.
What I ended up doing, I said, “Let’s start tracking some things. Let’s do the basics. Let’s look at how many students are showing up, and let’s see what they’re learning from it.” I think the challenge with assessment is sometimes we get stuck in satisfaction and we don’t really look at the learning or the impact. I was trying to be really intentional with when we were building out assessment tools, having no more than five questions. It’s all survey-based, which we have now run into a challenge with that.
I told the team, I said, “Listen, I want two questions to be learning and two questions to be satisfaction, and then one question to gather their information.” We use Anthology, or Campus Labs as some people refer to. They have an assessment branch called Basecamp, so we use that to help with gathering all that data, and it puts it into a report for us so we’re able to see what students are gaining.
So far what we’ve learned is a lot of our community builders are actually having a really good impact on students. They’re meeting the goals that we wanted, which makes me happy. It goes, “Okay, we did the thing, we can keep doing the thing proceeding forward.” That’s one thing.
The other thing I really wanted to know with student staff is what do they know about SOAR? And what do they know about implementation of SOAR? From that assessment, it was a 10-question survey that all student staff members took some time to do. It really analyzed were they able to identify the different learning goals that we have within our curriculum. Are they able to identify high-impact engagement strategies? They were able to say, “Our al chats are doing really, really well, but our bulletin boards have no engagement.” Which was good information for us, because it allowed us to then really reframe for the spring of where do we want to put our emphasis and stress with the student staff. Do we want to put all this time into bulletin boards, or do we want to put all this time into our al chats, community meetings that have the highest engagement?
That’s just one aspect of it. I think we are still in the process of figuring out, okay, what do we want to do and where do we want to go? I think the spring will help us figure that out. I think the challenge, once again, I started in May, so really this first year is about me sitting and watching, and seeing if I can build some tools to learn more about what we’re doing which falls into the assessment cycle. Hopefully that was helpful.
Dustin Ramsdell:
Yeah. Yeah, it is. I appreciate your willingness and comfort of you’re new in your position, and then in the thick of it of getting those early stage insights and everything. I think definitely really helpful to hear, even with as complicated and longitudinal as this work is, that idea of what you’re saying, because I say this a lot just generally, of start somewhere.
Then in this work, it could be start wherever it makes sense. If it is the idea of, “Well, we’ve been doing a lot of work but we’re doing no assessment.” Or, “Actually, we’ve done a lot of assessment, but we’ve done no analysis,” or something. It can be just figuring out where maybe can be the most impactful. Or if it’s, “Oh, we’ve gotten really far, all the way through three-quarters of this process, but we just haven’t really done that last part.” Let’s at least go through, so that we can hopefully, yeah, start to repeat the cycle of adapting what we’re doing with the insights that we’ve gotten from our feedback, or whatever else, and go from there.
Yeah. The one thing I jokingly thought in my head is that idea of this work generally, I think or often is, the person who is the assessment person. But then, the emotional part of this work is that people don’t want to look under the rock. The idea of, “Oh my gosh, this is going to be super gross. I don’t want to see what’s under here.” Or the idea of people who don’t go to the doctor because then I’m not sick. “I know I’m in pain or whatever, but I don’t want somebody to tell me I have a problem because then it doesn’t exist,” or whatever. Or, “I’m not going to open the mail because that actually means it’s a real thing.”
There is this emotional process of you have to get into that mindset of, “Okay, it’s not about me.” Just get very objective about it, and logical, rational, whatever. Because I think it is that virtuous cycle that can start is doing work that is aiming at a goal, how close did we get there, why didn’t we hit it or why did we hit it. Let’s do more of what’s working, less of what’s not, and just keep that going, and keep that continuous improvement going.
I did want to focus in on one part of it, I think is the technology piece, because I think obviously that just makes it certainly more efficient, and maybe just more optimized. You name-dropped utilizing a tool, the Anthology. If you want to talk more about that, or just your thinking, even if it’s, I don’t know, your whole background and history of how technology? Because I know you’re in the thick of it right now, so certainly talk about your current example and/or the whole longitudinal experience of you utilizing technology to assess learning and do assessment, because I think that is certainly a major factor here.
Danielle Parker:
I’ve already started name-dropping. I think here at Temple, we use Anthology, I’ve already talked about that in the Basecamp aspect. We also use Advocate to help with the data tracking for our al chats or our intentional interactions. They have actually launched a really cool tool to help with building al chats within Advocate, building reports that helps with pulling where is each resident assistant at, where are they going. What is the most commonly asked question? Which really in longterm will help us out with sequencing for the upcoming year of, “Okay, this is when this question was most frequently asked. Why don’t we build some strategies focused around some of those questions to help students feel more successful?”
For example, at the start of the school year we had a lot of RAs asking questions about roommate relationships, which has been a little bit of a challenge for us due to the high occupancy and limited ability to have residents change rooms. What we’ve learned is we have a lot of students asking these questions at the beginning of the year. Why don’t we build a strategy to help with residents navigating roommate relationship challenges? I think that’s one other example here at Temple that we’re using.
During my time at UMBC, I think that’s the assessment cycle I’m the most proud of right now because I saw it from start to finish. We surveyed our student staff to see, very similar to here, to Temple. We wanted to know how the implementation of the curriculum was going. Because we really believed at UMBC having students implement and be a part of that implementation of the curriculum was critical. When we assess our students, we use Qualtrics to help with, very similar to Basecamp. We wanted something online, something that they could put in, but also a tool that would be able to generate the reports. Because if you don’t have that tool, then you’re responsible for generating that report, which can take a lot of time. Which we all know in higher ed, we don’t have a ton of time.
We used Qualtrics there. We used a lot of the assessment tools on Roompact as well. The report generated feature now that Roompact has. I forget what the technical name is. That tool has been uber helpful with pulling strategy numbers, with pulling attendance numbers, with pulling retriever chat numbers there as well. That’s my other two experiences, is really Qualtrics and Roompact, to help with guiding a lot of that assessment planning.
I also think, outside of technology, we used a lot of space just to encourage dialogue. For example, we utilized focus groups to help with students creating a space where students felt comfortable. They didn’t really feel like there was that autonomy or that power in the room to really express, “This isn’t going well so let’s talk more about it.” Obviously technology-wise, we used a lot of recording features to hear what they were saying, understand exactly the wording that they were expressing from that.
Dustin Ramsdell:
Yeah. My one big reflection, because I don’t know when certain episodes will come out. But it’s fresh in my mind, we just recorded it, me and Paul talking about intentional conversations just as a component of strategy here. This, I think, just a very unintentionally relevant conversation to be a neighboring episode. That idea of, okay, we did the intentional conversations one-on-one episode, and now this setting it within a whole context of you harnessing them.
I think it’s that idea for me of everything that you’re mentioning, it’s not like having people do something in a radically different way. It’s just setting it within a more appropriate structure. What I’m struggling maybe to articulate here is the idea that you’re having these intentional conversations and everything else, and being able to maybe quantify them and measure them, and all these different things. It’s going to, my feeling is, result in a place where we’re not relying on people’s gut checks and anecdotes, and all these things. Historically, sometimes just for lack of any other better way, we just didn’t have a way. Whether it’s like, “Oh, I have all my paper notes here,” or something. “I can read off of these or something,” I don’t know.
What we’re doing is deeply human. It’s like, “Oh, how are your residents doing,” would be a question people would ask. It’s like, “Okay, I’ll give you my gut check, anecdotal response here.” But we could say, “Oh, wow. The RAs have talked to a lot of their residents, or not many of them.” Or so many of them at day whatever, or at the end of the first month, or the end of the first semester, are all remarking about the same things. You can say that for sure because everybody should be logging conversations, or flagging themes, or whatever else. You can just do that more appropriate analysis and respond to it in the best way, and everything. Do the right thing, at the right time, for the right people, all that kind of stuff.
Any reflections on that? It sounds like you’ve always had the assessment mindset and everything. Maybe if you’ve encountered people who maybe almost relish or are just so used to operating off of anecdotes, how you, through using technology, through being able to do those measurement, and analysis, and tracking, and everything else, how maybe you’ve been able to bridge those gaps? My mind feels like that’s a major connecting of dots. You’re that person who literally people are looking to. Of being like, “Oh, you’re the assessment person. How are we going to measure this?” That the next step is, “Well, we did measure it. Here’s how we can try to operationalize it into how we’re thinking about how we’re doing programming or anything else.”
To succinctly put this into a question, how do you take everything that you’re doing to try to bridge the gaps between your colleagues, your students staffs, or whoever else, who are just overly reliant on gut checks and anecdotes?
Danielle Parker:
I have to give a quick shout out. I think my assessment mindset really comes from grad school, where my professor really drilled into our head of assessment helps with improving, but it also helps with telling our story. I think we’re in an era of higher ed where we are being questioned left and right why are we doing what we’re doing, why are we pouring so much money into the things that we do. I think that bridges the gap between what you’re talking about.
I think a lot of your question is really asking how do you bridge the gap between feelings, but also using data to help with driving your decisions? I think that’s a good question. I think starting here at Temple has really challenged me to think about that. And think about how do we create a space where we can have hall directors, assistant directors, directors look at the data and really make some informed decisions about it.
For example, I think it was September, end of September, I had the goal in mind that each hall director would get a monthly report. The monthly report would reflect how many programs have been logged, how many residents have shown up, what did they learn from that program. But it also took apart some of the al chat numbers. We only strictly looked at numbers of this is how many al chats we currently have logged, this is how many we’re trying to shoot for.
I think creating that space, that packet … I made them a physical packet they could hold, and look at, and write notes on. Then I also created reflective questions. I had them reflect on how are they feeling so far about the numbers. And then I had them reflect what are your goals, not feelings, what are your goals based off of the numbers for the next couple of months. I think creating a space and having us slow down to take a look, and really reflect and be honest with each other was helpful. I saw a lot of our DEs and hall directors just go, “My numbers could be better. I did not know I was doing this good or bad.” That was up to their interpretation of their own expectations. I think that really helps with bridging some of the gap.
I think another thing that helped is recognition. I think people want to be recognized for doing the hard thing. Assessment is super hard. Figuring out implementation is hard. Figuring out how do you get your student staff on board for implementation. Figuring out how do you navigate that one student who you can’t get a hold of. It’s just all hard. Taking some time to say thank you and be like, “Hey, I really appreciate the work that you’re doing,” helps with resetting that mindset of she’s not to get us based off of data. She’s actually here trying to help us move forward.
I also have been reading this book, it’s called Switch. It talks a lot about how do you navigate change, with also keeping in mind people’s logistic side as well as the feelings. I think one key takeaway I took from that book so far is it’s really important to have some of those clear goals because it makes it a little bit easier to help with navigating some of the feelings that are coming in play.
I think that’s another key tip is having very clear goals of this is where you want to go helps with navigating some of those hard feelings and tense feelings that may come up based off of data, or based off of direction, or things like that.
Dustin Ramsdell:
Yeah. I guess that’s where I was feeling and going as well, is that idea of, yeah, there’s the logistical side, and I think that’s where the technology comes in. It can make it very easy to do the work, track the work, and all that. But then, the emotional side, too. They’re equally important in their own ways of doing something like this, where a lot of people can relish, or be fine, or be happy with just the anecdotes and that sort of thing. When you can respect both sides of the equation of the change management, and then also the cherry on top is being very clearly present to people, “Here’s how you’re doing, why it’s important.”
Even like you’re getting at, show people that they’re doing better than they think they are. Or, “Hey, well, here’s where you’re at, and here’s the areas that you can improve,” or whatever else. Because again, if you’re just relying on anecdotes, yeah, you get very comfortable I think with the status quo, or however you’re measuring it, or whatever. I think it’s all mixed together, of how you approach overseeing and implementing an assessment plan like this.
The other detail that I wanted to focus on was you’re getting students engaged in the process. More specifically, I’ll start with this question more specifically on student staff, of that idea of how is this being implemented and all that. Our conversation is focusing a bit more on that. From what you were saying, you’re surveying student staff members, you’re getting their input on how things are being implemented and how things are going. Any further reflections on that, on why that is an important part of the process of what you’re doing here?
Danielle Parker:
To be honest, I think student staff play that vital role at the table. They are the people that are implementing, they are the people doing the hard work that we are trying to envision for the students. I think having them at the table has been a critical piece.
My whole time at UMBC, I remember starting there and I was very much in the mindset of, “I am the educator, I have a whole Master’s degree. Students should not be at the table because they don’t know what educational component they need to implement.” I think UMBC really challenged me to think differently about that.
I think student staff have that light on what are residents actually responding to and what are they not responding to. I think here at Temple, we’re experiencing the challenge of intentional interactions are a very sticky point because students don’t want to interact with us. I think we need to take some time and listen to that.
At UMBC, we had student staff sitting on the SOAR committee. They would come in once a month. Sometimes we would survey them and be like, “How did this strategy go? How did that strategy go? Let’s pick it apart.” Or sometimes they would come in and we would talk about, we had four pillars there, and one of them was building community. That was “our fun pillar.” They worked with us. We came in with the educational components, but they also worked with us to figure out how do we want to implement that strategy at the table. Sometimes they would look at us and be like, “Students aren’t going to respond to this.” I’m like, “But it’s Bingo! You all love Bingo.” But them giving us feedback is a part of the assessment plan. Of you know what, this isn’t actually going to work. Let’s figure out how it is going to work to make it more acceptable or received better by the students.
I also think a lot about our student staff, and I think the more that they are at the table, the more that they are bought into what we’re doing and what assessment process that we are trying to implement. Which is my goal for here at Temple, is hopefully by next semester we will be able to have student staff coming in once a month, helping us figure out what’s working and what’s not working to help strengthen the curriculum.
Because I tell them every day. I said, “Listen, I think we need to work together in the sense of I think RDs, hall directors, grads, professional staff, we’re the experts of education, what skills you need to learn by the time you leave. But you all are the experts of fun. How do we bring you to the table to have more fun, while also trying to implement the thing that we’re trying to do?”
Dustin Ramsdell:
Right. It’s the idea of we’re not trying to lecture at anybody, so let’s try to do whatever you’re doing in the most fun way possible. It’s even the idea of not necessarily taking these student staff members word as infallible gospel in everything, but it is just at least trying to check your work and check your thinking, and engage with them in dialogue. If they are just going to be like, “Oh, this isn’t going to work.” It’s like, “Tell me more.” It might be just that flimsy thing where it’s like, “I don’t know, I just don’t like it.” It’s like, “Well, let’s try it still and we’ll see.” That’ll obviously be taken into consideration.
It expedites that process of you’re doing something, getting feedback, analyzing it, and then try to learn from that to do better next time. It’s like before we even try to do something, can we just get some quick insights of this is what we’re thinking, this is why we want to do it this way, this is the outcome we’re going for. How successful do you think this will be? Yeah. There can be certain ideas where it’s just like, “Oh my gosh, yeah. Don’t try that.” Or, “Yeah, that’ll be really great. Or maybe if you do it this way, it’ll be even better.” You don’t have to wait so long to start generating insights that you can use. You could use that to immediately, before you even start some sort of initiative for the first time, be like, “Oh, hey. Yeah, we already go some great feedback to make adjustments here.”
It’s really smart. And I think, yeah, it’s going to generate a lot more of that buy in. Certainly by having them feel like their voice is heard. But if nothing else it is just like, “Okay, people who will be doing this stuff, let us give you some time and space here to understand what we’re doing, why we’re doing it, the outcome we’re hoping for,” and all that. Versus just being like, “Well, just do it because I told you so. Just deal with it, whatever.”
Yeah, all that very good. All the aspects of just the broader assessment plan, it’s importance, and the things to be considerate of as you’re getting it implemented. As we’re wrapping up, I feel like this is a great topic to keep the conversation going on as folks are doing this work. Any further advice or resources of things that you’d want to recommend to folks? I know you mentioned that one book. Yeah. Anything else come to mind that you would like to share?
Danielle Parker:
I think it’s important to remember with assessment, sometimes things go to plan and you’re like, “Check, check, check, did the thing.” But also, sometimes there are loops that are going to make you either change it or throw it out the window, or whatever it is. I encourage people to be resilient and try not to lose sight of what you’re trying to do. I think it’s very easy in assessment to get in the weeds and to forget about that big picture. It’s about being resilient.
I think it’s also, like you mentioned, we in higher ed need to do a better job of having students at the table to give some of that live feedback to us. Because honestly, we can leverage that and create more impactful experiences.
I think lastly, I’m not an expert on it, I’m just trying to figure it out as we’re all trying to figure it out, but I think it’s very hard, we talked about it earlier, managing some of the feelings that come up with it. Of, “We’re doing such a bad job.” I think you have some people at the table that have been here forever and they’re like, “I don’t want to let go of this.” I’m like, “But I have the data saying this isn’t good!” I think it’s about how do you create space to engage in that dialogue so feelings can be navigated as you’re trying to persist forward.
Greatest words of advice. Be resilient. Please. Because it’s going to help you out in the long end, and then you can tell your story of the impact that you’re having on the students. Which obviously gets good PR with your VPSA, provost, president, you can keep going up the chain. Be resilient.
Dustin Ramsdell:
Yeah. It’s good advice for those who are the standard bearers for this. But also, if you’re colleague is coming to you and trying to work with you on this, be resilient, be patient, be understanding of their approach and why they’re doing this. Because yeah, I think so much of this work is rhythm, so if you get really into a rhythm, it makes it a little bit easier. Then being reactionary, just because it’s people living their lives, it could be a little bit controlled chaos of living in the halls and dealing with the problems as they come up.
I think that idea of finding resilience through that, taking the moments to have these conversations. Which can feel hard and taking a hard look at what you’re doing well or not so well, or whatever else. But slowing down and taking those moments of reflection and everything, I think can then end up paying off dividends. You’re making an investment to have those conversations and to do the assessment work to have it not feel as though you’re always pushing against the grain, and all that. It’s like, “Why aren’t students responding to this or that? Why are they always flaunting the rules, or doing whatever else?” It’s like, “Oh, because we’re not explaining them clearly enough, or we’re not doing X, Y, or Z in our contact process,” whatever.
I think it is just, yeah, important to have that resilience to have these conversations on both sides, and to be really empathetic to all the those factors and emotions, and everything else. Yeah, to bring students into the process. And also, to use the right technology tools to try and make it as easy as possible.
Yeah. I really appreciate this. This feels like a very good general conversation around assessment. Hoping to kick off a few different episodes on this topic in the coming weeks. Just really appreciate you, your work, and your time, and for jumping on and having this conversation.
Danielle Parker:
Yeah. Tickled pink to be here. Best of luck, folks.




