JMU's Student Affairs Assessment Support Services (SASS) Team Wins National NASPA Award
SUMMARY: Andrea Pope, Caroline Prendergast, Morgan Crewe, and Dr. Sara Finney (the SASS team) were selected as the receipts of the annual 2019 NASPA Assessment, Evaluation and Research Knowledge Community Innovation Award for their educational program “Student Affairs Assessment 101”.
The Student Affairs Assessment 101 program was honored during the 2019 NASPA Assessment, Persistence, and Data Analytics Conference in Orlando, Florida. To learn more about the NASPA conference, visit our Conferences Page.
Description of the Award-Winning Program
Student Affairs Assessment 101 is a three-day, hands-on workshop specifically designed for student affairs graduate students and professionals. This introductory workshop targets many of the foundational and intermediate knowledge/skills in the ACPA/NASPA AER and SLD Professional Competency Areas. More specifically, participants practice 1) writing measurable student learning outcomes, 2) developing theory-based programs, 3) selecting high-quality instruments, 4) drawing appropriate inferences from assessment results, 5) communicating results effectively, and 6) using results to improve student learning and development. Instead of presenting each of these topics separately, participants are repeatedly challenged to integrate and apply what they have learned to realistic assessment problems. In the process, participants experience the importance of alignment and intentionality when specifying feasible outcomes, building theory-based programming that should be effective, selecting theory-based measures that are sensitive to programming, and collecting data that will afford valid inferences. Participants come to understand and can articulate that this holistic view of programming and assessment is necessary to improve programs and enhance student learning.
In addition to building knowledge and skills, Student Affairs Assessment 101 targets attitudes related to one’s willingness to engage in assessment activities. Two crucial outcomes for the workshop include increasing participants’ value for assessment and building participants’ self-efficacy (both in terms of their ability to conduct assessment and their ability to advocate for assessment). These workshop outcomes are intentionally based on expectancy-value theory, which explicates that when individuals value a task and have confidence to complete a task, they are more motivated to engage in the task.
How is Student Affairs Assessment 101 Innovative?
INTENTIONAL SELECTION OF PARTICIPANTS: Student Affairs Assessment 101 provides a unique opportunity for student affairs graduate students, mid-level professionals, and upper administration to learn from and with one another while pursuing a common goal: enhancement of student learning and development. Not only does the diversity of experiences and perspectives enrich learning, it allows for important conversations about assessment culture. For example, during the workshop, participants articulated several organizational barriers that limited engagement in high-quality assessment work. The Vice President for Student Affairs, a participant in the workshop, was able to immediately acknowledge these concerns and share his commitment to improving the culture of assessment within the division. This was followed by a conversation about future opportunities for professional development and strategies for incorporating assessment into existing structures.
EMPHASIS ON THEORY-BASED PROGRAMMING AND ASSESSMENT: Assessment 101 is innovative because it couples building theory-based programming with using outcomes assessment for improvement. Often these topics are not integrated. A major barrier to student affairs professionals’ ability to engage in high-quality assessment is lack of knowledge of relevant theory (Bresciani, 2010). “Those student affairs professionals who understand the nature of their profession (e.g., the theories that underlie their work) were able to more effectively engage in outcomes-based assessment and identify how their programs contribute to student learning and development. Without an understanding of theories, others were having difficulty evaluating their programs, even though they had a general understanding of how to implement outcomes-based assessment” (p. 86). Despite the important role of theory/research, few assessment trainings address application of theory-to-practice as theory-to-practice models are often vague (Reason & Kimball, 2012). In contrast, Assessment 101 provides opportunity to practice articulating theory-based learning outcomes and developing theory-based programming. This necessitates providing training on logic models, which are critical to coherent programming and assessment but often not formally learned. Participants also demonstrate how specification of program theory facilitates selecting/designing well-aligned instruments and interpreting/using results.