I recently attended the 2025 Assessment Institute held in Indianapolis from Oct. 26-28. The conference showcased a wide-ranging exploration of emerging priorities, innovative practices, and cross-sector collaboration in higher education assessment. Several common themes surfaced across the Institute's sessions that I think are important for us to consider in the context of our work.
1. Building Cultures of Assessment and Continuous Improvement
Many sessions emphasized assessment as a living process rather than a compliance task. Presentations illustrated strategies to embed assessment into daily operations, nurture data literacy, and cultivate shared ownership of evidence-based improvement. Institutions described using journey mapping, logic models, and five-year review cycles to sustain momentum and accountability.
The session Assessment at Multiple Organizational Levels for Improvement and Accountability challenged my thinking around assessment happening at the department or program level versus taking time for a more in-depth exploration of what our institution-wide surveys like the NSSE, BCSSE, ACHA, Grad SERU and others can practically influence our work and what we know about the student experience. The session posed the question of what if we did more work in taking data we already have like our swipe data, information in our data warehouse (in our case, REX), and other sources at a divisional level versus individual departments doing one-off projects that may have overlap with content another department is interested in knowing more about. Our October Assessment 365 event featured Dr. Laila Shishineh sharing the results of the BCSSE. After attending this presentation at the Assessment Institute I am committed to having others from UMBC come to future Assessment 365 events to engage us in conversations about the results of other national surveys administered to our students and how these may impact our work.
2. Integrating Academic and Student Affairs for Holistic Learning
A strong focus was the movement toward alignment between curricular and co-curricular learning. Sessions highlighted collaborative models where faculty and student affairs professionals co-define learning outcomes and integrate data systems to demonstrate student success comprehensively. This integration was framed as essential to institutional effectiveness and student belonging.
I attended Creating a High-Impact Assessment Framework: Aligning Student and Academic Affairs for Collaborative Student Success which focused on the alignment of learning outcomes, goals, and assessment to divisional and institutional priorities. An emphasis in this session was starting with data you are already collecting instead of trying to implement a collection of novel assessment methods as well as trying to focus on the priorities you care most strongly about, such as retention, persistence, and well-being. One takeaway I had from this session was how they thought through intentionally mapping their work in student affairs to the Middle States accreditation areas, not just for compliance, but for its importance in helping to provide a path forward for your work. I encourage you to learn more about UMBC's Middle States self-study process.
3. Student Partnership and Engagement
The Institute underscored the importance of students as co-creators of assessment rather than subjects of it. Sessions offered frameworks for authentically engaging students in the design, analysis, and use of assessment data. Discussions frequently connected partnership to empowerment, transparency, and institutional trust.
One session that I attended that really captured this theme was From Insights to Action: Driving Student Success Through Student Partnerships in Assessment. This session has me thinking about how we can continue to shift from not just seeing students as consumers in the sense that they provide information to us about their experiences but are not involved in the process of information collection to working with students as consultants and eventually partners in assessment and program design. This builds off the topic that Dr. Ricky Blissett described at our September Assessment 365 event.
4. Leveraging Data, Analytics, and Artificial Intelligence
A surge of sessions explored data analytics, automation, and generative AI as transformative forces in assessment. Presenters demonstrated the use of learning analytics platforms and AI-supported qualitative coding to increase efficiency and insight while maintaining human interpretation. Ethical use and data validity were recurring concerns, with several sessions cautioning that technology should enhance, not replace professional judgment.
The session Assessment with Purpose: Designing Sustainable and Actionable Plans Through Stakeholder Engagement emphasized how you might be able to leverage AI tools to align outcomes to assessment. Specifically, when thinking about the three pillars of effective assessment- sustainable, evidence-based, and collaborative; AI may be able to assist in helping with the crafting of effective measurement tools to aid in identifying areas of strength and areas for enhancement. This session provided the reminder when working with AI you take your human perspectives and ideas, work with AI on developing a product, but following through with analyzing the output from a human perspective to make sure it accurately reflects your needs, your language, and how other people will engage with the product. UMBC AI is a great myUMBC group to follow to learn about advancements in AI here on campus, professional development opportunities, and more.
Across program tracks there was a unifying call to view assessment as a vehicle for transformation: advancing equity, fostering collaboration, and driving institutional adaptability in a rapidly changing environment. I left the institute with a renewed focus on purposeful, people-centered, and technology-informed approaches to measuring and improving learning and student success.