What does it mean to “Close the Loop”?
“Closing the Loop” is one of the most important stages in the assessment process. Once a department has (a) decided what they want their students to learn, (b) gathered samples of students’ work, and (c) analyzed the data, faculty members then take the time to evaluate whether students actually learned what they expected them to learn, and use that information to effectively improve teaching and learning.
During this phase of the assessment process, faculty collaboratively:
- discuss the assessment results,
- reach conclusions about their meaning,
- decide what changes are needed, if any,
- determine the implications for those changes, and
- follow through to implement the changes.
Assessment results are also meant to inform planning and influence decision making, therefore reporting results to the various stakeholders (e.g., students, administration, accrediting agencies, alumni) is an integral part of “closing the loop”.
Why is it important?
When results are carefully considered and discussed, and those discussions generate questions that people really care about, assessment makes a difference.
How do we use the results?
Here are some suggestions to consider when engaging in a collaborative review of assessment results:
- Present the results in several ways: face-to-face meeting, written report, workshop format in which the report serves as the springboard for brainstorming possible next steps.
- Use multiple sources of information when making decisions. Ideally review data from both direct and indirect measures of assessment.
- Engage the program faculty members, staff, and students in discussions about the results and how they might be used.
- Ask questions and probe the data for complete understanding. What are all the possible explanations for the findings?
- Don’t let assessment results dictate decisions. Assessment results should only advise faculty as they use professional judgment to make suitable decisions.
Here are some questions to start the conversation:
- What do the data say about your students’ mastery of subject matter, of research skills, or of writing and speaking?
- What do the data say about your students’ preparation for taking the next step in their careers?
- Are there areas where your students are outstanding? Are they consistently weak in some respects?
- Are graduates of your program getting good jobs, accepted into reputable graduate schools, reporting satisfaction with their undergraduate education?
- Do you see indications in student performance that point to weakness in any particular skills, such as research, writing, or critical thinking skills?
- Do you see areas where performance is okay, but not outstanding, and where you would like to see a higher level of performance?
- Do the results live up to the expectations we set?
- Are our students meeting our standards?
- Are our students meeting external standards?
- How do our students compare to their peers?
- How do our students compare to the best of their peers?
- Are our students doing as well as they can?
- Are our expectations appropriate? Should expectations be changed?
- Does the curriculum adequately address the learning outcome? Are our teaching & curricula improving?
- What were the most effective tools to assess student learning? Do they clearly correspond to our program learning outcomes as we defined them? Do the learning outcomes need to be clarified?
If results suggest the need for change, FACULTY might consider one or more of these types of change:
- Pedagogy—e.g., changing course assignments; providing better formative feedback to students; use of more active learning strategies to motivate and engage students; assigning better readings; expanding community service learning, fieldwork, or internship opportunities
- Curriculum—e.g., adding a second required speech course; designating writing-intensive courses; changing prerequisites; substituting new courses for existing ones; resequencing courses for scaffolded learning; adding internships or service learning opportunities to deepen course-learning
- Student support—e.g., improving tutoring services; adding on-line, self-study materials; developing specialized support by library or writing center staff; improving advising (or registration software) to ensure students take required courses in sequence; coordinating course-learning with student affairs programming; creating opportunities for students to engage with faculty or other mentors outside of class
- Faculty support—e.g., providing a writing-across-the-curriculum workshop; campus support for TAs or specialized tutors; professional development for improving pedagogy or curricular design; campus support for establishing community service learning, fieldwork, or internship sites
- Equipment/Supplies/Space—e.g., new or updated computers or software, improvements or expansions of laboratories; expanded space or equipment for student projects
DEANS and DEPARTMENT CHAIRS/DIRECTORS might consider these types of changes to support proposed improvements:
- Budgeting and planning—e.g., reallocating funds within the division to support improvement plans based on assessment findings; budgeting for new resources (software, staff, professional development training) to support assessment processes; reallocating staff; supporting an annual assessment forum to share results and best-practices
- Management practices—e.g., establishing new procedures to ensure assessment results are tracked and used for follow-up planning and budgeting at various levels within the division/department
- Once there is consensus on the action(s) to be taken, create an action plan that describes the actions the program will take, who will take those actions, and the timeline for implementing actions.
- Monitor changes as they are implemented to determine whether they have the desired effect(s).
- As you accumulate years of assessment results, periodically review the results for trends.
- Have changes you’ve made in previous years made a difference in the quality of student learning? Do you need to try something different?
- Sometimes results support the status quo. Celebrate! Share successes with colleagues and students so everyone can be aware of these achievements.
How do we report the results?
At its most basic, your report should have enough information to answer five basic questions:
1) What did you do?
2) Why did you do it?
3) What did you find?
4) How will you use it?
5) What is your evaluation of the assessment plan itself?
Click here to access a template of the Annual Assessment Report (Form C).
Additional Resources:
Allen, M. J. (2004). Assessing academic programs in higher education. Bolton, MA: Anker Publishing Company, Inc.
Baker, G., Jankowski, N., Provezis, S., & Kinzie, J. (2012). Using assessment results: Promising practices of institutions that do it well. Available at http://learningoutcomesassessment.org/UsingAssessmentResults.htm
Banta, T. W., & Blaich, C. (2011). Closing the assessment loop. Change, 43 (1), 22-27
Blaich, C., & Wise, K. (2011). From gathering to using assessment results: Lessons from the Wabash National Study. Available at http://www.learningoutcomeassessment.org/documents/Wabash_001.pdf
Driscoll & Wood, Developing Outcomes-Based Assessment for Learner-Centered Education, Chapter 9
Finley, A. (2011). Assessment of high-impact practices: Using findings to drive change in the Compass Project. Peer Review, 13 (2). 29-33. Available at http://www.aacu.org/peerreview/pr-sp11/finley.cfm
Funk, K., & Klomparens, K. L. (2006). Using the assessment process to improve doctoral programs. Chapter 5 in Maki & Borkowski, The Assessment of Doctoral Education.
Palomba, C. A. & Banta, T. W. (Eds.). (2001). Assessing student competence in accredited disciplines: Pioneering approaches to assessment in higher education. Sterling, VA: Stylus Publishing, LLC.
Suskie, L. (2009). Assessing student learning: A common sense guide (2nd edition), Chapters 15-18. San Francisco, CA: Jossey‐Bass.