How Indirect Assessment Shapes Business Education
Sponsored Content
- Indirect assessment tools complement traditional methods such as standardized tests, cumulative exams, portfolios, presentations, research projects, and internships.
- Through indirect assessment, a school can discover if students are engaged with program content and if the curriculum aligns with industry demands.
- To increase student participation in indirect assessments such as questionnaires, a school can administer a survey concurrently with the final exam.
When AACSB held its Assessment and AoL Conference in New York City last March, one topic repeatedly emerged as a focal point among participants and presenters: indirect assessment. No doubt the same topic will be a major theme of the next conference, scheduled for May 12–14, 2025, in London.
The concept of indirect measures is not new, but the AACSB requirement for such measures is relatively recent. Standard 5 of AACSB’s 2020 business accreditation standards states, “The school employs well-documented assurance of learning (AoL) processes that include direct and indirect measures for ensuring the quality of all degree programs that are deemed in scope for accreditation purposes.”
By adding the requirement, AACSB is communicating the importance of gathering AoL data through a diverse array of assessment methods, including alumni feedback, learner surveys, and employer/industry focus groups. These indirect measures provide insights into teaching and learning that direct measures may not capture. Therefore, these tools act as valuable complements to traditional assessment practices and ensure that programs promote academic excellence and align with industry needs.
For fifteen years, Peregrine Global Services has equipped schools with customizable assessments designed to meet AoL requirements, including those set out by AACSB. Our Business Administration Assessment, for instance, offers a comprehensive online instrument that allows for direct and indirect measurement of learning outcomes across various degree levels.
The What and Why of Student Learning
The key difference between direct and indirect measures is that direct assessments reveal the what of student learning, while indirect measures uncover the why behind those outcomes. Together, they form the whole assessment picture.
Direct measures—which might include standardized tests, cumulative exams, portfolios, presentations, research projects, and internships—demonstrate learners’ knowledge, skills, and abilities. By collecting direct measures, institutions can identify a program’s strengths, as well as areas where it might need improvement. This lays the groundwork for schools to make informed, data-driven decisions that impact program quality.
Through small focus groups and interviews with stakeholders, an institution can gather rich, qualitative insights into an academic program’s quality and its alignment with industry demands.
Using indirect measures such as surveys, interviews, and focus groups, schools can capture the nuanced aspects of student experiences, including engagement, confidence in competencies, and perception of learning.
Through small focus groups and interviews with stakeholders, an institution can gather rich, qualitative insights into an academic program’s quality and its alignment with industry demands. In these conversations, stakeholders will discuss the relevancy of the curriculum, the effectiveness of the school’s teaching methods, and the experiences these stakeholders have had when interacting with students or faculty.
Such discussions frequently reveal the competencies graduates need when entering the job market or progressing through their careers. These conversations also might highlight potential gaps in the school’s curriculum.
Institutions can interview and survey alumni, employers, and the community, as well as current learners. By collecting indirect measures from so many groups, institutions can quantify and set targets related to the perceptions of different stakeholders:
- By surveying alums, a school can put together a longitudinal review of what students need for career progression and professional development.
- By surveying employers, an institution can keep its finger on the pulse of changes within the field.
- By surveying students, an institution can compare indirect and direct assessment measures to gain a comprehensive picture of student learning. The school also can evaluate the results of changes it has made to its programs.
Peregrine’s assessment service includes a fully customizable student survey and unlimited access to corresponding reports and data. By integrating direct and indirect assessment measures, Peregrine provides meaningful data a school can use to effectively close the loop in the continuous improvement cycle. The following examples show how student survey data are reported:
Data shown above are fictitious and used for illustrative purposes only.
Flexible and Integrated
According to AACSB, direct and indirect measures are required across the entire assessment portfolio. However, programs may vary in how they are evaluated as long as their assessment plans are consistent with the school’s mission and strategic initiatives. Therefore, some programs may be assessed entirely through direct measures, and other programs may be assessed only through indirect measures. This methodology allows a school flexibility while also ensuring that it is accountable to its learners.
Schools also must use the data to identify ways to improve learning and performance.
Despite the flexibility, there’s a compelling argument to be made for adopting an integrated assessment strategy. By combining assessment methods, an institution can cross-validate findings and take a data-informed approach to continuous improvement.
However, as Karen Tarnoff stresses in a recent article in AACSB Insights, assurance of learning is not solely a data collection process. Schools also must use the data to identify ways to improve learning and performance. Following are two examples of times that schools used direct and indirect assessment to identify areas for improvement, make changes, and close the loop.
Boosting Quantitative Skills
In the first instance, when a school reviewed its assessment data results, it found that graduate business learners scored much lower in quantitative research and techniques than they did in other content areas. It is not unusual for learners to face challenges in quantitative subjects, but students at this school also underperformed when compared to the benchmark of the U.S. aggregate.
On the surface, there was no real indication as to why this might be. Therefore, the school added a targeted question to its exit survey, while faculty made minor adjustments to the curriculum. Despite these efforts, the next assessment cycle showed only slight improvements. Fortunately, student survey data revealed the problem: Students felt they didn’t have a strong foundation in quantitative methods before starting the graduate program.
The school considered various ways to improve their learners’ scores, including adding a prerequisite course or further revising the curriculum. Eventually, administrators decided to adopt Peregrine’s Academic Leveling Modules. This modular experience is designed to provide graduate learners with a foundation in specific content areas, including quantitative research and techniques.
Assessing and Enhancing Creativity
In the second case study, a school was evaluating its undergraduate entrepreneurship-focused business program by having learners complete a comprehensive exam and deliver a portfolio assignment during the capstone course.
The students scored well on their comprehensive exams, and their portfolios showed a mastery of technical skills. However, a standardized rubric captured and quantified a noticeable lack of creativity.
To uncover the root cause of this problem, the school conducted exit interviews, which revealed that learners felt the program emphasized memorization and technological proficiency rather than creative thinking. In response, the program revised its curriculum to include more projects that encouraged creativity.
Finally, the program added questions related to creativity to its exit survey. This allowed administrators to monitor progress and effectively close the loop.
Leveraging Indirect Assessment
While the value of indirect assessment is high, so are the hurdles that higher education institutions face when collecting stakeholder feedback. Institutions frequently note that their most significant challenge is securing meaningful response rates. Low participation in surveys and focus groups can yield results that do not accurately reflect the group’s views. This means institutions might make decisions based on incomplete data.
An integrated strategy aims to make the feedback process more engaging and representative, which ultimately provides a school with actionable insights.
Institutions are leveraging technology to ensure more effective data collection and analysis. For example, Peregrine offers an integrated student survey. Because the survey is administered concurrently with the final exam, student participation rates rise significantly. Institutions can use this link to download 70 examples of student survey questions.
Such an integrated strategy aims to make the feedback process more engaging and representative, which ultimately provides a school with actionable insights. When schools strategically incorporate indirect assessments within their AoL frameworks, they ensure that academic programs remain aligned with their missions and stay relevant to stakeholder needs.