Why Michigan Tech Needs Assessment
Assessment is a systematic process for the continuous improvement of student learning that assures educational quality. It enables the university community to identify opportunities to improve courses and curricula, teaching practices, and student life activities, as well as make informed decisions about degree programs.
There is really only one reason to do assessment: to assure that our students are learning at the level we expect for graduates of Michigan Tech. Assessment is also driven by external accountability and our professional accreditation (like ABET, AACSB, SAF) depends on our accreditation by the Higher Learning Commission.
Assessment at Michigan Tech strives to find a balance between the internal drive for educational improvement and external demands for accountability.
What is Assessment?
While faculty implicitly engage in continuously improving student learning when they teach classes, evaluate student performance, and respond with changes to their courses, assessment of student learning makes this process explicit. The assessment cycle consists of five iterative steps, which, when this cycle is completed, we say we have “closed the loop.” It can be used to improve any program of study, from courses to degree programs to other institutional outcomes.
Establish learning goals
Develop clear, explicit expectations for learning that identify what students will know or be able to do at the end of a program of study. The goals must be measurable or observable. Sometimes these are called “learning outcomes” because they are stated in terms of expected outcomes.
Gather evidence that students have achieved the learning goals
Programs need to provide sufficient opportunities for students to achieve the learning goals. Faculty need to identify evidence of student learning that can be used to determine whether a goal has been achieved. Is it appropriate and feasible to use this evidence to measure student learning? Evidence could include exams, essays, reports, presentations, or other products of student work. Indirect measures such as surveys, the rate of student participation in professional conferences, or success rates for professional exams, can also be used to augment direct measures obtained from student work.
Analyze the evidence to assess whether goals are met, and report results
Faculty work collectively to determine whether the evidence shows that student learning matches expectations stated in the learning goal. Faculty can develop a rubric that establishes criteria for goal achievement to guide analysis. Results are then summarized in a clear and concise way so that action can be taken. Actions could include designing new opportunities for student learning.
Take action to improve learning
Use the results of assessment to make changes in programs of study to improve student learning. These actions should be clear and specific and faculty responsibility identified.
Assess whether actions improved learning
After action has been taken, go back to Step 2. Gather and analyze evidence to see if the action has led to the desired changes. Repeat the cycle until you have successfully met the target. Even once the target is met, student achievement of that goal should be monitored and additional improvements should be considered (e.g. what can we do better? does the target need to be moved?).
While assessment of specific courses is very important, a university assessment program supports students in achieving learning goals that cannot be achieved in a single course. Scaffolded learning builds student competencies by offering instruction and opportunities for practice throughout the learning experience in an intentional way. General education, the degree program, and co-curricular student life programs work together to achieve the learning goals of the university.
|Assessment as a Strategy of Inquiry|
From “Assessment of Learning: Are we selling out or buying in?”, a presentation by Susan R. Hatfield, Ph.D, Visiting Scholar, Higher Learning Commission at the Higher Learning Commission Annual Conference, April 2013.