Every graduate program assessment plan should have the following key components:

• Program Learning Outcomes
• Method/s for Annual Data Collection
• Methods of Assessment and Analysis
• Use of Assessment Data

1. Program Learning Outcomes

PLOs are the accumulated knowledge, skills, and attitudes that students develop during a course of study in the program. Essentially, PLOs tell us what students will learn in the program. PLOs should be written as specific, measureable statements describing what students will be able to do upon completion of the program. Each
PLO should contain an action verband a learning statement. (For help in developing learning outcomes see “Program Assessment Overview”, under Resources on Provost Page: https://www.montana.edu/provost/assessment/program_assessment.html)

2. Program Description

Depending on the program plan (A: Thesis; B: Professional, or C: Course Work) will define the nature of your PLO’s. Ideally plans would include assessment that would cover all plans, but that would depend on the nature of your Master’s program.

3. Methods of Assessment

Every assessment plan needs evidence to demonstrate student learning at the program level. This evidence can be in the form of a direct measure of student learning or an indirect measure of student learning.

  • Examples of Direct measures include data that show specific student progress in achieving
    the student learning outcomes (SLOs) set by the program. For graduate programs, examples of direct measures could include:
    • a comprehensive or qualifying examination
    • research project
    • professional paper
    • defense (oral and written)
    • thesis or dissertation
    • manuscripts/ published peer-reviewed articles by the student
    • other papers, reports, course exams
    • examinations and presentations that demonstrate mastery of the knowledge,
      skills, and professionalism that a student is expected to learn in the program
    • obervations of student performance conducted by mentor/committee members and evaluated with assessment-specific rubric
  • Examples of Indirect measures include information related to the student’s learning.
    Examples could include:
    • course evaluations
    • annual reviews about the student’s progress, the results of a satisfaction or exit survey
    • focus group feedback
    • student placement data
    • rate of completion time

Both direct and indirect assessment data must be associated with the program’s student learning outcomes, and collected within a timeframe determined by the program. 

4. Timeframe for Collecting and Analyzing Data

Ideally, assessment data should be collected throughout the year on an annual basis. At the minimum, faculty should schedule an annual meeting to review these data and discuss student progress toward the PLOs. All of which should be outlined in the Assessment Plan.

5. Use of Assessment Data

In the Assessment Plan, The Department/School must identify how assessment data will be used by the program. Data should be shared with appropriate groups (as defined with in the Assessment Plan), for which feedback and continuing improvement plans can be developed. Program changes and improvements are then reported in Assessment Reports, where recommendations in response to the analysis of assessment data are documented. If assessment recommendations include changes to measures, items assessed, or means of assessment, updates to the Assessment Plan will be necessary.