Download:  2023-24 Annual & Biennial Instructions & Report Template

Dowload template or follow structure as noted below. Template includes full instructions at the beginning.  Please delete instruction pages from final report.

 

Annual and Biennial Program Assessment Report

Annual (undergraduate) and Biennial (Graduate) Assessment Reports are to be submitted by October 15th every year.  If more time is needed, please contact Assistant Provost Deb Blanchard for support (deborahblanchard@montana.edu).

Academic Year Assessed:
College:
Department:
Submitted by:

 

Program(s) Assessed:       
List all majors (including each option), minors, and certificates that are included in this assessment:

 

 


Have you reviewed the most recent Annual Program Assessment Report submitted and Assessment and Outcomes Committee feedback? (please contact Assistant Provost Deborah Blanchard if you need a copy of either one).


 

The Assessment Report should contain the following elements, which are outlined in this template and includes additional instructions and information. 

Additional instructions and information should be deleted from final reports.

  1. Past Assessment Summary.
  2. Action Research Question.
  3. Assessment Plan, Schedule, and Data Source(s).
  4. What Was Done.
  5. What Was Learned.
  6. How We Responded.
  7. Closing The Loop(s).

 

Sample reports and guidance can be found at: https://www.montana.edu/provost/assessment/program_assessment.html


1.    Past Assessment Summary.

Briefly summarize the findings from the last assessment report conducted related to the PLOs being assessed this year. Include any findings that influenced this cycle’s assessment approach. Alternatively, reflect on the program assessment conducted last year, and explain how that impacted or informed any changes made to this cycle’s assessment plan.

 

2.    Action Research Question.

            What question are you seeking to answer in this cycle’s assessment?

Note: Research questions should be meaningful (focus on an area you need to know the answer to), relatable (tied to program goals), and measurable. Focus on: What will we be able to improve on if we answer this question? The question should be tied to the PLOs. Formulate the question so it is specific to an observable action – not on something that is difficult to measure. E.g., If you have a PLO related to students developing problem-solving skills. An actionable research question could be: Can students apply problem-solving steps?

 

3.    Assessment Plan, Schedule, and Data Source(s).

a)  Please provide a multi-year assessment schedule that will show when all program learning outcomes will be assessed, and by what criteria (data).

Note: This schedule can be adjusted as needed. Attempt to assess all PLOs every three years. You may use the table provided, or you may delete and use a different format.

ASSESSMENT PLANNING SCHEDULE CHART

PROGRAM LEARNING OUTCOME

COURSES MAPPED TO PLOs

2021-2022

2022-2023

2023-2024

2024-2025

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

b) What are the threshold values for which your program demonstrates student achievement?

Note: Example provided in the table should be deleted before submission.

 

Threshold Values

PROGRAM LEARNING OUTCOME

Threshold Value

Data Source

Example: 6) Communicate in written form about fundamental and modern microbiological concepts

The threshold value for this outcome is for 75% of assessed students to score above 2 on a 1-4 scoring rubric.

Randomly selected student essays

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

*Data sources should be examples of direct evidence of student learning: specifically designed exam questions, written work, performances, presentations, projects (using a program-specific rubric – not a course grading rubric); scores and pass rates on licensure exams that assess key learning goals; observations of student skill or behavior; summaries classroom response systems; student reflections.

 

Indirect evidence of student learning includes course grades, grade distributions, assignment grades, retention and graduation rates, alumni perceptions, and questions on end-of-course evaluations forms related to the course rather than the instructor. These may provide information for identifying areas of learning that need more direct assessment but should NOT be used as primary sources for direct evidence of student learning.

 

 4.  What Was Done.

a)    Self-reporting Metric (required answer): Was the completed assessment consistent with the program’s assessment plan? If not, please explain the adjustments that were made.

                                     Yes      square                               No        square

b)    How were data collected and analyzed and by whom? Please include method of collection and sample             size.

c)    Please provide a rubric that demonstrates how your data were evaluated.

Note: Rubrics are program-specific NOT course grading rubrics. Example provided below should be deleted before submission. Rubrics may be very different than these examples; it just needs to explain the criteria used for evaluating the student artifacts as they relate to the PLOs being assessed.

Indicators

Beginning - 1

Developing- 2

Competent- 3

Accomplished- 4

Analysis of Information, Ideas, or Concepts

Identifies problem types

Focuses on difficult problems with persistence

Understands complexity of a problem

Provides logical interpretations of data

 

Application of Information, Ideas, or Concepts

Uses standard solution methods

Provides a logical interpretation of the data

Employs creativity in search of a solution

Achieves clear, unambiguous conclusions from the data

 

Synthesis

Identifies intermediate steps required that connects previous material

Recognizes and values alternative problem solving methods

Connects ideas or develops solutions in a clear coherent order

Develops multiple solutions, positions, or perspectives

Evaluation

Check the solutions against the issue

Identifies what the final solution should determine

Recognizes hidden assumptions and implied premises

Evaluates premises, relevance to a conclusion and adequacy of support for conclusion.

 

Note: This type of rubric can be used for all levels of assessment (the anticipated evaluation score may vary according to the course level). Some rubrics/assessments may be more tailored for specific levels of courses (e.g., designed to assess outcomes in upper division courses or for lower division) and therefore the scores might be similar across course levels. Or, if you are assessing more basic learning outcomes, you might expect outcomes to be established earlier in the academic career.  

 

Student names must NOT be included in data collection. Reporting on successful completions, or manner of assessment (publications, thesis/dissertation, or qualifying exam) may be presented in table format if they apply to learning outcomes. In programs where numbers are very small and individual identification can be made, focus should be on programmatic improvements rather than student success. Data should be collected throughout the year on an annual basis – this is especially helpful for biennial reporting. Proprietary program information (e.g., exam questions and examples) must not be included in the report if the program does not want that information to be included in any public-facing access.

 

5. What Was Learned.

a)    Based on the analysis of the data, and compared to the threshold values established, what was learned from the assessment?

b)    What areas of strength in the program were identified from this assessment process?

c)    What areas were identified that either need improvement or could be improved in a different way from this assessment process?

 

6. How We Responded.

a)    Describe how “What Was Learned” was communicated to the department, or program faculty. How did faculty discussions re-imagine new ways program assessment might contribute to program growth/improvement/innovation beyond the bare minimum of achieving program learning objectives through assessment activities conducted at the course level?

b)    How are the results of this assessment informing changes to enhance student learning in the program? 

c)    If information outside of this assessment is informing programmatic change, please describe that. 

d)    What support and resources (e.g. workshops, training, etc.) might you need to make these adjustments?

 

7. Closing The Loop(s).

Reflect on the program learning outcomes, how they were assessed in the previous cycle (refer to #1 of the report), and what was learned in this cycle.  What action will be taken to improve student learning objectives going forward?

a)    Self-Reportintg Metric (required answer): Based on the findings and/or faculty input, will there any curricular or assessment changes (such as plans for measurable improvements, or realignment of learning outcomes)?

Yes      square                               No        square


b)     In reviewing the last report that assessed the PLO(s) in this assessment cycle, what changes proposed were implemented and will be measured in future assessment reports?

c)     Have you seen a change in student learning based on other program adjustments made in the past? Please describe the adjustments made and subsequent changes in student learning.

 

Submit report to programassessment@montana.edu

Update Department program assessment report website.

Update PLO language in CIM if needed (Map PLOs to Course LOs)