Initial Publication Date: February 19, 2009

Program Assessment Experiences

Dale Sawyer, Earth Science Department, Rice University


Explicit learning objectives, assessment tools, and evaluation plans, became part of our world at Rice University about 3 years ago. The change was not welcomed by the faculty. The change was driven by a coming reaccreditation review, and the realization that the university had ignored the instructions of the prior accreditation that such a system be put in place. The university's first effort was low key and generally unsuccessful. The faculty in each department charged with this process had little idea about either the language or the process of program assessment. Most faculty resisted strenuously, arguing that they knew what a good degree program was and that they do not need a "process" to improve what is currently in place. The prime exceptions to this were the engineering departments, which were accustomed to ABET certification procedures.

Then the accreditation review began, and we were found to be seriously deficient in this area and were instructed to remedy the situation quickly. A new, high level, administrative position in Program Assessment was created and a hire made. What followed was a gradual implementation plan, which called for annual small steps toward a comprehensive plan. The first year, each dept. was asked to define one learning objective and assessment plan for one of our degree programs. These were reviewed, changes suggested, improvements made, and then accepted. The second year, each dept. was asked to report on data collected (during that one year), data evaluated, and actions taken for the one learning objective for one degree program. They were also asked to define one new learning objective and assessment plan for each of our degree programs. That brings us to the present.

The Earth Science Department at Rice offers 4 degrees: BA Earth Science, BS Earth Science, MS Earth Science, and PhD Earth Science. The BA degree is rarely (one every 4 years or so) sought by students, so it is not too important to our thinking. We average 8 BS students per year, and they mostly go on to graduate study at first tier research universities. We average 5 MS graduates per year. Most seek employment in the energy or environmental sectors. We average 8 PhD graduates per year. About half seek academic positions and the other half seek positions in the energy and environmental industries. I have submitted the current versions of our learning objectives and assessment plans to the workshop site for such material.

We are fairly comfortable with the learning outcomes that we have designed for each of our degree programs. They were built after perusing similar documents from other research departments around the US.

We are fairly uncomfortable with the strategies that we have identified for assessing the learning objectives for our degree programs. Historically in our dept., program assessment has been done by informal discussion among faculty and largely uninformed by data.

One area where I think that we are on the right track is the evaluation of communication skills for our graduate degree programs. We require each of our graduate students to prepare an annual report on their activities and submit this to our Graduate Committee. This report includes citations to all written and oral scientific presentations by the student. We consider the presentation of a poster or oral presentation at a regional, national, or international meeting to be a strong indicator of a student's communications skill. We consider the submission of a manuscript to a major scientific journal to be even better. Since this information is reported by the students, it is easy for the Graduate Committee to assess our progress toward more student presentations and more publications. As we have tracked this for several years, we are indeed seeing that the culture of scientific presentation and publication is growing in our students. We think that this is good. A key to the success of this assessment strategy is that the reporting work is distributed among the students themselves.

We have struggled with finding other strategies to assess other learning objectives. I hope to learn about other successful approaches at this workshop.