Assessment and Project Evaluation
Jump to: Evaluation
An external evaluation will focus on measuring the projects impact on programming, the associated impact on student learning, and the ultimate impact on students' ability and willingness to engage in societal roles addressing the sustainability of our civilization and our environment. The external evaluation team comprises a geoscientist who is independent of the development teams and a professional evaluation group. This combination will provide an understanding of the nuances of the program strategies and goals, and of the community that is striving towards those goals.
The project takes a two-fold approach to assessing the quality of the materials and courses. First, all curriculum are independently reviewed prior to field-testing in the classroom. Second, all curriculum are to be field tested in three different classroom settings using a range of assessment measures to gauge student learning gains, student attitude and aspiration changes, and the role of the teaching circumstances in the success of the curriculum.
Curriculum ReviewA materials rubric that uses a constructive alignment conceptual framework guides developers at inception, during testing, and through final field analysis of curricular materials to align the materials with project goals.
Before field-testing the materials in the classroom, all curriculum are reviewed by three members of the assessment team using the InTeGrate Materials Review Rubric (Microsoft Word 46kB Sep21 12) which incorporates the broad goals of the InTeGrate project and researched guidelines for best practices in curriculum development. All curriculum must meet the full score for the overarching goals of the project so that all curriculum:
- Address one or more geoscience-related grand challenges facing society
- Develop student ability to address interdisciplinary problems
- Improve student understanding of the nature and methods of geoscience and developing geoscientific habits of mind
- Make use of authentic and credible geoscience data to learn central concepts in the context of geoscience methods of inquiry
- Incorporate systems thinking
Measuring Impact of Curriculum on Student Learning
Second, all materials are field tested by a minimum of three different classroom settings. All field testing classrooms collect measures of student learning that include:
Geoscience Literacy Exam (GLE)
As part of the InTeGrate goal of improving geoscience literacy, the assessment team developed the Geoscience Literacy Exam (GLE) as one of the tools to quantify the effectiveness of these materials on students' understandings of geoscience literacy. The GLE instrument addresses content and concepts in the Earth, Climate, and Ocean Science literacy documents. The instrument will be used to measure geoscience literacy from introductory, non-science students to upper-level geoscience majors.
The GLE testing schema is organized into three levels of increasing complexity.
- Level 1 questions are single answer, understanding- or application-level multiple choice questions. For example, selecting which type of energy transfer is most responsible for the movement of tectonic plates. They are designed such that most introductory level students should be able to correctly answer after taking an introductory geoscience course.
- Level 2 questions are more advanced multiple answer/matching questions, at the understanding- through analysis-level. Students might be asked to determine the types of earth-atmosphere interactions that could result in changes to global temperatures in the event of a major volcanic eruption. Because the answers are more complicated, some introductory students and most advanced students should be able to respond correctly.
- Level 3 questions are analyzing- to evaluating-level short essays, such as describe the ways in which the atmosphere sustains life on Earth. These questions are designed such that introductory students could probably formulate a rudimentary response. We anticipate the detail and sophistication of the response will increase as students' progress through the InTeGrate curriculum.
In year one testing of curriculum, eight level one questions are used to understand students prior understanding of geoscience content. Two level three questions are used to assess the development of geoscience literacy: one question addressing the specific literacy area (Earth, Ocean, Atmosphere, Climate) addressed by the module and one addressing systems thinking. The systems thinking question is currently under development.
In addition to covering Geoscience content knowledge and understanding, GLE+ is also intended to probe InTeGrate students' ability and motivation to use their Geoscience expertise to address problems of environmental sustainability. This student survey is adminstered pre and post-instruction and also collects demographic information and career interests.
Embedded assessments play a critical role for the InTeGrate project in evaluating if instructional materials and strategies are meeting the stated learning objectives. These assessments are part of the flow of the class, graded by faculty based on a rubric, and demonstrate summative level of students' achievement of the learning outcomes. All field tested classrooms will have a set of three common embedded assessments.
Additional faculty surveys that provide structured reflection and student engagement measures will be collected to build case studies that describe the context for how faculty used the materials in their course.
The GLE and GLE+ assessments will be made available for use throughout the geoscience community in exchange for access to the resulting data. This will allow a better understanding of the current state of geoscience literacy across the community and provide data that can be used to situate the project results in a larger sample.
The project evaluation is being undertaken by Dr. Kim Kastens (Educational Development Center) and Carol Baldassari (Senior Research Associate, Program Evaluation and Research Group, Lesley University). Dr. Frances Lawrenz (University of Minnesota) serves as a consultant to the evaluation team.
Project evaluation will focus on the measurable
impact on programming, the associated impact on student learning, and the
ultimate impact on students' ability and willingness to engage in societal
roles addressing the sustainability of our civilization and our
environment. Because the community-based project design
depends critically on successful collaboration among partners who are dispersed
by geography, discipline, and institution type, the external evaluation team will provide formative feedback on the evolution of the partnership and sub-partnerships,their roles and responsibilities, and their ability to work together effectively. Key areas of interest will be the alignment of members' work with the project design and intended goals, potential benefits and costs of their participation in the project, their commitment to project goals and activities over time, and the effectiveness of the structures and processes the Center creates and maintains to foster communication and collaboration. Data collection efforts will include in-depth interviews, surveys, participation in on-line meetings and phone conferences, and reviews of program artifacts.
To evaluate the project's effectiveness at "...expanding the number of students who enroll..." (NSF, 2010, p. 5), patterns and trends in numbers and attributes of enrolled students will be analyzed throughout the grant period, building on data from AGI, and using data collected from participating faculty and students in a database maintained by the project management team. Enrollment impact of the project over time will by tracked by demography (gender, race/ethnicity), by type of InTeGrate interaction (original development teams, implementation grant, or professional development workshop), by institution type (e.g. R1, 2YC, MSI), and by student specialization (general education, geoscience major, or geoscience-using field). To probe the reasons behind students' enrollment decisions, pre- and post-instruction assessments of selected classes will include survey questions about students' reasons for taking this class, what alternatives they considered, and their interest in future geoscience classes. The Center seeks to impact 400,000 students during its lifetime and to collect enrollment and assessment data from courses enrolling 75,000 students. To date over 200 educators have been involved in the project.
To evaluate the project's effectiveness at achieving "...enhanced learning..." (NSF, 2010, p. 5), the evaluation team will analyze assessments of student learning, the same pre- and post-instruction assessments developed and deployed by the assessment team during their review of the effectiveness of modules. However, the evaluation team will take a cross-institutional, project-wide view of these data. The evaluation team will look for patterns and trends in student learning gains, asking whether learning gains are evenly distributed across demographic groups, across institution types, and across mode of InTeGrate involvement (original development sites, implementation grant recipients, professional development workshop attendees). Based on the initial data collection and analysis, in Year Two the assessment team will set learning benchmarks for students enrolled in single courses and those enrolled in programs in the areas of geoscience literacy, understanding the process of science, and interdisciplinary problem solving. The Center seeks to have all students enrolled in courses supported by the Center make progress toward these benchmarks, and for 75% of students to meet the benchmark.
To evaluate the project's effectiveness at achieving "....significant progress towards addressing the national challenge" of environmental sustainability (NSF, 2010, p. 6), the evaluation team will consider students' ability and motivation to use insights from geosciences to address grand challenges of sustainability. Motivation will be assessed by including a career interest component on pre- and post-instruction surveys (GLE+). At present, many science educators are ambivalent about inclusion of human/environment interactions in science courses (Kastens and Turrin, 2006) or about devoting instructional time to "soft skills" such as interdisciplinary collaboration. Questions have been included on the faculty survey administered by the On the Cutting Edge to gauge the degree of support among the geoscience professorate for teaching towards each of the three InTeGrate learning goals. This survey will be administered in Fall, 2012. The Center seeks to have 30% of geoscience faculty supportive of this goal.
To evaluate the program model, we will conduct a series of site studies probing the relationships among InTeGrate-supplied materials and activities, contextual and other factors influencing the faculty and student in the adoption site, and the consequent changes in programming, faculty practice, enrollment, and student learning. Integrating across all of the intensive study sites, we will identify circumstances or actions that either favor or undercut likelihood of successful adoption or adaptation of InTeGrate's materials and methods.
The evaluation is being guided by the following logic model, developed by the evaluation team and project leaders: InTeGrate logic model (Acrobat (PDF) 80kB Sep26 12)