Tagged University of Colorado Boulder

Student Pathways in STEM Majors

Towards better decision making

University of Colorado Boulder, Center for STEM Learning, Office of Information Technology, and Institutional Research

Noah Finkelstein, Daniel Reinholz, Joel Corbo, Robert Stubbs, Blake Redabaugh, Mark Werner

Contact: noah.finkelstein@colorado.edu

Purpose

Retention in STEM majors is a pressing initiative on our campus and nationwide. As a first step to improving retention of majors in STEM fields, we need to better understand which students are leaving, when they are leaving, and why they are leaving. Our institution already collects a wealth of useful data about student pathways, and the goal of this project is to make such data more accessible to support better-founded, evidence-based decision-making.

Tool: Student Pathways

The student pathways tool has been developed as a collaboration of our office of Institutional Research, Office of Information Technology, and the Center for STEM Learning. This tool is developed in Tableau, a powerful platform for data visualizations. Based on the UC-Davis “ribbon-plot,” an initial suite of visualizations captures: student enrollments, grades, and routes through course sequences within a major, enrollments in a course over time, trajectories into and out of a major, and individual student outcomes. Figure 1 shows how we might select pathways associated with certain course enrollments (in the figure we show pathways into a course, but we can follow routes after a course, or look at subsets of students (e.g. “B”-students) within a course. The use of Tableau allows for permissions to be set for different users; in this way individual student records could be protected from most users, but would be made available to advisers or other campus members who need access to them for their jobs.

The university already collects all of the data used in the Student Pathways tool; this tool simply makes them more accessible. It is possible to look at subsets of students based on relevant characteristics (e.g., gender, race, major). We are currently in a prototyping phase, and working with a variety of departments on campus to establish their use cases. Three departments and a deans office are currently piloting these tools.

Evidence-based Action

For more details about what evidence was collected, and actions taken, join the Tools for Evidence-based Action group on Trellis.

 

Observation Protocol for Learning Environments (OPLE)

Fall 2015 Pilot

University of Colorado Boulder, Center for STEM Learning, Academic Technology Design Team

Noah Finkelstein, Mark Werner, Elias Euler, Viktoriya Oliynyk, Rebecca Kallemeyn, Joel Corbo

Center for STEM Learning http://www.colorado.edu/csl/

Academic Technology Design Team http://www.colorado.edu/oit/services/academic-technology

Purpose

The ultimate purpose of the project is to create widely-accessible, flexible, research-based tools for observing educational practices used by faculty and students in classrooms, for use in formative and summative evaluation of teaching.

In Fall of 2015, we are conducting classroom observations in courses of different disciplines to:

  • Pilot OPLE as a resource to help instructors reflect on their teaching practices and identify potential areas for improvement.
  • Pilot OPLE as a data source to inform classroom design.
  • Pilot OPLE as a method to assess effectiveness of interventions in course redesign projects.
  • Assess OPLE as a potential data source to identify patterns in teaching practices between instructors, across departments, and across course types.

Tool: (GORP)

OPLE is a code-based protocol based on the codes designed for the Teaching Dimensions Observation Protocol (TDOP) and run through the Generalized Observation & Reflection Protocol (GORP) web platform. We conducted observations over the summer to test the new codes and, in Fall 2015, we will observe a set of courses from various disciplines for the purpose of data collection and analysis.

We decided to use TDOP codes for this project because its developers, a team from the University of Wisconsin, emphasized that the tool allowed for unbiased observations that captured as much classroom activity as possible. Reflecting on previous experience with TDOP for an honors thesis1 and a set of test observations, we revised some of the default TDOP codes to better meet the needs of our pilot. We wanted to more comprehensively characterize classroom interactions and to minimize variation in observer interpretation of codes. We made several rounds of changes based on observing different classroom types and teaching methods. We carefully documented all the changes and the reasoning behind them.

We first became interested is the GORP tool because of the visually appealing, accessible user interface. By the end of summer 2015, GORP released a new custom protocol editor that was exactly what we needed for our modified TDOP codes. Features of the GORP that we found most helpful for developing a new protocol were (1) a flexible number of codes and code categories, (2) ability to upload our own icons as code buttons, and (3) the ease of editing the protocol in future iterations. Figure 1 shows a screenshot of the observation window on the GORP website. The code names are listed in bold at the top of each rectangular button while a short description of each code is included in the middle.

GORP is still under development as we communicate our ideas with their team. The most recent update to the website was the inclusion of automatically-generated visualizations, which we plan on exploring further as part of our project.

Overall, we are pleased to see the rapid improvements to the GORP over the past few months, especially as we continue our correspondence with their team. They have been very responsive to our feedback and suggestions for changes to the tool. This flexibility and responsiveness of the GORP team makes it easy for us to move forward with the project without having to work around technological shortcomings of a particular tool.

Evidence-based Action

For additional information on evidence collected and actions taken, please join the Tools for Evidence-based Action group on Trellis.

References

  1. Euler, E., Fikelstein, N., & Corbo, J. C. (2015). Beliefs, intentions, actions, and reflections (BIAR): A new way to look at the interactions of students and teachers. Undergraduate Honors Thesis. Accessible: http://www.colorado.edu/per/research/dissertations-theses.