By Catherine Uvarov

Student Pathways in STEM Majors

Towards better decision making

University of Colorado Boulder, Center for STEM Learning, Office of Information Technology, and Institutional Research

Noah Finkelstein, Daniel Reinholz, Joel Corbo, Robert Stubbs, Blake Redabaugh, Mark Werner

Contact: noah.finkelstein@colorado.edu

Purpose

Retention in STEM majors is a pressing initiative on our campus and nationwide. As a first step to improving retention of majors in STEM fields, we need to better understand which students are leaving, when they are leaving, and why they are leaving. Our institution already collects a wealth of useful data about student pathways, and the goal of this project is to make such data more accessible to support better-founded, evidence-based decision-making.

Tool: Student Pathways

The student pathways tool has been developed as a collaboration of our office of Institutional Research, Office of Information Technology, and the Center for STEM Learning. This tool is developed in Tableau, a powerful platform for data visualizations. Based on the UC-Davis “ribbon-plot,” an initial suite of visualizations captures: student enrollments, grades, and routes through course sequences within a major, enrollments in a course over time, trajectories into and out of a major, and individual student outcomes. Figure 1 shows how we might select pathways associated with certain course enrollments (in the figure we show pathways into a course, but we can follow routes after a course, or look at subsets of students (e.g. “B”-students) within a course. The use of Tableau allows for permissions to be set for different users; in this way individual student records could be protected from most users, but would be made available to advisers or other campus members who need access to them for their jobs.

The university already collects all of the data used in the Student Pathways tool; this tool simply makes them more accessible. It is possible to look at subsets of students based on relevant characteristics (e.g., gender, race, major). We are currently in a prototyping phase, and working with a variety of departments on campus to establish their use cases. Three departments and a deans office are currently piloting these tools.

Evidence-based Action

For more details about what evidence was collected, and actions taken, join the Tools for Evidence-based Action group on Trellis.

 

Observation Protocol for Learning Environments (OPLE)

Fall 2015 Pilot

University of Colorado Boulder, Center for STEM Learning, Academic Technology Design Team

Noah Finkelstein, Mark Werner, Elias Euler, Viktoriya Oliynyk, Rebecca Kallemeyn, Joel Corbo

Center for STEM Learning http://www.colorado.edu/csl/

Academic Technology Design Team http://www.colorado.edu/oit/services/academic-technology

Purpose

The ultimate purpose of the project is to create widely-accessible, flexible, research-based tools for observing educational practices used by faculty and students in classrooms, for use in formative and summative evaluation of teaching.

In Fall of 2015, we are conducting classroom observations in courses of different disciplines to:

  • Pilot OPLE as a resource to help instructors reflect on their teaching practices and identify potential areas for improvement.
  • Pilot OPLE as a data source to inform classroom design.
  • Pilot OPLE as a method to assess effectiveness of interventions in course redesign projects.
  • Assess OPLE as a potential data source to identify patterns in teaching practices between instructors, across departments, and across course types.

Tool: (GORP)

OPLE is a code-based protocol based on the codes designed for the Teaching Dimensions Observation Protocol (TDOP) and run through the Generalized Observation & Reflection Protocol (GORP) web platform. We conducted observations over the summer to test the new codes and, in Fall 2015, we will observe a set of courses from various disciplines for the purpose of data collection and analysis.

We decided to use TDOP codes for this project because its developers, a team from the University of Wisconsin, emphasized that the tool allowed for unbiased observations that captured as much classroom activity as possible. Reflecting on previous experience with TDOP for an honors thesis1 and a set of test observations, we revised some of the default TDOP codes to better meet the needs of our pilot. We wanted to more comprehensively characterize classroom interactions and to minimize variation in observer interpretation of codes. We made several rounds of changes based on observing different classroom types and teaching methods. We carefully documented all the changes and the reasoning behind them.

We first became interested is the GORP tool because of the visually appealing, accessible user interface. By the end of summer 2015, GORP released a new custom protocol editor that was exactly what we needed for our modified TDOP codes. Features of the GORP that we found most helpful for developing a new protocol were (1) a flexible number of codes and code categories, (2) ability to upload our own icons as code buttons, and (3) the ease of editing the protocol in future iterations. Figure 1 shows a screenshot of the observation window on the GORP website. The code names are listed in bold at the top of each rectangular button while a short description of each code is included in the middle.

GORP is still under development as we communicate our ideas with their team. The most recent update to the website was the inclusion of automatically-generated visualizations, which we plan on exploring further as part of our project.

Overall, we are pleased to see the rapid improvements to the GORP over the past few months, especially as we continue our correspondence with their team. They have been very responsive to our feedback and suggestions for changes to the tool. This flexibility and responsiveness of the GORP team makes it easy for us to move forward with the project without having to work around technological shortcomings of a particular tool.

Evidence-based Action

For additional information on evidence collected and actions taken, please join the Tools for Evidence-based Action group on Trellis.

References

  1. Euler, E., Fikelstein, N., & Corbo, J. C. (2015). Beliefs, intentions, actions, and reflections (BIAR): A new way to look at the interactions of students and teachers. Undergraduate Honors Thesis. Accessible: http://www.colorado.edu/per/research/dissertations-theses.

 

Characterizing Instruction

Connecting student learning with Instructional Style in General Chemistry

University of California Davis. Center for Educational Effectiveness

Catherine Uvarov, Alberto Guzman-Alvarez, Greg Allen, Alan Gamage, Marco Molinaro

tea@ucdavis.edu

Purpose

Students in large introductory STEM courses often struggle, giving these courses the reputation as being “gate-keeper” courses. The General Chemistry course sequence at UC Davis is one of the highest enrollment course sequences on campus. Our goal was to collect evidence of student learning in General Chemistry and link that data with information on instructional practices in order to form a more complete picture of what is happening in the course series.

Tool: GORP

We used the GORP tool to collect classroom observation data on instructors using the COPUS protocol.(1) Instructors agreed in advanced to have their classes randomly observed over the course of the academic year. Observation data were mostly collected by trained undergraduates (not enrolled in the course). However, some observations were also collected by graduate students or CEE Staff.

Evidence-based Action

For more details about what evidence was collected, and actions taken, join the Tools for Evidence-based Action group on Trellis.

References

  1. Smith, M. K., F. H. Jones, S. L. Gilbert and C. E. Wieman (2013). “The Classroom Observation Protocol for Undergraduate STEM (COPUS): a new instrument to characterize university STEM classroom practices.” CBE Life Sci Educ 12(4): 618-627.
  2. Lund, T. J., M. Pilarz, J. B. Velasco, D. Chakraverty, K. Rosploch, M. Undersander and M. Stains (2015). “The Best of Both Worlds: Building on the COPUS and RTOP Observation Protocols to Easily and Reliably Measure Various Levels of Reformed Instructional Practice.” CBE-Life Sciences Education 14(2).

Attrition and Retention in Engineering

Analyzing Enrollment, Transfer, Drop-out and Stop-out activity

University of Saskatchewan

Jim Greer, Sean Maw, Liz Kuley, Craig Thompson, Stephanie Frost, Ryan Banow

Jim.greer@usask.ca

Purpose

In an effort to improve retention in Engineering at the University of Saskatchewan, a study was undertaken to identify factors, evaluate recent efforts to reduce attrition, and implement new initiatives to attract and retain students from targeted populations.

US_Engineering

Figure 1. Six year graduation and attrition for engineering students.

Tool: Ribbon

Efforts to bring student data at the University of Saskatchewan into a University Data Warehouse have been successful and it is now relatively easy to build up a specific dataset for visualization in the Ribbon Tool. Engineering enrolment data for different demographic groups since 2008 have been collected and structured. Admission policies, enrolment quotas, freshman promotion standards, and probationary actions have been tweaked in recent years and the effects of these changes on attrition and retention have not yet been fully analyzed.

The Ribbon Tool helps track the changes in student flows into, through, and out of Engineering for various demographic subgroups.

Evidence-based Action

For more details about what evidence was collected, and actions taken, join the Tools for Evidence-based Action group on Trellis.

 

Pathways to Graduation for Indigenous Students

University of Saskatchewan

Jim Greer, Graeme Joseph, Candace Wasacase-Lafferty, Stephanie Frost, Ryan Banow, Craig Thompson

Jim.greer@usask.ca

Purpose

Success for Aboriginal students has become a priority at the University of Saskatchewan. As an under-represented minority facing systemic financial and social challenges, more accurately tracking the flows of these students through various programs, providing adequate instructional and social supports, and improving retention and time to completion are important goals.

US_PathwaysIndigenous

Figure 1. Six year graduation and attrition for Aboriginal students.

Tool: Ribbon

Efforts to bring student data at the University of Saskatchewan into a University Data Warehouse have been successful and it is now relatively easy to build up a specific dataset for visualization in the ribbon tool. Aboriginal student enrollment data for in various disciplines have been collected since 2008. Retention initiatives put into place in recent years can be quickly analyzed using the Ribbon Tool to determine if success rates and dropout rates are affected for Aboriginal students in various demographic categories.

Evidence-based Action

For more details about what evidence was collected, and actions taken, join the Tools for Evidence-based Action group on Trellis.

 

Math and Physical Sciences Attrition

Visualizing who leaves and when

University of California Davis, Center for Educational Effectiveness

Catherine Uvarov, Marco Molinaro

tea@ucdavis.edu

Purpose

There are many initiatives to increase the number of students with degrees in STEM fields. One way to increase the number of students with STEM degrees is to reduce the number of students who leave STEM. As a first step to reducing attrition, one must first understand who leaves and when so that interventions can target at risk populations prior to departure.

MPS-1

Figure 1. Field of freshmen Math and Physical Science students after 1, 2, and 6 years. Only 18% obtain degrees in Math and Physical Science fields by the 6 year mark.

Tool: Ribbon

The Ribbon Tool, developed as part of TEA, was chosen to help visualize student flows over multiple time points, with filters for student demographics. The Ribbon Tool is freely available (1). The Microsoft Windows Snipping Tool was used to grab screenshots of the visualizations since exporting of images is not currently available.

The student information data was obtained through the university registrar. The tool allows for a quick view of percentages and isolation of particular groups. Groups can be expanded to higher levels of detail. For instance, this data set has the following layers:

  1. Enrollment Status (Enrolled, Graduated, Dismissed, Left, etc.)
  2. Field (Biological Sciences, Engineering, Social Sciences, Math and Physical Sciences, etc.)
  3. Specific Major (Math, Chemistry, Physics, Undeclared, etc.)

Evidence-based Action

For more details about what evidence was collected, and actions taken, join the Tools for Evidence-based Action group on Trellis.

 

 

Predicting Course Enrollment

Analyzing Student Flows through Course Sequence

University of California Davis, Center for Educational Effectiveness

Catherine Uvarov, Marco Molinaro

tea@ucdavis.edu

Purpose

Each term, the course enrollment must be estimated long before students register because departments need to determine how many sections of a class to offer, reserve adequate classroom space, and find instructors for the classes. Proper estimation is necessary to ensure students have access to courses they need to graduate. This estimation is complicated when a multiple-course sequence can be taken discontinuously, and enforcement of prerequisites is not automated during class registration periods. The purpose of these visualizations is to show how historic course grade data could be used to determine the enrollment composition of a course. These visualizations show student flows leading into enrollment in the third course in a 3-course long sequence (CHE 2A, 2B, and 2C), and how changes to prerequisite enforcement could change enrollment.

PREVIOUS CHE 2A AND 2B COURSES AND TERMS OF STUDENTS ENROLLED IN CHE 2C DURING SPRING QUARTER. Figure 1. Previous CHE 2A and 2B courses and terms of students enrolled in CHE 2C during Spring Quarter.

Tool: Ribbon

The Ribbon Tool, developed as part of TEA, was chosen to help visualize student flows over multiple time points, with filters for student information. The Ribbon Tool is freely available (1). The Microsoft Windows Snipping Tool was used to grab screenshots of the visualizations since exporting of images is not currently available.

The data for these particular visualizations were created from a data file of compiled gradebook and assessment data for the courses and terms in question. Similar course enrollment and grade information could also have been obtained from the University Registrar.

The layers for the Ribbon Tool are:

  1. Enrollment Status (Enrolled, Not Enrolled)
  2. Grade – Rounded (A, B, C, D, F)
  3. Grade – Exact (A+, A, A-, B+, B, B-, etc.)

Evidence-based Action

For more details about what evidence was collected, and actions taken, join the Tools for Evidence-based Action group on Trellis.

 

 

ALEKS Ribbon

Student Course Progress over Time

Summer Preparatory Chemistry Course

University of California Davis, Center for Educational Effectiveness

Catherine Uvarov, Derek Dockter

tea@ucdavis.edu, sp-chem@iamstem.ucdavis.edu

Purpose

ALEKS Ribbon

Figure 1. Enrollment and completion patterns over the course of the summer. Highlighted are students who have not enrolled (Blue), students enrolled but not started (Red), and students who have finished (Orange).

At UC Davis, General Chemistry is a required foundational course for a large number of incoming freshmen. However, many incoming freshmen have not had chemistry since sophomore year in high school, if at all. Historically, all incoming freshmen must take chemistry and math placement exams that serve to screen-out underprepared students and prevent them from enrolling in a class for which they are not ready. Underprepared students take a “Workload” course in the Fall term, and General Chemistry in the Winter term. This pattern creates a disparity between the Fall and Winter student demographics. For financial and logistical reasons a few years ago, the placement exams were moved online – unproctored – which makes it a less effective screening tool. The Chemistry Department, in partnership with Educational Effectiveness Hub (EEH), is piloting the use of ALEKS (1) for a Summer-Preparatory course that will prepare students over the summer so that they can enroll in General Chemistry Fall term. The purpose of these visualizations was to determine which students to target for interventions prior to the deadline to finish the preparatory course.

Tool: Ribbon

The Ribbon Tool, developed as part of TEA, was chosen to help visualize student flows over multiple time points, with filters for student information. The Ribbon Tool is freely available (2). The Microsoft Windows Snipping Tool was used to grab screenshots of the visualizations since exporting of images is not currently available.

IRB determination was that this pilot was not human subject research. Approximately 1100 students were randomly selected to participate from the pool of incoming domestic freshmen who completed a Student Intent to Register (SIR). Students were told that the pilot was by invitation only and that completing the ALEKS SP-Chem course would be accepted as a prerequisite so they did not need to do the placement exam. The student course mastery data was obtained through the custom reporting feature of ALEKS and matched with the invited student list via Microsoft Excel.

Students change from a status of “Not-Enrolled” to “Enrolled” once they make an account on ALEKS. Once they make an account on ALEKS, they are at 0% course mastery until they complete an initial assessment of their prior knowledge. The initial assessment plus any additional learning modules represents the student’s overall course mastery on a given day. Students with ≥ 95% course mastery have completed the Summer Preparatory course and can enroll in General Chemistry (CHE 2A). The Ribbon Tool represents snapshots of course mastery on different days.

The layers used were:

  1. Enrollment Status (Enrolled vs. Not-Enrolled).
  2. Course Mastery (0%, 1-10%, 11-20%, 21-30%, etc.)
  3. College (Engineering, Biological Science, Agriculture and Environmental Sciences, and Letters and Science)
  4. Major

The following filters were available:

  1. Chemistry placement test score (pass, did not pass, did not take)
  2. Chemistry Demand (High, Medium, Low) – This is based on historical enrollment numbers for a given major.

Evidence-based Action

For more details about what evidence was collected, and actions taken, join the Tools for Evidence-based Action group on Trellis.

 

 

References

  1. ALEKS, aleks.com
  2. Ribbon Tool, ucdavis.edu