Stephen Carroll is director of Core Writing and co-director of the Professional Writing Program at Santa Clara University. Stephen works on a team supported by the National Science Foundation to upgrade the platform for the online Student Assessment of Learning Gains (SALG) assessment tool, and continues to develop new features to make the SALG easier to use and more effective for educators. He also contributes his expertise to SENCER’s overall assessment and evaluation program by serving on the Assessment Advisory Committee. Stephen Carroll’s current projects focus on intersections of pedagogy, technology, assessment, writing, and learning. He has developed and taught a number of experimental, cross-disciplinary pilot courses to explore new ways to enhance student learning. He serves as a science writer for the National Science Foundation, having recently completed two reports on their Undergraduate Research Centers/Collaboratives project. His strong background in information technology stems from many years in the corporate world, where he served as a computer operations manager, help desk manager and technical training manager. In addition to his work revamping the Student Assessment of their Learning Gains (SALG) instrument and website, Stephen is investigating using course-specific writing practices to enhance learning in the sciences. His recent publications focus on how to use assessment practices to drive innovation in teaching and learning and on leveraging existing technologies to enhance communication and accelerate learning, especially in undergraduate learning communities. Picture: http://www.scu.edu/provost/writingcenter/about/who/faculty/carroll.cfm.
Stephen Carroll, Senior Research Fellow (Assessment)

Stephen Carroll, Senior Research Fellow (Assessment)By Dr. Stephen Carroll, Principal Investigator, SALG

“In fact, the margins by which SENCER faculty outperformed their non-SENCER colleagues tended to be slightly larger in the areas of higher-order learning gains (those pertaining to affective factors and habit formation). This indicates that the learning gains made by students in SENCER courses were more likely to be long-lasting than the gains made in non-SENCER courses.”

Analysis of the SALG (Student Assessment of their Learning Gains) data from September 2007 to September 2011 reveals that the SENCER project is succeeding on a number of levels. First, the data show that an ever-increasing number of SENCER faculty are using the SALG to assess and improve their teaching. Over the past four years, SENCER faculty conducted 1,314 SALG surveys, slightly more than 27% of the total number of SALG surveys delivered. The number of SENCER surveys has increased every year. The first year, 2008-2009, saw the largest increase, from 67 to 1019. In the next year, 2009, the number of users swelled to 1594 and expanded to 2026 in 2010 and 2174 in 2011 (1). This substantial and consistent growth shows that SENCER faculty are getting useful data from the SALG.

Second, consistent with this finding, the data show that SENCER faculty, even more than SALG users in general, use information gained from the SALG to make substantive revisions to their course designs and pedagogy. Over the past three years, 87 out of approximately 200 SENCER faculty using the SALG have made changes in 387 SALG instruments for over 200 courses. A detailed analysis of students’ answers shows that these modifications to their courses are working: the trend line shows that scores related to pedagogical goals are increasing for SENCER faculty (2). Moreover, this upward trend holds across all the main categories of pedagogical goals: those related to a) understanding course content, b) skill-building, c) changing attitudes toward science, and d) building habits of mind and behavior. In fact, the margins by which SENCER faculty outperformed their non-SENCER colleagues tended to be slightly larger in the areas of higher-order learning gains (those pertaining to affective factors and habit formation). This indicates that the learning gains made by students in SENCER courses were more likely to be long-lasting than the gains made in non-SENCER courses.

Finally, in 2007, 2008, and 2009, the scores for SENCER faculty were higher in almost all categories than for non-SENCER faculty. The fact that SENCER scores are higher than non-SENCER faculty is quite remarkable, since the research shows that students tend to give lower evaluation scores to faculty using innovative pedagogies like the ones used in SENCER courses (3). In 2010 and 2011, the situation got more complicated because the redesign of the SALG website attracted a much larger number of non-science faculty. Humanities and social science faculty tend to get higher ratings than physical sciences; major courses tend to get better ratings than general education courses; and upper-division courses tend to get better ratings than lower-division courses (4). (See chart 1) As SENCER courses are almost exclusively lower-division, general education, physical science courses it is unsurprising that the large increase of humanities and social science faculty using the SALG has driven the up the averages for non-SENCER faculty in the past two years. However, preliminary analysis of the data suggests that the averages for SENCER faculty remain above the averages for non-SENCER faculty in the sciences.

Overall, the data clearly show that SENCER is improving science education and civic engagement across the nation, supporting the NSF’s STEM education goals. The trends indicated by the data are clear and strong and show unequivocally that the SENCER project is not only working, but that it is getting better with time.

View Chart 1 (Scores in SENCER and non-SENCER Instruments)
[gview file=”http://ncsce.net/wp-content/uploads/2016/09/scores_sencer_non-sencer_instr.pdf”]
The SALG website is a free course-evaluation tool that allows college-level instructors to gather learning focused feedback from students. Once registered on the SALG site, you can create and use a SALG survey to measure students’ learning gains in your course and their progress toward your course’s learning goals. For information on creating a SENCER-SALG instrument, please visit http://new.sencer.net/sencer-salg/. The SALG is supported by the National Science Foundation.

Endnotes

  1. Because the dataset for 2011 ends in September, we have captured only about 60% of the data for that year. Adjusting for normal distribution would suggest that about 3623 surveys were delivered by SENCER faculty in 2011.
  2. The data for 2007 were left out of these calculations because the number of surveys returned was too small to be statistically significant.
  3. See Seymour, E., Wiese, D., Hunter, A. & Daffinrud, S.M. (2000) Creating a Better Mousetrap: On-line Student Assessment of their Learning Gains. Boulder, CO: University of Colorado, Bureau of Sociological Research.
  4. See Benton, Stephen L, and Cashin, William E. (2011) IDEA Paper #50: Student Ratings of Teaching: A Summary of Research and Literature, Kansas State University.