Grading and Evaluating
Student Experience Survey - 1J7
The approved policies below allow for the implementation of a fourteen- item common Student Experience Survey, a SES administration policy, a SES use policy, as well as a policy for continued review of the SES process every three years.
Preamble
There is a wide body of research indicating that student evaluations of teaching (SETs) may be influenced by such factors as instructor gender, physical attractiveness, race, and other types of characteristics (see reference list below). Prior student interest in the subject matter is also a factor, giving instructors of certain courses an advantage over others. For example, some instructors have the responsibility of teaching relatively unpopular courses, which may put them at a disadvantage.
Further, many faculty members have a responsibility to awaken students to discriminatory ideology and institutional practices that are hegemonic and oppressive to those not in the dominant group(s) in the world. Attempts to help students understand ableism, ageism, racism, sexism, and discrimination against those of non-dominant sexual orientation, ethnicity, or religious affiliation often leads to antipathy and confusion among students. These phenomena must be weighed when considering student evaluations of faculty teaching courses that expose racism, sexism, homophobia, and other forms of bigotry. Although this issue may be more relevant to some disciplines than others, it can be a factor in all disciplines and in any course.
As a result, Policy 1J7 initiates alternative approaches to evaluate teaching. Section I details the learning-focused Student Experience Survey (SES) that replaces the standard SET framework. The administration of the survey is covered in Section II. And section III highlights additional avenues of evaluation that must be used in conjunction with the SES. Finally, section IV covers the continuous review process of the SES.
- Policy on Student Experience Survey (SES)
The fourteen questions in I.A. will be used on all end-of-semester student experience surveys for all courses regardless of modality (e.g., face-to-face; asynchronous online; hybrid). Currently, exempted course types include lab, studio, performance, field placement, practica and internship courses.
- Student Experience Survey Core Instrument Questions:
- I consistently prepare for this course.
- I consistently attend this course.
- I interact with the instructor outside of class (online or face-to-face).
- Expectations for graded work are explained clearly.
- Feedback on assignments for the course helps me learn from the experience.
- The course assignments have helped me learn.
- The course material is explained clearly.
- Help is available if I have questions or difficulties.
- The course content is readily accessible to me.
- There are opportunities for student-to-student interactions.
- I am learning to evaluate diverse ideas and perspectives.
- The course encourages me to consider new ideas.
- The course includes content from people with diverse backgrounds.
- The learning environment is welcoming for all students.
The instrument uses a five-point scale: Strongly Disagree, Disagree, Neither Disagree nor Agree, Agree, and Strongly Agree. These items need to be described in ways that allow for students to leave the item blank if not applicable or to mark "not applicable" as a response.
- Additional Considerations
Colleges/schools and departments can add quantitative and/or qualitative questions to enhance the utility of student feedback. In order for this SES instrument to be most helpful to professors' improvement efforts, feedback is necessary.
The Student Experience Survey shall be subject to rigorous and ongoing evaluation. It is important to assess potential threats to validity, possible bias, and patterns over time.
- Student Experience Survey Core Instrument Questions:
- Policies on Administration of SESs
- Process of administering the SESs during the required end-of-semester evaluation:
- SIUE forms for Student Experience Survey include the approved campus-wide core. In addition, each department, school, or college can add a second section of multiple-choice questions and a section of open-ended questions.
- Student Experience Survey may be administered in paper-pencil format or online. Regardless of mode of delivery, the process must ensure anonymity for students. (Note: tools such as Qualtrics and Survey Monkey may not ensure student anonymity.)
- Before students take the survey, instructors should provide a standard statement in writing or verbally. This statement should instruct students about the importance and purpose of the survey as well as how the surveys will be used.
- The administrator should instruct students not to talk to each other while filling out SESs.
- The process must assure student anonymity on the SESs.
- If time is given during class for students to complete surveys, the instructor must not be present while surveys are being filled out.
- The department should develop a plan regarding the administration of student experience surveys. This plan should include designations for who will administer course surveys (whether the surveys are paper-pencil format or online). If a departmental designee is unavailable, the instructor can use a "signed envelope" procedure: in such an instance, the department chair or instructor must designate a student in the class to collect all surveys in a single large envelope, seal it, sign it across the seal, and deliver it to the department secretary or other designated location.
- Instructors must not have contact with individual SESs once they have been distributed (in the case of non-electronic administration, someone else must collect and give the completed SESs to someone in charge of processing them). The instructor will not be allowed to see the original survey forms after they have been completed.
- Handwritten comments must be typed before the instructor receives them.
- If paper administration, SES forms (both completed and blank) must be returned in the SES packet and accounted for. For online administration, faculty members and departments should make note of response rates and their potential impact on the results. Regardless of mode of delivery, surveys are anonymous, including whether or not a student has completed the survey.
- The final results are provided to the instructor after the final grade submission period is over.
- Further suggestions
It is suggested that instructors administer a survey during the course of the semester in addition to the end-of-semester survey. (A midterm survey may help instructors to identify problems and remedy them while they still have the opportunity).
In the case of the midterm surveys, the survey practices should ensure anonymity. Departments should work to develop effective practices to support faculty members who wish to implement midterm surveys for the purpose of course improvement.
- Process of administering the SESs during the required end-of-semester evaluation:
- Policy on Use of Results of Student Experience Surveys
- Policy on use of student experience surveys
- SES shall strictly not be used as the sole or primary indicator of faculty effectiveness - neither as individuals, nor collectively. It is the responsibility of each department to inform their faculty of the review policy. Specifically, multiple measures must be used to evaluate faculty teaching. Such additional measures may include the following:
- Peer evaluations through faculty development programs or through instructors in the department.
- See Gormally et al., (2014) for more information on giving instructional feedback to others
- The Center for Faculty Development and Innovation can provide teaching professional development and facilitate these observations through programs such as Teaching Peer Consultants who provide Group Instructional Feedback Technique (GIFT).
- Surveys to assess student perceptions of your classroom that include qualitative prompts for students to provide feedback.
- Example surveys/tools can measure active learning measures (Owens et al., 2017) or student perceptions of pedagogy, including active learning, diversity, and sense of belonging (Owens et al., 2018).
- Assessments of learning (quizzes, pre-post-learning tests, student assessment of course objective mastery) Midterm evaluations.
- Faculty reflections on efforts to improve learning measures
- This reflection of how you perceive your teaching effectiveness can be triangulated with other sources, such as student and peer evidence.
- Teaching portfolios that can include the following (Berk, 2005):
- Personal and peer reflections
- Teaching awards
- Relevant course materials
- Teaching scholarship – presentations on teaching/learning effectiveness
- Student evidence, such as exit tickets, metacognitive reflections, pre-post assessments, etc.
- Video recordings can serve as a tool for you to reflect on your own teaching. This can also be a mechanism for peer feedback (Berk, 2005).
- Peer evaluations through faculty development programs or through instructors in the department.
- The response to a single question on a SES shall never be used as the sole or primary indicator of faculty effectiveness taken from that instrument, even when that instrument is used in conjunction with other measures. This applies both to individual faculty members and to collections of faculty members. Also, with quantitative SESs, student response percentages for each answer category are more useful than the arithmetic mean for each item.
- Results of SESs shall not be used to compare faculty members or collections of faculty members for evaluation purposes. Rather, they shall be used in at least one of the following ways:
- to document faculty improvement or changes in a faculty member's results in the same class over time
- along with other indicators of teaching quality, to determine the quality of faculty teaching
- to assess the extent to which faculty use evaluation results to improve their teaching
- The Chair and/or other review committee should meet with faculty to interpret and discuss the results of student experience surveys and be aware of potential biases.
- Because student experience surveys are anonymous, no disciplinary action may be based solely on student experience surveys.
- SES shall strictly not be used as the sole or primary indicator of faculty effectiveness - neither as individuals, nor collectively. It is the responsibility of each department to inform their faculty of the review policy. Specifically, multiple measures must be used to evaluate faculty teaching. Such additional measures may include the following:
- Policy on use of student experience surveys
- SES Continuous Review Committee
The SES Continuous Review committee, a subcommittee of the Committee on Assessment, meets every three years and functions to oversee continuous review and validation of the SIUE Student Experience Survey. The Committee shall be constituted of a minimum of four faculty members, including the Director of Assessment (as a voting member) and an additional liaison from the Committee on Assessment. Faculty members will be chosen based on their expertise in psychometric measurement, survey design, and statistics. Appointments are made jointly by the Director of Assessment and the Committee on Assessment and approved through Faculty Senate. Appointments to the Committee shall normally be for a three-year term; reappointment is permitted. All members of the Committee will be voting members. The Committee shall be responsible for the continuous review and validation of the SIUE Student Experience Survey and making recommendations to the Committee on Assessment and Faculty Senate on the basis of the data collected. - References for SES Policies
Arend (2018). Towards a Comprehensive Teaching Evaluation Framework: A Summary of Literature, Recommendations, and Examples from Other Institutions. White Paper.
Baldwin, Tamara, and Nancy Blattner. "Guarding against potential bias in student evaluations: What every faculty member needs to know." College Teaching 51, no. 1 (2003): 27-32.
Basow, S. (1994). Student ratings of professors are not gender blind. Retrieved September 28, 2004, from the World Wide Web: http://www.awmmath.org/newsletter/199409/basow.html
Bennett, S. K. (1982). Student perceptions of and expectations for male and female instructors: Evidence relating to the question of gender bias in teaching evaluation. Journal of Educational Psychology, 74(2), 170-179.
Berk, R. A. (2005). Survey of 12 strategies to measure teaching effectiveness. International journal of teaching and learning in higher education, 17(1), 48-62.
Boring, Anne, Ottoboni, K. & Stark, P.. (2016). Student evaluations of teaching (mostly) do not measure teaching effectiveness. ScienceOpen Research, 1, 16.
Freeman, H. R. (1994). Student evaluations of college instructors: Effects of type of course taught, gender and gender role, and student gender. Journal of Educational Psychology, 86(4), 627-630.
Gormally, Cara, Mara Evans, and Peggy Brickman. "Feedback about teaching in higher ed: Neglected opportunities to promote change." CBE—Life Sciences Education 13, no. 2 (2014): 187-199.
Greenwald, Anthony G. and Gerald M. Gillmore. (1997). "Grading Leniency Is a Removable Contaminant of Student Ratings," American Psychology 11: 1209-17.
Hendrix, K. T. (1993). Guess who's coming to lecture? Two case studies in professor credibility. Washington, DC: Paper presented at the Annual Meeting of the Western States Communication Association.
Linse, Angela, R. (2017). Interpreting and using student ratings data: Guidance for faculty serving as administrators and on evaluation committees, Studies in Educational Evaluation, 54, 94-106.
Marsh, H. W., & Roche, L. A. (2000). Effects of grading leniency and low workload on students' evaluations of teaching: Popular myth, bias, validity, or innocent bystanders? Journal of Educational Psychology, 92(1), 202-228.
Naftulin, Donald H., John E. Ware, and Frank A. Donnelly, (1973). "The Doctor Fox Lecture: A Paradigm of Educational Seduction," Journal of Medical Education 48: 630-5.
Nast, H. J. (1999). 'Sex', 'race' and multiculturalism: Critical consumption and the politics of course evaluations. Journal of Geography in Higher Education, 23(1), 102-115.
O'Reilly, M. T. (1987). Relationship of physical attractiveness to students' ratings of teaching effectiveness. Journal of Dental Medicine, 51(10), 600-602.
Owens, M. T., Seidel, S. B., Wong, M., Bejines, T. E., Lietz, S., Perez, J. R., Sit, S., Subedar, Z. S., Acker, G. N., Akana, S. F., Balukjian, B., Benton, H. P., Blair, J. R., Boaz, S. M., Boyer, K. E., Bram, J. B., Burrus, L. W., Byrd, D. T., Caporale, N., … Tanner, K. D. (2017). Classroom sound can be used to classify teaching practices in college science courses. Proceedings of the National Academy of Sciences of the United States of America, 114(12), 3085–3090. https://doi.org/10.1073/pnas.1618693114
Owens, M. T., Trujillo, G., Seidel, S. B., Harrison, C. D., Farrar, K. M., Benton, H. P., Blair, J. R., Boyer, K. E., Breckler, J. L., Burrus, L. W., Byrd, D. T., Caporale, N., Carpenter, E. J., Chan, Y. H. M., Chen, J. C., Chen, L., Chen, L. H., Chu, D. S., Cochlan, W. P., … Tanner, K. D. (2018). Collectively improving our teaching: Attempting biology department–wide professional development in scientific teaching. CBE Life Sciences Education, 17(1), 1–17. https://doi.org/10.1187/cbe.17-06-0106
SIUE Department of Historical Studies (2004). Operating Papers, Appendix I.
Spooren, P., Brockx, B., & Mortelmans, D. (2013). On the validity of student evaluation of teaching: The state of the art. Review of Educational Research, 83(4), 598–642.
Sproule, R. (2000). "Student Evaluation of Teaching: A Methodological Critique of Conventional Practices", Education Policy Analysis Archives 8.50 (2000).
Williams, W. M. and S. J. Ceci. (1997). '"How'm I Doing?' Problems with Student Ratings of Instructors and Courses", Change: The Magazine of Higher Learning 29: 12-23.
Approved by Chancellor effective 2/20/26
This policy was issued on February 23, 2026, replacing the June 16, 2017 version.
Document Reference: 1J7
Origin: WC 6-06/07; CC 9-10/11, CC 10-11/12 & OC 5/3/13; CC 18-16/17; FS 01-25/26

