Do Exam Policies Matter in College?


This paper uses the binary logistic regression to show how exam policies affect students’ learning outcomes. Types of examinations employed by instructors are divided broadly into three, namely traditional, nontraditional, and project. Using data from an undergraduate business program, the study develops a binary logistic regression model predicting the effects of the three types of examinations on students’ learning outcomes. The results showed that the traditional (in-class) examinationhad the largest predictive powers on students’ learning outcomes. Nontraditional examination and project had significantly lesser predictive powers than traditional examination, with project having the least powers. The findings suggest, first, that instructors’ examination policies may be less impactful or have negative effects on learning outcomes; second, there can be a particular combination of traditional, nontraditional, and project examinations, which can most effectively boost students’ learning outcomes; third, students who participate in academic program with higher correctly classified estimates would be expected to acquire higher learning outcomes than students who participate in an academic program with significantly lower correctly classified estimates; fourth, examination policies can be deployed as a critical tool for students’ learning outcomes; and, fifth, a periodic evaluation of examination policies in an academic program may be useful.

Share and Cite:

Ikwueze, L. (2014) Do Exam Policies Matter in College?. Creative Education, 5, 177-184. doi: 10.4236/ce.2014.54027.

Conflicts of Interest

The authors declare no conflicts of interest.


[1] Astin, A. W. (1993). What Matters in College? Four Critical Years Revisited. San Francisco, CA: Jossey-Bass.
[2] Berger, J., & Milem, J. (2000). Organizational Behavior in Higher Education and Student Outcomes. In J. C. Smart (Ed.), Higher Education: Handbook of Theory and Research, (Vol. XV: pp. 268-338). New York: Agathon.
[3] Chen, H. L., Lattuca, L. R., & Hamilton, E. R. (2008). Conceptualizing Engagement: Contributions of Faculty to Student Engagement in Engineering. Journal of Engineering Education, 97, 339-353.
[4] Coelho, C., Ylvisaker, M., & Turkstra, L. S. (2005). Nonstandardized Assessment Approaches for Individuals with Traumatic Brain Injuries. Seminars in Speech and Language, 26, 223-241.
[5] Cornell University Statistical Consulting Unit (2011). Binary Logistic Regression Models and Statistical Software: What You Need to Know. Ithaca, NY: Cornell University.
[6] Crozier, J. (2002). China Focus. The Journal of the Society for Anglo-Chinese Understanding.
[7] Dey, E., Hurtado, S., Rhee, B., Inkelas, K. K., Wimsatt, L. A., & Guan, F. (1997). Improving Research on Postsecondary Outcomes: A Review of the Strengths and Limitations of National Data Sources. Palo Alto, CA: Stanford University, National Center for Postsecondary Improvement.
[8] Gray, P. (2013). Free to Learn: Why Unleashing the Instinct to Play Will Make Our Children Happier, More Self-Reliant, and Better Students for Life. New York: Basic Books.
[9] Henry, P. (2007). The Case Against Standardized Testing. Minnesota English Journal, 55.
[10] Hu, S., McCormick, A. C., & Gonyea, R. M. (2012). Examining the Relationship between Student Learning and Persistence. Innovative Higher Education, 37, 387-395.
[11] Jackson, C. K. (2008). Cash for Test Scores. Education Next, 8, 70-77.
[12] Khan, A. (2010). Model Building Using Logistic Regression (pp. 8-9). St Lucia, QLD: University of Queensland.
[13] Kuh, G. D. (1993). In Their Own Words: What Students Learn Outside the Classroom. American Educational Research Journal, 30, 277-304.
[14] Kuh, G. (1999). Setting the Bar High to Promote Student Learning. In G. S. Blimling, E. J. Whitt, & Associates (Eds.), Good Practice in Student Affairs: Principles to Foster Student Learning (pp. 67-89). San Francisco: Jossey-Bass Publishers.
[15] Linden, S. (2007). The Impact of Standardized Testing on Student Performance in the United States. Pell Scholars and Senior Theses, Paper 10, Newport, RI: Salve Regina University.
[16] National Commission on the Future of Higher Education (2006). A Test of Leadership: Charting the Future of U.S. Higher Education. Washington, DC: US Department of Education.
[17] Organisation for Economic Cooperation and Development (2005) Education at a Glance: OECD Indicators. Paris: Author.
[18] Pascarella, E. T., & Terenzini, P. T. (1991). How College Affects Students: Findings and Insights from 20 Years of Research. San Francisco, CA: Jossey-Bass.
[19] Pascarella, E. T., & Terenzini, P. T. (2005). How College Affects Students: A Third Decade of Research. San Francisco, CA: Jossey-Bass.
[20] Smart, J. C., Feldman, K. A., & Ethington, C. A. (2000). Academic Disciplines: Holland’s Theory and the Study of College Students and Faculty. Nashville, TN: Vanderbilt University Press.
[21] Terenzini, P. T., & Reason, R. D. (2005). Parsing the First Year of College: Rethinking the Effects of College on Students. In The Meeting of the Association for the Study of Higher Education (p. 630). Philadelphia, PA: The Association for the Study of Higher Education.
[22] Terenzini, P. T., Ro, H. K., & Yin, A. C. (2005). Between-College Effects on Students Reconsidered. In The Meeting of the Association for the Study of Higher Education. Philadelphia, PA: The Association for the Study of Higher Education.
[23] Terenzini, P. T., Volkwein, J. F., & Lattuca, L. R. (2007). The Effects of Conventional vs. Internal Organizational Characteristics on Student Experiences. In The Meeting of the Association for the Study of Higher Education. Louisville, KY: The Association for the Study of Higher Education.
[24] Zucker, S. (2003). Assessment Report: Fundamentals of Standardized Testing. London: Pearson.

Copyright © 2023 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.