Journal of Computer and Communications

Volume 10, Issue 4 (April 2022)

ISSN Print: 2327-5219   ISSN Online: 2327-5227

Google-based Impact Factor: 1.12  Citations  

The Practice of a Method of Self-Study Students Counting in Classrooms Based on Head Detection in Colleges and Universities

HTML  XML Download Download as PDF (Size: 680KB)  PP. 51-62  
DOI: 10.4236/jcc.2022.104005    106 Downloads   465 Views  
Author(s)

ABSTRACT

In order to solve the problem that it is difficult for students to find self-study classrooms because of the limited classroom resources, combined with the current situation of informatization in colleges and universities, a feasible method of students counting in classrooms based on head detection is proposed. This method first collects the scene images in the classroom at regular intervals based on the existing examination monitoring system, and then uses the offline trained AdaBoost cascade detector to detect the head candidate region in the images. Then, the trained CNN-SVM model is used to further identify the head, and finally the identification results are processed and the number of students in the classrooms is counted. The test and practice show that the query system for the idle situation of self-study classrooms constructed by coordinating the classroom seat capacity, classroom scheduling data and the students counting in the classroom based on the above method can easily query the current crowded degree of the students in the classrooms, which plays a good guiding role for students to find self-study classrooms. The method has strong reference and promotion significance for solving similar problems in other universities.

Share and Cite:

Liu, M. and Yu, L. (2022) The Practice of a Method of Self-Study Students Counting in Classrooms Based on Head Detection in Colleges and Universities. Journal of Computer and Communications, 10, 51-62. doi: 10.4236/jcc.2022.104005.

Cited by

No relevant information.

Copyright © 2024 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.