Journal of Intelligent Learning Systems and Applications

Volume 16, Issue 1 (February 2024)

ISSN Print: 2150-8402   ISSN Online: 2150-8410

Google-based Impact Factor: 1.5  Citations  

A CNN-Based Single-Stage Occlusion Real-Time Target Detection Method

HTML  XML Download Download as PDF (Size: 2507KB)  PP. 1-11  
DOI: 10.4236/jilsa.2024.161001    61 Downloads   200 Views  

ABSTRACT

Aiming at the problem of low accuracy of traditional target detection methods for target detection in endoscopes in substation environments, a CNN-based real-time detection method for masked targets is proposed. The method adopts the overall design of backbone network, detection network and algorithmic parameter optimisation method, completes the model training on the self-constructed occlusion target dataset, and adopts the multi-scale perception method for target detection. The HNM algorithm is used to screen positive and negative samples during the training process, and the NMS algorithm is used to post-process the prediction results during the detection process to improve the detection efficiency. After experimental validation, the obtained model has the multi-class average predicted value (mAP) of the dataset. It has general advantages over traditional target detection methods. The detection time of a single target on FDDB dataset is 39 ms, which can meet the need of real-time target detection. In addition, the project team has successfully deployed the method into substations and put it into use in many places in Beijing, which is important for achieving the anomaly of occlusion target detection.

Share and Cite:

Liu, L. , Yang, N. , Liu, S. , Cao, Y. , Tian, S. , Liu, T. and Zhao, X. (2024) A CNN-Based Single-Stage Occlusion Real-Time Target Detection Method. Journal of Intelligent Learning Systems and Applications, 16, 1-11. doi: 10.4236/jilsa.2024.161001.

Cited by

No relevant information.

Copyright © 2024 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.