Journal of Computer and Communications

Volume 12, Issue 1 (January 2024)

ISSN Print: 2327-5219   ISSN Online: 2327-5227

Google-based Impact Factor: 1.12  Citations  

Parallel Inference for Real-Time Machine Learning Applications

HTML  XML Download Download as PDF (Size: 1058KB)  PP. 139-146  
DOI: 10.4236/jcc.2024.121010    55 Downloads   223 Views  

ABSTRACT

Hyperparameter tuning is a key step in developing high-performing machine learning models, but searching large hyperparameter spaces requires extensive computation using standard sequential methods. This work analyzes the performance gains from parallel versus sequential hyperparameter optimization. Using scikit-learn’s Randomized SearchCV, this project tuned a Random Forest classifier for fake news detection via randomized grid search. Setting n_jobs to -1 enabled full parallelization across CPU cores. Results show the parallel implementation achieved over 5× faster CPU times and 3× faster total run times compared to sequential tuning. However, test accuracy slightly dropped from 99.26% sequentially to 99.15% with parallelism, indicating a trade-off between evaluation efficiency and model performance. Still, the significant computational gains allow more extensive hyperparameter exploration within reasonable timeframes, outweighing the small accuracy decrease. Further analysis could better quantify this trade-off across different models, tuning techniques, tasks, and hardware.

Share and Cite:

Bayyat, S. , Alomran, A. , Alshatti, M. , Almousa, A. , Almousa, R. and Alguwaifli, Y. (2024) Parallel Inference for Real-Time Machine Learning Applications. Journal of Computer and Communications, 12, 139-146. doi: 10.4236/jcc.2024.121010.

Cited by

No relevant information.

Copyright © 2024 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.