Journal of Computer and Communications

Volume 13, Issue 4 (April 2025)

ISSN Print: 2327-5219   ISSN Online: 2327-5227

Google-based Impact Factor: 1.98  Citations  

BERT-Prompt Based Equipment to Support Domain Sentence Vector Training

  XML Download Download as PDF (Size: 3079KB)  PP. 289-310  
DOI: 10.4236/jcc.2025.134018    20 Downloads   124 Views  

ABSTRACT

In the field of equipment support, the method of generating equipment support sentence vectors based on word vectors is simple and effective, but it ignores the order and dependency relationships between words in the sentence, thus failing to capture the overall semantic information of the sentence. In contrast, using deep learning models (such as RNN, LSTM, Transformer, etc.) to directly generate sentence vectors can better capture the order and dependency relationships between words in the sentence, and thus better represent the overall semantic information of the sentence, avoiding the loss of information by simplifying the sentence to the average or concatenation of word vectors. To address the characteristics of equipment support, a method for training equipment support domain sentence vectors based on Bert-Prompt is proposed to improve the semantic understanding and representation capabilities of equipment failure texts. Specifically, the pre-trained BERT model is applied to sentence vector training, and the concept of prompt learning is combined. By designing effective Prompt sentence vector templates and the InfoNCE Loss function, the representation effect of equipment support sentence vectors is further improved. Based on BERT-Prompt, the training of equipment support domain sentence vectors is explored. This includes an overview of BERT sentence vector models, the development of sentence vector models, common BERT sentence vector model introductions, an introduction to Bert-Prompt, the main achievements and innovations of Bert-Prompt, its core ideas, common strategies and methods of Bert-Prompt, template-based Prompt sentence vector representation, continuous Prompt templates, the InfoNCE Loss function, training and optimization processes. The experimental analysis section covers data preparation, evaluation metrics, experimental preparations, comparative experimental methods, and analysis of experimental results.

Share and Cite:

Guo, W. , Ling, H. and Pan, L. (2025) BERT-Prompt Based Equipment to Support Domain Sentence Vector Training. Journal of Computer and Communications, 13, 289-310. doi: 10.4236/jcc.2025.134018.

Cited by

No relevant information.

Copyright © 2025 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.