Abstract
In this paper, high-speed channel simulators using neural language models are proposed. Given the input sequence of geometry design parameters of differential channels, the proposed channel simulator predicts SI characteristic sequences such as insertion loss (IL) and far-end crosstalk (FEXT). Sequence-to-sequence (seq2seq) networks using a recurrent neural network (RNN) and a long short-term memory (LSTM) are utilized for the estimator. Moreover, a transformer network which is a recent neural engine of large language models (LLMs) is introduced for the first time. Compared to seq2seq networks, the transformer network-based simulator can achieve shorter computing time due to its parallel computation called an attention. The accuracy and training time of seq2seq and transformer networks are validated and compared. As a result, all the proposed simulators show ~1% error rates for both the IL and FEXT. However, for the training time, the transformer network achieves 75%-83% reduction compared to seq2seq networks.
Recommended Citation
H. Park et al., "High-speed Channel Simulator using Neural Language Models," IEEE International Symposium on Electromagnetic Compatibility, pp. 11 - 16, Institute of Electrical and Electronics Engineers, Jan 2024.
The definitive version is available at https://doi.org/10.1109/EMCSIPI49824.2024.10705639
Department(s)
Electrical and Computer Engineering
Keywords and Phrases
High-speed channel; Neural language model; Signal integrity; Transformer network
International Standard Serial Number (ISSN)
2158-1118; 1077-4076
Document Type
Article - Conference proceedings
Document Version
Citation
File Type
text
Language(s)
English
Rights
© 2024 Institute of Electrical and Electronics Engineers, All rights reserved.
Publication Date
01 Jan 2024
Comments
National Science Foundation, Grant IIP-1916535