Paper
28 March 2024 Power system transient stability assessment based on improved graph attention broad learning system
Ruixue Ni, Haiquan Zhao
Author Affiliations +
Proceedings Volume 13091, Fifteenth International Conference on Signal Processing Systems (ICSPS 2023); 130912O (2024) https://doi.org/10.1117/12.3022775
Event: Fifteenth International Conference on Signal Processing Systems (ICSPS 2023), 2023, Xi’an, China
Abstract
A fast and accurate transient stability assessment method is essential for the safe and stable operation of power systems. Existing deep learning algorithms have achieved good results in transient stability assessment (TSA), but the training time of these models is extended as the number of network layers continues to increase. There-fore, in order to improve the accuracy of power system transient stability and the efficiency of operation, the improved graph attention broad learning system (IGAT-BLS) model is proposed in this paper, which combines broad learning system with graph attention network. This model leverages the deep feature extraction capability of graph attention networks to enhance its feature representation, and utilizes the flat structure of broad learning system (BLS) to reduce the complexity of the network model. Simulation experiment of the New England 39-bus benchmark system show that, compared with other TSA methods based on deep learning, the proposed method can make more accurate transient stability judgments with shorter training time.
(2024) Published by SPIE. Downloading of the abstract is permitted for personal use only.
Ruixue Ni and Haiquan Zhao "Power system transient stability assessment based on improved graph attention broad learning system", Proc. SPIE 13091, Fifteenth International Conference on Signal Processing Systems (ICSPS 2023), 130912O (28 March 2024); https://doi.org/10.1117/12.3022775
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Performance modeling

Machine learning

Feature extraction

Deep learning

Matrices

Artificial neural networks

Convolution

Back to Top