АНАЛІЗ РЕКУРЕНТНИХ ШТУЧНИХ НЕЙРОННИХ МЕРЕЖ, ЇХ СТРУКТУРА

Authors

Kharkiv National University of Radio Electronics
Kharkiv National University of Radio Electronics

Abstract

This conference paper provides insights into Recurrent Neural Networks (RNNs) in various domains like language modeling and speech recognition. It discusses key concepts such as «Backpropagation Through Time» and «Long
Short-Term Memory Units» essential for understanding RNNs. Recent advancements like «Attention Mechanism» and «Pointer Networks» are also explored, showcasing improved performance in RNN-based techniques. Challenges like vanishing gradients are addressed, and solutions like Deep Recurrent Neural Networks (DRNNs) and Bidirectional Recurrent Neural Networks (BRNNs) are discussed. The Encoder-Decoder architecture, exemplified by Sequence to Sequence (seq2seq) models, is examined, and Pointer Networks (Ptr-Nets) are introduced as effective solutions for combinatorial optimization problems.


Радіоелектроніка та молодь у XXI столітті. Т. 6 : Конференція "Інформаційні інтелектуальні системи": матеріали 28-го Міжнар. молодіж. форуму, 16–18 квітня 2024 р.

Pages

47-49

Published

December 12, 2024

Details about this monograph

ISBN-13 (15)

978-966-659-396-5