Posted 2024-09-27Updated 2024-12-21Reviewa minute read (About 112 words) visitsEmpirical Evaluation of Gated Recurrent Neural Networks on Sequence ModelingBackground: RNN首先介绍了RNN通过hidden state来实现记忆力功能 但指出RNN的训练有梯度消失/爆炸的现象,且记忆会沿序列长度的增加而指数下降,缺乏长期记忆能力。 解决梯度消失/爆炸目前有梯度裁剪和二阶梯度的方法,但成效并不显著 Gated RNN[[On the Properties of Neural Machine Translation= Encoder–Decoder Approaches]] Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modelinghttp://chen-yulin.github.io/2024/09/27/[OBS]Deep Learning-RNN-Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling/AuthorChen YulinPosted on2024-09-27Updated on2024-12-21Licensed under#NLPRNNGated-NN
2024-12-17Dynamic Open-Vocabulary 3D Scene Graphs for Long-term Language-Guided Mobile ManipulationReview