文件名称:rnn-from-scratch-master
介绍说明--下载内容均来自于网络,请自行研究使用
RNN神经网络的应用和概念,RNN源代码和使用方法-You can find that the parameters `(W, U, V)` are shared in different time steps. And the output in each time step can be**softmax**. So you can use**cross entropy** loss as an error function and use some optimizing method (e.g. gradient descent) to calculate the optimized parameters `(W, U, V)`.
Let recap the equations of our RNN:
Let recap the equations of our RNN:
(系统自动生成,下载前可以参看下载内容)
下载文件列表
rnn-from-scratch-master
.......................\README.md
.......................\__pycache__
.......................\...........\activation.cpython-34.pyc
.......................\...........\gate.cpython-34.pyc
.......................\...........\layer.cpython-34.pyc
.......................\...........\output.cpython-34.pyc
.......................\...........\preprocessing.cpython-34.pyc
.......................\...........\rnn.cpython-34.pyc
.......................\activation.py
.......................\data
.......................\....\reddit-comments-2015-08.csv
.......................\figures
.......................\.......\gradient.png
.......................\.......\init.png
.......................\.......\rnn-bptt-with-gradients.png
.......................\.......\rnn-bptt1.png
.......................\.......\rnn-compuattion-graph.png
.......................\.......\rnn-compuattion-graph_2.png
.......................\.......\rnn.jpg
.......................\.......\rnn_equation.png
.......................\.......\rnn_eval.png
.......................\.......\rnn_loss.png
.......................\.......\rnn_loss_2.png
.......................\gate.py
.......................\layer.py
.......................\output.py
.......................\preprocessing.py
.......................\rnn.py
.......................\rnnlm.py