LSTM¶
- class torch.ao.nn.quantizable.LSTM(input_size, hidden_size, num_layers=1, bias=True, batch_first=False, dropout=0.0, bidirectional=False, device=None, dtype=None, *, split_gates=False)[source][source]¶
一個可量化的長短期記憶(LSTM)。
關於描述和引數類型,請參考
LSTM
- 變數
layers – _LSTMLayer 的實例
注意
要存取權重和偏差,您需要逐層存取它們。請參閱以下範例。
範例
>>> import torch.ao.nn.quantizable as nnqa >>> rnn = nnqa.LSTM(10, 20, 2) >>> input = torch.randn(5, 3, 10) >>> h0 = torch.randn(2, 3, 20) >>> c0 = torch.randn(2, 3, 20) >>> output, (hn, cn) = rnn(input, (h0, c0)) >>> # To get the weights: >>> print(rnn.layers[0].weight_ih) tensor([[...]]) >>> print(rnn.layers[0].weight_hh) AssertionError: There is no reverse path in the non-bidirectional layer