torch.Tensor.requires_grad_¶
- Tensor.requires_grad_(requires_grad=True) Tensor ¶
變更 autograd 是否應在此 tensor 上記錄運算:就地設定此 tensor 的
requires_grad
屬性。傳回此 tensor。requires_grad_()
的主要用例是告訴 autograd 開始記錄 Tensortensor
上的運算。 如果tensor
具有requires_grad=False
(因為它是透過 DataLoader 取得,或是需要預處理或初始化),則tensor.requires_grad_()
會使 autograd 開始記錄tensor
上的運算。- 參數
requires_grad (bool) – 如果 autograd 應該記錄此張量上的運算。預設值:
True
。
範例
>>> # Let's say we want to preprocess some saved weights and use >>> # the result as new weights. >>> saved_weights = [0.1, 0.2, 0.3, 0.25] >>> loaded_weights = torch.tensor(saved_weights) >>> weights = preprocess(loaded_weights) # some function >>> weights tensor([-0.5503, 0.4926, -2.1158, -0.8303]) >>> # Now, start to record operations done to weights >>> weights.requires_grad_() >>> out = weights.pow(2).sum() >>> out.backward() >>> weights.grad tensor([-1.1007, 0.9853, -4.2316, -1.6606])