PyTorch Tutorial 04 - Backpropagation...

教程Python代碼如下:
import torch
x = torch.tensor(1.0)
y = torch.tensor(2.0)
w = torch.tensor(1.0,requires_grad=True)
#前向傳播,forward pass and compute the loss
y_hat = w * x
loss = (y_hat - y)**2
print(loss)
#反向傳播,backward pass
loss.backward()
print(w.grad)
### update weights
### next forward and backward pass
標(biāo)簽: