# Automatic Differentiation¶

In [1]:
from mxnet import autograd, np, npx
npx.set_np()

x = np.arange(4)
x

Out[1]:
array([0., 1., 2., 3.])

Allocate space to store the gradient with respect to x.

In [2]:
x.attach_grad()


Record the computation within the record scope.

In [3]:
with autograd.record():
y = 2.0 * np.dot(x, x)
y

Out[3]:
array(28.)

The gradient of the function $y = 2\mathbf{x}^{\top}\mathbf{x}$ with respect to $\mathbf{x}$ should be $4\mathbf{x}$.

In [4]:
y.backward()

array([0., 0., 0., 0.])