less than 1 minute read

Description: Implementation of a small-scale automatic differentiation framework, like PyTorch and Tensorflow.

The functions for the experiment are as follows:

x = Variable(2)
y = Variable(5)
z = x*y
v = 1+2*z
v.derivative(z) # evaluates dv/dz = 2
v.derivative(y) # evaluates dv/dy = 5
v.derivative(x) # evaluates dv/dx = 10

Various rules of differentiation are implemented as follows:

  • d.mul:

  • d.truediv:

  • d.pow:

Assuming ,

When operand , the second term can be replaced with , otherwise is not defined.

  • d.tanh:

  • d.sin:

d.cos:

  • d.log:

Gradient descent is implemented as follows:

gradient_descent(parameters, error_function, num_iters, learning_rate, verbose):
  errors = []
  for i in range(0,(num_iters-1)):
    e = error_function(parameters)
    append e to errors
    for p in parameters:
      p = p - learning_rate.grad(e)
    end for
    if verbose:
      print i and e
    end if
  end for
return errors