Autodiff or autograd

Jimmy (xiaoke) Shen
2 min readMay 1, 2020

--

This article is a quick summary of a lecture note.

Difference between autodiff and autograd

Page 3 of the lecture note.

  • Automatic differentiation (autodiff) refers to a general way of taking a program that computes a value and automatically constructing a procedure for computing derivatives of that value.
  • Backpropagation is the special case of autodiff applied to neural nets. But in machine learning, we often use backprop synonymously with autodiff.
  • Autograd is the name of a particular autodiff package. But lots of people, including the PyTorch developers, got confused and started using “autograd” to mean “autodiff”

What is the relationship between chain rule and backpropagation?

In a way, back propagation is just fancy name for the chain rule — Jeremy Howard

What Autodiff Is from a chain rule’s perspective

page 6 of the notes.

Recall how we computed the derivatives of logistic least squares regression. An autodiff system should transform the left-hand side into the right-hand side.

The left part is clear. The right part is applying the chain rule backwards.

For the right part, if we change L bar to dl/dl and y ba to dl/dy, then it will be clear a chain rule.

Convert the autodiff to a sequence of primitive operations

This is cool, right? After we change the autodiff into a sequence of primitive operations, the computation process will be more clear and easier to be done by programming.

Python autograd tutorial

At this time, we may need python autograd tutorial to move forward. Here is a nice tutorial.

The source code of autograd can be found here:

Thanks for reading.

--

--

No responses yet