Google all the way

Tangent fills a unique gap in the machine learning tools space

Gabriela Motroc

© Shutterstock /GagliardiImages

There’s a new, free, and open-source Python library for automatic differentiation! Tangent —an experimental release— offers extra autodiff features other Python machine learning libraries don’t have. Plus, it’s compatible with TensorFlow and NumPy.

Tangent is a new, free, and open-source Python library for automatic differentiation. What does that mean exactly? Unlike existing machine learning libraries, Tangent is a source-to-source system, consuming a Python function f and emitting a new Python function that computes the gradient of f, Alex Wiltschko, Research Scientist in the Google Brain Team wrote in a blog post announcing the project.

Tangent fills a unique location in the space of machine learning tools.

This means that you can read your automatic derivative code just like the rest of your program. “Tangent is useful to researchers and students who not only want to write their models in Python but also read and debug automatically-generated derivative code without sacrificing speed and flexibility,” according to the GitHub repo. Furthermore, it’s compatible with both TensorFlow and NumPy.

SEE ALSO: Top 5 open source machine learning projects

Keep in mind that Tangent is an experimental release, and is under active development. This means there might be API changes as the team continues to build the project and respond to feedback from the community.

Tangent has a one-function API

You don’t need special tools or indirection to inspect and debug your models written in Tangent.

This is what happens when you call tangent.grad on a Python function


If you want to print out your derivatives, you can run

Tangent also supports

We are working to add support in Tangent for more aspects of the Python language (e.g., closures, inline function definitions, classes, more NumPy and TensorFlow functions). We also hope to add more advanced automatic differentiation and compiler functionality in the future, such as automatic trade-off between memory and compute (Griewank and Walther 2000Gruslys et al., 2016), more aggressive optimizations, and lambda lifting.

If you want to learn more about the project, head on over to GitHub.

Gabriela Motroc
Gabriela Motroc was editor of and JAX Magazine. Before working at Software & Support Media Group, she studied International Communication Management at the Hague University of Applied Sciences.

Inline Feedbacks
View all comments