New demo: Pipeline-level autodiff with Tesseracts
By on 30 Jun 2025TL;DR
We’ve prepared an optimisation demo highlighting how Tesseracts unlock efficient, clean pipeline-level automatic differentiation.We’re excited to share our new demo showcasing how Tesseracts enable pipeline-level automatic differentiation. In this demo, we use multiple Tesseracts to implement gradient-based parametric optimization of a finite element method (FEM) simulation.
First, a quick refresher: What’s a Tesseract?
Tesseracts are components that allow scientists to enable complex scientific workflows at scale. They are self-contained, self-documenting, and self-executing, available via command line and HTTP. Since each Tesseract is a standalone and stateless component, we can build pipelines connecting multiple Tesseracts to create complex workflows. This is particularly useful for multi-step computations, including use cases in digital engineering and machine learning.
… and as a reminder, Tesseract is an open source project! Check out our launch announcement from March 2025 and our Tesseract dev forum.
Tesseracts as pipeline-level autodiff enablers
Automatic differentiation (also known as 'autodiff', or just 'AD') is an important part of modern scientific computing. It enjoys particular success in machine learning world, underpinning popular deep learning frameworks like PyTorch and Tensorflow.
Autodiff works really well for situations where all calculations can be expressed in a single program. However when it comes to differentiating a system of multiple loosely coupled components, tracking gradients becomes much more difficult. Enabling this sort of system-level automatic differentiation is what motivated us to create Tesseracts in the first place. And today, we’re keen to share this example highlighting Tesseracts in action.
Demo: Parametric shape optimization with differentiable FEM simulation
Our demo is built using JAX-FEM and inspired by the 2D Topology Optimization with the SIMP Method. We’ve reformulated the problem and solved it as an optimisation of a parameterised shape.
Of note, we’ve implemented this demo using multiple Tesseracts that communicate with each other, forming a multi-step computation pipeline. We then apply end-to-end automatic differentiation to carry out the optimisation. The demo clearly shows the feasibility — and efficacy — of pipeline-level automatic differentiation with Tesseracts. In particular, Tesseracts simplify heterogeneous gradient computation, as well as managing dependencies, computing resources, and components.
Eager to learn more?
👉 Join the Tesseract Community Forum
👉 Visit Tesseract Core on GitHub
👉 Visit Tesseract-JAX on GitHub