With OpenMDAO version 3.37.1 and later, JAX (a machine learning library from Google) has been included as an optional dependency. Brings with it the ability to automatically differentiate (AD) components, assuming that they are written in a JAX-traceable way. Think of this as python code with a few modifications. We have doc pages (updated) on how to get up and running with JAX, as well as a video example.

While the OpenMDAO code base is constantly improving, the inclusion of AD is perhaps one of the largest step changes in OpenMDAO that we have seen in several years. AD is one of the most requested features from our users as it can decrease the amount of effort required to make models, as well as decrease maintenance of those models over the long term. To give you an idea of the type of impact that JAX AD can have, we had an explicit component that was 225 lines of code, and using JAX AD we shortened that to just 80 lines. The majority of the effort of designing the component was in calculating those analytic derivatives. The derivative computation time of JAX AD vs. analytical derivatives vs. Finite Difference is something that we need to explore in the future.
We hope that this new capability that the OpenMDAO development team created will be very useful to you. Please keep us informed of your successes and challenges with this update. As always, you can get answers to your OpenMDAO questions on Stack Overflow.