DDPS Seminar Talk: Generalizing Scientific Machine Learning and Differentiable Simulation Beyond Continuous models
November 12 2023 in Uncategorized | Tags: data-driven physics, ddps, physics-informed machine learning, piml, sciml | Author: Christopher Rackauckas
I’m pleased to share a talk I gave in the DDPS seminar series!
Data-driven Physical Simulations (DDPS) Seminar Series
Abstract: The combination of scientific models into deep learning structures, commonly referred to as scientific machine learning (SciML), has made great strides in the last few years in incorporating models such as ODEs and PDEs into deep learning through differentiable simulation. However, the vast space of scientific simulation also includes models like jump diffusions, agent-based models, and more. Is SciML constrained to the simple continuous cases or is there a way to generalize to more advanced model forms? This talk will dive into the mathematical aspects of generalizing differentiable simulation to discuss cases like chaotic simulations, differentiating stochastic simulations like particle filters and agent-based models, and solving inverse … READ MORE
Summary of Julia Plotting Packages
June 17 2023 in Julia | Tags: data science, ggplot2, julia, plots, programming language, startup time, ttfx, visualization | Author: Christopher Rackauckas
This is a repost of my response on the Julia Discourse on this topic. I was asked to make a blog post so here you go!
The “Main” Plotting Packages
Here’s a quick summary of the most widely used plotting packages. I may have missed one, but I haven’t missed one that is very widely used.
- Plots.jl is the most used. It’s probably the most documented, used in the most tutorials, and is used in many videos.
- Pros: Its main draw is that it has a lot of plugins to other packages through its recipes system, which means that a lot of odd things like `plot(sol::ODESolution)` or showing the sparsity of a `BandedMatrix` just works. With all of these integrations, it’s normally what I would recommend first to newcomers since they will generally get the most done with the least work. It … READ MORE
- Pros: Its main draw is that it has a lot of plugins to other packages through its recipes system, which means that a lot of odd things like `plot(sol::ODESolution)` or showing the sparsity of a `BandedMatrix` just works. With all of these integrations, it’s normally what I would recommend first to newcomers since they will generally get the most done with the least work. It … READ MORE
Integrating equation solvers with probabilistic programming through differentiable programming
November 24 2022 in Julia, Programming, Science, Scientific ML | Tags: differential equations, julia, probabilistic programming, scientific machine learning, sciml | Author: Christopher Rackauckas
Part of the COMPUTATIONAL ABSTRACTIONS FOR PROBABILISTIC AND DIFFERENTIABLE PROGRAMMING WORKSHOP
Abstract: Many probabilistic programming languages (PPLs) attempt to integrate with equation solvers (differential equations, nonlinear equations, partial differential equations, etc.) from the inside, i.e. the developers of the PPLs like Stan provide differential equation solver choices as part of the suite. However, as equation solvers are an entire discipline to themselves with many active development communities and subfields, this places an immense burden on PPL developers to keep up with the changing landscape of tens of thousands of independent researchers. In this talk we will explore how Julia PPLs such as Turing.jl support of equation solvers from the outside, i.e. how the tools of differentiable programming allows equation solver libraries to be compatible with PPLs … READ MORE
Direct Automatic Differentiation of (Differential Equation) Solvers vs Analytical Adjoints: Which is Better?
October 11 2022 in Differential Equations, Julia, Mathematics, Science, Scientific ML | Tags: automatic differentiation, differentiable programming, sciml | Author: Christopher Rackauckas
Automatic differentiation of a “solver” is a subject with many details for doing it in the most effective form. For this reason, there are a lot of talks and courses that go into lots of depth on the topic. I recently gave a talk on some of the latest stuff in differentiable simulation with the American Statistical Association, and have some detailed notes on such adjoint derivations as part of the 18.337 Parallel Computing and Scientific Machine Learning graduate course at MIT. And there are entire organizations like my SciML Open Source Software Organization which work day-in and day-out on the development of new differentiable solvers.
I’ll give a brief summary of all my materials here below.
Continuous vs Discrete Differentiation of Solvers
AD of a solver can be done in essentially two different ways: either directly performing automatic … READ MORE
Is Differentiable Programming Actually Necessary? Can’t you just train the neural networks separately?
October 4 2022 in Scientific ML | Tags: | Author: Christopher Rackauckas
Is differentiable programming actually necessary, or can you just train the neural network in isolation against data and then stick the trained neural network into the simulation? We looked at this problem in detail in our new manuscript titled Capturing missing physics in climate model parameterizations using neural differential equations.
The goal of this project is to understand temperature mixing in large eddy simulations, essentially columns of water in the ocean. I.e., can we take a “true” 3D Navier-Stokes and use that to build very quick and accurate models for how heat flows up and down in the water?
This isn’t a new problem: … READ MORE
Accurate and Efficient Physics-Informed Learning Through Differentiable Simulation (ASA Seminar Talk)
July 14 2022 in Uncategorized | Tags: | Author: Christopher Rackauckas
Abstract: Scientific machine learning (SciML) methods allow for the automatic discovery of mechanistic models by infusing neural network training into the simulation process. In this talk we will start by showcasing some of the ways that SciML is being used, from discovery of extrapolatory epidemic models to nonlinear mixed effects models in pharmacology. From there, we will discuss some of the increasingly advanced computational techniques behind the training process, focusing on the numerical issues involved in handling differentiation of highly stiff and chaotic systems. The viewers will leave with an understanding of how compiler techniques are being infused into the simulation stack to increasingly automate the process of developing mechanistic models
Bio: Dr. Chris Rackauckas is the Director of Scientific Research at Pumas-AI, the Director of … READ MORE
Keynote: New Horizons in Modeling and Simulation with Julia (Modelica Conference 2021)
June 25 2022 in Uncategorized | Tags: | Author: Christopher Rackauckas
Keynote Address: New Horizons in Modeling and Simulation in Julia
Presenters: Viral Shah (Julia Computing, CEO and Co-Founder), Chris Rackauckas (Julia Computing, Director of Modeling and Simulation and Christopher Laughman (Mitsubishi Electric Research Laboratories, Principal Member Research Staff)
Abstract: As modeling has become more ubiquitous, our models keep growing. The time to build models, verify their behavior, and simulate them is increasing exponentially as we seek more precise predictions. How will our tools change to accommodate the future? Julia’s language design has led to new opportunities. The combination of multiple dispatch, staged compilation, and Julia’s composable libraries have made it possible to build a next generation symbolic-numeric framework. Julia’s abstract interpretation framework enables capabilities such as automatic differentiation, automatic surrogate generation, symbolic tracing, uncertainty propagation, and automatic … READ MORE
Composing Modeling and Simulation with Julia (2021 Modelica Conference)
April 8 2022 in Differential Equations, Julia, Science, Scientific ML | Tags: | Author: Christopher Rackauckas
In this paper we introduce JuliaSim, a high-performance programming environment designed to blend traditional modeling and simulation with machine learning. JuliaSim can build accelerated surrogates from component-based models, such as those conforming to the FMI standard, using continuous-time echo state networks (CTESN). The foundation of this environment, ModelingToolkit.jl, is an acausal modeling language which can compose the trained surrogates as components within its staged compilation process. As a complementary factor we present the JuliaSim model library, a standard library with differential-algebraic equations and pre-trained surrogates, which can be composed using the modeling system for design, optimization, and control. We demonstrate the effectiveness of the surrogate-accelerated modeling and simulation approach on HVAC dynamics by showing that the CTESN surrogates accurately capture the dynamics of a HVAC … READ MORE
Engineering Trade-Offs in Automatic Differentiation: from TensorFlow and PyTorch to Jax and Julia
December 25 2021 in Julia, Programming, Science, Scientific ML | Tags: automatic differentiation, compilers, differentiable programming, jax, julia, machine learning, pytorch, tensorflow, XLA | Author: Christopher Rackauckas
To understand the differences between automatic differentiation libraries, let’s talk about the engineering trade-offs that were made. I would personally say that none of these libraries are “better” than another, they simply all make engineering trade-offs based on the domains and use cases they were aiming to satisfy. The easiest way to describe these trade-offs is to follow the evolution and see how each new library tweaked the trade-offs made of the previous.
Early TensorFlow used a graph building system, i.e. it required users to essentially define variables in a specific graph language separate from the host language. You had to define “TensorFlow variables” and “TensorFlow ops”, and the AD would then be performed on this static graph. Control flow constructs were limited to the constructs that could be represented statically. For example, an `ifelse` function statement is very different from … READ MORE
Improved ForwardDiff.jl Stacktraces With Package Tags
December 19 2021 in Uncategorized | Tags: | Author: Christopher Rackauckas
You may have seen some hilariously long stacktraces when using ForwardDiff. In the latest releases of OrdinaryDiffEq.jl we have fixed this, and the fix is rather safe. I want to take a second to describe some of the technical details so that others can copy this technique.
The reason for this is the tag parameter. The Dual number type is given by Dual{T,V,N} where V is an element type (usually Float64), N is a chunksize (some integer), and T is the tag. What the tag does is prevent perturbation confusion by erroring if two incompatible dual numbers try to interact. The key requirement for it to prevent perturbation confusion is for the type to be unique in the context of the user. For example, if the user is differentiating f, and then differentiating x->derivative(f,x), you want the tag to be … READ MORE