Differences Between Methods for Solving Stiff ODEs


April 6 2024 in Differential Equations, Mathematics | Tags: | Author: Christopher Rackauckas

I found these notes from August 2018 and thought they might be useful so I am posting them verbatim.

A stiff ordinary differential equation is a difficult problem to integrator. However, many of the ODE solver suites offer quite a few different choices for this kind of problem. DifferentialEquations.jl offers almost 200 different choices for example. In this article we will dig into what the differences between these integrators really is so that way you can more easily find which one will be most efficient for your problem.

Quick Overview (tl;dr)

  1. A BDF, Rosenbrock, ESDIRK method are standard
  2. For small equations, Rosenbrock methods have performance advantages
  3. For very stiff systems, Rosenbrock and Rosenbrock-W methods do not require convergence of Newton’s method and thus can take larger steps, being more efficient
  4. BDF integrators are only L-stable (and A-stable) to order 2, so if the problem is … READ MORE

Direct Automatic Differentiation of (Differential Equation) Solvers vs Analytical Adjoints: Which is Better?


Automatic differentiation of a “solver” is a subject with many details for doing it in the most effective form. For this reason, there are a lot of talks and courses that go into lots of depth on the topic. I recently gave a talk on some of the latest stuff in differentiable simulation with the American Statistical Association, and have some detailed notes on such adjoint derivations as part of the 18.337 Parallel Computing and Scientific Machine Learning graduate course at MIT. And there are entire organizations like my SciML Open Source Software Organization which work day-in and day-out on the development of new differentiable solvers.

I’ll give a brief summary of all my materials here below.

Continuous vs Discrete Differentiation of Solvers

AD of a solver can be done in essentially two different ways: either directly performing automatic … READ MORE

Composing Modeling and Simulation with Julia (2021 Modelica Conference)


In this paper we introduce JuliaSim, a high-performance programming environment designed to blend traditional modeling and simulation with machine learning. JuliaSim can build accelerated surrogates from component-based models, such as those conforming to the FMI standard, using continuous-time echo state networks (CTESN). The foundation of this environment, ModelingToolkit.jl, is an acausal modeling language which can compose the trained surrogates as components within its staged compilation process. As a complementary factor we present the JuliaSim model library, a standard library with differential-algebraic equations and pre-trained surrogates, which can be composed using the modeling system for design, optimization, and control. We demonstrate the effectiveness of the surrogate-accelerated modeling and simulation approach on HVAC dynamics by showing that the CTESN surrogates accurately capture the dynamics of a HVAC cycle … READ MORE

Learning Epidemic Models That Extrapolate, AI4Pandemics


I think this talk was pretty good so I wanted to link it here!

Title: Learning Epidemic Models That Extrapolate

Speaker Chris Rackauckas, https://chrisrackauckas.com/

Abstract:
Modern techniques of machine learning are uncanny in their ability to automatically learn predictive models directly from data. However, they do not tend to work beyond their original training dataset. Mechanistic models utilize characteristics of the problem to ensure accurate qualitative extrapolation but can lack in predictive power. How can we build techniques which integrate the best of both approaches? In this talk we will discuss the body of work around universal differential equations, a technique which mixes traditional differential equation modeling with machine learning for accurate extrapolation from small data. We will showcase how incorporating different variations of the technique, such … READ MORE

ModelingToolkit, Modelica, and Modia: The Composable Modeling Future in Julia


Let me take a bit of time here to write out a complete canonical answer to ModelingToolkit and how it relates to Modia and Modelica. This question comes up a lot: why does ModelingToolkit exist instead of building on tooling for Modelica compilers? I’ll start out by saying I am a huge fan of Martin and Hilding’s work and we work very closely with them on the direction of Julia-based tooling for modeling and simulation. ModelingToolkit, being a new system, has some flexibility in the design space it explores, and while we are following a different foundational philosophy, we have many of the same goals.

Composable Abstractions for Model Transformations

Everything in the SciML organization is built around a principle of confederated modular development: let other packages influence the capabilities of your own. This is highlighted in a … READ MORE

Generalizing Automatic Differentiation to Automatic Sparsity, Uncertainty, Stability, and Parallelism


Automatic differentiation is a “compiler trick” whereby a code that calculates f(x) is transformed into a code that calculates f'(x). This trick and its two forms, forward and reverse mode automatic differentiation, have become the pervasive backbone behind all of the machine learning libraries. If you ask what PyTorch or Flux.jl is doing that’s special, the answer is really that it’s doing automatic differentiation over some functions.

What I want to dig into in this blog post is a simple question: what is the trick behind automatic differentiation, why is it always differentiation, and are there other mathematical problems we can be focusing this trick towards? While very technical discussions on this can be found in our recent paper titled “ModelingToolkit: A Composable Graph Transformation System For Equation-Based Modeling” and descriptions of methods like intrusive uncertainty quantification, I want … READ MORE

JuliaCall Update: Automated Julia Installation for R Packages


Some sneakily cool features made it into the JuliaCall v0.17.2 CRAN release. With the latest version there is now an install_julia function for automatically installing Julia. This makes Julia a great high performance back end for R packages. For example, the following is an example from the diffeqr package that will work, even without Julia installed:

install.packages("diffeqr")
library(diffeqr)
de <- diffeqr::diffeq_setup()
 
lorenz <- function (u,p,t){
  du1 = p[1]*(u[2]-u[1])
  du2 = u[1]*(p[2]-u[3]) - u[2]
  du3 = u[1]*u[2] - p[3]*u[3]
  c(du1,du2,du3)
}
u0 <- c(1.0,1.0,1.0)
tspan <- c(0.0,100.0)
p <- c(10.0,28.0,8/3)
prob <- de$ODEProblem(lorenz,u0,tspan,p)
fastprob <- diffeqr::jitoptimize_ode(de,prob)
sol <- de$solve(fastprob,de$Tsit5(),saveat=0.01)

Under the hood it’s using the DifferentialEquations.jl package and the SciML stack, but it’s abstracted from users so much that Julia is essentially an alternative to Rcpp with easier interactive development. The following example really brings the seamless … READ MORE

GPU-Accelerated ODE Solving in R with Julia, the Language of Libraries


R is a widely used language for data science, but due to performance most of its underlying library are written in C, C++, or Fortran. Julia is a relative newcomer to the field which has busted out since its 1.0 to become one of the top 20 most used languages due to its high performance libraries for scientific computing and machine learning. Julia’s value proposition has been its high performance in high level language, known as solving the two language problem, which has allowed allowed the language to build a robust, mature, and expansive package ecosystem. While this has been a major strength for package developers, the fact remains that there are still large and robust communities in other high level languages like R and Python. Instead of spawning distracting language wars, we should ask the … READ MORE

COVID-19 Epidemic Mitigation via Scientific Machine Learning (SciML)


Chris Rackauckas
Applied Mathematics Instructor, MIT
Senior Research Analyst, University of Maryland, Baltimore School of Pharmacy

This was a seminar talk given to the COVID modeling journal club on scientific machine learning for epidemic modeling.

Resources:

https://sciml.ai/
https://diffeqflux.sciml.ai/dev/
https://datadriven.sciml.ai/dev/
https://docs.sciml.ai/latest/
https://safeblues.org/

Cheap But Effective: Instituting Effective Pandemic Policies Without Knowing Who’s Infected


Cheap But Effective: Instituting Effective Pandemic Policies Without Knowing Who’s Infected
Chris Rackauckas
MIT Applied Mathematics Instructor

One way to find out how many people are infected is to figure out who’s infected, but that’s working too hard! In this talk we will look into cheaper alternatives for effective real-time policy making. To this end we introduce SafeBlues, a project that simulates fake virus strands over Bluetooth and utilizes deep neural networks mixed within differential equations to accurately approximate infection statistics weeks before updated statistics are available. We then introduce COEXIST, a quarantine policy which utilizes inexpensive “useless” tests to perform accurate regional case isolation. This work is all being done as part of the Microsoft Pandemic Modeling Project, where the Julia SciML tooling has accelerated the COEXIST simulations by … READ MORE