ModelingToolkit, Modelica, and Modia: The Composable Modeling Future in Julia
April 19 2021 in Differential Equations, Julia, Mathematics, Programming, Science, Scientific ML | Tags: acausal, dae, language, modelica, modeling, modelingtoolkit, modia | Author: Christopher Rackauckas
Let me take a bit of time here to write out a complete canonical answer to ModelingToolkit and how it relates to Modia and Modelica. This question comes up a lot: why does ModelingToolkit exist instead of building on tooling for Modelica compilers? I’ll start out by saying I am a huge fan of Martin and Hilding’s work and we work very closely with them on the direction of Julia-based tooling for modeling and simulation. ModelingToolkit, being a new system, has some flexibility in the design space it explores, and while we are following a different foundational philosophy, we have many of the same goals.
Composable Abstractions for Model Transformations
Everything in the SciML organization is built around a principle of confederated modular development: let other packages influence the capabilities of your own. This is highlighted in a … READ MORE
Generalizing Automatic Differentiation to Automatic Sparsity, Uncertainty, Stability, and Parallelism
March 10 2021 in Differential Equations, Julia, Mathematics, Programming, Science, Scientific ML | Tags: abstract interpretation, automatic differentiation, non-standard interpretation, Pantelides algorithm | Author: Christopher Rackauckas
Automatic differentiation is a “compiler trick” whereby a code that calculates f(x) is transformed into a code that calculates f'(x). This trick and its two forms, forward and reverse mode automatic differentiation, have become the pervasive backbone behind all of the machine learning libraries. If you ask what PyTorch or Flux.jl is doing that’s special, the answer is really that it’s doing automatic differentiation over some functions.
What I want to dig into in this blog post is a simple question: what is the trick behind automatic differentiation, why is it always differentiation, and are there other mathematical problems we can be focusing this trick towards? While very technical discussions on this can be found in our recent paper titled “ModelingToolkit: A Composable Graph Transformation System For Equation-Based Modeling” and descriptions of methods like intrusive uncertainty quantification, I want … READ MORE
JuliaCall Update: Automated Julia Installation for R Packages
January 18 2021 in Differential Equations, Julia, Mathematics, Programming, R | Tags: devops, differentialequations, installation, julia, juliacall, modelingtoolkit, r | Author: Christopher Rackauckas
Some sneakily cool features made it into the JuliaCall v0.17.2 CRAN release. With the latest version there is now an install_julia function for automatically installing Julia. This makes Julia a great high performance back end for R packages. For example, the following is an example from the diffeqr package that will work, even without Julia installed:
install.packages("diffeqr") library(diffeqr) de <- diffeqr::diffeq_setup() lorenz <- function (u,p,t){ du1 = p[1]*(u[2]-u[1]) du2 = u[1]*(p[2]-u[3]) - u[2] du3 = u[1]*u[2] - p[3]*u[3] c(du1,du2,du3) } u0 <- c(1.0,1.0,1.0) tspan <- c(0.0,100.0) p <- c(10.0,28.0,8/3) prob <- de$ODEProblem(lorenz,u0,tspan,p) fastprob <- diffeqr::jitoptimize_ode(de,prob) sol <- de$solve(fastprob,de$Tsit5(),saveat=0.01)
Under the hood it’s using the DifferentialEquations.jl package and the SciML stack, but it’s abstracted from users so much that Julia is essentially an alternative to Rcpp with easier interactive development. The following example really brings the seamless … READ MORE
GPU-Accelerated ODE Solving in R with Julia, the Language of Libraries
August 24 2020 in Differential Equations, Julia, Programming, R, Uncategorized | Tags: diffeqr, differentialequations, gpu, high-performance, jit, r | Author: Christopher Rackauckas
R is a widely used language for data science, but due to performance most of its underlying library are written in C, C++, or Fortran. Julia is a relative newcomer to the field which has busted out since its 1.0 to become one of the top 20 most used languages due to its high performance libraries for scientific computing and machine learning. Julia’s value proposition has been its high performance in high level language, known as solving the two language problem, which has allowed allowed the language to build a robust, mature, and expansive package ecosystem. While this has been a major strength for package developers, the fact remains that there are still large and robust communities in other high level languages like R and Python. Instead of spawning distracting language wars, we should ask the … READ MORE
COVID-19 Epidemic Mitigation via Scientific Machine Learning (SciML)
July 7 2020 in Differential Equations, Julia, Mathematics, Programming, Science, Scientific ML | Tags: covid-19, epidemic modeling, scientific machine learning, sciml | Author: Christopher Rackauckas
Chris Rackauckas
Applied Mathematics Instructor, MIT
Senior Research Analyst, University of Maryland, Baltimore School of Pharmacy
This was a seminar talk given to the COVID modeling journal club on scientific machine learning for epidemic modeling.
Resources:
https://sciml.ai/
https://diffeqflux.sciml.ai/dev/
https://datadriven.sciml.ai/dev/
https://docs.sciml.ai/latest/
https://safeblues.org/
Cheap But Effective: Instituting Effective Pandemic Policies Without Knowing Who’s Infected
July 2 2020 in Biology, Differential Equations, Julia, Mathematics, Science, Scientific ML | Tags: covid-19, scientific machine learning, sciml | Author: Christopher Rackauckas
Cheap But Effective: Instituting Effective Pandemic Policies Without Knowing Who’s Infected
Chris Rackauckas
MIT Applied Mathematics Instructor
One way to find out how many people are infected is to figure out who’s infected, but that’s working too hard! In this talk we will look into cheaper alternatives for effective real-time policy making. To this end we introduce SafeBlues, a project that simulates fake virus strands over Bluetooth and utilizes deep neural networks mixed within differential equations to accurately approximate infection statistics weeks before updated statistics are available. We then introduce COEXIST, a quarantine policy which utilizes inexpensive “useless” tests to perform accurate regional case isolation. This work is all being done as part of the Microsoft Pandemic Modeling Project, where the Julia SciML tooling has accelerated the COEXIST simulations by … READ MORE
Generalized Physics-Informed Learning through Language-Wide Differentiable Programming (Video)
March 31 2020 in Differential Equations, Mathematics, Science, Scientific ML | Tags: physics-informed machine learning, pinn, scientific machine learning, scientific ml, sciml | Author: Christopher Rackauckas
Chris Rackauckas (MIT), “Generalized Physics-Informed Learning through Language-Wide Differentiable Programming”
Scientific computing is increasingly incorporating the advancements in machine learning to allow for data-driven physics-informed modeling approaches. However, re-targeting existing scientific computing workloads to machine learning frameworks is both costly and limiting, as scientific simulations tend to use the full feature set of a general purpose programming language. In this manuscript we develop an infrastructure for incorporating deep learning into existing scientific computing code through Differentiable Programming (∂P). We describe a ∂P system that is able to take gradients of full Julia programs, making Automatic Differentiation a first class language feature and compatibility with deep learning pervasive. Our system utilizes the one-language nature of Julia package development to augment the existing package ecosystem with deep learning, supporting almost all … READ MORE
Scientific Machine Learning: Interpretable Neural Networks That Accurately Extrapolate From Small Data
January 14 2020 in Differential Equations, Julia, Mathematics, Science, Scientific ML | Tags: neural ode, physics-informed, sciml, small data, universal differential equations | Author: Christopher Rackauckas
The fundamental problems of classical machine learning are:
- Machine learning models require big data to train
- Machine learning models cannot extrapolate out of the their training data well
- Machine learning models are not interpretable
However, in our recent paper, we have shown that this does not have to be the case. In Universal Differential Equations for Scientific Machine Learning, we start by showing the following figure:

Indeed, it shows that by only seeing the tiny first part of the time series, we can automatically learn the equations in such a manner that it predicts the time series will be cyclic in the future, in a … READ MORE
Recent advancements in differential equation solver software
October 16 2019 in Differential Equations, Julia, Mathematics, Scientific ML, Uncategorized | Tags: | Author: Christopher Rackauckas
This was a talk given at the Modelica Jubilee Symposium – Future Directions of System Modeling and Simulation.
Recent Advancements in Differential Equation Solver Software
Since the time of the ancient Fortran methods like dop853 and DASSL were created, many advancements in numerical analysis, computational methods, and hardware have accelerated computing. However, many applications of differential equations still rely on the same older software, possibly to their own detriment. In this talk we will describe the recent advancements being made in differential equation solver software, focusing on the Julia-based DifferentialEquations.jl ecosystem. We will show how high order Rosenbrock and IMEX methods have been proven advantageous over traditional BDF implementations in certain problem domains, and the types of issues that give rise to general performance characteristics between the methods. Extensions of these … READ MORE
A Collection of Jacobian Sparsity Acceleration Tools for Julia
October 6 2019 in Differential Equations, Julia, Programming, Scientific ML | Tags: | Author: Christopher Rackauckas
Over the summer there have been a whole suite of sparsity acceleration tools for Julia. These are encoded in the packages:
The toolchain is showcased in the following blog post by Pankaj Mishra, the student who build a lot of the Jacobian coloring and decompression framework. Langwen Huang setup the fast paths for structured matrices (tridiagonal, banded, and block-banded matrices) and also integrated these tools with DifferentialEquations.jl. Shashi Gowda then setup a mechanism for automatically detecting the sparsity of Julia programs (!!!).
A tutorial using this workflow together is described in the SparseDiffTools.jl README. In summary, to use the tools you have the following flow:
- Find your sparsity pattern, Jacobian structure (i.e. Jacobian type), or automatically detect it with SparsityDetection.jl.
- Call `matrix_colors(A)` from SparseDiffTools.jl to get the `colorvec` for A. This is the vector that the … READ MORE