Cheap But Effective: Instituting Effective Pandemic Policies Without Knowing Who’s Infected


Cheap But Effective: Instituting Effective Pandemic Policies Without Knowing Who’s Infected
Chris Rackauckas
MIT Applied Mathematics Instructor

One way to find out how many people are infected is to figure out who’s infected, but that’s working too hard! In this talk we will look into cheaper alternatives for effective real-time policy making. To this end we introduce SafeBlues, a project that simulates fake virus strands over Bluetooth and utilizes deep neural networks mixed within differential equations to accurately approximate infection statistics weeks before updated statistics are available. We then introduce COEXIST, a quarantine policy which utilizes inexpensive “useless” tests to perform accurate regional case isolation. This work is all being done as part of the Microsoft Pandemic Modeling Project, where the Julia SciML tooling has accelerated the COEXIST simulations by 36,000x … READ MORE

Glue AD for Full Language Differentiable Programming


June 15 2020 in Julia, Science, Scientific ML | Tags: | Author: Christopher Rackauckas

No design choice will be the best choice for all possible users. That’s a statement that is provocative but at the same time I think everyone would easily agree with it. But that should make us all question whether it’s a good idea to ever try and make all users happy with one piece of code. Under the differentiable programming mindset we are trying to make all code in the entire programming language be differentiable, but why would we think that a single system with a single set of rules and assumptions would be the best for everyone?

Optimized Algorithms Across Scientific Computing and Machine Learning

Differentiable programming is a subset of modeling where you model with a program where each of the steps are differentiable, for the purpose of being able to find the correct program with parameter fitting using said … READ MORE

Generalized Physics-Informed Learning through Language-Wide Differentiable Programming (Video)


Chris Rackauckas (MIT), “Generalized Physics-Informed Learning through Language-Wide Differentiable Programming”

Scientific computing is increasingly incorporating the advancements in machine learning to allow for data-driven physics-informed modeling approaches. However, re-targeting existing scientific computing workloads to machine learning frameworks is both costly and limiting, as scientific simulations tend to use the full feature set of a general purpose programming language. In this manuscript we develop an infrastructure for incorporating deep learning into existing scientific computing code through Differentiable Programming (∂P). We describe a ∂P system that is able to take gradients of full Julia programs, making Automatic Differentiation a first class language feature and compatibility with deep learning pervasive. Our system utilizes the one-language nature of Julia package development to augment the existing package ecosystem with deep learning, supporting almost all language … READ MORE

Scientific Machine Learning: Interpretable Neural Networks That Accurately Extrapolate From Small Data


The fundamental problems of classical machine learning are:

  1. Machine learning models require big data to train
  2. Machine learning models cannot extrapolate out of the their training data well
  3. Machine learning models are not interpretable

However, in our recent paper, we have shown that this does not have to be the case. In Universal Differential Equations for Scientific Machine Learning, we start by showing the following figure:

Indeed, it shows that by only seeing the tiny first part of the time series, we can automatically learn the equations in such a manner that it predicts the time series will be cyclic in the future, in a way … READ MORE

Recent advancements in differential equation solver software


This was a talk given at the Modelica Jubilee Symposium – Future Directions of System Modeling and Simulation.

Recent Advancements in Differential Equation Solver Software

Since the time of the ancient Fortran methods like dop853 and DASSL were created, many advancements in numerical analysis, computational methods, and hardware have accelerated computing. However, many applications of differential equations still rely on the same older software, possibly to their own detriment. In this talk we will describe the recent advancements being made in differential equation solver software, focusing on the Julia-based DifferentialEquations.jl ecosystem. We will show how high order Rosenbrock and IMEX methods have been proven advantageous over traditional BDF implementations in certain problem domains, and the types of issues that give rise to general performance characteristics between the methods. Extensions of these … READ MORE

A Collection of Jacobian Sparsity Acceleration Tools for Julia


Over the summer there have been a whole suite of sparsity acceleration tools for Julia. These are encoded in the packages:

The toolchain is showcased in the following blog post by Pankaj Mishra, the student who build a lot of the Jacobian coloring and decompression framework. Langwen Huang setup the fast paths for structured matrices (tridiagonal, banded, and block-banded matrices) and also integrated these tools with DifferentialEquations.jl. Shashi Gowda then setup a mechanism for automatically detecting the sparsity of Julia programs (!!!).

A tutorial using this workflow together is described in the SparseDiffTools.jl README. In summary, to use the tools you have the following flow:

  1. Find your sparsity pattern, Jacobian structure (i.e. Jacobian type), or automatically detect it with SparsityDetection.jl.
  2. Call `matrix_colors(A)` from SparseDiffTools.jl to get the `colorvec` for A. This is the vector that the … READ MORE

When do micro-optimizations matter in scientific computing?


September 3 2019 in Julia, Programming, Scientific ML | Tags: | Author: Christopher Rackauckas

Something that has been bothering me about discussions about microbenchmarks is that people seem to ignore that the benchmarks are highly application-dependent. The easiest way to judge whether the benchmark really matters to a particular application is the operation overhead of the largest and most common calls. If you have a single operation dominating all of your runtime 99.9%, making everything else 100x faster still won’t do anything to your real runtime performance. But at the same time, if your bottleneck is some small operation that’s in a tight loop, then that operation may be your bottleneck. This is a classic chart to keep in the back of your mind when considering optimizations.

Here is a very brief overview on what to think about when optimizing code and how to figure out when to stop.

Function Call Overhead

When dealing with … READ MORE

The Essential Tools of Scientific Machine Learning (Scientific ML)


Scientific machine learning is a burgeoning discipline which blends scientific computing and machine learning. Traditionally, scientific computing focuses on large-scale mechanistic models, usually differential equations, that are derived from scientific laws that simplified and explained phenomena. On the other hand, machine learning focuses on developing non-mechanistic data-driven models which require minimal knowledge and prior assumptions. The two sides have their pros and cons: differential equation models are great at extrapolating, the terms are explainable, and they can be fit with small data and few parameters. Machine learning models on the other hand require “big data” and lots of parameters but are not biased by the scientists ability to correctly identify valid laws and assumptions.

However, the recent trend has been to merge the two disciplines, allowing explainable models that are data-driven, require less data than traditional machine learning, and utilize the … READ MORE