A Collection of Jacobian Sparsity Acceleration Tools for Julia


Over the summer there have been a whole suite of sparsity acceleration tools for Julia. These are encoded in the packages:

The toolchain is showcased in the following blog post by Pankaj Mishra, the student who build a lot of the Jacobian coloring and decompression framework. Langwen Huang setup the fast paths for structured matrices (tridiagonal, banded, and block-banded matrices) and also integrated these tools with DifferentialEquations.jl. Shashi Gowda then setup a mechanism for automatically detecting the sparsity of Julia programs (!!!).

A tutorial using this workflow together is described in the SparseDiffTools.jl README. In summary, to use the tools you have the following flow:

  1. Find your sparsity pattern, Jacobian structure (i.e. Jacobian type), or automatically detect it with SparsityDetection.jl.
  2. Call `matrix_colors(A)` from SparseDiffTools.jl to get the `colorvec` for A. This is the vector that the … READ MORE

Developing Julia Packages


October 4 2019 in Julia | Tags: | Author: Christopher Rackauckas

Have you ever wanted to develop your own package for the Julia programming language? Have you ever wanted to contribute a bug fix? Then this tutorial is for you! I will walk you through getting the community resources (Discourse and Slack) so that you can get help, get the Juno and GitKraken development environments going, and show all of the steps of building a package. In this video you will learn how to use modules, how to interactively update a package without recompiling, how to setup continuous integration testing, and how to get your package registered. In addition, I show how to “dev” a package to get a local copy to work on, and use this to give a bug fix to open a pull-request to fix an issue on … READ MORE

When do micro-optimizations matter in scientific computing?


September 3 2019 in Julia, Programming, Scientific ML | Tags: | Author: Christopher Rackauckas

Something that has been bothering me about discussions about microbenchmarks is that people seem to ignore that the benchmarks are highly application-dependent. The easiest way to judge whether the benchmark really matters to a particular application is the operation overhead of the largest and most common calls. If you have a single operation dominating all of your runtime 99.9%, making everything else 100x faster still won’t do anything to your real runtime performance. But at the same time, if your bottleneck is some small operation that’s in a tight loop, then that operation may be your bottleneck. This is a classic chart to keep in the back of your mind when considering optimizations.

Here is a very brief overview on what to think about when optimizing code and how to figure out when to stop.

Function Call Overhead

When dealing with … READ MORE

The Essential Tools of Scientific Machine Learning (Scientific ML)


Scientific machine learning is a burgeoning discipline which blends scientific computing and machine learning. Traditionally, scientific computing focuses on large-scale mechanistic models, usually differential equations, that are derived from scientific laws that simplified and explained phenomena. On the other hand, machine learning focuses on developing non-mechanistic data-driven models which require minimal knowledge and prior assumptions. The two sides have their pros and cons: differential equation models are great at extrapolating, the terms are explainable, and they can be fit with small data and few parameters. Machine learning models on the other hand require “big data” and lots of parameters but are not biased by the scientists ability to correctly identify valid laws and assumptions.

However, the recent trend has been to merge the two disciplines, allowing explainable models that are data-driven, require less data than traditional machine learning, and utilize the … READ MORE

Neural Jump SDEs (Jump Diffusions) and Neural PDEs


This is just an exploration of some new neural models I decided to jot down for safe keeping. DiffEqFlux.jl gives you the differentiable programming tools to allow you to use any DifferentialEquations.jl problem type (DEProblem) mixed with neural networks. We demonstrated this before, not just with neural ordinary differential equations, but also with things like neural stochastic differential equations and neural delay differential equations.

At the time we made DiffEqFlux, we were the “first to the gate” for many of these differential equations types and left it as an open question for people to find a use for these tools. And judging by the Arxiv papers that went out days after NeurIPS submissions were due, it looks like people now have justified some machine learning use cases for them. There were two separate papers on neural … READ MORE

Some State of the Art Packages in Julia v1.0


August 27 2018 in Julia, Programming | Tags: | Author: Christopher Rackauckas

In this post I would like to explain some of the state of the art packages in Julia and how they are pushing forward their disciplines. I think the part I would really like to make clear is the ways which Julia gives these packages a competitive advantage and how these features are being utilized to reach their end goals. Maybe this will generate some more ideas!

What are the advantages for Julia in terms of package development?

Using Python/R to use other people’s packages is perfectly fine since then your code will just be as fast as the package code if all your time is in package code calls (I purposely leave MATLAB off this list because its interoperability is much more difficult to use). There is one big exception where using an efficient package can still be slow even when … READ MORE

PuMaS.jl: Pharmaceutical Modeling and Simulation Engine


August 13 2018 in Biology, Julia, Programming | Tags: | Author: Christopher Rackauckas

Here is an introduction to a pharmaceutical modeling project which I will be releasing in the near future. More details to come.

Solving Partial Differential Equations with Julia


Here is a talk from JuliaCon 2018 where I describe how to use the tooling across the Julia ecosystem to solve partial differential equations (PDEs), and how the different areas of the ecosystem are evolving to give top-notch PDE solver support.

Why Numba and Cython are not substitutes for Julia


August 6 2018 in Julia, Programming | Tags: | Author: Christopher Rackauckas

Sometimes people ask: why does Julia need to be a new language? What about Julia is truly different from tools like Cython and Numba? The purpose of this blog post is to describe how Julia’s design gives a very different package development experience than something like Cython, and how that can lead to many more optimizations. What I really want to show is:

  1. Julia’s compilation setup is built for specialization of labor which is required for scientific progress
  2. Composition of Julia codes can utilize the compilation process to build new programs which are greater than the sum of the parts

I will also explain some of the engineering tradeoffs which were made to make this happen. There are many state-of-the-art scientific computing and data science packages in Julia and what I want to describe is how these are using the more “hardcore … READ MORE

DifferentialEquations.jl’s Confederated Modular API


I wrote a manuscript describing DifferentialEquations.jl’s confederated modular API and its effect on the local scientific computing ecosystem. It’s now on Arxiv until we can find the right venue for it.