DifferentialEquations.jl 3.0 and a Roadmap for 4.0


I am pleased to announce the release of DifferentialEquations.jl 3.0. In the last DiffEq blog post I described the current state of JuliaDiffEq and DifferentialEquations.jl along with the trajectory that we hoped to take. We identified (at that time) current shortcomings of the software and our plans to remedy them. I also recently did a survey of differential equation suites in order to understand where we stand and see where we need to improve. These research efforts were used to put together a list of goals that were systematically achieved during 3.0. What I would like to do this time around is give a broad overview of what we have released in the 3.0 timeframe, the goals that we have achieved, and the goals that we are putting off (for next Google Summer of Code?). And then, more importantly, I want to set some milestones for the next version. If you want to dig into our new features and start using them, please see the documentation. If you want to read the release posts, see the official JuliaDiffEq blog. This post isn’t the most exciting because a lot of the release details were mentioned elsewhere (like the DiffEq suite survey), but I hope that by putting the development into context users can better track how we are doing.

A Quick Review of DifferentialEquations.jl Pre-3.0

In 1.0, we made every thing work with generic types and event handling. In all of the native Julia solvers you could use arbitrary arithmetic and use events to have the ODEs do crazy things like change size over time. This was about features. In 2.0, we expanded our capabilities to cover “most” of what users tend to need. A broad array of ordinary differential equation (ODE) solvers, a broad array of stochastic differential equation (SDE) solvers, delay differential equation (DDE) solvers, and some partial differential equation (PDE) solvers. We added addons for parameter estimation, sensitivity analysis, uncertainty quantification, etc. This was really exciting because it was the first set of differential equation solvers which had this range of applicability. What this did was make it possible to solve many different types of problems. You “could” solve the problems. There were some edge cases for sure, but the main areas where the vast majority of individuals were looking was hit. But there were two major remaining warts: stiff problems and PDEs.

Introducing DifferentialEquations.jl 3.0

There was an issue. There are some specific types of problems, namely stiff differential equations, which require specific types of methods. We had wrappers to common C/Fortran solver for these, but this meant that we lost the type flexibility and event handling when solving these equations. We couldn’t handle some of the more difficult problems like state-dependent delays as well. Thus these types of problems were the focus of 3.0: to have some semblance of “completeness” or “coverage” with native Julia methods. The quick summary of DifferentialEquations.jl 3.0 is the following: for hard problems, we now have methods specifically suited for the problem. We have methods for stiff ODEs, SDEs, DDEs, etc. and these work with the differential-algebraic forms of each of these equations. We still need to round out the suite, but I am pleased to say that for hard problems which require special methods that can be difficult to implement, we do have options available for you. Let’s go into some details.

Solvers for Stiff ODEs and DAEs

This is probably the area that will impact the most individuals. In DifferentialEquations.jl 3.0 we are happy to announce the release of a vast array of methods for solving stiff differential equations. While before we had wrappers for methods like CVODE from Sundials, LSODA, and radau from Hairer’s software, our offering here wasn’t too unique. However, we now have a wide array of high order methods for solving stiff ODEs. The centerpiece here are Rosenbrock methods and (E)SDIRK methods.

Rosenbrock methods are methods which are generally very good at lower tolerances. Hairer’s second book showed that high order Rosenbrock methods tend to be the most efficient methods when the required error is less than something around 5 digits. This is huge because this is the amount of accuracy many people want. Our new offerings of ODE solvers includes pretty much every Rosenbrock method that we could find that has been proposed in the literature. These methods have special interpolations so that sol(t) not only acts as a continuous function of the solution, but this continuous function is in some sense “stiffness-aware” and can with high accuracy produce the solution and its sharp turns between the solver’s steps. Being a “generic” Julia implementation, these all work with a wide array of Julia-defined number types, including high-precision arithmetic and (if the Jacobian is defined, see below) complex numbers. As far as we know, this is the first set of stiff ODE solvers with this flexibility. They all fully conform to the framework of DifferentialEquations.jl, meaning that they have event handling, the integrator interface, and all of the other extra goodies. They have adaptive timestepping with automatic initial dt calculations and all of the other features to make them a “fully automatic” solver. Using the mass matrices, these solvers can also handle DAEs.

In addition to the Rosenbrock methods we released a large set of (E)SDIRK methods. These methods have a specialized quasi-Newton solver for their implicit equations, making them highly efficient, especially in the case where there are times in which the Jacobian changes less quickly (it will skip factorizations when it determines that it can). As with the Rosenbrock methods, these are “fully automatic” and work with all of the event handling, generic numbers, etc.

So, how did we do? What do the benchmarks look like? DiffEqBenchmarks.jl is how we’ve been tracking some of our progress. In the benchmarks there we show that in the common test problems for stiff ODEs, these newest methods are the fastest we have available, even faster than CVODE from Sundials and radau from Hairer, in the “range of reasonable tolerances”, i.e. where the user wants an the error to be in the 9th digit or lower. I think that satisfies most uses cases and so we are pretty happy with the results. There are many other tests from users which report similar results that our new Rosenbrock methods and (E)SDIRK methods benchmark as the fastest for achieving the desired accuracy.

We are not surprised though: multistep methods like CVODE are specialized to decrease the number of function (f) calls which in turn is only useful when the system is sufficiently large. And fully implicit methods make use of larger linear systems to be efficient when more steps are required. Thus in the case where the system function is very costly or the number of ODEs is huge, Sundials will still be a good choice. Or if you need really high accuracy, radau will still be a good choice (note: these methods are still wrapped so you can keep using them). But other than these cases, we find our new methods to perform really well.

Let me mention a few caveats here which are left. One of them is complex numbers. Complex number handling is a little bit spotty in Julia packages right now. DifferentialEquations.jl fully supports them in each of the *DiffEq (OrdinaryDiffEq.jl, StochasticDiffEq.jl, DelayDiffEq.jl) solvers, along with all of the addons. However, where we run into issues is when interfacing with other packages. For example, ForwardDiff.jl and Calculus.jl cannot handle complex numbers. This becomes an issue only for the stiff solvers because the stiff solvers require the ability to calculate Jocobians, for which we use these packages. This is why I added the caveat “if you provide your own Jacobian”. But actually, we get the capability to numerically compute Jacobians with complex numbers by replacing Calculus.jl with DiffEqDiffTools.jl in DifferentialEquations.jl 3.0 (a little pre-mature, but the PR is just about done here). But in a week or so this will be a problem of the past, and we hope to integrate our tools into things like Optim.jl and NLsolve.jl so that more packages support complex numbers (you quantum physicists keep emailing me! ๐Ÿ™‚ ).

Solvers for Stiff Delay Differential Equations

Our new methods in OrdinaryDiffEq.jl, the high-order Rosenbrock and (E)SDIRK methods, extend over to stiff delay differential equations. These are specialized so that way they can “re-use” step information to be more efficient than classic designs. We only know of one other available free stiff delay differential equation solver (Radar5), but since we couldn’t find out how to get it to work (it requires some really intricate compilation binding so I don’t think it can be wrapped) we don’t have anything to benchmark against. But from what we’ve seen it works well! Once again (as always), DiffEqBenchmarks.jl is the open resource for seeing how things are doing.

Solvers for Stiff Stochastic Differential Equations

We wouldn’t be complete without saying that we also have methods for stiff stochastic differential equations. These are based on the SDIRK architecture of the ODE solvers and thus employ the same tricks to get efficiency. Not much more to say here.

Solvers for Ordinary, Stochastic, and Delay Differential-Algebraic Equations

Many of our stiff solvers allow for defining a mass matrix. The mass matrix is allowed to be singular, in which case the stiff solver will solve a differential-algebraic version of the ODE, SDE, or DDE. As far as we know, this is the first available set of solvers for SDAEs and DDAEs.

Solvers for for Second Order ODEs

Okay, you can convert second-order ODEs to first-order ODEs and solve them like that. However, when doing so you don’t make use of the full structure of the second order equation, and thus you don’t get the full efficiency out. Runge-Kutta Nystrom methods are made directly for second order ODEs and we now have these methods implemented. In addition, in many cases one may want to solve an equation in a way that you know certain quantities are preserved over long-time integrations. These methods are known as symplectic integrators, and we have implemented a large array of symplectic integrators.

The format for these is what we call a “dynamical ODE”. The basic way to specify a DynamicalODEProblem is by specifying it as a second order ODE. However, we also allow one to directly specify the Hamiltonian for a physical system from which we use autodifferentiation to derive the equations of motion. Additionally, we allow a partitioned ODE form which allows one to specify the velocity component directly and thus allows for more advanced dynamics than a simple second order ODE. All of these problems can also directly be solved by first-order ODE solvers which will automatically do the conversion.

Solvers for State-Dependent Delay Differential Equations

State-dependent delay differential equations are delay differential equations where the delay factor depends on the differential equation itself. For example, you can say something like, the amount of growth in the fish population is dependent on the population from a few weeks ago (since that’s when conception would have occurred), but when there’s more fish there’s a longer delay since the development process is slowed when resources are scarce. This means that the derivative of now is dependent on the derivative of the past, but how long in the past is dependent on the value right now!

Delay differential equations are complicated to solve because these delays propagate discontinuities. If you don’t properly handle the discontinuities then you will not achieve high accuracy. For constant-time delays you can know exactly where all of the discontinuities will be a priori, and thus you can have the solver hit exactly those points in time in order to not have issues. For state-dependent delays, the timepoints for the discontinuities are dependent on the solution itself, so you need the numerical solution in order to know how to handle the discontinuities!

If you ever step over a discontinuity, you will suffer from increased error. Or do you? Questioning this assumption gives you the residual control methods. These are adaptive methods which have a robust form of error estimation and thus try to detect discontinuities by stepping over them and seeing if the resulting error is high. This is the method that MATLAB’s ddesd uses, and thus we implemented this as well. However, shortly before doing so, I received an email from some numerical delay differential equation researchers who questioned the validity of this approach because they ran MATLAB’s ddesd on some test problems and found out that its error was quite high. Well, our resdiual control methods match this behavior: they don’t tend to get more than 3 digits of accuracy, but they are pretty fast. To be fair, Shampine’s paper on ddesd said that it was for getting plotting accuracy, and not necessarily scientific computing accuracy.

So that method handles one case, what about the high accuracy case? The JuliaDiffEq contributor David Widmann is who to thank for this. Using the event handling setup in the ODE solvers, we setup a system by which the solver would continuously track and detect discontinuities, and use this to pullback and hit discontinuities “exactly”. Testing against numerical solutions this method is able to get to full floating point accuracy. This method is also compatible with all ODE solvers via method of steps, and thus allows for using stiff solvers and differential-algebraic delay equations via mass matrices.

Solvers for Boundary Value Problem (BVP) Solvers

This is a result from which you can thank Google Summer of Code. Boundary-value problems are extensions of ODEs which allow you to set conditions which the ODE must satisfy. Normally one thinks of the two-point boundary value problem where these conditions specify values that the ODE must be at the start and the end of the solution interval. We did create a method for two-point BVPs which mirrors that of bvp4c (though adaptivity is coming soon), but we generalized the allowed BVPs quite a bit. For many of the methods, you are able to specify conditions using the full solution and its interpolation. Thus one can make a “boundary condition” that the maximum of the second derivative over the full interval is 1. We honestly do not know of problems which utilize this full generality yet (though I do know of “multipoint BVP problems” which are of course a subset of this), so we’d like to hear if you end up using this for something crazy.

Partial Differential Equation (PDE) Toolkits: Linear Operators and FEniCS

Oh PDEs. If you watch my JuliaCon workshop, you’ll see that the same two questions always come up: what about stiff solvers, and what about PDEs? I just told you about solvers for stiff differential equations for a wide variety of problems, and now lets address our PDE tools.

Early on in DifferentialEquations.jl I created a finite element toolbox to go along with the software. It was very basic, and I realized that approach will not scale, so instead what we decided to do was wrap the popular FEM library FEniCS. This was part of a Google Summer of Code project which created FEniCS.jl. You can read the blog post which introduces it and what’s cool is that the pieces that FEniCS created, the assembled operator equations, can be directly converted to sparse matrices in Julia which can be used to solve time-dependent PDEs in our ODE solvers (or time-independent problems can just solve the implicit equation using whatever linear solver you choose from Julia).

But not every problem needs finite element methods. To help with finite difference methods, another GSoC project developed DiffEqOperators.jl which makes it easy to discretize PDEs via finite difference methods. Essentially, you tell it the derivatives you want to discretize and it spits out lazy (matrix-free) linear operators which are fully multithreaded and perform the stencil. Once again, this makes it easy to define the discretized ODE system from the PDE and then solve it using the ODE solvers. We also include upwind operators for stable discretizations of hyperbolic PDEs.

As you can see, this is a toolbox for solving PDEs. People who have a little bit of prior knowledge in solving PDEs can easily use these tools to build a method that solves their specific PDE. However, what we plan to do next is to use this toolbox to make some pre-made solvers for some common PDEs like diffusion-advection equations. With this development it will really complete our PDE story.

Conclusion: DifferentialEquations.jl 3.0 addresses the major concerns of the past

The main conclusion is this: people wanted methods for all sorts of solvers for stiff ordinary, stochastic, and delay differential equations, along with the differential-algebraic and partial differential variants, and they wanted these to work with generic numbers, event handling, all of the addons, etc. This announcement’s tl;dr is simply that we listened, and we released. Of course, we aren’t done (there’s always more to do), but what we can say is that it is highly likely that one of our offering will solve your differential equation well.

Roadmap for DifferentialEquations.jl 4.0

So what’s next? Well, we can always add some more methods which handle specific special cases better. That’s the goal of DifferentialEquations.jl 4.0 (and beyond). Here’s a look into what we have planned.

Multistep Methods (Adams, BDF) and Implicit ODEs

Classic multistep method solvers like LSODE and CVODE are some of the most commonly used methods. In most cases they aren’t the most efficient: this is a fact noted in Hairer’s benchmarks and now in ours. However they have major upsides when the user’s function f is expensive, or when the system of ODEs is large. This is something that comes up in large PDE discretizations and is why these methods are central to solving large-scale PDEs. We have put this off because it is a niche area and we have pretty good wrappers to the classics like Sundials, LSODA, DASKR… but it is definitely time that we tackle these methods with a native Julia implementation.

One other thing to mention here is that multistep BDF methods are also what have traditionally gave rise to fully implicit DAE solvers like DASSL, DASPK, and IDA. This is an area where we have been lacking quite a bit in terms of native solver capabilities (mostly relying on wrappers), so we will need to spend some time here. We also need to build tooling for finding consistent initial conditions like is done in these solvers… we’re a ways off here. Modeling tools like Sims.jl and Modia.jl directly utilize IDA since we don’t offer anything of interest in this area, but hopefully we can develop some native Julia tooling here and help link it to these other packages so that we can expand the capabilities of not only us but modeling packages as well. If we can provide a better solving backend and they provide a great modeling front end, Julia will be the star of this field.

Fully Implicit Runge-Kutta Methods (radau)

Fully implicit Runge-Kutta methods also have a niche. Some methods, like radau, are great for high-accuracy (low tolerance) solving of stiff equations. Others are high order symplectic methods for stiff differential equations. They also do really well with DAEs in mass matrix form. Now that we have the availability, these are definitely areas that we will tackle in the near future as well, not just in ODEs, but also in SDEs (and note that the ODE part builds DDE solvers for free as well).

Exponential Integrators

Exponential integrators allow you to exploit linearity in the definition of an ODE, SDE, or DDE. There are two forms of interest: $$u’=A(t)u$$ for $$A$$ is a time-dependent linear operator, and $$u’=Au + f(u,t)$$ where $$A$$ is a time-indpendent linear operator. The first form shows up in a lot of quantum mechanics situations. The latter comes from discretizations of semilinear PDEs. Both of these can be solved with standard first-order ODE solvers, but the efficiency can be improved by using $$A$$ directly.

We have already made great strides in this direction. There are some solvers released for both types of equations, and we have developed an interface, the DiffEqOperator, for handling the definition of $$A$$ in a way the solvers can exploit. However, the crucial linear algebra tools were picked up by Marcelo Forets and implemented in ExpoKit.jl, and using their expmv and phimv implementations we can tackle the higher order methods. I wouldn’t expect this until the next summer since I see portions of this project as a great Google Summer of Code project, so if you’re interested please feel free to get in contact with us.

Implicit-Explicit (IMEX) Methods

IMEX methods, where the user can split the function f into two portions so that way one part is explicit and one part is implicitly solved (i.e. a stiff and nonstiff part) has recently received lots of popularity for solving PDEs. We have most of the pieces for high order IMEX methods. The ESDIRK methods from Kennedy and Carpenter are the additive Runge-Kutta methods that are used in Sundials’ ARKODE IMEX solvers. We just haven’t added the explicit part.

But we have the architecture which allows the user to define IMEX methods. There’s also plenty of other IMEX methods which can be implemented as well. Since the architecture here is “already done”, it’s simply a matter of coding in a the inner loop for a few new methods. To me, this sounds like a great Google Summer of Code project as well, so we may be holding off on development here until the next summer.

Finding Out Our Partial Differential Equation (PDE) Interface

I was happy to release (at least the beginnings of) PDE toolkits… but that satisfies a small group of people who know numerical methods for PDEs and want these pieces in order to write solvers more easily. In practice, many scientists probably don’t know how to (or don’t want to) do this (they are specialists in science! Not solving PDEs!). We need to provide something like MATLAB’s pdepe: simple interfaces for solving common PDEs. While one way we will build these solvers will be to use our toolbox and build method of lines integrators, we will need to make use of the distributed architecture of DifferentialEquations.jl in order to get good coverage of the PDE landscape. Indeed, I know of some individuals like John Gibson who are building spectral PDE solvers and really seem to know what they are doing, and so it would be a shame if we didn’t have a way to allow users to directly interface with these tools (it’ll also be a great way for methods researchers to easily benchmark, hopefully pulling an even larger community of developers in).

The answer will be some kind of problem hierarchy where the user defines something like a DiffusionAdvectionProblem where it contains a bunch of functions and there is a common interpretation of what the problem definition is and how solvers should treat it, and then they should all use that to spit out a similar solution. We have the pieces of how that can work in our (fantastically outdated) FEM Heat and Poisson methods, but the issues are finding out how to do things like boundary conditions in a way that we can make the most out of every’s PDE solver while not complicating the common interface. Once we have solvers all wrapped together, there will still be a nightmare: how do you document this? If we have a different set of four packages available for 5 different PDEs, the combinatoric explosion in documenting the nuances means we can’t add all of this to our current documentation (which is already huge). So we may need a “different section” of the docs somehow? To me it’s very unclear how we will document this well, so my plan is to just start adding the functionality and find out how to document it as we go along. It will probably need to be re-written a few times before it’s any good, so please bear with us!

High Order Adaptive Solvers for Stiff SDEs

This is actually one of my recent research projects. I have new methods for high order adaptive solvers for stiff stochastic differential equations, and will be submitting the publication soon. When this is published, the associated methods will be released and we will have high order adaptive methods for stiff SDEs. So stay tuned!

Flesh out the BVP solvers

Our Shooting methods are very flexible, but they will never do well on problems which are sensitive to the initial condition. We need to instead make our MIRK methods get all of the bells and whistles: continuous extension, adaptivity, etc., and we need to wrap some of the classic Netlib solvers into the same interface so we can start to do some comprehensive benchmarking.

Parallel-in-time ODE Solvers

During the last Google Summer of Code we took a stab at developing some parallel-in-time ODE solvers using Neural Networks. While the student did a great job at trying a bunch of different strategies, we have come to realize that neural networks simply do not “know” enough about the structure of the differential equations in order to be efficient. When talking to a friend about PDE solvers, he noted that he tested the efficiency of TensorFlow’s neural net PDE solver (which it has in a tutorial) against more standard methods and noted similar issues with efficiency. So we tried, and it was definitely a very interesting research project, but this direction didn’t yield the results that we hoped.

Thus instead I am hoping to take a step over to different methods. Parallel-in-time integration methods like parareal integration and the XBraid software do exist, and so I plan on taking a stab on adding these to the repertoire of DiffEq. Note that for large and expensive ODEs, SDEs, and DDEs, one can already parallelize the calculation of f using the current tooling. Thus these methods are for when you want to solve problems where the set of ODEs is quite small, yet you need to solve over a large timespan and have a lot of parallel computing power available. So once again, there’s no rush as this is quite a niche, but we plan to get to it in the next release cycle. In fact, I hear it’s so niche that I was told in an email that parareal is only good for long time integration when you have >128 cores available… let’s make an open-source implementation and try it out ourselves.

Automatic Stiffness Detection and Switching

What’s a stiff differential equation? That can be hard to explain and predict. For this reason, many people like the idea of automatic stiffness detection and allowing the method to automatically switch between solvers to handle the different types of equations more effectively without user input. We have all of the tooling for building this, and the user can actually specify switching strategies themselves using our CompositeAlgorithm setup, but in the next release cycle I hope to release some methods which have this built in, akin to something like LSODA. Note that my newest paper on SDE methods includes stiffness detection as well, so in one fell swoop we plan to add stiffness detection and switching for ODEs, SDEs, and DDEs (and of course differential-algebraic version via mass matrices, and …, you get the picture about how all of this stuff is composed together!)

This is actually at a point where it would not be too difficult to do but would take a good chunk of time and also take some research time, so I am putting it off and hoping this can be a Google Summer of Code project or another project with a student.

Efficiency Improvements for Cheap DEs

One final wart in the DiffEq architecture. Due to issues with Julia, it seems that if an ODE is “sufficiency cheap”, i.e. takes less than a millisecond to solve, our setup has some inefficiencies. This is from how we do the setup of the integrator and limitations Julia has on type inference. This issue is constant, meaning that for more expensive ODEs/SDEs/DDEs it’s still just a millisecond. However, it’s annoying and we hope to remedy this. In reality, we haven’t seen this affect real-world benchmarks since < millisecond ODEs are not necessarily where people are looking at performance, but it does start to become an issue when attempting to do something like parameter estimation on cheap ODEs since this involves solving the same ODE thousands of times. Part of the issue is due to the use of keyword arguments whose performance will be fixed in Julia v0.7. For the inference issues, we hope to get the right updates into an early Julia v1.x, along with making a few structural changes, and then this issue will go away. Other enhancements along this line will be a reinit interface to make it easier to reuse internal caches and cleaning up the solution type pre-allocation interface.

Conclusion

At this point, we are very satisfied with our offering. The 30 people who make up the JuliaDiffEq team have really built a software which has the methods to solve “most” differential equations that users encounter, and also do so efficiently. In the coming months we hope to add extra methods for specific and important niches to our offerings and fill in some holes. But together, I think we have a pretty solid offering, and everything else is (important) icing on the cake.

Write a Reply or Comment

Your email address will not be published. Required fields are marked *


*

This site uses Akismet to reduce spam. Learn how your comment data is processed.