What Agentic AI “Vibe Coding” In The Hands Of Actual Programmers / Engineers
February 12 2026 in Julia, Programming | Tags: agentic ai, Claude, codex, llm, sciml | Author: Christopher Rackauckas
I often have people ask how I’m using Claude code so much, given that I have a bot account storming the SciML Open Source Software repositories with tens to hundreds of PRs a day, with many of them successful. Then GSoC students come in with Claude/Codex and spit out things that are clearly just bot spam, and many people ask, what is different? The difference is actually knowing the codebase and the domains. It turns out that if you know how to actually program, you can use the LLM-based interfaces as just an accelerator for some of the tedious work that you have to do. I tend to think about it the same as working with a grad student: you need to give sufficient information for it to work, and if you don’t get good stuff back it’s … READ MORE
Why Julia’s GPU Accelerated ODE Solvers are 20x-100x Faster than Jax and PyTorch
January 11 2026 in Differential Equations, Julia, Mathematics, Programming | Tags: gpu, julia, python | Author: Christopher Rackauckas
You may have seen the benchmark results and thought, “how the heck are the Julia ODE solvers on GPUs orders of magnitude faster than the GPU-accelerated Python libraries, that can’t be true?” In this talk I will go into detail about the architectural differences between the Julia approaches to generating GPU-accelerated solvers vs the standard ML library approach to GPU usage. By the end of the talk you’ll have a good enough understanding of models of GPU acceleration to understand why this performance difference exists, and the many applications that can take advantage of this performance improvement.
Claude Code sucks but is still useful: experiences maintaining Julia’s SciML scientific computing infrastructure
October 6 2025 in Differential Equations, Julia, Mathematics, Programming, Science, Scientific ML | Tags: agentic ai, Claude, differential equations, julia, llm, numerical analysis, physics, scientific computing, vibecoding | Author: Christopher Rackauckas
Claude Code sucks but is still useful: experiences maintaining Julia’s SciML scientific computing infrastructure
So it’s pretty public that for about a month now I’ve had 32 processes setup on one of the 64 core 128gb RAM servers to just ssh in, tmux to a window, and tell it to slam on some things non-stop. And it has been really successful!… with the right definition of success. Let me explain.
This is a repost of the long post in the Julia Discourse.
* How is Claude being used, and how useful has it been?
j-bowhay, post:1, topic:131009
I think the first will answer the others. Basically, Claude is really not smart at all. There is no extensive algorithm implementation that has come from AI. I know some GSoCers and SciML Small Grants applicants have used AI (many without disclosure) but no wholesale usage has … READ MORE
Implicit ODE Solvers Are Not Universally More Robust than Explicit ODE Solvers, Or Why No ODE Solver is Best
September 4 2025 in Differential Equations, Julia, Mathematics, Programming | Tags: bdf, euler, explicit, implicit, numerical analysis, ode, runge-kutta, solver | Author: Christopher Rackauckas
A very common adage in ODE solvers is that if you run into trouble with an explicit method, usually some explicit Runge-Kutta method like RK4, then you should try an implicit method. Implicit methods, because they are doing more work, solving an implicit system via a Newton method having “better” stability, should be the thing you go to on the “hard” problems.
This is at least what I heard at first, and then I learned about edge cases. Specifically, you hear people say “but for hyperbolic PDEs you need to use explicit methods”. You might even intuit from this “PDEs can have special properties, so sometimes special things can happen with PDEs… but ODEs, that should use implicit methods if you need more robustness”. This turns out to not be true, and really understanding the ODEs will help us understand better … READ MORE
A Guide to Gen AI / LLM Vibecoding for Expert Programmers
August 22 2025 in Programming, Science | Tags: chatgpt, Claude, Generative AI, llm | Author: Christopher Rackauckas
I get it, you’re too good to vibe code. You’re a senior developer who has been doing this for 20 years and knows the system like the back of your hand. Or maybe you’re the star individual contributor who is the only person who can ever figure out how to solve the hard problems. Or maybe you’re the professor who created the entire subject of the algorithms you’re implementing. I don’t know you, but I do know that you think you’re too good to vibe code. And guess what, you’re absolutely and totally wrong.
Facetious? Maybe… but I will go even further.
No, you’re not too good to vibe code. In fact, you’re the only person who should be vibe coding.
I would have thought this statement was crazy just a month ago because this label of “expert” coder also applies to me. … READ MORE
Machine learning with hard constraints: Neural Differential-Algebraic Equations (DAEs) as a general formalism
June 3 2025 in Differential Equations, Julia, Mathematics, Programming, Science, Scientific ML, Uncategorized | Tags: adjoint methods, differential-algebraic equations, julia, modelingtoolkit, neural dae, numerical solvers | Author: Christopher Rackauckas
We recently released a new manuscript Semi-Explicit Neural DAEs: Learning Long-Horizon Dynamical Systems with Algebraic Constraints where we showed a way to develop neural networks where any arbitrary constraint function can be directly imposed throughout the evolution equation to near floating point accuracy. However, in true academic form it focuses directly on getting to the point about the architecture, but here I want to elaborate about the mathematical structures that surround the object, particularly the differential-algebraic equation (DAE), how its various formulations lead to the various architectures (such as stabilized neural ODEs), and elaborate on the other related architectures which haven’t had a paper yet but how you’d do it (and in what circumstances they would make sense).
How chaotic is chaos? How some AI for Science / SciML papers are overstating accuracy claims
May 26 2025 in Differential Equations, Julia, Mathematics, Programming, Science, Scientific ML | Tags: | Author: Christopher Rackauckas
Just how chaotic are chaotic systems? Many of you may have heard of “the butterfly effect” but don’t quite know the mathematics behind such systems. What I want to demonstrate is the “sensitive dependence to initial conditions” property of chaotic systems and just how sensitive these systems are. The reason this has come up is that I have seen some AI papers claiming to be able to predict the timeseries of a chaotic system (many more can be found online too, just highlighting a few random ones). What I want to bring to the forefront is an examination of what is really being claimed: just how hard is it to actually forecast a chaotic system? And if they aren’t doing that, what have they done instead?
Quick Understanding of Chaos: Sensitive Dependence and the Shadowing Lemma
First of … READ MORE
A Hands on Introduction to Applied Scientific Machine Learning / Physics-Informed Learning
May 11 2025 in Julia, Scientific ML, Uncategorized | Tags: ml, neural networks, sciml | Author: Christopher Rackauckas
Presented at JuliaEO25
This is a hands-on introduction to Scientific Machine Learning that does not assume a background in machine learning. We start scratch, showing the mathematical basis of “what is a neural network?” all the way up through adding physical intuition to the neural network and using it solve problem in epidemic outbreaks to improving sensor tracking of Formula 1 cars.
Open Source Component-Based Modeling with ModelingToolkit
May 5 2025 in Differential Equations, Julia, Mathematics, Programming | Tags: | Author: Christopher Rackauckas
Component-based modeling systems such as Simulink and Dymola allow for building scientific models in a way that can be composed. For example, Bob can build a model of an engine, and Alice can build a model of a drive shaft, and you can then connect the two models and have a model of a car. These kinds of tools are used all throughout industrial modeling and simulation in order to allow for “separation of concerns”, allowing experts to engineer their domain and compose the final digital twins with reusable scientific modules. But what about open source? In this talk we will introduce ModelingToolkit, an open source component-based modeling framework that allows for composing pre-built models and scales to large high-fidelity digital twins.
PyData is … READ MORE
The Numerical Analysis of Differentiable Simulation: Automatic Differentiation Can Be Incorrect
April 20 2025 in Differential Equations, Julia, Mathematics, Scientific ML | Tags: | Author: Christopher Rackauckas
The Numerical Analysis of Differentiable Simulation: How Automatic Differentiation of Physics Can Give Incorrect Derivatives
Scientific machine learning (SciML) relies heavily on automatic differentiation (AD), the process of constructing gradients which include machine learning integrated into mechanistic models for the purpose of gradient-based optimization. While these differentiable programming approaches pitch an idea of “simply put the simulator into a loss function and use AD”, it turns out there are a lot more subtle details to consider in practice. In this talk we will dive into the numerical analysis of differentiable simulation and ask the question: how numerically stable and robust is AD? We will use examples from the Python-based Jax (diffrax) and PyTorch (torchdiffeq) libraries in order to demonstrate how canonical … READ MORE
READ MORE