Algorithms¶
Detailed documentation for each optimisation and sampling algorithm in Diffid.
Optimisers¶
Gradient-free and gradient-based algorithms for finding optimal parameters.
Samplers¶
MCMC and nested sampling for uncertainty quantification and model comparison.
Algorithm Comparison¶
| Algorithm | Type | Gradients | Best For | Parallelisable |
|---|---|---|---|---|
| Nelder-Mead | Local | No | < 10 params, noisy | No |
| CMA-ES | Global | No | 10-100+ params | Yes |
| Adam | Local | Yes | Smooth objectives | No |
| Metropolis-Hastings | MCMC | No | Uncertainty | No |
| Nested Sampling | Evidence | No | Model comparison | Yes |
Choosing an Algorithm¶
graph TD
A[Start] --> B{Need uncertainty?}
B -->|No| C{Gradients available?}
B -->|Yes| D{Need evidence?}
C -->|Yes| E[Adam]
C -->|No| F{Problem size?}
D -->|Yes| G[Nested Sampling]
D -->|No| H[Metropolis-Hastings]
F -->|< 10 params| I[Nelder-Mead]
F -->|> 10 params| J[CMA-ES] Performance Characteristics¶
Convergence Speed¶
Fast → Slow:
- Adam (with gradients)
- Nelder-Mead (small problems)
- CMA-ES (large problems)
- MCMC samplers (many evaluations)
- Nested sampling (most evaluations)
Robustness to Local Minima¶
Least → Most robust:
- Adam (gradient descent)
- Nelder-Mead (local search)
- CMA-ES (global search)
- MCMC (explores posterior)
- Nested sampling (explores full space)
Implementation Details¶
All algorithms are implemented in Rust for performance:
- Zero-copy: Efficient memory usage
- Parallel: Where applicable (CMA-ES, nested sampling)
- Numerically stable: Careful floating-point handling
- Well-tested: Comprehensive test suite
Source code: rust/src/optimisers/
References¶
Each algorithm page includes references to original papers and implementation details.