5-Minute Quickstart¶
This quickstart guide demonstrates scalar optimisation using the classic Rosenbrock function.
The Problem¶
The Rosenbrock function is defined as:
The global minimum is at \((x, y) = (1, 1)\) with \(f(1, 1) = 0\).
Basic Example¶
import numpy as np
import diffid
def rosenbrock(x):
"""The Rosenbrock function - a classic optimisation test problem."""
value = (1 - x[0]) ** 2 + 100 * (x[1] - x[0] ** 2) ** 2
return np.asarray([value])
# Build the problem
builder = (
diffid.ScalarBuilder()
.with_objective(rosenbrock)
.with_parameter("x", 1.5) # Initial guess
.with_parameter("y", -1.5) # Initial guess
)
problem = builder.build()
# Run optimisation (uses Nelder-Mead by default)
result = problem.optimise()
# Display results
print(f"Optimal parameters: {result.x}")
print(f"Objective value: {result.value:.3e}")
print(f"Success: {result.success}")
print(f"Iterations: {result.iterations}")
Output:
Understanding the Code¶
-
Define the objective function: The function must accept a NumPy array and return a NumPy array
-
Create a builder:
ScalarBuilder()is used for scalar optimisation problems where you directly evaluate a function -
Add parameters: Use
with_parameter(name, initial_value)to define decision variables -
Build the problem: Call
build()to create an optimisable problem instance -
Run optimisation: Call
optimise()to run the default optimiser (Nelder-Mead)
Using Different Optimisers¶
You can specify which optimiser to use:
CMA-ES¶
# Use CMA-ES for global search
optimiser = diffid.CMAES().with_max_iter(1000).with_step_size(0.5)
result = optimiser.run(problem, [1.5, -1.5])
print(f"Optimal parameters: {result.x}")
print(f"Objective value: {result.value:.3e}")
Adam (Gradient-Based)¶
# Use Adam optimiser
optimiser = diffid.Adam().with_max_iter(1000).with_step_size(0.01)
result = optimiser.run(problem, [1.5, -1.5])
print(f"Optimal parameters: {result.x}")
print(f"Objective value: {result.value:.3e}")
Visualising the Optimisation¶
If you installed the plotting extra, you can visualise the optimisation landscape:
import numpy as np
import matplotlib.pyplot as plt
import diffid
def rosenbrock(x):
value = (1 - x[0]) ** 2 + 100 * (x[1] - x[0] ** 2) ** 2
return np.asarray([value])
# Create a grid for plotting
x = np.linspace(-2, 2, 200)
y = np.linspace(-1, 3, 200)
X, Y = np.meshgrid(x, y)
Z = np.zeros_like(X)
for i in range(X.shape[0]):
for j in range(X.shape[1]):
Z[i, j] = rosenbrock([X[i, j], Y[i, j]])[0]
# Plot contours
plt.figure(figsize=(10, 8))
levels = np.logspace(-1, 3.5, 20)
plt.contour(X, Y, Z, levels=levels, cmap='viridis')
plt.colorbar(label='f(x, y)')
# Mark the optimum
plt.plot(1.0, 1.0, 'r*', markersize=20, label='Global minimum')
# Run optimisation and plot path
builder = (
diffid.ScalarBuilder()
.with_objective(rosenbrock)
.with_parameter("x", -1.5)
.with_parameter("y", -0.5)
)
problem = builder.build()
result = problem.optimise()
plt.plot(result.x[0], result.x[1], 'go', markersize=10, label='Found optimum')
plt.xlabel('x')
plt.ylabel('y')
plt.title('Rosenbrock Function Optimisation')
plt.legend()
plt.grid(True, alpha=0.3)
plt.show()

Next Steps¶
- First ODE Fit: Learn how to fit differential equations to data
- Core Concepts: Understand the builder pattern and problem types
- Choosing an Optimiser: Learn when to use each optimiser
- Tutorials: Explore interactive Jupyter notebooks