Secant Method
Fundamental Principles
The Secant Method is an Open Method that mimics Newton’s Method but avoids calculating derivatives.
Geometric Basis
It approximates the tangent line in Newton’s method using a secant line passing through the two most recent points.
Core Concept
Unlike Bisection, it does not require the root to be bracketed. It uses two initial guesses to project a straight line to the x-axis.
Prerequisites & Conditions
As an open method, the requirements differ from bracketing methods:
Two Initial Guesses
Requires \( x_{n-1} \) and \( x_n \). They do not need to bracket the root (signs can be the same).
Continuity
The function \( f(x) \) should be continuous near the root.
Failure Condition
The method fails if the secant line is horizontal (denominator \( f(x_n) – f(x_{n-1}) \) becomes zero).
The Algorithm
The method generates a sequence \( x_0, x_1, x_2 \dots \) using the recurrence relation:
Discard the oldest point. Move values forward:
- Old \( x_{n-1} \) is discarded.
- \( x_{n-1} \leftarrow x_n \)
- \( x_n \leftarrow x_{n+1} \)
Interactive Simulator
Simulate for \( f(x) = x^3 – x – 2 \). Root \(\approx 1.521\).
Watch how quickly it converges compared to Bisection!
x0 = 1.0, x1 = 2.0
Ready to calculate x2.
Convergence & Performance
Superlinear Convergence
The error reduces according to the relation \( E_{n+1} \approx C \cdot E_n^\phi \), where \( \phi \) is the Golden Ratio.
- Order: \( \phi \approx 1.618 \).
- Speed: Significantly faster than Bisection (1.0) but slower than Newton-Raphson (2.0).
- Risk: Not guaranteed to converge. Can diverge if initial guesses are poor.
Visualizing Speed
Notice the rapid drop compared to linear methods.
Method Comparison
| Feature | Bisection / Regula Falsi | Secant Method | Newton-Raphson |
|---|---|---|---|
| Type | Bracketing (Closed) | Open | Open |
| Convergence | Guaranteed | Not Guaranteed | Not Guaranteed |
| Speed (Order) | Linear (1.0) | Superlinear (~1.618) | Quadratic (2.0) |
| Derivatives? | No | No | Yes (Required) |
| Initial Guesses | 2 (Must bracket) | 2 (Any distinct) | 1 |
Stopping Criteria
The algorithm stops when:
1. Tolerance
Difference between steps is tiny: \( |x_{n+1} – x_n| < \epsilon \).
2. Function Value
Close to zero: \( |f(x_{n+1})| < \epsilon \).
3. Safety Limit
Max iterations reached (prevents infinite loops if divergent).
