Hill Climbing, A*, Annealing - StudyPulse
Boost Your VCE Scores Today with StudyPulse
8000+ Questions AI Tutor Help
Home Subjects Algorithmics (HESS) Hill climbing, A*, simulated annealing

Hill Climbing, A*, Annealing

Algorithmics (HESS)
StudyPulse

Hill Climbing, A*, Annealing

Algorithmics (HESS)
01 May 2026

Heuristic Search: Hill Climbing, A*, and Simulated Annealing

When problems are too large for exhaustive search, heuristic methods guide the search using problem-specific knowledge to find good (not necessarily optimal) solutions efficiently.

Heuristic Functions

A heuristic function $h(n)$ estimates the cost or quality of a solution from state $n$. Good heuristics are computationally cheap, correlated with actual solution quality, and — for A* — admissible (never overestimate true cost).


Hill Climbing

Concept: Start from an initial state. Repeatedly move to the neighbour with the best heuristic value. Stop when no neighbour improves the score (local maximum).

function HillClimbing(problem):
    current = initial_state()
    while True:
        neighbours = get_neighbours(current)
        best = argmax(h(n) for n in neighbours)
        if h(best) <= h(current):
            return current  # stuck at local maximum
        current = best

Limitations of Hill Climbing

Problem Description
Local maxima Gets stuck at a peak that is not the global optimum
Plateaux Flat regions where all neighbours have equal value
Ridges Narrow optimal paths that are hard to follow step-by-step

COMMON MISTAKE: Hill climbing does not backtrack. Once stuck at a local maximum it cannot escape. It is not guaranteed to find the global optimum.


The A* Algorithm

A* is an informed graph search that finds the optimal path by combining actual cost and heuristic estimate:

$$f(n) = g(n) + h(n)$$

Where $g(n)$ is the actual cost from start to $n$ and $h(n)$ is the heuristic estimate from $n$ to the goal. A* always expands the node with the lowest $f(n)$.

function AStar(G, start, goal, h):
    open_set = priority queue {start: f(start)}
    g[start] = 0
    while open_set is not empty:
        current = open_set.extract_min()
        if current == goal: return reconstruct_path()
        for each neighbour v with edge weight w:
            tentative_g = g[current] + w
            if tentative_g < g.get(v, infinity):
                g[v] = tentative_g
                open_set.insert(v, g[v] + h(v))
    return failure

Admissibility and Optimality

A heuristic is admissible if $h(n) \leq h^(n)$ for all $n$ (never overestimates). With an admissible heuristic, A* is guaranteed to find the optimal solution*.

Example: In grid pathfinding, Manhattan distance is admissible because you cannot travel fewer steps than the Manhattan distance.

KEY TAKEAWAY: A* finds the optimal path when the heuristic is admissible. It is more efficient than BFS/Dijkstra because it uses the heuristic to guide the search toward the goal.


Simulated Annealing

Concept: Inspired by metallurgical annealing. Allows worsening moves with a probability that decreases over time, enabling escape from local optima.

function SimulatedAnnealing(problem, T_initial, cooling_rate):
    current = initial_state()
    T = T_initial
    while T > T_min:
        neighbour = random_neighbour(current)
        delta = quality(neighbour) - quality(current)
        if delta > 0:
            current = neighbour           # always accept improvement
        elif random() < exp(delta / T):
            current = neighbour           # accept worse state probabilistically
        T = T * cooling_rate
    return current

Key Parameters

Parameter Effect
Initial temperature $T$ High $T$ = more exploration
Cooling rate Slow cooling = better quality, slower runtime
$T_{\min}$ Stopping criterion

EXAM TIP: Simulated annealing is probabilistic and does not guarantee the optimal solution. The key insight is that temperature controls the probability of accepting worse moves — high temperature early allows exploration; low temperature later enforces exploitation.


Comparison Summary

Method Optimal? Handles Local Optima? Cost
Hill Climbing No No Cheap
A* (admissible $h$) Yes N/A (graph search) Moderate
Simulated Annealing No (probabilistic) Yes Moderate

VCAA FOCUS: Know how each algorithm works, when to apply each, and their limitations. A* requires an admissible heuristic for optimality. Simulated annealing uses a temperature parameter to escape local optima. Hill climbing is fast but incomplete.

Table of Contents