The Forward Backward Algorithm is an iterative algorithm used to compute smoothing queries: p(Xk | e1:t) where 1 <= k < t. Note that we are computing an entire distribution over Xk.
FORWARD-BACKWARD-ALGORITHM(X, e, k, t, bn)
// Split the evidence
past_evidence = e1:k
future_evidence = ek+1:t
F = FORWARD(X, past_evidence, k, bn) // Returns a distribution p(Xk | e1:k) with length equal to the domain size of Xk
B = BACKWARD(X, future_evidence, k, t, bn) //Returns a vector p(ek+1:t | Xk) with length equal to the domain size of Xk
S = ()
for p in domain of Xk
S(p) = F(p)*B(p)
end
return S.normalize()
end
The Backward algorithm is an iterative algorithm for computing $p(e_{k+1:t} | X_k) $.
Note that we must compute this quanitity for all values of Xk although this is not a distribution over Xk.
BACKWARD-ALGORITHM(X, e, k, t, bn)
// The initial backward vector is given by p(et+1:t | Xt) where et+1:t is the empty set
// Thus we initialize B to all 1s
// The length of B is the domain size of the state variables X even though we are not computing a distribution over X
B = (1...1)
// Each iteration computes p(ei:t | Xi-1)
// The last iteration, when i=k+1, is ultimately what we are interested in computing
for i=t...k+1
C = {} // Will hold the updated values
// We must compute the above quantity for each value of Xi-1
for p in domain of Xi-1
// Sum over all possible values of Xi
sum = 0
for q in domain of Xi
sum += p(ei | Xi = q) * B(q) * p(Xi = q | Xi-1 = p)
end
C(p) = sum
end
B = C //Update the distribution
end
return B
end