The Hamilton Jacobi Bellman (HJB) equation is a partial differential equation which is central to optimal control theory. Classical variational problems, for example, the brachistochrone problem can be solved using this method. The HJB method can be generalized to stochastic systems as well.