Optimal control for partial differential equations is very interested branch in pure and applied Mathematics. This field can be used to describe many application in many fields as biology, chemistry, control. Optimal control theory deals with the problem of finding a control law for a given system such that a certain optimality criterion is achieved. It is an extension of the calculus of variations, and is a mathematical optimization method for deriving control policies. The method is largely due to the work of Lev Pontryagin and Richard Bellman in the 1950s, after contributions to calculus of variations by Edward J. McShane. Optimal control can be seen as a control strategy in control theory.