Control theory can be roughly classified as deterministic or stochastic. Each of these can further be subdivided into game theory and optimal control theory. The central problem of control theory is the so called constrained maximization (which- with slight modifications--is equivalent to minimization). One can then say, heuristically, that the major problem of control theory is to find the maximum of some performance criterion (or criteria), given a set of constraints. The starting point is, of course, a mathematical representation of the performance criterion (or criteria)- sometimes called the objective functional--along with the constraints. When the objective functional is single valued (Le. , when there is only one objective to be maximized), then one is dealing with optimal control theory. When more than one objective is involved, and the objectives are generally incompatible, then one is dealing with game theory. The first paper deals with stochastic optimal control, using the dynamic programming approach. The next two papers deal with deterministic optimal control, and the final two deal with applications of game theory to ecological problems. In his contribution, Dr. Marc Mangel applies the dynamic proQramming approach, as modified by his recent work--with Dr. Colin Clark, from the University of British Columbia (Mangel and Clark 1987}*--to modelling the "behavioral decisions" of insects. The objective functional is a measure of fitness. Readers interested in detailed development of the subject matter may consult Mangel (1985). My contributions deal with two applications of optimal control theory.
Hinweis: Dieser Artikel kann nur an eine deutsche Lieferadresse ausgeliefert werden.
Hinweis: Dieser Artikel kann nur an eine deutsche Lieferadresse ausgeliefert werden.