The use in statistical theory of approximate arguments based on such methods as local linearization (the delta method) and approxi mate normality has a long history. Such ideas play at least three roles. First they may give simple approximate answers to distributional problems where an exact solution is known in principle but difficult to implement. The second role is to yield higher-order expansions from which the accuracy of simple approximations may be assessed and where necessary improved. Thirdly the systematic development of a theoretical approach to statistical inference that will apply to quite general families of statistical models demands an asymptotic formulation, as far as possible one that will recover 'exact' results where these are available. The approximate arguments are developed by supposing that some defining quantity, often a sample size but more generally an amount of information, becomes large: it must be stressed that this is a technical device for generating approximations whose adequacy always needs assessing, rather than a 'physical' limiting notion. Of the three roles outlined above, the first two are quite close to the traditional roles of asymptotic expansions in applied mathematics and much ofthe very extensive literature on the asymptotic expansion of integrals and of the special functions of mathematical physics is quite directly relevant, although the recasting of these methods into a probability mould is quite often enlightening.