The aim of this book is to design fast feed forward neural networks to present a method to solve two point boundary value problems for ordinary differential equations, that is, design a fully connected networks contains links between all nodes in adjacent layers which can speedup the solution times, reduce solver failures, and increase possibility of obtaining the globally optimal solution. We training suggested network by Levenberg Marquardt, BFGS Quasi-Newton, Bayesian regularization, CG training algorithm with Polak-Ribiere update procedure then speeding suggested networks by modification these training algorithm, many of them having a very fast convergence rate for reasonable size networks. The above modify algorithms have a variety of different computation and storage requirements, however non of the above algorithms has a global properties, such as stability and convergence, which suited to all problems, and all the above algorithms applied in solving two point boundary value problem . Finally, we illustrate the suggested network by solving a variety of model problems and present comparisons with solutions obtained using other different method .