Over the last decade, statistical learning theory has achieved rapid progress due to the introduction and research of classification algorithms including support vector machines and boosting. Along with their successful applications in practice, theoretical performance of these algorithms becomes well understood in terms of margin bounds, Bayes risk consistency, and asymptotic rate analysis. This monograph provides further investigation of these algorithms within regularization frameworks and from an approximation theory point of view. Error analysis frameworks by error decomposition techniques are fully developed for two classes of regularization schemes which cover the support vector machines, regularized boosting, and support vector kernel networks by linear programming and indefinite kernels. The results presented in this monograph are by far the best. The error analysis frameworks have been shown to be wide applicable in most recent research works and should be able to shed light on future researches on related topics in machine learning and artificial intelligence.