Nearest neighbor classifier (NNC), a non-parametric pattern classification technique is not only simple to use, but often shows good performance. It is used in several domains, like data mining, machine learning, image/video/audio data analysis and retrieval, etc. It has some shortcomings or limitations, like it can be biased due to curse of dimensionality effect, huge computational requirements, etc. With large number of training instances it can achieve a better classification accuracy. But, this can worse the computational burden. The monograph presents a series of techniques whereby number of training instances can be artificially increased and can be stored in compact data representation schemes. This means, training set size can be virtually increased, but computational burden is not. The monograph presents, mainly, pattern synthesis techniques called partition based pattern synthesis and overlap based pattern synthesis, and their respective compact data structures. It is shown, both theoretically and experimentally that the proposed methods are effective. The monograph also presents some other improvements to NNC and presents a constant time NNC also.