Data fusion is the process of putting together information obtained from many heterogeneous sensors, on many platforms, into a single composite picture. Data fusion researches focuses on the development of models appropriate to unify images have different scale with complex information. In this book, many fusion techniques have been used to improve low resolution remotely sensed images, and renewing the information by considering that the high resolution images are newer than the restored image. The unifying methods used are: 1.Fusion techniques based on wavelet transformations; The Haar and the Tap 3/5 wavelet transformations. 2.Lab, HLS, and YUV color transformation algorithms were used for the unification purposes, and finally 3.The Principle Component Analysis (PCA) transformation technique is used to perform the fusion. The produced image quality have been tested, subjectively by showing the obtained results, and quantitatively by measuring the Normalized-Mean-Square-Error (NMSE), the Signal-to-Noise-Ratio (SNR), and the Peak-Signal-to-Noise-Ratio (PSNR) of the output images compared with the large-scale panchromatic SPOT scene.