Neural Machine Translation uses Artificial Neural Networks such as Transformers, which are the state of the art models that shows promising result over the previous MT models. Several ancient scripts written in the Ge'ez language that needs to be translated are available in Ethiopia and abroad. Currently, youth and researchers are interested to learn and involve in research areas of Ge'ez and Amharic manuscripts. This thesis, therefore, aims to demonstrate the capabilities of deep learning algorithms on MT tasks for those morphologically rich languages. A bi-directional text-based Ge'ez-Amharic MT was tested on two main different deep learning models viz. Seq2Seq with attention, and Transformer.