I have long been intrigued with computational linguistic. However, the paradox is: the further I went onwards, the more pale I felt about statistical method in dealing with human language. In the past decade, along with the boost in the development of NLP, have come deep-rooted drawbacks brought by machine learning. These problems have proven more difficult than we thought it would be, and the solution is unlikely to rest on a foundation of clean, beautiful mathematics. Partly because of the influence of Noam Chomsky, natural language researchers began to focus on syntax and parsing. Even though, AI systems are still a very long way from achieving its original goal of creating flexible, integrated, human-like AI. Yet, it is really meaningful to create some valuable technologies in specific narrow domains. For, AI as a field, may pause occasionally to take advantage of these new technologies, thus becoming a boost to our original goal. That's how Rumination on Natural Language Processing came out, recording the effort and contemplation that has brought me where I am. Hope some of my thoughts in this book will be kind of help to your work.