Please note that the content of this book primarily consists of articles available from Wikipedia or other free sources online. Multi-task learning is an approach to machine learning, that learns a problem together with other related problems at the same time, using a shared representation. This often leads to a better model for the main task, because it allows the learner to use the commonality among the tasks. Therefore, multi-task learning is a kind of inductive transfer. Progol allows arbitrary Prolog programs as background knowledge and arbitrary definite clauses as examples. Despite this, in bench-tests the efficiency of Progol compares favourably with FOIL.