In this paper, we study a not well-investigated but important transfer learning problem termed Distant Domain Transfer Learning (DDTL). This topic is closely related to negative transfer. Unlike conventional transfer learning problems which assume that the source domain and the target domain are more or less similar to each other, DDTL aims to make efficient transfers even when the domains or the tasks are completely different. As an extreme example in image classification, there are only a sufficient amount of unlabeled images of watches, airplanes, and horses in the source domain, and the target domain only has a small set of labeled human face images. Previously, a few instance-based distant domain transfer algorithms were proposed to deal with this type of binary distant domain image classification problems. Yet most existing algorithms are very task-specific and they are only good at binary classification tasks. In this study, we propose a novel feature-based distant domain transfer learning algorithm, which requires only a tiny set of labeled target data and unlabeled source data from completely different domains. Instead of selecting intermediate instances, we introduced Distant Feature Fusion (DFF), a novel feature selection method, to discover general features cross distant domains and tasks by using convolutional autoencoder with a domain distance measurement as a feature extractor. As the novelty of this study, it can effectively handle both distant domain mutil-class image classification and binary image classification problems. More importantly, it has achieved up to 19% higher classification accuracy than “non-transfer” algorithms, and up to 9% higher than existing distant transfer algorithms.