The proposed work will focus on several fundamental research questions concerning the aggregate loss: are there any other types of aggregate loss beyond the average individual losses?; if so, what will be a general abstract formulation of these new aggregate loss?; how can the new aggregate losses be adapted to different machine learning problems?; and what are the statistical and computational behaviors of machine learning algorithms using the general aggregate losses?. Pursuing answers to these questions, we conduct the proposed project in four interrelated thrusts. 1. Rank-based aggregate losses for binary classification. Building upon our recent work, we explore new types of rank-based aggregate losses for binary classification and study efficient algorithms optimizing learning objectives formed based upon them. 2. Theoretical analysis of rank-based aggregate losses. To deepen our understanding of the binary classification algorithms developed using the rank-based aggregate losses, we will study their asymptotic behaviors such as generalization and consistency. 3. Rank-based aggregate losses for other learning problems. The ranking based aggregate losses will be extended to other supervised problems (multi-class and multi-label learning and supervised metric learning) and unsupervised learning. 4. General properties and new types of aggregate losses. An aggregate loss will be abstracted as a set function that maps the ensemble of individual losses to a number. This abstraction will be exploited to study the general properties and propose new forms of aggregate losses.