06
Ιουν
Learning from Rankings
We consider learning from rankings, i.e., learning from a dataset containing subsets of samples ranked w.r.t. their relative order. For example, a medical expert presented with patient records can order them w.r.t. the relative severity of a disease. Rankings are often less noisy than class labels: human experts disagreeing when generating class judgments often exhibit reduced variability when asked to compare samples instead. Rankings are also more informative, as they capture both inter and intra-class relationships; the latter are not revealed via class labels alone. Nevertheless, the combinatorial nature of rankings increases the computational cost of training significantly. We propose spectral algorithms to accelerate training in this ranking regression setting; our main technical contribution is to show that the Plackett-Luce negative log-likelihood augmented with a proximal penalty has stationary points that satisfy the balance equations of a Markov Chain. This observation yields fast spectral algorithms for ranking regression for both shallow and deep neural network regression models. Compared to state-of-the-art siamese networks, our resulting algorithms are up to 175 times faster and attain better predictions by up to 26% Top-1 Accuracy and 6% Kendall-Tau correlation over five real-life ranking datasets.
Stratis Ioannidis is an associate professor in the Electrical and Computer Engineering Department of Northeastern University, in Boston, MA, where he also holds a courtesy appointment with the Khoury College of Computer Sciences. He received his B.Sc. (2002) in Electrical and Computer Engineering from the National Technical University of Athens, Greece, and his M.Sc. (2004) and Ph.D. (2009) in Computer Science from the University of Toronto, Canada. Prior to joining Northeastern, he was a research scientist at the Technicolor research centers in Paris, France, and Palo Alto, CA, as well as at Yahoo Labs in Sunnyvale, CA. He is the recipient of an NSF CAREER Award, a Google Faculty Research Award, a Facebook Research Award, a Martin W. Essigmann Outstanding Teaching Award, and several best paper awards. His research interests span machine learning, distributed systems, networking, optimization, and privacy.