Large Scale Ranking Using Stochastic Gradient Descent
DOI:
https://doi.org/10.7546/CRABS.2022.10.03Keywords:
ranking, stochastic gradient descent, east-squares, gradient methods, stochastic optimizationAbstract
A system of linear equations can represent any ranking problem that minimizes a pairwise ranking loss. We utilize a fast version of gradient descent algorithm with a near-optimal learning rate and momentum factor to solve this linear equations system iteratively. Tikhonov regularization is also integrated into this framework to avoid overfitting problems where we have very large and high dimensional but sparse data.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2022 Proceedings of the Bulgarian Academy of SciencesCopyright (c) 2022 Proceedings of the Bulgarian Academy of Sciences
Copyright is subject to the protection of the Bulgarian Copyright and Associated Rights Act. The copyright holder of all articles on this site is Proceedings of the Bulgarian Academy of Sciences. If you want to reuse any part of the content, please, contact us.