Talks and poster presentations
- In November 2019, at KAUST-Tsinghua-Industry workshop, I presented a poster, based on our NeurIPS 2018 paper(poster here).
- In June 2019, at DS3, I presented our poster "A New Randomized Method for Solving Large Linear Systems".
- In June 2019, at LJK, I gave a talk on our new asynchronous delay-tolerant distributed algorithm.
- In February 2018, at Optimization and Big Data workshop, our team presented a poster on randomized linear algebra(poster here).
Stochastic Spectral and Conjugate Descent Methods
The state-of-the-art methods for solving optimization problems in big dimensions are variants of randomized coordinate descent (RCD). In this paper we introduce a fundamentally new type of acceleration strategy for RCD based on the augmenta- tion of the set of coordinate directions by a few spectral or conjugate directions. As we increase the number of extra directions to be sampled from, the rate of the method improves, and interpolates between the linear rate of RCD and a linear rate independent of the condition number. We develop and analyze also inexact variants of these methods where the spectral and conjugate directions are allowed to be approximate only. We motivate the above development by proving several negative results which highlight the limitations of RCD with importance sampling.
, Eduard Gorbunov
, Elnur Gasanov, Peter Richtárik
Creation of approximating scalogram description in a problem of movement prediction [in Russian]
The paper addresses the problem of a thumb movement prediction using electrocorticographic (ECoG) activity. The task is to predict thumb positions from the voltage time series of cortical activity. The scalograms are used as input features to this regression problem. Scalograms are generated by the spatio-spectro-temporal integration of voltage time series across multiple cortical areas. To reduce the dimension of a feature space, local approximation is used: every scalogram is approximated by parametric model. The predictions are obtained with partial least squares regression applied to local approximation parameters. Local approximation of scalograms does not significantly lower the quality of prediction while it efficiently reduces the dimension of feature space.
Elnur Gasanov, Motrenko Anastasia
Journal of Machine Learning and Data Analysis (in Russian)