Search

Your search keyword '"Loizou, Nicolas"' showing total 36 results

Search Constraints

Start Over You searched for: Author "Loizou, Nicolas" Remove constraint Author: "Loizou, Nicolas" Publication Type Reports Remove constraint Publication Type: Reports
36 results on '"Loizou, Nicolas"'

Search Results

1. Multiplayer Federated Learning: Reaching Equilibrium with Less Communication

2. Stochastic Polyak Step-sizes and Momentum: Convergence Guarantees and Practical Performance

3. Dissipative Gradient Descent Ascent Method: A Control Theory Inspired Algorithm for Min-max Optimization

4. Stochastic Extragradient with Random Reshuffling: Improved Convergence for Variational Inequalities

5. Remove that Square Root: A New Efficient Scale-Invariant Version of AdaGrad

6. Locally Adaptive Federated Learning

7. Communication-Efficient Gradient Descent-Accent Methods for Distributed Variational Inequalities: Unified Analysis and Local Updates

8. Single-Call Stochastic Extragradient Methods for Structured Non-monotone Variational Inequalities: Improved Analysis under Weaker Conditions

9. A Unified Approach to Reinforcement Learning, Quantal Response Equilibria, and Two-Player Zero-Sum Games

10. Dynamics of SGD with Stochastic Polyak Stepsizes: Truly Adaptive Variants and Convergence to Exact Solution

11. Stochastic Gradient Descent-Ascent: Unified Theory and New Efficient Methods

12. Stochastic Extragradient: General Analysis and Improved Rates

13. Stochastic Mirror Descent: Convergence Analysis and Adaptive Variants via the Mirror Stochastic Polyak Stepsize

14. Extragradient Method: $O(1/K)$ Last-Iterate Convergence for Monotone Variational Inequalities and Connections With Cocoercivity

15. On the Convergence of Stochastic Extragradient for Bilinear Games using Restarted Iteration Averaging

16. Stochastic Gradient Descent-Ascent and Consensus Optimization for Smooth Games: Convergence Analysis under Expected Co-coercivity

17. AI-SARAH: Adaptive and Implicit Stochastic Recursive Gradient Methods

18. Stochastic Hamiltonian Gradient Methods for Smooth Games

19. Unified Analysis of Stochastic Gradient Methods for Composite Convex and Smooth Optimization

20. SGD for Structured Nonconvex Functions: Learning Rates, Minibatching and Interpolation

21. A Unified Theory of Decentralized SGD with Changing Topology and Local Updates

22. Stochastic Polyak Step-size for SGD: An Adaptive Learning Rate for Fast Convergence

23. Randomized Iterative Methods for Linear Systems: Momentum, Inexactness and Gossip

24. Revisiting Randomized Gossip Algorithms: General Framework, Convergence Rates and Novel Block and Accelerated Protocols

25. Convergence Analysis of Inexact Randomized Iterative Methods

26. A Privacy Preserving Randomized Gossip Algorithm via Controlled Noise Insertion

27. SGD: General Analysis and Improved Rates

28. Stochastic Gradient Push for Distributed Deep Learning

29. Provably Accelerated Randomized Gossip Algorithms

30. Accelerated Gossip via Stochastic Heavy Ball Method

31. Momentum and Stochastic Momentum for Stochastic Gradient, Newton, Proximal Point and Subspace Descent Methods

32. Linearly convergent stochastic heavy ball method for minimizing generalization error

33. Privacy Preserving Randomized Gossip Algorithms

34. A New Perspective on Randomized Gossip Algorithms

35. Distributionally Robust Games with Risk-averse Players

36. Distributionally Robust Game Theory

Catalog

Books, media, physical & digital resources