Search

Your search keyword '"Horváth, Samuel"' showing total 101 results

Search Constraints

Start Over You searched for: Author "Horváth, Samuel" Remove constraint Author: "Horváth, Samuel"
101 results on '"Horváth, Samuel"'

Search Results

1. Collaborative and Efficient Personalization with Mixtures of Adaptors

2. FedPeWS: Personalized Warmup via Subnetworks for Enhanced Heterogeneous Federated Learning

3. Methods for Convex $(L_0,L_1)$-Smooth Optimization: Clipping, Acceleration, and Adaptivity

4. Low-Resource Machine Translation through the Lens of Personalized Federated Learning

5. Decentralized Personalized Federated Learning

6. Gradient Clipping Improves AdaGrad when the Noise Is Heavy-Tailed

7. Redefining Contributions: Shapley-Driven Federated Learning

8. Enhancing Policy Gradient with the Polyak Step-Size Adaption

9. Generalized Policy Learning for Smart Grids: FL TRPO Approach

10. Remove that Square Root: A New Efficient Scale-Invariant Version of AdaGrad

11. Flashback: Understanding and Mitigating Forgetting in Federated Learning

12. Federated Learning Can Find Friends That Are Advantageous

13. Rethinking Model Re-Basin and Linear Mode Connectivity

14. Efficient Conformal Prediction under Data Heterogeneity

15. Dirichlet-based Uncertainty Quantification for Personalized Federated Learning with Improved Posterior Networks

16. Byzantine Robustness and Partial Participation Can Be Achieved at Once: Just Clip Gradient Differences

17. Byzantine-Tolerant Methods for Distributed Variational Inequalities

18. Accelerated Zeroth-order Method for Non-Smooth Stochastic Convex Optimization Problem with Infinite Variance

19. Handling Data Heterogeneity via Architectural Design for Federated Visual Recognition

20. High-Probability Convergence for Composite and Distributed Stochastic Minimization and Variational Inequalities with Heavy-Tailed Noise

21. Maestro: Uncovering Low-Rank Structures via Trainable Decomposition

22. Clip21: Error Feedback for Gradient Clipping

23. Global-QSGD: Practical Floatless Quantization for Distributed Learning with Theoretical Guarantees

24. Partially Personalized Federated Learning: Breaking the Curse of Data Heterogeneity

25. Balancing Privacy and Performance for Private Federated Learning Algorithms

26. Federated Learning with Regularized Client Participation

27. High-Probability Bounds for Stochastic Optimization and Variational Inequalities: the Case of Unbounded Variance

28. PaDPaF: Partial Disentanglement with Partially-Federated GANs

29. Convergence of Proximal Point and Extragradient-Based Methods Beyond Monotonicity: the Case of Negative Comonotonicity

30. Adaptive Learning Rates for Faster Stochastic Gradient Methods

31. Granger Causality using Neural Networks

32. Better Methods and Theory for Federated Learning: Compression, Client Selection and Heterogeneity

33. Variance Reduction is an Antidote to Byzantines: Better Rates, Weaker Assumptions and Communication Compression as a Cherry on the Top

34. FedShuffle: Recipes for Better Use of Local Work in Federated Learning

35. FL_PyTorch: optimization research simulator for federated learning

36. FLIX: A Simple and Communication-Efficient Alternative to Local Methods in Federated Learning

37. A Field Guide to Federated Optimization

38. FjORD: Fair and Accurate Federated Learning under heterogeneous targets with Ordered Dropout

39. Hyperparameter Transfer Learning with Adaptive Complexity

40. Optimal Client Sampling for Federated Learning

41. Lower Bounds and Optimal Algorithms for Personalized Federated Learning

42. A Better Alternative to Error Feedback for Communication-Efficient Distributed Learning

43. On Biased Compression for Distributed Learning

44. Adaptivity of Stochastic Gradient Methods for Nonconvex Optimization

45. Natural Compression for Distributed Deep Learning

46. Stochastic Distributed Learning with Gradient Quantization and Variance Reduction

47. Don't Jump Through Hoops and Remove Those Loops: SVRG and Katyusha are Better Without the Outer Loop

48. Nonconvex Variance Reduced Optimization with Arbitrary Sampling

Catalog

Books, media, physical & digital resources