Follow
Chen-Yu Ho
Chen-Yu Ho
ByteDance Inc.
Verified email at bytedance.com - Homepage
Title
Cited by
Cited by
Year
Scaling Distributed Machine Learning with In-Network Aggregation
A Sapio, M Canini, CY Ho, J Nelson, P Kalnis, C Kim, A Krishnamurthy, ...
Proceedings of the 18th USENIX Symposium on Networked Systems Design and …, 2021
4752021
GRACE: A compressed communication framework for distributed machine learning
H Xu, CY Ho, AM Abdelmoniem, A Dutta, EH Bergou, K Karatsenidis, ...
2021 IEEE 41st international conference on distributed computing systems …, 2021
177*2021
Natural compression for distributed deep learning
S Horvóth, CY Ho, L Horvath, AN Sahu, M Canini, P Richtárik
Mathematical and Scientific Machine Learning, 129-141, 2022
1732022
On the discrepancy between the theoretical analysis and practical implementations of compressed communication for distributed deep learning
A Dutta, EH Bergou, AM Abdelmoniem, CY Ho, AN Sahu, M Canini, ...
Proceedings of the AAAI Conference on Artificial Intelligence 34 (04), 3817-3824, 2020
992020
Efficient sparse collective communication and its application to accelerate distributed deep learning
J Fei, CY Ho, AN Sahu, M Canini, A Sapio
Proceedings of the 2021 ACM SIGCOMM 2021 Conference, 676-691, 2021
952021
A Comprehensive Empirical Study of Heterogeneity in Federated Learning
AM Abdelmoniem, CY Ho, P Papageorgiou, M Canini
IEEE Internet of Things Journal, 2023
83*2023
Tackling the Communication Bottlenecks of Distributed Deep Learning Training Workloads
CY Ho
2023
The system can't perform the operation now. Try again later.
Articles 1–7