Avatar
Tiansheng Huang
  • Email: tianshenghuangscut at gmail.com

Biography

I am a second-year CS PhD student at Georgia Institute of Technology, Atlanta, USA, advised by Prof. Ling Liu. I received my B.E. and master degree at South China University of Technology, Guangzhou, China, advised by Prof. Weiwei Lin. My research interests include distributed machine learning, parallel and distributed computing, optimization algorithms and machine learning security.

Publications

Conference

  • T. Huang, S. Hu, KH. Chow, F. Ilhan, S. Tekin, L. Liu, “Lockdown: Backdoor Defense for Federated Learning with Isolated Subspace Training,” NeurIPS2023.[Paper] [Video][Code]
  • F. Ilhan, S. Tekin, S. Hu, T. Huang, KH. Chow, L. Liu, “Hierarchical Deep Neural Network Inference for Device-Edge-Cloud Systems,” WWW2023.[Paper]
  • Y. Sun, L. Shen, T. Huang, and D. Tao, “FedSpeed: Larger Local Interval, Less Communication Round, and Higher Generalization Accuracy ,” ICLR2023. [OpenReview]
  • S.Hu, T. Huang, KH. Chow, W. Wei, Y. Wu, L. Liu “ZipZap: Efficient Training of Language Models for Ethereum Fraud Detection” to appear, WWW2024

Journal

  • T. Huang, L. Shen, Y. Sun, W. Lin, and D. Tao, “Fusion of Global and Local Knowledge for Personalized Federated Learning,” accepted, Transactions on Machine Learning Research (TMLR). [OpenReview]
  • T. Huang, W. Lin, W. Wu, L. He, K. Li and AY. Zomaya, “An Efficiency-boosting Client Selection Scheme for Federated Learning with Fairness Guarantee,” 2020, IEEE Transactions on Parallel and Distributed Systems (TPDS) (Special Section on Parallel and Distributed Computing Techniques for AI, ML, and DL). [arXiv]
  • T. Huang, W. Lin, L. Shen, K. Li and A. Y. Zomaya, “Stochastic Client Selection for Federated Learning with Volatile Clients,” 2022, IEEE Internet of Things Journals. [arXiv]
  • T. Huang, W. Lin, X. Hong , X. Wang, Q. Wu, R. Li, CH. Hsu, AY. Zomaya, “Adaptive Processor Frequency Adjustment for Mobile Edge Computing with Intermittent Energy Supply”, 2021, IEEE Internet of Things Journals. [arXiv] [code]
  • T. Huang, W. Lin, C. Xiong, R. Pan and J. Huang, “An Ant Colony Optimization Based Multi-objective Service Replicas Placement Strategy for Fog Computing,” 2020, IEEE Transactions on Cybernetics.

Preprint&OpenReview

  • T. Huang, S. Liu, L. Shen, F. He, W. Lin, D. Tao, “Achieving Personalized Federated Learning with Sparse Local Models,” preprint [arXiv]
  • T. Huang, S. Hu, L. Liu, “Vaccine: Perturbation-aware Alignment for Large Language Model,” preprint [arXiv]

Industrial Experience

Research intern at JD explore academy, Beijing, China, with Dr. Li Shen. (Jun. 2021 ~ Oct. 2021)

  • Develop high-efficiency model compression algorithm for distributed ML.
  • Optimization for Personalized Federated Learning.

Research intern at JD explore academy, Beijing, China, with Dr. Li Shen. (March. 2022 ~ June. 2022)

  • Low-rank+sparse compression for personalized federated learning .
  • Application of proximal algorithms.

Awards & Honors

  • National Scholarship for graduate, 2021
  • National Scholarship for graduate, 2020
  • The First-Class School Scholarship, 2019

Services

  • Reviewer (NeurIPS2023, ICLR2024, ICML2024, IEEE TMC, IEEE TCOM, ACM TOIT)