I am an ECE PhD Student at Purdue University, advised by Professor Christopher Brinton, specializing in machine learning, deep learning, federated learning, network systems, fog learning, and large language models (LLMs). With six years of experience in neural networks, my research aims to bridge the gap between theory and real-world applications, driving both academic advancements and industrial impact.
My work mainly focuses on designing efficient and scalable distributed AI systems, tackling challenges in resource-constrained environments, multi-tier communication networks, and communication-efficient model training. I am particularly interested in the intersection of federated learning and large-scale AI, with applications in edge computing, large-scale distributed intelligence, and privacy-preserving AI systems.
I have contributed to both fundamental research and practical deployments, pushing the boundaries of scalable AI. My long-term goal is to develop innovative AI-driven solutions that advance the field while addressing real-world challenges.
[Sep 2025] | Our paper Towards Straggler-Resilient Split Federated Learning: An Unbalanced Update Approach is accepted for NeurIPS 2025! |
[May 2025] | I will begin my Internship as a Controls Research Engineer at Cummins Inc. in Summer 2025. |
[Jan 2025] | Our paper Differentially-Private Multi-Tier Federated Learning is accepted for ICC 2025! |
[Sep 2024] | Our paper Hierarchical Federated Learning with Multi-Timescale Gradient Correction is accepted for NeurIPS 2024! |
See all Publications
Performing research related to distributed optimization, decentralized machine learning, large language models, and fog communication networks.
Building real-world federated learning applications using NVIDIA FLARE and Meta ExecuTorch
Conducting research on transformer models and video compressing.
Conducting research on meta-learning, video compressing, transformer models, and computer vision.
Advised by Professor Christopher Brinton
Advised by Professor Chen-Yi Lee