Evan Chen (Po-Yu Chen)

Elmore Family School of Electrical and Computer Engineering, Purdue University, West Lafayette, IN · chen4388@purdue.edu · CV

I am an ECE PhD Student at Purdue University, advised by Professor Christopher Brinton, specializing in machine learning, deep learning, federated learning, network systems, fog learning, and large language models (LLMs). With six years of experience in neural networks, my research aims to bridge the gap between theory and real-world applications, driving both academic advancements and industrial impact.

My work mainly focuses on designing efficient and scalable distributed AI systems, tackling challenges in resource-constrained environments, multi-tier communication networks, and communication-efficient model training. I am particularly interested in the intersection of federated learning and large-scale AI, with applications in edge computing, large-scale distributed intelligence, and privacy-preserving AI systems.

I have contributed to both fundamental research and practical deployments, pushing the boundaries of scalable AI. My long-term goal is to develop innovative AI-driven solutions that advance the field while addressing real-world challenges.

News

[Sep 2025] Our paper Towards Straggler-Resilient Split Federated Learning: An Unbalanced Update Approach is accepted for NeurIPS 2025!
[May 2025] I will begin my Internship as a Controls Research Engineer at Cummins Inc. in Summer 2025.
[Jan 2025] Our paper Differentially-Private Multi-Tier Federated Learning is accepted for ICC 2025!
[Sep 2024] Our paper Hierarchical Federated Learning with Multi-Timescale Gradient Correction is accepted for NeurIPS 2024!

Interests

  • RL-based Post-Training Strategies
  • On-device AI/device-cloud collaborative AI
  • Federated Learning
  • Distributed Optimization
  • Differential Privacy

NeurIPS2025 - Towards Straggler-Resilient Split Federated Learning: An Unbalanced Update Approach

We propose MU-SplitFed, a straggler-resilient Split Federated Learning (SFL) algorithm that addresses the critical bottleneck of client-server synchronization delays—caused by SFL’s dependency on client-side activations—by introducing a simple yet effective unbalanced update mechanism that enables per-client local updates on the server, achieving a linear reduction in communication rounds and demonstrating superior performance over baselines under straggler conditions through adaptive tuning.

September 2025

IEEE Network - Federated Foundation Models in Harsh Wireless Environments: Prospects, Challenges, and Future Directions -

We investigate Federated Foundation Models to bring reliable, adaptive intelligence to harsh environments-where connectivity is intermittent, devices are resource-limited, and data is noisy—by combining FM generalization with decentralized, communication-aware FL. Our work maps key system constraints and outlines research challenges in communication design, robustness, and energy-efficient personalization.

September 2025

ICC2025 - Differentially-Private Multi-Tier Federated Learning -

A privacy-enhanced FL framework that optimally injects Differential Privacy noise at different hierarchical layers of fog nodes on untrusted models within subnetworks. Our method achieves strong privacy guarantees while maintaining superior model performance and convergence efficiency compared to baseline methods.

January 2025

NeurIPS2024 - Hierarchical Federated Learning with Multi-Timescale Gradient Correction -

Hierarchical Federated Learning (HFL) faces challenges from multi-timescale model drift due to data heterogeneity across hierarchical levels. To address this, we propose Multi-Timescale Gradient Correction (MTGC), achieving stable convergence independent of data heterogeneity and demonstrating superior performance in diverse HFL settings..

September 2024

INFOCOM2024 - Taming Subnet-Drift in D2D-Enabled Fog Learning: A Hierarchical Gradient Tracking Approach -

The first Semi-Decentralized Federated Learning (SD-FL) framework that eliminates the need for data heterogeneity assumptions by incorporating tracking terms into device updates. Our method achieves significant improvements in model quality and communication efficiency over existing SD-FL methods.

May 2024

WCAV2023 - Cross-Resolution Flow Propagation for Foveated Video Super-Resolution -

A framework that combines super-resolution techniques with fovea rendering to enable efficient low-bandwidth video streaming over low-power protocols like Bluetooth. By leveraging deformable convolutional networks (DCN), our model achieves state-of-the-art video quality with fast, low-energy performance, making it ideal for VR/AR applications.

January 2023

ECCV2020 - Meta-rppg: Remote heart rate estimation using a transductive meta-learner -

A meta-learning approach for remote heart rate estimation using rPPG, enabling self-supervised weight adjustment during deployment to adapt to distributional shifts. This method improves model robustness to variations in skin tone, lighting, and facial structure.

August 2020

There's more!

See all Publications

/Users/evanchen/Downloads/resume_2025_09.pdf

Experience

Graduate Researcher (PhD)

Performing research related to distributed optimization, decentralized machine learning, large language models, and fog communication networks.

August 2022 - Present

Controls Research Engineer

Building real-world federated learning applications using NVIDIA FLARE and Meta ExecuTorch

May 2025 - Aug 2025

Research Assistant

Conducting research on transformer models and video compressing.

July 2021 - December 2021

UnderGraduate Researcher

Conducting research on meta-learning, video compressing, transformer models, and computer vision.

February 2019 - June 2021

Academic Services

Reviewer

Conference on Neural Information Processing Systems (NeurIPS)

International Conference on Machine Learning (ICML)

IEEE International Conference on Computer Communications (INFOCOM)

IEEE Transactions on Mobile Computing (TMC)

IEEE/ACM Transactions on Networking (ToN)

Presenter

‘‘Taming Subnet-Drift in D2D-Enabled Fog Learning: A Hierarchical Gradient Tracking Approach’’, May 23, 2024 INFOCOM, Vancouver, Canada.

Education

Purdue University [West Lafayette, IN, USA]

Doctor of Philosophy
Elmore Family School of Electrical and Computer Engineering

Advised by Professor Christopher Brinton

2022-Current

National Chiao Tung University (NCTU) [Hsinchu, Taiwan]

Bachelor of Science
Electronics Engineering

Advised by Professor Chen-Yi Lee

2017-2021
Nifty tech tag lists from Wouter Beeftink