Publications

Arxiv - Parameter Tracking in Federated Learning with Adaptive Optimization

We introduce Parameter Tracking (PT), a generalization of Gradient Tracking (GT) for Federated Learning (FL), and propose two novel Adam-based FL algorithms, FAdamET and FAdamGT, that integrate PT for improved adaptive optimization. Theoretical analysis and experiments demonstrate that these methods enhance convergence efficiency, reducing both communication and computation costs across varying data heterogeneity levels.

January 2025

ICC2025 - Differentially-Private Multi-Tier Federated Learning -

A privacy-enhanced FL framework that optimally injects Differential Privacy noise at different hierarchical layers of fog nodes on untrusted models within subnetworks. Our method achieves strong privacy guarantees while maintaining superior model performance and convergence efficiency compared to baseline methods.

January 2025

(Submitted)ToN-Special Issue - Differentially-Private Multi-Tier Federated Learning: A Formal Analysis and Evaluation -

A DP-secured FL framework that adapts noise injection across multiple tiers of networks based on heterogeneous trust models. Our method optimizes privacy and performance, achieving significant improvements in convergence, energy, latency, and model quality across various privacy budgets and system configurations.

January 2025

NeurIPS2024 - Hierarchical Federated Learning with Multi-Timescale Gradient Correction -

Hierarchical Federated Learning (HFL) faces challenges from multi-timescale model drift due to data heterogeneity across hierarchical levels. To address this, we propose Multi-Timescale Gradient Correction (MTGC), achieving stable convergence independent of data heterogeneity and demonstrating superior performance in diverse HFL settings..

September 2024

(Submitted)ToN - A Hierarchical Gradient Tracking Algorithm for Mitigating Subnet-Drift in Fog Learning Networks -

We propose Semi-Decentralized Gradient Tracking (SD-GT) to enhance federated learning over fog networks by eliminating gradient diversity assumptions, enabling efficient hierarchical model aggregation and achieving superior model quality with reduced communication costs.

September 2024

INFOCOM2024 - Taming Subnet-Drift in D2D-Enabled Fog Learning: A Hierarchical Gradient Tracking Approach -

The first Semi-Decentralized Federated Learning (SD-FL) framework that eliminates the need for data heterogeneity assumptions by incorporating tracking terms into device updates. Our method achieves significant improvements in model quality and communication efficiency over existing SD-FL methods.

May 2024

WCAV2023 - Cross-Resolution Flow Propagation for Foveated Video Super-Resolution -

A framework that combines super-resolution techniques with fovea rendering to enable efficient low-bandwidth video streaming over low-power protocols like Bluetooth. By leveraging deformable convolutional networks (DCN), our model achieves state-of-the-art video quality with fast, low-energy performance, making it ideal for VR/AR applications.

January 2023

ECCV2020 - Meta-rppg: Remote heart rate estimation using a transductive meta-learner -

A meta-learning approach for remote heart rate estimation using rPPG, enabling self-supervised weight adjustment during deployment to adapt to distributional shifts. This method improves model robustness to variations in skin tone, lighting, and facial structure.

August 2020
Nifty tech tag lists fromĀ Wouter Beeftink