Published Date : 6/9/2025
Can smart devices collaborate to train artificial intelligence (AI) models when they experience poor internet connections? Yes, and Xiaowen Gong, the Godbold Associate Professor in electrical and computer engineering, can prove it.
Gong's recently completed National Science Foundation-funded research, “Quality-Aware Distributed Computation for Wireless Federated Learning: Channel-Aware User Selection, Mini-Batch Size Adaptation, and Scheduling,” demonstrates how smart devices can collaborate to build better AI models regardless of connection quality, turning network limitations from a barrier into a manageable constraint. Originally funded and commissioned in 2021, his work paves the way for smarter, faster, and safer technologies — powering innovations that could make robots more capable, augmented reality/virtual reality experiences more immersive, vehicles more autonomous, and wireless systems more intelligent.
“Our algorithms enable federated learning in wireless networked systems where devices often have unreliable, time-varying, and heterogeneous communication and computation capabilities,” Gong said. “Our research improves learning accuracy and accelerates the training process, all while enabling devices to participate with greater flexibility.”
Federated learning allows multiple devices — like smartphones, tablets, or sensors — to collaboratively train an AI model without sharing their raw data. Instead of sending sensitive information to a central server, devices process data locally and share only the learning updates. This approach protects privacy while enabling AI systems to learn from diverse data sources.
“AI isn't just something that lives in massive data centers anymore,” Gong said. “It's happening on the devices we use every day, like phones, automobiles, and smart home systems. Our work helps these devices learn together, even when their internet connections are not perfect. That means smarter predictions, faster responses, and better performance in real-world conditions.”
Existing federated learning methods often do not perform well when devices have unreliable connections or different computational capabilities, leading to slower training and less accurate models. Gong's research tackles this problem through a method described as quality-aware distributed computation. The new algorithms intelligently select which devices participate in each training round and adjust how much work each device does based on its connection quality and computational power.
“Our methods not only improve the learning accuracy of federated learning but also accelerate the training process while allowing devices to participate in federated learning with much flexibility, even if some devices drop in and out,” he said.
“Imagine your smart assistant learning new things 30% faster, or your car reacting more quickly to changing traffic. That's the kind of improvement we're seeing. This isn't just about speed. It's about making AI more responsive and reliable in everyday life.”
Q: What is federated learning?
A: Federated learning is a method that allows multiple devices to collaboratively train an AI model without sharing their raw data. Devices process data locally and share only the learning updates, protecting privacy and enabling AI systems to learn from diverse data sources.
Q: How does quality-aware distributed computation improve federated learning?
A: Quality-aware distributed computation intelligently selects which devices participate in each training round and adjusts the workload based on each device's connection quality and computational power. This improves learning accuracy and accelerates the training process.
Q: What are the potential applications of Gong's research?
A: Gong's research can enhance various technologies, making robots more capable, augmented reality/virtual reality experiences more immersive, vehicles more autonomous, and wireless systems more intelligent.
Q: Why is this research important for smart devices with poor internet connections?
A: This research turns network limitations into manageable constraints, allowing smart devices to collaborate effectively in training AI models even when their internet connections are unreliable.
Q: What are the benefits of improved federated learning methods?
A: Improved federated learning methods lead to faster learning, more accurate models, and better performance in real-world conditions, making AI more responsive and reliable.