Abstract

In the Ambitious Realm of Space AI, the Integration of Federated Learning (FL) with Low Earth Orbit (LEO) Satellite Constellations Holds Immense Promise. However, Many Challenges Persist in Terms of Feasibility, Learning Efficiency, and Convergence. These Hurdles Stem from the Bottleneck in Communication, Characterized by Sporadic and Irregular Connectivity between LEO Satellites and Ground Stations, Coupled with the Limited Computation Capability of Satellite Edge Computing (SEC). This Paper Proposes a Novel FL-SEC Framework that Empowers LEO Satellites to Execute Large-Scale Machine Learning (ML) Tasks Onboard Efficiently. its Key Components Include I) Personalized Learning Via Divide-And-Conquer, Which Identifies and Eliminates Redundant Satellite Images and Converts Complex Multi-Class Classification Problems to Simple Binary Classification, Enabling Rapid and Energy-Efficient Training of Lightweight ML Models Suitable for IoT/edge Devices on Satellites; Ii) Orbital Model Retraining, Which Generates an Aggregated 'orbital Model' Per Orbit and Retrains It Before Sending to the Ground Station, Significantly Reducing the Required Communication Rounds. We Conducted Experiments using Jetson Nano, an Edge Device Closely Mimicking the Limited Compute on LEO Satellites, and a Real Satellite Dataset. the Results Underscore the Effectiveness of Our Approach, Highlighting SEC's Ability to Run Lightweight ML Models on Real and High-Resolution Satellite Imagery. Our Approach Dramatically Reduces FL Convergence Time by Nearly 30 Times, and Satellite Energy Consumption Down to as Low as 1.38 Watts, All While Maintaining an Exceptional Accuracy of Up to 96%.

Department(s)

Computer Science

Comments

National Science Foundation, Grant 2008878

Document Type

Article - Conference proceedings

Document Version

Citation

File Type

text

Language(s)

English

Rights

© 2024 Institute of Electrical and Electronics Engineers, All rights reserved.

Publication Date

01 Jan 2024

Share

 
COinS