Publications
A federated learning architecture for secure and private neuroimaging analysis
Abstract
The amount of biomedical data continues to grow rapidly. However, collecting data from multiple sites for joint analysis remains challenging due to security, privacy, and regulatory concerns. To overcome this challenge, we use federated learning, which enables distributed training of neural network models over multiple data sources without sharing data. Each site trains the neural network over its private data for some time and then shares the neural network parameters (i.e., weights and/or gradients) with a federation controller, which in turn aggregates the local models and sends the resulting community model back to each site, and the process repeats. Our federated learning architecture, MetisFL, provides strong security and privacy. First, sample data never leave a site. Second, neural network parameters are encrypted before transmission and the global neural model is computed under fully homomorphic …
- Date
- August 9, 2024
- Authors
- Dimitris Stripelis, Umang Gupta, Hamza Saleem, Nikhil Dhinagar, Tanmay Ghai, Chrysovalantis Anastasiou, Rafael Sánchez, Greg Ver Steeg, Srivatsan Ravi, Muhammad Naveed, Paul M Thompson, Jose-Luis Ambite
- Journal
- Patterns
- Volume
- 5
- Issue
- 8
- Pages
- 101031
- Publisher
- Cell Press