A Personalized and Privacy-Preserving Federated Transformer Framework for Multilingual Sentiment Analysis
Summary
FedPerX is a federated transformer framework (a system where multiple computers train an AI model together without sharing raw data) designed for sentiment analysis across multiple languages while protecting privacy. It uses residual adapters (lightweight customizable modules added to a shared language model) and differential privacy (a mathematical technique that adds noise to data to prevent identifying individuals) to let each participant personalize their model without compromising data privacy. The framework outperforms existing methods on multilingual datasets with improved accuracy and significantly reduced communication needs.
Classification
Original source: http://ieeexplore.ieee.org/document/11391653
First tracked: May 7, 2026 at 08:03 PM
Classified by LLM (prompt v3) · confidence: 85%