Embracing open-source values, Capital One’s federated model aggregation paves the way for collaborative AI training, unlocking possibilities beyond centralised data. The future of AI starts here.
Capital One has introduced Federated Model Aggregation (FMA), an open source project designed to facilitate federated learning – an innovative method for training AI models without centralising data. Kenny Bean, machine learning software engineer at Capital One, explains, “Federated learning decentralises training, eliminating the need for data to be gathered in one place.”
FMA takes it a step further, offering a customisable framework for deploying machine learning workflows in a federated environment. It encompasses various Python components, fostering smooth communication between them. Bean highlights, “Connectors streamline communication and offer flexibility for integration.”
This project includes a client for managing interactions, an aggregator for consolidating model updates, and an API service for seamless coordination. Bean notes, “FMA introduces federated learning to distributed models.”
Customisation and deployment are the main focuses. The team aimed to seamlessly integrate FMA into existing training paradigms. Bean elaborates, “We aimed to integrate seamlessly with pre-existing workflows, giving birth to the FMA service.” Deploying models with FMA is simplified through integration with HashiCorp’s Terraform tool.
Originally not intended for open source, FMA’s potential quickly grew apparent. “As we saw its flexibility and user-friendliness, it became clear that FMA could benefit a larger community,” Bean says.
Looking ahead, the FMA team plans to enhance collaboration with the community and expand the tool’s components to accommodate different languages.
Capital One’s FMA project signifies innovation, democratising AI model training and expediting federated learning adoption. The project’s combination of customisation, user-friendliness, and open source ethos positions it as a noteworthy player in AI advancement.