Federated Learning

Protecting Data

Using Federated Learning, SAIL is able to train research algorithms locally at each hospital. As a result, only the insights are transferred back; data never leaves the data owner’s infrastructure.


On the SAIL platform, researchers collect insights by training algorithms. A researcher is able to select different algorithms that he or she wants to use to train models. Through the SAIL platform, those algorithms, along with the researcher’s code, are sent over to managed the datasets that the researcher finds useful to the model.


At the data owner's site, the code arrives in a SAIL server on the data owner infrastructure. Locally, the data owner then pairs the code with its datasets to create insights. Those insights are transferred back to the researcher.


The Power of FL

P-Value Without FL

0.88

P-Value With FL

0.88

We demonstrated this accuracy through work with researchers at Novartis. Read about our study and findings.

Read the White Paper
Secure Enclaves

Protecting Commercial IP

Secure enclaves provide a truly secure environment through Intel SGX technology by leveraging hardware-assisted confidentiality. SAIL uses Intel SGX and attestation to secure the training environments. Not even SAIL has access to the data or code inside the secure enclave.

Differential Privacy

Preserving Patient Privacy

Differential privacy generates random noise into a dataset to obfuscate the data layer. Because the obfuscation happens in a controlled (but random) manner, it can be accounted for when generating the trained model to preserve the accuracy of the model.

Want to geek out about our technology? Join our Discord Community.


Chat with Us