Imagine a world where decisions are supported by data analysis and collaboration, without exposing sensitive data. A world where one financial institution can privately and securely share sensitive data with another financial institution – and in turn serve their customers better, mitigate risks, and anticipate fraud and financial crimes. Or a world where healthcare organizations can safely share valuable patient information, so healthcare providers can make more meaningful decisions and improve patient care and health outcomes.
How do our trusted PETs factor into our ability to collaborate on data while preserving privacy? To understand this, you must first understand the PETs landscape.
Get to know the most common PETs:
- Homomorphic encryption: the most well-known in the PETs world; enables analyses of data while encrypted
- Multiparty computation: allows multiple parties to perform joint computations on individual inputs without revealing the underlying data
- Differential privacy: adds noise to data so it can be analyzed but not reverse engineered
- Federated learning: statistical analysis or model training on decentralized data sets
- Secure enclave/trusted execution environment: data analysis performed on secure hardware
- Zero-knowledge proofs: cryptographic method by which one party can prove to another party that a given statement is true, without conveying any additional information
To learn more about the PETs, use cases, when you need to combine them, and more, don't miss the full article here.