CAISI and OpenMined CRADA for Privacy-Preserving AI Evaluations
Summary
NIST's Center for AI Standards and Innovation (CAISI) signed a Collaborative Research and Development Agreement (CRADA) with OpenMined, a 501(c)(3) nonprofit, to conduct research into privacy-preserving methods for AI evaluations. The collaboration will enable rigorous measurement of AI systems when data, models, or benchmarks must remain confidential due to intellectual property, data protection, or national security concerns. The research will inform NIST's development of voluntary AI standards and best practices.
What changed
NIST CAISI and OpenMined have entered into a CRADA to research privacy-preserving techniques for conducting AI evaluations securely. The partnership will leverage OpenMined's PySyft software infrastructure to enable evaluations that address both the security requirements of AI developers and data owners, as well as the scientific rigor demanded by researchers. The insights generated will support NIST's AI security efforts and inform voluntary standards and recommendations for AI practitioners.
Compliance teams and regulated entities should note that this announcement does not create any new compliance obligations or deadlines. This is a research collaboration focused on developing methodologies that may inform future NIST guidance and standards. Organizations developing or deploying AI systems should monitor NIST's forthcoming publications on AI evaluation best practices, particularly regarding privacy-preserving evaluation methods that balance intellectual property protection with rigorous assessment.
Source document (simplified)
Announcement: CAISI signs CRADA with OpenMined to Enable Secure AI Evaluations
March 27, 2026
Share
Facebook Linkedin X.com Email The Center for AI Standards and Innovation (CAISI) has signed a collaborative research and development agreement (CRADA) with OpenMined, a 501(c)(3) non-profit that develops open-source software infrastructure for secure computation across organizational boundaries. Under this agreement, CAISI and OpenMined will collaborate on research into privacy-preserving methods for conducting AI evaluations, enabling rigorous measurement of AI systems even when the underlying data, models, or benchmarks must remain confidential due to, for example, intellectual property concerns, data protection requirements, or national security considerations.
Access to real-world or sensitive data presents a challenge for researchers as AI evaluations are increasingly intended to reflect or predict real-world deployments. It is simultaneously crucial that data is shared in a secure and decentralized manner, in order to safeguard intellectual property, encourage innovation, and maintain privacy.
This collaboration will leverage OpenMined’s software infrastructure, including PySyft and subsequent advances, to conduct evaluations that address both the security requirements of AI developers and data owners, as well as the scientific rigor demanded by researchers and evaluators.
The insights generated from this collaboration will support NIST's efforts in AI security and applied AI evaluation. This research will inform the development of voluntary standards, best practices, and future recommendations for AI practitioners and adopters on how to effectively measure AI systems, e.g., for workforce or productivity uplift and other impacts.
For questions, please contact caisi-metrology [at] nist.gov (caisi-metrology[at]nist[dot]gov).
NIST in your inbox
Stay up to date with the latest news from NIST. Enter Email Address
Released March 27, 2026
Related changes
Source
Classification
Who this affects
Taxonomy
Browse Categories
Get Telecom & Technology alerts
Weekly digest. AI-summarized, no noise.
Free. Unsubscribe anytime.
Get alerts for this source
We'll email you when NIST AI News & Updates publishes new changes.