Changeflow GovPing AI Regulation NIST AI Risk Management Framework Engagement an...
Routine Notice Amended Final

NIST AI Risk Management Framework Engagement and Updates

Favicon for www.nist.gov NIST AI Risk Management Framework
Detected March 27th, 2026
Email

Summary

NIST's Information Technology Laboratory (ITL) AI Program is organizing workshops and webinars to foster collaboration and advance the development of AI standards and guidelines. The page lists upcoming, recent, and past events related to AI trustworthiness and risk management, including updates to the AI Risk Management Framework (AI RMF 1.0).

What changed

NIST's ITL AI Program is actively engaging stakeholders through workshops and webinars to promote a shared understanding of trustworthy AI and to bolster the scientific underpinnings for assessing AI systems. The program focuses on advancing AI standards, guidelines, and related tools, with recent and upcoming events covering topics such as traceability in agentic AI ecosystems and the international AI standards landscape. The AI Risk Management Framework (AI RMF) 1.0 was launched in January 2023, and ongoing engagement aims to support its practical application and evolution.

While this page primarily serves as an informational hub for NIST's AI engagement activities, it highlights NIST's commitment to developing non-binding guidance and frameworks for AI risk management. Organizations involved in AI development or deployment should monitor these events and resources to stay informed about best practices, emerging standards, and NIST's ongoing contributions to trustworthy AI. No immediate compliance actions are mandated by this notice, but awareness of these initiatives is crucial for entities seeking to align with evolving AI governance and risk management principles.

Source document (simplified)

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock () or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.


Artificial intelligence

ITL AI Engagement

Share

Facebook Linkedin X.com Email

Credit: N. Hanacek/NIST

To foster collaboration and develop a shared understanding of what constitutes trustworthy AI, and to bolster scientific underpinning of how to assess and assure trustworthiness of AI systems, The NIST Information Technology Laboratory (ITL) AI Program organizes workshops bringing together government, industry, academia, and other stakeholders from the US and around the world. The workshops’ focus is on advancing the development of AI standards, guidelines, and related tools.

Upcoming Workshops & Events

  • ITL AI Webinar Series: Building Traceability into Agentic AI Ecosystems Through Measurement Probes. Learn More and Register.

Recent Workshops & Events

  • ITL AI Webinar Series: The International AI Standards Landscape and ITL’s Role, Priorities, and Progress (March 6, 2026) Watch Recording.

Past Workshops & Events

Ways to Engage

The ITL AI Program relies on and encourages robust interactions with industry, universities, nonprofits, and other government agencies in driving and carrying out its AI agenda. There are multiple ways to engage with NIST, including:

  • NIST AI Consortium: ITL has established the NIST AI Consortium to empower the collaborative establishment of a new measurement science that will enable the identification of proven, scalable, and interoperable techniques and metrics to promote the development and use of AI.
  • Requests for Information (RFIs): The ITL AI Program sometimes uses formal RFIs to inform the public about its AI activities and gain insights into specific AI issues. For example, an RFI was issued to help develop the AI Risk Management Framework.
  • Share  your input on draft reports: The ITL AI Program counts on stakeholders to review drafts of reports on a variety of AI issues. Drafts typically are prepared based on inputs from private and public sector individuals and organizations and then posted for broader  public review on NIST’s AI website and via email alerts. Public comments help to improve these documents.
  • Student Programs: NIST offers a range of opportunities for students to engage with NIST on AI-related work. That includes the Professional Research Experience Program (PREP), which provides valuable laboratory experience and financial assistance to undergraduate, graduate, and post-graduate students. Sign up for AI email alerts here . If you have questions or ideas about how to engage with us on AI topics or have ideas about NIST’s AI activities, send us an email: ai-inquiries [at] nist.gov (ai-inquiries[at]nist[dot]gov) .

Artificial intelligence Created June 16, 2020, Updated March 27, 2026

Was this page helpful?

Source

Analysis generated by AI. Source diff and links are from the original.

Classification

Agency
NIST
Instrument
Notice
Legal weight
Non-binding
Stage
Final
Change scope
Minor

Who this affects

Industry sector
5112 Software & Technology 9211 Government & Public Administration
Activity scope
AI Risk Management AI Standards Development
Geographic scope
United States US

Taxonomy

Primary area
Artificial Intelligence
Operational domain
IT Security
Compliance frameworks
NIST CSF
Topics
AI Governance AI Standards

Get AI Regulation alerts

Weekly digest. AI-summarized, no noise.

Free. Unsubscribe anytime.

Get alerts for this source

We'll email you when NIST AI Risk Management Framework publishes new changes.

Optional. Personalizes your daily digest.

Free. Unsubscribe anytime.