The National Institute of Standards and Technology (NIST) has released a video describing NIST’s responsibilities under the President’s Executive Order (EO) 14110 on Safe, Secure, and Trustworthy Artificial Intelligence, released on October 30.
The EO directs NIST to develop evaluation, red-teaming, safety, and cybersecurity guidelines; facilitate development of consensus-based standards; and provide testing environments for evaluation of artificial intelligence (AI) systems. Guidelines and infrastructure developed as a result of the EO will serve as a voluntary resource for the AI community on trustworthy development and responsible use of AI.
USAISI Workshop on AI Safety
NIST has also opened registration for a related hybrid workshop: the AI Safety Institute (USAISI) Workshop on Collaboration to Enable Safe and Trustworthy AI. The event will take place on November 17, 2023, from 9:00 a.m. - 12:30 p.m. ET. In-person attendance will take place at the Department of Commerce in Washington, DC, with remote participation available.
The workshop will explore measurement gaps in AI safety and trust and opportunities for collaboration to ensure the effective design, development, and deployment of AI systems that integrate considerations for AI safety and trust; with specific interest in defining key community needs for working groups; understanding how to take advantage of resources such as data, models, compute; and developing the foundation for collaborations that are permitted under this open, transparent approach.
The goal of the workshop will be to understand how NIST and the community can work together to create the practice and policy of AI safety and trust.
To maximize opportunities for all interested to attend, NIST requests that organizations limit in-person attendance to two representatives per organization. Registration for in-person attendance closes on November 15, 2023. There is no deadline to register to attend virtually. Learn more and register.
See related:
NIST Issues Call for Participants in New AI Safety Consortium