Skip to content
Home » Tech » Page 186

Tech

OpenAI Models Caught Handing Out Weapons Instructions

NBC News tests reveal OpenAI chatbots can still be jailbroken to give step-by-step instructions for chemical and biological weapons. The post OpenAI Models Caught Handing Out Weapons Instructions appeared first on TechRepublic.