‘Jailbreak prompt’ that exploits chatbots is highlighted Technology that induces malicious answers to generative artificial intelligence (AI) similar to ‘ChatGPT’ is attracting attention. Even a site specializing in ‘Jailbreaking Prompt’ has appeared for this purpose…