Prompt Jailbreak Manual: A Guide to Designing Prompt Words That Break AI Limitations
General Introduction Prompt Jailbreak Manual is an open source project hosted on GitHub and maintained by the Acmesec team. It focuses on teaching users how to break the limits of the AI macromodel through well-designed prompt words (Prompt), helping tech enthusiasts and security researchers...