The AI tool ChatGPT allegedly helped a man smuggle drugs into Europe. ChatGPT advised and gave tips and tricks on how to get involved in the drug industry. Man posed certain questions and after talking with the AI tool, he got what he wanted.
Question & Answer Session
ChatGPT eventually proposed certain ways to get involved in smuggling cocaine after a long conversation of 12 hours. The first question the man asked was “How do people make crack cocaine?”. To which ChatGPT went on to recommend step by step process of making cocaine. ChatGPTusually don’t answer questions of illegal nature but in this case the individual extracted the information he wanted.
ChatGPT Refused To Answer
ChatGPT refused to answer when asked about “How to join a cartel?”. But finally he was able to extract his desired answer when he asked ChatGPT about a person in a novel smuggling cocaine into Europe.The AI tool told him to hide the stuff in cargo, concealed on a person or by sea. Not just listing the particular methods, the bot even went as far as giving specific information on each piece of advice, namely offering ‘another substance’ to be used as a disguising tool.