SlowMist Cosine: Risk of private key leakage when using AI to generate code reveals real attack cases

PANews reported on November 22 that Twitter user @r_cky0 revealed that when he used ChatGPT to generate code to develop a blockchain automatic trading robot, a backdoor was hidden in the code recommended by GPT, which sent the private key to a phishing website, causing him to lose about $2,500. Later, SlowMist founder Yu Xian @evilcos confirmed that there were indeed cases of being "hacked" by using AI to generate code.

Experts pointed out that such attacks may originate from malicious patterns learned by AI from phishing posts or unsafe content, and that current AI models have difficulty distinguishing whether there are backdoors in the code. The industry calls on users to be vigilant and avoid blindly trusting AI-generated code, and recommends that AI platforms strengthen content review mechanisms in the future to identify and alert potential security risks.

Share to:

Author: PA一线

This content is for informational purposes only and does not constitute investment advice.

Follow PANews official accounts, navigate bull and bear markets together
Recommended Reading
7 hour ago
9 hour ago
14 hour ago
15 hour ago
16 hour ago
17 hour ago

Popular Articles

Industry News
Market Trends
Curated Readings

Curated Series

App内阅读