Coinbase said it would not integrate the popular artificial intelligence tool ChatGPT into its security review process because it does not meet its accuracy requirements.
Coinbase used ChatGPT to test the security standards of 20 unnamed ERC-20 tokens. The results of the tests showed that the tool showed “promise for its ability to quickly assess smart contract risks.”
However, when ChatGPT results were compared against the Coinbase security team manual review, the machine gave eight incorrect answers — five of which were the worst-case failure.
A breakdown of these errors showed that ChatGPT incorrectly labeled high-risk assets as low-risk. Coinbase noted that “underestimating a risk score is far more detrimental than overestimating.”
(By Oluwapelumi Adejumo)
All Comments