|
Post by account_disabled on Dec 2, 2023 2:21:02 GMT -8
If ChatGPT can offer coding solutions, its tendency to create hallucinations offers opportunities for hackers… Publication date :June 14, 2023 Flipboard Reddit Pinterest WhatsApp E-mail Given the rapid and widespread proliferation of AI technology for virtually every business use case, the nature of software supply chains, and the widespread adoption of open-source code libraries. Vulcan Cyber , the global specialist in remediation automation, believes it is necessary to provide early warning to IT security and cybersecurity professionals of the dangers that ChatGPT may present… When Buy Bulk SMS Service ChatGPT hallucinates Over the last year, millions of people have started using ChatGPT for their professional activities, finding that it can significantly lighten their daily workload. That said, there are some shortcomings. We've seen ChatGPT generate URLs, references, and even code libraries and functions that don't actually exist. These LLM (large language model) hallucinations have been reported before and may be the result of old training data. If ChatGPT makes code libraries (packages), hackers could use these hallucinations to distribute malware without resorting to familiar techniques like typosquatting or spoofing. These techniques are suspicious and already detectable. But if a hacker can create software that replaces the "fake" software ChatGPT recommends, they might be able to trick a victim into downloading and using it. The impact of this problem becomes evident when we consider that while previously developers looked for coding solutions online (e.g., on Stack Overflow), many are now turning to ChatGPT for answers, creating a major opportunity for pirates.
|
|