- cross-posted to:
- hackernews@lemmy.bestiver.se
- cross-posted to:
- hackernews@lemmy.bestiver.se
Now, with the help of AI, it’s even easier to waste time of open source developers by creating fake security vulnerability reports.
Now, with the help of AI, it’s even easier to waste time of open source developers by creating fake security vulnerability reports.
Isn’t this a bug in chatgpt? Someone needs to file a high priority bug report. I wonder if they can be sued for their tool being used for abuse - gun makers are trying hard to prevent that, if you disagree with that, then you should demand chatgpt be held responsible for how their tool is used.