This is a non safe for work, as in lewd, instance.
It’s safe to assume that anything you see here will be lewd.
Just letting you know. If you view the content and then pull a surprised pikachu when you see big anime tiddies, Judy Hopps getting railed or some furry vore… Then it’s on you.
Should this popup continue to show up, you may want to enable cookies or disable privacy focused addons on your browser, I assure you we won’t track our user.
Should that fail, some users claim they got rid of it by hammering the ok button.
I appreciate the effort you put into the comment and your kind tone, but I’m not really interested in increasing LLM presence in my life.
I said what I said, and I experienced what I experienced. Providing me an example where it works is in no way a falsification of the core of my original comment: LLMs have no place generating code for secure applications apart from human review, because they don’t have a mechanism to comprehend or proof their own work.
I’d also add that, depending on the language, the ways you can shoot yourself in the foot are very subtle (cf C++/C, which are popular languages for “secure” stuff).
It’s already hard to not write buggy code, but I don’t think you will detect them by just reviewing LLM code, because detecting issues during code review is much harder than when you’re writing code.
Oh, and I assume it’ll be tough to get an LLM to follow MISRA conventions.
Definitely. That’s what I was trying to drive at, but you said it well.