This is a non safe for work, as in lewd, instance.
It’s safe to assume that anything you see here will be lewd.
Just letting you know. If you view the content and then pull a surprised pikachu when you see big anime tiddies, Judy Hopps getting railed or some furry vore… Then it’s on you.
Should this popup continue to show up, you may want to enable cookies or disable privacy focused addons on your browser, I assure you we won’t track our user.
Should that fail, some users claim they got rid of it by hammering the ok button.
I like Ollama, and recommend it to tinker, but I admit this “LLM Explorer” is quite neat thanks to sections like “LLMs Fit 16GB VRAM”
Ollama just works but it doesn’t help to pick which model best fits your needs.
What is the need I have to put the effort in to install all this locally. Websites win in terms of convenience.
I want to work on my stuff in peace and in private without worrying about a company grabbing my stuff and using it for themselves and to give/sell it to other outfits, including the government. “If you have nothing to hide…” is bullshit and needs to die.
Good point. Everything you feed into chatgpt is stored for future reference.
I don’t think I understand your point, are you saying there is no benefit in running locally and that Websites or APIs are more convenient?
I already have stable diffusion on a local machine. I was trying to find motivation to install a LLM locally. You answered my question in a different response