Track_Shovel to Lemmy [email protected]English • 1 month agoHexadecimalslrpnk.netimagemessage-square95fedilinkarrow-up1597arrow-down115
arrow-up1582arrow-down1imageHexadecimalslrpnk.netTrack_Shovel to Lemmy [email protected]English • 1 month agomessage-square95fedilink
minus-square@[email protected]linkfedilink15•1 month agoJust run the LLM locally with open-webui and you can tweak the system prompt to ignore all the censorship
minus-square@[email protected]linkfedilink1•1 month agoOr just use Perplexity if you don’t want to run your own LLM. It’s not afraid to answer political questions (and cite its sources)
minus-square@[email protected]linkfedilink0•1 month agoDepends on how many parameters you want to use. I can run it with 8billion on my laptop.
Just run the LLM locally with open-webui and you can tweak the system prompt to ignore all the censorship
Is the local version censored at all?
Or just use Perplexity if you don’t want to run your own LLM. It’s not afraid to answer political questions (and cite its sources)
How? The tweaking part, of course
Don’t you need a beefy GPU to run local LLMs?
Depends on how many parameters you want to use. I can run it with 8billion on my laptop.
After censorship, bias still remains.