CallMeButtLove@lemmy.worldtoSelfhosted@lemmy.world•Self-Hosted AI is pretty darn coolEnglish
4·
3 months agoIs there a way to host an LLM in a docker container on my home server but still leverage the GPU on my main PC?
Is there a way to host an LLM in a docker container on my home server but still leverage the GPU on my main PC?
I really hate when companies do that kind of crap. I just imagine a little toddler stomping around going “No! No! Nooo!”