Discovered: Aug 19, 2023. 17:57 Running my own LLM ¦ Nelson’s log <– I too am interested in running a model (actually multiple models, a photo model of my 500K photos and a support model from the Thunderbird Support KB and Forums) locally and not paying API fees and not sharing my data with companies! –> QUOTE: But I’m more interested in running the model locally. My example above uses llm-gpt4all, which downloads and runs GPT models downloadable in some standard format. (There’s some 13 models there, most are about 7GB big.) There’s also hosted plugins for Llama, MLC, and Mosaic.

Leave a comment on github