Lancern's Treasure Chest
12:09 · Dec 2, 2024 · Mon
llama.cpp guide - Running LLMs locally, on any hardware, from scratch
Comments
via
steelph0enix.github.io
via hauleth
blog.steelph0enix.dev
llama.cpp guide - Running LLMs locally, on any hardware, from scratch
Psst, kid, want some cheap and small LLMs?
Home
Powered by
BroadcastChannel
&
Sepia