logo
Goat
A simple way to run LLMs locally
screenshot
It runs llama.cpp server in the background and provides a simple UI to interact with it. There is also a simple automation to download models from TheBloke.
Subscribe to get notified
when the alpha version is released.
cgeosoft - all rights reserved