Skip to Content
This 📚 Documentation is still 🧑‍💻 Work in Progress
DeveloperExperimental guidesAdd another Ollama LLM model

Adding another Ollama LLM model

Caution

This is not recommended as it will not work with some things, like chat with memory.

If you want to add another Ollama LLM model, you can simply search for one on the Ollama website .

Note

To do this you will need root access on the Jolla Mind 2, if you don’t have it enabled follow the steps in enable Developer mode

#Switch to root devel-su #Switch to the venho-ada-env Environment venho-ada-env #Replace 'name of the model' with the model you want nerdctl exec -it ollama ollama pull 'name of the model' #For it to show up in the web GUI is to restart the llm-router. nerdctl compose restart llm-router
Last updated on