.. index:: Language Models Language Models ==================== .. note:: This was added in version 2.15.0 It's now possible to run the entire range of supported :term:`Large Language Models` on CoCalc OnPrem. You can configure this like all other settings in the "Admin panel" → "Site settings". Click on ``AI-LLM`` to filter the configuration values. On top of that, you can limit the choices for your users by only included selected models in list called ``User Selectable LLMs``. In particular, running your own :term:`Ollama` server and registering it might be an interesting option. .. note:: Just like with all the other admin settings, you can also control their values via your :ref:`my-values.yaml `, in the ``globals.settings`` dict. Ollama ------------------ To tell CoCalc OnPrem to talk to your own Ollama server, you have to set the ``ollama_configuration`` site setting. Assuming your :term:`Ollama` server runs at the host and port ``ollama.local:11434``, you have to set this in ``global.software`` to access a model named ``gemma``:: ollama_configuration: '{"gemma" : {"baseUrl": "http://ollama.local:11434/" , cocalc: {display: "Gemma", desc: "Google''s Gemma Model"}}}' You can add more models by adding more dictionary entries. User Defined Models ------------------------------ .. note:: This is available since version ``3.0.3`` User's are also able to configure their own language model – either from a supported provider in the cloud or one hosted at another endpoint. This can be enabled/disabled in the :term:`Site Settings`, under "User Defined LLM", or in your :ref:`my-values.yaml `, in the ``globals.settings`` under ``user_defined_llm: "yes|no"``. With that, in the account settings there is this section: .. image:: ../_static/byollm.png