Application Guide: Ollama¶
Note
To use Ollama it requires port forwarding this is covered below, but it requires your own terminal and cannot be run on the portal
Ollama via Apptainer¶
You can obtain and run an Ollama container using following steps:
-
Warning
Please make sure you exit your interactive job once completed as an idle job will waste resources - Make sure you are in your project space as the image you will obtain is large.
unset APPTAINER_BIND
apptainer pull docker://ollama/ollama
This will get you the latest version of Ollama ollama_latest.sif. Specific versions can be found here https://hub.docker.com/r/ollama/ollama/tags-
To run Ollama do
apptainer shell ollama_latest.sif
and thenollama --help
to see your options and to make sure it has all been installed correctly and you will see:Apptainer> ollama --help Large language model runner Usage: ollama [flags] ollama [command] Available Commands: serve Start ollama create Create a model from a Modelfile show Show information for a model run Run a model stop Stop a running model pull Pull a model from a registry push Push a model to a registry list List models ps List running models cp Copy a model rm Remove a model help Help about any command Flags: -h, --help help for ollama -v, --version Show version information Use "ollama [command] --help" for more information about a command.
-
To start Ollama you then run
ollama serve
Ollama defaults to the 11434 port - Next is Port-forwarding/ssh tunneling
- Open up an new terminal window
- ssh onto BlueBEAR allowing a local tunnel
ssh -L 11434:127.0.0.1:11434 username@bluebear.bham.ac.uk
- ssh onto the compute node again allowing a local tunnel
ssh -L 11434:127:0.0.1:11434 bear-pgXXXXX
where XXXXX is the rest of the compute node information
- In this new terminal you can do another
apptainer shell ollama_latest.sif
and follow an Ollama example like the one shown https://github.com/ollama/ollama?tab=readme-ov-file#cli-reference
.ollama/models¶
Using the ollama pull
command will create a local model these are very large and are automatically placed in a hidden directory in your home directory.
To prevent the filling up your home space you can create a symlink to your project space:
cd ~/.ollama
ln -s /rds/projects/_initial_/_projectname_/model_directory models